<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: gignico</title><link>https://news.ycombinator.com/user?id=gignico</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Mon, 13 Apr 2026 17:26:29 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=gignico" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by gignico in "A Canonical Generalization of OBDD"]]></title><description><![CDATA[
<p>Thanks!<p>The arXiv submission says the paper is submitted to SAT26, did it get accepted?</p>
]]></description><pubDate>Mon, 13 Apr 2026 09:09:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=47749566</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47749566</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47749566</guid></item><item><title><![CDATA[New comment by gignico in "A Canonical Generalization of OBDD"]]></title><description><![CDATA[
<p>Positively surprised to see stuff like these on HN first page!<p>If any author is around, do you have an implementation that can be compared with CUDD and similar BDD libraries?</p>
]]></description><pubDate>Mon, 13 Apr 2026 06:48:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=47748529</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47748529</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47748529</guid></item><item><title><![CDATA[New comment by gignico in "Scientists are working on "everything vaccines""]]></title><description><![CDATA[
<p>I know about reproductive pressure and I’ve read The Selfish Gene. What you say is correct but does not explain that “if evolution did not, better not do it” attitude of the original comment, which I think is wrong for many reasons as I’ve wrote.</p>
]]></description><pubDate>Sun, 05 Apr 2026 14:47:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=47650003</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47650003</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47650003</guid></item><item><title><![CDATA[New comment by gignico in "Scientists are working on "everything vaccines""]]></title><description><![CDATA[
<p>The problem is implying that “if evolution did not do it there must be a reason”, because 1) it makes evolution look like an engineer evaluating trade offs, which is not and 2) it considers the current state of affairs the final “product”, which is not. For example, flowers did not exist in the Cretaceous, so somebody looking at what evolution did until then would say “if evolution did not invent flowers, then we’d better not do it”. But of course that’s absurd.<p>Also as I said evolution is not a process towards a goal. There are 8 billion people around the world which proves Homo sapiens is quite fit for its environment so the pressure to evolve further features is quite low.</p>
]]></description><pubDate>Sun, 05 Apr 2026 13:26:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=47649229</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47649229</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47649229</guid></item><item><title><![CDATA[New comment by gignico in "Modern Generic SVGA driver for Windows 3.1"]]></title><description><![CDATA[
<p>> Running Windows 3.1 in True Color Full HD<p>People from the time would be astonished by the hardware we have now yet bloated software globs up every ounce of performance. What a waste! </granny mode=off></p>
]]></description><pubDate>Sun, 05 Apr 2026 07:23:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=47647010</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47647010</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47647010</guid></item><item><title><![CDATA[New comment by gignico in "Scientists are working on "everything vaccines""]]></title><description><![CDATA[
<p>Evolution is not a process toward better quality of life and life expectancy of individuals. As long as enough individuals can reach the age to procreate in their environment evolution is done. Evolution didn’t train our bodies to reject the diseases we already have the vaccines for neither, so your reasoning would apply to smallpox as well. And what about viruses appeared after Homo sapiens evolved (such as HIV)?</p>
]]></description><pubDate>Sat, 04 Apr 2026 11:30:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=47638103</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47638103</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47638103</guid></item><item><title><![CDATA[New comment by gignico in "Intuiting Pratt Parsing"]]></title><description><![CDATA[
<p>Until you need to do more than all-or-nothing parsing :) see tree-sitter for example, or any other efficient LSP implementation of incremental parsing.</p>
]]></description><pubDate>Wed, 01 Apr 2026 11:23:11 +0000</pubDate><link>https://news.ycombinator.com/item?id=47599384</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47599384</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47599384</guid></item><item><title><![CDATA[New comment by gignico in "Reports of code's death are greatly exaggerated"]]></title><description><![CDATA[
<p>I don’t know if someone said it already, but when Steve Jobs said this famous quote (“reports of my death are greatly exaggerated”) he then died maybe just a couple of years later.<p>Hope this does not happen to code :)</p>
]]></description><pubDate>Mon, 23 Mar 2026 13:52:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=47489568</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47489568</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47489568</guid></item><item><title><![CDATA[New comment by gignico in "My “grand vision” for Rust"]]></title><description><![CDATA[
<p>Your statement contradicts itself. It was unusable hard before non-lexical lifetimes, but they vastly increased the complexity? Then maybe what’s complex for <i>compiler writers</i> to implement can make the user’s life easier by lowering the complexity of <i>their</i> code?</p>
]]></description><pubDate>Mon, 09 Mar 2026 16:40:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=47311445</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47311445</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47311445</guid></item><item><title><![CDATA[New comment by gignico in "My “grand vision” for Rust"]]></title><description><![CDATA[
<p>I'm not sure it shows that. Even basic features of Rust we take from granted come from concepts common users do not need to understand. Borrowing of lifetime draws from affine types, but nobody cares when writing Rust code. If in 2012 you read a similar article explaining borrow checking in academic terms you would have thought Rust would be unusably hard, which is not.<p>Also I do not think that adding features is always bad to the point of comparing with Scala. Most of the things the article mentions will be almost invisible to users. For example, the `!Forget` thing it mentions will just end up with users getting new errors for things that before would have caused memory leaks. What a disgrace!<p>Then, pattern types allow you to remove panics from code, which is super helpful in many critical contexts where Rust is used in production, even in the Linux kernel once they will bump the language version so far.</p>
]]></description><pubDate>Mon, 09 Mar 2026 16:02:19 +0000</pubDate><link>https://news.ycombinator.com/item?id=47310839</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47310839</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47310839</guid></item><item><title><![CDATA[New comment by gignico in "Devirtualization and Static Polymorphism"]]></title><description><![CDATA[
<p>Well, Rust as well has been around for more than 10 years now. I don't imply Rust <i>invented</i> the approach. Surely academia knew about it decades before. I was rather commenting on how one's mental model of things can change by learning new languages.</p>
]]></description><pubDate>Mon, 09 Mar 2026 08:38:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=47306288</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47306288</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47306288</guid></item><item><title><![CDATA[New comment by gignico in "My “grand vision” for Rust"]]></title><description><![CDATA[
<p>I think the misunderstanding here is that the article was not intended to users but to other language designers.<p>As a user, using a feature such as pattern types will be natural if you know the rest of the language.<p>Do you have a function that accepts an enum `MyEnum` but has an `unreachable!()` for some variant that you know is impossible to have at that point?<p>Then you can accept a `MyEnum is MyEnum::Variant | MyEnum::OtherVariant` instead of `MyEnum` to tell which are the accepted variants, and the pattern match will not require that `unreachable!()` anymore.<p>The fact someone does not know this is called "refinement types" does not limit their ability to use the feature effectively.</p>
]]></description><pubDate>Mon, 09 Mar 2026 08:27:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=47306221</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47306221</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47306221</guid></item><item><title><![CDATA[New comment by gignico in "Galileo's handwritten notes found in ancient astronomy text"]]></title><description><![CDATA[
<p>It’s unbelievable how that 16th century book looks like it is written in LaTeX. Or plain TeX, probably, given its age XD</p>
]]></description><pubDate>Sat, 07 Mar 2026 07:35:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=47285390</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47285390</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47285390</guid></item><item><title><![CDATA[New comment by gignico in "10-202: Introduction to Modern AI (CMU)"]]></title><description><![CDATA[
<p>I think the problem is the under representation of other branches of AI research: knowledge representation, automated reasoning, planning, etc.<p>These are important topics with important industrial applications which have the only downsides to not be suitable for implementing friendly chatbots and for raising the stocks of Silicon Valley companies.</p>
]]></description><pubDate>Sun, 01 Mar 2026 08:47:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=47204929</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47204929</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47204929</guid></item><item><title><![CDATA[New comment by gignico in "Devirtualization and Static Polymorphism"]]></title><description><![CDATA[
<p>> Under the hood, a virtual table (vtable) is created for each class, and a pointer (vptr) to the vtable is added to each instance.<p>Coming from C++ I assumed this was the only way but Rust has an interesting approach where the single objects do not pay any cost because virtual dispatch is handled by fat pointers. So you carry around the `vptr` in fat pointers (`&dyn MyTrait`) only when needed, not in every instance.</p>
]]></description><pubDate>Wed, 25 Feb 2026 22:10:47 +0000</pubDate><link>https://news.ycombinator.com/item?id=47158754</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47158754</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47158754</guid></item><item><title><![CDATA[New comment by gignico in "Defer available in gcc and clang"]]></title><description><![CDATA[
<p>That’s great news!</p>
]]></description><pubDate>Fri, 20 Feb 2026 10:12:31 +0000</pubDate><link>https://news.ycombinator.com/item?id=47086013</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47086013</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47086013</guid></item><item><title><![CDATA[New comment by gignico in "Defer available in gcc and clang"]]></title><description><![CDATA[
<p>I was under the wrong assumption that defer was approved for the next standard already.</p>
]]></description><pubDate>Fri, 20 Feb 2026 08:15:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=47085172</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47085172</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47085172</guid></item><item><title><![CDATA[New comment by gignico in "Defer available in gcc and clang"]]></title><description><![CDATA[
<p>Absolutely, it's not their first language. In our curriculum C programming is part of the Operating Systems course and comes after Computer Architecture where they see assembly. So its purpose is to be low level to understand what's under the hood. To learn programming itself they use other languages (currently Java, for better or worse, but I don't have voice on that choice).</p>
]]></description><pubDate>Fri, 20 Feb 2026 08:14:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=47085165</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47085165</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47085165</guid></item><item><title><![CDATA[New comment by gignico in "Defer available in gcc and clang"]]></title><description><![CDATA[
<p>I’m just going to start teaching classes of C programming to university first-year CS students. Would you teach `defer` straight away to manage allocated memory?</p>
]]></description><pubDate>Fri, 20 Feb 2026 06:52:46 +0000</pubDate><link>https://news.ycombinator.com/item?id=47084606</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=47084606</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47084606</guid></item><item><title><![CDATA[New comment by gignico in "Claude’s C Compiler vs. GCC"]]></title><description><![CDATA[
<p>Exactly. This flawed argument by which everything will be fixed by future models drives me crazy every time.</p>
]]></description><pubDate>Mon, 09 Feb 2026 06:45:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=46942284</link><dc:creator>gignico</dc:creator><comments>https://news.ycombinator.com/item?id=46942284</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46942284</guid></item></channel></rss>