<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: snarkconjecture</title><link>https://news.ycombinator.com/user?id=snarkconjecture</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Fri, 24 Apr 2026 08:28:13 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=snarkconjecture" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by snarkconjecture in "Framework Laptop 13 Pro"]]></title><description><![CDATA[
<p>I am much <i>more</i> likely to fault them for omitting important information <i>specifically to hide a weak point of the product</i> rather than out of laziness.</p>
]]></description><pubDate>Tue, 21 Apr 2026 20:05:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=47853835</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=47853835</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47853835</guid></item><item><title><![CDATA[New comment by snarkconjecture in "Quine Game"]]></title><description><![CDATA[
<p>Write a Python quine with three or four hands tied behind your back</p>
]]></description><pubDate>Thu, 02 Apr 2026 07:21:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=47611071</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=47611071</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47611071</guid></item><item><title><![CDATA[Quine Game]]></title><description><![CDATA[
<p>Article URL: <a href="https://adam.scherl.is/static/quine-game/">https://adam.scherl.is/static/quine-game/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=47611070">https://news.ycombinator.com/item?id=47611070</a></p>
<p>Points: 1</p>
<p># Comments: 1</p>
]]></description><pubDate>Thu, 02 Apr 2026 07:21:16 +0000</pubDate><link>https://adam.scherl.is/static/quine-game/</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=47611070</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47611070</guid></item><item><title><![CDATA[New comment by snarkconjecture in "Too Much Color"]]></title><description><![CDATA[
<p>Computer screens have three-dimensional color spaces. Tetrachromacy doesn't change that.</p>
]]></description><pubDate>Fri, 20 Mar 2026 10:12:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=47452596</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=47452596</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47452596</guid></item><item><title><![CDATA[New comment by snarkconjecture in "Show HN: What's my JND? – a colour guessing game"]]></title><description><![CDATA[
<p>Tetrachromacy wouldn't affect a test taken through a phone screen.</p>
]]></description><pubDate>Wed, 11 Mar 2026 05:28:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=47331996</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=47331996</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47331996</guid></item><item><title><![CDATA[New comment by snarkconjecture in "Billion-Parameter Theories"]]></title><description><![CDATA[
<p>Deep neural networks can generalize well even when they're far into the overparametrized regime where classical statistical learning theory predicts overfitting. This is usually called "double descent" and there are many papers on it.</p>
]]></description><pubDate>Tue, 10 Mar 2026 23:55:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=47330276</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=47330276</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47330276</guid></item><item><title><![CDATA[New comment by snarkconjecture in "Why do people keep writing about the imaginary compound Cr2Gr2Te6?"]]></title><description><![CDATA[
<p>It's more like saying pi is approximately "3..14". Easily corrected syntax errors aren't as bad as semantic errors.</p>
]]></description><pubDate>Wed, 27 Aug 2025 03:05:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=45035025</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=45035025</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45035025</guid></item><item><title><![CDATA[Hex Runes]]></title><description><![CDATA[
<p>Article URL: <a href="https://adamscherlis.github.io/blog/hex-runes/">https://adamscherlis.github.io/blog/hex-runes/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=43215384">https://news.ycombinator.com/item?id=43215384</a></p>
<p>Points: 2</p>
<p># Comments: 0</p>
]]></description><pubDate>Sat, 01 Mar 2025 03:16:51 +0000</pubDate><link>https://adamscherlis.github.io/blog/hex-runes/</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=43215384</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43215384</guid></item><item><title><![CDATA[New comment by snarkconjecture in "Hot take: GPT 4.5 is a nothing burger"]]></title><description><![CDATA[
<p>Versions numbers for LLMs don't mean anything consistent. They don't even publicly announce at this point which models are built from new base models and which aren't. I'm pretty sure Claude 3.5 was a new set of base models since Claude 3.<p>What do mean by "it's a 1.0" and "3rd iteration"? I'm having trouble parsing those in context.</p>
]]></description><pubDate>Fri, 28 Feb 2025 23:49:11 +0000</pubDate><link>https://news.ycombinator.com/item?id=43213482</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=43213482</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43213482</guid></item><item><title><![CDATA[New comment by snarkconjecture in "Iterated Log Coding"]]></title><description><![CDATA[
<p>Not really. Dirac's trick works entirely at a depth of two logs, using sqrt like unary to increment the number. It requires O(n) symbols to represent the number n, i.e. O(2^n) symbols to represent n bits of precision. This thing has arbitrary nesting depth of logs (or exps), and can represent a number to n bits of precision in O(n) symbols.</p>
]]></description><pubDate>Wed, 26 Feb 2025 21:06:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=43188208</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=43188208</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43188208</guid></item><item><title><![CDATA[Iterated Log Coding]]></title><description><![CDATA[
<p>Article URL: <a href="https://adamscherlis.github.io/blog/iterlog-coding/">https://adamscherlis.github.io/blog/iterlog-coding/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=43181610">https://news.ycombinator.com/item?id=43181610</a></p>
<p>Points: 109</p>
<p># Comments: 36</p>
]]></description><pubDate>Wed, 26 Feb 2025 07:43:21 +0000</pubDate><link>https://adamscherlis.github.io/blog/iterlog-coding/</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=43181610</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43181610</guid></item><item><title><![CDATA[Iterated Log Coding]]></title><description><![CDATA[
<p>Article URL: <a href="https://adamscherlis.github.io/blog/iterlog-coding/">https://adamscherlis.github.io/blog/iterlog-coding/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=42607880">https://news.ycombinator.com/item?id=42607880</a></p>
<p>Points: 3</p>
<p># Comments: 0</p>
]]></description><pubDate>Mon, 06 Jan 2025 05:13:03 +0000</pubDate><link>https://adamscherlis.github.io/blog/iterlog-coding/</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=42607880</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42607880</guid></item><item><title><![CDATA[New comment by snarkconjecture in "QwQ: Alibaba's O1-like reasoning LLM"]]></title><description><![CDATA[
<p>I think it's better phrased as "find the best rule", with a tacit understanding that people mostly agree on what makes a rule decent vs. terrible (maybe not on what makes one great) and a tacit promise that the sequence presented has at least one decent rule and does not have multiple.<p>A rule being "good" is largely about simplicity, which is also essentially the trick that deep learning uses to escape no-free-lunch theorems.</p>
]]></description><pubDate>Fri, 29 Nov 2024 05:48:41 +0000</pubDate><link>https://news.ycombinator.com/item?id=42271195</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=42271195</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42271195</guid></item><item><title><![CDATA[New comment by snarkconjecture in "Wealth Distribution in the United States"]]></title><description><![CDATA[
<p>For the individuals shown in the graph, this buys about $6k per American (and after the first year you can't do it again).</p>
]]></description><pubDate>Thu, 10 Oct 2024 02:45:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=41794986</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=41794986</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41794986</guid></item><item><title><![CDATA[New comment by snarkconjecture in "The bunkbed conjecture is false"]]></title><description><![CDATA[
<p>There are two kinds of naturalness principle in physics, sometimes called "technical naturalness" and "Dirac naturalness" respectively.<p>Dirac naturalness is as you describe: skepticism towards extremely large numbers, end of story. It has the flaw you (and every other person who's ever heard it) point out.<p>Technical (or t'Hooft) naturalness is different, and specific to quantum field theory.<p>To cut a long story short, the "effective", observable parameters of the Standard Model, such as the mass of the electron, are really the sums of enormous numbers of contributions from different processes happening in quantum superposition. (Keywords: Feynman diagram, renormalization, effective field theory.) The underlying, "bare" parameters each end up affecting many different observables. You can think of this as a big machine with N knobs and N dials, but where each dial is sensitive to each knob in a complicated way.<p>Technical naturalness states: the sum of the contributions to e.g. the Higgs boson mass should not be many orders of magnitude smaller than each individual contribution, without good reason.<p>The Higgs mass that we observe is not technically natural. As far as we can tell, thousands of different effects due to unrelated processes are all cancelling out to dozens of decimal places, for no reason anyone can discern. There's a dial at 0.000000.., and turning any knob by a tiny bit would put it at 3 or -2 or something.<p>There are still critiques to be made here. Maybe the "underlying" parameters aren't really the only fundamental ones, and somehow the effective ones are also fundamental? Maybe there's some reason things cancel out, which we just haven't done the right math to discover? Maybe there's new physics beyond the SM (as we know there eventually has to be)?<p>But overall it's a situation that, imo, demands an explanation beyond "eh, sometimes numbers are big". If you want to say that physical calculations "explain" anything -- if, for example, you think electromagnetism and thermodynamics can "explain" the approximate light output of a 100W bulb -- then you should care about this.</p>
]]></description><pubDate>Thu, 03 Oct 2024 02:09:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=41726666</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=41726666</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41726666</guid></item><item><title><![CDATA[New comment by snarkconjecture in "The Webb Telescope further deepens the Hubble tension controversy in cosmology"]]></title><description><![CDATA[
<p>She's saying that a different model -- one of the three disagreeing methods for distance ladder measurements -- must be wrong, because they disagree with each other. But if one or more of those models are wrong, then there's not much evidence that the LambdaCDM model is wrong.<p>Conversely, the hypothesis that LambdaCDM is wrong does nothing to explain why the distance ladder methods disagree.<p>She clearly isn't saying that any model is infallible, she's just saying that clear flaws with one set of models throw into question some specific accusations that a different model is wrong.<p>You actually need to pay attention to the details; the physicists certainly are. Glib contrarianism isn't very useful here.</p>
]]></description><pubDate>Tue, 13 Aug 2024 19:09:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=41238576</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=41238576</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41238576</guid></item><item><title><![CDATA[New comment by snarkconjecture in "Tsung-Dao Lee, physicist who challenged a law of nature, has died"]]></title><description><![CDATA[
<p>It's the magnetic field that has the arbitrary sign convention. You can't determine the direction of a magnetic field from observations without using the right hand rule.</p>
]]></description><pubDate>Wed, 07 Aug 2024 14:23:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=41181783</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=41181783</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41181783</guid></item><item><title><![CDATA[New comment by snarkconjecture in "What Is Entropy?"]]></title><description><![CDATA[
<p>That definition doesn't work well because you can have changes in entropy even if no energy is transferred, e.g. by exchanging some other conserved quantity.<p>The side note is wrong in letter and spirit; turning potential energy into heat is one way for something to be irreversible, but neither of those statements is true.<p>For example, consider an iron ball being thrown sideways. It hits a pile of sand and stops. The iron ball is not affected structurally, but its kinetic energy is transferred (almost entirely) to heat energy. If the ball is thrown slightly upwards, potential energy increases but the process is still irreversible.<p>Also, the changes of potential energy in corresponding parts of two Carnot cycles are directionally the same, even if one is ideal (reversible) and one is not (irreversible).</p>
]]></description><pubDate>Mon, 22 Jul 2024 20:06:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=41039223</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=41039223</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41039223</guid></item><item><title><![CDATA[New comment by snarkconjecture in "A Dramatic Reading: I Will Fucking Piledrive You If You Mention AI Again"]]></title><description><![CDATA[
<p>It's not a dupe.<p>"I commissioned a professional voice actor to give a full dramatic reading of that blog post."</p>
]]></description><pubDate>Sat, 06 Jul 2024 22:57:06 +0000</pubDate><link>https://news.ycombinator.com/item?id=40893973</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=40893973</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40893973</guid></item><item><title><![CDATA[New comment by snarkconjecture in "Well-known paradox of R-squared is still buggin me"]]></title><description><![CDATA[
<p>R squared has extremely nice properties. If you add a bunch of linearly uncorrelated variables together, then compute R² between each variable and the sum, the R² values will sum to 1.</p>
]]></description><pubDate>Thu, 04 Jul 2024 08:00:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=40873184</link><dc:creator>snarkconjecture</dc:creator><comments>https://news.ycombinator.com/item?id=40873184</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40873184</guid></item></channel></rss>