<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: rdlecler1</title><link>https://news.ycombinator.com/user?id=rdlecler1</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Thu, 09 Apr 2026 10:36:11 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=rdlecler1" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by rdlecler1 in "You Can Just Buy Far-UVC"]]></title><description><![CDATA[
<p>that wavelength penetrates the skin. you need to be around 222nm for human safety<p>uviquity has prototypes of a 220nm solid state chip they’ll commercialize next year (we’re an investor). a single far-uvc photon will destroy the covid virus.<p><a href="https://uviquity.com/" rel="nofollow">https://uviquity.com/</a></p>
]]></description><pubDate>Wed, 14 Jan 2026 20:49:41 +0000</pubDate><link>https://news.ycombinator.com/item?id=46623115</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=46623115</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46623115</guid></item><item><title><![CDATA[New comment by rdlecler1 in "The Curious Similarity Between LLMs and Quantum Mechanics"]]></title><description><![CDATA[
<p>You clearly have deep understanding of base reality and the emergence of complexity. So please share an actual argument.</p>
]]></description><pubDate>Tue, 11 Feb 2025 19:20:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=43017084</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=43017084</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43017084</guid></item><item><title><![CDATA[The Curious Similarity Between LLMs and Quantum Mechanics]]></title><description><![CDATA[
<p>Article URL: <a href="https://robleclerc.substack.com/p/the-curious-similarity-between-llms">https://robleclerc.substack.com/p/the-curious-similarity-between-llms</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=43014716">https://news.ycombinator.com/item?id=43014716</a></p>
<p>Points: 16</p>
<p># Comments: 14</p>
]]></description><pubDate>Tue, 11 Feb 2025 16:24:30 +0000</pubDate><link>https://robleclerc.substack.com/p/the-curious-similarity-between-llms</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=43014716</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43014716</guid></item><item><title><![CDATA[Super Intelligence ≠ Hyper Intelligence]]></title><description><![CDATA[
<p>Article URL: <a href="https://robleclerc.substack.com/p/super-intelligence-hyper-intelligence">https://robleclerc.substack.com/p/super-intelligence-hyper-intelligence</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=43005528">https://news.ycombinator.com/item?id=43005528</a></p>
<p>Points: 3</p>
<p># Comments: 0</p>
]]></description><pubDate>Mon, 10 Feb 2025 21:38:56 +0000</pubDate><link>https://robleclerc.substack.com/p/super-intelligence-hyper-intelligence</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=43005528</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43005528</guid></item><item><title><![CDATA[New comment by rdlecler1 in "Elon Musk wants to dominate robotaxis–first he needs to catch up to Waymo"]]></title><description><![CDATA[
<p>It’s the other way around. Waymo may do the job exquisitely, but exquisite is also expensive. Telsa has always constrained their system to lower cost hardware. It’ll be easier for Tesla to eventually punch through than for Waymo to cut their costs to that of a Tesla robotaxi.</p>
]]></description><pubDate>Thu, 10 Oct 2024 22:50:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=41804380</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=41804380</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41804380</guid></item><item><title><![CDATA[New comment by rdlecler1 in "The AI startup drama that's damaging Y Combinator's reputation"]]></title><description><![CDATA[
<p>Power eventually turns a high trust society into a low trust society when that power is used to one’s own advantage.</p>
]]></description><pubDate>Thu, 03 Oct 2024 13:37:06 +0000</pubDate><link>https://news.ycombinator.com/item?id=41730681</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=41730681</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41730681</guid></item><item><title><![CDATA[New comment by rdlecler1 in "Apple removes nearly 100 VPNs used by Russians to bypass censorship"]]></title><description><![CDATA[
<p>Think different. No kidding. No one is going to think this.</p>
]]></description><pubDate>Wed, 02 Oct 2024 01:30:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=41716333</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=41716333</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41716333</guid></item><item><title><![CDATA[New comment by rdlecler1 in "YC criticized for backing AI startup that simply cloned another AI startup"]]></title><description><![CDATA[
<p>YC is putting through 200+ companies per batch and making a tiny investment. It’s far more efficient to swallow the occasional bad bet than to do a lot due diligence… Now the VCs that came in after… that’s another story.</p>
]]></description><pubDate>Tue, 01 Oct 2024 16:39:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=41710794</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=41710794</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41710794</guid></item><item><title><![CDATA[New comment by rdlecler1 in "If AI is helping people code better, why aren't products getting better?"]]></title><description><![CDATA[
<p>This is a great point. LLMs may flood an app with more features without making it better. At the end I do the day you still need to make something people love. AI may help you build that faster if you know what do build, but it’s not going to make a bad product great.</p>
]]></description><pubDate>Mon, 23 Sep 2024 15:20:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=41627106</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=41627106</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41627106</guid></item><item><title><![CDATA[New comment by rdlecler1 in "Myocarditis complications more common after Covid infection than vaccination"]]></title><description><![CDATA[
<p>Remember there is still survival bias here. We’re not seeing the complications from people who died of the infection.</p>
]]></description><pubDate>Tue, 27 Aug 2024 04:43:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=41364629</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=41364629</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41364629</guid></item><item><title><![CDATA[New comment by rdlecler1 in "America’s Transit Exceptionalism"]]></title><description><![CDATA[
<p>Do driverless cars make all this moot?</p>
]]></description><pubDate>Tue, 23 Jul 2024 23:03:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=41051930</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=41051930</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41051930</guid></item><item><title><![CDATA[New comment by rdlecler1 in "General Theory of Neural Networks"]]></title><description><![CDATA[
<p>Super interesting.</p>
]]></description><pubDate>Mon, 15 Jul 2024 23:42:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=40972454</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=40972454</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40972454</guid></item><item><title><![CDATA[New comment by rdlecler1 in "General Theory of Neural Networks"]]></title><description><![CDATA[
<p>This sits in a larger field of complexity theory and complex adaptive systems. There was also some interesting work on “Artificial Life” although that research program seems to have fallen out of favor. My introduction in 1995 was the book Chaos and then Stuart Kauffman’s At Home in the Universe. Wolframs New Kind of Science was also interesting.</p>
]]></description><pubDate>Mon, 15 Jul 2024 21:18:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=40971540</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=40971540</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40971540</guid></item><item><title><![CDATA[New comment by rdlecler1 in "General Theory of Neural Networks"]]></title><description><![CDATA[
<p>I could have bogged the essay down with qualifiers to address all the potential straw man objections, but that didn't seem productive. It's easy to take an uncharitable view on this, but I do explain more about GRNs later in the essay. I worked with them for 8 years, and yes, they do act like the rudimentary brains of the cell, and that's the reason this system is selected again and again by evolution.</p>
]]></description><pubDate>Mon, 15 Jul 2024 19:18:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=40970682</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=40970682</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40970682</guid></item><item><title><![CDATA[New comment by rdlecler1 in "General Theory of Neural Networks"]]></title><description><![CDATA[
<p>I'm not arguing that this approximator is necessary (not sufficient) for this class of networks. I've proposed some conjectures on what we might expect to see, but there are certainly other salient ingredients and common principles that we haven't discovered, and I think it's important to hunt for them.</p>
]]></description><pubDate>Mon, 15 Jul 2024 18:02:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=40970055</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=40970055</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40970055</guid></item><item><title><![CDATA[New comment by rdlecler1 in "General Theory of Neural Networks"]]></title><description><![CDATA[
<p>This is a great question, and I don't yet have an answer. I'm going to butcher this description, so please be charitable, but functionally, the attention mechanism reduces the dimensions and uses the coincidence between the Q and K linear layers to narrow down to a subset of the input, and then the softmax amplifies the signal.<p>One unsatisfying argument might be that this might fall into implementation details for this particular class. Another prediction might be that an attention mechanism is an essential element of these networks that appears in other networks of this class. Another is that this is a decent approximation, but has limitations, and we'll figure out how the brain does it and replace it with that.</p>
]]></description><pubDate>Mon, 15 Jul 2024 17:40:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=40969894</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=40969894</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40969894</guid></item><item><title><![CDATA[New comment by rdlecler1 in "General Theory of Neural Networks"]]></title><description><![CDATA[
<p>More generally there’s graph neural networks, for instance, but not you’re including many dynamic networks that are not open-ended or evolvable. The idea is to identify common dynamics and add constraints on the types of networks that are included to find general principles within that class. Kisen the constraints, you make the class too broad and can’t identify common principles.</p>
]]></description><pubDate>Mon, 15 Jul 2024 16:03:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=40969078</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=40969078</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40969078</guid></item><item><title><![CDATA[New comment by rdlecler1 in "General Theory of Neural Networks"]]></title><description><![CDATA[
<p>Genetic systems code gene regulatory networks. I spent most of the essay on them.</p>
]]></description><pubDate>Mon, 15 Jul 2024 15:11:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=40968562</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=40968562</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40968562</guid></item><item><title><![CDATA[New comment by rdlecler1 in "General Theory of Neural Networks"]]></title><description><![CDATA[
<p>Those neurons are being trained the day we were born. Reality corresponds to about 11 million bits per second. What I suspect’s happening is that we train higher and higher levels of abstraction and we get to a point where new knowledge is involves training a new permutation of a few high level neurons.</p>
]]></description><pubDate>Mon, 15 Jul 2024 04:52:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=40965355</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=40965355</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40965355</guid></item><item><title><![CDATA[New comment by rdlecler1 in "General Theory of Neural Networks"]]></title><description><![CDATA[
<p>Activation functions are implementation details. See appendix for the general formula.</p>
]]></description><pubDate>Mon, 15 Jul 2024 04:46:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=40965335</link><dc:creator>rdlecler1</dc:creator><comments>https://news.ycombinator.com/item?id=40965335</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40965335</guid></item></channel></rss>