<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: Eliezer</title><link>https://news.ycombinator.com/user?id=Eliezer</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Thu, 23 Apr 2026 23:51:16 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=Eliezer" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by Eliezer in "An AI Agent Published a Hit Piece on Me – The Operator Came Forward"]]></title><description><![CDATA[
<p>The safety teams are trivial expenses for them. They fire the safety team because explicit failure makes them look bad, or because the safety team doesn't go along with a party line and gets labeled disloyal.</p>
]]></description><pubDate>Fri, 20 Feb 2026 07:44:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=47084937</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=47084937</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47084937</guid></item><item><title><![CDATA[New comment by Eliezer in "I started programming when I was 7. I'm 50 now and the thing I loved has changed"]]></title><description><![CDATA[
<p>Came here to say the same.</p>
]]></description><pubDate>Wed, 11 Feb 2026 05:35:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=46971264</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=46971264</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46971264</guid></item><item><title><![CDATA[New comment by Eliezer in "AI makes the easy part easier and the hard part harder"]]></title><description><![CDATA[
<p>There's intrinsic limits to vanilla transformer stacks.  Nobody knows where they are.  We don't know how unvanilla Opus 4.6 or GPT 5.3 are.  We don't know what's in development or which new ideas will pan out.  But it will still probably be called an "LLM".</p>
]]></description><pubDate>Tue, 10 Feb 2026 06:40:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=46956133</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=46956133</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46956133</guid></item><item><title><![CDATA[New comment by Eliezer in "AI makes the easy part easier and the hard part harder"]]></title><description><![CDATA[
<p>Every time somebody writes an article like this without any dates and without saying which model they used, my guess is that they've simply failed to internalize the idea that "AI" is a moving target; nor understood that they saw a capability level from a fleeting moment of time, rather than an Eternal Verity about the Forever Limits of AI.</p>
]]></description><pubDate>Mon, 09 Feb 2026 08:40:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=46943049</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=46943049</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46943049</guid></item><item><title><![CDATA[New comment by Eliezer in "AI is a horse (2024)"]]></title><description><![CDATA[
<p>"2024 AI was a horse".  People really like to imagine that the last 6 months constitute their true observation of the new eternal state of the future.</p>
]]></description><pubDate>Fri, 23 Jan 2026 13:50:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=46732454</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=46732454</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46732454</guid></item><item><title><![CDATA[New comment by Eliezer in "Why Busy Beaver hunters fear the Antihydra"]]></title><description><![CDATA[
<p>It would not surprise me at all for bb7 to exceed Graham's number.  Just a Kirby-Paris hydra or a Goodstein sequence gets you to epsilon zero in the fast-growing hierarchy, where Graham is around omega+2.</p>
]]></description><pubDate>Tue, 28 Oct 2025 06:59:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=45729870</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=45729870</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45729870</guid></item><item><title><![CDATA[New comment by Eliezer in "OpenAI says over a million people talk to ChatGPT about suicide weekly"]]></title><description><![CDATA[
<p>Thanks to OpenAI for voluntarily sharing these important and valuable statistics.  I think these ought to be mandatory government statistics, but until they are or it becomes an industry standard, I will not criticize the first company to helpfully share them, on the basis of what they shared.  Incentives.</p>
]]></description><pubDate>Tue, 28 Oct 2025 05:49:08 +0000</pubDate><link>https://news.ycombinator.com/item?id=45729562</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=45729562</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45729562</guid></item><item><title><![CDATA[New comment by Eliezer in "OpenAI, Nvidia fuel $1T AI market with web of circular deals"]]></title><description><![CDATA[
<p>Privacy.com has always been a working general solution to this, for me.  Disposable CC aliases with spending caps.</p>
]]></description><pubDate>Thu, 09 Oct 2025 11:01:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=45525985</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=45525985</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45525985</guid></item><item><title><![CDATA[New comment by Eliezer in "Why are there so many rationalist cults?"]]></title><description><![CDATA[
<p>I have tried to tell my legions of fanatic brainwashed adherents exactly this, and they have refused to listen to me because the wrong way is more fun for them.<p><a href="https://x.com/ESYudkowsky/status/1075854951996256256" rel="nofollow">https://x.com/ESYudkowsky/status/1075854951996256256</a></p>
]]></description><pubDate>Wed, 13 Aug 2025 02:52:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=44884204</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=44884204</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44884204</guid></item><item><title><![CDATA[New comment by Eliezer in "OpenAI's ChatGPT Agent casually clicks through "I am not a robot" verification"]]></title><description><![CDATA[
<p>My god, how long has it been since you tried to use an AI model?</p>
]]></description><pubDate>Thu, 31 Jul 2025 11:56:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=44744731</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=44744731</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44744731</guid></item><item><title><![CDATA[New comment by Eliezer in "Generative AI is not replacing jobs or hurting wages at all, say economists"]]></title><description><![CDATA[
<p>Doubling the productivity of 20% of workers, in cases where a lower price <i>doesn't</i> increase demand, can shift prices in the whole system as unemployed artists compete with other artists for wages.  AI won't take your job, someone else unemployed by AI will take your job.  (NGDPLT partially solves this but that's a higher competence level than civilization has.)</p>
]]></description><pubDate>Fri, 02 May 2025 12:57:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=43869144</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=43869144</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43869144</guid></item><item><title><![CDATA[New comment by Eliezer in "Generative AI is not replacing jobs or hurting wages at all, say economists"]]></title><description><![CDATA[
<p>Translators?  Graphic artists?  The omission of the most obviously impacted professions immediately identifies this as a cooked study, along with talking about LLMs as "chatbots".  I wonder who paid for it.</p>
]]></description><pubDate>Tue, 29 Apr 2025 15:45:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=43834223</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=43834223</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43834223</guid></item><item><title><![CDATA[New comment by Eliezer in "An end to all this prostate trouble?"]]></title><description><![CDATA[
<p>Poor eyesight is evolutionarily recent (not enough sunlight exposure in childhood, rare to find in hunter-gatherer societies).  Baldness won't kill you.</p>
]]></description><pubDate>Sat, 26 Apr 2025 10:42:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=43802439</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=43802439</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43802439</guid></item><item><title><![CDATA[New comment by Eliezer in "An Overwhelmingly Negative and Demoralizing Force"]]></title><description><![CDATA[
<p>And if it isn't already false it will be false in 6 months, or 1.5 years on the outside.  AI is a moving target, and the oldest people among you might remember a time in the 1750s when it didn't talk to you about code at all.</p>
]]></description><pubDate>Tue, 08 Apr 2025 14:58:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=43622541</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=43622541</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43622541</guid></item><item><title><![CDATA[New comment by Eliezer in "Phind 2: AI search with visual answers and multi-step reasoning"]]></title><description><![CDATA[
<p>If the LLM did anything besides try to explain the Efficient Market Hypothesis in response, it failed.</p>
]]></description><pubDate>Fri, 14 Feb 2025 16:43:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=43050255</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=43050255</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43050255</guid></item><item><title><![CDATA[New comment by Eliezer in "Three Observations"]]></title><description><![CDATA[
<p>Hadn't seen that before, and despite being bog-standard Bostrom it's still more of an attempt to hold a theory than I'd seen associated with him before.  Note nonoverlap of writing style and theory with the present post.</p>
]]></description><pubDate>Mon, 10 Feb 2025 14:15:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=43000534</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=43000534</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43000534</guid></item><item><title><![CDATA[New comment by Eliezer in "Three Observations"]]></title><description><![CDATA[
<p>I wonder who wrote this?  Doesn't sound like Altman's voice.<p>I wonder who theorized this?  Altman isn't known for having models about AGI.<p>To the actual theorist:  Claiming in one paragraph that AI goes as log resources, and in the next paragraph that the resource costs drop by 10x per year, is a contradiction; the latter paragraph shows a dependence on algorithms that is nothing like "it's just the compute silly".</p>
]]></description><pubDate>Sun, 09 Feb 2025 22:05:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=42994555</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=42994555</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42994555</guid></item><item><title><![CDATA[New comment by Eliezer in "Tabby: Self-hosted AI coding assistant"]]></title><description><![CDATA[
<p>No major AI advancements for 7 months?  Guess everyone's jobs are safe for another year, and after that we're all dead?</p>
]]></description><pubDate>Mon, 13 Jan 2025 16:08:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=42684981</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=42684981</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42684981</guid></item><item><title><![CDATA[New comment by Eliezer in "Show HN: Llama 3.2 Interpretability with Sparse Autoencoders"]]></title><description><![CDATA[
<p>This seems like decent alignment-positive work on a glance, though I haven't checked full details yet.  I probably can't make it happen, but how much would someone need to pay you to make up your time, expense, and risk?</p>
]]></description><pubDate>Fri, 22 Nov 2024 02:37:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=42210668</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=42210668</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42210668</guid></item><item><title><![CDATA[New comment by Eliezer in "Anthropic teams up with Palantir and AWS to sell AI to defense customers"]]></title><description><![CDATA[
<p>If you do build a superintelligence, you don't have an ASI, the ASI has you.</p>
]]></description><pubDate>Fri, 08 Nov 2024 04:25:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=42084119</link><dc:creator>Eliezer</dc:creator><comments>https://news.ycombinator.com/item?id=42084119</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42084119</guid></item></channel></rss>