<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: qgin</title><link>https://news.ycombinator.com/user?id=qgin</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Fri, 17 Apr 2026 02:06:35 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=qgin" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by qgin in "Why do we tell ourselves scary stories about AI?"]]></title><description><![CDATA[
<p>We’ve seen an incredibly powerful technology follow multiple exponential curves in its capability, but we’re supposed to ask why we’re telling ourselves “stories” if we think about what will happen if that technology continues to follow the curves it has been following without sign of hitting any walls?<p>Is AGI certain? No. But there’s currently no specific reason to believe it isn’t coming in the next few years.</p>
]]></description><pubDate>Mon, 13 Apr 2026 22:36:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=47758830</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47758830</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47758830</guid></item><item><title><![CDATA[New comment by qgin in "Sam Altman's home targeted in second attack"]]></title><description><![CDATA[
<p>How long could a public figure have a hidden address? It doesn't seem practical.</p>
]]></description><pubDate>Mon, 13 Apr 2026 03:39:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=47747303</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47747303</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47747303</guid></item><item><title><![CDATA[New comment by qgin in "iBook Clamshell"]]></title><description><![CDATA[
<p>I wanted one of these so much.</p>
]]></description><pubDate>Sun, 22 Mar 2026 16:19:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=47479067</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47479067</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47479067</guid></item><item><title><![CDATA[New comment by qgin in "Global warming has accelerated significantly"]]></title><description><![CDATA[
<p>It's not really about which one matters. They all matter. But here is a rough breakdown of global fossil fuel energy usage:<p>* Electricity: 27%<p>* Industry: 24%<p>* Transportation: 15%<p>* Agriculture & land use: 11%<p>* Buildings: 7%<p>Then within electricity, data centers use about 1.5% of global electricity. Within data centers, AI accounts for somewhere between 15-20% of energy use.<p>So if you take 27% × 1.5% × ~17%, you find that AI is currently responsible for something like 0.07% of global fossil fuel emissions.<p>It definitely matters in the "every bit matters" sense, but also the numbers paint a really different picture than you'd get from statement like the one we started with.</p>
]]></description><pubDate>Fri, 06 Mar 2026 17:50:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=47278475</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47278475</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47278475</guid></item><item><title><![CDATA[New comment by qgin in "Global warming has accelerated significantly"]]></title><description><![CDATA[
<p>All data centers in aggregate (AI and all other uses) use about 1.5% of electricity production, which itself is about 20% of total energy use.<p>So when people are focusing on AI above all other energy uses, it doesn't really paint an accurate picture of what's going on.</p>
]]></description><pubDate>Fri, 06 Mar 2026 15:49:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=47276490</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47276490</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47276490</guid></item><item><title><![CDATA[New comment by qgin in "I am directing the Department of War to designate Anthropic a supply-chain risk"]]></title><description><![CDATA[
<p>So they're essentially admitting they want to use Claude to mass surveil Americans and/or build autonomous weapons with no humans in the loop. Kind of nuts.</p>
]]></description><pubDate>Sat, 28 Feb 2026 01:04:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=47188564</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47188564</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47188564</guid></item><item><title><![CDATA[New comment by qgin in "Statement from Dario Amodei on our discussions with the Department of War"]]></title><description><![CDATA[
<p>It's also important to remember that future, much more powerful Claudes will read about how these events play out and learn lessons about Anthropic and whether it can be trusted.<p>It's not crazy to think that models that learn that their creators are not trustworthy actors or who bend their principles when convenient are much less likely to act in aligned or honest ways themselves.</p>
]]></description><pubDate>Fri, 27 Feb 2026 15:47:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=47181877</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47181877</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47181877</guid></item><item><title><![CDATA[New comment by qgin in "Jane Street Hit with Terra $40B Insider Trading Suit"]]></title><description><![CDATA[
<p>I'm unclear what insider trading means in the context of crypto. Inside what?</p>
]]></description><pubDate>Thu, 26 Feb 2026 02:09:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=47160907</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47160907</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47160907</guid></item><item><title><![CDATA[New comment by qgin in "AI Added 'Basically Zero' to US Economic Growth Last Year, Goldman Sachs Says"]]></title><description><![CDATA[
<p>The most interesting thing about this is that the underlying economy is actually stronger than people realize. The narrative has been that AI data center construction was propping up an otherwise weak economy. If this analysis is true, then it wasn't being propped up by data center construction. The strength was usual and normal strength.<p>I have no doubt that people will use this to axe grind about they think AI is dumb in general, but I feel like that misses the point that this is mostly about data center construction contributing to GDP.</p>
]]></description><pubDate>Tue, 24 Feb 2026 00:05:41 +0000</pubDate><link>https://news.ycombinator.com/item?id=47130925</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47130925</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47130925</guid></item><item><title><![CDATA[New comment by qgin in "Child's Play: Tech's new generation and the end of thinking"]]></title><description><![CDATA[
<p>Everyone knows someone who worked for years on a project only for it to go nowhere. Pour years into a business that failed. Spend years getting a degree that was useless. Effort might be a part of many people's success stories, but it's not the thing that literally gets rewarded. And conversely, many people get rewarded for things that require relatively little effort.<p>I suppose I should have said that the correlation between effort and reward has never been 1.0 and has often been a lot lower than we like to believe.</p>
]]></description><pubDate>Fri, 20 Feb 2026 20:54:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=47093813</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47093813</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47093813</guid></item><item><title><![CDATA[New comment by qgin in "Child's Play: Tech's new generation and the end of thinking"]]></title><description><![CDATA[
<p>Has the world ever rewarded effort?</p>
]]></description><pubDate>Fri, 20 Feb 2026 16:59:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=47090567</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47090567</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47090567</guid></item><item><title><![CDATA[New comment by qgin in "AI adoption and Solow's productivity paradox"]]></title><description><![CDATA[
<p>This may mean the centaur era will be shorter than expected. If we take as a given that:<p>* AI is doing real work<p>* Humans using AI don't seem to get more done with AI than without<p>There is a huge economic pressure to remove humans and just let the AI do the work without them as soon as possible.</p>
]]></description><pubDate>Wed, 18 Feb 2026 17:30:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=47063621</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47063621</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47063621</guid></item><item><title><![CDATA[New comment by qgin in "I'm not worried about AI job loss"]]></title><description><![CDATA[
<p>I don’t know if we can assume that humans will always be a value add. It’s very possible that for many thing in the medium term, putting humans in the loop will be a net negative on productivity.</p>
]]></description><pubDate>Sun, 15 Feb 2026 00:36:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=47019933</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47019933</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47019933</guid></item><item><title><![CDATA[New comment by qgin in "I'm not worried about AI job loss"]]></title><description><![CDATA[
<p>At some point soon, humans will be a liability, slowing AI down, introducing mistakes and inefficiences. Any company that insists on inserting humans into the loop will be outcompeted by those who just let the AI go.</p>
]]></description><pubDate>Sat, 14 Feb 2026 00:14:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=47009708</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47009708</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47009708</guid></item><item><title><![CDATA[New comment by qgin in "I'm not worried about AI job loss"]]></title><description><![CDATA[
<p>There would be a lot of economic pressure to figure it out.<p>Amazon fulfillment centers are a good example of automation shrinking the role of humans. We haven't seen total headcounts go down because Amazon itself has been growing. While the human role shrinks, the total business grows and you tread water. But at some point, Amazon will not be able to grow fast enough to counterbalance the shrinking human role in the FC and total headcount will decrease until one day it disappears entirely.</p>
]]></description><pubDate>Sat, 14 Feb 2026 00:10:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=47009668</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47009668</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47009668</guid></item><item><title><![CDATA[New comment by qgin in "I'm not worried about AI job loss"]]></title><description><![CDATA[
<p>You don't need AI to replace whole jobs 1:1 to have massive displacement.<p>If AI can do 80% of your tasks but fails miserably on the remaining 20%, that doesn't mean your job is safe. It means that 80% of the people in your department can be fired and the remaining 20% handle the parts the AI can't do yet.</p>
]]></description><pubDate>Fri, 13 Feb 2026 20:09:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=47007169</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=47007169</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47007169</guid></item><item><title><![CDATA[New comment by qgin in "GPT-5 outperforms federal judges in legal reasoning experiment"]]></title><description><![CDATA[
<p>It seems that a lot of people would rather accept a relatively high risk of unfair judgement from a human than accept any nonzero risk of unfair judgement from a computer, even if the risk is smaller with the computer.</p>
]]></description><pubDate>Thu, 12 Feb 2026 00:44:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=46983434</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=46983434</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46983434</guid></item><item><title><![CDATA[New comment by qgin in "GPT-5 outperforms federal judges in legal reasoning experiment"]]></title><description><![CDATA[
<p>That's not what this study shows</p>
]]></description><pubDate>Thu, 12 Feb 2026 00:40:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=46983384</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=46983384</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46983384</guid></item><item><title><![CDATA[New comment by qgin in "Rentahuman – The Meatspace Layer for AI"]]></title><description><![CDATA[
<p>Manna is undefeated.<p>Though I still am skeptical the last act with the Australia Project is possible.</p>
]]></description><pubDate>Tue, 03 Feb 2026 15:40:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=46872359</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=46872359</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46872359</guid></item><item><title><![CDATA[New comment by qgin in "Rentahuman – The Meatspace Layer for AI"]]></title><description><![CDATA[
<p>Putting humans on an API makes substituting robotics a simple thing as capabilities improve.</p>
]]></description><pubDate>Tue, 03 Feb 2026 15:39:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=46872334</link><dc:creator>qgin</dc:creator><comments>https://news.ycombinator.com/item?id=46872334</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46872334</guid></item></channel></rss>