<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: atrocious</title><link>https://news.ycombinator.com/user?id=atrocious</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Sun, 05 Apr 2026 13:29:04 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=atrocious" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by atrocious in "The risk of AI isn't making us lazy, but making "lazy" look productive"]]></title><description><![CDATA[
<p>An LLM can't invent meaning in a text where there is none. It's equivalent to CSI's classic "zoom, enhance" on resolution limited photographs. You need to consider you're learning a load of rubbish from LLMs.</p>
]]></description><pubDate>Sun, 29 Mar 2026 09:01:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=47561504</link><dc:creator>atrocious</dc:creator><comments>https://news.ycombinator.com/item?id=47561504</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47561504</guid></item><item><title><![CDATA[New comment by atrocious in "The risk of AI isn't making us lazy, but making "lazy" look productive"]]></title><description><![CDATA[
<p>Pretend learning is absolutely the key point, for me. There is danger in shifting our reasoning from knowing "stuff", to knowing a symbolic summary of "stuff" (helpfully generated by an LLM at varying levels of accuracy).<p>Previously, we saw a shift with search engines where we no longer needed to learn data because we could use a search engine as a mental signpost to the data, freeing up capacity for other thought.<p>LLMs are shifting knowledge creation to this mental pointer model. We don't need to know real "stuff" because we know how to look it up later (never?).<p>Each of these summaries is a secondary source, delivered through an agent biased by whatever is in its current context window. Like a game of telephone the summaries are inherently lossy, and each one may be 95% correct and we crucially don't understand which 5% may be incorrect.<p>When our basis for decision making is a collection of 100s or 1000s of LLM generated "Schrodinger's facts", we risk cumulative cascading errors. We will be wrong in unpredictable, chaotic ways.<p>We are voluntarily capping ourselves as this childish level of thought, because it feels like we are exercising our critical judgement the same as ever. However, the integrity of the inputs has been compromised. Bad inputs always lead to bad outputs.</p>
]]></description><pubDate>Sun, 29 Mar 2026 08:44:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=47561424</link><dc:creator>atrocious</dc:creator><comments>https://news.ycombinator.com/item?id=47561424</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47561424</guid></item><item><title><![CDATA[New comment by atrocious in "How we rebuilt Next.js with AI in one week"]]></title><description><![CDATA[
<p>I agree, but it's historically been as simple as "you already use AWS for everything else". For example, ElasticSearch.</p>
]]></description><pubDate>Fri, 27 Feb 2026 13:41:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=47180390</link><dc:creator>atrocious</dc:creator><comments>https://news.ycombinator.com/item?id=47180390</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47180390</guid></item><item><title><![CDATA[New comment by atrocious in "How we rebuilt Next.js with AI in one week"]]></title><description><![CDATA[
<p>Established corporations will be doing yoinking, with a pre-existing credibility. There's a huge incentive to offer these copied services for cents on the dollar, as a way to kill the competition.</p>
]]></description><pubDate>Wed, 25 Feb 2026 09:43:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=47149443</link><dc:creator>atrocious</dc:creator><comments>https://news.ycombinator.com/item?id=47149443</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47149443</guid></item><item><title><![CDATA[New comment by atrocious in "German court sends VW execs to prison over Dieselgate scandal"]]></title><description><![CDATA[
<p>The law locks up the man or woman
Who steals the goose from off the common
But leaves the greater villain loose
Who steals the common from off the goose.</p>
]]></description><pubDate>Mon, 26 May 2025 18:52:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=44100406</link><dc:creator>atrocious</dc:creator><comments>https://news.ycombinator.com/item?id=44100406</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44100406</guid></item><item><title><![CDATA[New comment by atrocious in "Opinions I have formed about the “geospatial industry”"]]></title><description><![CDATA[
<p>My academic and work history is with GIS, but I've spent the last 4 years in fintech. What does the cutting edge of GIS look like in tech nowadays?</p>
]]></description><pubDate>Tue, 23 Jun 2020 08:17:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=23611194</link><dc:creator>atrocious</dc:creator><comments>https://news.ycombinator.com/item?id=23611194</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=23611194</guid></item><item><title><![CDATA[New comment by atrocious in "Ask HN: How do you guys prevent back problems?"]]></title><description><![CDATA[
<p>Pilates class at my local gym. Friendly,  low stress and only once or twice a week for 45 minutes. Sorted out my core strength, messed up shoulders and back.</p>
]]></description><pubDate>Sat, 04 May 2019 14:53:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=19826790</link><dc:creator>atrocious</dc:creator><comments>https://news.ycombinator.com/item?id=19826790</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=19826790</guid></item><item><title><![CDATA[New comment by atrocious in "Please do not attempt to simplify this code"]]></title><description><![CDATA[
<p>Which metrics?</p>
]]></description><pubDate>Thu, 27 Dec 2018 22:35:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=18773254</link><dc:creator>atrocious</dc:creator><comments>https://news.ycombinator.com/item?id=18773254</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=18773254</guid></item><item><title><![CDATA[New comment by atrocious in "Ask HN: Older textbooks/papers you consider classics still worth studying today?"]]></title><description><![CDATA[
<p>PAIP - Norvig 1992. It appears to be available online <a href="https://github.com/norvig/paip-lisp" rel="nofollow">https://github.com/norvig/paip-lisp</a></p>
]]></description><pubDate>Fri, 16 Nov 2018 22:22:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=18472432</link><dc:creator>atrocious</dc:creator><comments>https://news.ycombinator.com/item?id=18472432</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=18472432</guid></item><item><title><![CDATA[New comment by atrocious in "Ask HN: How do I come up with a name for programming language?"]]></title><description><![CDATA[
<p>Try smashing bits of related words together. Heavy math: Heath.</p>
]]></description><pubDate>Mon, 29 Oct 2018 22:31:41 +0000</pubDate><link>https://news.ycombinator.com/item?id=18332207</link><dc:creator>atrocious</dc:creator><comments>https://news.ycombinator.com/item?id=18332207</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=18332207</guid></item></channel></rss>