<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: lowsong</title><link>https://news.ycombinator.com/user?id=lowsong</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Wed, 08 Apr 2026 11:15:18 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=lowsong" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by lowsong in "Eight years of wanting, three months of building with AI"]]></title><description><![CDATA[
<p>I'm going to take your comment at face value, and I'm also going to assume that you're US-based.<p>You <i>need</i> to take a step back and look at the economic reality of the majority of Americans today. Many live paycheck-to-paycheck, even those with "middle class" incomes. For many a $200 one-off bill is debilitating, yet alone a recurring subscription. If you don't know that, you have a dangerously narrow view of the economy.</p>
]]></description><pubDate>Mon, 06 Apr 2026 14:13:48 +0000</pubDate><link>https://news.ycombinator.com/item?id=47661207</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47661207</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47661207</guid></item><item><title><![CDATA[New comment by lowsong in "Eight years of wanting, three months of building with AI"]]></title><description><![CDATA[
<p>The median per capita income in the United States is $37,683/year.[0] Depending on your state, after taxes, that's something like ~$2,600/month. You're asking almost 10% of their post-tax income to this just for the <i>opportunity</i> to create software. With rent, food, and other living expenses many households at that income level simply cannot afford this.<p>This is the <i>median</i> income. If it's a struggle for someone on this income then it's worse for <i>half of all Americans</i>, and American incomes are higher than most of the rest of the world.<p>[0]: <a href="https://en.wikipedia.org/wiki/Per_capita_personal_income_in_the_United_States" rel="nofollow">https://en.wikipedia.org/wiki/Per_capita_personal_income_in_...</a></p>
]]></description><pubDate>Mon, 06 Apr 2026 10:53:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=47659248</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47659248</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47659248</guid></item><item><title><![CDATA[New comment by lowsong in "Eight years of wanting, three months of building with AI"]]></title><description><![CDATA[
<p>> I don't know what to say to you. More people are coding now with AI than ever coded before. If your argument was true, then that would just mean that there are more elites than ever. Obviously that's not what's happening.<p>I don't know how I can explain this any more clearly.<p>If you need AI to create software, and the cost of AI is $200/month, then only people who can afford $200/month can create software.<p>Costs will increase. The current cost is substituted by investor funding. Sell at a loss to get people hooked on the product and then raise the price to make money, a "high-growth business model" as you say.<p>The cost to make a competitor to Anthropic or OpenAI is tens or hundreds of billions of dollars upfront. There will be few competitors and minimal market pressure to reduce prices, even <i>if</i> the unit costs of inference are low.<p>$200/month is already out of reach of the majority of the population. Increases from here means only a small percentage of the richest people can afford it.<p>I don't know what definition of "elite" you're using but, "technology limited so that only a small percentage of the population can afford it" is... an elite group.<p>This is fun and all, but I think we've reached the end of the productive discussion to be had and I don't have much more to say. Charitably, we're leaving in completely different realities. I just hope when the bubble pops the fall isn't too hard for you.</p>
]]></description><pubDate>Mon, 06 Apr 2026 00:05:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=47655310</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47655310</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47655310</guid></item><item><title><![CDATA[New comment by lowsong in "Eight years of wanting, three months of building with AI"]]></title><description><![CDATA[
<p>> Democratizing is defined as "the process of making technology, information, or power accessible, available, or appealing to everyone, rather than just experts or elites."<p>Your definition only supports my point. The transfer of skill from something you learn to something you <i>pay</i> to do is the exact and complete opposite of your stated definition. It turns the activity from something that requires you to learn it to one that only <i>those that can afford to pay</i> can do.<p>It is quite literally making this technology, information, and power available to <i>only</i> the elite.<p>>  Uhhh. Maybe you don't know any AI investors, but the payout is coming NOW.<p>What payout? Zero AI companies are profitable. If you're invested in one of these companies you could be a billionaire on paper, but until it's liquid it's meaningless. There's plenty of investors who stand to make a lot of money if these big companies exit, but there's no guarantee that will happen.<p>The only people making money at the moment are either taking cash salaries from AI labs or speculating on Nvidia stock. Neither of which have much do with the tech itself and everything to do with the hype.</p>
]]></description><pubDate>Sun, 05 Apr 2026 22:29:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=47654606</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47654606</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47654606</guid></item><item><title><![CDATA[New comment by lowsong in "Eight years of wanting, three months of building with AI"]]></title><description><![CDATA[
<p>AI has none of these things.<p>1. As I said before, we've long since reached diminishing returns on models. We simply don't have enough compute or training data left to make them dramatically better.<p>2. This is <i>only</i> true if it actually pans out, which is still an unknown question.<p>3. Just... not using it? It has to justify its existence. If it's not of benefit vs. the cost then why bother.<p>4. The public hates AI. The proliferation of "AI slop" makes people despise the technology wholesale.</p>
]]></description><pubDate>Sun, 05 Apr 2026 22:08:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=47654437</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47654437</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47654437</guid></item><item><title><![CDATA[New comment by lowsong in "Eight years of wanting, three months of building with AI"]]></title><description><![CDATA[
<p>On a SWE salary maybe. If the baseline cost of doing business is a $5k GPU you've excluded like a quarter of the US working population immediately.</p>
]]></description><pubDate>Sun, 05 Apr 2026 20:35:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=47653606</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47653606</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47653606</guid></item><item><title><![CDATA[New comment by lowsong in "Eight years of wanting, three months of building with AI"]]></title><description><![CDATA[
<p>> This is what everyone says when technology democratizes something that was previously reserved for a small number of experts.<p>What part of renting your ability to do your job is "democratizing"? The current state of AI is the literal opposite. Same for local models that require thousands of dollars of GPUs to run.<p>Over the past 20 years software engineering has become something that just about anyone can do with little more than a shitty laptop, the time and effort, and an internet connection. How is a world where that ability is rented out to only those that can pay "democratic"?<p>> When the printing press was invented, scribes complained that it would lead to a flood of poorly written, untrustworthy information. And you know what? It did. And nobody cares.<p>A bad book is just a bad book. If a novel is $10 at the airport and it's complete garbage then I'm out $10 and a couple of hours. As you say, who cares. A bad vibe coded app and you've leaked your email inbox and bank account and you're out way more than $10. The risk profile from AI is way higher.<p>Same is even more true for businesses. The cost of a cyberattack or a outage is measured in the millions of dollars. It's a simple maths, the cost of the risk of compromise far oughtweights the cost of cheaper upfront software.<p>> You cut out the part where I said it only popped economically, but the technology continued to improve.<p>The improvement in AI models requires <i>billions</i> of dollars a year in hardware, infrastructure, end energy. Do you think that investors will continue to pour that level of investment into improving AI models for a payout that might only come ten to fifteen years down the road? Once the economic bubble pops, the models we have are the end of the road.</p>
]]></description><pubDate>Sun, 05 Apr 2026 19:20:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=47652889</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47652889</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47652889</guid></item><item><title><![CDATA[New comment by lowsong in "Eight years of wanting, three months of building with AI"]]></title><description><![CDATA[
<p>> What you're missing is that fewer and fewer projects are going to need a ton of technical depth.
> I have friends who'd never written a line of code in their lives who now use multiple simple vibe-coded apps at work daily.<p>Again it's the opposite. A landscape of vibe coded micro apps is a landscape of buggy, vulnerable, points of failure. When you buy a product, software or hardware, you do more than buy the functionality you buy the assurance it will work. AI does not change this. Vibe code an app to automate your lightbulbs all you like, but nobody is going to be paying millions of dollars a year on vibe coded slop apps and apps like that is what keeps the tech industry afloat.<p>> Humanity is not going to stop pouring more and more money into AI.<p>There's no more money to pour into it. Even if you did, we're out of GPU capacity and we're running low on the power and infrastructure to run these giant data centres, and it takes decades to bring new fabs or power plants online. It is physically impossible to continue this level of growth in AI investment. Every company that's invested into AI has done so on the promise of increased improvement, but the moment that stops being true everything shifts.<p>> The AI bubble isn't going to pop. This is like saying the internet bubble is going to pop in 1999.<p>The internet bubble <i>did</i> pop. What happened after is an assessment of how much the tech is actually worth, and the future we have now 26 years later bears little resemblance to the hype in 1999. What makes you think this will be different?<p>Once the hype fades, the long-term unsuitability for large projects becomes obvious, and token costs increase by ten or one hundred times, are businesses really going to pay thousands of dollars a month on agent subscriptions to vibe code little apps here and there?</p>
]]></description><pubDate>Sun, 05 Apr 2026 17:48:47 +0000</pubDate><link>https://news.ycombinator.com/item?id=47651957</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47651957</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47651957</guid></item><item><title><![CDATA[New comment by lowsong in "Eight years of wanting, three months of building with AI"]]></title><description><![CDATA[
<p>> However, code quality is becoming less and less relevant in the age of AI coding, and to ignore that is to have our heads stuck in the sand. Just because we don't like it doesn't mean it's not true.<p>It's the opposite, code quality is becoming <i>more and more</i> relevant. Before now you could only neglect quality for so long before the time to implement any change became so long as to completely stall out a project.<p>That's still true, the only thing AI has changed is it's let you charge further and further into technical debt before you see the problems. But now instead of the problems being a gradual ramp up it's a cliff, the moment you hit the point where the current crop of models can't operate on it effectively any more you're completely lost.<p>> We are in the very earliest months of AI actually being somewhat competent at this. It's unlikely that it will plateau and stop improving.<p>We hit the plateau on model improvement a few years back. We've only continued to see any improvement at all because of the exponential increase of money poured into it.<p>> It's only trending in one direction. And it isn't going to stop.<p>Sure it can. When the bubble pops there will be a question: is using an agent cost effective? Even if you think it is at $200/month/user, we'll see how that holds up once the cost skyrockets after OpenAI and Anthropic run out of money to burn and their investors want some returns.<p>Think about it this way: If your job survived the popularity of offshoring to engineers paid 10% of your salary, why would AI tooling kill it?</p>
]]></description><pubDate>Sun, 05 Apr 2026 17:34:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=47651792</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47651792</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47651792</guid></item><item><title><![CDATA[New comment by lowsong in "The threat is comfortable drift toward not understanding what you're doing"]]></title><description><![CDATA[
<p>> agents aren’t going away<p>Why not? Once the true cost of token generation is passed on to the end user and costs go up by 10 or 100 times, and once the honeymoon delusion of "oh wow I can just prompt the AI to write code" fades, there's a big question as to if what's left is worth it. If it isn't, agents will most certainly go away and all of this will be consigned to the "failed hype" bin along with cryptocurrency and "metaverse".</p>
]]></description><pubDate>Sun, 05 Apr 2026 17:26:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=47651690</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47651690</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47651690</guid></item><item><title><![CDATA[New comment by lowsong in "Coding agents could make free software matter again"]]></title><description><![CDATA[
<p>I worry people are lacking context about how SaaS products are purchased if they think LLMs and "vibe coding" are going to replace them. It's almost never the feature set. Often it's capex vs opex budgeting (i.e., it's easier to get approval for a monthly cost than a upfront capital cost) but the biggest one is liability.<p>Companies buy these contracts for support and to have a throat to choke if things go wrong. It doesn't matter how much you pay your AI vendor, if you use their product to "vibe code" a SaaS replacement and it fails in some way and you lose a bunch of money/time/customers/reputation/whatever, then that's on you.<p>This is as much a political consideration as a financial one. If you're a C-suite and you let your staff make something (LLM generated or not) and it gets compromised then you're the one who signed off on the risky project and it's your ass on the line. If you buy a big established SaaS, do your compliance due-diligence (SOC2, ISO27001, etc.), and they get compromised then you were just following best practice. Coding agents don't change this.<p>The truth is that the people making the choice about what to buy or build are usually not the people using the end result. If someone down the food chain had to spend a bunch of time with "brittle hacks" to make their workflow work, they're not going to care at all. All they want is the minimum possible to meet whatever the requirement is, that isn't going to come back to bite them later.<p>SaaS isn't about software, it's about shifting blame.</p>
]]></description><pubDate>Mon, 30 Mar 2026 01:18:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=47569351</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47569351</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47569351</guid></item><item><title><![CDATA[New comment by lowsong in "2026 tech layoffs reach 45,000 in March"]]></title><description><![CDATA[
<p>There's little to no evidence that companies are actually doing layoffs to focus on "AI-enabled" work.<p>All there is are layoffs because of interest rates and concerns about the economic outlook. Companies using "AI" as a fig leaf justification and people are apparently falling for it.</p>
]]></description><pubDate>Sat, 14 Mar 2026 22:24:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=47381933</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47381933</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47381933</guid></item><item><title><![CDATA[New comment by lowsong in "Things I've Done with AI"]]></title><description><![CDATA[
<p>It's not even as simple as "views as replaceable". It's pure economics. It's someone looking at a spreadsheet going "We spent a lot of money on SWE salaries, our financial results look better if we fire some of them. Is there a cheaper option?"<p>From that perspective, yes some management view SWE as replaceable. My argument is that all attempts to actually implement that have failed to date, and the most successful financial companies are staffed by upper management who know that to remove much of the SWE staff would doom the company in the medium term.<p>It's a move of either desperation ("we'll go bankrupt if we don't do this"), or short-sightedness ("if I cut 40% of headcount, our P&L will be better, which will result in better quarterly results, which is likely to increase share price, which gives me a bigger performance bonus. Who cares what happens after that."), or a lack of experience in managing software companies and watching this play out before.<p>AI, even if it lives up the hype, is no different.</p>
]]></description><pubDate>Wed, 11 Mar 2026 20:59:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=47341723</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47341723</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47341723</guid></item><item><title><![CDATA[New comment by lowsong in "Things I've Done with AI"]]></title><description><![CDATA[
<p>> you're admitting that businesses do see SWE as cogs in a wheel and seasonally try to replace them...<p>Not quite. I agree that companies will <i>try</i> to do this, but every company that has tried to treat engineering staff as replaceable units of person-hours has failed.<p>> Metrics, performance reviews, sprint velocity, delivery timelines, all orbit around observable artifacts because those are what management systems can actually track objectively and equitably. It's a handy abstraction just like looking only at the ins/outs of a logic gate as opposed to looking at the implementation and wiring.<p>Yes, and these metrics are, usually, worthless.<p>It's not that companies and managers will not <i>try</i> to replace engineers with AI. I'm sure they will. I'm sure many will be laid off because "AI does it cheaper now".<p>My point is that companies that have gone down this route in the past have failed, and AI is no different. Companies that lean strongly into AI as a workforce replacement will fail too.</p>
]]></description><pubDate>Tue, 10 Mar 2026 17:23:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=47326216</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47326216</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47326216</guid></item><item><title><![CDATA[New comment by lowsong in "Things I've Done with AI"]]></title><description><![CDATA[
<p>> From the company’s POV employees function as cogs in a larger system whose purpose is to generate value considering that businesses are structured to optimize outcomes i.e. Profit. If tech appears that can produce the same output more cheaply or efficiently, companies will most definitely as we've seen so far explore replacing people with it.<p>Businesses <i>wish</i> this were the case, and many will even say it or start to believe it. But it doesn't bare out to be true in practice.<p>Think about it this way, engineers are expensive so a company is going to want to have as few of them as possible to do as much work as possible. Long before LLMs came along there have been many rounds of "replace expensive engineers" fads.<p>Visual programming was going to destroy the industry, where any idiot could drag and drop a few boxes and put together software. Turns out that didn't work out and now visual programming is all but dead. Then we had consultants and software consultancies. Why keep engineers on staff and have to deal with benefits and HR functions when you can hire consultants for just long enough to get the job done and end their contracts. Then we had offshoring. Why hire expensive developers in markets like California when you can hire far cheaper engineers abroad in a country with lower wages and laxer employment law. (It's not a quality thing either, many of these engineers are unquestionably excellent.)<p>Or, think about what happens when software companies get acquired. It's almost unheard of for the acquiring company to layoff all of the engineering staff from the acquired company right away, if anything it's the opposite with vesting incentives to convince engineers to stay.<p>If all that mattered was the code and the systems, and people were cogs that produced code that businesses wanted to optimise, then none of these actions make sense. You'd see companies offshore and use consultants with the company that does "good enough" as cheaply as possible. You'd see engineers from acquisitions be laid off immediately, replaced with cheaper staff as fast as possible.<p>There are businesses like that operate like this, it happens all the time. But, all of the most successful and profitable tech companies in the world <i>don't</i> do this. Why?</p>
]]></description><pubDate>Mon, 09 Mar 2026 22:55:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=47316881</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47316881</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47316881</guid></item><item><title><![CDATA[New comment by lowsong in "Code Review for Claude Code"]]></title><description><![CDATA[
<p>> Reviews are billed on token usage and generally average $15–25, scaling with PR size and complexity.<p>You've got to be completely insane to use AI coding tools at this point.<p>This is the subsidised cost to get users to use it, it could trivially end up ten times this amount. Plus, you've got the ultimate perverse incentive where the company that is selling you the model time to create the PRs is also selling you the review of the same PR.</p>
]]></description><pubDate>Mon, 09 Mar 2026 22:15:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=47316398</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47316398</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47316398</guid></item><item><title><![CDATA[New comment by lowsong in "Things I've Done with AI"]]></title><description><![CDATA[
<p>> At work, all that matters is that value is delivered to the business. Code needs to be maintainable so that new requirements can be met. Code follows design patterns, when appropriate, because they are known solutions to common problems, and thus are easy to talk about with others. Code has type systems and static analysis so that programmers make fewer mistakes.<p>This is a narrow view of software engineering. Thinking that your role is "code that works" is hardly better than thinking you're a "(human) resource that produces code". Your job is to provide value. You do that by building knowledge, not only of the system you're developing but of the problem space you're exploring, the customers you're serving, the innovations you can do that your competitors can't.<p>It's like saying that a soccer player's purpose is "to kick a ball" and therefore a machine that launches balls faster and further than any human will replace all soccer players, and soon all professional teams will be made up of robots.</p>
]]></description><pubDate>Mon, 09 Mar 2026 22:00:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=47316212</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47316212</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47316212</guid></item><item><title><![CDATA[New comment by lowsong in "Addicted to Claude Code–Help"]]></title><description><![CDATA[
<p>Listen, If you truly want help you've made the first step by realising what's wrong, but you won't get help here.<p>This community is obsessively pro-AI. Asking here is the equivalent of asking the guy who has sat at the slot machine next to you for the past three hours if he thinks you have a gambling problem. Of course he's going to say "no" or try to justify it, to do otherwise would be to admit to himself that he has a problem.<p>I don't have advice for you, other than to look up what gambling, drug, or alcohol addicts do. The path to recovery for all addiction is long and painful, but it can be done. Good luck.</p>
]]></description><pubDate>Sat, 07 Mar 2026 18:37:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=47290261</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47290261</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47290261</guid></item><item><title><![CDATA[New comment by lowsong in "Verification debt: the hidden cost of AI-generated code"]]></title><description><![CDATA[
<p>> AI is actually better getting those built as long as you clean it up afterwards<p>I've never seen a quick PoC get cleaned up. Not once.<p>I'm sure it happens sometimes, but it's very rare in the industry. The reality is that a PoC usually becomes "good enough" and gets moved into production with only the most perfunctory of cleanup.</p>
]]></description><pubDate>Sat, 07 Mar 2026 18:13:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=47290026</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47290026</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47290026</guid></item><item><title><![CDATA[New comment by lowsong in "Ask HN: Do You Enjoy Your Career in Tech Nowadays?"]]></title><description><![CDATA[
<p>I've been in this game a long time and I've seen a lot, but this AI hype cycle is exhausting. Like no technology before it I've watched extremely smart and capable engineers fall into AI like it's a cult. I've had colleagues and friends I've known for years drop head first into this shit.<p>At first I was interested in the tech, I deep-dived into it. Understood as much as I could. I understand how an LLM works and what it can and can't do. So, I realised pretty quickly that their use is limited. I figured it would blow over in a few years, the real use cases would be weeded out, and we'd all move on to the next thing like normal.<p>What I didn't account for is how addictive this technology is. The moment something "feels" like a person it's ascribed magical qualities, and people fall for it. Anyone can, doesn't matter how smart you are.<p>For the past six months I've felt nothing beyond a deep melancholic sadness. Not that my industry is changing, it isn't. Not really. These models will not replace people, and anyone who thinks they can is either trying to sell you something or is delusional. The readjustment and the end of the hype cycle will come eventually. But, I fear many people will never be able to let it go. I'm saddened that we're going to lose a generation of brilliant people to fiddling with token predictors, and many of them will never recover from it.<p>AI will set the industry back twenty years. Not because we will be replaced, but because so many people will be dragged into psychosis and addiction or waste decades chasing the future on a lie.<p>And there's nothing any of us can do about it now.</p>
]]></description><pubDate>Fri, 06 Mar 2026 02:01:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=47269892</link><dc:creator>lowsong</dc:creator><comments>https://news.ycombinator.com/item?id=47269892</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47269892</guid></item></channel></rss>