<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: jplusequalt</title><link>https://news.ycombinator.com/user?id=jplusequalt</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Fri, 17 Apr 2026 05:48:00 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=jplusequalt" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by jplusequalt in "AI-assisted cognition endangers human development?"]]></title><description><![CDATA[
<p>>our evolved relationship with food is not inherently good, and it's better for us to change our behaviors than to abandon our advancements and return to the food-scarce world we're adapted to.<p>So are you arguing we should change our relationship with human intelligence? What does that even mean?</p>
]]></description><pubDate>Wed, 15 Apr 2026 22:39:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=47786268</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47786268</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47786268</guid></item><item><title><![CDATA[New comment by jplusequalt in "Claude Code Routines"]]></title><description><![CDATA[
<p>>You can lessen your dependence on the specific details of how /loop, code routines, etc. work by asking the LLM to do simpler tasks, and instead, having a proper workflow engine be in charge of the workflow aspects.<p>Or, you know, by writing the code yourself?</p>
]]></description><pubDate>Wed, 15 Apr 2026 13:51:34 +0000</pubDate><link>https://news.ycombinator.com/item?id=47778962</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47778962</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47778962</guid></item><item><title><![CDATA[New comment by jplusequalt in "AI Will Be Met with Violence, and Nothing Good Will Come of It"]]></title><description><![CDATA[
<p>Plenty of people do.</p>
]]></description><pubDate>Mon, 13 Apr 2026 14:28:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=47752488</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47752488</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47752488</guid></item><item><title><![CDATA[New comment by jplusequalt in "AI Will Be Met with Violence, and Nothing Good Will Come of It"]]></title><description><![CDATA[
<p>>Depends on how much is provided - the simple fix is to provide enough.<p>To "provide enough" would cost trillions of extra dollars per year.</p>
]]></description><pubDate>Mon, 13 Apr 2026 13:17:30 +0000</pubDate><link>https://news.ycombinator.com/item?id=47751534</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47751534</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47751534</guid></item><item><title><![CDATA[New comment by jplusequalt in "AI Will Be Met with Violence, and Nothing Good Will Come of It"]]></title><description><![CDATA[
<p>A UBI is basically impossible to implement on a large scale without there being significant downsides. In what world does increasing the budget by a trillion dollars or more work out well?</p>
]]></description><pubDate>Sun, 12 Apr 2026 17:06:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=47742008</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47742008</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47742008</guid></item><item><title><![CDATA[New comment by jplusequalt in "AI Will Be Met with Violence, and Nothing Good Will Come of It"]]></title><description><![CDATA[
<p>True, but again, the other points are more damning.<p>We're talking about an increased federal budget in the hundreds of billions/trillions to support such a UBI. That will cause a massive increase in taxation on the people who can still find jobs.<p>To make matters worst, the government in 10-15 years will likely be spending ~25% of it's budget on interest payments alone. Hiking the federal budget up even more sounds like a hard sell.</p>
]]></description><pubDate>Sun, 12 Apr 2026 16:25:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=47741538</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47741538</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47741538</guid></item><item><title><![CDATA[New comment by jplusequalt in "AI Will Be Met with Violence, and Nothing Good Will Come of It"]]></title><description><![CDATA[
<p>UBI:<p>- 1. Will require a large increase in taxation.<p>- 2. Will likely cause some form of inflation.<p>- 3. Will not provide enough money for a majority of people to survive on.<p>- 4. Has no significant political support in the US.</p>
]]></description><pubDate>Sun, 12 Apr 2026 16:05:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=47741331</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47741331</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47741331</guid></item><item><title><![CDATA[New comment by jplusequalt in "AI Will Be Met with Violence, and Nothing Good Will Come of It"]]></title><description><![CDATA[
<p>Raising taxes is the only possible way a UBI would be feasible, and even then it wouldn't be a large enough amount for most people to live off of.<p>Also, a UBI is likely to cause inflation.</p>
]]></description><pubDate>Sun, 12 Apr 2026 15:55:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=47741199</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47741199</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47741199</guid></item><item><title><![CDATA[New comment by jplusequalt in "Sam Altman may control our future – can he be trusted?"]]></title><description><![CDATA[
<p>>People are voting with their wallets<p>A handful of people's wallets are much deeper than vast swaths of the population. None of this AI shit would be happening without their funding.</p>
]]></description><pubDate>Tue, 07 Apr 2026 13:53:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=47675419</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47675419</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47675419</guid></item><item><title><![CDATA[New comment by jplusequalt in "Sam Altman may control our future – can he be trusted?"]]></title><description><![CDATA[
<p>>Thew would see us adapting to the new conditions in a relatively short while.<p>Say ~5 million jobs in the next 10 years are automated away, which industries do those people move to?<p>With college being exorbitantly expensive, that locks out many people from re-skilling in other fields.<p>As people race to other industries, that forces down wages because now there is a larger pool to select from.<p>How do we ensure people are taken care of when UBI is all but fiscally impossible in the US?</p>
]]></description><pubDate>Tue, 07 Apr 2026 13:46:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=47675326</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47675326</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47675326</guid></item><item><title><![CDATA[New comment by jplusequalt in "Sam Altman may control our future – can he be trusted?"]]></title><description><![CDATA[
<p>It's easy to advocate for something when you know it's essentially impossible to implement.</p>
]]></description><pubDate>Tue, 07 Apr 2026 13:40:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=47675228</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47675228</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47675228</guid></item><item><title><![CDATA[New comment by jplusequalt in "AI may be making us think and write more alike"]]></title><description><![CDATA[
<p>>But generally this isn't what happens, because often what a lot of what we're seeing is just this new thing occupying the zeitgeist. Eventually, its novelty passes, the underlying norms of human behaviour reassert themselves, and society regresses to the mean. Not completely unchanged, but not as radically transformed as we feared either. The new phenomenon goes from being the latest fashion to overexposed and lame, then either fades away entirely, retreats to a niche, or settles in as just one strand of mainstream civilisational diversity<p>The internet didn't follow this trajectory. Neither did smart phones.<p>Surprise, surprise, it's the same people trying to make AI entrenched into our society.</p>
]]></description><pubDate>Tue, 07 Apr 2026 13:27:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=47675070</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47675070</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47675070</guid></item><item><title><![CDATA[New comment by jplusequalt in "Slop is not necessarily the future"]]></title><description><![CDATA[
<p>Neither is the original assertion. There are thousands of examples of exceptionally well crafted code bases that are used by many. I would posit the Linux kernel as an example, which is arguably the most used piece of software in the world.</p>
]]></description><pubDate>Tue, 31 Mar 2026 19:53:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=47592569</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47592569</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47592569</guid></item><item><title><![CDATA[New comment by jplusequalt in "The ladder is missing rungs – Engineering Progression When AI Ate the Middle"]]></title><description><![CDATA[
<p>>It's interesting to watch the same class of people who told coal miners "they should learn to code" back in the early 2010s now getting the same comeuppance.<p>There are millions of software engineers in the US alone. Don't put all of them into a single bucket.</p>
]]></description><pubDate>Tue, 31 Mar 2026 15:18:47 +0000</pubDate><link>https://news.ycombinator.com/item?id=47588634</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47588634</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47588634</guid></item><item><title><![CDATA[New comment by jplusequalt in "The ladder is missing rungs – Engineering Progression When AI Ate the Middle"]]></title><description><![CDATA[
<p><a href="https://en.wikipedia.org/wiki/Hedonic_treadmill" rel="nofollow">https://en.wikipedia.org/wiki/Hedonic_treadmill</a><p>Economists typically point to this phenomenon when people talk about the relatively stable working hours over the last 50-60 years. I've seen some of them argue it's an issue of supply/demand, and that if people truly wanted to work less we'd see more of demand for such careers. I think this ignores that retirement/medical benefits are almost exclusively tied to jobs expecting you to work 40 hours a week.</p>
]]></description><pubDate>Tue, 31 Mar 2026 15:15:08 +0000</pubDate><link>https://news.ycombinator.com/item?id=47588571</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47588571</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47588571</guid></item><item><title><![CDATA[New comment by jplusequalt in "I am leaving the AI party after one drink"]]></title><description><![CDATA[
<p>>What exactly did AI take from me? Spending hours of research on Google and Youtube to glean little incomplete bits and pieces? Calling a yard service?<p>An opportunity for a deeper understanding of gardening? If you spend hours researching on gardening and come away with an incomplete understanding of what you were attempting to do, I'm not sure that's immediately the fault of the research available. It could be that you just didn't do a good job searching for the necessary information.<p>In this way, AI can be a boon. It helps figure out what you actually want to know in the moment. But I think it would be a step to far to say that a smattering of specific questions can replace the sturdy foundation povided by a typical education--e.g. through apprenticeship, books, etc.<p>>It's also clearly obvious when AI gives bad or incorrect advice<p>Is it? Isn't this a __core__ problem that researchers around the world are trying to solve? Also, __how__ could you make such a statement unless you already possessed the knowledge ahead of time to make such a judgment? I think it's hard to know if something is bad advice by looking at just cause and effect. It could be that you just lack the understanding to put the advice into practice.</p>
]]></description><pubDate>Fri, 27 Mar 2026 18:12:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=47546239</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47546239</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47546239</guid></item><item><title><![CDATA[New comment by jplusequalt in "I am leaving the AI party after one drink"]]></title><description><![CDATA[
<p>>It only a loss if you think the skill and ability you are losing is intrinsically valuable<p>What about the skill of learning itself? I would suggest that's one of the most important skills humans have evolved. The more integrated AI becomes in our societies, the more it will automate away potential opportunities for learning. I can forsee a world tightly integrated with AI where people are not only physically sedentary, but mentally as well.<p>As we progress further into the future, we need more educated people than ever to tackle the exponentially increasing complexities of our society. But AI presents an obstacle that many will never cross due to how to convenient it is to skip the messy work of understanding.<p>Also, this problem is not unique to AI. It existed before the GPTs and Claude's of the world. But it's a problem of scale, and every company on the Earth right now is trying to scale AI up as fast as possible.</p>
]]></description><pubDate>Fri, 27 Mar 2026 17:48:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=47545934</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47545934</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47545934</guid></item><item><title><![CDATA[New comment by jplusequalt in "I am leaving the AI party after one drink"]]></title><description><![CDATA[
<p>>Socrates' exact perspective on writing<p>Again, writing replacing memorization is not a good 1:1 comparison to AI replacing technical understanding. Someone still needs to understand what is written and act upon that knowledge. That requires skill and experience in the domain they're working within.<p>However, a person using an AI does not need to understand the underlying problem to get results. A person can ask Claude Code to write them a web app dashboard without having ever learned JS/CSS/HTML. It does not require them to have skills within a domain.<p>Also, we need to be honest with ourselves. Human brains did not evolve for the instant gratification of modern technology. We've already seen what technology has done to our attention spans. I am concerned over what further reliance on technology, particularly AI, will do to our brains.</p>
]]></description><pubDate>Fri, 27 Mar 2026 17:35:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=47545777</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47545777</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47545777</guid></item><item><title><![CDATA[New comment by jplusequalt in "You are not your job"]]></title><description><![CDATA[
<p>>If you're going to make the claim that people hold intrinsic value, people are going to challenge you for proof.<p>But this is assuming we share the same set of axioms?<p>It sounds like you don't accept humans having intrinsic value as a core axiom. However, I do, and it makes zero sense to me to try and "prove" such a notion.</p>
]]></description><pubDate>Mon, 23 Mar 2026 15:39:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=47491014</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47491014</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47491014</guid></item><item><title><![CDATA[New comment by jplusequalt in "You are not your job"]]></title><description><![CDATA[
<p>>Ultimately, out there, people value you by just 5-6 things and almost never they are your beliefs or personal values.<p>This is such a rigid world view that is demonstrably wrong. I won't even argue with you because others have already pointed out why this cannot be true elsewhere.<p>What I will say is that I hope you're doing alright.</p>
]]></description><pubDate>Mon, 23 Mar 2026 14:49:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=47490316</link><dc:creator>jplusequalt</dc:creator><comments>https://news.ycombinator.com/item?id=47490316</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47490316</guid></item></channel></rss>