<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: noduerme</title><link>https://news.ycombinator.com/user?id=noduerme</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Wed, 15 Apr 2026 07:49:45 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=noduerme" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by noduerme in "Nucleus Nouns"]]></title><description><![CDATA[
<p>Domain specific knowledge has been the moat for a long time, hasn't it? Outsourcing isn't new. Maybe you can do in a weekend for $500 what would have taken a month and $20k before, but code itself isn't a barrier to competition - even really good code.<p>On the other hand, much of the code I write is in an industry where training and operations manuals are closely guarded corporate secrets that make up the recipe or soul of a company. The job of the SWE is to deeply understand the processes and procedures that employees follow, and to write code that helps facilitate those and then gets out of the way. A lot of it comes from walking around and seeing how people are actually using the software and what works, and what's a pain point. I've always maintained that the value is in the operations manuals, and the code is just a logical extension of that. But that's where SaaS usually is insufficient because regardless how versatile and broad it is, it doesn't usually encapsulate enough domain knowledge, let alone the proprietary stuff.</p>
]]></description><pubDate>Tue, 14 Apr 2026 23:38:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=47772882</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47772882</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47772882</guid></item><item><title><![CDATA[New comment by noduerme in "Nucleus Nouns"]]></title><description><![CDATA[
<p>This is quite insightful. Often in my projects, some central nouns come up over and over again in discussions about architecture, until they become embedded in the actual software and the interface.<p>Since I started using AI tools to assist, I've found a lot of both utility and frustration revolves around my use of these nouns in prompts (in the context of, e.g. "during the 'quickpay' confirmation phase..."). When the bot settles into understanding these nouns, it seems to get a better handle on the architecture as a whole. When it suddenly forgets them and has to go figure out what they mean by scanning the code base, I know it's about to do something staggeringly redundant and stupid.</p>
]]></description><pubDate>Tue, 14 Apr 2026 23:20:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=47772724</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47772724</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47772724</guid></item><item><title><![CDATA[New comment by noduerme in "Sam Altman's response to Molotov cocktail incident"]]></title><description><![CDATA[
<p>We needn't be cowed into saying otherwise, but throwing a bomb at him is something else entirely. If you're convinced that wicked people are running the world, the response isn't to be wicked.</p>
]]></description><pubDate>Sat, 11 Apr 2026 04:27:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=47727396</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47727396</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47727396</guid></item><item><title><![CDATA[New comment by noduerme in "Will I ever own a zettaflop?"]]></title><description><![CDATA[
<p>Oof. You may have just named the next frontier in our becoming unmoored from base reality. That's gonna be a gold rush. And I hate that this just gave me a brutally demented idea for a dating platform. I have a conscience, damnit.</p>
]]></description><pubDate>Sat, 11 Apr 2026 03:25:30 +0000</pubDate><link>https://news.ycombinator.com/item?id=47727034</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47727034</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47727034</guid></item><item><title><![CDATA[New comment by noduerme in "Sam Altman's response to Molotov cocktail incident"]]></title><description><![CDATA[
<p>But longshot bettors have it easy. Society quickly forgets all the predictions that don't come true. It remembers the one that did, and treats the prognosticator as a prophet. In social terms, predicting doom is an asymmetrical strategy, because you only have to be right once.<p>Which is also to say it's a cheap bet that anyone with no reputation can afford. Hence, not believing doomsayers mean what they say is a sort of societal hedge against people flooding the zone with doomsday scenarios about everything.</p>
]]></description><pubDate>Sat, 11 Apr 2026 03:16:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=47726975</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47726975</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47726975</guid></item><item><title><![CDATA[New comment by noduerme in "Sam Altman's response to Molotov cocktail incident"]]></title><description><![CDATA[
<p>If your only measure is whether something is effective, then state and corporate violence will always be a lot more effective than individual acts of violence. You could even say that individual violence <i>helps</i> the state to commit violence, by providing justification and by removing the moral imperative to avoid violence.</p>
]]></description><pubDate>Sat, 11 Apr 2026 03:07:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=47726917</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47726917</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47726917</guid></item><item><title><![CDATA[New comment by noduerme in "Sam Altman's response to Molotov cocktail incident"]]></title><description><![CDATA[
<p>>> Yeah, the words and narratives that Sam Altman promoted caused so much fear and uncertainty and anger that someone thought their only option was to attempt a horrific crime.<p>The problem with this inversion of your first statement (that violence is not the answer), which everyone justifying violence in this thread seems to forget, is that there is <i>always someone</i> who feels this way about <i>anything</i>.<p>The words and narratives of Martin Luther King, Jr., for example, caused so much fear and uncertainty and anger in some people that they thought their only option was to commit a horrific crime.<p>Someone responded to you below saying if you feel that peaceful revolution is impossible, then violent revolution is necessary. That person feels that they are on the side of justice. What they forget is that <i>so does everyone else</i>.<p>The reason revolutions rarely stop where a reasonable person would want them to stop, and instead continue into eating their own and counter-revolutions, is that once you say that it's understandable to take out a proponent of (X narrative), there's no end to the number of people who will justify violence in the same way against any other narrative as well.<p>We can all well think that Altman is opening Pandora's Box, but that doesn't justify opening it ourselves, or giving a pass to wannabe revolutionaries who would.<p>In <i>retrospect</i>, too, we can say that the assassination of Hitler had it succeeded would have been a good thing. We can say that the elimination of the ayatollah by the US was a good thing. What we cannot say is that an individual's perception gives them a right to commmit murder.</p>
]]></description><pubDate>Sat, 11 Apr 2026 02:53:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=47726851</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47726851</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47726851</guid></item><item><title><![CDATA[New comment by noduerme in "Will I ever own a zettaflop?"]]></title><description><![CDATA[
<p>For one thing, most news websites would have to load at least 10,000x as much useless javascript to achieve the same performance.</p>
]]></description><pubDate>Fri, 10 Apr 2026 07:53:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=47714926</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47714926</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47714926</guid></item><item><title><![CDATA[New comment by noduerme in "Ask HN: Is weird it that Anthropic raised my API limit from $500/mo to $200k?"]]></title><description><![CDATA[
<p>Kinda wondering if Claude is reporting back on the value of the project it's working on, going <i>"this guy's making money, let's put him on some list"</i></p>
]]></description><pubDate>Wed, 01 Apr 2026 07:42:47 +0000</pubDate><link>https://news.ycombinator.com/item?id=47598027</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47598027</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47598027</guid></item><item><title><![CDATA[New comment by noduerme in "Index providers shouldn't bend the rules for Elon Musk"]]></title><description><![CDATA[
<p>Would recommend anyone who sees this to read<p><a href="https://news.ycombinator.com/item?id=47392550">https://news.ycombinator.com/item?id=47392550</a></p>
]]></description><pubDate>Wed, 01 Apr 2026 05:51:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=47597280</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47597280</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47597280</guid></item><item><title><![CDATA[New comment by noduerme in "Improving my focus by giving up my big monitor"]]></title><description><![CDATA[
<p>I've been a laptop purist most of my life, and prefer to work outside my house / office. Only recently I got a Big Monitor™ for a mini pc. It's really messed with my head. Now when I look at my 15" laptop everything looks incredibly small. Not just that, but the scroll direction is opposite on the pc, so if I'm working side by side I find myself accidentally scrolling each one backwards, or actually typing into the wrong keyboard. Somehow I survived this long with just laptop screens and I don't think it's a mistake that my focus was preserved through that.</p>
]]></description><pubDate>Wed, 01 Apr 2026 05:49:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=47597274</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47597274</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47597274</guid></item><item><title><![CDATA[Ask HN: Is weird it that Anthropic raised my API limit from $500/mo to $200k?]]></title><description><![CDATA[
<p>So a few weeks ago I spun up an openclaw instance and gave it a Claude API key. I was coaching it through turning an old client/server codebase into a headless one (something I wrote 15 years ago for projecting sports gambling). When I'd spent about $250 in tokens last week, I got a little worried that maybe I'd have to raise my $500/mo API limit, but I decided to wait on that. A few days ago when I hit $300 in spend, lo and behold my limit was raised to $1000/mo. Great.<p>Today I checked my spend again and found that my limit had been raised to $200,000 per month. I'm not complaining, but that <i>seems a little strange to me</i>. My total all-time spend is about $400 right now. I don't think they've checked my credit score. Not to sound paranoid, but ... okay, I'm going to sound paranoid for a second. Is Claude reporting back that my sports prediction software actually works? Who the ** gets a $199,000 increase in their monthly limit when they pass $400 in spend?<p>To put it another way, they just gave me a limit that surpasses my annual net income. Or is that just the normal jump, from $1k to $200k that's baked into their billing system based on some frequency of usage metric?</p>
<hr>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=47597238">https://news.ycombinator.com/item?id=47597238</a></p>
<p>Points: 2</p>
<p># Comments: 4</p>
]]></description><pubDate>Wed, 01 Apr 2026 05:44:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=47597238</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47597238</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47597238</guid></item><item><title><![CDATA[New comment by noduerme in "Ask HN: How do you feel when your coding assistant loses context?"]]></title><description><![CDATA[
<p>This is the first project where I've really let AI to do more than work on a single file at a time. The trouble is, there's no way for it to be useful without a fairly large context. When it runs out, it starts doing things that are actively destructive, yet very subtle and easy to miss at the same time. Mainly, it forgets the architecture. A couple days ago, it had a good handle on an a database table that I was writing side by side with an API that ran queries and did calculations on the data. I read the code it wrote for a particular API call, and didn't notice that it had started flipping the sign of one of the columns in a query, because it had misinterpreted the column name. A few minutes before that, it had written another query correctly, but from that point on it kept flipping the sign on that column. I only noticed after having it write several other queries and when it oddly mentioned in its "thinking" that X was Y-Z. Reading the thinking has been the main clue as to when it loses track, but if I didn't know exactly why X was Y+Z, the code built on that API would have given subtly inconsistent results that would have been very hard to trace.</p>
]]></description><pubDate>Thu, 26 Mar 2026 00:38:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=47525312</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47525312</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47525312</guid></item><item><title><![CDATA[New comment by noduerme in "Ask HN: How do you feel when your coding assistant loses context?"]]></title><description><![CDATA[
<p>The fact that it's supposed act like a tool, and you come to treat it as one, makes it <i>more</i> frustrating. What if you bought a very expensive knife that went dull every 10 minutes? Of course, you would mutter some curses at the knife; that doesn't mean you believe it to be sentient.</p>
]]></description><pubDate>Wed, 25 Mar 2026 23:32:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=47524733</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47524733</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47524733</guid></item><item><title><![CDATA[New comment by noduerme in "Ask HN: How do you feel when your coding assistant loses context?"]]></title><description><![CDATA[
<p>That's interesting. I mean, I've got an openclaw setup with Claude that is merging and storing chats from whatsapp and the web client once a day, has a ton of context accessible... but there's something about being right in the middle of solving a hard technical problem where you're deep in the weeds about which columns should represent which data, and suddenly it's like, what were we talking about? Oh I should trying to read the database structure again from scratch. I don't think that's a problem that any clever arrangement of memory or personality files can actually solve.</p>
]]></description><pubDate>Wed, 25 Mar 2026 12:25:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=47516411</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47516411</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47516411</guid></item><item><title><![CDATA[New comment by noduerme in "Jury finds Meta liable in case over child sexual exploitation on its platforms"]]></title><description><![CDATA[
<p>To be fair, they're just an evil corporation making lemonade out of lemons. I'm sure they'd be happier pushing porn and nazism to hundreds of millions of underage users, but if certain governments want them to write all that bunk code to verify everyone's ID, they might as well make money off the data.</p>
]]></description><pubDate>Wed, 25 Mar 2026 12:21:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=47516372</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47516372</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47516372</guid></item><item><title><![CDATA[Ask HN: How do you feel when your coding assistant loses context?]]></title><description><![CDATA[
<p>Background: My dad, my mom's dad, and my uncle all suffered from dementia; having a deep, multi-threaded conversation which you were invested in, where you suddenly need to remind the other person of what you were talking about, or who they are, has emotional consequences that range from deep frustration to helpless anger.<p>Can you <i>feel</i> when your agent has just compressed or lost context? Can you tell by how it bullshits you that it knows where it is, while it's trying to grasp what was going on? What's your emotional response to that?</p>
<hr>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=47516319">https://news.ycombinator.com/item?id=47516319</a></p>
<p>Points: 4</p>
<p># Comments: 16</p>
]]></description><pubDate>Wed, 25 Mar 2026 12:14:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=47516319</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47516319</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47516319</guid></item><item><title><![CDATA[New comment by noduerme in "Amazon Acquires Fauna Robotics"]]></title><description><![CDATA[
<p>Congrats to this team, who went public here two months ago to little fanfare. Very cool looking platform.<p>Previous:
<a href="https://news.ycombinator.com/item?id=46781990">https://news.ycombinator.com/item?id=46781990</a></p>
]]></description><pubDate>Tue, 24 Mar 2026 20:05:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=47508313</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47508313</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47508313</guid></item><item><title><![CDATA[Amazon Acquires Fauna Robotics]]></title><description><![CDATA[
<p>Article URL: <a href="https://www.bloomberg.com/news/articles/2026-03-24/amazon-acquires-fauna-robotics-entering-consumer-humanoid-market">https://www.bloomberg.com/news/articles/2026-03-24/amazon-acquires-fauna-robotics-entering-consumer-humanoid-market</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=47508312">https://news.ycombinator.com/item?id=47508312</a></p>
<p>Points: 1</p>
<p># Comments: 1</p>
]]></description><pubDate>Tue, 24 Mar 2026 20:05:08 +0000</pubDate><link>https://www.bloomberg.com/news/articles/2026-03-24/amazon-acquires-fauna-robotics-entering-consumer-humanoid-market</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47508312</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47508312</guid></item><item><title><![CDATA[New comment by noduerme in "“Collaboration” is bullshit"]]></title><description><![CDATA[
<p>It's frustrating to pull more weight and take ownership when other people aren't. But what's legitimately soul-killing to an individual and deadly to an organization is the collective impulse to avoid giving those people credit when it's due. Most of those 20% out there pulling more than their weight just want some acknowledgement. Not giving them that is one way to quickly hollow out your company.</p>
]]></description><pubDate>Mon, 23 Mar 2026 03:56:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=47485359</link><dc:creator>noduerme</dc:creator><comments>https://news.ycombinator.com/item?id=47485359</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47485359</guid></item></channel></rss>