<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: Difwif</title><link>https://news.ycombinator.com/user?id=Difwif</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Wed, 08 Apr 2026 02:21:25 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=Difwif" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by Difwif in "Tell HN: I'm 60 years old. Claude Code has re-ignited a passion"]]></title><description><![CDATA[
<p>I think it's pretty obvious what category you see yourself in.<p>I don't think you're a hacker. I think you enjoy writing code (good for you). Some of us just enjoy making the computer execute our ideas - like a digital magician. I've also gotten very good at the code writing and debugging part. I've even enjoyed it for long periods of time but there's times where I can't execute my ideas because they're bigger than what I can reasonably do by myself. Then my job becomes pitching, hiring, and managing humans. Now I write code to write code and no project seems too big.<p>But I'm looking forward to collapsing the many layers of abstraction we've created to move bits and control devices. It was always about what we could do with the computers for me.</p>
]]></description><pubDate>Sat, 07 Mar 2026 22:25:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=47292050</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=47292050</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47292050</guid></item><item><title><![CDATA[New comment by Difwif in "Your brain on ChatGPT: Accumulation of cognitive debt when using an AI assistant"]]></title><description><![CDATA[
<p>This statement feels like a farmer making a case for using their hands to tend the land instead of a tractor because it produces too many crops. Modern farming requires you to have an ecosystem of supporting tools to handle the scale and you need to learn new skills like being a diesel mechanic.<p>How we work changes and the extra complexity buys us productivity. The vast majority of software will be AI generated, tools will exist to continuously test/refine it, and hand written code will be for artists, hobbyists, and an ever shrinking set of hard problems where a human still wins.</p>
]]></description><pubDate>Thu, 22 Jan 2026 12:29:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=46718384</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=46718384</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46718384</guid></item><item><title><![CDATA[New comment by Difwif in "Ford F-150 Lightning outsold the Cybertruck and was then canceled for poor sales"]]></title><description><![CDATA[
<p>He's alluded to thinking that Asians and Indians are "better" on some metrics so supremacy still seems a bit sensationalist. He certainly doesn't think all races are equal.</p>
]]></description><pubDate>Thu, 15 Jan 2026 06:01:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=46628623</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=46628623</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46628623</guid></item><item><title><![CDATA[New comment by Difwif in "AI is a business model stress test"]]></title><description><![CDATA[
<p>This is temporary. AI models have their own Moore's law. Yes the mega corps will have the best models but soon enough what is currently SOTA will be open source and run on your own local machine if you want.<p>the mega corps are getting all of us and the investors to fund the RnD.</p>
]]></description><pubDate>Sun, 11 Jan 2026 17:48:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=46577797</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=46577797</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46577797</guid></item><item><title><![CDATA[New comment by Difwif in "WorldGen – Text to Immersive 3D Worlds"]]></title><description><![CDATA[
<p>This just seems like an engineered pipeline of existing GenAI to get a 3d procedurally generated world that doesn't even look SOTA. I'm really sorry to dunk on this for those that worked on it, but this doesn't look like progress to me. The current approach looks like a dead end.<p>An end-to-end _trained_ model that spits out a textured mesh of the same result would have been an innovation. The fact that they didn't do that suggests they're missing something fundamental for world model training.<p>The best thing I can say is that maybe they can use this to bootstrap a dataset for a future model.</p>
]]></description><pubDate>Sun, 23 Nov 2025 00:28:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=46019604</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=46019604</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46019604</guid></item><item><title><![CDATA[New comment by Difwif in "USDA head says 'everyone' on SNAP will now have to reapply"]]></title><description><![CDATA[
<p>[removed]</p>
]]></description><pubDate>Fri, 14 Nov 2025 23:30:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=45933416</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=45933416</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45933416</guid></item><item><title><![CDATA[New comment by Difwif in "First recording of a dying human brain shows waves similar to memory flashbacks (2022)"]]></title><description><![CDATA[
<p>A valid rationalization but never an excuse. At some point the buck has to stop being passed around. Standing up to all instances of violence is the only way to stop the endless cycles.</p>
]]></description><pubDate>Mon, 03 Nov 2025 20:11:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=45803813</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=45803813</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45803813</guid></item><item><title><![CDATA[New comment by Difwif in "Uv is the best thing to happen to the Python ecosystem in a decade"]]></title><description><![CDATA[
<p>Pixi has also been such a breathe of fresh air for me. I think it's as big of a deal as UV (It uses UV under the hood for the pure python parts).<p>It's still very immature but if you have a mixture of languages (C, C++, Python, Rust, etc.) I highly recommend checking it out.</p>
]]></description><pubDate>Wed, 29 Oct 2025 21:20:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=45753195</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=45753195</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45753195</guid></item><item><title><![CDATA[New comment by Difwif in "Asahi Linux Still Working on Apple M3 Support, M1n1 Bootloader Going Rust"]]></title><description><![CDATA[
<p>Well you should tell that to Dell because I have coworkers with a range of their models that are constantly fighting with webcams, audio, bluetooth, wifi, and Nvidia driver updates.</p>
]]></description><pubDate>Fri, 24 Oct 2025 19:32:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=45698276</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=45698276</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45698276</guid></item><item><title><![CDATA[New comment by Difwif in "Asahi Linux Still Working on Apple M3 Support, M1n1 Bootloader Going Rust"]]></title><description><![CDATA[
<p>I used to be in this camp until I tried and bought an M1 Macbook as my daily driver. I thought I was going to be Thinkpad/XPS w/ Linux until I die. I don't love MacOS but POSIX is mostly good enough for me and the hardware is so good that I'm willing to look past the shortfalls.<p>Seriously I would love to switch back to a full-time Linux distro but I'm more interested in getting work done and having a stable & performant platform. Loosing a day of productivity fixing drivers and patching kernels gets old. The M-series laptops have been the perfect balance for me so far.</p>
]]></description><pubDate>Fri, 24 Oct 2025 19:17:19 +0000</pubDate><link>https://news.ycombinator.com/item?id=45698143</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=45698143</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45698143</guid></item><item><title><![CDATA[New comment by Difwif in "Addendum to GPT-5 system card: GPT-5-Codex"]]></title><description><![CDATA[
<p>Is this available to use now in Codex? Should I see a new /model?</p>
]]></description><pubDate>Mon, 15 Sep 2025 19:48:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=45254131</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=45254131</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45254131</guid></item><item><title><![CDATA[New comment by Difwif in "Microsoft is officially sending employees back to the office"]]></title><description><![CDATA[
<p>(2) Seems like a media narrative rather than truth. I don't think that would be anywhere remotely high on a CEO's priority list unless they were a commercial real estate company.<p>It's far more likely a mixture of (1) and actual results - in-person/hybrid teams produce better outcomes (even if why that's true hasn't been deeply evaluated or ultimately falls on management)</p>
]]></description><pubDate>Wed, 10 Sep 2025 14:31:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=45198277</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=45198277</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45198277</guid></item><item><title><![CDATA[New comment by Difwif in "Why language models hallucinate"]]></title><description><![CDATA[
<p>It would be interesting to see two versions of a model. A primary model tuned for precision that's focused on correctness that works with or orchestrates a creative model that's tuned for generating new (and potentially incorrect) ideas. The primary model is responsible for evaluating and reasoning about the ideas/hallucinations. Feels like a left/right brain architecture (even though that's an antiquated model of human brain hemispheres).</p>
]]></description><pubDate>Sat, 06 Sep 2025 22:35:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=45153536</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=45153536</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45153536</guid></item><item><title><![CDATA[New comment by Difwif in "A staff engineer's journey with Claude Code"]]></title><description><![CDATA[
<p>I took a quick informal poll of my coworkers and the majority of us have found workflows where CC is producing 70-99% of the code on average in PRs. We're getting more done faster. Most of these people tend to be anywhere from 5-12 yrs professional experience. There are some concerns that maybe more bugs are slipping through (but also there's more code being produced).<p>We agree most problems stem from:
1. Getting lazy and auto-accepting edits. Always review changes and make sure you understand everything.
2. Clearly written specification documents before starting complex work items
3. Breaking down tasks into a managable chunk of scope
4. Clean digestible code architecture. If it's hard for a human to understand (e.g: poor separation of concerns) it will be hard for the LLM too.<p>But yeah I would never waste my time making that video. Having too much fun turning ideas into products to care about proving a point.</p>
]]></description><pubDate>Wed, 03 Sep 2025 12:23:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=45114880</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=45114880</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45114880</guid></item><item><title><![CDATA[New comment by Difwif in "Some thoughts on LLMs and software development"]]></title><description><![CDATA[
<p>My parents could have said your first paragraph when I tried to teach them they could Google their questions and find answers.<p>Technology moves forward and productivity improves for those that move with it.</p>
]]></description><pubDate>Fri, 29 Aug 2025 05:55:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=45060693</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=45060693</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45060693</guid></item><item><title><![CDATA[New comment by Difwif in "Claim: GPT-5-pro can prove new interesting mathematics"]]></title><description><![CDATA[
<p>Seems short sighted to me. LLMs could have any data in their training set encoded as tokens. Either new specialized tokens are explicitly included (e.g: Vision models) or the language encoded version of everything that usually exists (e.g: the research paper and the csv with the data).<p>To improve next token prediction performance on these datasets and generalize requires a much richer latent space. I think it could theoretically lead to better results from cross-domain connections (ex: being fluent in a specific area of advanced mathematics, quantum mechanics, and materials engineering is key to a particular breakthrough)</p>
]]></description><pubDate>Mon, 25 Aug 2025 15:34:46 +0000</pubDate><link>https://news.ycombinator.com/item?id=45014990</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=45014990</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45014990</guid></item><item><title><![CDATA[New comment by Difwif in "NSF and Nvidia award Ai2 $152M to support building an open AI ecosystem"]]></title><description><![CDATA[
<p>You're right. It's time to ban the evil numbers from being matrix multiplied. Contact your local representative about CPU control.</p>
]]></description><pubDate>Thu, 14 Aug 2025 15:52:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=44901919</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=44901919</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44901919</guid></item><item><title><![CDATA[New comment by Difwif in "Historical Tech Tree"]]></title><description><![CDATA[
<p>Looking forward to the new Civilization mod that uses this.</p>
]]></description><pubDate>Thu, 07 Aug 2025 20:37:47 +0000</pubDate><link>https://news.ycombinator.com/item?id=44830026</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=44830026</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44830026</guid></item><item><title><![CDATA[New comment by Difwif in "GPT-5"]]></title><description><![CDATA[
<p>My mental model is a bit different:<p>Context -> Attention Span<p>Model weights/Inference -> System 1 thinking (intuition)<p>Computer memory (files) -> Long term memory<p>Chain of thought/Reasoning -> System 2 thinking<p>Prompts/Tool Output -> Sensing<p>Tool Use -> Actuation<p>The system 2 thinking performance is heavily dependent on the system 1 having the right intuitive models for effective problem solving via tool use. Tools are also what load long term memories into attention.</p>
]]></description><pubDate>Thu, 07 Aug 2025 19:42:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=44829399</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=44829399</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44829399</guid></item><item><title><![CDATA[New comment by Difwif in "GPT-5"]]></title><description><![CDATA[
<p>And LLM memories are stored in an electrical charge trapped in a floating gate transistor (or as magnetization of a ferromagnetic region on an alloy platter).<p>Or they write CLAUDE.md files. Whatever you want to call it.</p>
]]></description><pubDate>Thu, 07 Aug 2025 19:18:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=44829117</link><dc:creator>Difwif</dc:creator><comments>https://news.ycombinator.com/item?id=44829117</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44829117</guid></item></channel></rss>