<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: gradys</title><link>https://news.ycombinator.com/user?id=gradys</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Sun, 05 Apr 2026 23:55:59 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=gradys" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[Will there still be professional software developers?]]></title><description><![CDATA[
<p>Article URL: <a href="https://grady.io/will-there-still-be-pro-devs/">https://grady.io/will-there-still-be-pro-devs/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=47649904">https://news.ycombinator.com/item?id=47649904</a></p>
<p>Points: 3</p>
<p># Comments: 1</p>
]]></description><pubDate>Sun, 05 Apr 2026 14:36:09 +0000</pubDate><link>https://grady.io/will-there-still-be-pro-devs/</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=47649904</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47649904</guid></item><item><title><![CDATA[Slop is content without grounding]]></title><description><![CDATA[
<p>Article URL: <a href="https://grady.io/slop-is-content-without-grounding/">https://grady.io/slop-is-content-without-grounding/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=47639173">https://news.ycombinator.com/item?id=47639173</a></p>
<p>Points: 4</p>
<p># Comments: 0</p>
]]></description><pubDate>Sat, 04 Apr 2026 14:04:14 +0000</pubDate><link>https://grady.io/slop-is-content-without-grounding/</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=47639173</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47639173</guid></item><item><title><![CDATA[Will there still be professional software developers?]]></title><description><![CDATA[
<p>Article URL: <a href="https://grady.io/will-there-still-be-pro-devs/">https://grady.io/will-there-still-be-pro-devs/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=47297464">https://news.ycombinator.com/item?id=47297464</a></p>
<p>Points: 1</p>
<p># Comments: 0</p>
]]></description><pubDate>Sun, 08 Mar 2026 14:10:47 +0000</pubDate><link>https://grady.io/will-there-still-be-pro-devs/</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=47297464</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47297464</guid></item><item><title><![CDATA[New comment by gradys in "OpenAI has deleted the word 'safely' from its mission"]]></title><description><![CDATA[
<p>This was long before. Google had conversational LLMs before ChatGPT (though they weren’t as good in my recollection), and they declined to productize. There was a sense at the time that you couldn’t productize anything with truly open ended content generation because you couldn’t guarantee it wouldn’t say something problematic.<p>See Facebook’s Galactica project for an example of what Google was afraid would happen: <a href="https://www.technologyreview.com/2022/11/18/1063487/meta-large-language-model-ai-only-survived-three-days-gpt-3-science/amp/" rel="nofollow">https://www.technologyreview.com/2022/11/18/1063487/meta-lar...</a></p>
]]></description><pubDate>Sat, 14 Feb 2026 06:17:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=47012093</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=47012093</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47012093</guid></item><item><title><![CDATA[New comment by gradys in "Bun v1.3.9"]]></title><description><![CDATA[
<p>Both would be understood and are roughly interchangeable.<p>"Sequential" feels more appropriate to me for the task runner scenario where we wait for one task to finish before running the next.<p>"Series" suggests a kind of concurrency to me because of the electrical circuit context, where the outputs of one are flowing into the next, but both are running concurrently. Processes that are Unix piped into each other would be another thing that feels more like a "series" than a "sequence".</p>
]]></description><pubDate>Sun, 08 Feb 2026 18:22:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=46936991</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=46936991</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46936991</guid></item><item><title><![CDATA[New comment by gradys in "Structured outputs on the Claude Developer Platform"]]></title><description><![CDATA[
<p>You might be using JSON mode, which doesn’t guarantee a schema will be followed, or structured outputs not in strict mode. It is possible to get the property that the response is either a valid instance of the schema or an error (eg for refusal)</p>
]]></description><pubDate>Fri, 14 Nov 2025 22:18:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=45932827</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=45932827</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45932827</guid></item><item><title><![CDATA[New comment by gradys in "Nvidia’s $589B DeepSeek rout"]]></title><description><![CDATA[
<p>Is it luck, or is scaling arithmetic genuinely a useful capability to offer the world?</p>
]]></description><pubDate>Tue, 28 Jan 2025 03:24:48 +0000</pubDate><link>https://news.ycombinator.com/item?id=42848540</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=42848540</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42848540</guid></item><item><title><![CDATA[New comment by gradys in "DSPy – Programming–not prompting–LMs"]]></title><description><![CDATA[
<p>I assume you know how to program in Python? I would start with just the client libraries of the model providers you want to use. LLMs are conceptually simple when treated as black boxes. String in, string out. You don't necessarily need a framework.</p>
]]></description><pubDate>Fri, 06 Dec 2024 22:23:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=42345212</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=42345212</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42345212</guid></item><item><title><![CDATA[New comment by gradys in "Google threatened with being broken up by US"]]></title><description><![CDATA[
<p>I’m not as familiar with the Baby Bells, so this is a surprising comparison to me. Bell Labs was famously so productive while it had the monopoly money hose, and not as much came from it after Bell was broken up.<p>What are the most noteworthy accomplishments of the Baby Bells?</p>
]]></description><pubDate>Wed, 09 Oct 2024 14:56:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=41788578</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=41788578</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41788578</guid></item><item><title><![CDATA[New comment by gradys in "Better RAG Results with Reciprocal Rank Fusion and Hybrid Search"]]></title><description><![CDATA[
<p>Sounds more like Generation Augmented Retrieval in that case.</p>
]]></description><pubDate>Thu, 30 May 2024 21:59:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=40529191</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=40529191</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40529191</guid></item><item><title><![CDATA[New comment by gradys in "Context caching guide"]]></title><description><![CDATA[
<p>I don’t know of any public details on how they implement Context Caching, but that is presumably exactly what they are doing. Just caching the text would be a minimal savings.</p>
]]></description><pubDate>Thu, 16 May 2024 20:06:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=40382546</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=40382546</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40382546</guid></item><item><title><![CDATA[New comment by gradys in "Context caching guide"]]></title><description><![CDATA[
<p>The size of the cached internal state of the network processing the book is much larger than the size of the book. The resource that is preserved with caching is the compute required to recreate that state.</p>
]]></description><pubDate>Thu, 16 May 2024 19:46:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=40382366</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=40382366</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40382366</guid></item><item><title><![CDATA[New comment by gradys in "GPT-4o"]]></title><description><![CDATA[
<p>I’m guessing there was an instrumental reason for this, for instance to check that the model was listening before launching into what they wanted to demo</p>
]]></description><pubDate>Mon, 13 May 2024 23:31:55 +0000</pubDate><link>https://news.ycombinator.com/item?id=40349821</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=40349821</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40349821</guid></item><item><title><![CDATA[The Schulze Method]]></title><description><![CDATA[
<p>Article URL: <a href="https://colab.research.google.com/drive/1R8xxUVdhjFQPZJdyJBC6Q5nFr01nSf0t?usp=sharing">https://colab.research.google.com/drive/1R8xxUVdhjFQPZJdyJBC6Q5nFr01nSf0t?usp=sharing</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=38992049">https://news.ycombinator.com/item?id=38992049</a></p>
<p>Points: 1</p>
<p># Comments: 0</p>
]]></description><pubDate>Sun, 14 Jan 2024 16:57:03 +0000</pubDate><link>https://colab.research.google.com/drive/1R8xxUVdhjFQPZJdyJBC6Q5nFr01nSf0t?usp=sharing</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=38992049</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38992049</guid></item><item><title><![CDATA[New comment by gradys in "Reasons to grow and keep big muscles"]]></title><description><![CDATA[
<p>There are many different rules of thumb on this floating around. 1g per lb of body weight is indeed something a number of well informed people recommend, but those people generally acknowledge that it’s something like an upper bound on the amount that is useful.</p>
]]></description><pubDate>Wed, 03 Jan 2024 14:48:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=38854662</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=38854662</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38854662</guid></item><item><title><![CDATA[New comment by gradys in "Reasons to grow and keep big muscles"]]></title><description><![CDATA[
<p>Cardio isn’t super important for what? It certainly has longevity benefits over and above those from resistance training.</p>
]]></description><pubDate>Wed, 03 Jan 2024 14:43:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=38854616</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=38854616</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38854616</guid></item><item><title><![CDATA[New comment by gradys in "Simplifying Transformer Blocks"]]></title><description><![CDATA[
<p>Attention itself was the key idea of that paper and, as you sort of acknowledge, was definitely not just throwing things at the wall. It was the culmination of a long line of work gradually progressing toward fully dynamic routing via attention, and it was motivated, if not by deep theory, at least deep intuition from linguistics. The other details of transformers are perhaps sort of arbitrary, but made sense to everyone at the time. There was no claim that those other details were optimal - just that they were one way of surrounding the attention mechanism with computing machinery that worked.</p>
]]></description><pubDate>Tue, 28 Nov 2023 16:47:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=38447767</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=38447767</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38447767</guid></item><item><title><![CDATA[New comment by gradys in "Amtrak Explorer"]]></title><description><![CDATA[
<p>The key feature of the NEC is that the cities are large and close together. This is true almost nowhere else in the country.</p>
]]></description><pubDate>Thu, 28 Sep 2023 22:25:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=37696702</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=37696702</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=37696702</guid></item><item><title><![CDATA[New comment by gradys in "Pijul: Version-Control Post-Git [video]"]]></title><description><![CDATA[
<p>What does understanding and reasoning mean?</p>
]]></description><pubDate>Sat, 12 Aug 2023 12:37:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=37099563</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=37099563</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=37099563</guid></item><item><title><![CDATA[New comment by gradys in "Show HN: I built a transit travel time map"]]></title><description><![CDATA[
<p>I'd love to be able to get a print of the map with a customized zoom level and origin point.</p>
]]></description><pubDate>Mon, 24 Jul 2023 16:59:55 +0000</pubDate><link>https://news.ycombinator.com/item?id=36851038</link><dc:creator>gradys</dc:creator><comments>https://news.ycombinator.com/item?id=36851038</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36851038</guid></item></channel></rss>