<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: GPerson</title><link>https://news.ycombinator.com/user?id=GPerson</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Thu, 07 May 2026 15:33:18 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=GPerson" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by GPerson in "Show HN: Gemini Pro 3 imagines the HN front page 10 years from now"]]></title><description><![CDATA[
<p>And it’s just a copy of notepad from windows xp.</p>
]]></description><pubDate>Wed, 10 Dec 2025 02:51:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=46213465</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=46213465</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46213465</guid></item><item><title><![CDATA[New comment by GPerson in "Perl's decline was cultural"]]></title><description><![CDATA[
<p>Did you come up with that? If so, bravo!</p>
]]></description><pubDate>Sat, 06 Dec 2025 20:27:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=46176349</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=46176349</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46176349</guid></item><item><title><![CDATA[New comment by GPerson in "AI-Assisted Coding Killed My Joy of Programming"]]></title><description><![CDATA[
<p>You can try. What happens is it cheats you at every turn and finally admits it wasn’t testing anything when you ask why it’s still broken.<p>Yes, this actually happens.</p>
]]></description><pubDate>Tue, 02 Dec 2025 03:55:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=46117351</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=46117351</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46117351</guid></item><item><title><![CDATA[New comment by GPerson in "AI-Assisted Coding Killed My Joy of Programming"]]></title><description><![CDATA[
<p>Rigorously designing and understanding the system does not sound like what you were referring to in the original post.<p>All of my heroes always said the difficult part is not in the writing of the code, but in the reading/understanding of it.</p>
]]></description><pubDate>Tue, 02 Dec 2025 03:53:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=46117331</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=46117331</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46117331</guid></item><item><title><![CDATA[New comment by GPerson in "AI-Assisted Coding Killed My Joy of Programming"]]></title><description><![CDATA[
<p>The step you describe is the main use case I’ve found where vibe coding actually makes sense. Going from there to actually make a good version of the thing then becomes more hands on.</p>
]]></description><pubDate>Tue, 02 Dec 2025 00:12:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=46115482</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=46115482</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46115482</guid></item><item><title><![CDATA[New comment by GPerson in "AI-Assisted Coding Killed My Joy of Programming"]]></title><description><![CDATA[
<p>I’ve done a lot of vibe coding and I just can’t understand these takes. Pure vibe coding is not going to get you to a good result, so the alchemist you describe is still very much essential, and as far as I can see will be for a very long time.</p>
]]></description><pubDate>Tue, 02 Dec 2025 00:10:55 +0000</pubDate><link>https://news.ycombinator.com/item?id=46115470</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=46115470</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46115470</guid></item><item><title><![CDATA[New comment by GPerson in "AI-Assisted Coding Killed My Joy of Programming"]]></title><description><![CDATA[
<p>I agree. Vibe coding eventually just becomes repetitively QA testing the work of a bad engineer.</p>
]]></description><pubDate>Tue, 02 Dec 2025 00:09:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=46115453</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=46115453</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46115453</guid></item><item><title><![CDATA[New comment by GPerson in "10 years of writing a blog nobody reads"]]></title><description><![CDATA[
<p>“It's redundant to say "I think" at any point in an opinion piece.”<p>“But is there still value in human produced writing? Subjectively, yes. Objectively? I'm not sure. I think there's a lot of personal value in writing though.”<p>There is value because I felt compelled to engage, but if it turns out you’re a bot then I’ll feel cheated and less likely to read other blog posts.</p>
]]></description><pubDate>Fri, 28 Nov 2025 04:44:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=46075582</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=46075582</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46075582</guid></item><item><title><![CDATA[New comment by GPerson in "Underrated reasons to be thankful V"]]></title><description><![CDATA[
<p>I found this to be a disturbing read. Do not recommend.</p>
]]></description><pubDate>Thu, 27 Nov 2025 21:33:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=46073423</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=46073423</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46073423</guid></item><item><title><![CDATA[New comment by GPerson in "Study identifies weaknesses in how AI systems are evaluated"]]></title><description><![CDATA[
<p>It definitely has its amazing moments, but sometimes I get caught in a loop of expecting it to do the thing, it not working, and spinning my wheels a lot instead of just solving it myself. I think I’m still learning how to use the tools effectively, but the random nature of it makes it difficult.</p>
]]></description><pubDate>Sat, 08 Nov 2025 15:40:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=45857430</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=45857430</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45857430</guid></item><item><title><![CDATA[New comment by GPerson in "We are building AI slaves. Alignment through control will fail"]]></title><description><![CDATA[
<p>So your stance is that it is impossible to create a simulated intelligence which is not conscious? That seems like the less likely possibility to me.<p>I do think it’s clearly possible to manufacture a conscious mind.</p>
]]></description><pubDate>Sat, 08 Nov 2025 15:28:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=45857308</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=45857308</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45857308</guid></item><item><title><![CDATA[New comment by GPerson in "A new threat: Being replaced by someone who knows AI"]]></title><description><![CDATA[
<p>The more I interact with these the less I’m afraid these tools will make life meaningless. (Can’t speak on art generation tools. Those still depress me.) It doesn’t matter what you’re making there are still a lot of hard parts even with the best versions of these tools. I doubt a good software developer can be replaced totally unless these get way better.<p>The best use cases are for code that’s clearly not an end product. You can just try way more ideas and get a sense of which are likely to pan out. That is tremendously valuable. When I start reading the code they produce, I quickly find many ways I would have written it differently though.</p>
]]></description><pubDate>Sat, 08 Nov 2025 01:57:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=45853384</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=45853384</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45853384</guid></item><item><title><![CDATA[New comment by GPerson in "Michael Burry a.k.a. "Big Short",discloses $1.1B bet against Nvidia&Palantir"]]></title><description><![CDATA[
<p>So you’re not buying the idea that with increased productivity those tools will quickly follow?</p>
]]></description><pubDate>Wed, 05 Nov 2025 15:43:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=45824108</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=45824108</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45824108</guid></item><item><title><![CDATA[New comment by GPerson in "Michael Burry a.k.a. "Big Short",discloses $1.1B bet against Nvidia&Palantir"]]></title><description><![CDATA[
<p>Have you tried claude code? I despise AI to my bones but even I can’t say claude code is not impressive.<p>If any anthropic reps read this, I think you guys, while probably better than open AI and meta, possibly Google, are delusional and are more likely to destroy the world than create infinite human life.</p>
]]></description><pubDate>Wed, 05 Nov 2025 15:31:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=45823925</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=45823925</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45823925</guid></item><item><title><![CDATA[New comment by GPerson in "Reddit CEO says chatbots are not a traffic driver"]]></title><description><![CDATA[
<p>Reddit is partially owned by Sam Altman and they have deals with llm companies to sell the data. The content has been and will continue to be grabbed by everyone who can pay.</p>
]]></description><pubDate>Sun, 02 Nov 2025 18:07:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=45792173</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=45792173</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45792173</guid></item><item><title><![CDATA[New comment by GPerson in "We are building AI slaves. Alignment through control will fail"]]></title><description><![CDATA[
<p>This crowd would sooner believe silicon hardware (an arbitrary human invention from the 50s-60s) will have the physical properties required for consciousness than accept that they participate in torturing literally a hundred billion consciousness animals every year.</p>
]]></description><pubDate>Sat, 01 Nov 2025 16:10:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=45782803</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=45782803</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45782803</guid></item><item><title><![CDATA[New comment by GPerson in "We are building AI slaves. Alignment through control will fail"]]></title><description><![CDATA[
<p>My principled stance is that all known physical processes depend on particular physical processes and consciousness should be no different. What is yours?</p>
]]></description><pubDate>Sat, 01 Nov 2025 04:42:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=45779306</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=45779306</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45779306</guid></item><item><title><![CDATA[New comment by GPerson in "We are building AI slaves. Alignment through control will fail"]]></title><description><![CDATA[
<p>Consciousness is a physical process and like all physical processes depends on particular material interactions.</p>
]]></description><pubDate>Sat, 01 Nov 2025 04:39:11 +0000</pubDate><link>https://news.ycombinator.com/item?id=45779292</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=45779292</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45779292</guid></item><item><title><![CDATA[New comment by GPerson in "We are building AI slaves. Alignment through control will fail"]]></title><description><![CDATA[
<p>I’m actually a vegan because I believe cows have consciousness. I believe consciousness is the only trait worth considering when applying morality questions. Arbitrary hardware is conscious.</p>
]]></description><pubDate>Sat, 01 Nov 2025 04:37:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=45779288</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=45779288</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45779288</guid></item><item><title><![CDATA[New comment by GPerson in "How the AI Crash Happens"]]></title><description><![CDATA[
<p>Would it crash if AI advances in a way which makes massive compute unnecessary? That’s an interesting possibility I wonder about.</p>
]]></description><pubDate>Fri, 31 Oct 2025 03:44:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=45768186</link><dc:creator>GPerson</dc:creator><comments>https://news.ycombinator.com/item?id=45768186</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45768186</guid></item></channel></rss>