<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: zulban</title><link>https://news.ycombinator.com/user?id=zulban</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Wed, 13 May 2026 14:37:41 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=zulban" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by zulban in "Googlebook"]]></title><description><![CDATA[
<p>"This is why AI doesn't sell."<p>There are several AI companies now with billions in yearly revenue that didn't even exist a few years ago. Many more with many millions in revenue. Saying AI doesn't sell is completely delusional. You're in an anti-AI bubble.</p>
]]></description><pubDate>Tue, 12 May 2026 20:30:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=48114095</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=48114095</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48114095</guid></item><item><title><![CDATA[New comment by zulban in "Canvas online again as ShinyHunters threatens to leak schools’ data"]]></title><description><![CDATA[
<p>"It should be illegal for any company to pay ransomware attacks. Period. No pay out ever."<p>You seem to think "if it's illegal it won't happen". Instead you need to think about unintended consequences and what would actually happen if this were law. People would hesitate to contact the police for help before they've decided, or not do it at all. And not report it.</p>
]]></description><pubDate>Fri, 08 May 2026 14:24:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=48063703</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=48063703</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48063703</guid></item><item><title><![CDATA[New comment by zulban in "LinkedIn is scanning browser extensions"]]></title><description><![CDATA[
<p>There's a third choice. Say you'll do it but do it poorly, or drag your feet forever. Hard to prove you intentionally did a bad job.<p>If that's the game you're playing tho, maybe time to find another job too ;)</p>
]]></description><pubDate>Thu, 30 Apr 2026 21:49:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=47968673</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47968673</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47968673</guid></item><item><title><![CDATA[New comment by zulban in "I cancelled Claude: Token issues, declining quality, and poor support"]]></title><description><![CDATA[
<p>Curious. Not my experience whatsoever.<p>I tried Claude recently and it was able to one-shot fixes on 9/9 of the bugs I gave it on my large and older Unity C# project. Only 2/9 needed minor tweaks for personal style (functionally the same).<p>Maybe it helps that I separately have a CLI with very extensive unit tests. Or that I just signed up. Or that I use Claude late in the evenings (off hours). I also give it very targeted instructions and if it's taking longer than a couple minutes - I abort and try a different or more precise prompt. Maybe the backend recognizes that I use it sparingly and I get better service.<p>The author describes what sounds like very large tasks that I'd never hand off to an AI to run wild in 2026.<p>Anyway I thought I'd give a different perspective than this thread.</p>
]]></description><pubDate>Fri, 24 Apr 2026 22:32:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=47896633</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47896633</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47896633</guid></item><item><title><![CDATA[New comment by zulban in "I cancelled Claude: Token issues, declining quality, and poor support"]]></title><description><![CDATA[
<p>People hired to do jobs they cannot do have many, many more methods than that. For thousands of years.</p>
]]></description><pubDate>Fri, 24 Apr 2026 22:24:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=47896563</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47896563</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47896563</guid></item><item><title><![CDATA[New comment by zulban in "ChatGPT Images 2.0"]]></title><description><![CDATA[
<p>The claim is that people don't / shouldn't want to see something if humans can't be bothered to make it. I provided a counter example. So the claim is nonsense.</p>
]]></description><pubDate>Wed, 22 Apr 2026 21:32:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=47869555</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47869555</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47869555</guid></item><item><title><![CDATA[New comment by zulban in "ChatGPT Images 2.0"]]></title><description><![CDATA[
<p>You're presenting this as legally clear but it's not. To the detriment of your point.<p>If I download all BSD software, count how many times "if" appears, and distribute that total, I've not violated BSD. AI generated code is different than that but not totally different.<p>Ignore nuance and the adults will ignore you.</p>
]]></description><pubDate>Wed, 22 Apr 2026 12:57:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=47862914</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47862914</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47862914</guid></item><item><title><![CDATA[New comment by zulban in "ChatGPT Images 2.0"]]></title><description><![CDATA[
<p>Maybe reread my comment. Would you not want to see a mount Everest sized Lego cat? Even if it were my cat?<p>Again - your quip sounds good but when you think about it, it's flatly wrong.</p>
]]></description><pubDate>Wed, 22 Apr 2026 03:49:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=47858718</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47858718</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47858718</guid></item><item><title><![CDATA[New comment by zulban in "ChatGPT Images 2.0"]]></title><description><![CDATA[
<p>Nobody can be bothered to make my cat out of Lego and the size of mount Everest but if an AI did I'd sure love to see it.<p>Your quip is pithy but meaningless.</p>
]]></description><pubDate>Tue, 21 Apr 2026 22:55:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=47855824</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47855824</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47855824</guid></item><item><title><![CDATA[New comment by zulban in "Louis Zocchi, games industry pioneer, has died"]]></title><description><![CDATA[
<p>Nothing unreasonable about wanting to live healthy and longer. It's not likely tho.</p>
]]></description><pubDate>Tue, 21 Apr 2026 12:27:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=47847833</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47847833</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47847833</guid></item><item><title><![CDATA[New comment by zulban in "Updating Gun Rocket through 10 years of Unity Engine"]]></title><description><![CDATA[
<p>Fun. I've upgraded my game a few times over the years. It started in 2018 so I started with a version slightly older than that. Some of these changes seem familiar to me. I had a fairly similar experience as my game also has always been C# and simple. I have always carefully avoided any fancy new Unity features and just use the core engine to deliver my game to many platforms. Neat to hear the author worked on the deprecated renames which I also remember.</p>
]]></description><pubDate>Sun, 19 Apr 2026 06:25:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=47822209</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47822209</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47822209</guid></item><item><title><![CDATA[New comment by zulban in "€54k spike in 13h from unrestricted Firebase browser key accessing Gemini APIs"]]></title><description><![CDATA[
<p>Ridiculous. They are clearly not trying at all. A hard wall preventing going over budget by 100x in a couple hours is not some devilishly complicated decentralized system problem.<p>Don't tote the party line.<p>Same reason why Azure AI only has easy rate limits by minute, not by day or week or month. Open source proxy projects do it easily tho. Think about the incentives.<p>Going over a hard cap by 3% would be a reasonable failure to make, not by 30000%.</p>
]]></description><pubDate>Thu, 16 Apr 2026 13:11:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=47792469</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47792469</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47792469</guid></item><item><title><![CDATA[New comment by zulban in "AI overly affirms users asking for personal advice"]]></title><description><![CDATA[
<p>I'm not sure how one example contradicts documented huge overall trends, but okay.</p>
]]></description><pubDate>Sat, 28 Mar 2026 19:25:41 +0000</pubDate><link>https://news.ycombinator.com/item?id=47557480</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47557480</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47557480</guid></item><item><title><![CDATA[New comment by zulban in "AI overly affirms users asking for personal advice"]]></title><description><![CDATA[
<p>> LLMs outputs, for example, are notoriously unreproducible.<p>Only in the same way that an individual in a medical study cannot be "reproduced" for the next study. However the overall statistical outcomes of studying a specific LLM can be reproduced.</p>
]]></description><pubDate>Sat, 28 Mar 2026 19:24:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=47557470</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47557470</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47557470</guid></item><item><title><![CDATA[New comment by zulban in "AI overly affirms users asking for personal advice"]]></title><description><![CDATA[
<p>Generally, published papers don't give a damn about reproducibility. I've seen it identified as a crisis by many. Publishers, reviewers, and researchers mostly don't care about that level of basic rigor. There's no professional repercussions or embarrassment.<p>Agreed - if I was a reviewer for LLM papers it would be an instant rejection not listing the versions and prompts used.</p>
]]></description><pubDate>Sat, 28 Mar 2026 17:24:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=47556608</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47556608</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47556608</guid></item><item><title><![CDATA[New comment by zulban in "Migrating to the EU"]]></title><description><![CDATA[
<p>Not comfortable. But making choices in the real world is about choosing the best option, not the perfect option.</p>
]]></description><pubDate>Mon, 23 Mar 2026 13:28:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=47489266</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47489266</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47489266</guid></item><item><title><![CDATA[New comment by zulban in "2% of ICML papers desk rejected because the authors used LLM in their reviews"]]></title><description><![CDATA[
<p>I've learned a bit today about how often people on hn read the article when commenting. Or potentially bots who are way off. The title alone isn't enough to totally grasp what happened here, or the methods used.<p>Extremely conservative detection. The real number must be much higher.</p>
]]></description><pubDate>Thu, 19 Mar 2026 13:27:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=47439049</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47439049</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47439049</guid></item><item><title><![CDATA[New comment by zulban in "Digg is gone again"]]></title><description><![CDATA[
<p>"So if someone compromises your identity they can unperson you?"<p>You've identified a problem that unrelated systems also have. Like banks and identity theft. This solution isn't responsible for causing that problem.<p>"How will the AI be detected? By another AI?"<p>However a platform likes to. Let the best platform win.</p>
]]></description><pubDate>Sat, 14 Mar 2026 16:28:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=47378297</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47378297</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47378297</guid></item><item><title><![CDATA[New comment by zulban in "Digg is gone again"]]></title><description><![CDATA[
<p>But then if the AI is detected that person can be permanently banned. No more AI. No new accounts.</p>
]]></description><pubDate>Sat, 14 Mar 2026 13:48:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=47376655</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47376655</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47376655</guid></item><item><title><![CDATA[New comment by zulban in "Tell HN: I'm 60 years old. Claude Code has re-ignited a passion"]]></title><description><![CDATA[
<p>> I love it. It feels like it did back then. I’m chasing the midnight hour and not getting any sleep.<p>I highly recommend this blog post about vibe coding, gambling, and flow. Glad you're having a great time! Just something to consider.<p><a href="https://www.fast.ai/posts/2026-01-28-dark-flow/" rel="nofollow">https://www.fast.ai/posts/2026-01-28-dark-flow/</a></p>
]]></description><pubDate>Sat, 07 Mar 2026 14:55:41 +0000</pubDate><link>https://news.ycombinator.com/item?id=47288193</link><dc:creator>zulban</dc:creator><comments>https://news.ycombinator.com/item?id=47288193</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47288193</guid></item></channel></rss>