<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: qsera</title><link>https://news.ycombinator.com/user?id=qsera</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Sun, 12 Apr 2026 11:46:29 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=qsera" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by qsera in "OpenAI backs Illinois bill that would limit when AI labs can be held liable"]]></title><description><![CDATA[
<p>It won't be, because it is marketing, and we are eating it up.</p>
]]></description><pubDate>Fri, 10 Apr 2026 16:21:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=47720408</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47720408</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47720408</guid></item><item><title><![CDATA[New comment by qsera in "OpenAI backs Illinois bill that would limit when AI labs can be held liable"]]></title><description><![CDATA[
<p>Another marketing gimmik...</p>
]]></description><pubDate>Fri, 10 Apr 2026 16:20:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=47720400</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47720400</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47720400</guid></item><item><title><![CDATA[New comment by qsera in "OpenAI backs Illinois bill that would limit when AI labs can be held liable"]]></title><description><![CDATA[
<p><a href="https://en.wikipedia.org/wiki/National_Childhood_Vaccine_Injury_Act" rel="nofollow">https://en.wikipedia.org/wiki/National_Childhood_Vaccine_Inj...</a></p>
]]></description><pubDate>Fri, 10 Apr 2026 14:24:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=47718637</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47718637</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47718637</guid></item><item><title><![CDATA[New comment by qsera in "White House staff told not to place bets on prediction markets"]]></title><description><![CDATA[
<p>Mmm..yea. But stock market is not something new, and this possibility always existed.<p>So I was only thinking about Poloymarket and stuff like that.</p>
]]></description><pubDate>Fri, 10 Apr 2026 14:17:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=47718547</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47718547</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47718547</guid></item><item><title><![CDATA[New comment by qsera in "OpenAI backs Illinois bill that would limit when AI labs can be held liable"]]></title><description><![CDATA[
<p>>But that would be an insane thing to claim.<p>The trick is to make people behave like that without actually claiming it. AI companies seems to have aced it.</p>
]]></description><pubDate>Fri, 10 Apr 2026 14:09:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=47718435</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47718435</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47718435</guid></item><item><title><![CDATA[New comment by qsera in "OpenAI Backs Bill That Would Limit Liability for AI-Enabled Mass Deaths"]]></title><description><![CDATA[
<p>> it simply shouldn’t be possible with a proper approach to testing.<p>It just has to be delayed. Like many years after application. Or trigger on very specific and rare circumstances. Not likely in a trial, but near certain at a population scale.<p>Or both...<p>On top of that, If I remember correctly, this liability wavering also exist for Vaccines.</p>
]]></description><pubDate>Fri, 10 Apr 2026 14:08:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=47718410</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47718410</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47718410</guid></item><item><title><![CDATA[New comment by qsera in "White House staff told not to place bets on prediction markets"]]></title><description><![CDATA[
<p>It also incentivize people to follow the events and remember stuff so that they are better equipped to make predictions.<p>I think it incentivize people to use their head and may be even select better leaders.</p>
]]></description><pubDate>Fri, 10 Apr 2026 13:56:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=47718252</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47718252</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47718252</guid></item><item><title><![CDATA[New comment by qsera in "White House staff told not to place bets on prediction markets"]]></title><description><![CDATA[
<p>I think the proper thing to do would be to forbid these markets to make bets on stuff that can be won by using only insider information.</p>
]]></description><pubDate>Fri, 10 Apr 2026 13:54:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=47718220</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47718220</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47718220</guid></item><item><title><![CDATA[New comment by qsera in "ML promises to be profoundly weird"]]></title><description><![CDATA[
<p>> LLMs are circuit builders<p>I think they are circuit "approximators". In other words, a result of a glorified linear regression..</p>
]]></description><pubDate>Fri, 10 Apr 2026 01:12:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=47712358</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47712358</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47712358</guid></item><item><title><![CDATA[New comment by qsera in "Claude mixes up who said what"]]></title><description><![CDATA[
<p>Obviously, a real intelligent entity would consider risk/benefit analysis and act accordingly.</p>
]]></description><pubDate>Fri, 10 Apr 2026 01:10:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=47712347</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47712347</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47712347</guid></item><item><title><![CDATA[New comment by qsera in "ML promises to be profoundly weird"]]></title><description><![CDATA[
<p>I think It matters for the question that I was responding to.</p>
]]></description><pubDate>Fri, 10 Apr 2026 01:08:38 +0000</pubDate><link>https://news.ycombinator.com/item?id=47712338</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47712338</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47712338</guid></item><item><title><![CDATA[New comment by qsera in "ML promises to be profoundly weird"]]></title><description><![CDATA[
<p><a href="https://arxiv.org/abs/2603.09678" rel="nofollow">https://arxiv.org/abs/2603.09678</a></p>
]]></description><pubDate>Thu, 09 Apr 2026 14:59:06 +0000</pubDate><link>https://news.ycombinator.com/item?id=47704636</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47704636</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47704636</guid></item><item><title><![CDATA[New comment by qsera in "Claude mixes up who said what"]]></title><description><![CDATA[
<p>>If you were there, what would you do?<p>Show it to my boss and let them decide.</p>
]]></description><pubDate>Thu, 09 Apr 2026 14:10:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=47704012</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47704012</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47704012</guid></item><item><title><![CDATA[New comment by qsera in "ML promises to be profoundly weird"]]></title><description><![CDATA[
<p>But that is bolted on and is not a core behavior.</p>
]]></description><pubDate>Wed, 08 Apr 2026 18:17:34 +0000</pubDate><link>https://news.ycombinator.com/item?id=47694157</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47694157</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47694157</guid></item><item><title><![CDATA[New comment by qsera in "ML promises to be profoundly weird"]]></title><description><![CDATA[
<p>Source?</p>
]]></description><pubDate>Wed, 08 Apr 2026 17:34:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=47693533</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47693533</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47693533</guid></item><item><title><![CDATA[New comment by qsera in "ML promises to be profoundly weird"]]></title><description><![CDATA[
<p>For starters, natural brains have the innate ability to differentiate between things that it knows and things that it have no possibility of knowing...</p>
]]></description><pubDate>Wed, 08 Apr 2026 17:28:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=47693444</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47693444</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47693444</guid></item><item><title><![CDATA[New comment by qsera in "ML promises to be profoundly weird"]]></title><description><![CDATA[
<p>I am not the OP, an I have only used ChatGPT free version. Last day I asked it something. It answered. Then I asked it to provide sources. Then it provided sources, and also changed its original answer. When I checked the new answers it was wrong, and when I checked sources, it didn't actually contain the information that I asked for, and thus it hallucinated the answers as well as the sources...</p>
]]></description><pubDate>Wed, 08 Apr 2026 17:15:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=47693229</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47693229</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47693229</guid></item><item><title><![CDATA[New comment by qsera in "ML promises to be profoundly weird"]]></title><description><![CDATA[
<p>But why do you need an example? Isn't it pretty well understood that LLMS will have trouble responding to stuff that is under represented in the training data?<p>You will just won't have any clue what that could be.</p>
]]></description><pubDate>Wed, 08 Apr 2026 17:04:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=47693041</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47693041</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47693041</guid></item><item><title><![CDATA[New comment by qsera in "ML promises to be profoundly weird"]]></title><description><![CDATA[
<p>>95% is not my experience and frankly dishonest.<p>Quite frankly, this is exactly like how two people can use the same compression program on two different files and get vastly different compression ratios (because one has a lot of redundancy and the other one has not).</p>
]]></description><pubDate>Wed, 08 Apr 2026 17:00:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=47692977</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47692977</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47692977</guid></item><item><title><![CDATA[New comment by qsera in "AI Assistance Reduces Persistence and Hurts Independent Performance"]]></title><description><![CDATA[
<p>The AI marketing just need the publicity people generate during the honeymoon phase. Falling out is generally gradual and typically in silence.</p>
]]></description><pubDate>Wed, 08 Apr 2026 15:00:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=47691178</link><dc:creator>qsera</dc:creator><comments>https://news.ycombinator.com/item?id=47691178</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47691178</guid></item></channel></rss>