<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: KalMann</title><link>https://news.ycombinator.com/user?id=KalMann</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Wed, 13 May 2026 16:20:20 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=KalMann" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by KalMann in "As researchers age, they produce less disruptive work"]]></title><description><![CDATA[
<p>Not a physicist either but my understanding is that is that if you believe that we can discover all the laws of physics that explain how the world operates then it needs to have a solution.<p>Like we have formulas describing how gravity works. We can test these formulas by observing the motion of the planets and galaxies. Is this theory true? There's lots of evidence for it so it feels like it's gotta be pretty close to "the truth"<p>We also have formulas describing how elementary particles behave. These formulas have been tested to a very high degree of precision so it seems they've got to be close to the truth as well. But if you use both our formulas for gravitation and formulas for elementary particles you can derive a contradiction. So these two theories cannot simultaneously be true. There's got to be something wrong with them.<p>I suppose there's the possibility that at a certain point nature simply doesn't follow any laws and you can't possibly make sense of it.</p>
]]></description><pubDate>Tue, 12 May 2026 21:38:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=48114958</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=48114958</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48114958</guid></item><item><title><![CDATA[New comment by KalMann in "Googlebook"]]></title><description><![CDATA[
<p>Are you sure? I think "normies" would prefer to see and try on the clothes they buy.</p>
]]></description><pubDate>Tue, 12 May 2026 20:44:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=48114269</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=48114269</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48114269</guid></item><item><title><![CDATA[New comment by KalMann in "Software engineering may no longer be a lifetime career"]]></title><description><![CDATA[
<p>I can give you the exact mathematical formula used to statistically optimize the output of a neural network from input examples. Can you do the same for the brain?</p>
]]></description><pubDate>Mon, 11 May 2026 19:25:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=48099524</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=48099524</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48099524</guid></item><item><title><![CDATA[New comment by KalMann in "Software engineering may no longer be a lifetime career"]]></title><description><![CDATA[
<p>> But we are? That's our education system.<p>That is not what the education system does. That's an obvious distortion of reality. People train over billions of documents to statistically predict the next word to gain and understanding of language. LLMs do this statistical processing in order to mimic humans natural language learning ability. And there has been continued evidence of the limitations of this approach to accurately mimic the totality of human cognition.</p>
]]></description><pubDate>Mon, 11 May 2026 19:19:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=48099460</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=48099460</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48099460</guid></item><item><title><![CDATA[New comment by KalMann in "Software engineering may no longer be a lifetime career"]]></title><description><![CDATA[
<p>> So are humans.<p>AI advocates are _way_ too confident about the nature human cognition. Questions that have been debated by philosophers and cognitive scientists for decades are now "obvious" according to you people, though you never provide any argument to support your statements.</p>
]]></description><pubDate>Mon, 11 May 2026 19:17:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=48099416</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=48099416</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48099416</guid></item><item><title><![CDATA[New comment by KalMann in "Think Linear Algebra (2023)"]]></title><description><![CDATA[
<p>Did you misstate your comment? The Jordan normal form is more general than spectral decomposition so it should come after.</p>
]]></description><pubDate>Mon, 11 May 2026 16:34:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=48097260</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=48097260</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48097260</guid></item><item><title><![CDATA[New comment by KalMann in "You can beat the binary search"]]></title><description><![CDATA[
<p>I don't really see how this implies the above commenter's statement is "simply not true".</p>
]]></description><pubDate>Thu, 30 Apr 2026 17:16:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=47965533</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=47965533</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47965533</guid></item><item><title><![CDATA[New comment by KalMann in "Waymo halts service during S.F. blackout after causing traffic jams"]]></title><description><![CDATA[
<p>I think this was a failure. The gold standard should be that the if every human driver was replaced with an AI how well could the system function. This makes it look like things would be catastrophic. Thus, showing how humans continue to be much more versatile and capable than AI.<p>I suppose if you lower the standards for what you hope AI can accomplish it wouldn't be considered a failure.</p>
]]></description><pubDate>Mon, 22 Dec 2025 18:39:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=46357206</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=46357206</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46357206</guid></item><item><title><![CDATA[New comment by KalMann in "Your job is to deliver code you have proven to work"]]></title><description><![CDATA[
<p>This but with hate.</p>
]]></description><pubDate>Fri, 19 Dec 2025 16:15:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=46327413</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=46327413</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46327413</guid></item><item><title><![CDATA[New comment by KalMann in "40 percent of fMRI signals do not correspond to actual brain activity"]]></title><description><![CDATA[
<p>Why are you phrasing your correction in the form of a question? I think it's pretty reasonable to infer that he mistakenly thought it was a Stanford study because the link was from Stanford.</p>
]]></description><pubDate>Wed, 17 Dec 2025 17:00:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=46302123</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=46302123</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46302123</guid></item><item><title><![CDATA[New comment by KalMann in "40 percent of fMRI signals do not correspond to actual brain activity"]]></title><description><![CDATA[
<p>> Further, an ad hominem is when a person attacks someone's character without any base.<p>That is not what an ad hominem is.</p>
]]></description><pubDate>Tue, 16 Dec 2025 22:37:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=46295624</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=46295624</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46295624</guid></item><item><title><![CDATA[New comment by KalMann in "Show HN: Gemini Pro 3 imagines the HN front page 10 years from now"]]></title><description><![CDATA[
<p>> It does exactly the same, predicts tokens,<p>That is an absolutely wild claim you've made. You're being way to presumptious.</p>
]]></description><pubDate>Wed, 10 Dec 2025 16:26:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=46219733</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=46219733</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46219733</guid></item><item><title><![CDATA[New comment by KalMann in "Show HN: Gemini Pro 3 imagines the HN front page 10 years from now"]]></title><description><![CDATA[
<p>>  Models don't have access to "reality"<p>This is an explanation of why models "hallucinate" not a criticism for the provided definition of hallucination.</p>
]]></description><pubDate>Wed, 10 Dec 2025 16:12:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=46219522</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=46219522</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46219522</guid></item><item><title><![CDATA[New comment by KalMann in "Researchers Find Microbe Capable of Producing Oxygen from Martian Soil"]]></title><description><![CDATA[
<p>I think the problem with your post is that it started a list of "incorrect statements" with a statement that wasn't incorrect.</p>
]]></description><pubDate>Wed, 03 Dec 2025 16:48:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=46136712</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=46136712</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46136712</guid></item><item><title><![CDATA[New comment by KalMann in "Boom, bubble, bust, boom. Why should AI be different?"]]></title><description><![CDATA[
<p>You're missing the point. Those kind of narrow AI applications are not the motivation for the trillions of dollars being poured into AI.  Of course AI has a variety of applications many disciplines, as it has for decades. The motivation behind the massive investment in AI is as forgetfulness said, reap the benefits from "revolutionizing the workplace"</p>
]]></description><pubDate>Fri, 21 Nov 2025 21:53:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=46009425</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=46009425</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46009425</guid></item><item><title><![CDATA[New comment by KalMann in "Arduino published updated terms and conditions: no longer an open commons"]]></title><description><![CDATA[
<p>Aren't you forgetting about the software that makes it so easy and straightforward for newcomers to flash programs and experiment the microcontroller?</p>
]]></description><pubDate>Fri, 21 Nov 2025 20:36:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=46008679</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=46008679</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46008679</guid></item><item><title><![CDATA[New comment by KalMann in "Nano Banana Pro"]]></title><description><![CDATA[
<p>I'd try a some more if I were you. I saw an example of generated infographic that was greatly improved over anything I've seen an image generator do before. What you desire seems in the realm of possibility.</p>
]]></description><pubDate>Thu, 20 Nov 2025 16:35:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=45994506</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=45994506</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45994506</guid></item><item><title><![CDATA[New comment by KalMann in "Linear algebra explains why some words are effectively untranslatable"]]></title><description><![CDATA[
<p>He probably meant to say "vector" the second time he said "matrix".</p>
]]></description><pubDate>Fri, 14 Nov 2025 20:48:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=45931993</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=45931993</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45931993</guid></item><item><title><![CDATA[New comment by KalMann in "GPT-5.1: A smarter, more conversational ChatGPT"]]></title><description><![CDATA[
<p>Expecting every little fact to have an "authoritative source" is just annoying faux intellectualism. You can ask someone why they believe something and listen to their reasoning, decide for yourself if you find it convincing, without invoking such a pretentious phrase. There are conclusions you can think to and reach without an "official citation".</p>
]]></description><pubDate>Wed, 12 Nov 2025 23:52:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=45908600</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=45908600</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45908600</guid></item><item><title><![CDATA[New comment by KalMann in "Yann LeCun to depart Meta and launch AI startup focused on 'world models'"]]></title><description><![CDATA[
<p>I think his point is it means a lot in the context of investors seeking to make returns on their investments.</p>
]]></description><pubDate>Wed, 12 Nov 2025 19:00:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=45904443</link><dc:creator>KalMann</dc:creator><comments>https://news.ycombinator.com/item?id=45904443</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45904443</guid></item></channel></rss>