<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: wave_1</title><link>https://news.ycombinator.com/user?id=wave_1</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Mon, 04 May 2026 14:57:01 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=wave_1" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by wave_1 in "xLSTM code release by NX-AI"]]></title><description><![CDATA[
<p>state tracking...</p>
]]></description><pubDate>Wed, 05 Jun 2024 18:50:48 +0000</pubDate><link>https://news.ycombinator.com/item?id=40588798</link><dc:creator>wave_1</dc:creator><comments>https://news.ycombinator.com/item?id=40588798</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40588798</guid></item><item><title><![CDATA[New comment by wave_1 in "Feynman Symbolic Regression Database"]]></title><description><![CDATA[
<p>If we can figure out symbolic regression in a universal fashion that would be AGI in a way, right?</p>
]]></description><pubDate>Mon, 13 May 2024 16:31:46 +0000</pubDate><link>https://news.ycombinator.com/item?id=40345114</link><dc:creator>wave_1</dc:creator><comments>https://news.ycombinator.com/item?id=40345114</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40345114</guid></item><item><title><![CDATA[New comment by wave_1 in "FireChat was a tool for revolution, then disappeared"]]></title><description><![CDATA[
<p>haha the best reply on the Internet</p>
]]></description><pubDate>Mon, 29 Apr 2024 22:46:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=40205032</link><dc:creator>wave_1</dc:creator><comments>https://news.ycombinator.com/item?id=40205032</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40205032</guid></item><item><title><![CDATA[New comment by wave_1 in "What can LLMs never do?"]]></title><description><![CDATA[
<p>Thanks. This is just in the labs stage, but moving closer to releasing it, exactly so that you can play with it! I have one angel investor involved in supporting this and it's intended for commercial applications in the para legal space, initially (controlled, structured environment) But you just gave me the motivation to "put it out there" so people can just play with it. It'll take a bit of time, but I will do a Show HN then when it's ready for people to play with. Otherwise, it would be just teasing people to talk about it on the main HN stage without giving access. Hold tight! And thanks again!</p>
]]></description><pubDate>Sat, 27 Apr 2024 20:49:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=40183389</link><dc:creator>wave_1</dc:creator><comments>https://news.ycombinator.com/item?id=40183389</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40183389</guid></item><item><title><![CDATA[New comment by wave_1 in "What can LLMs never do?"]]></title><description><![CDATA[
<p>I build an Agentic AI that leverages #6 and #7 at the end of the article as well as techniques not yet published. It tackles hallucination relative not to the world at large but to the facts, entities and causal relationships contained in a document (which is really bad reasoning if we assume LLMs are "reasoning" to begin with) It also tackles cross-reasoning with very large token distance.<p><a href="https://www.youtube.com/watch?v=99NPzteAz94" rel="nofollow">https://www.youtube.com/watch?v=99NPzteAz94</a><p>This is my first post on HN in 10 years.</p>
]]></description><pubDate>Sat, 27 Apr 2024 20:11:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=40183053</link><dc:creator>wave_1</dc:creator><comments>https://news.ycombinator.com/item?id=40183053</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40183053</guid></item></channel></rss>