<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: tyronehed</title><link>https://news.ycombinator.com/user?id=tyronehed</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Fri, 17 Apr 2026 10:34:18 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=tyronehed" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by tyronehed in "Author of "Careless People" banned from saying anything negative about Meta"]]></title><description><![CDATA[
<p>This is a great book. I thoroughly enjoyed reading.
She was extremely fair to Facebook executives.</p>
]]></description><pubDate>Sat, 04 Apr 2026 18:03:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=47641574</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=47641574</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47641574</guid></item><item><title><![CDATA[New comment by tyronehed in "Clawback of $1.1B for PBS and NPR puts rural stations at risk"]]></title><description><![CDATA[
<p>So, you oppose their desire to diversify their funding?
NPR gets 10% of their funding from tax dollars. 
  Who you are harming is small rural public radio stations in red areas.</p>
]]></description><pubDate>Sat, 19 Jul 2025 19:47:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=44618726</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=44618726</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44618726</guid></item><item><title><![CDATA[New comment by tyronehed in "Ask HN: Where do you guys find audiobooks?"]]></title><description><![CDATA[
<p>archive org.
Look for the audiobook "Mawson's Will: The Greatest Survival Story Ever Told"</p>
]]></description><pubDate>Fri, 18 Jul 2025 17:52:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=44607800</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=44607800</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44607800</guid></item><item><title><![CDATA[New comment by tyronehed in "All AI models might be the same"]]></title><description><![CDATA[
<p>Especially if they are all me-too copies of a Transformer.<p>When we arrive at AGI, you can be certain it will not contain a Transformer.</p>
]]></description><pubDate>Thu, 17 Jul 2025 19:30:34 +0000</pubDate><link>https://news.ycombinator.com/item?id=44597194</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=44597194</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44597194</guid></item><item><title><![CDATA[New comment by tyronehed in "Ask HN: Any insider takes on Yann LeCun's push against current architectures?"]]></title><description><![CDATA[
<p>As soon as you need to start leaning heavily on error correction, that is an indication that your architecture and solution is not correct. The final solution will need to be elegant and very close to a perfect solution immediately.<p>You must always keep close to the only known example we have of an intelligence which is the human brain. As soon as you start to wander away from the way the human brain does it, you are on your own and you are not relying on known examples of intelligence. Certainly that might be possible, but since there's only one known example in this universe of intelligence, it seems ridiculous to do anything but stick close to that example, which is the human brain.</p>
]]></description><pubDate>Sat, 15 Mar 2025 05:02:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=43370111</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43370111</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43370111</guid></item><item><title><![CDATA[New comment by tyronehed in "Ask HN: Any insider takes on Yann LeCun's push against current architectures?"]]></title><description><![CDATA[
<p>This is actually a lazy approach as you describe it. Instead, what is needed is an elegant and simple approach that is 99% of the way there out of the gate. Soon as you start doing statistical tweaking and overfitting models, you are not approaching a solution.</p>
]]></description><pubDate>Fri, 14 Mar 2025 18:46:48 +0000</pubDate><link>https://news.ycombinator.com/item?id=43365867</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43365867</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43365867</guid></item><item><title><![CDATA[New comment by tyronehed in "Any insider takes on Yann LeCun's push against current architectures?"]]></title><description><![CDATA[
<p>When an architecture is based around world model building, then it is a casual outcome that similar concepts and things end up being stored in similar places. They overlap. 
  As soon as your solution starts to get mathematically complex, you are departing from what the human brain does. Not saying that in some universe it might be possible to make a statistical intelligence, but when you go that direction you are straying away from the only existing intelligences that we know about. The human brain. So the best solutions will closely echo neuroscience.</p>
]]></description><pubDate>Fri, 14 Mar 2025 18:45:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=43365847</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43365847</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43365847</guid></item><item><title><![CDATA[New comment by tyronehed in "Ask HN: Any insider takes on Yann LeCun's push against current architectures?"]]></title><description><![CDATA[
<p>Since this exposes the answer, the new architecture has to be based on world model building.</p>
]]></description><pubDate>Fri, 14 Mar 2025 18:43:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=43365825</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43365825</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43365825</guid></item><item><title><![CDATA[New comment by tyronehed in "Any insider takes on Yann LeCun's push against current architectures?"]]></title><description><![CDATA[
<p>The alternative architectures must learn from streaming data, must be error tolerant and must have the characteristic that similar objects or concepts much naturally come near to each other. They must naturally overlap.</p>
]]></description><pubDate>Fri, 14 Mar 2025 18:42:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=43365809</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43365809</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43365809</guid></item><item><title><![CDATA[New comment by tyronehed in "Ask HN: Any insider takes on Yann LeCun's push against current architectures?"]]></title><description><![CDATA[
<p>Any transformer based LLM will never achieve AGI because it's only trying to pick the next word. You need a larger amount of planning to achieve AGI. 
Also, the characteristics of LLMs do not resemble any existing intelligence that we know of. Does a baby require 2 years of statistical analysis to become useful? No. Transformer architectures are parlor tricks. They are glorified Google but they're not doing anything or planning. 
If you want that, then you have to base your architecture on the known examples of intelligence that we are aware of in the universe. And that's not a transformer. In fact, whatever AGI emerges will absolutely not contain a transformer.</p>
]]></description><pubDate>Fri, 14 Mar 2025 18:40:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=43365788</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43365788</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43365788</guid></item><item><title><![CDATA[New comment by tyronehed in "OpenAI declares AI race "over" if training on copyrighted works isn't fair use"]]></title><description><![CDATA[
<p>Only Transformer based architectures are over.<p>It amazes me that everyone so fetishizes Transformer architectures that they cannot imagine alternative--when the alternative is obvious.</p>
]]></description><pubDate>Fri, 14 Mar 2025 05:27:19 +0000</pubDate><link>https://news.ycombinator.com/item?id=43359838</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43359838</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43359838</guid></item><item><title><![CDATA[New comment by tyronehed in "The Einstein AI Model"]]></title><description><![CDATA[
<p>The first thing you need to understand is that no current llm based, transformer architected AI is going to get to agi. The design in essence is not capable of that kind of creativity. In fact no AI that has at its root a statistical analysis or probabilistic correlation will get us past the glorified Google parlor trick that is the modern llm in every form.<p>A great leap in IP but unfortunately is too important to blab about widely, is the solution to this problem and the architecture that will be contained in the ultimate AGI solution that emerges.</p>
]]></description><pubDate>Mon, 10 Mar 2025 21:29:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=43326335</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43326335</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43326335</guid></item><item><title><![CDATA[New comment by tyronehed in "Ask HN: How to find intellectually challenging work?"]]></title><description><![CDATA[
<p>You need to start looking for big problems to solve. 
It was once said by a guy named Bruce Williams who knew what he was talking about that the way to get rich is to solve a problem.<p>So what you need to do is start looking for big problems. Not big problems that are only seen by developers, but big problems that are seen by the general public. People will pay for solutions.
 You will be motivated by solutions, something that you need to research the hell out of. Something you need to make connections that no one else has made before to solve. That's it</p>
]]></description><pubDate>Mon, 10 Mar 2025 21:15:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=43326185</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43326185</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43326185</guid></item><item><title><![CDATA[New comment by tyronehed in "Tattoo ink exposure is associated with lymphoma and skin cancers"]]></title><description><![CDATA[
<p>In life it's often more important what you don't do, then what you do do</p>
]]></description><pubDate>Tue, 04 Mar 2025 15:48:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=43256204</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43256204</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43256204</guid></item><item><title><![CDATA[New comment by tyronehed in "Starship's Eighth Flight Test"]]></title><description><![CDATA[
<p>I used to follow SpaceX so closely but now, given the behavior of Elon Musk, I cannot take any interest in any company connected to him.</p>
]]></description><pubDate>Mon, 24 Feb 2025 22:09:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=43165520</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43165520</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43165520</guid></item><item><title><![CDATA[New comment by tyronehed in "Turning Off All 8k EV Chargers at All Federal Government Buildings"]]></title><description><![CDATA[
<p>This is just anti environmentalism, plain and simple. Sticking it to the libs is their number one priority.</p>
]]></description><pubDate>Sun, 23 Feb 2025 20:38:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=43153002</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43153002</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43153002</guid></item><item><title><![CDATA[New comment by tyronehed in "Turning Off All 8k EV Chargers at All Federal Government Buildings"]]></title><description><![CDATA[
<p>What a bunch of petty sons of b***. Trump has aesthetics against electric cars, and that seems doubly bizarre since his partner is Elon Musk. 
But this is just extreme and ridiculous. The right has always had a fetish about fighting against the environment. Trump will only be alive a few more years and he doesn't care what happens to the planet.</p>
]]></description><pubDate>Sun, 23 Feb 2025 20:35:23 +0000</pubDate><link>https://news.ycombinator.com/item?id=43152977</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43152977</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43152977</guid></item><item><title><![CDATA[New comment by tyronehed in "DOGE employees don't understand the basics of SQL"]]></title><description><![CDATA[
<p>News for you. Elon Musk is not an engineer. Elon Musk does not know computer science. Yes, he's able to read books and learn subjects but he has never worked as a software engineer in any professional setting. You attribute competence to him in so many areas where he does not deserve it.</p>
]]></description><pubDate>Fri, 21 Feb 2025 01:16:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=43122913</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43122913</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43122913</guid></item><item><title><![CDATA[New comment by tyronehed in "DOGE employees don't understand the basics of SQL"]]></title><description><![CDATA[
<p>When a foreign key field is left empty, that implies that no foreign key relationship was claimed. By definition a primary key cannot be no. But a foreign key, which is a representation of another table's primary key, can be null. It merely means that no foreign key relationship is claimed.</p>
]]></description><pubDate>Fri, 21 Feb 2025 01:15:11 +0000</pubDate><link>https://news.ycombinator.com/item?id=43122905</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43122905</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43122905</guid></item><item><title><![CDATA[New comment by tyronehed in "Ask HN: What is the best method for turning a scanned book as a PDF into text?"]]></title><description><![CDATA[
<p>For years I have been printing PDFs off on regular paper and then binding them into books.
1. Print it at work when no one is looking.
2. Get two rigid boards and squeeze the stack of paper together. I customarily use two wooden armrests that originally came from a garden-furniture lounger.
3. Squeeze the paper with just a 1/4-inch showing.
4. Use wood glue and with your finger working like a toothbrush, work the glue into the pages at the gluing end.
5. Get a 14-inch X 4-inch strip of canvas. I use cutoff painter's canvas.
6. Hang all this by the boards and put glue also on top of the canvas strip.
7. When it dries, remove the boards and glue down the sides.
You have a strong, bound book out of those printed pages.</p>
]]></description><pubDate>Sun, 16 Feb 2025 18:10:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=43070140</link><dc:creator>tyronehed</dc:creator><comments>https://news.ycombinator.com/item?id=43070140</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43070140</guid></item></channel></rss>