<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: narenm16</title><link>https://news.ycombinator.com/user?id=narenm16</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Thu, 23 Apr 2026 12:21:50 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=narenm16" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by narenm16 in "AI 2027"]]></title><description><![CDATA[
<p>i agree. it feels like scaling up these large models is such an inefficient route that seems to be warranting new ideas (test-time compute, etc).<p>we'll likely reach a point where it's infeasible for deep learning to completely encompass human-level reasoning, and we'll need neuroscience discoveries to continue progress. altman seems to be hyping up "bigger is better," not just for model parameters but openai's valuation.</p>
]]></description><pubDate>Fri, 04 Apr 2025 02:20:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=43577671</link><dc:creator>narenm16</dc:creator><comments>https://news.ycombinator.com/item?id=43577671</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43577671</guid></item></channel></rss>