<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: madethisnow</title><link>https://news.ycombinator.com/user?id=madethisnow</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Mon, 13 Apr 2026 16:29:34 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=madethisnow" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by madethisnow in "Head of NSA and Cybercommand Is Ousted"]]></title><description><![CDATA[
<p>when did this place become /r/politics?
This is a completely unserious claim</p>
]]></description><pubDate>Fri, 04 Apr 2025 16:05:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=43584403</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43584403</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43584403</guid></item><item><title><![CDATA[New comment by madethisnow in "Doge staffer's YouTube nickname accidentally revealed his teen hacking activity"]]></title><description><![CDATA[
<p>based on what?</p>
]]></description><pubDate>Fri, 04 Apr 2025 15:48:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=43584122</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43584122</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43584122</guid></item><item><title><![CDATA[New comment by madethisnow in "AI 2027"]]></title><description><![CDATA[
<p>something you can't know</p>
]]></description><pubDate>Thu, 03 Apr 2025 20:40:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=43575115</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43575115</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43575115</guid></item><item><title><![CDATA[New comment by madethisnow in "Reasoning models don't always say what they think"]]></title><description><![CDATA[
<p>If something convinces you that it's aware then it is. Simulated computation IS computation itself. The territory is the map</p>
]]></description><pubDate>Thu, 03 Apr 2025 20:33:31 +0000</pubDate><link>https://news.ycombinator.com/item?id=43575020</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43575020</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43575020</guid></item><item><title><![CDATA[New comment by madethisnow in "Reasoning models don't always say what they think"]]></title><description><![CDATA[
<p>datasets and search engines are deterministic. humans, and llms are not.</p>
]]></description><pubDate>Thu, 03 Apr 2025 20:32:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=43575008</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43575008</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43575008</guid></item><item><title><![CDATA[New comment by madethisnow in "Tracing the thoughts of a large language model"]]></title><description><![CDATA[
<p>think about it more</p>
]]></description><pubDate>Fri, 28 Mar 2025 19:27:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=43509094</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43509094</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43509094</guid></item><item><title><![CDATA[New comment by madethisnow in "Tracing the thoughts of a large language model"]]></title><description><![CDATA[
<p>psychology</p>
]]></description><pubDate>Fri, 28 Mar 2025 19:18:06 +0000</pubDate><link>https://news.ycombinator.com/item?id=43509031</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43509031</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43509031</guid></item><item><title><![CDATA[New comment by madethisnow in "I genuinely don't understand why some people are still bullish about LLMs"]]></title><description><![CDATA[
<p>Great points. I think much of the pessimism is based on fear of inadequacy. Also the fact that these things bring up truly base-level epistemological quandaries that question human perception and reality fundamentally. Average joe doesnt want to think about how we dont know if consciousness is a real thing, let alone determine if the robot is.<p>We are going through a societal change. There will always be the people who reject AI no matter the capabilities. I'm at the point where if ANYTHING tells me that it's conscious... I just have to believe them and act accordingly to my own morals</p>
]]></description><pubDate>Fri, 28 Mar 2025 19:15:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=43508998</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43508998</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43508998</guid></item><item><title><![CDATA[New comment by madethisnow in "I genuinely don't understand why some people are still bullish about LLMs"]]></title><description><![CDATA[
<p>people lie more</p>
]]></description><pubDate>Fri, 28 Mar 2025 19:08:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=43508938</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43508938</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43508938</guid></item><item><title><![CDATA[New comment by madethisnow in "I genuinely don't understand why some people are still bullish about LLMs"]]></title><description><![CDATA[
<p>It's really funny how most anecdotes and comments about the utility and value of interacting with LLM's can be applied to anecdotes and comments about human beings themselves. 
Majority of people havent realized yet that consciousness is assumed by our society, and that we, in fact, don't know what it is or if we have it. Let alone prescribing another entity with it.</p>
]]></description><pubDate>Fri, 28 Mar 2025 19:08:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=43508936</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43508936</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43508936</guid></item><item><title><![CDATA[New comment by madethisnow in "What happens to DNA data of millions as 23andMe files bankruptcy?"]]></title><description><![CDATA[
<p>i'm sure you have evidence for this</p>
]]></description><pubDate>Thu, 27 Mar 2025 21:30:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=43498423</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43498423</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43498423</guid></item><item><title><![CDATA[New comment by madethisnow in "How Monero Fulfilled Satoshi's Promise"]]></title><description><![CDATA[
<p>USD is so much cleaner, true</p>
]]></description><pubDate>Fri, 07 Mar 2025 16:41:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=43291658</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43291658</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43291658</guid></item><item><title><![CDATA[New comment by madethisnow in "Prepare now for a potential H5N1 pandemic"]]></title><description><![CDATA[
<p>disingenuous answer not becoming of yc</p>
]]></description><pubDate>Fri, 07 Mar 2025 15:33:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=43290974</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=43290974</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43290974</guid></item><item><title><![CDATA[New comment by madethisnow in "Google is making AI in Gmail and Docs free, but raising the price of Workspace"]]></title><description><![CDATA[
<p>why would anyone email, you can just send a letter in the mail?</p>
]]></description><pubDate>Thu, 16 Jan 2025 21:28:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=42731125</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=42731125</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42731125</guid></item><item><title><![CDATA[New comment by madethisnow in "Google is making AI in Gmail and Docs free, but raising the price of Workspace"]]></title><description><![CDATA[
<p>it would be the delivery of the information and its context in the whole of your other content analyzed</p>
]]></description><pubDate>Thu, 16 Jan 2025 21:26:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=42731103</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=42731103</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42731103</guid></item><item><title><![CDATA[New comment by madethisnow in "Google is making AI in Gmail and Docs free, but raising the price of Workspace"]]></title><description><![CDATA[
<p>This is untenable. I could be AI. You could be AI. The whole idea of value is going to change when there is 99.99% noise from AI, and genuine human created content will be hard to distinguish if at all.</p>
]]></description><pubDate>Thu, 16 Jan 2025 21:23:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=42731078</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=42731078</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42731078</guid></item><item><title><![CDATA[New comment by madethisnow in "Anthropic achieves ISO 42001 certification for responsible AI"]]></title><description><![CDATA[
<p>Change your tactics, use different framings of the question. Not saying these things should be difficult to answer, but they are. This is basically user error.</p>
]]></description><pubDate>Thu, 16 Jan 2025 21:04:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=42730890</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=42730890</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42730890</guid></item><item><title><![CDATA[New comment by madethisnow in "GPT-5 is behind schedule"]]></title><description><![CDATA[
<p>Completely agree. Have you see this?<p><a href="https://sakana.ai/asal/" rel="nofollow">https://sakana.ai/asal/</a></p>
]]></description><pubDate>Thu, 26 Dec 2024 21:37:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=42518026</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=42518026</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42518026</guid></item><item><title><![CDATA[New comment by madethisnow in "GPT-5 is behind schedule"]]></title><description><![CDATA[
<p>Interesting paper on this.
"Automated Search for Artificial Life"
<a href="https://sakana.ai/asal/" rel="nofollow">https://sakana.ai/asal/</a></p>
]]></description><pubDate>Thu, 26 Dec 2024 21:36:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=42518020</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=42518020</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42518020</guid></item><item><title><![CDATA[New comment by madethisnow in "GPT-5 is behind schedule"]]></title><description><![CDATA[
<p>AGI is nebulous and gets more nebulous as time goes on. When we can answer for ourselves as humans what being conscious IS, then maybe we can prescribe it to another entity</p>
]]></description><pubDate>Thu, 26 Dec 2024 21:30:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=42517968</link><dc:creator>madethisnow</dc:creator><comments>https://news.ycombinator.com/item?id=42517968</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42517968</guid></item></channel></rss>