<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: SmooL</title><link>https://news.ycombinator.com/user?id=SmooL</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Wed, 06 May 2026 21:39:55 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=SmooL" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by SmooL in "US and UK refuse to sign AI safety declaration at summit"]]></title><description><![CDATA[
<p>I can trivially just print any AI image I want, then take a "verified" picture of it with my camera. That seems like a pretty large failure point.</p>
]]></description><pubDate>Thu, 13 Feb 2025 04:10:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=43032572</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=43032572</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43032572</guid></item><item><title><![CDATA[New comment by SmooL in "Things we learned about LLMs in 2024"]]></title><description><![CDATA[
<p>The thinking goes:
- any job that can be done on a computer is immediately outsourced to AI, since the AI is smarter and cheaper than humans
- humanoid robots are built that are cheap to produce, using tech advances that the AI discovered
- any job that can be done by a human is immediately outsourced to a robot, since the robot is better/faster/stronger/cheaper than humans</p>
]]></description><pubDate>Tue, 31 Dec 2024 20:10:34 +0000</pubDate><link>https://news.ycombinator.com/item?id=42561433</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=42561433</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42561433</guid></item><item><title><![CDATA[New comment by SmooL in "Tesla Optimus Bots Were Remotely Operated at Cybercab Event"]]></title><description><![CDATA[
<p>That OpenAI demo is available right now to subscribers; I have access to it</p>
]]></description><pubDate>Tue, 15 Oct 2024 02:20:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=41844304</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=41844304</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41844304</guid></item><item><title><![CDATA[New comment by SmooL in "AI-Driven Drone Surveillance Is Leading to Home Insurance Cancellations"]]></title><description><![CDATA[
<p>Ironically, it seems the article itself was written by AI</p>
]]></description><pubDate>Fri, 09 Aug 2024 16:56:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=41203516</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=41203516</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41203516</guid></item><item><title><![CDATA[New comment by SmooL in "As AI booms, land near nuclear power plants becomes hot real estate"]]></title><description><![CDATA[
<p>There's no good reason for BTC to be set up to use that much energy. There are viable, proven alternatives like proof of stake. I'd also be more forgiving if BTC only required a given amount of power, but no, the system is designed such the power requirement is related to the cost of flops/watt - it's specifically designed such that the system will use more power as power becomes cheaper.</p>
]]></description><pubDate>Mon, 25 Mar 2024 23:29:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=39822474</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=39822474</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39822474</guid></item><item><title><![CDATA[New comment by SmooL in "Digital forgeries are hard"]]></title><description><![CDATA[
<p>This is/could be a legitimate use of blockchain, e.g. see <a href="https://gwern.net/timestamping" rel="nofollow">https://gwern.net/timestamping</a></p>
]]></description><pubDate>Thu, 14 Mar 2024 17:28:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=39706690</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=39706690</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39706690</guid></item><item><title><![CDATA[New comment by SmooL in "How far are we from intelligent visual deductive reasoning?"]]></title><description><![CDATA[
<p>Isn't that a bit of a cop-out though? They're just redefining the word "intelligence" to be something else (here, the ability to learn new things). That's fine and all, but that doesn't answer the question we actually care about, which is the absolute magnitude of the ability. It doesn't matter how "long" an entity took to learn, or how "efficient" they were in learning - at the end of the day, AlphaZero will crush Magnus.</p>
]]></description><pubDate>Mon, 11 Mar 2024 03:06:48 +0000</pubDate><link>https://news.ycombinator.com/item?id=39664583</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=39664583</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39664583</guid></item><item><title><![CDATA[New comment by SmooL in "'We had to educate Oracle about our contract,' CIO says after Big Red audit"]]></title><description><![CDATA[
<p>It was a multi year effort mainly around 2015/16 but extended into 2018 IIRC</p>
]]></description><pubDate>Wed, 06 Mar 2024 18:12:19 +0000</pubDate><link>https://news.ycombinator.com/item?id=39618966</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=39618966</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39618966</guid></item><item><title><![CDATA[New comment by SmooL in "Nanoplastics in water – surprisingly large amounts discovered and its not good"]]></title><description><![CDATA[
<p>Genuine question: are the health risks really that bad? Plastic has been around for decades - surely also then these micro/nano plastics have been around for decades? I can't help but feel like if there were obvious and large health effects that we'd have noticed by now.</p>
]]></description><pubDate>Wed, 14 Feb 2024 21:27:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=39375670</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=39375670</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39375670</guid></item><item><title><![CDATA[New comment by SmooL in "'Pataphysics"]]></title><description><![CDATA[
<p>This is either brilliantly absurd, or ridiculously stupid, depending on your POV</p>
]]></description><pubDate>Thu, 01 Feb 2024 02:30:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=39212189</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=39212189</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39212189</guid></item><item><title><![CDATA[New comment by SmooL in "Tesla braces for its first trial involving Autopilot fatality"]]></title><description><![CDATA[
<p>This is not a fair comparison - Tesla is comparing miles driven by their systems, which are inherently limited in scope to "easier" scenarios (only lane assist/only on highways/ect), against total US average, which includes _all_ gnarly road situations.</p>
]]></description><pubDate>Mon, 28 Aug 2023 14:18:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=37294394</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=37294394</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=37294394</guid></item><item><title><![CDATA[New comment by SmooL in "Iron fuel shows its mettle"]]></title><description><![CDATA[
<p>Yes, but transmission lines can only go so far, and you still lack the ability to arbitrage over time instead of just spatially. E.g. from a solar power POV, it's night everywhere in a given region at the same time.</p>
]]></description><pubDate>Fri, 23 Jun 2023 19:20:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=36451080</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=36451080</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36451080</guid></item><item><title><![CDATA[New comment by SmooL in "Iron fuel shows its mettle"]]></title><description><![CDATA[
<p>> My gut feeling is that transmission lines would still be cheaper<p>Transmission lines are great for moving electricity, but only if there's demand for that electricity _right now_. Otherwise, you have to store it - which is a problem, because battery tech right now isn't great (or rather, it's not good enough for grid-scale requirements) . This iron powder could be thought of as a "battery". It might be harder to move than compared to a transmission line, but it's _stored_ energy and can be redeemed at a later time.</p>
]]></description><pubDate>Fri, 23 Jun 2023 16:21:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=36449017</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=36449017</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36449017</guid></item><item><title><![CDATA[New comment by SmooL in "Gopher Wrangling: Effective error handling in Go"]]></title><description><![CDATA[
<p>1. It's opinionated, so there's often only one way of doings things. Largely, the "one way" is a good way, so people appreciate the forced consistency
2. It's simple. It is very easy to read and write. It is hard to shoot yourself in the foot.
3. It's powerful. They have a few core abstractions that compose well (generic io, http stuff).
4. It's fast. It runs fast because it's compiled, and it compiles fast because it's simple.<p>Me personally: I appreciate the simplicity of it. It's a great language for working with in a team. I wish it was more functional, and had better ways to handle errors, but the simplicity of it all was a breath of fresh air using it in a working environment.</p>
]]></description><pubDate>Tue, 20 Jun 2023 04:29:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=36399727</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=36399727</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36399727</guid></item><item><title><![CDATA[New comment by SmooL in "Obsidian-Copilot: A Prototype Assistant for Writing and Thinking"]]></title><description><![CDATA[
<p>I've thought about doing this as well, but I haven't tried it yet. Are there any resources/blogs/information on various strategies on how to best chunk & embed arbitrary text?</p>
]]></description><pubDate>Tue, 13 Jun 2023 21:15:08 +0000</pubDate><link>https://news.ycombinator.com/item?id=36317465</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=36317465</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36317465</guid></item><item><title><![CDATA[New comment by SmooL in "Debunking the Myth of Dollar Cost Averaging"]]></title><description><![CDATA[
<p>Doesn't this completely miss the point? DCA is about reducing volatility, not maximizing return. Your expected value is higher without DCA, but it's not about the expected value - it's about tightening the stddev of possible outcomes. "A bird in the hand is worth two in the bush".</p>
]]></description><pubDate>Sat, 10 Jun 2023 16:54:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=36272226</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=36272226</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36272226</guid></item><item><title><![CDATA[New comment by SmooL in "How much memory do you need to run 1M concurrent tasks?"]]></title><description><![CDATA[
<p>Yeah this isn't using goroutines at all? I don't see how this is a good comparison of goroutines vs coroutines.</p>
]]></description><pubDate>Sun, 21 May 2023 23:21:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=36025558</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=36025558</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36025558</guid></item><item><title><![CDATA[New comment by SmooL in "How much memory do you need to run 1M concurrent tasks?"]]></title><description><![CDATA[
<p>It's not _quite_ the same: you can't call async code from a sync context (hence the color issue), but I can always pass a "context.Background()" or such as a context value if I don't already have one.</p>
]]></description><pubDate>Sun, 21 May 2023 23:16:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=36025517</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=36025517</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36025517</guid></item><item><title><![CDATA[New comment by SmooL in "Avoiding hallucinations in LLM-powered applications"]]></title><description><![CDATA[
<p>The proposed solution is to feed relevant data from a database of "ground truth facts" into the query (I'm assuming using the usual method of similarity search leveraging embedding vectors).<p>This solution... doesn't prohibit hallucinations? As far as I can tell it only makes them less likely. The AI is still totally capable of hallucinating, it's just less likely to hallucinate an answer to _question X_ if the query includes data that has the answer.<p>I've been thinking that it might be useful if you could actually _remove_ all the stored facts that the LLM has inside of it. I believe that an LLM that didn't natively know a whole bunch of random trivia facts, didn't know basic math, didn't know much about anything _except_ what was put into the initial query would be valuable. The AI can't hallucinate anything if it doesn't know anything to hallucinate.<p>How you achieve this practically I have no clue. I'm not sure it's even possible to remove the knowledge that 1+1=2 without removing the knowledge of how to write a python script one could execute to figure it out.</p>
]]></description><pubDate>Wed, 03 May 2023 01:16:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=35796499</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=35796499</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35796499</guid></item><item><title><![CDATA[New comment by SmooL in "SpaceX moves Starship to launch site, and liftoff could be just days away"]]></title><description><![CDATA[
<p>This is slightly misleading - what matters isn't the peak G force experienced, but moreso the integral of force over time. In a roller coaster you might experience 5Gs, but only for a moment. Astronauts experience 3G, but for up to 5 minutes continuously</p>
]]></description><pubDate>Sun, 02 Apr 2023 00:06:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=35405585</link><dc:creator>SmooL</dc:creator><comments>https://news.ycombinator.com/item?id=35405585</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35405585</guid></item></channel></rss>