<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: etaioinshrdlu</title><link>https://news.ycombinator.com/user?id=etaioinshrdlu</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Thu, 16 Apr 2026 16:34:20 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=etaioinshrdlu" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by etaioinshrdlu in "Making WebAssembly a first-class language on the Web"]]></title><description><![CDATA[
<p>What do you think is the incentive to LLM post on HN (or any site?)</p>
]]></description><pubDate>Wed, 11 Mar 2026 19:31:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=47340105</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=47340105</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47340105</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Claude Code On-the-Go"]]></title><description><![CDATA[
<p>It makes sense - i build something very similar for my company over the last couple weeks :)<p>I have a tweak that allows pasting images to claude code over SSH:<p>How it works:<p>PTY Interception: It creates a pseudo-terminal (PTY) to wrap the SSH process, allowing it to sit as a "man-in-the-middle" between your keyboard and the remote shell.<p>Bracketed Paste Detection: It monitors stdin for "bracketed paste" sequences (the control codes terminals send when you Cmd+V or drag-and-drop a file).<p>The "Hook": When a paste occurs, it pauses execution and scans the text for local macOS file paths.<p>Auto-Sync: If a local path is found, it immediately syncs that file to the remote server (using the provided SSH key) in the background.<p>Transparent Forwarding: Once the sync is complete, it forwards the original text to the shell.<p>You can drag and drop a file from your local Finder into a remote SSH session, and the file is automatically uploaded to the server before the path appears on the command line. Also works with copy paste, screnshots.</p>
]]></description><pubDate>Sun, 04 Jan 2026 22:10:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=46492831</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=46492831</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46492831</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "LLMs Are Not Fun"]]></title><description><![CDATA[
<p>Like others here, I disagree completely. I find them very fun, almost too fun, like intellectual crack. The craziest ideas are now within reach.</p>
]]></description><pubDate>Mon, 29 Dec 2025 19:24:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=46424378</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=46424378</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46424378</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "A 1961 Relay Computer Running in the Browser"]]></title><description><![CDATA[
<p>"Before microchips existed, computers were built with mechanical relays." Should probably say something about vacuum tubes as well!</p>
]]></description><pubDate>Mon, 17 Nov 2025 02:55:08 +0000</pubDate><link>https://news.ycombinator.com/item?id=45950472</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=45950472</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45950472</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Show HN: qqqa – A fast, stateless LLM-powered assistant for your shell"]]></title><description><![CDATA[
<p>Cool! Right now it's just IP address rate limiting and the costs have not mattered too much, but yes long term I am not sure what we'll do...</p>
]]></description><pubDate>Thu, 06 Nov 2025 22:31:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=45841317</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=45841317</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45841317</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Show HN: qqqa – A fast, stateless LLM-powered assistant for your shell"]]></title><description><![CDATA[
<p>I can suggest our service (previously here <a href="https://news.ycombinator.com/item?id=44849129">https://news.ycombinator.com/item?id=44849129</a> )  that might be helpful -- If you want a zero-setup backend to try qqqa, ch.at might be a useful option. We built ch.at — a single-binary, OpenAI‑compatible chat service with no accounts, no logs, and no tracking. You can point qqqa at our API endpoint and it should “just work”:<p>OpenAI-compatible endpoint: <a href="https://ch.at/v1/chat/completions" rel="nofollow">https://ch.at/v1/chat/completions</a> (supports streamed responses)<p>Also accessible via HTTP/SSH/DNS for quick tests: curl ch.at/?q=… , ssh ch.at
Privacy note: we don’t log anything, but upstream LLM providers might...</p>
]]></description><pubDate>Thu, 06 Nov 2025 21:02:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=45840380</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=45840380</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45840380</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "I have released a 69.0MB version of Windows 7 x86"]]></title><description><![CDATA[
<p>This is impressive and it also kind of demonstrates how bloated Windows really is. You can fit a ton more functionality into even 1MB.</p>
]]></description><pubDate>Thu, 30 Oct 2025 18:50:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=45763770</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=45763770</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45763770</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Pyrex catalog from from 1938 with hand-drawn lab glassware [pdf]"]]></title><description><![CDATA[
<p>My new theory, developing for a while, is that as technology makes things easier, the perceived average quality goes down over time. I've yet to fully understand the factors that drive this trend, but feel certain AI will put it in overdrive! I'm not a luddite or hater actually - but this trend is pretty apparent...</p>
]]></description><pubDate>Mon, 27 Oct 2025 22:34:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=45727128</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=45727128</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45727128</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Show HN: Bottlefire – Build single-executable microVMs from Docker images"]]></title><description><![CDATA[
<p>Interesting - I somehow didn't realize that KVM didn't require root access.<p>Also, I wonder if this could be adapted to use Apple's Hypervisor.framework. That one also doesn't require root and ought to be able to spin up and down very quickly.</p>
]]></description><pubDate>Wed, 10 Sep 2025 05:35:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=45193717</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=45193717</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45193717</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Ch.at – A lightweight LLM chat service accessible through HTTP, SSH, DNS and API"]]></title><description><![CDATA[
<p>One interesting thing I forgot to mention: the server streams HTML back to the client and almost all browsers since the beginning will render as it streams.<p>However, we don't parse markdown on the server and convert to HTML. Rather, we just prompt the model to emit HTML directly.</p>
]]></description><pubDate>Sun, 10 Aug 2025 06:17:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=44853144</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=44853144</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44853144</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Ch.at – A lightweight LLM chat service accessible through HTTP, SSH, DNS and API"]]></title><description><![CDATA[
<p>Author here, was a bit surprised to see this here. I thought there needed to be a good zero-JS LLM site for computer people, and we thought it would be fun to add various other protocols. The short domain hack of "ch.at" was exciting because it felt like the natural domain for such a service.<p>It has not been expensive to operate so far. If it ever changes we can think about rate limiting it.<p>We used GPT4o because it seemed like a decent general default model. Considering adding an openrouter interface to a smorgasbord of additional LLMS.<p>One day, on a plane with WiFi before paying, I noticed that DNS queries were still allowed and thought it would be nice to chat with an LLM over it.<p>We are not logging anything but OpenAI must be...</p>
]]></description><pubDate>Sun, 10 Aug 2025 01:44:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=44852102</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=44852102</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44852102</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Ch.at – A lightweight LLM chat service accessible through HTTP, SSH, DNS and API"]]></title><description><![CDATA[
<p>These can definitely be added</p>
]]></description><pubDate>Sun, 10 Aug 2025 01:38:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=44852078</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=44852078</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44852078</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Ch.at – A lightweight LLM chat service accessible through HTTP, SSH, DNS and API"]]></title><description><![CDATA[
<p>Do we know each other :0 :)</p>
]]></description><pubDate>Sun, 10 Aug 2025 01:37:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=44852074</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=44852074</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44852074</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Many lung cancers are now in nonsmokers"]]></title><description><![CDATA[
<p>Much of LA has some of the worst air in the country, so I think it selects for people who don't care about being poisoned by their environment.</p>
]]></description><pubDate>Wed, 23 Jul 2025 00:39:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=44654568</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=44654568</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44654568</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Shutting Down Clear Linux OS"]]></title><description><![CDATA[
<p>Sounds like bloat removal and minimalism.</p>
]]></description><pubDate>Sat, 19 Jul 2025 00:58:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=44611550</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=44611550</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44611550</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Shutting Down Clear Linux OS"]]></title><description><![CDATA[
<p>What optimizations did they do that had the biggest effect? can they be brought into the mainline linux kernel and distros?</p>
]]></description><pubDate>Sat, 19 Jul 2025 00:29:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=44611376</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=44611376</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44611376</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "LLM Inevitabilism"]]></title><description><![CDATA[
<p>It would be profitable even if we self-hosted the LLMs, which we've done. The only thing subsidized is the training costs. So maybe people will one day stop training AI models.</p>
]]></description><pubDate>Tue, 15 Jul 2025 16:52:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=44573233</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=44573233</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44573233</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "LLM Inevitabilism"]]></title><description><![CDATA[
<p>I've operated a top ~20 LLM service for over 2 years, very comfortably profitably with ads. As for the pure costs you can measure the cost of getting an LLM answer from say, OpenAI, and the equivalent search query from Bing/Google/Exa will cost over 10x more...</p>
]]></description><pubDate>Tue, 15 Jul 2025 06:33:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=44568437</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=44568437</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44568437</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "LLM Inevitabilism"]]></title><description><![CDATA[
<p>This is wrong because LLMs are cheap enough to run profitably on ads alone (search style or banner ad style) for over 2 years now. And they are getting cheaper over time for the same quality.<p>It is even cheaper to serve an LLM answer than call a web search API!<p>Zero chance all the users evaporate unless something much better comes along, or the tech is banned, etc...</p>
]]></description><pubDate>Tue, 15 Jul 2025 05:33:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=44568113</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=44568113</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44568113</guid></item><item><title><![CDATA[New comment by etaioinshrdlu in "Anthropic signs a $200M deal with the Department of Defense"]]></title><description><![CDATA[
<p>LLMs are a key enabling technology to extract real insights from the enormous amount of surveillance data the USA captures. I think it's not an understatement to say we are entering a new era here!<p>Previously, the data may have been collected, but there was so much that effectively, on average no one was "looking" at it. Now it can all be looked at.</p>
]]></description><pubDate>Mon, 14 Jul 2025 20:53:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=44565133</link><dc:creator>etaioinshrdlu</dc:creator><comments>https://news.ycombinator.com/item?id=44565133</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44565133</guid></item></channel></rss>