<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: coinexpert</title><link>https://news.ycombinator.com/user?id=coinexpert</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Sat, 25 Apr 2026 14:19:34 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=coinexpert" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by coinexpert in "Can I run AI locally?"]]></title><description><![CDATA[
<p>Most of the friction around local AI comes from juggling different runtimes for different providers. We built Milady specifically to solve that — one unified runtime that works with Ollama, OpenAI, Anthropic, and others. Switch providers without rewriting a line of code. Fully offline capable, zero telemetry. Happy to answer questions if anyone's curious: milady.ai</p>
]]></description><pubDate>Wed, 18 Mar 2026 15:02:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=47426645</link><dc:creator>coinexpert</dc:creator><comments>https://news.ycombinator.com/item?id=47426645</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47426645</guid></item></channel></rss>