<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: luew</title><link>https://news.ycombinator.com/user?id=luew</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Sun, 26 Apr 2026 09:51:15 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=luew" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[Open-Source Inference is growing 10% week over week this year]]></title><description><![CDATA[
<p>So we're a small inference provider, launched publicly two weeks ago and have seen a crazy demand of growth.<p>I reached out to a lot of other inference providers such as fireworks, togetherAI, simpliAI etc and started asking them their growth and what they are seeing in this space / what they predict we will see over this year.<p>I was told by a higher up at fireworks that on average since January the space as a whole has had 10% week over week growth -- this type of explosive growth feels unreal. I honestly expect a price squeeze later this year, no one can keep up.<p>I think we're starting to see this with GPU prices -- h100s have gone from 1.30 on demand to 1.90 about, h200 nodes are all sold out with 3+ month waitlists. B300s have near a year waitlist from the quotes I got. Insane market.</p>
<hr>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=47906463">https://news.ycombinator.com/item?id=47906463</a></p>
<p>Points: 2</p>
<p># Comments: 1</p>
]]></description><pubDate>Sun, 26 Apr 2026 01:33:31 +0000</pubDate><link>https://news.ycombinator.com/item?id=47906463</link><dc:creator>luew</dc:creator><comments>https://news.ycombinator.com/item?id=47906463</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47906463</guid></item><item><title><![CDATA[New comment by luew in "DeepSeek v4"]]></title><description><![CDATA[
<p>We will be hosting it soon at getlilac.com!</p>
]]></description><pubDate>Fri, 24 Apr 2026 05:35:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=47885998</link><dc:creator>luew</dc:creator><comments>https://news.ycombinator.com/item?id=47885998</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47885998</guid></item><item><title><![CDATA[New comment by luew in "GPU's Are Being Wasted"]]></title><description><![CDATA[
<p>Hey, Lucas from Lilac here.<p>Wrote a short piece on GPU waste. We've been talking with many enterprises and neoclouds for the past 9 months to get a good understanding of utilization rates (and they are shockingly low).<p>Most clusters sit with less than half of their GPUs utilized, translating to billions in lost compute time. Our goal is to help solve this problem at Lilac. Let me know what your thoughts are!</p>
]]></description><pubDate>Wed, 25 Mar 2026 23:45:34 +0000</pubDate><link>https://news.ycombinator.com/item?id=47524875</link><dc:creator>luew</dc:creator><comments>https://news.ycombinator.com/item?id=47524875</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47524875</guid></item><item><title><![CDATA[GPU's Are Being Wasted]]></title><description><![CDATA[
<p>Article URL: <a href="https://getlilac.com/blog/gpu-scarcity-paradox">https://getlilac.com/blog/gpu-scarcity-paradox</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=47524874">https://news.ycombinator.com/item?id=47524874</a></p>
<p>Points: 5</p>
<p># Comments: 2</p>
]]></description><pubDate>Wed, 25 Mar 2026 23:45:34 +0000</pubDate><link>https://getlilac.com/blog/gpu-scarcity-paradox</link><dc:creator>luew</dc:creator><comments>https://news.ycombinator.com/item?id=47524874</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47524874</guid></item></channel></rss>