<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: sacrelege</title><link>https://news.ycombinator.com/user?id=sacrelege</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Sat, 18 Apr 2026 10:51:46 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=sacrelege" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by sacrelege in "N-Day-Bench – Can LLMs find real vulnerabilities in real codebases?"]]></title><description><![CDATA[
<p>no I don't, since there seem to be a silent degradation bug</p>
]]></description><pubDate>Tue, 14 Apr 2026 17:02:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=47768252</link><dc:creator>sacrelege</dc:creator><comments>https://news.ycombinator.com/item?id=47768252</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47768252</guid></item><item><title><![CDATA[New comment by sacrelege in "N-Day-Bench – Can LLMs find real vulnerabilities in real codebases?"]]></title><description><![CDATA[
<p>Ah thanks, I love coffee<p>At a high level, it's a mix of our own GPU capacity plus the ability to burst into external nodes when things get busy. Right now we're running a bunch of RTX PRO 6000s, which basically forces you into workstation/server boards since you need full x16 PCIe 5.0 lanes per card.<p>We operate a small private datacenter, which gives us some flexibility in how we deploy and scale hardware. On the software side, we're currently LiteLLM as a load balancer in front of the inference servers, though I'm in the process of replacing that with a custom rust based implementation.<p>We've only been online since the beginning of this month, so I can't really say much about the economics yet, but we've had some really nice feedback from early customers so far. :)</p>
]]></description><pubDate>Tue, 14 Apr 2026 10:45:08 +0000</pubDate><link>https://news.ycombinator.com/item?id=47763862</link><dc:creator>sacrelege</dc:creator><comments>https://news.ycombinator.com/item?id=47763862</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47763862</guid></item><item><title><![CDATA[New comment by sacrelege in "N-Day-Bench – Can LLMs find real vulnerabilities in real codebases?"]]></title><description><![CDATA[
<p>We typically observe throughput of around 100–110 toks/s, and for larger context sizes this ranges between 90–100 toks/s.<p>While we don't guarantee a fixed toks/s rate, we scale by provisioning external GPU nodes during peak demand. These nodes run our own dockerized environment over a secure tunnel.<p>Our goal is to ensure a consistent baseline performance of at least 60–80 toks/s, even under high load.</p>
]]></description><pubDate>Tue, 14 Apr 2026 10:27:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=47763714</link><dc:creator>sacrelege</dc:creator><comments>https://news.ycombinator.com/item?id=47763714</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47763714</guid></item><item><title><![CDATA[New comment by sacrelege in "N-Day-Bench – Can LLMs find real vulnerabilities in real codebases?"]]></title><description><![CDATA[
<p>Thanks for putting N-Day-Bench together - really interesting benchmark design and results.<p>I'd love to see how the model we serve, Qwen3.5 122B A10B, stacks up against the rest on this benchmark. AI Router Switzerland (aiRouter.ch) can sponsor free API access for about a month if that helps for adding it to the evaluation set.</p>
]]></description><pubDate>Tue, 14 Apr 2026 02:54:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=47760698</link><dc:creator>sacrelege</dc:creator><comments>https://news.ycombinator.com/item?id=47760698</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47760698</guid></item><item><title><![CDATA[New comment by sacrelege in "Bitcoin is a disaster"]]></title><description><![CDATA[
<p>the posted thread is rather interesting and I would love to see more critical discussions like this on the issues of bitcoin. unlike the often seen circle jerk of bitcoin maxis, this could actually lead to solutions or issues getting addressed before they become bigger.</p>
]]></description><pubDate>Fri, 01 Jan 2021 17:22:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=25604790</link><dc:creator>sacrelege</dc:creator><comments>https://news.ycombinator.com/item?id=25604790</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25604790</guid></item><item><title><![CDATA[New comment by sacrelege in "3D printed 3-sided cryptocurrency themed LED lamp"]]></title><description><![CDATA[
<p>I'm proud of my work, I hope all of you like it too :)</p>
]]></description><pubDate>Thu, 10 Dec 2020 23:56:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=25380463</link><dc:creator>sacrelege</dc:creator><comments>https://news.ycombinator.com/item?id=25380463</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25380463</guid></item><item><title><![CDATA[3D printed 3-sided cryptocurrency themed LED lamp]]></title><description><![CDATA[
<p>Article URL: <a href="https://bitlamp.ch">https://bitlamp.ch</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=25380462">https://news.ycombinator.com/item?id=25380462</a></p>
<p>Points: 1</p>
<p># Comments: 1</p>
]]></description><pubDate>Thu, 10 Dec 2020 23:56:05 +0000</pubDate><link>https://bitlamp.ch</link><dc:creator>sacrelege</dc:creator><comments>https://news.ycombinator.com/item?id=25380462</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25380462</guid></item></channel></rss>