<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: icyfox</title><link>https://news.ycombinator.com/user?id=icyfox</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Mon, 27 Apr 2026 11:34:15 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=icyfox" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by icyfox in "John Ternus to become Apple CEO"]]></title><description><![CDATA[
<p>So much of what Apple has lost over the last 10 years is a lower bar for what counts as good enough.<p>You see this most obviously in software and marketing - the kinds of decisions where only a few people sign off at the end, and where "good enough" is whatever those few people decide it is. You see it less in hardware and procurement where there's a powerful review cycle and scrutiny at every level of the stack. Work there is more immediately measurable: benchmarks for performance, dollars for cost.<p>The "vibe" of software, or of a PDF [^1], is much harder to catch that way. There's no benchmark that flags it and most conventional executives aren't drilling down in that level of detail to see it either.<p>You want distributed decision-making, of course. But that only works well if it's distributed to people who've cultivated their own taste and who will make good calls under pressure. I'm not sure how much of that gets fixed by leadership change at the top. Taste isn't really something a CEO can decree into a 60,000 person org. But I've only heard good things about Ternus, so I'm optimistic. Fingers crossed for a bright new chapter.<p>[^1]: <a href="https://www.apple.com/promo/pdf/US_FY26_Earth_Day_Promo_TandCs.pdf" rel="nofollow">https://www.apple.com/promo/pdf/US_FY26_Earth_Day_Promo_Tand...</a></p>
]]></description><pubDate>Mon, 20 Apr 2026 21:05:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=47840728</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=47840728</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47840728</guid></item><item><title><![CDATA[New comment by icyfox in "Bring Back MiniDV with This Raspberry Pi FireWire Hat"]]></title><description><![CDATA[
<p>As far as I've seen, local OSS video understanding models just really aren't there yet. I briefly looked at facial recognition models but a good amount of signal was actually in the video's audio instead of the raw video frames. Depends on the accuracy you're looking for at the end of the day.</p>
]]></description><pubDate>Wed, 01 Apr 2026 19:27:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=47605384</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=47605384</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47605384</guid></item><item><title><![CDATA[New comment by icyfox in "Bring Back MiniDV with This Raspberry Pi FireWire Hat"]]></title><description><![CDATA[
<p>Mostly wanted to fully automate the pipeline (auto-rewind tape, scan tape head position, etc) and iMovie is just using the same AVFoundation APIs under the scene that you can call manually. Took some notes here if helpful:
<a href="https://pierce.dev/notes/automating-our-home-video-imports" rel="nofollow">https://pierce.dev/notes/automating-our-home-video-imports</a><p>Wish vhsdecode was easier to use in practice! Such a cool idea but a bit too inconvenient to hack your own hardware like this...</p>
]]></description><pubDate>Wed, 01 Apr 2026 05:46:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=47597249</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=47597249</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47597249</guid></item><item><title><![CDATA[New comment by icyfox in "Bring Back MiniDV with This Raspberry Pi FireWire Hat"]]></title><description><![CDATA[
<p>Digitizing my old tapes was one of the most rewarding side projects that I did over the last year. I managed to get in under the wire (pun intended) of Firewire compatibility on Sequoia and a long daisy-chain of adapters. But it was clear the days of this approach were numbered. I'm optimistic these 3rd party accessories will become more standardized into self-contained cheap boxes where people can easily transfer over their stuff before camcorders degrade.<p>My pipeline went camera -> dvrescue -> ffmpeg -> clip chunking -> gemini for auto tagging of family members and locations where things were shot.<p>We now have all our family's footage hosted on a NAS with Jellyfin serving over Tailscale to my parents Macbooks. I found the clip chunking in particular made the footage a lot more watchable than just importing the two-hour long tapes although ymmv.</p>
]]></description><pubDate>Wed, 01 Apr 2026 05:17:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=47597095</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=47597095</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47597095</guid></item><item><title><![CDATA[New comment by icyfox in "Miami, your Waymo ride is ready"]]></title><description><![CDATA[
<p>Market used to be closed to all cars (2021-2025); only taxis and busses were allowed but that changed recently:<p><a href="https://www.sfmta.com/blog/creating-better-market-street-car-free-enforcement-resume" rel="nofollow">https://www.sfmta.com/blog/creating-better-market-street-car...</a>
<a href="https://www.planetizen.com/news/2025/08/135849-sfs-market-street-will-no-longer-be-car-free" rel="nofollow">https://www.planetizen.com/news/2025/08/135849-sfs-market-st...</a><p>Wonder if that explains your observed preference. I'd bet Waymos will start utilizing the route again if it aligns with Google's mapping algo.</p>
]]></description><pubDate>Thu, 22 Jan 2026 23:22:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=46726413</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=46726413</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46726413</guid></item><item><title><![CDATA[New comment by icyfox in "Miami, your Waymo ride is ready"]]></title><description><![CDATA[
<p>I'm not commenting on the externalities. For that I'd also cite economic impact, job loss, occasional emergency services issues, etc. I'm saying the experience when you yourself are taking a ride. I haven't met a single person who's said "this sucked - I'm going back to Uber".</p>
]]></description><pubDate>Thu, 22 Jan 2026 17:32:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=46722411</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=46722411</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46722411</guid></item><item><title><![CDATA[New comment by icyfox in "Miami, your Waymo ride is ready"]]></title><description><![CDATA[
<p>Waymo is such an interesting case study. For most other ~AI deployments you have strong public reaction to the proliferation of slop, non-human failure modes, cost cutting at the expense of quality, etc. But I haven't met a single person who doesn't like the experience of Waymo. They ended up cracking the code on what I suspect people really want:<p>- consistent car quality<p>- safety of the drive (conservative driving and potential fear of drivers)<p>- no randomly chatty driver<p>All of those feel like a breath of fresh air especially when stacked up against the current state of Uber & Lyft rides. People really just want consistency. I don't actually think you needed AI to get there (I've had occasional rides in black cars that provided the same experience). Waymo was just right time, right place, right price.</p>
]]></description><pubDate>Thu, 22 Jan 2026 17:28:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=46722327</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=46722327</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46722327</guid></item><item><title><![CDATA[A deep dive on agent sandboxes]]></title><description><![CDATA[
<p>Article URL: <a href="https://pierce.dev/notes/a-deep-dive-on-agent-sandboxes">https://pierce.dev/notes/a-deep-dive-on-agent-sandboxes</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=46595393">https://news.ycombinator.com/item?id=46595393</a></p>
<p>Points: 68</p>
<p># Comments: 20</p>
]]></description><pubDate>Mon, 12 Jan 2026 23:02:27 +0000</pubDate><link>https://pierce.dev/notes/a-deep-dive-on-agent-sandboxes</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=46595393</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46595393</guid></item><item><title><![CDATA[New comment by icyfox in "Scientists discover oldest poison, on 60k-year-old arrows"]]></title><description><![CDATA[
<p>Fair point about the source, but the classification usually follows the mode of delivery, not the organism of origin.<p>Many plant-derived compounds function as venoms once introduced into the bloodstream (arrow coatings, darts, etc.), even if they’re also toxic when ingested. Curare is one example of a plant-based compound - lethal in blood, but largely harmless if eaten.<p>So while Boophone is absolutely a poison in the ecological sense, using it on arrows still fits the venom/toxin distinction better than a purely ingested poison. Otherwise why would people hunt with this if they got sick the second they ate the meat?</p>
]]></description><pubDate>Fri, 09 Jan 2026 22:54:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=46560578</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=46560578</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46560578</guid></item><item><title><![CDATA[New comment by icyfox in "Scientists discover oldest poison, on 60k-year-old arrows"]]></title><description><![CDATA[
<p>At the risk of being overly pedantic, topologists would typically classify this as venom.<p>Venom is inert if digested; it's only a problem if it gets in your blood stream. So arrows that were laced with venom and thereby contaminated meat were actually perfectly safe to eat.<p>Poison is different. If ingested, inhaled, or absorbed it will kill you.</p>
]]></description><pubDate>Fri, 09 Jan 2026 21:23:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=46559542</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=46559542</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46559542</guid></item><item><title><![CDATA[Messages in bottles across the digital sea]]></title><description><![CDATA[
<p>Article URL: <a href="https://adrift.today/">https://adrift.today/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=46445277">https://news.ycombinator.com/item?id=46445277</a></p>
<p>Points: 4</p>
<p># Comments: 1</p>
]]></description><pubDate>Wed, 31 Dec 2025 15:59:09 +0000</pubDate><link>https://adrift.today/</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=46445277</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46445277</guid></item><item><title><![CDATA[New comment by icyfox in "Show HN: Gemini Pro 3 imagines the HN front page 10 years from now"]]></title><description><![CDATA[
<p>Exactly half of these HN usernames actually exist. So either there are enough people on HN that follow common conventions for Gemini to guess from a more general distribution, or Gemini has memorized some of the more popular posters. The ones that are missing:<p>- aphyr_bot
 - bio_hacker
 - concerned_grandson
 - cyborg_sec
 - dang_fan
 - edge_compute
 - founder_jane
 - glasshole2
 - monad_lover
 - muskwatch
 - net_hacker
 - oldtimer99
 - persistence_is_key
 - physics_lover
 - policy_wonk
 - pure_coder
 - qemu_fan
 - retro_fix
 - skeptic_ai
 - stock_watcher<p>Huge opportunity for someone to become the actual dang fan.</p>
]]></description><pubDate>Tue, 09 Dec 2025 16:52:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=46207231</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=46207231</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46207231</guid></item><item><title><![CDATA[New comment by icyfox in "Z-Image: Powerful and highly efficient image generation model with 6B parameters"]]></title><description><![CDATA[
<p>We talked about this model in some depth on the last Pretrained episode:
<a href="https://youtu.be/5weFerGhO84?si=Eh_92_9PPKyiTU_h&t=1743" rel="nofollow">https://youtu.be/5weFerGhO84?si=Eh_92_9PPKyiTU_h&t=1743</a><p>Some interesting takeaways imo:<p>- Uses existing model backbones for text encoding & semantic tokens (why reinvent the wheel if you don't need to?)<p>- Trains on a whole lot of synthetic captions of different lengths, ostensibly generated using some existing vision LLM<p>- Solid text generation support is facilitated by training on all OCR'd text from the ground truth image. This seems to match how Nano Banana Pro got so good as well; I've seen its thinking tokens sketch out exactly what text to say in the image before it renders.</p>
]]></description><pubDate>Sun, 07 Dec 2025 04:18:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=46179113</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=46179113</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46179113</guid></item><item><title><![CDATA[New comment by icyfox in "Show HN: SerpApi MCP Server"]]></title><description><![CDATA[
<p>I used Serp via API many moons ago. The most interesting part of the company imo is their legal defense of different plans:<p><pre><code>  Production - $150
  15,000 searches / month
  U.S. Legal Shield
</code></pre>
ie. "Our U.S. Legal Shield protects your right to crawl and parse public search engine data under the First Amendment. We assume scraping and parsing liability for customers on most recurring plans unless your usage is illegal."<p>I imagine at least some portion of companies use them just for this liability shield.</p>
]]></description><pubDate>Fri, 05 Dec 2025 18:44:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=46165445</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=46165445</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46165445</guid></item><item><title><![CDATA[New comment by icyfox in "Reverse engineering a $1B Legal AI tool exposed 100k+ confidential files"]]></title><description><![CDATA[
<p>I'm always a bit surprised how long it can take to triage and fix these pretty glaring security vulnerabilities. October 27, 2025 disclosure and November 4, 2025 email confirmation seems like a long time to have their entire client file system exposed. Sure the actual bug ended up being (what I imagine to be) a <1hr fix plus the time for QA testing to make sure it didn't break anything.<p>Is the issue that people aren't checking their security@ email addresses? People are on holiday? These emails get so much spam it's really hard to separate the noise from the legit signal? I'm genuinely curious.</p>
]]></description><pubDate>Wed, 03 Dec 2025 18:24:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=46138029</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=46138029</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46138029</guid></item><item><title><![CDATA[New comment by icyfox in "Launch HN: Karumi (YC F25) – Personalized, agentic product demos"]]></title><description><![CDATA[
<p>Seems like the live demo is bear hugged - been waiting for ~5 minutes now. A bit ironic given their landing page: Don’t make your prospects wait–ever again<p>In its current iteration this demo might net discourage your future clients rather than encourage them.<p>I like the idea in general as an alternative to needing to book with a BDE. I'd always prefer to just self serve for a new product; anything that gates my time (sales calls, popover walkthroughs, etc) is something I'd prefer to skip. But I know non-engineering customers really love these calls to see the power of a new platform. I wonder if they'll be as engaged during an AI walkthrough versus when there's a person on the other end of the phone.</p>
]]></description><pubDate>Mon, 24 Nov 2025 19:03:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=46037785</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=46037785</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46037785</guid></item><item><title><![CDATA[New comment by icyfox in "Gemini 3"]]></title><description><![CDATA[
<p>I'm not sure how concerned people should be at the trend lines. If you're building a product that already works well, you shouldn't feel the need to upgrade to a larger parameter model. If your product doesn't work and the new architectures unlock performance that would let you have a feasible business, even a 2x on input tokens shouldn't be the dealbreaker.<p>If we're paying more for a more petaflop heavy model, it makes sense that costs would go up. What really would concern me is if companies start ratcheting prices up for models with the same level of performance. My hope is raw hardware costs and OSS releases keep a lid on the margin pressure.</p>
]]></description><pubDate>Tue, 18 Nov 2025 17:23:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=45969247</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=45969247</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45969247</guid></item><item><title><![CDATA[New comment by icyfox in "Gemini 3"]]></title><description><![CDATA[
<p>Pretty happy the under 200k token pricing is staying in the same ballpark as Gemini 2.5 Pro:<p>Input: $1.25 -> $2.00 (1M tokens)<p>Output: $10.00 -> $12.00<p>Squeezes a bit more margin out of app layer companies, certainly, but there's a good chance that for tasks that really require a sota model it can be more than justified.</p>
]]></description><pubDate>Tue, 18 Nov 2025 16:28:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=45968392</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=45968392</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45968392</guid></item><item><title><![CDATA[New comment by icyfox in "Automating our home video imports"]]></title><description><![CDATA[
<p>AdaptiveDetector definitely did a better job, will append these new stats to the post:<p>precision 0.397, recall 0.727, F1 0.513, mean temporal error 0.307 s</p>
]]></description><pubDate>Wed, 12 Nov 2025 04:03:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=45896268</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=45896268</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45896268</guid></item><item><title><![CDATA[New comment by icyfox in "Automating our home video imports"]]></title><description><![CDATA[
<p>Nice to see you on here! I used the ContentDetector with a threshold of 27.0 and otherwise default parameters. Realize I could have done a grid sweep to really hone in on a good param range, but because I had only one input video labeled I wanted something that would work well enough out of the box. I imagine this dataset is rather... heterogenous.<p>If you happen to know a better apriori threshold I would be happy to re-run the analysis and update the chart.</p>
]]></description><pubDate>Wed, 12 Nov 2025 01:24:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=45895332</link><dc:creator>icyfox</dc:creator><comments>https://news.ycombinator.com/item?id=45895332</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45895332</guid></item></channel></rss>