<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: pbbakkum</title><link>https://news.ycombinator.com/user?id=pbbakkum</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Mon, 06 Apr 2026 04:38:28 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=pbbakkum" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by pbbakkum in "OpenAI charges by the minute, so speed up your audio"]]></title><description><![CDATA[
<p>This is great, thank you for sharing. I work on these APIs at OpenAI, it's a surprise to me that it still works reasonably well at 2/3x speed, but on the other hand for phone channels we get 8khz audio that is upsampled to 24khz for the model and it still works well. Note there's probably a measurable decrease in transcription accuracy that worsens as you deviate from 1x speed. Also we really need to support bigger/longer file uploads :)</p>
]]></description><pubDate>Wed, 25 Jun 2025 21:55:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=44382153</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=44382153</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44382153</guid></item><item><title><![CDATA[New comment by pbbakkum in "Ask HN: Conversational AI to Learn a Language"]]></title><description><![CDATA[
<p>Heyo, I work on the realtime api, this is a very cool app!<p>With transcription I would recommend trying out "gpt-4o-transcribe" or "gpt-4o-mini-transcribe" models, which will be more accurate than "whisper-1". On any model you can set the language parameter, see docs here: <a href="https://platform.openai.com/docs/api-reference/realtime-client-events/session/update#realtime-client-events/session/update-session" rel="nofollow">https://platform.openai.com/docs/api-reference/realtime-clie...</a>. This doesn't guarantee ordering relative to the rest of the response, but the idea is to optimize for conversational-feeling latency. Hope this is helpful.</p>
]]></description><pubDate>Fri, 23 May 2025 00:30:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=44068597</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=44068597</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44068597</guid></item><item><title><![CDATA[butterfish.nvim – Neovim plugin for fluent coding with LLMs]]></title><description><![CDATA[
<p>Article URL: <a href="https://github.com/bakks/butterfish.nvim">https://github.com/bakks/butterfish.nvim</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=38972332">https://news.ycombinator.com/item?id=38972332</a></p>
<p>Points: 3</p>
<p># Comments: 0</p>
]]></description><pubDate>Fri, 12 Jan 2024 18:59:03 +0000</pubDate><link>https://github.com/bakks/butterfish.nvim</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=38972332</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38972332</guid></item><item><title><![CDATA[New comment by pbbakkum in "Show HN: Inshellisense – IDE style shell autocomplete"]]></title><description><![CDATA[
<p>If you're interested in GPT-powered shell autocomplete, check out <a href="https://butterfi.sh" rel="nofollow noreferrer">https://butterfi.sh</a><p>This also enables shell-aware LLM prompting!</p>
]]></description><pubDate>Mon, 06 Nov 2023 22:09:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=38169771</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=38169771</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38169771</guid></item><item><title><![CDATA[A single golden pattern: the story of Butterfish, an LLM-powered shell]]></title><description><![CDATA[
<p>Article URL: <a href="https://pbbakkum.com/blog/20230927/">https://pbbakkum.com/blog/20230927/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=37695059">https://news.ycombinator.com/item?id=37695059</a></p>
<p>Points: 2</p>
<p># Comments: 0</p>
]]></description><pubDate>Thu, 28 Sep 2023 20:06:05 +0000</pubDate><link>https://pbbakkum.com/blog/20230927/</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=37695059</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=37695059</guid></item><item><title><![CDATA[New comment by pbbakkum in "Butterfish – A Shell with AI Superpowers"]]></title><description><![CDATA[
<p>Do you work from the command line? Butterfish is a project I wrote for myself to use AI prompting seamlessly directly from the shell. I hope it's useful to others, give it a try and send feedback!</p>
]]></description><pubDate>Wed, 27 Sep 2023 22:31:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=37682315</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=37682315</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=37682315</guid></item><item><title><![CDATA[Butterfish – A Shell with AI Superpowers]]></title><description><![CDATA[
<p>Article URL: <a href="https://butterfi.sh">https://butterfi.sh</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=37682314">https://news.ycombinator.com/item?id=37682314</a></p>
<p>Points: 2</p>
<p># Comments: 1</p>
]]></description><pubDate>Wed, 27 Sep 2023 22:31:19 +0000</pubDate><link>https://butterfi.sh</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=37682314</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=37682314</guid></item><item><title><![CDATA[New comment by pbbakkum in "Amazon acquires Fig"]]></title><description><![CDATA[
<p>I've experimented with it, the reason I haven't yet added it is that I want deployment to be seamless, and it's not trivial to ship a binary that would (without extra fuss or configuration) efficiently support Metal and CUDA, plus download the models in a graceful way. This is of course possible, but still hard, and not clear if it's the right place to spend energy. I'm curious how you think about it - is your primary desire to work offline or avoid sending data to OpenAI? Or both?</p>
]]></description><pubDate>Mon, 28 Aug 2023 19:11:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=37299045</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=37299045</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=37299045</guid></item><item><title><![CDATA[New comment by pbbakkum in "Amazon acquires Fig"]]></title><description><![CDATA[
<p>Plugging a project of mine: I've been working on a similar idea for the era of LLMs: <a href="https://butterfi.sh" rel="nofollow noreferrer">https://butterfi.sh</a>.<p>It's much more bare-bones than Fig but perhaps useful if you're looking for an alternative! Send me feedback!</p>
]]></description><pubDate>Mon, 28 Aug 2023 18:51:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=37298791</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=37298791</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=37298791</guid></item><item><title><![CDATA[New comment by pbbakkum in "Btop++: Resource monitor for processor, memory, disks, network and processes"]]></title><description><![CDATA[
<p>If you're looking for a lighter weight TUI for Top information check out a recent project of mine here: <a href="https://github.com/bakks/poptop">https://github.com/bakks/poptop</a></p>
]]></description><pubDate>Sun, 25 Jun 2023 20:49:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=36472446</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=36472446</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36472446</guid></item><item><title><![CDATA[Code Spelunker: A Code Search Command Line Tool]]></title><description><![CDATA[
<p>Article URL: <a href="https://boyter.org/posts/code-spelunker-a-code-search-command-line-tool/">https://boyter.org/posts/code-spelunker-a-code-search-command-line-tool/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=36206688">https://news.ycombinator.com/item?id=36206688</a></p>
<p>Points: 1</p>
<p># Comments: 1</p>
]]></description><pubDate>Tue, 06 Jun 2023 00:59:02 +0000</pubDate><link>https://boyter.org/posts/code-spelunker-a-code-search-command-line-tool/</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=36206688</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36206688</guid></item><item><title><![CDATA[New comment by pbbakkum in "CLI tools for working with ChatGPT and other LLMs"]]></title><description><![CDATA[
<p>Really like the design of these tools so that you can easily pipe between them, this a good way to make things composable. Also really cool to see all of the other CLI tools folks have posted here, lots that I wasn't aware of.<p>I've been experimenting with CLI/LLM tools and found my favorite approach is to make the LLM constantly accessible in my shell. The way I do this is to add a transparent wrapper around whatever your shell is (bash,zsh,etc), send commands that start with capital letters to ChatGPT, and manage a history of local commands and GPT responses. This means you can ask questions about a command's output and autocomplete based on ChatGPT suggestions, etc.<p>You can see this approach here, I hope it proves useful to other folks!
<a href="https://github.com/bakks/butterfish">https://github.com/bakks/butterfish</a></p>
]]></description><pubDate>Fri, 19 May 2023 15:57:38 +0000</pubDate><link>https://news.ycombinator.com/item?id=36003453</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=36003453</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36003453</guid></item><item><title><![CDATA[Butterfish – Let's do useful things with LLMs from the command line]]></title><description><![CDATA[
<p>Article URL: <a href="https://github.com/bakks/butterfish">https://github.com/bakks/butterfish</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=35002221">https://news.ycombinator.com/item?id=35002221</a></p>
<p>Points: 2</p>
<p># Comments: 0</p>
]]></description><pubDate>Thu, 02 Mar 2023 21:18:42 +0000</pubDate><link>https://github.com/bakks/butterfish</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=35002221</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35002221</guid></item><item><title><![CDATA[Poptop – A modern top command with dynamic charting]]></title><description><![CDATA[
<p>Article URL: <a href="https://github.com/bakks/poptop">https://github.com/bakks/poptop</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=33537712">https://news.ycombinator.com/item?id=33537712</a></p>
<p>Points: 4</p>
<p># Comments: 0</p>
]]></description><pubDate>Wed, 09 Nov 2022 20:57:20 +0000</pubDate><link>https://github.com/bakks/poptop</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=33537712</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=33537712</guid></item><item><title><![CDATA[New comment by pbbakkum in "Introducing Network Service Tiers"]]></title><description><![CDATA[
<p>A few notes here:<p>- An unmentioned alternative to this pricing is that GCP has a deal with Cloudflare that gives you a 50% discount to what is now called Premium pricing for traffic that egresses GCP through Cloudflare. This is cheaper for Google because GCP and Cloudflare have a peering arrangement. Of course, you also have to pay Cloudflare for bandwidth.<p>- This announcement is actually a small price cut compared to existing network egress prices for the 1-10 TiB/month and 150+ TiB/month buckets.<p>- The biggest advantage of using private networks is often client latency, since packets avoid points of congestion on the open internet. They don't really highlight this, instead showing a chart of throughput to a single client, which only matters for a subset of GCP customers. The throughput chart is also a little bit deceptive because of the y-axis they've chosen.<p>- Other important things to consider if you're optimizing a website for latency are CDN and where SSL negotiation takes place. For a single small HTTPS request doing SSL negotiation on the network edge can make a pretty big latency difference.<p>- Interesting number: Google capex (excluding other Alphabet capex) in both 2015 and 2016 was around $10B, at least part of that going to the networking tech discussed in the post. I expect they're continuing to invest in this space.<p>- A common trend with GCP products is moving away from flat-rate pricing models to models which incentivize users in ways that reflect underlying costs. For example, BigQuery users are priced per-query, which is uncommon for analytical databases. It's possible that network pricing could reflect that in the future. For example, there is probably more slack network capacity at 3am than 8am.</p>
]]></description><pubDate>Wed, 23 Aug 2017 18:13:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=15083748</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=15083748</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=15083748</guid></item><item><title><![CDATA[Quizlet Tests Cloud Spanner – The Most Sophisticated Cloud Database]]></title><description><![CDATA[
<p>Article URL: <a href="https://quizlet.com/blog/quizlet-cloud-spanner">https://quizlet.com/blog/quizlet-cloud-spanner</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=13645015">https://news.ycombinator.com/item?id=13645015</a></p>
<p>Points: 91</p>
<p># Comments: 3</p>
]]></description><pubDate>Tue, 14 Feb 2017 17:06:03 +0000</pubDate><link>https://quizlet.com/blog/quizlet-cloud-spanner</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=13645015</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=13645015</guid></item><item><title><![CDATA[New comment by pbbakkum in "Why We Moved from Amazon Web Services to Google Cloud Platform"]]></title><description><![CDATA[
<p>We've been running it for about a year to power Quizlet - overall things have been good and we're happy. AWS and GCP are complicated enough that they're tough to compare holistically, but on most of the things we care about we find GCP to be equivalent or better (sometimes significantly) than AWS. It really does have better networking and disk technology, and the pricing is much better. Here's the analysis we did: <a href="https://quizlet.com/blog/whats-the-best-cloud-probably-gcp" rel="nofollow">https://quizlet.com/blog/whats-the-best-cloud-probably-gcp</a>.<p>Rough patches are:<p>- Live migration is sometimes not seamless.<p>- Pub/sub is missing some core features.</p>
]]></description><pubDate>Fri, 05 Aug 2016 20:22:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=12235491</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=12235491</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=12235491</guid></item><item><title><![CDATA[New comment by pbbakkum in "Ask HN: Which cloud provider to use in 2016? AWS or GCE?"]]></title><description><![CDATA[
<p>We thought a lot about this question to pick a cloud for Quizlet (~200 cloud machines) and ended up going with GCP. Here's a summary of our analysis, hope its helpful: <a href="https://quizlet.com/blog/whats-the-best-cloud-probably-gcp" rel="nofollow">https://quizlet.com/blog/whats-the-best-cloud-probably-gcp</a></p>
]]></description><pubDate>Mon, 18 Apr 2016 01:07:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=11517136</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=11517136</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=11517136</guid></item><item><title><![CDATA[New comment by pbbakkum in "What's the Best Cloud? Probably GCP"]]></title><description><![CDATA[
<p>To be more explicit on this point - I think Azure is a good product and the growth that Microsoft is seeing speaks for itself. However, for our use case, which is at least somewhat representative of a rapidly growing Linux-based startup, we didn't see any compelling advantage in using the Azure compute product (we may use one of their ML products in the future).  Hence, it made sense to narrow the focus somewhat to what we thought were the best options. We eliminated Azure from our list based on the fact that our preliminary analysis didn't uncover any big advantages to use it over the other clouds, we wanted a cloud focused more on linux, and we don't currently use any products in the Microsoft ecosystem.<p>Yes I know Azure runs Linux, let me unpack that point: We had run previously on a cloud that wasn't focused on Linux hosting as their flagship OS. The effect we observed was that Linux was a second-class citizen in terms of features and performance. Perhaps its unfair to project that onto Azure, but I think its true that AWS and GCP think about Linux first, and Azure doesn't. Running a company on the cloud means relying on the compute product (GCE/EC2) as the foundation for your infrastructure, so we think this makes a difference.<p>It would be valuable for a lot of people to see more comprehensive stats across all clouds - I would love to see this personally and I think it would help people make better decisions about cloud infrastructure.</p>
]]></description><pubDate>Thu, 10 Mar 2016 19:29:34 +0000</pubDate><link>https://news.ycombinator.com/item?id=11261534</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=11261534</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=11261534</guid></item><item><title><![CDATA[New comment by pbbakkum in "What's the Best Cloud? Probably GCP"]]></title><description><![CDATA[
<p>Sure - this isn't really a dimension of comparison, just something that I found interesting / surprising. It seems like SDN is probably the future, and this is an illustration of how its different.</p>
]]></description><pubDate>Thu, 10 Mar 2016 18:56:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=11261242</link><dc:creator>pbbakkum</dc:creator><comments>https://news.ycombinator.com/item?id=11261242</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=11261242</guid></item></channel></rss>