<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: scriptsmith</title><link>https://news.ycombinator.com/user?id=scriptsmith</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Wed, 06 May 2026 13:56:20 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=scriptsmith" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by scriptsmith in "Google Chrome silently installs a 4 GB AI model on your device without consent"]]></title><description><![CDATA[
<p>Depends on where you get it. By default the flags will be enabled, but some packagers may choose to disable them. I haven't seen a major distro release chromium 148 yet.<p>Weirdly though, chromium won't be able to actually use the model even though it can download it, because the inference engine is a closed-source blob.<p><a href="https://adsm.dev/posts/prompt-api/#which-browsers-support-the-api" rel="nofollow">https://adsm.dev/posts/prompt-api/#which-browsers-support-th...</a></p>
]]></description><pubDate>Wed, 06 May 2026 08:06:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=48033582</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=48033582</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48033582</guid></item><item><title><![CDATA[New comment by scriptsmith in "Google Chrome silently installs a 4 GB AI model on your device without consent"]]></title><description><![CDATA[
<p>I wrote a more detailed blog post here:<p><a href="https://news.ycombinator.com/item?id=48028662">https://news.ycombinator.com/item?id=48028662</a></p>
]]></description><pubDate>Wed, 06 May 2026 04:23:34 +0000</pubDate><link>https://news.ycombinator.com/item?id=48032145</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=48032145</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48032145</guid></item><item><title><![CDATA[New comment by scriptsmith in "Google Chrome silently installs a 4 GB AI model on your device without consent"]]></title><description><![CDATA[
<p>Chromium doesn't support this API because it needs a binary blob to run the inference, although in theory it may still be configured to download the weights:<p><a href="https://adsm.dev/posts/prompt-api/#which-browsers-support-the-api" rel="nofollow">https://adsm.dev/posts/prompt-api/#which-browsers-support-th...</a></p>
]]></description><pubDate>Wed, 06 May 2026 00:13:55 +0000</pubDate><link>https://news.ycombinator.com/item?id=48030524</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=48030524</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48030524</guid></item><item><title><![CDATA[New comment by scriptsmith in "Google Chrome silently installs a 4 GB AI model on your device without consent"]]></title><description><![CDATA[
<p>It can only be called after the user has interacted with the page, but there's no dialogue from the browser<p><a href="https://developer.chrome.com/docs/ai/get-started#user-activation" rel="nofollow">https://developer.chrome.com/docs/ai/get-started#user-activa...</a></p>
]]></description><pubDate>Tue, 05 May 2026 23:46:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=48030317</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=48030317</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48030317</guid></item><item><title><![CDATA[New comment by scriptsmith in "The Prompt API is now on by default in Chrome"]]></title><description><![CDATA[
<p>Author here. After trying out the Prompt API over the last week, I wrote up some details on the chromium internals, how to use the API, and made some toy demos.<p>It's a 4 GB model that can be used to run on-device inference.</p>
]]></description><pubDate>Tue, 05 May 2026 21:58:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=48029212</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=48029212</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48029212</guid></item><item><title><![CDATA[New comment by scriptsmith in "Google Chrome silently installs a 4 GB AI model on your device without consent"]]></title><description><![CDATA[
<p>In my understanding, yes. I wrote a blog post about some of the internals here: <a href="https://news.ycombinator.com/item?id=48028662">https://news.ycombinator.com/item?id=48028662</a></p>
]]></description><pubDate>Tue, 05 May 2026 21:15:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=48028690</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=48028690</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48028690</guid></item><item><title><![CDATA[The Prompt API is now on by default in Chrome]]></title><description><![CDATA[
<p>Article URL: <a href="https://adsm.dev/posts/prompt-api/">https://adsm.dev/posts/prompt-api/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=48028662">https://news.ycombinator.com/item?id=48028662</a></p>
<p>Points: 2</p>
<p># Comments: 2</p>
]]></description><pubDate>Tue, 05 May 2026 21:12:57 +0000</pubDate><link>https://adsm.dev/posts/prompt-api/</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=48028662</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48028662</guid></item><item><title><![CDATA[New comment by scriptsmith in "Google Chrome silently installs a 4 GB AI model on your device without consent"]]></title><description><![CDATA[
<p>Google has been trialling the Prompt API in chrome for the over a year, so before Gemma 4 existed. But they are indicating they'll move to Gemma 4: <a href="https://groups.google.com/a/chromium.org/g/blink-dev/c/iR6R7-nQeHI/m/AM0yj_xTBgAJ" rel="nofollow">https://groups.google.com/a/chromium.org/g/blink-dev/c/iR6R7...</a></p>
]]></description><pubDate>Tue, 05 May 2026 11:32:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=48021035</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=48021035</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48021035</guid></item><item><title><![CDATA[New comment by scriptsmith in "Google Chrome silently installs a 4 GB AI model on your device without consent"]]></title><description><![CDATA[
<p>It's based on Gemma 3n, and it's not the best.<p>I find it works fine for simple classification, translation, interpretation of images & audio. It can write longer prose, but it's pretty bad.<p>It can also write text in the format of a JSON schema or regexp for anything you might want to do with structured data.</p>
]]></description><pubDate>Tue, 05 May 2026 09:25:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=48020033</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=48020033</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48020033</guid></item><item><title><![CDATA[New comment by scriptsmith in "Google Chrome silently installs a 4 GB AI model on your device without consent"]]></title><description><![CDATA[
<p>Those flags will exist already, but will default to enabled in 148.<p>That other flag is for using a different open-source inference engine to the (from what I can tell) closed-source one that's used by default.</p>
]]></description><pubDate>Tue, 05 May 2026 08:46:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=48019702</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=48019702</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48019702</guid></item><item><title><![CDATA[New comment by scriptsmith in "Google Chrome silently installs a 4 GB AI model on your device without consent"]]></title><description><![CDATA[
<p>If Chrome has the <i>#optimization-guide-on-device-model</i> and <i>#prompt-api-for-gemini-nano</i> flags enabled, either because it's part of some Origin Trial / Early Stable Release or something, then web pages will have access to the new Prompt API which allows any webpage to initiate the (one-time) download of the ~2.7 GiB CPU or ~4.0 GiB GPU model using LanguageModel.create()<p><a href="https://developer.chrome.com/docs/ai/prompt-api" rel="nofollow">https://developer.chrome.com/docs/ai/prompt-api</a><p>When Chrome 148 releases tomorrow, this will be the default behaviour on desktop.<p>To download, it should check for 22 GiB free disk space on the volume where your Chrome data dir is, and at least double the model size of free space in your tmp dir.</p>
]]></description><pubDate>Tue, 05 May 2026 08:26:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=48019542</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=48019542</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48019542</guid></item><item><title><![CDATA[New comment by scriptsmith in "No, it doesn't cost Anthropic $5k per Claude Code user"]]></title><description><![CDATA[
<p>Yes, you could turn it around to say that using Anthropic models in Cursor, Copilot, Junie, etc. is 'subsidising' Claude Code users.</p>
]]></description><pubDate>Tue, 10 Mar 2026 05:13:19 +0000</pubDate><link>https://news.ycombinator.com/item?id=47319297</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=47319297</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47319297</guid></item><item><title><![CDATA[New comment by scriptsmith in "Ki Editor - an editor that operates on the AST"]]></title><description><![CDATA[
<p>The "First-class syntactic selection" reminds me of my most used shortcut(s) in Jetbrains IDEs: the Expand / Shrink Selection.<p><pre><code>  Ctrl + W
  Ctrl + Shift + W
</code></pre>
<a href="https://www.jetbrains.com/help/idea/working-with-source-code.html#editor_code_selection" rel="nofollow">https://www.jetbrains.com/help/idea/working-with-source-code...</a><p>It really changed my perspective on interacting with the 'text' of a file.<p>VS Code, Zed, etc. have similar operations, but in my experience they expand and shrink too coarsely.</p>
]]></description><pubDate>Sat, 07 Mar 2026 12:14:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=47286943</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=47286943</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47286943</guid></item><item><title><![CDATA[New comment by scriptsmith in "Quantized Llama models with increased speed and a reduced memory footprint"]]></title><description><![CDATA[
<p>Yes, I've used the v3.2 3B-Instruct model in a Slack app. Specifically using vLLM, with a template: <a href="https://github.com/vllm-project/vllm/blob/main/examples/tool_chat_template_llama3.2_json.jinja">https://github.com/vllm-project/vllm/blob/main/examples/tool...</a><p>Works as expected if you provide a few system prompts with context.</p>
]]></description><pubDate>Thu, 24 Oct 2024 22:05:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=41940328</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=41940328</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41940328</guid></item><item><title><![CDATA[New comment by scriptsmith in "Zero regrets: Firefox power user kept 7,500 tabs open for two years"]]></title><description><![CDATA[
<p>To keep on-top of tabs in Firefox, I use 'Auto Tab Discard' [1] to discard tabs after a certain amount of inactivity. Then when I need to clean up my list of tabs, I click on any discarded tabs I want to keep, and then use my extension 'Close Discarded Tabs' [2] to clear the rest.<p>[1] <a href="https://addons.mozilla.org/en-US/firefox/addon/auto-tab-discard/" rel="nofollow">https://addons.mozilla.org/en-US/firefox/addon/auto-tab-disc...</a><p>[2] <a href="https://addons.mozilla.org/en-US/firefox/addon/close-discarded-tabs/" rel="nofollow">https://addons.mozilla.org/en-US/firefox/addon/close-discard...</a></p>
]]></description><pubDate>Mon, 05 Aug 2024 05:00:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=41158293</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=41158293</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41158293</guid></item><item><title><![CDATA[New comment by scriptsmith in "CrowdStrike Update: Windows Bluescreen and Boot Loops"]]></title><description><![CDATA[
<p>Someone posted this in the thread, but I also can't log in to verify<p>> Summary<p>> CrowdStrike is aware of reports of crashes on Windows hosts related to the Falcon Sensor.<p>> Details<p>> Symptoms include hosts experiencing a bugcheck\blue screen error related to the Falcon Sensor.<p>> Current Action<p>> Our Engineering teams are actively working to resolve this issue and there is no need to open a support ticket.<p>> Status updates will be posted below as we have more information to share, including when the issue is resolved.<p>> Latest Updates<p>> 2024-07-19 05:30 AM UTC | Tech Alert Published.<p>> Support<p>> Find answers and contact Support with our Support Portal</p>
]]></description><pubDate>Fri, 19 Jul 2024 05:46:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=41002264</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=41002264</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41002264</guid></item><item><title><![CDATA[New comment by scriptsmith in "CrowdStrike Update: Windows Bluescreen and Boot Loops"]]></title><description><![CDATA[
<p>It's crowdstrike: <a href="https://www.reddit.com/r/crowdstrike/comments/1e6vmkf/bsod_error_in_latest_crowdstrike_update/" rel="nofollow">https://www.reddit.com/r/crowdstrike/comments/1e6vmkf/bsod_e...</a><p>> 7/18/24 10:20PT - Hello everyone - We have widespread reports of BSODs on windows hosts, occurring on multiple sensor versions. Investigating cause. TA will be published shortly. Pinned thread.<p>> SCOPE: EU-1, US-1, US-2 and US-GOV-1<p>> Edit 10:36PT - TA posted: <a href="https://supportportal.crowdstrike.com/s/article/Tech-Alert-Windows-crashes-related-to-Falcon-Sensor-2024-07-19" rel="nofollow">https://supportportal.crowdstrike.com/s/article/Tech-Alert-W...</a><p>>  Edit 11:27 PM PT:<p>> Workaround Steps:<p>>    Boot Windows into Safe Mode or the Windows Recovery Environment<p>>    Navigate to the C:\Windows\System32\drivers\CrowdStrike directory<p>>    Locate the file matching “C-00000291*.sys”, and delete it.<p>>    Boot the host normally.</p>
]]></description><pubDate>Fri, 19 Jul 2024 05:27:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=41002199</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=41002199</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41002199</guid></item><item><title><![CDATA[New comment by scriptsmith in "Training for one trillion parameter model backed by Intel and US govt has begun"]]></title><description><![CDATA[
<p>Seems like he actually disagrees here:<p><i>If you train a bigger model on more text, we have a lot of confidence that the next-word prediction task will improve. So algorithmic progress is not necessary, it's a very nice bonus, but we can sort of get more powerful models for free, because we can just get a bigger computer, which we can say with some confidence we're going to get, and just train a bigger model for longer, and we are very confident we are going to get a better result.</i><p><a href="https://youtu.be/zjkBMFhNj_g?t=1543" rel="nofollow noreferrer">https://youtu.be/zjkBMFhNj_g?t=1543</a> (23:43)</p>
]]></description><pubDate>Fri, 24 Nov 2023 13:08:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=38403594</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=38403594</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38403594</guid></item><item><title><![CDATA[New comment by scriptsmith in "Code Llama, a state-of-the-art large language model for coding"]]></title><description><![CDATA[
<p>How are people using these local code models? I would much prefer using these in-context in an editor, but most of them seem to be deployed just in an instruction context. There's a lot of value to not having to context switch, or have a conversation.<p>I see the GitHub copilot extensions gets a new release one every few days, so is it just that the way they're integrated is more complicated so not worth the effort?</p>
]]></description><pubDate>Thu, 24 Aug 2023 15:05:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=37249662</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=37249662</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=37249662</guid></item><item><title><![CDATA[New comment by scriptsmith in "tRPC – Build and consume typesafe APIs without schemas or code generation"]]></title><description><![CDATA[
<p>Is there some trick to doing validation of request data using this process? That's a valuable part of using something like tRPC, JSON Schema + type generation, zod, etc.</p>
]]></description><pubDate>Sat, 12 Aug 2023 11:58:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=37099295</link><dc:creator>scriptsmith</dc:creator><comments>https://news.ycombinator.com/item?id=37099295</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=37099295</guid></item></channel></rss>