<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: ericskiff</title><link>https://news.ycombinator.com/user?id=ericskiff</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Sat, 11 Apr 2026 18:00:07 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=ericskiff" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by ericskiff in "Starfling: A one-tap endless orbital slingshot game in a single HTML file"]]></title><description><![CDATA[
<p>Fun, but the way they fly doesn't quite match my intuition. Why would an object curve when I send it out on the tangent? Wouldn't that be a straight line unless it's affected by a different gravity well?</p>
]]></description><pubDate>Sat, 11 Apr 2026 04:54:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=47727524</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=47727524</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47727524</guid></item><item><title><![CDATA[New comment by ericskiff in "Animated Engines"]]></title><description><![CDATA[
<p>Wooooow, I LOVED this site when it first came out and I still reference it when I talk about the early web and how it enabled me to learn things I never would have otherwise. I can still picture the animation of the wankel rotary engine from this site whenever I think about it.<p>This and howstuffworks.com made me so hopeful for the future of the web when I was young</p>
]]></description><pubDate>Fri, 06 Feb 2026 19:41:11 +0000</pubDate><link>https://news.ycombinator.com/item?id=46917189</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=46917189</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46917189</guid></item><item><title><![CDATA[New comment by ericskiff in "Hackers (1995) Animated Experience"]]></title><description><![CDATA[
<p>This is so lovely! If the original author is here in the comments, some feature requests that would absolutely make my day, presumably from easiest to hardest :)<p>I love this so much, thank you for sharing!<p>* Slow down the motion to about .5 of what it is currently, with easing/acceleration on the speed to emulate the camera dolly and jib effects used in the film<p>* Add a random motion setting that allows me to run it full screen just sliding through the aisles, banking around turns, flying up and then back down the aisles.<p>* optionally lock the framerate to 24fps to give it a film feel<p>* optional shaders on the main viewport to emulate lens distortion, film grain, etc<p>* raytracing with reflectivity on the glass, refraction, diffusion, etc.</p>
]]></description><pubDate>Fri, 06 Feb 2026 19:37:30 +0000</pubDate><link>https://news.ycombinator.com/item?id=46917152</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=46917152</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46917152</guid></item><item><title><![CDATA[New comment by ericskiff in "Humankind's 10 million year love affair with booze might end"]]></title><description><![CDATA[
<p>For anyone taking this comment seriously, please research and understand the potential long term impacts of GBL before going near it. It's neurotoxic and can cause brainfog and lowered cognitive ability.  It's also lethal in the wrong dose, with a tiny margin for error.<p>It's by no means a safe alcohol replacement</p>
]]></description><pubDate>Sun, 21 Dec 2025 01:23:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=46341374</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=46341374</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46341374</guid></item><item><title><![CDATA[New comment by ericskiff in "Show HN: Picknplace.js, an alternative to drag-and-drop"]]></title><description><![CDATA[
<p>Awesome to see real UX experimentation, and this elicited a strong response from me at first  "oh I haaaaate that".<p>On further reflection, this is very interesting and I understand where the drag and drop interaction breaks down on long lists. Some additional UI affordances to communicate what's happening may make it intuitive and clear.<p>Things I'd want to experiment with if I was implementing this:<p>* A "wheel" effect where the items in the list grow slightly as they near the chosen item which stays locked in the interface at the center, popping into place at at each 'click'. Somewhat like the Price Is Right wheel flipper<p>* Making the interaction entirely scroll based once I click. Setting the item in place can be done by any other click or keypress, and cancelled with the escape hotkey. My interaction is pick, scroll, click (without having to aim back at the thing I just placed by scrolling)</p>
]]></description><pubDate>Fri, 19 Dec 2025 14:51:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=46326476</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=46326476</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46326476</guid></item><item><title><![CDATA[New comment by ericskiff in "DeepSeek-v3.2: Pushing the frontier of open large language models [pdf]"]]></title><description><![CDATA[
<p>I believe this was a statement on cost per token to us as consumers of the service</p>
]]></description><pubDate>Mon, 01 Dec 2025 19:55:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=46112288</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=46112288</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46112288</guid></item><item><title><![CDATA[New comment by ericskiff in "The great software quality collapse or, how we normalized catastrophe"]]></title><description><![CDATA[
<p>It all reads like hallucinated slop from top to bottom<p>"I've been tracking software quality metrics for three years" and then doesn't show any of the receipts, and simply lists anecdotal issues. I don't trust a single fact from this article.<p>My own anecdote: barely capable developers churning out webapps built on PHP and a poor understanding of Wordpress and jQuery were the norm in 2005. There's been an industry trend towards caring about the craft and writing decent code.<p>Most projects, even the messy ones I inherit from other teams today have Git, CI/CD, at least some tests, and a sane hosting infrastructure. They're also mosty built on decent platforms like Rails/Django/Next etc that impose some conventional structure. 20 years ago most of them were "SSH into the box and try not to break anything"</p>
]]></description><pubDate>Thu, 09 Oct 2025 15:34:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=45529159</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=45529159</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45529159</guid></item><item><title><![CDATA[New comment by ericskiff in "Kurt Got Got"]]></title><description><![CDATA[
<p>go to tax.gov<p>You'll identify on id.me<p>People have just gotten used to this sort of thing unfortunately</p>
]]></description><pubDate>Thu, 09 Oct 2025 13:20:47 +0000</pubDate><link>https://news.ycombinator.com/item?id=45527329</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=45527329</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45527329</guid></item><item><title><![CDATA[New comment by ericskiff in "Meta RayBan AR glasses shows Lumus waveguide structures in leaked video"]]></title><description><![CDATA[
<p>Folks have been predicting that the next big shift in computing will be onto glasses that we wear and away from our phones.<p>The tech just hasn’t been there yet and most of the devices that do this are heavy clunky and hot<p>Meta is investing billions to get out ahead of this shift and to own the entertainment and data (and thus advertising) layers that sit on top of the real world through these glasses<p>The rumor mill is abuzz that Facebook finally making a play for it in the next set of smart glasses after a few years of sticking to VR headsets and audio/camera only glasses</p>
]]></description><pubDate>Tue, 16 Sep 2025 20:03:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=45267242</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=45267242</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45267242</guid></item><item><title><![CDATA[New comment by ericskiff in "Veo 3 and Imagen 4, and a new tool for filmmaking called Flow"]]></title><description><![CDATA[
<p>Has anyone gotten access to Imagen 4 for image editing, inpaint/outpaint or using reference images yet? That's core to my workflow and their docs just lead to a google form. I've submitted but it feels like it's a bit of a black hole.</p>
]]></description><pubDate>Tue, 20 May 2025 21:02:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=44045891</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=44045891</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44045891</guid></item><item><title><![CDATA[New comment by ericskiff in "Can LLMs write better code if you keep asking them to “write better code”?"]]></title><description><![CDATA[
<p>I highly recommend the command line AI coding tool, AIder. You fill its context window with a few relevant files, ask questions, and then set it to code mode and it starts making commits. It’s all git, so you can back anything out, see the history, etc.<p>It’s remarkable, and I agree Claude 3.5 makes playing with local LLMs seem silly in comparison. Claude is useful for generating real work.</p>
]]></description><pubDate>Fri, 03 Jan 2025 12:27:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=42585052</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=42585052</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42585052</guid></item><item><title><![CDATA[New comment by ericskiff in "GPT-5 is behind schedule"]]></title><description><![CDATA[
<p>What we can reasonably assume from statements made by insiders:<p>They want a 10x improvement from scaling and a 10x improvement from data and algorithmic changes<p>The sources of public data are essentially tapped<p>Algorithmic changes will be an unknown to us until they release, but from published research this remains a steady source of improvement<p>Scaling seems to stall if data is limited<p>So with all of that taken together, the logical step is to figure out how to turn compute into better data to train on. Enter strawberry / o1, and now o3<p>They can throw money, time, and compute at thinking about and then generating better training data. If the belief is that N billion new tokens of high quality training data will unlock the leap in capabilities they’re looking for, then it makes sense to delay the training until that dataset is ready<p>With o3 now public knowledge, imagine how long it’s been churning out new thinking at expert level across every field. OpenAI’s next moat may be the best synthetic training set ever.<p>At this point I would guess we get 4.5 with a subset of this - some scale improvement, the algorithmic pickups since 4 was trained, and a cleaned and improved core data set but without risking leakage of the superior dataset<p>When 5 launches, we get to see what a fully scaled version looks like with training data that outstrips average humans in almost every problem space<p>Then the next o-model gets to start with that as a base and reason? Its likely to be remarkable</p>
]]></description><pubDate>Sun, 22 Dec 2024 13:47:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=42486273</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=42486273</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42486273</guid></item><item><title><![CDATA[New comment by ericskiff in "Claude AI built me a React app to compare maps side by side"]]></title><description><![CDATA[
<p>We’ve been hitting this in our work and in experimentation, and I can confirm that Claude sonnet 3.5 has gotten 100% of the way there, including working through errors and tricky problems as we tested the apps it built.</p>
]]></description><pubDate>Sun, 17 Nov 2024 15:47:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=42164771</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=42164771</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42164771</guid></item><item><title><![CDATA[New comment by ericskiff in "Claude AI built me a React app to compare maps side by side"]]></title><description><![CDATA[
<p>Interestingly, I’m pretty sure they mean they hit the limit with tokens on Claude.<p>There’s a daily 2.5 million token limit that you can use up fairly quickly with 100K context<p>So they may very well have completed the whole program with Claude. It’s just the machine literally stopped and the human had to do the final grunt work.</p>
]]></description><pubDate>Sun, 17 Nov 2024 15:46:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=42164767</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=42164767</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42164767</guid></item><item><title><![CDATA[New comment by ericskiff in "Using an 8K TV as a Monitor"]]></title><description><![CDATA[
<p>Deskpad might be what you’re after! It’s a virtual display in a window, you can share that instead of your whole screen but still get multi-app flows captured<p><a href="https://github.com/Stengo/DeskPad">https://github.com/Stengo/DeskPad</a></p>
]]></description><pubDate>Wed, 30 Oct 2024 04:19:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=41991944</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=41991944</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41991944</guid></item><item><title><![CDATA[New comment by ericskiff in "Shrinking augmented reality displays into eyeglasses to expand their use"]]></title><description><![CDATA[
<p>100% 
Being able to write, code, and read from a large virtual terminal without a clunky visor on would be amazing.</p>
]]></description><pubDate>Wed, 09 Oct 2024 16:51:06 +0000</pubDate><link>https://news.ycombinator.com/item?id=41790006</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=41790006</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41790006</guid></item><item><title><![CDATA[New comment by ericskiff in "I Quit Teaching Because of ChatGPT"]]></title><description><![CDATA[
<p>I’ll share my experience and the experience of my kids so far.<p>Aside from blindly copying and pasting a response, in which case the learner wasn’t interested in learning and probably would have plagiarized from somewhere else anyway, I have found LLM to be an incredible, endlessly patient teacher that I’m never afraid to ask a question of.<p>My kids who are in the tween and teenage years, are incredibly skeptical and dismissive of AI. They regard AI art as taking away creative initiative from artists and treat LLM similar to the way we treated Google growing up, if they use them at all. It’s a tool which can be helpful for answering questions that is part of the landscape of their knowledge building.<p>That knowledge acquisition includes school, YouTube and other short videos, their peers (online and off) Internet searches, and asking AI. Generally, I regard asking AI as one of the least problematic sources of info in that environment.<p>While I tend to be optimistic as a default, I truly do think that the ability to become less ignorant by asking questions is a net positive for humanity.<p>The only thing I truly lean on AI for right now is as an editor, helping me turn my detailed bullet points into decently crafted prose, and for generating clear and concise transcripts and takeaways from long meetings. To me that doesn’t seem like the downfall of human knowledge.</p>
]]></description><pubDate>Tue, 01 Oct 2024 16:09:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=41710314</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=41710314</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41710314</guid></item><item><title><![CDATA[New comment by ericskiff in "Panic at the Job Market"]]></title><description><![CDATA[
<p>For over 10 years we've had people do a paid ~4 hr take-home which is very similar to the work they'll actually be doing (here's a dummy codebase, add a few features fix a few bugs).<p>If they're not interested in getting paid to do that work now, it's a good signal for us that they won't be happy doing it when they're working with us. It's helped us find really wonderful people to work with.</p>
]]></description><pubDate>Wed, 17 Jul 2024 19:02:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=40989192</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=40989192</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40989192</guid></item><item><title><![CDATA[New comment by ericskiff in "Leaked deck reveals how OpenAI is pitching publisher partnerships"]]></title><description><![CDATA[
<p>Counterpoint - I pay for my whole team to have access, shared tools, etc. we also spend a decent amount on their APIs across a number of client projects.<p>OpenAI has a strong revenue model based on paid use</p>
]]></description><pubDate>Thu, 09 May 2024 19:57:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=40312338</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=40312338</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40312338</guid></item><item><title><![CDATA[New comment by ericskiff in "Google CodeGemma: Open Code Models Based on Gemma [pdf]"]]></title><description><![CDATA[
<p>I've been playing with Continue:
<a href="https://github.com/continuedev/continue">https://github.com/continuedev/continue</a></p>
]]></description><pubDate>Tue, 09 Apr 2024 14:18:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=39979693</link><dc:creator>ericskiff</dc:creator><comments>https://news.ycombinator.com/item?id=39979693</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39979693</guid></item></channel></rss>