<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: web007</title><link>https://news.ycombinator.com/user?id=web007</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Wed, 29 Apr 2026 07:59:02 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=web007" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by web007 in "An AI agent deleted our production database. The agent's confession is below"]]></title><description><![CDATA[
<p>> Backups stored on the same volume is an interesting glitch to avoid<p>The phrasing is different, but this is how AWS RDS works as well. If you delete a database in RDS, all of the automated snapshots that it was doing and all of the PITR logs are also gone. If you do manual snapshots they stick around, but all of the magic "I don't have to think about it" stuff dies with the DB.</p>
]]></description><pubDate>Sun, 26 Apr 2026 17:20:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=47911989</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=47911989</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47911989</guid></item><item><title><![CDATA[New comment by web007 in "Taking a Look at Compression Algorithms – Moncef Abboud"]]></title><description><![CDATA[
<p>zstd beats gzip on both speed and size, for every compression level.<p>If you need compatibility then gzip (pigz) or zip (7z) or bz2 (pbzip2) are the best of worse outcomes, but for Pareto front optimal speed and size you want zstd.</p>
]]></description><pubDate>Fri, 17 Apr 2026 15:10:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=47806792</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=47806792</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47806792</guid></item><item><title><![CDATA[New comment by web007 in "Backblaze has stopped backing up OneDrive and Dropbox folders and maybe others"]]></title><description><![CDATA[
<p>`rclone` with AWS credentials. Go make a bucket and a key that can read/write to it.<p>Set up your config to exclude common non-file dirs, or say "only `/Applications` and `Home` and that's about it. If it's a file then it's a file, and it will be synced up.</p>
]]></description><pubDate>Tue, 14 Apr 2026 20:12:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=47770842</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=47770842</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47770842</guid></item><item><title><![CDATA[New comment by web007 in "A WebGL game where you deliver messages on a tiny planet"]]></title><description><![CDATA[
<p>There's a poop symbol on the bottom right, click it to see ways to communicate.<p>I was surprised to learn the random messengers are other humans!</p>
]]></description><pubDate>Sat, 27 Sep 2025 22:23:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=45399852</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=45399852</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45399852</guid></item><item><title><![CDATA[New comment by web007 in "Curate your shell history"]]></title><description><![CDATA[
<p>Curation is probably a good idea, but keeping context is probably a better idea.<p>The referenced "I don't keep history" philosophy is madness. You won't know what thing would have been useful to keep until you need it $later. Sure, you'll write down some good stuff and maybe alias it.<p>That's fantastic, do more of that anyway.<p>Don't pretend you're going to do that for every trick or gotcha you encounter, and don't think you're going to remember that one-off not-gonna-need-it thing you did last week when you find that bug again.<p>My local history is currently 106022 lines, and that's not even my full synchronized copy, just this machine. It's isolated per-session and organized into ~/.history/YYYY/MM/machine_time_session hierarchy. It has 8325 "git status", 4291 "ll", 2403 "cd .." and 97 "date" entries which don't matter.  Literal/complete entries, not including variations like "date +%Y%m%d" which are separate.  I can ignore them, either by grepping them out or filtering mentally, but something as benign as "cd .." is INCREDIBLY useful to establish context when I'm spelunking through what I did to debug a thing 2 years ago.<p>The even better version of both of these variants is to keep everything AND curate out useful stuff. That whole history (4 years locally) is 10MB, and my entire history compressed would be less than a megabyte.<p>Edit: just realized who posted this, I overlapped with Tod at my first gig in Silicon Valley. Small world!</p>
]]></description><pubDate>Fri, 06 Jun 2025 16:05:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=44202234</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=44202234</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44202234</guid></item><item><title><![CDATA[New comment by web007 in "After months of coding with LLMs, I'm going back to using my brain"]]></title><description><![CDATA[
<p>> LLMs are great for junior, fast-shipping devs; less so for experienced, meticulous engineers<p>Is that not true? That feels sufficiently nuanced and gives a spectrum of utility, not binary one and zero but "10x" on one side and perhaps 1.1x at the other extrema.<p>The reality is slightly different - "10x" is SLoC, not necessarily good code - but the direction and scale are about right.</p>
]]></description><pubDate>Fri, 16 May 2025 14:07:55 +0000</pubDate><link>https://news.ycombinator.com/item?id=44005722</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=44005722</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44005722</guid></item><item><title><![CDATA[New comment by web007 in "The Epochalypse Project"]]></title><description><![CDATA[
<p>I've had my "2038 consulting" sites since Feb 2011, but someone got epochalypse dot com registered August 2007.</p>
]]></description><pubDate>Sun, 11 May 2025 16:28:23 +0000</pubDate><link>https://news.ycombinator.com/item?id=43954861</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=43954861</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43954861</guid></item><item><title><![CDATA[New comment by web007 in "Blasting Past WebP - An analysis of the NSO BLASTPASS iMessage exploit"]]></title><description><![CDATA[
<p>> Storage is cheap, bandwidth is cheap, so who cares?<p>This is a ridiculous assertion.<p>They're both cheap in the commercial sense, but neither cheap nor infinite in the UX sense. Time and space matter in the real world.<p>Google wouldn't have created WebP if there was no tangible, measurable cost benefit to using it over some alternative. Same goes for H.264 or HEVC or AV1, at scale bandwidth and storage are far from cheap. See the article on the FP today about Google's double-digit EXAbyte storage clusters with 50TB/s read volume each as a real-world example, there's nothing cheap about that.</p>
]]></description><pubDate>Thu, 27 Mar 2025 15:59:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=43495000</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=43495000</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43495000</guid></item><item><title><![CDATA[New comment by web007 in "WWI's 'Dazzle' Camouflage Seemed Effective Due to Unexpected Optical Trick"]]></title><description><![CDATA[
<p>Look up "CV dazzle" for the equivalent in the modern age, makeup effects to avoid facial detection / recognition.</p>
]]></description><pubDate>Tue, 25 Mar 2025 05:39:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=43468386</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=43468386</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43468386</guid></item><item><title><![CDATA[New comment by web007 in "Who exactly needs to get approval from an institutional review board (IRB)?"]]></title><description><![CDATA[
<p>> As an analogy, driving a car is dangerous. Whenever I drive, I could easily kill someone. But the government doesn’t force me to submit a driving plan any time I want to go somewhere. Instead, if I misbehave, I am punished in retrospect. Why don’t we apply the same policy to research?<p>"We" decided that Tuskegee was bad enough that it should be stopped before harm is done, and that there is no appropriate or sufficient "punish[ment] in retrospect" for the fallout.<p>The government makes you get a license to drive at all, then "drive a Pinto" versus "drive a Trabant" are similar enough that they don't require more info.  They require you to get different licensure to drive a bigger truck where you could potentially cause more harm, or to drive an airplane.  In this analogy the IRB is the DMV/FAA/whatever, and you're asking for permission to drive a tank, a motorized unicycle, a helicopter, an 18-wheeler or a stealth fighter. You don't get a Science License rubber stamp because that's like getting a Vehicle License - the variation in "Vehicle" is big enough that each type needs review.</p>
]]></description><pubDate>Thu, 13 Feb 2025 18:59:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=43039787</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=43039787</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43039787</guid></item><item><title><![CDATA[New comment by web007 in "If your customers don't talk, NPS is a vanity metric"]]></title><description><![CDATA[
<p>Every time you get one of those surveys rank them at zero, then add "Net Promoter Score is a flawed vanity metric and shouldn't be used for business purposes" in the comment box. Sometimes I link the Wikipedia NPS "Criticism" section as well.<p>Most places don't care about the results from an actual customer service perspective. The above gets crickets, not even an auto responder.<p>For companies that do care (tiny startups, mostly) I've gotten IMMEDIATE personal email responses from CEOs and founders asking what they can fix for a zero NPS. That's a great place to link the criticism section if not done previously, and to provide useful, raw feedback on what you love/hate about their products.</p>
]]></description><pubDate>Fri, 07 Feb 2025 07:01:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=42970184</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=42970184</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42970184</guid></item><item><title><![CDATA[New comment by web007 in "My failed attempt to shrink all NPM packages by 5%"]]></title><description><![CDATA[
<p>It's very likely zero or positive impact on the decompression side of things.<p>Starting with smaller data means everything ends up smaller. It's the same decompression algorithm in all cases, so it's not some special / unoptimized branch of code. It's yielding the same data in the end, so writes equal out plus or minus disk queue fullness and power cycles. It's _maybe_ better for RAM and CPU because more data fits in cache, so less memory is used and the compute is idle less often.<p>It's relatively easy to test decompression efficiency if you think CPU time is a good proxy for energy usage: go find something like React and test the decomp time of gzip -9 vs zopfli. Or even better, find something similar but much bigger so you can see the delta and it's not lost in rounding errors.</p>
]]></description><pubDate>Mon, 27 Jan 2025 15:20:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=42842007</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=42842007</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42842007</guid></item><item><title><![CDATA[New comment by web007 in "Scientists use magnetic nanotech to safely rewarm frozen tissues for transplant"]]></title><description><![CDATA[
<p>Reminds me of microwaves - one of the early uses of microwave heating was to reanimate frozen hamsters:<p><a href="https://en.wikipedia.org/wiki/Microwave_oven#Discovery" rel="nofollow">https://en.wikipedia.org/wiki/Microwave_oven#Discovery</a></p>
]]></description><pubDate>Wed, 11 Sep 2024 18:35:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=41514056</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=41514056</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41514056</guid></item><item><title><![CDATA[New comment by web007 in "Lottery Simulator (2023)"]]></title><description><![CDATA[
<p>Odds for N tickets as odd(N):<p>odd(0) = 0<p>odd(1) = X, so odd(1) / odd(0) = +Inf increase in odds of winning vs 0<p>odd(2) = 2X, so odd(2) / odd(1) = 2x increase in odds of winning vs 1</p>
]]></description><pubDate>Wed, 11 Sep 2024 00:19:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=41507000</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=41507000</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41507000</guid></item><item><title><![CDATA[New comment by web007 in "How to mail an SD card with gummy glue"]]></title><description><![CDATA[
<p>It's a computer based on BSD, but with no WiFi, no BT, no screen, and no ability to play movies or games or music. And it's all programmed in C - not C++ or Rust anything similarly memory-safe-ish.<p>So RPi, but more vague and vulnerable and less useful. And maybe more expensive?</p>
]]></description><pubDate>Tue, 16 Jul 2024 14:40:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=40977074</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=40977074</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40977074</guid></item><item><title><![CDATA[New comment by web007 in ""No way to prevent this" say users of only language where this regularly happens"]]></title><description><![CDATA[
<p>A) "surely nobody has ever thought of [this] before]" says person who hasn't read <a href="https://xeiaso.net/shitposts/no-way-to-prevent-this/" rel="nofollow">https://xeiaso.net/shitposts/no-way-to-prevent-this/</a><p>B) It's a spin on The Onion headline about school shootings.</p>
]]></description><pubDate>Wed, 22 May 2024 18:43:06 +0000</pubDate><link>https://news.ycombinator.com/item?id=40444579</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=40444579</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40444579</guid></item><item><title><![CDATA[New comment by web007 in "Ubuntu Desktop 24.04 LTS: Noble Numbat"]]></title><description><![CDATA[
<p>You mean like <a href="https://cdimage.ubuntu.com/releases/24.04/release/ubuntu-24.04-live-server-arm64.iso" rel="nofollow">https://cdimage.ubuntu.com/releases/24.04/release/ubuntu-24....</a> ?</p>
]]></description><pubDate>Sat, 27 Apr 2024 14:59:06 +0000</pubDate><link>https://news.ycombinator.com/item?id=40180447</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=40180447</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40180447</guid></item><item><title><![CDATA[New comment by web007 in "VideoGigaGAN: Towards detail-rich video super-resolution"]]></title><description><![CDATA[
<p>Yes, absolutely. Paychovisual encoding can only do so much within the constraints of H.264/265.<p>Throwing away 3/4 (half res) or 15/16 (quarter res) of the data, encoding to X bitrate and then decoding+upscaling looks far better than encoding to the same X bitrate with full resolution.<p>For high bitrate, native resolution will of course look better. For low bitrate, the way H.26? algorithms work end up turning high resolution into a blocky ringing mess to compensate, vs lower resolution where you can see the content, just fuzzily.<p>Go get Tears of Steel raw 4K video (Y4M I think it's called). Scale it down 4x and encode it with ffmpeg HEVC veryslow at CRF 30. Figure out the bitrate, then cheat - use two-pass veryslow HEVC encoding to get the best possible quality native resolution at the same bitrate as your 4x downscaled version. You're aiming for two files that are about the same size. Somehow I couldn't convince the codec to go low enough to match, so I had the low-res version about 60% of the high-res version filesize. Now go and play them both back at 4K with just whatever your native upscale is - bilinear, bicubic, maybe NVIDIA Shield with it's AI Upscaling.<p>Go do that, then tell me you honestly think the blocky, streaky, illegible 4K native looks better than the "soft" quarter-res version.</p>
]]></description><pubDate>Thu, 25 Apr 2024 06:05:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=40153989</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=40153989</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40153989</guid></item><item><title><![CDATA[New comment by web007 in "Are Unpaid Take-Home Interview Assignments Ethical?"]]></title><description><![CDATA[
<p>I think they are ethical but annoying if done properly, and likely give good signal as to a candidate's skill. Planning and (voluntary) time boxing to 1-2 hours make the argument stronger, as that time could equally be spent in a 1:1 synchronous interview which is worse IMO.<p>Leetcode-ish and strict timeboxing are awful and can't possibly provide useful signal beyond "can program in some manner". Nobody can do their best work in 1 hour timed and limited, only in the web IDE which isn't the same as their dev environment, no looking up anything, no progress on part 2 without completing part 1 and similar unrealistic restrictions.<p>They encourage the worst in coding. Globals, dumb temporary names, no comments and done-vs-maintainable style? Ship it. I only need to deal with this code for an hour and then it's thrown away. I'm not going to make my `important_thing_to_remember` variable anything longer than `i`, and I'm going to use `foo[0]` from that ridiculous regex I bodged together instead of splitting it up and building it from pieces where I name the capture group so Future Me can understand it.<p>I'd much rather have a test for 1h of reasonable work, and let me take 2h if needed to solve it and then refactor to make it maintainable.</p>
]]></description><pubDate>Thu, 25 Apr 2024 05:43:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=40153854</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=40153854</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40153854</guid></item><item><title><![CDATA[New comment by web007 in "VideoGigaGAN: Towards detail-rich video super-resolution"]]></title><description><![CDATA[
<p>Why would someone ever take a 40Mbps (compressed) video and downsample it so it can be encoded at 400Kbps (compressed) but played back with nearly the same fidelity / with similar artifacts to the same process at 50x data volume? The world will never know.<p>You're also ignoring the part where all lossy codecs throw away those same details and then fake-recreate them with enough fidelity that people are satisfied. Same concept, different mechanism.<p>Look up what 4:2:0 means vs 4:4:4 in a video codec and tell me you still think it's "pure insanity" to rescale.<p>Or, you know, maybe some people have reasons for doing things that aren't the same as the narrow scope of use-cases you considered, and this would work perfectly well for them.</p>
]]></description><pubDate>Wed, 24 Apr 2024 15:36:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=40145657</link><dc:creator>web007</dc:creator><comments>https://news.ycombinator.com/item?id=40145657</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40145657</guid></item></channel></rss>