<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: SoSoRoCoCo</title><link>https://news.ycombinator.com/user?id=SoSoRoCoCo</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Mon, 13 Apr 2026 07:08:58 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=SoSoRoCoCo" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by SoSoRoCoCo in "How I hijacked the top-level domain of a sovereign state"]]></title><description><![CDATA[
<p>Wow. An entire country can accidentally be hosed if their domain name used by their NS expires? Is it that perilous?</p>
]]></description><pubDate>Fri, 15 Jan 2021 19:10:46 +0000</pubDate><link>https://news.ycombinator.com/item?id=25794894</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25794894</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25794894</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Uganda's internet Shutdown"]]></title><description><![CDATA[
<p>I hope we never find out what happens if the US internet goes down. But with 2020's track record and what 2021 is starting to look like, we might find out!</p>
]]></description><pubDate>Fri, 15 Jan 2021 19:04:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=25794805</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25794805</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25794805</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Signal is having technical difficulties"]]></title><description><![CDATA[
<p>Same here!<p>Last night a friend from India popped up on signal. I told him "Welcome!" and he said "You finally wore me down, I've left WhatsApp and I'm trying to move my family off of it..."</p>
]]></description><pubDate>Fri, 15 Jan 2021 18:36:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=25794375</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25794375</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25794375</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Oberon OS Walkthrough (2009)"]]></title><description><![CDATA[
<p>> Everything is a Command Line<p>This is a really controversial pattern for GUIs. In one camp, the GUI is really a skin over the CLI that acts like a virtual user, translating GUI inputs into underlying CLI. In the other camp, the GUI is all there is (e.g., Windows) and there is no underlying OS that can be accessed via CLI: in fact, the CLI is a "fake GUI" (win32 apps written without a Window). I can't say which is better, but it is fascinating to see that this was an "original pattern".</p>
]]></description><pubDate>Fri, 15 Jan 2021 04:32:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=25787035</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25787035</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25787035</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Dungeon Magazine"]]></title><description><![CDATA[
<p>BYTE ftw! Typing in Apple //e ASM code for hours and hours... ... and hours ... and then later, 2600 magazine.</p>
]]></description><pubDate>Fri, 15 Jan 2021 02:27:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=25786206</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25786206</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25786206</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Dungeon Magazine"]]></title><description><![CDATA[
<p>I'm glad to have this to read. It brings back memories of hanging out at the local RC shop that sold D&D stuff, back in the 80's.<p>What I've always wanted to read area transcripts of the early players (like Gygax), perhaps at GenCon, working out very difficult encounters. For example: how did the early DMs approach role-playing a supergenius demigod with high-level cleric/MU spells? That's got to be very hard character to inhabit. I mean the Lolth module, Q1? come on, there are so many high level intelligent creatures in that endgame...</p>
]]></description><pubDate>Fri, 15 Jan 2021 02:26:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=25786203</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25786203</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25786203</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Machine Learning: The Great Stagnation"]]></title><description><![CDATA[
<p>Oops! Thanks, it was from this challenge. Lots of neat stuff in here.<p><a href="http://dcase.community/challenge2021/index" rel="nofollow">http://dcase.community/challenge2021/index</a></p>
]]></description><pubDate>Fri, 15 Jan 2021 00:29:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=25785219</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25785219</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25785219</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Machine Learning: The Great Stagnation"]]></title><description><![CDATA[
<p>Oh. I see what you mean. Yeah, I guess by definition backwards propagation is trial-end-error. Huh, I never thought of it that way. Thanks for clarifying, I thought you were being saucy: my apologies for being snarky.</p>
]]></description><pubDate>Thu, 14 Jan 2021 20:41:08 +0000</pubDate><link>https://news.ycombinator.com/item?id=25782176</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25782176</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25782176</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Machine Learning: The Great Stagnation"]]></title><description><![CDATA[
<p>Well, if your only experience is reading python opencv stack overflow posts, then of course...</p>
]]></description><pubDate>Thu, 14 Jan 2021 20:05:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=25781733</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25781733</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25781733</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Machine Learning: The Great Stagnation"]]></title><description><![CDATA[
<p>This article is dead-on, but I think it is missing a fairly large segment of where ML is actually working well: anomaly detection and industrial defect detection.<p>While I agree that everyone was shocked, myself included, when we saw how well SSD and YOLO worked, the last mile problem is stagnating. What I mean is: 7 years ago I wrote an image pipeline for a company using traditional AI methods. It was extremely challenging. When we saw SSDMobileNet do the same job 10x faster with a fraction of the code, our jaws dropped. Which is why the dev ship turned on a dime: there's something big in there.<p>The industry is stagnated for exactly the reasons brought up: we don't know how to squeeze out the last mile problem because NNs are EFFING HARD and research is very math heavy: e.g., it cannot be hacked by a Zuck-type into a half-assed product overnight, it needs to be carefully researched for years. This makes programmers sad, because by nature we love to brute force trial-and error our code, and homey don't play that game with machine learning.<p>However, places where it isn't stagnating are things like vibration and anomaly detection. This is a case where <a href="https://github.com/YumaKoizumi/ToyADMOS-dataset" rel="nofollow">https://github.com/YumaKoizumi/ToyADMOS-dataset</a> really shines because it adds something that didn't exist before, and it doesn't have to be 100% perfect: anything is better than nothing.<p>At Embedded World last year I saw tons of FPGA solutions for rejecting parts on assembly lines. Since every object appears nearly in canonical form (good lighting, centered, homogeneous presentation), NN's are kicking ass bigtime in that space.<p>It is important to remember Self-Driving Car Magic is just the consumer-facing hype machine. ML/NNs are working spectacularly well in some domains.</p>
]]></description><pubDate>Thu, 14 Jan 2021 16:36:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=25778415</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25778415</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25778415</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Intel is dying and we can’t save it"]]></title><description><![CDATA[
<p>This question has raged since the 90's. I worked on the Itanium (Madison and McKinley), and the VLIW architecture was brilliant. This was during the time of the Power4 and the DEC ALPHA, two non-x86 competing architectures that were dominating the "Workstation" market (remember that term?). It looked like the server world was going to have three architectural options (Sun was dying, and Motorola's 68000 line wasn't up to the task.)<p>Microsoft even had a version of NT3.5 for The Itanic. It seemed we were just about to achieve critical mass in the server world to switch to a new architecture with huge address space, ECC up the wazoo and massive integer performance.<p>Then the PC revolution took off with Win95, and the second war with AMD happened (and NexGen sorta). This couldn't be solved with legal battles. This put all hands on deck because there was SO much money to be made with x86 compatibility. The presence of x86 "up and down" AMD & Intel's roadmap took over the server market as well: it was x86 all over the place.<p>And that, chil'ren, is why x86 was reborn in the 90's just as it was close to being wiped out.<p>Now Apple has proven you can you seamless sneak in a brand-new architecture, get hyooj gainz, and we are none the wiser. This is fantastic news. I think we are truly on the cusp of x86 losing its dominance in the consumer space after almost 35 years of dominance.</p>
]]></description><pubDate>Thu, 14 Jan 2021 16:08:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=25778041</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25778041</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25778041</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Intel is dying and we can’t save it"]]></title><description><![CDATA[
<p>Ah, good info. Thanks.</p>
]]></description><pubDate>Thu, 14 Jan 2021 03:28:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=25771502</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25771502</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25771502</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Intel is dying and we can’t save it"]]></title><description><![CDATA[
<p>That only lists the number, not what the # actually means in terms of actual lithography or more importantly, transitor performance. It used to be minimum feature size, or just L of the gate, but with finfet it can be an overloaded term. Scaling in x of .75 and y of .7 lead to 10% performance improvement per node at Intel. TSMC hasn't been that clear. And that doesn't even account for increase in metal layers, pitch of layers (or if they use poly lower layers to get even faster gains), or average track density due to above/below electromigration minimums.<p>EDIT: All of this stuff is usually stated at ISSCC every time a new process is announced, so it isn't NDA. I haven't followed this in years which is why I was asking for a process person to step in.</p>
]]></description><pubDate>Thu, 14 Jan 2021 03:23:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=25771472</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25771472</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25771472</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Intel is dying and we can’t save it"]]></title><description><![CDATA[
<p>> 3-5 times behind TSMC lithographically,<p>I'm not entirely sure that is true. TSMC plays fast and loose with their definition of node. Any fab folks want to speak up?<p>What I mean by that:<p><a href="https://www.pcgamesn.com/amd/tsmc-7nm-5nm-and-3nm-are-just-numbers" rel="nofollow">https://www.pcgamesn.com/amd/tsmc-7nm-5nm-and-3nm-are-just-n...</a><p>"And also goes some way to explaining why, despite TSMC offering a nominally 7nm process, the general consensus has been that Intel’s 10nm design is pretty much analogous. But what’s 3nm between fabs? At that level, probably quite a lot. But if the 7nm node is more of a branding exercise than genuinely denoting the physical properties of that production process then you can understand why there’s supposedly not a lot in it."</p>
]]></description><pubDate>Thu, 14 Jan 2021 03:16:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=25771428</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25771428</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25771428</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Intel is dying and we can’t save it"]]></title><description><![CDATA[
<p>Yeah, agreed. The designers / integration will probably get the newest nodes--and the headaches with getting their yields up! I suspect the older high-yield nodes will be filled with tenants pretty quickly. I don't have much knowledge of how this is going, at least from the inside.</p>
]]></description><pubDate>Thu, 14 Jan 2021 03:14:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=25771411</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25771411</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25771411</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Intel is dying and we can’t save it"]]></title><description><![CDATA[
<p>Good observation. The only thing that kept MS going was the massive inertia of Win & Office. It bought them time to pivot to the cloud. And now we have things like VSCode, an ElectronJS app, which is the first microsoft product I've adored in two decades. And TypeScript, which has made the world a better place.</p>
]]></description><pubDate>Thu, 14 Jan 2021 03:10:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=25771371</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25771371</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25771371</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Intel is dying and we can’t save it"]]></title><description><![CDATA[
<p>Intel innovates in process. Everything else is ruled by backwards compatibility and frenetic management scared to stay the course. (The vast majority of projects are killed if they don't tape out in ~2 years.)<p>Intel will shift to a TSMC model. They have the best fabs on the planet, and the best fab engineers. I believe it is something like a 3 millions dollars lost per minute if they are idle. They have already started doing that a few years ago, I suspect this will be their final form.<p>IMHO: The only thing holding them back from the transition are the hundreds of small boondoggle-groups staffed by old-timers too scared to retire, and too scared to do something daring, yet still somehow hang on to their hidey-holes. They lost a ton of key architects to Apple a few years ago, which I also suspect was the reason why the M1 is so badass.<p>...and if you really want to get sentimental, here's an AMD poster we had in our cubes back in ~1991:<p><a href="https://linustechtips.com/uploads/monthly_2016_03/Szg2Ppo.jpg.50131cfba4af86263bc0f3ff2b0e5559.jpg" rel="nofollow">https://linustechtips.com/uploads/monthly_2016_03/Szg2Ppo.jp...</a><p>The Sanders-as-Indiana was both funny and infuriating....<p>(The Farrah Fawcett looking woman was Sander's bombshell wife, compared to Grove at the time, who drove a beat up old car.)</p>
]]></description><pubDate>Thu, 14 Jan 2021 03:08:46 +0000</pubDate><link>https://news.ycombinator.com/item?id=25771358</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25771358</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25771358</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Intel CEO Bob Swan to step down, VMware CEO Pat Gelsinger to replace him"]]></title><description><![CDATA[
<p>Nope, I can't find it on Google. Nada. Even on Wayback. It may have been internal but I definitely remember it. It was pope Benedict.</p>
]]></description><pubDate>Wed, 13 Jan 2021 18:18:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=25765916</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25765916</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25765916</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "Intel CEO Bob Swan to step down, VMware CEO Pat Gelsinger to replace him"]]></title><description><![CDATA[
<p>Gelsinger was pushed out as CTO after horribly failing to address AMD's competitive threats. I have no idea how he wormed his way back in, but this does not bode well for Intel's future: Gelsinger is proof of the "Peter Principle", being promoted too high.<p>EDIT: I was a bit harsh, toned it down.<p>EDIT 2: This is probably petty, but I can't ignore the fact that there was a significant hubbub at Intel regarding him using the "Dr" prefix. He scrubbed it from his bio and internal pages when it was pointed it out didn't come from an accredited university and that it was honorary. He also caught flak internally for having the pope bless a wafer for Intel's future success. It was very weird, especially given the high percentage of Muslim engineers at Intel, and its focus on neutrality.</p>
]]></description><pubDate>Wed, 13 Jan 2021 16:46:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=25764520</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25764520</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25764520</guid></item><item><title><![CDATA[New comment by SoSoRoCoCo in "What Is Social Cooling?"]]></title><description><![CDATA[
<p>Your content isn't anonymous: who you are is distilled from what you say, creating a unique fingerprint. Doesn't matter if you your name is hnaccy or SoSoRoCoCo.<p>I change details, times, various things, just a little. Whenever I need to make a reference as a citiation, 20% of the facts change just slightly. Sometimes I'm a woman, sometimes I'm from a different state, sometimes it is a different company I worked for. These details don't matter to the content, but there is so much contradiction in my history that I hope to defeat the algorithms. I got the idea from how Firefox defeats browser fingerprinting (or tries to).<p>I also create new accounts routinely.<p>However, even HN knows who I am because I occasionally forget to use my randomized VPN, and HN keeps track of you.<p>I know this because I was banned from an IP address once and had to renew it from my ISP to come back to HN.</p>
]]></description><pubDate>Wed, 13 Jan 2021 05:00:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=25758379</link><dc:creator>SoSoRoCoCo</dc:creator><comments>https://news.ycombinator.com/item?id=25758379</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=25758379</guid></item></channel></rss>