<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: progmetaldev</title><link>https://news.ycombinator.com/user?id=progmetaldev</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Tue, 07 Apr 2026 08:11:30 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=progmetaldev" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by progmetaldev in "The cult of vibe coding is dogfooding run amok"]]></title><description><![CDATA[
<p>Since most won't actually deal with fintech (I don't know the stats on HN, but I'm talking devs as one industry), your first "a" example might actually be better than your first "b" example, depending on the complexity of the software. In lots (probably most) of industries, having a good codebase would mean architecture decisions were solid, but the domain/service layer is bad. Maybe my experiences don't match most of the HN crowd, but usually I get stuck with very detailed domain/service rules, but the architecture is a problem where too much memory or CPU is being used, just to abstract away the actual rules of the application (the purpose). Usually when I've been brought in to rebuild an application, the client is fine with the results, but they are upset over performance and/or cost to run the application. For anything of actual complexity, it's usually the supporting code that is the biggest failure, because complex apps usually have decent requirements. Now, if the requirements were bad, and the architecture was bad, AND the domain/service layer is bad, I don't know if there's anything to fix that.</p>
]]></description><pubDate>Tue, 07 Apr 2026 04:58:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=47670926</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47670926</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47670926</guid></item><item><title><![CDATA[New comment by progmetaldev in "'Backrooms' and the Rise of the Institutional Gothic"]]></title><description><![CDATA[
<p>My book states that the coloring is in the newest version. I suppose I should go back and take a look at the copyright, in case it is the newest, but is fairly old.<p>I enjoy the general feeling of the book, like claustrophobia inside something that grows infinitely. That's the best way I can describe it, but I haven't gotten more than a quarter of the way through. I do need to go back and fully read it.</p>
]]></description><pubDate>Tue, 07 Apr 2026 01:41:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=47669735</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47669735</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47669735</guid></item><item><title><![CDATA[New comment by progmetaldev in "The cult of vibe coding is dogfooding run amok"]]></title><description><![CDATA[
<p>I mean, hasn't it learned from reading other's code? I don't think it can be any better than the common patterns and practices that it has been trained on. Some outlier of amazing code is probably not going to make much of a difference, unless I am completely misunderstanding LLMs (which I very well may be, and would gladly take any criticism on my take here).</p>
]]></description><pubDate>Tue, 07 Apr 2026 01:36:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=47669703</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47669703</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47669703</guid></item><item><title><![CDATA[New comment by progmetaldev in "The cult of vibe coding is dogfooding run amok"]]></title><description><![CDATA[
<p>To me, it instead sounds like you care about the code you produce. You judge it more harshly than you probably do other code. It sounds like you are also meeting deadlines, so I'd call that a success and more production than what a lot of people tend to put out into the world.<p>I often have a lot of time between projects, and am able to really think about things, and write the code that I'm happy with. Even when I do that, I do some more research, or work on another project, and immediately I'm picking apart sections of my code that I really took the time to "get right." Sometimes it can be worse if you are given vast amounts of time to build your solution, where some form of deadline may have pushed you to make decisions you were able to put off. At least that's my perspective on it, I feel like if you love writing software, you are going to keep improving nearly constantly, and look back at what you've done and be able to pick it apart.<p>To keep myself from getting too distressed over looking at past code now, I tend to look at the overall architecture and success of the project (in regards to the performing what it was supposed to, not necessarily monetarily). If I see a piece of code that I feel could have been written far better, I look at how it fits into the rest. I tend to work on very small teams, so I'm often making architecture decisions that touch large areas of the code, so this may just be from my perspective of not working on a large team. I still do think if you care about your craft, you will be harsh on yourself, more than you deserve.</p>
]]></description><pubDate>Tue, 07 Apr 2026 01:32:11 +0000</pubDate><link>https://news.ycombinator.com/item?id=47669686</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47669686</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47669686</guid></item><item><title><![CDATA[New comment by progmetaldev in "IBM Announces Strategic Collaboration with Arm"]]></title><description><![CDATA[
<p>Fresh out of college, I had an interview for a job working with COBOL. There were classes being held to teach people development, as well as how to maintain existing COBOL software. It was between myself and another recruit, and that other recruit already had COBOL experience. Naturally, I was not chosen, over someone who already have knowledge of the working language being used.<p>Although I probably don't make nearly as much as that COBOL developer over 20 years later, I would be willing to bet that I am happier and haven't locked myself into a specific technology the way that developer probably has. Money is great, but if you actually care about what you do, I expect that being stuck on the same codebase for years isn't too satisfying (at least on code you didn't have a hand in creating from the very start). Too many people translate money into happiness, and I guess there is a balance there, but usually it's not possible to maintain happiness based off money when you do the same thing day in and day out.</p>
]]></description><pubDate>Fri, 03 Apr 2026 05:25:08 +0000</pubDate><link>https://news.ycombinator.com/item?id=47623441</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47623441</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47623441</guid></item><item><title><![CDATA[New comment by progmetaldev in "IBM Announces Strategic Collaboration with Arm"]]></title><description><![CDATA[
<p>I think a large number of people seem to forget the trust that companies have built on IBM over decades. The mainframe market is IBM, where IBM already had a hold. People want to believe that dropping such a large company could be done with a rewrite, but as long as IBM is there to support what they already have in place, it makes it unlikely for companies to move away. Obviously a team that has experience moving away from IBM technology to something more "modern" could go with another platform running on different hardware, but you don't hear about those migrations too much because they are rare (for a reason, IBM also offers support that companies love to cling on to).<p>I don't blame companies that already tied up in IBM tech for sticking with what they already have. As boring and dated as IBM tech might be, it's still running a ton of infrastructure, and you don't get to be that kind of company without being solid and reliable. That's what companies want, even if a development team wants to flex their skills in something new and not tied to IBM.</p>
]]></description><pubDate>Fri, 03 Apr 2026 05:12:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=47623386</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47623386</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47623386</guid></item><item><title><![CDATA[New comment by progmetaldev in "'Backrooms' and the Rise of the Institutional Gothic"]]></title><description><![CDATA[
<p>When I was younger, I went into some places that I shouldn't have (legally). For myself, something scarier is walking through offices and tunnels that look completely deserted, and then coming upon an office or room where it's clear someone has recently been there. Whether it was someone homeless looking for a place to stay, or an employee that's still on the payroll, both would freak me out far more than a completely empty space.<p>I started House of Leaves a couple times, but I always end up spending more time online than reading fiction. I need to actually sit down and read it. I used to read lots of fiction before my addiction to technology. Every five years or so, I go back and read Herman Hesse's Steppenwolf, but it's a fairly small book to get through. I bought the House of Leaves version that has color coding and bizarre layout (not sure if that was always in every version). I suppose I spend enough time watching horror movies and playing odd games that I could afford the time to read House of Leaves.<p>I do watch a lot of amateur found footage films on YouTube, along with analog horror. I remember when Blair Witch Project first came out, and it reminded me of strange dreams and nightmares I've had, and I think that's part of where the attraction to liminal spaces comes from. It's something humans can relate to, but it's harder to put a specific label on the feeling you get when consuming this type of content.</p>
]]></description><pubDate>Fri, 03 Apr 2026 01:07:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=47622212</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47622212</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47622212</guid></item><item><title><![CDATA[New comment by progmetaldev in "Axios compromised on NPM – Malicious versions drop remote access trojan"]]></title><description><![CDATA[
<p>It's more to do with the standard library being so barren of common application needs, and looking for a solution that the community has gotten behind. Axios has been a common dependency in many codebases, because it is a solid solution that many have already used. Every developer could try building all the libraries that they would reach for themselves, but then each company has now taken on the task of ensuring their own (much larger) codebase is free from security issues, on top of taking care of their own issues and bugs.</p>
]]></description><pubDate>Tue, 31 Mar 2026 11:26:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=47585775</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47585775</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47585775</guid></item><item><title><![CDATA[New comment by progmetaldev in "How to turn anything into a router"]]></title><description><![CDATA[
<p>Isn't the issue that a lot of these devices have vulnerabilities and aren't updated often enough, rather than the device being of Chinese origin? You look at hardware for the home market, and most haven't received an update in years, if not a decade. Widely deployed hardware with out of date software seems like it's just a script to crawl home IP address spaces, like a Metasploit module, no?<p>Maybe I'm misunderstanding the link to Chinese vs. non-Chinese router vendors?</p>
]]></description><pubDate>Tue, 31 Mar 2026 05:20:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=47583036</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47583036</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47583036</guid></item><item><title><![CDATA[New comment by progmetaldev in "How to turn anything into a router"]]></title><description><![CDATA[
<p>Sometimes if it's a client that isn't too difficult, they are worth keeping if they come at you with projects that expand your knowledge.</p>
]]></description><pubDate>Tue, 31 Mar 2026 05:04:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=47582947</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47582947</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47582947</guid></item><item><title><![CDATA[New comment by progmetaldev in "How to turn anything into a router"]]></title><description><![CDATA[
<p>Squid caching takes me back. I was dealing with a network for a large car dealership (2006), and they were having issues with pages appearing out of date, as well as sales people who couldn't help themselves from looking at adult websites. I had to figure out the entire network (was put in place before I ever showed up to provide support), which included both the physical and software layers. Not only was I on ladders in the service area, using a network tone device (for those that don't know, you can connect a cable to a device that pushes a tone down the line, and then pick up that tone on a device that lets you run the device down the line and hear the one if you have the correct wire), but I also had to figure out this server using a Squid cache that stood in front of everything.<p>Eventually I got all the devices marked from origin to their patch cables in the server room, and I started looking into the Squid cache. It turns out that they were caching everything, as well as blocking websites. I figured out what websites they needed to do their job, and turned off caching, while also learning the ACLs for blocking websites. Anything else was allowed, but the Squid cache would hold a copy for some set amount of time (I think it was 24 hours, so if it was legitimate they only had to wait a day, but it also saved on bandwidth by quite a bit - although think this was used more to monitor user activity).<p>It was frustrating as someone new to large LANs, as well as to in-house caching, but had been using Linux since an early version of Slackware in the later 1990's. Even to this day, as someone that writes software and does DevOps, that knowledge has helped my debugging skills tremendously. Dealing with caching is a skill I feel you need to be burned by in order to finally understand it, and recognize when it's occurring. I cut my teeth on Linux through a teacher that set up a web server in 1997, and not only gave students access to upload their web files, but also a terminal to message each other and see who was online.</p>
]]></description><pubDate>Tue, 31 Mar 2026 02:18:46 +0000</pubDate><link>https://news.ycombinator.com/item?id=47582020</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47582020</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47582020</guid></item><item><title><![CDATA[New comment by progmetaldev in "America Is Now a Rogue Superpower"]]></title><description><![CDATA[
<p>I think the problem is that the "deep state" really came into public consciousness with Trump, on his first run. While I agree with your definition of the deep state, that is not what most people think of in current days, and Trump is probably the deepest of deep state you can legally be. He ran against the deep state, while being deeply embedded inside it. It was just easier to pass off because he wasn't a politician (at least from an American point of view, not sure of your country of origin).</p>
]]></description><pubDate>Mon, 30 Mar 2026 23:44:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=47581065</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47581065</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47581065</guid></item><item><title><![CDATA[New comment by progmetaldev in "90% of Claude-linked output going to GitHub repos w <2 stars"]]></title><description><![CDATA[
<p>I agree with this. I just have seen a huge pile-on Microsoft for Azure, in regards to this GitHub migration. There are already plenty of legitimate reasons to be upset with Microsoft, without needing to tackle Azure.</p>
]]></description><pubDate>Mon, 30 Mar 2026 20:04:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=47579076</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47579076</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47579076</guid></item><item><title><![CDATA[New comment by progmetaldev in "90% of Claude-linked output going to GitHub repos w <2 stars"]]></title><description><![CDATA[
<p>I keep hearing this, and I know Azure has had some issues recently, but I rarely have an issue with Azure like I do with GitHub. I have close to 100 websites on Azure, running on .NET, mostly on Azure App Service (some on Windows 2016 VMs). These sites don't see the type of traffic or amount of features that GitHub has, but if we're talking about Azure being the issue, I'm wondering if I just don't see this because there aren't enough people dependent on these sites compared to GitHub?<p>Or instead, is it mistakes being made migrating to Azure, rather than Azure being the actual problem? Changing providers can be difficult, especially if you relied on any proprietary services from the old provider.</p>
]]></description><pubDate>Wed, 25 Mar 2026 22:35:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=47524204</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47524204</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47524204</guid></item><item><title><![CDATA[New comment by progmetaldev in "Local Stack Archived their GitHub repo and requires an account to run"]]></title><description><![CDATA[
<p>Does the amount of labor that was provided by a community make a difference? What if it was minimal? Where do you draw the line (any piece of code accepted, or a "large portion" of code)?<p>I didn't downvote you, but I suspect combining PRs with issues is what most people have an issue with. Issues obviously help to improve software, but only through the fixing or writing of new code.<p>Maybe I'm in the minority, but I also think that if it were a requirement to never close source your project after it's already been open sourced, we'd have far fewer projects available that are open source. Often a project is created on a company's dime, and open source, to draw attention to the developer skills and ability to solve a problem. If the code was legally disallowed to be close sourced in the future, we might see far less code available universally. A working repository of code is potentially a reference for another developer to learn something new. I don't have any examples, but I know for a fact that I've read code that had been open source, and later close sourced, and learned something from the open source version (even if it was out of date for the latest libraries/platform).</p>
]]></description><pubDate>Tue, 24 Mar 2026 02:25:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=47497963</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47497963</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47497963</guid></item><item><title><![CDATA[New comment by progmetaldev in "Local Stack Archived their GitHub repo and requires an account to run"]]></title><description><![CDATA[
<p>I agree, along with the child comment. I think the issue is that if there wasn't some kind of ability to "rug pull," that we would see far fewer open source contributions in the first place.<p>I hate that a company can take a fully open-source project, and then turn it into a commercial offering, dropping support for the project's open source model. I am fine with a project's maintainers stopping support for a project because they have other things to deal with, or just are burnt out. I understand that both of these things are allowed under the specific license you choose, and still believe you should have the freedom to do what was done here (although not agreeing with the idea of what was done, I still think it should be allowed). If you want to guarantee your code is allowed to live on as fully open, you pick that license. If you don't, but want to contribute as a means to selling your talent, I still think the world would have far less software if this was discouraged. The source is still legal from before the license was changed, and I feel that even if the project doesn't get forked, it is still there for others to learn from.<p>With that said I'm wondering if there has ever been a legal case where source was previously fully open source, then became closed source, and someone was taken to court over using portions of the code that was previously open. It seems like it would be cut and dry about the case being thrown out, but what if the code was referenced, and then rewritten? What if there was code in the open source version that obviously needed to be rewritten, but the authors closed the source, and then someone did the obvious rewrite? This is more of a thought experiment than anything, but I wonder if there's any precedent for this, or if you'd just have to put up the money for attorneys to prove that it was an obvious change?</p>
]]></description><pubDate>Tue, 24 Mar 2026 02:12:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=47497885</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47497885</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47497885</guid></item><item><title><![CDATA[New comment by progmetaldev in "Building a TB-303 from Scratch"]]></title><description><![CDATA[
<p>You could always invite him over to check it out, just to see how he reacts to it, and if he's interested more in the tech aspect or the music aspect. Due to the increased rarity of the device, you'd probably want to find out if he would actually use the device, or try taking it apart to see how it works. I'm not sure how old your daughter is, but you could try asking her if she would be upset if you allowed the neighbor to play with the device, just to avoid any ill feelings.<p>It sounds like you've got some great options either way. I wish I had a neighbor growing up that had cool music gear (although I did get to grow up with a dad that got me into computers before I could read, so that definitely built my love for technology). Sounds like you're the kind of dad more kids these days need in their lives.</p>
]]></description><pubDate>Wed, 11 Mar 2026 20:26:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=47341050</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=47341050</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47341050</guid></item><item><title><![CDATA[New comment by progmetaldev in "GitHub is down again"]]></title><description><![CDATA[
<p>There are still some processes that require a waterfall method for development, though. One example would be if you have a designer, and also have a front-end developer that is waiting for a feature to be complete to come in and start their development. I know on HN it's common for people to be full-stack developers, or for front-end developers to be able to work with a mockup and write the code before a designer gets involved, but there are plenty of companies that don't work that way. Even if a company is working in an agile manner, there still may come a time where work stalls until some part of a system is finished by another team/team-member, especially in a monorepo. Of course they could change the organization of their project, but the time suck of doing that (like going with microservices) is probably going to waste quite a bit more time than how often GitHub is down.</p>
]]></description><pubDate>Mon, 09 Feb 2026 18:57:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=46949326</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=46949326</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46949326</guid></item><item><title><![CDATA[New comment by progmetaldev in "10 Years of Let's Encrypt"]]></title><description><![CDATA[
<p>Dun and Bradstreet (?). I believe I'm remembering this correctly. I still deal with a few financial institutions that insist on using an EV SSL certificate on their websites. I may be wrong, but I believe that having an EV SSL gives a larger insurance dollar amount should the security be compromised from the EV certificate (although I imagine it would be nearly impossible to prove).<p>When I last reissued an EV SSL (recently), I had to create a CNAME record to prove domain ownership, as well as provide the financial institution's CEO's information which they matched up with Dun & Bradstreet and called to confirm. The entire process took about three days to complete.</p>
]]></description><pubDate>Wed, 10 Dec 2025 00:47:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=46212703</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=46212703</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46212703</guid></item><item><title><![CDATA[New comment by progmetaldev in "Why WinQuake exists and how it works"]]></title><description><![CDATA[
<p>This is a great write-up for those of us that were into Quake when it was released. Trying to tune your performance was a huge undertaking during the days where you tried running Quake while also having Windows 95. I got into Quake because of all the available MAP tools you could use with it, and the multiplayer aspect, which previously had been very difficult to get working without a LAN.</p>
]]></description><pubDate>Thu, 04 Dec 2025 08:04:19 +0000</pubDate><link>https://news.ycombinator.com/item?id=46144981</link><dc:creator>progmetaldev</dc:creator><comments>https://news.ycombinator.com/item?id=46144981</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46144981</guid></item></channel></rss>