<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: spease</title><link>https://news.ycombinator.com/user?id=spease</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Fri, 17 Apr 2026 06:15:24 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=spease" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by spease in "Fluorite – A console-grade game engine fully integrated with Flutter"]]></title><description><![CDATA[
<p>Yeah, I cannot understand why people are thinking a gigabyte of RAM in this context save for their context being imagining what this would take with a python HTTPS server streaming video via WebRTC to an electron GUI running out of local docker containers or something. Because that ought to be enough memory for a hour of compressed video.<p>It’s like saying your family of four is going to take a vacation, so you might need to reserve an entire Hyatt for a week, rather than a single room in a Motel 6.</p>
]]></description><pubDate>Thu, 12 Feb 2026 21:39:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=46995618</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=46995618</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46995618</guid></item><item><title><![CDATA[New comment by spease in "Fluorite – A console-grade game engine fully integrated with Flutter"]]></title><description><![CDATA[
<p>I really feel like a lot of the people objecting in this thread are people who have just written web apps in Python whose closest experience with the audio-visual space is WebRTC.<p>Tech for cars <i>is</i> “standard-sized”. Not everything revolves around datacenters and tech, the car industry easily predates the computer industry and operates on a lot tighter margins and a lot stricter regulations.<p>So having a smaller, simpler chip that ultimately costs less physical resources at scale and is simpler to test is better when you’re planning on selling millions of units and you need to prove that it isn’t going to fail and kill somebody. Or, if it does fail and kill somebody, it’s simpler to analyze to figure out why that happened. You’ve also got to worry about failure rates for things like a separate RAM module not being seated properly at the factory and slipping out of the socket someday when the car is moving around.<p>Now - yes, modern cars have gotten more complex, and are more likely to run some software using Linux rather than an RTOS or asic. But the original complaint was that a backup camera adds non-negligible complexity / cost.<p>For a budget car where that would even make sense, that means you’re expecting to sell at high volume and basically nothing else requires electronics. So sourcing 1GB RAM chips and a motherboard that you can slot them in would be complete overkill and probably a regulatory nightmare, when you could just buy an off-the-shelf industrial-grade microcontroller package that gets fabbed en masse, dozens or hundreds of units to a single silicon wafer.</p>
]]></description><pubDate>Thu, 12 Feb 2026 21:24:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=46995416</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=46995416</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46995416</guid></item><item><title><![CDATA[New comment by spease in "Fluorite – A console-grade game engine fully integrated with Flutter"]]></title><description><![CDATA[
<p>The original comment was complaining about backup cameras seemingly adding significant electronics requirements.<p>In practice, you’re not going to tie intimate knowledge of the matrix headlights into the infotainment system, that’s just bad engineering. At most it would know how to switch them on and off, maybe a few very granular settings like brightness or color or some kind of frequency adjustment, not worrying about every single LED, but I can’t imagine a budget car ever exposing all that to the end user. Even if you did, that would be some kind of legendarily bad implementation to require a gigabyte of RAM to manage dozens of LEDs. Like, is it launching a separate node instance exposing a separate HTTPS port for every LED at that point?<p>Ditto for the satellite radio. That can and probably is a separate module, and that’s more of a radio / AV domain piece of tech that’s going to operate in a world that historically hasn’t had the luxury of gigabytes of RAM.<p>Sensors - if this is a self-driving car with 3D LIDAR and 360-degree image sensors, the backup camera requirement is obviously utterly negligible.<p>Remember, we had TV for most of the 20th century, even before <i>integrated circuits</i> even existed, let alone computers and RAM. We didn’t magically lose the ability to send video around without the luxury of storing hundreds of frames’ worth of data.<p>Yeah, at some point it makes more sense to make or grab a chip with slightly more RAM so it has more market reach, but cars are manufactured at a scale where they actually are drivers of microcontroller technology. We are talking about a few dollars for a chip in a car being sold for thousands of dollars used, or tens of thousands of dollars new.<p>There is just no way that adding a backup camera is an existential issue for product lines.</p>
]]></description><pubDate>Thu, 12 Feb 2026 21:00:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=46995088</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=46995088</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46995088</guid></item><item><title><![CDATA[New comment by spease in "Fluorite – A console-grade game engine fully integrated with Flutter"]]></title><description><![CDATA[
<p><a href="https://wiki.st.com/stm32mpu/wiki/How_to_display_on_HDMI" rel="nofollow">https://wiki.st.com/stm32mpu/wiki/How_to_display_on_HDMI</a><p>But mostly it’s the fundamental problem space from an A/V perspective. You don’t need iPhone-grade image processing - you just need to convert the raw signal from the CMOS chip to some flavor of YUV or RGB, and get that over to the screen via whatever interface it exposes.<p>NTSC HD was designed to be compatible with pretty stateless one-way broadcast over the air. And that was a follow-on to analog encodings that were laid down based on timing of the scanning CRT gun from dividing the power line frequency in an era where 1GB of RAM would be sci-fi. We use 29.97 / 59.94 fps from shimming color signal into 30 fps B&W back when color TV was invented in the early-mid 1900s, that’s how tight this domain is.</p>
]]></description><pubDate>Thu, 12 Feb 2026 20:41:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=46994820</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=46994820</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46994820</guid></item><item><title><![CDATA[New comment by spease in "Fluorite – A console-grade game engine fully integrated with Flutter"]]></title><description><![CDATA[
<p>A gigabyte!?<p>You shouldn’t need any dedicated RAM. A decent microcontroller should be able to handle transcoding the output from the camera to the display and provide infotainment software that talks to the CANbus or Ethernet.<p>And the bare minimum is probably just a camera and a display.<p>Even buffering a full HD frame would only require a few megabytes.<p>Pretty sure the law doesn’t require an electron app running a VLM (yet) that would justify anything approaching gigabytes of RAM.</p>
]]></description><pubDate>Wed, 11 Feb 2026 20:42:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=46980620</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=46980620</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46980620</guid></item><item><title><![CDATA[New comment by spease in "Rust’s Standard Library on the GPU"]]></title><description><![CDATA[
<p>There was a library for Rust called “faster” which worked similarly to Rayon, but for SIMD.<p>The simpleminded way to do what you’re saying would be to have the compiler create separate PTX and native versions of a Rayon structure, and then choose which to invoke at runtime.</p>
]]></description><pubDate>Wed, 28 Jan 2026 10:36:11 +0000</pubDate><link>https://news.ycombinator.com/item?id=46793588</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=46793588</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46793588</guid></item><item><title><![CDATA[New comment by spease in "AISLE’s autonomous analyzer found all CVEs in the January OpenSSL release"]]></title><description><![CDATA[
<p>“…just think, Wally, everything that makes this thing go was supplied by the lowest bidder.”<p>- astronaut</p>
]]></description><pubDate>Wed, 28 Jan 2026 04:22:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=46791021</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=46791021</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46791021</guid></item><item><title><![CDATA[New comment by spease in "AISLE’s autonomous analyzer found all CVEs in the January OpenSSL release"]]></title><description><![CDATA[
<p>> Surely everyone would want such a key piece of technology to be air tight and easy to debug<p>1. Tragedy of the Commons (<a href="https://en.wikipedia.org/wiki/Tragedy_of_the_commons" rel="nofollow">https://en.wikipedia.org/wiki/Tragedy_of_the_commons</a>) / Bystander Effect (<a href="https://en.wikipedia.org/wiki/Bystander_effect" rel="nofollow">https://en.wikipedia.org/wiki/Bystander_effect</a>)<p>2. In practice, the risk of <i>introducing</i> a breakage probably makes upstream averse to refactoring for aesthetics alone; you’d need to prove that there’s a functional bug. But of course, you’re less likely to notice a functional bug if the aesthetic is so bad you can’t follow the code. And when people need a new feature, that will get shoehorned in while changing as little code as possible, because nobody fully understands why everything is there. Especially when execution speed is a potential attack vector.<p>So maybe shades of the trolley problem too - people would rather passively let multiple bugs exist, than be actively responsible for introducing one.</p>
]]></description><pubDate>Wed, 28 Jan 2026 04:17:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=46790987</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=46790987</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46790987</guid></item><item><title><![CDATA[New comment by spease in "Fedora Asahi Remix is now working on Apple M3"]]></title><description><![CDATA[
<p>Just because the fed exists doesn’t mean the entire economy shuts down with the government.<p>It depends on how it’s structured.</p>
]]></description><pubDate>Mon, 26 Jan 2026 22:37:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=46772617</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=46772617</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46772617</guid></item><item><title><![CDATA[New comment by spease in "Gemini 3 Pro: the frontier of vision AI"]]></title><description><![CDATA[
<p>> It is the first model to get partial-credit on an LLM image test I have. Which is counting the legs of a dog. Specifically, a dog with 5 legs. This is a wild test, because LLMs get really pushy and insistent that the dog only has 4 legs.<p>I wonder if “How many legs do you see?” is close enough to “How many lights do you see?” that the LLMs are responding based on the memes surrounding the Star Trek episode “Chain of Command”.<p><a href="https://youtu.be/S9brF-wlja8" rel="nofollow">https://youtu.be/S9brF-wlja8</a></p>
]]></description><pubDate>Sat, 06 Dec 2025 11:51:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=46172600</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=46172600</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46172600</guid></item><item><title><![CDATA[New comment by spease in "Valdi – A cross-platform UI framework that delivers native performance"]]></title><description><![CDATA[
<p>I started with desktop applications, so my go-to for GUI has been Qt, especially QML. It works on Windows / MacOS / Linux as well as iOS and Android. I think there’s now a way to compile QML to webassembly as well. It also has a ton of support classes that are loosely analogous to the various *Kit things supplied on iOS and Android.<p>The downside is that the core of Qt is in C++, so it’s mostly seen (or used for?) embedded contexts.<p>I recently used Slint as well, which isn’t anywhere near as mature, but is at least written in Rust and has some type-safety benefits.<p>SwiftUI is pretty good too, and I wish I got to work on Apple platforms more.<p>To me, the simplicity of creating a “Button” when you want a button makes more sense, instead of a React component that’s a div styled by layers of CSS and brought to life by JavaScript.<p>But I’m kind of bummed that I started with that route (well, and writing partial UI systems for game / media engines a few times) because most people learned web apps and the DOM, and it’s made it harder to get the kind of work I identify with.<p>So it’s hard for me to recommend Qt due to the career implications…but at the same for the projects I’ve worked on, it’s made a smaller amount of work go a longer way with a more native feel than electron apps seem to have.</p>
]]></description><pubDate>Sun, 09 Nov 2025 11:29:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=45864796</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=45864796</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45864796</guid></item><item><title><![CDATA[New comment by spease in "Linux phones are more important now than ever"]]></title><description><![CDATA[
<p>Yeah, they could call it Tizen or something.</p>
]]></description><pubDate>Tue, 16 Sep 2025 02:54:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=45257534</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=45257534</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45257534</guid></item><item><title><![CDATA[New comment by spease in "Experimenting with Local LLMs on macOS"]]></title><description><![CDATA[
<p>Yes. And everyone is glossing over the benefit of unified memory for LLM applications. Apple may not have the models, but it has customer goodwill, a platform, and the logistical infrastructure to roll them out. It probably even has the cash to buy some AI companies outright; maybe not the big ones (for a reasonable amount, anyway) but small to midsize ones with domain-specific models that could be combined.<p>Not to mention the “default browser” leverage it has with with iPhones, iPods, and watches.</p>
]]></description><pubDate>Tue, 09 Sep 2025 04:48:38 +0000</pubDate><link>https://news.ycombinator.com/item?id=45177531</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=45177531</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45177531</guid></item><item><title><![CDATA[New comment by spease in "I tried Servo"]]></title><description><![CDATA[
<p>Slint recently added Bevy support. I’ve been keeping an eye on it since I’ve used Qt and love working in Qml.</p>
]]></description><pubDate>Thu, 31 Jul 2025 16:08:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=44747065</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=44747065</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44747065</guid></item><item><title><![CDATA[New comment by spease in "Web fingerprinting is worse than I thought (2023)"]]></title><description><![CDATA[
<p>Whose interests corporations act in is not arbitrary, it’s tied to how they make money.<p>Meta and Google make their money primarily from advertisers, Apple makes money from consumers buying iPhones. One of the upsides to paying for something is that the company is incentivized to keep you paying or get you to pay more.<p>Something I remind people who buy cheaper Android phones and then complain about ads - the OS development is being subsidized by those ads. From Google’s perspective, securing their revenue stream is the justification for Chrome and Android’s existence. It’s not a purely altruistic move to fund their open source development.<p>Charts of the revenue stream for some major tech companies:<p><a href="https://www.visualcapitalist.com/charted-how-does-meta-make-money/" rel="nofollow">https://www.visualcapitalist.com/charted-how-does-meta-make-...</a><p><a href="https://www.visualcapitalist.com/alphabets-revenue-breakdown-in-2024/" rel="nofollow">https://www.visualcapitalist.com/alphabets-revenue-breakdown...</a><p><a href="https://www.visualcapitalist.com/charted-how-apple-makes-its-391b-in-revenue/" rel="nofollow">https://www.visualcapitalist.com/charted-how-apple-makes-its...</a><p><a href="https://www.visualcapitalist.com/how-amazon-makes-its-billions/" rel="nofollow">https://www.visualcapitalist.com/how-amazon-makes-its-billio...</a><p><a href="https://www.visualcapitalist.com/how-microsoft-makes-its-billions/" rel="nofollow">https://www.visualcapitalist.com/how-microsoft-makes-its-bil...</a><p>Older aggregate chart:<p><a href="https://www.visualcapitalist.com/how-big-tech-makes-their-billions-2022/" rel="nofollow">https://www.visualcapitalist.com/how-big-tech-makes-their-bi...</a></p>
]]></description><pubDate>Thu, 24 Jul 2025 18:18:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=44674140</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=44674140</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44674140</guid></item><item><title><![CDATA[New comment by spease in "Peep Show is the most realistic portrayal of evil I have seen (2020)"]]></title><description><![CDATA[
<p>Reminds me of “The Bully and the Beast” by Orson Scott Card.</p>
]]></description><pubDate>Mon, 21 Jul 2025 08:28:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=44632941</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=44632941</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44632941</guid></item><item><title><![CDATA[New comment by spease in "Trying Guix: A Nixer's impressions"]]></title><description><![CDATA[
<p>Personally, I’d really like a crossplatform declarative package manager in a mainstream or mainstream-style language, where the nixpkgs equivalent can be JITed or AOTed including the shell scripts, so it isn’t painful to work with and can switch into an environment almost instantly.<p>Though nix the language syntactically isn’t that complex, it’s really the way that nixpkgs and things like overrides are implemented, the lack of a standard interface between environments and Darwin and NixOS, needing overlays with multiple levels of depth, etc that makes things complex.<p>The infuriating thing about nix is that it’s functionally capable of doing what I want, but it’s patently obvious that the people at the wheel are not particularly inclined to design things for a casual user who cannot keep a hundred idiosyncrasies in their head memorized just to work on their build scripts.</p>
]]></description><pubDate>Sat, 19 Jul 2025 15:19:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=44616238</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=44616238</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44616238</guid></item><item><title><![CDATA[New comment by spease in "Cancer DNA is detectable in blood years before diagnosis"]]></title><description><![CDATA[
<p>> It would be very expensive to run such a trial, over a long period of time, and the administrators would feel ethically bound to unblind and then report on every tiny incidentaloma, which completely fucks the training process.<p>I wonder if our current research product is only considered the gold standard because doing things in a probabilistic way is the only way we can manage the complexity of the human body to date.<p>It’s like me running an application many, many times with many different configurations and datasets, while scanning some memory addresses at runtime before and after the test runs, to figure out whether a specific bug exists in a specific feature.<p>Wouldn’t it be a lot easier if I could look at the relevant function in the source code and understand its implementation to determine whether it was logically possible based on the implementation?<p>We currently don’t have the ability to decompile the human body, or understand the way it’s “implemented”, but that is something that tech is rapidly developing tools that could be used for such a thing. Either a way to corroborate enough information aggregated about the human body “in mind” than any person can in one lifetime and reason about it, or a way to simulate it with enough granularity to be meaningful.<p>Alternatively, the double-blindedness of a study might not be as necessary if you can continually objectively quantify the agreement of the results with the hypothesis.<p>Ie if your AI model is reporting low agreement while the researchers are reporting high agreement, that could be a signal that external investigation is warranted, or prompt the researchers to question their own biases where they would’ve previously succumbed to confidence bias.<p>All of this is fuzzy anyway - we likely will not ever understand everything at 100% or have perfect outcomes, but if you can cut the overhead of each study down by an order of magnitude, you can run more studies to fine-tune the results.<p>Alternatively, you can have an AI passively running studies to verify reproducibility and flag cases where it fails, whereas now the way the system values contributions makes it far less useful for a human author to invest the time, effort, and money. Ie improve recovery from a bad study a lot quicker rather than improve the accuracy.<p>EDIT: These are probably all ideas other people have had before, so sorry to anyone who reaches the end of my brainstorming and didn’t come out with anything new. :)</p>
]]></description><pubDate>Sat, 19 Jul 2025 15:07:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=44616092</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=44616092</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44616092</guid></item><item><title><![CDATA[New comment by spease in "How to write Rust in the Linux kernel: part 3"]]></title><description><![CDATA[
<p>Yeah, Rust tends to have enough type-safety that you can encode project-specific constraints to it and lower the learning curve to that of Rust itself, rather than learning all the idiosyncratic unconscious and undocumented design constraints that you need to abide by to keep from destabilizing things.</p>
]]></description><pubDate>Sat, 19 Jul 2025 14:48:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=44615912</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=44615912</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44615912</guid></item><item><title><![CDATA[New comment by spease in "How to write Rust in the Linux kernel: part 3"]]></title><description><![CDATA[
<p>> I had my journey of using Rust to implement a WinAPI-based app in which I did lots of unsafe (from Rust's point of view) operations. I spent more time on fighting with Rust's protections than on the actual implementation. The result code wasn't safe and looked horrible - totally opposite of what Rust code should look like. And yes, I followed "the Rust way" of doing things.<p>I hate to just directly question your last statement, but “tons of unsafe” is a red flag that the way things are being done. You should need it for direct interactions with a C API, or for the lowest level of interacting with bare metal memory registers, or for lowest level high performance code that’s verging on or actually using assembly.<p>I can see how doing windows API stuff would cause a lot of the FFI usage of unsafe, but that should be handled by the winapi or a windows api crate that converts things to idiomatic rust and the rest of the codebase would be dealing with logic.<p>Entirely possible you hit an edge case here where you were writing glue code which exclusively talked  to hand-optimized SSE algorithms or something, but that is exceedingly rare compared to someone just starting out who’s experienced with C trying to do things the C way and fighting through unsafe rather than looking for and finding higher-level patterns that are idiomatic.<p>> I've gone through hundreds of source codes in Kernel, and most of them wouldn't pass code review in my company - cutting-corners everywhere, multipurpose functions, lack of documentation; those are issues which can't be fixed by just changing a language.<p>Except they are mitigated to such a degree that it again makes me doubt you were coding idiomatically.<p>Rust tends to force you to exhaustively handle code paths, so it will require a lot of explicit opting out to cut corners.<p>Type-safety tends to help <i>a lot</i> in making multipurpose functions self-verifying. Function chains come up a lot more in Rust and type inference works so well because of this.<p>Documentation is a lot easier when you can just have people use “cargo doc” and add a bit of markdown and some doctests that will get exercised by CI.<p>> Sure it will protect developers from missing "free()" here and there, but it's gonna bite from another side.<p>RAII is the smallest difference in DX in Rust compared to working in C. For me the biggest differences were tooling, and type-safety making it a lot easier to catch bugs at compile-time, and enforce design assumptions I had about the architecture.<p>I don’t necessarily expect it to be easier to write Rust code, but I do expect it will catch more issues up-front rather than at runtime or in the field, so you will end up doing more code-compile iterations (seconds to minutes) and fewer code-customer iterations (hours to days or weeks).<p>Though when it comes to C++, there is less surface area of the language to learn to get to the modern “safe” constructs and the constructs enforce their own proper usage rather than expecting the developer to, so I expect an intermediate developer to be writing better code until a C++ developer gets to an expert level.</p>
]]></description><pubDate>Sat, 19 Jul 2025 14:39:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=44615828</link><dc:creator>spease</dc:creator><comments>https://news.ycombinator.com/item?id=44615828</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44615828</guid></item></channel></rss>