<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: BulgarianIdiot</title><link>https://news.ycombinator.com/user?id=BulgarianIdiot</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Fri, 03 Apr 2026 18:13:37 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=BulgarianIdiot" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by BulgarianIdiot in "MS Teams is down"]]></title><description><![CDATA[
<p>You win btw.</p>
]]></description><pubDate>Wed, 28 Jun 2023 16:07:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=36508509</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36508509</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36508509</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "Is ORM still an anti-pattern?"]]></title><description><![CDATA[
<p>The abstract concept of ORM is fine, unfortunately the particular ORMs we use suffer from pretending all objects and their relationships exist in RAM, there's a single instance of every entity and it's always up to date. An abstraction that isn't merely wrong, but falls apart in your face at every opportunity.<p>This problem is not unique to mapping objects to SQL databases, but mapping objects to anything remote at all, say a GraphQL or a REST API.<p>OOP as we presently interpret it is an inherently synchronous, reference (or handle, rather) based paradigm, that only works when all your state is local and in RAM.<p>This is why distributed object protocols keep failing. They'll keep failing until OOP programming reorients to use values for messages and makes references explicit, so their impact is seen and felt (and it's especially seen and felt when you reference an entity on another machine halfway around the world, in terms of lag, fragility, eventual consistency and everything).<p>We see hints of this with value types in Swift and .NET, unfortunately I'd say rather rudimentary so far. But it's coming. The ideal example of such a set up is Erlang. A language that Joe Armstrong has called "probably the only OOP language in the world". A statement Alan Kay agrees with (source: <a href="https://www.quora.com/What-does-Alan-Kay-think-about-Joe-Armstrong-claiming-that-Erlang-might-be-the-only-object-oriented-language-and-also-his-thesis-supervisor-s-claim-that-Erlang-is-extremely-object-oriented" rel="nofollow noreferrer">https://www.quora.com/What-does-Alan-Kay-think-about-Joe-Arm...</a> ).</p>
]]></description><pubDate>Wed, 28 Jun 2023 07:45:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=36503553</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36503553</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36503553</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "Why a browser and mail combination is worth it"]]></title><description><![CDATA[
<p>At this point browsers are more complicated than the operating systems we had back when browser+mail was the norm.<p>And I bet Gmail.com running in a modern browser is more complicated than a mail client from that time.<p>As such, browsers are now an app platform. They don't need to ship with prebaked apps like mail. They need APIs like a powerful runtime, visualization layer, background services and notifications... and they have them.</p>
]]></description><pubDate>Tue, 27 Jun 2023 09:47:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=36490831</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36490831</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36490831</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "PSA: Twitter one-click dev subscriptions"]]></title><description><![CDATA[
<p>Chargeback from your CC provider/bank?</p>
]]></description><pubDate>Tue, 20 Jun 2023 19:50:08 +0000</pubDate><link>https://news.ycombinator.com/item?id=36409511</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36409511</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36409511</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "Representing Enums in PostgreSQL"]]></title><description><![CDATA[
<p>I find it odd how many schema changes in modern RDBMS must be done on the whole table at once. You can split a table in chunks and recode each chunk gradually in a way which doesn't change the data in it (so no downtime) but removes dead entries like updated enums.<p>In a way you describe how we can emulate this process. The question is why the heck wouldn't databases do this themselves? Same with adding and dropping columns.<p>Consider how PostgreSQL encodes null for example, by skipping them in the row as fields, and adding them in a null bitmap in front of the row. Meaning... rows are not uniformly sized, there's no math like offset = row * rowsize + field_offset; kind of addressing for reading a field in PG where recoding some of the rows breaks the entire table.<p>And yet we have all those huge monolithic operations that need to be done atomically. So weird.</p>
]]></description><pubDate>Tue, 20 Jun 2023 17:19:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=36407321</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36407321</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36407321</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "Read the new Twitter CEO’s first email to employees"]]></title><description><![CDATA[
<p>Elon would rather lose 50 billion (and yes that includes what he owes the bank and his co-investors) than admit defeat. He sees his "genius who makes no mistakes, it's all 6D chess" image as his primary money maker.</p>
]]></description><pubDate>Mon, 12 Jun 2023 22:29:11 +0000</pubDate><link>https://news.ycombinator.com/item?id=36302198</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36302198</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36302198</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "Read the new Twitter CEO’s first email to employees"]]></title><description><![CDATA[
<p>What do you mean ruined your objectivity? It seems like it repaired it.</p>
]]></description><pubDate>Mon, 12 Jun 2023 22:27:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=36302180</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36302180</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36302180</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "A developer's view of Vision Pro"]]></title><description><![CDATA[
<p>I realize yes. And do you realize if you'll be using keyboard and mouse you may as well not literally wear a computer *on your head*? Are you aware of displays? They can support themselves. On desks. Or wall mounts. Compared to Vision Pro it feels like magic. Self-supporting displays. It's the future. Everything is about to change when people learn about it.</p>
]]></description><pubDate>Mon, 12 Jun 2023 21:32:30 +0000</pubDate><link>https://news.ycombinator.com/item?id=36301476</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36301476</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36301476</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "A developer's view of Vision Pro"]]></title><description><![CDATA[
<p>None of what I said is restricted to the current state of LLMs. I was speaking very broadly about the nature of AI in our world, and going back to the creation of DNA...</p>
]]></description><pubDate>Mon, 12 Jun 2023 21:32:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=36301474</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36301474</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36301474</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "A developer's view of Vision Pro"]]></title><description><![CDATA[
<p>Maybe, but not in terms of needing code.<p>You see, the rules of formal languages that encode formal rules of system constraints pre-date computers by centuries. Think of math proofs, for example. Sure, we can encode symbols as emojis, or geometric figures or whatever. But in the end, it's sequences of symbols, that's the nature of it. And tapping symbols one by one with a headset will suck, no matter how programming looks.<p>The rules of formal languages that encode formal rules of system constraints pre-date in fact our species too. Think about what DNA is. Oh yeah, spooky, isn't it. A sequence of symbols (GTCA) encoding a sequence of more complex symbols (proteins). Spooky! But yes, DNA is our code. And it works the same as our programming code.<p>Now I know where you're going. LLMs. Let's assume an LLM writes the code for you. You still have to read it, which you can do fine with a headset (if it's not as encapsulating and heavy, and with short battery life as Vision Pro v1). But if you spot something's off, you need to adjust it. Go directly for the kill, and make that surgical series of edits. You know? Or... maybe you can spend the rest of the day hopelessly trying to explain to Siri 2030 year edition what you want to do, instead of going in and doing it, for that "last mile".<p>Because if AI can do the last mile itself, to the point you don't need to even verify it... first, that's the fast way to AI shipping code we don't understand and basically giving up our entire civilization to it. And second... we don't need to code, but we also won't need to exist, and therefore not need headsets.<p>So in the worldlines where we DO exist... Vision Pro sucks for coding, because it's a shitty human interface to editing code.<p>And in the worldlines where we DO NOT exist... Vision Pro sucks for coding, because AI doesn't need headsets.</p>
]]></description><pubDate>Mon, 12 Jun 2023 19:40:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=36299760</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36299760</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36299760</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "A developer's view of Vision Pro"]]></title><description><![CDATA[
<p>Do you recall the wave of people breaking their TVs with the Nintendo Wii controllers?<p>I'd expect a similar wave of people breaking their expensive Vision Pros if they try cooking with one. It's a terrible idea. First, most kitchens are cramped, full of low-hanging cabinets to break your Vision's fragile glass into.<p>And then, keeping those open vents around vapors full of fat and tasty food bits is a great way to cover the circuits with grease.<p>We already have a solution for something telling you what to do next, and it's called an iPad with a stand. A phone also does the job and has much lower chance of incidents than a headset, despite yes, you may need to wash your hands from time to time to scroll down. Or... you can simply use assistive features and voice for that. Siri is going to get a lot smarter thanks to LLM, much sooner than Vision Pro will become light and pragmatic for such purposes.<p>Regarding this "it'll know where things are in your home", let's use basic logic here. It can learn the layout of your rooms and where your immovable furniture is. But no, it can't know where everything that moves is, because this means you literally can't move anything unless you have the headset on to track its location. Or slap expensive AirTags on every single jar and utensil maybe. All solutions would be hilariously impractical. And... we'll end up with where I started: a broken Vision Pro glass as you slam it in a cupboard while trying to fish out a jar of condiments.<p>I don't know what is about VR that makes people pull out the fantasy scenarios. It's simply a (bulky) screen with pass through. It's not a wizard. It can't know things unless there's a way for it to find them.<p>We can imagine a super-thin model that you can keep on your face 24/7 and sleep with it too, so it tracks your entire life forever and knows you better than you know yourself. And it synchronizes with your spouse and children who also wear their own headsets 24/7. And it's unbreakable. And the battery never runs out. We can imagine many things. They don't exist, and won't exist any time soon. "Not on the horizon" as Steve Jobs used to say.</p>
]]></description><pubDate>Mon, 12 Jun 2023 19:29:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=36299600</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36299600</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36299600</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "AI Generated artistic QR Codes"]]></title><description><![CDATA[
<p>Great example how AI can be given few partially opposing constraints and told "find me a common subset that fully satisfies both". This thing would take a person weeks of tweaking to get right.</p>
]]></description><pubDate>Mon, 12 Jun 2023 19:23:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=36299527</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36299527</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36299527</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "Mojo is a much better “Objective-C without the C” than Swift ever was"]]></title><description><![CDATA[
<p>I'm a kind fella usually, but this article was criminally stupid.</p>
]]></description><pubDate>Mon, 12 Jun 2023 18:22:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=36298461</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36298461</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36298461</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "A developer's view of Vision Pro"]]></title><description><![CDATA[
<p>> In short, my brain has crossed a Rubicon and now feels like experiences constrained to small, rectangular screens are lesser experiences.<p>The funny thing about this statement is that Vision Pro is in fact the smallest rectangular screen of any device he's written code for.<p>He's hyped up. That's normal. In time he'll understand Vision Pro doesn't provide any better UX for common activities. In fact it's worse in many ways.<p>Where Vision Pro may shine is tasks where you need to perceive and manipulate complex three-dimensional objects, as they would be in physical space. I see great uses in engineering, design, art. It'd be great to preview interior design, design cars, architecture, create machinery and so on.<p>It'll also be great for previewing products, so online stores become a lot more viable than they are now, as you get a sense of size and style for an item in Vision Pro.<p>It may also be great for education, training, simulations.<p>It has many great uses. But basic apps isn't it. And most people won't care. This thing sucks to wear for more than 20 minutes. It's heavy and uncomfortable. You can't share your experience with others, either. It costs a lot. And you can't multitask with it. I can walk to a place and do something on my phone.<p>The input model also sucks. To code, for example, you need to hook a bluetooth keyboard and mouse. Looking at symbols one by one to fingertap would be comically slow. At which point, you may as well just get 2-3 screens and work on a normal workstation. For less money.</p>
]]></description><pubDate>Mon, 12 Jun 2023 18:16:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=36298347</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36298347</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36298347</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "Reddit.com appears to be having an outage"]]></title><description><![CDATA[
<p>My sci-fi theory is that the boycott freed up so much unused RAM and CPU at their datacenter that it went into meltdown.</p>
]]></description><pubDate>Mon, 12 Jun 2023 16:12:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=36295875</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36295875</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36295875</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "Generative AI has a serious problem with bias"]]></title><description><![CDATA[
<p>AI doesn't have a problem with bias. It is bias. It's how it works, it is its nature. I find it very interesting also that when faced with these biases, we think the AI is wrong, instead of looking at the society it reflects. There's nothing wrong with the AI. It's just an algorithm processing our data.<p>Also is it a fact that people of certain origin are overrepresented in given populations, classes and so on. The solution is NOT to turn a blind eye to this, or lobotomize the AI in order to conceal it.</p>
]]></description><pubDate>Mon, 12 Jun 2023 15:00:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=36294785</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36294785</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36294785</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "Twitter is refusing to pay its Google Cloud bill"]]></title><description><![CDATA[
<p>Calling his Tesla stock "savings" and using market cap to determine its liquid purchase power is astoundingly ignorant. Tell me, if you have, say, 100 million in savings, and you spend 20 million, does the remainder magically shrink also to 20 million, like happened with Musk in 2022?<p>Also do you realize he paid about 1/4 of the value of Twitter. The rest is <i>loans</i> and other investors burning cash by trusting him and rolling over.<p>Tell me why did a wealthy man who merely spent "20% of his savings" take on high-interest debts worth over 13 billion?</p>
]]></description><pubDate>Sun, 11 Jun 2023 19:20:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=36284662</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36284662</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36284662</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "If We’re Not Careful, the AI Revolution Could Become the ‘Great Homogenization’"]]></title><description><![CDATA[
<p>Oh it will be. Already everyone can program, be a voice artist, draw paintings, write poems, and what not. The actual skills will be lost, because cheap and quick beats depth and intent. Basically every time.</p>
]]></description><pubDate>Sun, 11 Jun 2023 16:30:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=36282821</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36282821</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36282821</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "Twitter is refusing to pay its Google Cloud bills"]]></title><description><![CDATA[
<p>And yet they bought a recruiting company to the tune of 30 million a month ago.<p>Clearly Elon doesn't even have the decency to file for bankruptcy. He'll following the Trump book. Bold claims, bouncing checks, blaming everyone else and playing the victim.</p>
]]></description><pubDate>Sun, 11 Jun 2023 14:43:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=36281877</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36281877</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36281877</guid></item><item><title><![CDATA[New comment by BulgarianIdiot in "Twitter is refusing to pay its Google Cloud bill"]]></title><description><![CDATA[
<p>We don't like it, but he keeps testing his power and getting his way. Sometimes I don't even know how. But truth is law and order stop applying when you have power or wealth.</p>
]]></description><pubDate>Sat, 10 Jun 2023 23:54:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=36276571</link><dc:creator>BulgarianIdiot</dc:creator><comments>https://news.ycombinator.com/item?id=36276571</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36276571</guid></item></channel></rss>