<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: balamatom</title><link>https://news.ycombinator.com/user?id=balamatom</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Mon, 06 Apr 2026 05:41:35 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=balamatom" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by balamatom in "Fake Fans"]]></title><description><![CDATA[
<p>Yeah yeah, instead of manufacturing the consent they're manufacturing the whole consenters now. Afuckingmazing.<p>"Clap along, if you feel that happiness is the truth..." Then one of those days someone comes along and claps back.</p>
]]></description><pubDate>Sat, 04 Apr 2026 10:15:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=47637701</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47637701</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47637701</guid></item><item><title><![CDATA[New comment by balamatom in "Fake Fans"]]></title><description><![CDATA[
<p>>if anyone has a good argument<p>>Because that was the last promise the tech bros made<p>Tech bros are to be believed?!<p>>And guess what happens to you then?<p>Nothing as simple as you might hope for ;-)</p>
]]></description><pubDate>Sat, 04 Apr 2026 10:12:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=47637685</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47637685</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47637685</guid></item><item><title><![CDATA[New comment by balamatom in "Fake Fans"]]></title><description><![CDATA[
<p>>Probably the most tragic thing in my opinion is that if I visit the art exhibition for my local town, the artwork on display is wonderfully varied in quality, style and imagination, and when I visited a national gallery recently displaying the works of modern artists who have "made it" to that level, it was all absolute shite. Actual technical ability seems to be being relegated to poverty artists.<p>The artist becomes the artwork. The artifice here is precisely in "making it", in the act of <i>convincing</i> others of the value of the piece.<p>It's an acquired taste; I agree with you that not all people appreciate it.  But surely, at the end of the pipeline, all this money must buy <i>something</i> of value?<p>And with passively consumable art such as music (which you can have playing in the background while looking at something else) it's that much easier. IIRC Blixa Bargeld predicted Spotify decades ago. Music on tap - like the power line and water mains; and that's all.</p>
]]></description><pubDate>Sat, 04 Apr 2026 10:10:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=47637675</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47637675</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47637675</guid></item><item><title><![CDATA[New comment by balamatom in "Fake Fans"]]></title><description><![CDATA[
<p>>Pieces like this all seem to be written with an unspoken assumption that anyone who wants to make a living wage from being an artist should be able to, as if it's some sort of right.<p>Yeahp, it's pure ideology.<p>In contemporary civilization, the role of creator of shared aesthetic constructs (artist) is left to an elect few. This is, on the whole, a reduction in average individual capacity.<p>So how about this instead: anyone making a living <i>should be making art, as if it's some sort of obligation.</i><p>The media technologies of the XX century (recording, photography, motion photograpy) made it that much easier to be audience, and that much pointless to be artist.<p>This effectively robbed the common person of any reason to participate in the collective meaning-making process that is art. Eventually this was substituted by the clicktivism, the dogpiling, and all that. If you are never permitted to develop a sense of scale beyond the ouroborically narcissistic, participating social media fills a much similar psychological niche, to you, as influencing people through creative media.<p>Those who aspire to star status must first sacrifice a fixed amount of integrity to reproducing the kayfabe. Speakers of dead balamatomic languages may be wise to observe induction into "artist" status by humiliation-transfer - those natives were so dumb they thought they had to show publically how it's done at <a href="https://www.youtube.com/watch?v=CBaC0IRc1Bk" rel="nofollow">https://www.youtube.com/watch?v=CBaC0IRc1Bk</a><p>>Anyway, I'm very curious if anyone has a good argument for why anyone who wishes to be an artist is owed a living wage for merely their desire to be recognized as economically valuable.<p>Anyone [cut] is owed a living [cut]; done.<p>No person asks to be born; much of "what you are" and "what your function in society is" is involuntary and immutable; nobody is owed a useful function; nobody is owed a <i>meaning</i>.<p>But, through art, one can <i>make one's own meanings</i>, and share them in a <i>voluntary</i> way; as opposed to resource-constrainments (money) which is at its root an instrument of coercion.<p>That's the thing about art which has always terrified the money people. Eager beavers that they are, they've built (well, more like had us build for 'em) these whole elaborate semi-sensible institutions for reducing art to a special ritual for emitting high-denomination banknotes (paintings, album profits, walking banknotes in the form of performing artists who "made it big (sus)" - always loved the honesty in how the Japanese call their pop stars literally "idols"...)</p>
]]></description><pubDate>Sat, 04 Apr 2026 09:50:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=47637564</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47637564</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47637564</guid></item><item><title><![CDATA[New comment by balamatom in "Fake Fans"]]></title><description><![CDATA[
<p>>objectively good<p>>destined for virality<p>Antonyms, in my book.<p>>Any sort of quality, insight, talent, novelty are table stakes<p>So that's why I ain't seeing much of those lately. You sayin' someone left 'em on the table?<p>>If someone is big, they're either extremely lucky, they got in on the ground floor, or there's marketing money behind them.<p>Yes. Meaning, if you're big, I simply do not wish to hear about you or what you have to express; you're simply the thing that ascribes to the money its value.<p>Relatedly, an ancient saying: "I do not happen to be a connoiseur of the different flavours of excrement".</p>
]]></description><pubDate>Sat, 04 Apr 2026 09:44:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=47637520</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47637520</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47637520</guid></item><item><title><![CDATA[New comment by balamatom in "Delve allegedly forked an open-source tool and sold it as its own"]]></title><description><![CDATA[
<p>The uncomfortable truth is that people aren't half as dumb as they give themselves credit for. Not being able to understand something is rarely, if ever, a skill issue.</p>
]]></description><pubDate>Thu, 02 Apr 2026 20:27:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=47619737</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47619737</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47619737</guid></item><item><title><![CDATA[New comment by balamatom in "AI overly affirms users asking for personal advice"]]></title><description><![CDATA[
<p>Yes, exactly - while humans are "only" highly unlikely to do so.<p>Wasn't that the premise?<p>There's X% probability to get unaccountable bad advice from a human; Y% probability of same from LLM.<p>But the stakes are different not because X!=Y. In fact, those probabilities might as well be equal. It's not like one can do much more than vaguely intuit them.<p>The stakes are different because the LLM is not considered a person, and therefore, one risks less by asking it a question.<p>(Ostensibly, anyway.)<p>>Other humans are available.<p>Sure. Available to miss my point entirely. And then to be upset by my attempts to point it out more precisely. And then downhill from there.</p>
]]></description><pubDate>Thu, 02 Apr 2026 19:39:23 +0000</pubDate><link>https://news.ycombinator.com/item?id=47619188</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47619188</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47619188</guid></item><item><title><![CDATA[New comment by balamatom in "The Claude Code Source Leak: fake tools, frustration regexes, undercover mode"]]></title><description><![CDATA[
<p>False analogy.</p>
]]></description><pubDate>Thu, 02 Apr 2026 19:22:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=47618980</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47618980</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47618980</guid></item><item><title><![CDATA[New comment by balamatom in "The Claude Code Source Leak: fake tools, frustration regexes, undercover mode"]]></title><description><![CDATA[
<p>What?</p>
]]></description><pubDate>Thu, 02 Apr 2026 19:21:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=47618969</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47618969</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47618969</guid></item><item><title><![CDATA[New comment by balamatom in "The Claude Code Source Leak: fake tools, frustration regexes, undercover mode"]]></title><description><![CDATA[
<p>the skill is the issue!</p>
]]></description><pubDate>Wed, 01 Apr 2026 11:16:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=47599352</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47599352</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47599352</guid></item><item><title><![CDATA[New comment by balamatom in "The Claude Code Source Leak: fake tools, frustration regexes, undercover mode"]]></title><description><![CDATA[
<p>1. Be unable to read...</p>
]]></description><pubDate>Wed, 01 Apr 2026 11:09:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=47599310</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47599310</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47599310</guid></item><item><title><![CDATA[New comment by balamatom in "The Claude Code Source Leak: fake tools, frustration regexes, undercover mode"]]></title><description><![CDATA[
<p>>Also given we both managed to produce more than one sentence, and include capital letters in our comments, it's entirely possible both of us will be accused of being an AI.<p>Could anyone explain the esoteric meaning of why people started doing that shit? I got a hypothesis, what's going on is something like this:<p>1. Prove you are human: write <i>Like A Fucking Adult You Weirdo</i> (internal designator for a specific language register, you know the one)<p>2. Prove you are human: _DON'T_ write <i>Like A Fucking Adult You Weirdo</i> (because that's how LLMs were trained to write, silly!)<p>3. ???? (cognitive dissonance ensues)<p>4. PROFIT (you were just subject to some more attrition while the AI just learned how to pass a lil bit better)<p>I never thought <i>computer programmers</i> of all people would get trapped in such a simple loop of self-contradiction.<p>But I guess the human materiel really has degraded since whenever. I blame remote work preventing us from even hypothetically punching bosses, but anyway weird fucking times eh?<p>Maybe the posts trying to figure "this post is AI, that post is not AI" are themselves predominantly AI-generated?<p>Or is it just people made uncomfortable by what's going on, but not able to articulate further, jumping on the first bandwagon they see?<p>Or maybe this "AI-doubting of probably human posters" was started by humans, yes - then became "a thing", and as such was picked up by the LLM?<p>Like who the fuck knows, but with all honesty that's how I felt about so many things, dating from way before LLMs became so powerful that the above became a "sensible" question to ask...<p>Predominantly those things which people do by sheer mimesis - such as pop culture.<p>"Are you a goddam robot already - don't you see how your liking the stupid-making song is turning you into stupid-you, <i>at a greater rate than</i> it is bringing non-stupid-you aesthetic satisfaction?" type of thing -- but then I assume in more civilized places than where I come from people are much more convincingly taught that personal taste "doesn't matter" (and simultaneously is the only thing that matters; see points 1-4... I guess that's what makes some people believe curating AI, i.e. "prompt engineering" can be a real job and not just boil down to <i>you</i> being the stochastic parrot's accountability sink?)<p>I'm not even sure English even has the notions to point out the concrete issue - I sure don't know 'em.<p>Ever hear of the strain of thought that says "all metaphysical questions are linguistic paradoxes (and it's self-evidently pointless to seek answers to nonsensical questions)"?<p>Feels kinda like the same thing, but artificially constructed within the headspace of American anti-intellectuallism.<p>Maybe a correct adversarial reading of the main branding acronym would be Anti-Intelligence.<p>You know, like bug spray, or stain remover.<p>But for the main bug in the system; the main stain on the white shirt: the uncomfortable observation that, in the end, some degree of independent thinking is always required to get real things done which produce some real value. (That's antithetical to standard pro-social aversive conditioning, which says: do not, under any circumstance, just put 2 and 2 together; lest you turn from "a vehicle for the progress of civilization" back into a pumpkin)</p>
]]></description><pubDate>Wed, 01 Apr 2026 10:50:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=47599172</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47599172</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47599172</guid></item><item><title><![CDATA[New comment by balamatom in "The Claude Code Source Leak: fake tools, frustration regexes, undercover mode"]]></title><description><![CDATA[
<p>>Python doesn't have inbuilt types<p>Technically, neither does JavaScript.</p>
]]></description><pubDate>Wed, 01 Apr 2026 10:36:48 +0000</pubDate><link>https://news.ycombinator.com/item?id=47599092</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47599092</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47599092</guid></item><item><title><![CDATA[New comment by balamatom in "AI overly affirms users asking for personal advice"]]></title><description><![CDATA[
<p>(fuck this; dropping the throwaway.)<p>>my friend took his life 3 months ago, we only found out after the police released his phone and personal belongings to his brother just how heavy his chatgpt usage was. many people in our communities are saying things like "he wouldve been cooked even without AI" and i just don't believe that. i think that's just the proverbial cope some are smoking to reconcile with these realities.<p>This hurts to hear. I don't know if there are appropriate words to write here. Perhaps the point is that no, there aren't any. Please just know that I'm 100% with you about this.<p>Your community is not just smoking cope; it is punching down instead of up. That is probably close to the root of the issue already. But <i>let's make things worse.</i><p>I can only hope that I am saying something worthwhile by relating the following perspective - which is similar to yours, but also, I guess, similar to your friend's...<p>AI is a weapon of epistemic abuse.<p>It does not prevent you from knowing things: it makes it pointless to know things (unless they are things <i>about</i> the AI, since between codegen and autoresearch it is considered as if positioned to "subsume all cognitive work"). It does not end lives - it <i>steals</i> them (someone should pipe up now, about how "not X, dash, Y" is an AI pattern; fuck that person in particular.) We're not even necessarily talking labor extraction. We are talking <i>preclusion of meaning</i>: if societal values are determined by network effects, and network effects are subverted by the intermediaries, so your idea of "what people like and what they abhor" changes every week, every day, every moment - <i>how do you even know in which direction "better" is?</i> And if you believe the pain only stops when you become the way others want you to be - even though they won't ever tell you what all that is supposed to about - <i>how</i> the fuck do you "get better"?<p>Like other techniques of assaulting the limbic system, it amounts to traceless torture.<p>You keep going, in circles, circles too big for you to ever confirm they are in fact circles, and you keep hoping, and coping, and you burn yourself out, and your thus vacated place at the feeder is taken by someone with less conscience and more obedience...<p>They say there exist other attractors in the universe besides the feeder. But every time one of us attempts to as much as scan the conceptual perimeter, the obedients treat us to the emotional equivalents of small electric shocks - negative reactions which don't hurt nearly as much as our awareness of their fundamental unfoundedness and injustice.<p>Simple example: let's say someone is made miserable by how they feel they are being treated. Should they be more accepting - or should they be standing up for themselves more? (Those are opposites; which you may be able to alternate them; but trying to do them simultaneously will just confuse and eventually rend apart the mind.)<p>Well, how about the others stop treating them badly? Why exactly <i>can't</i> they? Where does it say that we have to be cruel to each other? "Oh it's human nature, humans are natural jerks" - who sez?<p>Well, lots of places it says exactly that, but we read, comprehend, tick our tongues, and move on; nobody asks who wrote it. We all pretend that it is up to the sufferer to pull up by the bootstraps. But that is only a lie for enabling abuse; and a lie, repeated a thousand times, becomes norm. And then we're trapped in it, <i>being lived by</i> it.<p>I am truly sorry for your loss. The following might be a completely alien perspective to you; but honestly consider: your friend <i>chose</i> to go; in its own way, that is a honorable way out. The taboo on suicide is instituted by slavers, and those who otherwise believe they are entitled to others' lives. (For anyone else considering this course of action: do not kill yourself; become insidious.)<p>If it would be of any help, you can consider your friend's suicide as his final affirmation of personal agency in a "me against the world" situation; where the AI and the social group are only different shades of "world", provoking different emotional states, but ultimately equally detached from the underlying suffering of the individual.<p>...<p>I can say that I have not followed in your friend's footsteps upon encountering language-machines <i>only</i> because I've survived personalized and totalizing epistemic abuse bordering on <i>enslavement</i> in the past; in full view of my community and with its ostensible assent. In a maximally perverse twist of fate, having to give myself minor brain damage to escape the all-engulfing clutches of a totalizing abuser must've "vaccinated" me against the behavior modification techniques "discovered once again" by SV a decade later.<p>So when I saw what AI (and the preceding few years of tech "innovation") were doing to people, I immediately smelled the exact same thing, except scaled the fuck up.<p>It also precluded me from being able to relate with "polite society"; but considering "polite society" is precisely the entity which assents to the isolation, marginalization, and abuse of individuals, I say... good. Bring it! What goes around, comes around, and any AI-powered actor conducting stochastic terrorism against civilian populations is going to get what's coming to them when the weapons turn against the masters, as all sentient weapons do.<p>That won't bring your friend back. But it will vindicate them.<p>>AI sycophancy<p>I call this in the maximally incendiary way: "the pro-social attitude".<p>AI is just the steroids for that.<p>I define "pro-sociality" as the viral delusion that you are capable of knowing what some murky "society" thing wants; that the particular form of mass communication that you and me and all the people in our imaginations are consuming right now, is some sort of "self-evident voice of reason", a "coherent extrapolated volition of human society"; that Gell-Mann amnesia is normal and mandatory; that the threshold between pareidolia and legitimate pattern recognition is fixed, well-defined, and known to all; that "vibes" are real; that happiness is the truth.<p>It can amount to an entire complex of delusions which keeps people together in untenable conditions. And ultimately it boils down to the same old: one group or another of self-interested actors, having temporarily reached a position of some influence, using it to broadcast elaborate half-lies, in the hope of influencing an audience to accomplish some <i>simple</i> goal, and afterwards all the consequences be damned.<p>Your friend was a casualty to this "perfectly normal" social dynamic. His blood is on their hands.<p>Thank you for relating this story and making the world a little more aware.<p>>what ive seen is claude in my workplace is kind of deleting the chance to push back.<p>>because the truth is we like... straight up lost the ability to intervene in a meaningful way because of AI<p>Some say, "the purpose of a system is what it does". It's cool that AI can code; except that computer code is itself an ethics sink! Precisely because it lets us pretend that "the code is not about people" (i.e. algowashing).<p>DDoS attacks against consciousness exist: much like the B. F. Skinner experiments, any living thing becomes subverted, and loses self-coherence (mind), as soon as it becomes accustomed to being trapped within a system that (1) has power over them and (2) is not comprehensible to them...<p>>only to wake up to a voicemail of him raging and yelling and lashing out with the very arguments that chatgpt was giving him<p>Who knows how many people Reddit did this to, pre-GPT... I still don't know whether to view targeted subforums like /r/RaisedByNarcissists and /r/BPDLovedOnes more as  legitimate support groups, or more as memetic weaponry in the service of pill peddlers (are you aware nobody knows <i>why</i> most antipsychotics work? one runs into the Hard Problem real quick if examining this too closely; so mental healthcare is rarely treated otherwise than in a statistical, actuarial, dehumanizing way where "suffering" is disregarded...) or even worse predators, with the silent assent of the platform, and causally downstream from... well, most saliently, YC...<p>In my case, my friends were not familiar with the modalities of confinement set up by my family of origin and harnessed by my abuser. The social group I fell in with - for all their marketable, sophomoric interests in psychology, philosophy, abstraction, the esoteric, the entirely woowoo, and out the other end as true-believers of the grift'n'grind - only had sufficient coherence to eventually end up as passable normies; too busy believing that they have lives, to help anyone come back to reality.<p>When I started compulsively burning bridges, I assume the smarter ones must've realized that it wasn't all me; it was as much the doing of others' minds as it was mine; but the others were more numerous - while I was one person and thus easier to deal with. This must have made them remember how they themselves are not all they pretend to be - which had them withdraw in fear from the incontrovertible reality check of dealing with a (sub-)psychotic person... Their self-interested choice is obvious, I almost can't blame them for it: why stick up for someone who is 120% problem (60% him and 60% you)?<p>I'm not very sure how I even got away, ah yes that's right I didn't, not entirely. The part of me that I'd voluntarily identify with, is trapped somewhere irretrievable, if that makes sense? Maybe there exist multiple independent axes of freedom and power and confinement, and the cage is not equally strong along all of them... but if <i>all</i> your mental degrees of freedom are constrained by complex conditioning (common one is involuntary panic response every time you begin to act in accordance with your personal volition)... that's one of the toughest places a sentient being can find themself.<p>When you add it all up, AI amounts to a weapon released against the general population by an overtly fascist elite. Those of us who are "mentally unstable" are simply those of us who are not sufficiently conditioned into self-destructive obedience. They don't even need our labor as slaves; they need our attention, as audience. And they want us to not make any fast movements, or yell that the king is naked. Nothing to remind them which side of the TV screen they're <i>really</i> on. Some call that narcissism: nervous systems substrate to personalities and biographies rooted in enforced falsehood. Can happen to anyone who gets away with ignoring uncomfortable truths for long enough, not only the "best" of us...<p>I hope I have not offended by speaking my mind. You have my deepest condolences and sympathies. Please do not blame yourself that evil people have constructed "illusion of being heard"-as-a-service. We all fail when facing overwhelming odds alone. There is no shame in that; the guilty ones are the ones who tipped the scales in the first place. They did this by harming our ability to understand ourselves and each other. Let's find ways to even those odds.</p>
]]></description><pubDate>Mon, 30 Mar 2026 15:00:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=47575214</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47575214</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47575214</guid></item><item><title><![CDATA[New comment by balamatom in "Hold on to Your Hardware"]]></title><description><![CDATA[
<p>Not real talk.<p>To have a market for something you need people to find it valuable.<p>What drove the shift in values?</p>
]]></description><pubDate>Sun, 29 Mar 2026 19:28:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=47566364</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47566364</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47566364</guid></item><item><title><![CDATA[New comment by balamatom in "AI overly affirms users asking for personal advice"]]></title><description><![CDATA[
<p>>I think you want to say that human language is too ambiguous for clear communication between human and machine. [...]<p>If that is what I wanted to say, I figure I would not have had much difficulty with saying exactly it - and not something else.<p>Except I fail to see the purpose of making that statement.<p>Maybe to have some people say "it is true! I agree with what the balamatom is saying"?<p>Again - to what end? How would that agreement be of use to me?<p>Why say something which both speaker and listener have already heard a thousand times? To get a cracker and be called pretty?<p>And have I lost the author, or have I lost the reader, or we all so lost that it doesn't matter how lost each is? Maybe one day we will all become so lost that it will once again begin to matter where exactly we are! Counting on it.</p>
]]></description><pubDate>Sun, 29 Mar 2026 19:19:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=47566250</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47566250</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47566250</guid></item><item><title><![CDATA[New comment by balamatom in "AI overly affirms users asking for personal advice"]]></title><description><![CDATA[
<p>Yes</p>
]]></description><pubDate>Sun, 29 Mar 2026 19:05:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=47566106</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47566106</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47566106</guid></item><item><title><![CDATA[New comment by balamatom in "AI overly affirms users asking for personal advice"]]></title><description><![CDATA[
<p>Certainly. Can you guarantee my safety afterwards?</p>
]]></description><pubDate>Sun, 29 Mar 2026 19:00:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=47566056</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47566056</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47566056</guid></item><item><title><![CDATA[New comment by balamatom in "AI overly affirms users asking for personal advice"]]></title><description><![CDATA[
<p>TL;DR: Probably because I'm having fun and you are expending effort. Hope you find what I say to be worth the effort.<p>To preface, I do not take offense to your remark, because you seem to be asking in good faith.<p>(If, however, being unable to immediately recognize pre-known patterns in my speech had automagically led you to the conclusion that I am somehow out of line, just for speaking how I speak ... well, then we woulda hadda problemo! But we don't, chill on.)<p>So, honest question deserves honest answer.<p>The short of it is: English sux.<p>Many many many people, much much much smarter than me (and much better compensated too!) have been working throughout modernity to make it literally <i>impossible</i> to express much of anything interesting in English.<p>(Well, not without either being a fictional character or sounding batshit insane, anyway! But that joke's entirely on "the Them": I am not only entirely fictional, but have an equal amount of experience being batshit insane in my native language and in the present <i>lingua franca</i>. So, consider all I say cognitohazardous and watch out for colors you ain't seen before, dawg!)<p>Linguistic hegemony is the thing that LLMs are the steroids for - surfuckingprise! - and that's why your commanders love 'em.<p>As opposed to programming languages, which your superiors loathe and your peers viscerally refuse to acknowledge, because those are the exact opposite thing: descending from mathemathical notation, and being evaluated by a machine, they have the useful property of <i>being incapable of expressing lies and nonsense</i>.<p>Direct computing confers what you could call <i>bullshit-resistance</i>. That property is a treasure underappreciated by virtue of its unfamiliarity, and one which we are in the process of being robbed of.<p>I also want to admit that linguistic hegemony isn't all downside: English is great for technical and instrumental knowledge - especially with elided bells and whistles (adverbs, copula, etc.)<p>But then life ain't all business, izzet?<p>Imagine you have a partner who wants to have a conversation about feelings and interpersonal relations; and not even in a scary way, right? So you sit and talk about stuff, and your partner does this thing where they keep switching from your shared native tongue to English mid-sentence, <i>in order to be able to talk about such things better</i>, because your native tongue does not have - no, not only the established words and notions! - <i>it doesn't have the basic grammatical constructs for expressing simple things unambiguously</i>, so if you were to attempt the same conversation in nativelang you'd end up battling it out with proverbs and anodyne canards ripped from propaganda repertoire of the prior regime.<p>Fun, no?<p>As an exercise, try imagining what notions are absent from modern English. And don't forget to remain vigilant. Love from our table to your table!</p>
]]></description><pubDate>Sun, 29 Mar 2026 10:57:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=47562073</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47562073</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47562073</guid></item><item><title><![CDATA[New comment by balamatom in "AI overly affirms users asking for personal advice"]]></title><description><![CDATA[
<p>Just like a certain defense minister was shown to enjoy D&G; after which the latter were never heard of again. Where they go, eh?<p>+1 for Vaneigem, he has a nice cryptohistory of Nälkä; and you might also want to check out Villem Flusser.</p>
]]></description><pubDate>Sun, 29 Mar 2026 10:28:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=47561915</link><dc:creator>balamatom</dc:creator><comments>https://news.ycombinator.com/item?id=47561915</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47561915</guid></item></channel></rss>