<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: pardon_me</title><link>https://news.ycombinator.com/user?id=pardon_me</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Sat, 04 Apr 2026 00:48:18 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=pardon_me" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by pardon_me in "I beg you to follow Crocker's Rules, even if you will be rude to me"]]></title><description><![CDATA[
<p>> Being able to be brutally honest with each other about our misunderstandings<p>Being specific to misunderstandings is an element that's overlooked.<p>This advice tends to be taken onboard (often to extremes) by those who take it as a free pass to just say whatever comes to their mind, whenever they like, without explaining how they arrived there. Any excuse to avoid putting in effort to be understood or be conscious of the fact that human beings have emotions.<p>We are not robots.<p>I'm glad commenters here are aware of this, as HN sentiment is getting close to the point of treating each other as machines, whilst we train bots to have better communication skill such as empathetic reflection, and allow them more creativity and freedom.<p>Some people are more patient and sympathetic towards computers making mistakes and not following commands perfectly, or being too verbose, than we are with our fellow human beings.</p>
]]></description><pubDate>Sat, 14 Mar 2026 15:58:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=47377955</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=47377955</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47377955</guid></item><item><title><![CDATA[New comment by pardon_me in "I beg you to follow Crocker's Rules, even if you will be rude to me"]]></title><description><![CDATA[
<p>Aside from the poor tone of this style of writing, short declarative statements don't convey the same information and leave a confusing message.<p>Without knowing how you arrived at "the point", you are pushing all the work onto the recipient (or worse, every reader of your comment on HN) to verify what you say and how much they can trust you. That could involve researching, checking your credentials, or putting in effort to understand/overlook the emotional tone.<p>"This is the answer. I have the answer" style dumping of information is a poor form of human-human communication, unless you are directly answering a closed-ended question.</p>
]]></description><pubDate>Sat, 14 Mar 2026 15:41:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=47377792</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=47377792</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47377792</guid></item><item><title><![CDATA[New comment by pardon_me in "Digg is gone again"]]></title><description><![CDATA[
<p>Ban reason and the moderator name were public on Something Awful, which allowed the community to respond (actively or passively), and for more senior moderators/admin to take public action against rogue moderators. The transparent audit trail countered the incentive to ban somewhat, but a lot of people also treating getting banned as a game.</p>
]]></description><pubDate>Sat, 14 Mar 2026 15:07:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=47377446</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=47377446</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47377446</guid></item><item><title><![CDATA[New comment by pardon_me in "The Wyden Siren Goes Off Again: We’ll Be “Stunned” By What the NSA Is Doing"]]></title><description><![CDATA[
<p>Locks on bathroom doors are for privacy, not security.</p>
]]></description><pubDate>Fri, 13 Mar 2026 22:32:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=47370879</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=47370879</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47370879</guid></item><item><title><![CDATA[New comment by pardon_me in "Don't post generated/AI-edited comments. HN is for conversation between humans"]]></title><description><![CDATA[
<p>First we would run into the spam-filter problem no different to email. Then we have to choose: do we concede to viewing the world through a lens of WhatEverAI, or train it locally on our own thoughts/views on the world, and hope that AI model is never compromised.</p>
]]></description><pubDate>Thu, 12 Mar 2026 18:14:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=47354974</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=47354974</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47354974</guid></item><item><title><![CDATA[New comment by pardon_me in "Whistleblower claims ex-DOGE member says he took Social Security data to new job"]]></title><description><![CDATA[
<p>We break eggs into the known confines of a pan. We don't spray egg all over the place unless we want to end up with it on our face.<p>Even if it did make sense to "move fast and break things" inside working critical systems, doing so should surely be within the law and without going against the most basic of known security measures.</p>
]]></description><pubDate>Thu, 12 Mar 2026 16:02:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=47352822</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=47352822</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47352822</guid></item><item><title><![CDATA[New comment by pardon_me in "Claude Sonnet 4.6"]]></title><description><![CDATA[
<p>The smug, non-informative, confidently wrong tone these LLMs have learned from such comments drives me mad.</p>
]]></description><pubDate>Tue, 17 Feb 2026 22:51:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=47054565</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=47054565</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47054565</guid></item><item><title><![CDATA[New comment by pardon_me in "Tesla 'Robotaxi' adds 5 more crashes in Austin in a month – 4x worse than humans"]]></title><description><![CDATA[
<p>You could say the same in reverse about HN.</p>
]]></description><pubDate>Tue, 17 Feb 2026 22:44:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=47054505</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=47054505</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47054505</guid></item><item><title><![CDATA[New comment by pardon_me in "Tesla 'Robotaxi' adds 5 more crashes in Austin in a month – 4x worse than humans"]]></title><description><![CDATA[
<p>I'm almost certain they aren't comparing crash statistics to the equivalent human taxi context, "<i>professional</i>" drivers.</p>
]]></description><pubDate>Tue, 17 Feb 2026 22:38:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=47054447</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=47054447</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47054447</guid></item><item><title><![CDATA[New comment by pardon_me in "Amazon Ring's lost dog ad sparks backlash amid fears of mass surveillance"]]></title><description><![CDATA[
<p>Do they just store <i>everything</i>? How long is that sustainable?</p>
]]></description><pubDate>Fri, 13 Feb 2026 01:04:30 +0000</pubDate><link>https://news.ycombinator.com/item?id=46997611</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=46997611</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46997611</guid></item><item><title><![CDATA[New comment by pardon_me in "TikTok's 'addictive design' found to be illegal in Europe"]]></title><description><![CDATA[
<p>It's been going on since forever. The first people the British enslaved were their own kind, they just managed to create a society where citizens enjoyed the authority, and naturally the fruits of pillaging half the world did trickle down back then.<p>If you think about PFI etc. and how those contracts were crafted, it's no different to what happened to the UK's oil. That didn't eventually go to the citizens like Norway. Every last bit of the UK is being extracted now.</p>
]]></description><pubDate>Fri, 06 Feb 2026 21:23:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=46918332</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=46918332</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46918332</guid></item><item><title><![CDATA[New comment by pardon_me in "TikTok's 'addictive design' found to be illegal in Europe"]]></title><description><![CDATA[
<p>Manufactured consent, planned economies, controlled economies, imbalance of wealth or power, tariffs, subsidies, tax breaks, lobbying, ad networks, tracking, algorithmic content delivery, AI generation, asymmetric access to information, social effects, requirements to live despite inaccessible resources for basic needs, government control, private property but no free land available, and international trade laws, are a few things that come to mind which very much go against the idea that we are living in anything like the model of capitalism we learn about in school.<p>2026 is not based on wants and needs except in isolated situations. We are at the hypernormal point of manufacturing problems to sell solutions, because there's very little rent or work left to extract from assets. Lives of excess are maintained by depriving others of necessities. The intense control and misdirection required to keep this somewhat stable is starting to be felt.</p>
]]></description><pubDate>Fri, 06 Feb 2026 21:16:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=46918247</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=46918247</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46918247</guid></item><item><title><![CDATA[New comment by pardon_me in "Systems Thinking"]]></title><description><![CDATA[
<p>This is the point we are at now with wide-scale societal technologies; combining the need for network effects with the product being the prototype, and no option but to work on the system live.<p>Some projects have been forced so far, by diverting resources (either public-funded or not-yet-profitable VC money), but these efforts have not proven to be self-sustaining. Humans will be perpetually stuck where we are as a species if we cannot integrate the currently opposing ideas of up-front planning vs. move fast and break things.<p>Society is slowly realizing the step-change in difficulty between projects in controlled conditions that can have simplified models to these irreducibly complex systems. Western doctors are facing an interesting parallel, now becoming more aware to treat human beings in the same way--that we emerge as a result of parts which can be simplified and understood, but could never describe the overall system behavior. We are good examples of the intrinsic fault-tolerance required for such systems to remain stable.</p>
]]></description><pubDate>Fri, 06 Feb 2026 18:03:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=46916032</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=46916032</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46916032</guid></item><item><title><![CDATA[New comment by pardon_me in "CIA to Sunset the World Factbook"]]></title><description><![CDATA[
<p>AIs that were trained on data obtained through naughty channels actively avoid citing sources and full passages of reference text, otherwise they'd give the game away. This seems to increase the chance of them entirely hallucinating sources too.</p>
]]></description><pubDate>Thu, 05 Feb 2026 15:37:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=46900817</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=46900817</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46900817</guid></item><item><title><![CDATA[New comment by pardon_me in "Amazon cuts 16k jobs"]]></title><description><![CDATA[
<p>It turned out the revolution was always being televised, only it was us thinking what we saw was a reflection of society, not that it was being delivered to us disguised as our own ideas.</p>
]]></description><pubDate>Fri, 30 Jan 2026 17:51:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=46827501</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=46827501</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46827501</guid></item><item><title><![CDATA[New comment by pardon_me in "Tesla’s autonomous vehicles are crashing at a rate much higher tha human drivers"]]></title><description><![CDATA[
<p>Following your logic (which is from the company marketing), why not remove GPS on the car in case they go wrong, as humans we don't need GPS? Cameras could go wrong too... then what happens?<p>Humans hear car/road noises, along with potential screams from outside or passenger shouts from inside, we sense vibrations, can respond to pedestrian or other driver hand signals, and constantly predict hazards through perceptions. How is the car doing all of this, if not for additional sensors and processing?<p>You can land a plane by eye, but what happens when there's fog? That's exactly like the situation in cars. LIDAR can provide extra sensory data where the cameras absolutely fail, just like our own eyes.<p>Knowing there's a solution to this, we are just to accept the car will fail where humans will? That's progress? Why <i>wouldn't</i> you want that extra data for such a small relative cost? LIDAR was already used on cars as a safety-only front collision avoidance system (that's how cheap it is to install).<p>In a properly designed system, adding data which is useful and cannot otherwise be inferred makes complete sense.<p>Given these cars are supposed to be so good that they will be working autonomously for you and pay themselves off in a year or two, the idea that LIDAR etc. is unnecessary and too expensive and will be lead to <i>actively worse</i> performance, is just insane logic for an "engineering" discussion.</p>
]]></description><pubDate>Fri, 30 Jan 2026 17:43:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=46827374</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=46827374</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46827374</guid></item><item><title><![CDATA[New comment by pardon_me in "Tesla’s autonomous vehicles are crashing at a rate much higher tha human drivers"]]></title><description><![CDATA[
<p>How do we know everyone's better for it? Massive amounts of public and private funding went into it and we still haven't seen the outcome, it's too soon.<p>Consider how many car manufacturers are backtracking, the fact that China will now win the EV race, utility energy prices have skyrocketed, and the damage done not only to the brand but people's view towards EVs in general.<p>Forcing this play happened too soon.</p>
]]></description><pubDate>Fri, 30 Jan 2026 17:23:30 +0000</pubDate><link>https://news.ycombinator.com/item?id=46827135</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=46827135</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46827135</guid></item><item><title><![CDATA[New comment by pardon_me in "Tesla’s autonomous vehicles are crashing at a rate much higher tha human drivers"]]></title><description><![CDATA[
<p>Lapse in focus is such a great point. If we looked at the number of accidents caused by very human errors such as "lapse in focus" and "sudden medical events" etc. which we would 100% expect to go away when offloading tasks to any computer, the statistics of accidents remaining becomes the bare minimum for what computer-based automated driving must achieve.<p>This is compounded by the system mistakes likely being hard-errors. A computer hard-error vs. human lapse of judgement is potentially the difference between the vehicle slowly crushing a small child as they scream and beg for help vs. a human stopping as soon as they felt/heard something. Context matters.<p>The compared error-rates must consider if it could have been avoided or mitigated, the near misses, the human vs. computer type of error, and how hard-errors may lead to horrifying scenarios.</p>
]]></description><pubDate>Fri, 30 Jan 2026 17:18:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=46827066</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=46827066</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46827066</guid></item><item><title><![CDATA[New comment by pardon_me in "Tesla’s autonomous vehicles are crashing at a rate much higher tha human drivers"]]></title><description><![CDATA[
<p>If rider and pedestrian safety is the main concern, the automated assistance and safety systems that car manufacturers were already developing make the most sense. They either warn or intervene in situations the human may not realize they are in danger and/or do not respond in time. Developing these solves the harder problems first, automation is easy in comparison.<p>The idea that mostly-automating the system because it's statistically better than humans, but requiring human-assistance to monitor and respond in these exact situations, was flawed logic to begin with. Comparisons of statistics should be made like-for-like, given these are scenarios we can easily control.<p>For example, a robotic taxis should <i>at least</i> be compared to professional drivers on similar routes, roads, vehicles, and times of day. Not just comparing "all drivers in all vehicles in all scenarios over time" with private company data that cherry-picks "automated driving" miles on highways etc. (where existing assistance systems could already achieve near-perfect results).<p>Companies testing autonomy on the public should be forced to upload all crash data to investigators as part of their licensing. The vehicles already have extremely detailed sensor and video data to operate. The fact that we have no verified data to compare to existing human statistics is damning. It's a farce.</p>
]]></description><pubDate>Fri, 30 Jan 2026 17:04:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=46826920</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=46826920</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46826920</guid></item><item><title><![CDATA[New comment by pardon_me in "Tesla’s autonomous vehicles are crashing at a rate much higher tha human drivers"]]></title><description><![CDATA[
<p>I used to be an adventurer like you, then I took a roundhouse kick to the head. Never let your humanoid robot watch TV!</p>
]]></description><pubDate>Fri, 30 Jan 2026 16:42:30 +0000</pubDate><link>https://news.ycombinator.com/item?id=46826574</link><dc:creator>pardon_me</dc:creator><comments>https://news.ycombinator.com/item?id=46826574</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46826574</guid></item></channel></rss>