<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: yongjik</title><link>https://news.ycombinator.com/user?id=yongjik</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Thu, 14 May 2026 20:27:36 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=yongjik" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by yongjik in "Princeton mandates proctoring for in-person exams, upending 133 year precedent"]]></title><description><![CDATA[
<p>"Culture" works by having a system that collectively punishes cheaters, so that people learn from their own (or others') experiences and internalize that cheating is bad and won't pay off in the long term.<p>That's how you get a culture against cheating.  You ensure that cheating doesn't pay, and eventually people learn that cheating doesn't pay.  The enforcement is part of the culture.</p>
]]></description><pubDate>Wed, 13 May 2026 21:51:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=48128106</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=48128106</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48128106</guid></item><item><title><![CDATA[New comment by yongjik in "People Hate AI Art"]]></title><description><![CDATA[
<p>(Disclaimer: I'm working on AI related stuff, so I'm not exactly impartial.)<p>Not saying AI is blameless, but I'm seeing a trend where the problem is clearly more about social media and how it enables every one of us to live in our own algorithmically fed bubble.  Like, look at this:<p>> If your initial reaction to reading that and seeing [an AI image] is some variation of "ughhh" or rolling your eyes or "fuck this guy" congrats. You are normal.<p>> If it wasn't I cannot stress to you enough that you are an outlier. Whenever you pick key art for a presentation or blog, your business, or whatever - if you use AI art you give a clear signal that you have low social literacy. You immediately associate yourself with a huge bundle of negative emotions because people, largely, hate this shit.<p>See how <i>confident</i> the author is that their own view is the normal, socially acceptable one, and they "cannot stress enough" that any other view is a social outlier.  This has all the same "If you don't care about what's happening in Gaza you're not normal and nobody likes you" energy, except at least people are actually dying in Gaza.<p>And of course it should make perfect sense for the author because, unironically, everywhere they go online they will see people talking and thinking like that!  Largely thanks to those profit-driven corporations and their massive data centers.<p>I don't know what's the solution but this can't go on forever ...</p>
]]></description><pubDate>Sat, 09 May 2026 03:38:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=48071589</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=48071589</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48071589</guid></item><item><title><![CDATA[New comment by yongjik in "Canvas online again as ShinyHunters threatens to leak schools’ data"]]></title><description><![CDATA[
<p>Reminds me of the incident last year when a South Korean government's server room caught fire, which contained the government equivalent of Google Drive, and the only backup was <i>in the same room</i>, and they all burnt down together.<p>Some data was permanently lost, and then officers told reporters that multi-regional backup was not yet built because it was too hard at such a massive scale... of 858 TB.</p>
]]></description><pubDate>Fri, 08 May 2026 03:56:47 +0000</pubDate><link>https://news.ycombinator.com/item?id=48058362</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=48058362</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48058362</guid></item><item><title><![CDATA[New comment by yongjik in "US launches new strikes on Iran"]]></title><description><![CDATA[
<p>I wonder how much what Trump <i>wants</i> matters any more w.r.t. Iran.  Trump kicked a hornets' nest.  Now it's the hornets' turn to decide when (or if) they'll let him walk away.<p>I mean, what's Trump going to do?  Murder Iranian leaders harder?</p>
]]></description><pubDate>Fri, 08 May 2026 00:46:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=48057062</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=48057062</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48057062</guid></item><item><title><![CDATA[New comment by yongjik in "Newton's law of gravity passes its biggest test"]]></title><description><![CDATA[
<p>If it had failed it would've been a top science news.  Every scientist dreams about the day when they're mentioned as "(...)'s experiment puts Newton's law of gravity in jeopardy."</p>
]]></description><pubDate>Mon, 04 May 2026 19:46:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=48013976</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=48013976</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48013976</guid></item><item><title><![CDATA[New comment by yongjik in "Rumor: Disney to Remove Star Wars Sequel Trilogy from Timeline"]]></title><description><![CDATA[
<p>It would be also a more coherent story than what we got in the sequels.</p>
]]></description><pubDate>Mon, 04 May 2026 19:34:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=48013828</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=48013828</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48013828</guid></item><item><title><![CDATA[New comment by yongjik in "LLMs Are Not a Higher Level of Abstraction"]]></title><description><![CDATA[
<p>It's orthogonal to whether LLMs can be a useful abstraction layer, but ...<p>I have a feeling that if LLMs were built on a deterministic technology, a lot of the current AI-is-not-intelligent crowd would be saying "These LLMs can only generate one answer given a question, which means they lack human creativity and they'll never be intelligent!"</p>
]]></description><pubDate>Sun, 03 May 2026 23:56:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=48002921</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=48002921</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=48002921</guid></item><item><title><![CDATA[New comment by yongjik in "The Claude Delusion: Richard Dawkins believes his AI chatbot is conscious"]]></title><description><![CDATA[
<p>He's also older than Trump.  His mind is likely not as sharp as it used to be.</p>
]]></description><pubDate>Sun, 03 May 2026 03:49:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=47993155</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47993155</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47993155</guid></item><item><title><![CDATA[New comment by yongjik in "U.S. to Withdraw 5k Troops from Germany, Pentagon Says"]]></title><description><![CDATA[
<p>This may not be the end of the days of America the Superpower... but it may be the beginning of the end.</p>
]]></description><pubDate>Sat, 02 May 2026 01:42:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=47982495</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47982495</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47982495</guid></item><item><title><![CDATA[New comment by yongjik in "AI uses less water than the public thinks"]]></title><description><![CDATA[
<p>> The farming water usage already exists. The data centers do not. Adding more on top of what farming is using is not going to help. We can prevent the data centers, so that's where the push back is.<p>Well, to me, this sounds basically like "Jeff Bezos already exists, this school does not.  Increasing the government budget to build a school here is not going to help our finance, so that's where we will push back."<p>(I don't think Jeff Bezos should lose all his money, but he could definitely pay more tax.)</p>
]]></description><pubDate>Fri, 01 May 2026 20:23:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=47979790</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47979790</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47979790</guid></item><item><title><![CDATA[New comment by yongjik in "An AI agent deleted our production database. The agent's confession is below"]]></title><description><![CDATA[
<p>I think I've already explained my position, and I don't have any deeper insight than that, so I'll be only repeating myself.  But to repeat one more time: when talking about probability, there's something like "not mathematically zero, but the probability is so low that we can assume that it will just never happen."<p>And it's good that we can think that way, because we also follow the rules of statistical and quantum physics, which are inherently probabilistic.  So, basically, you can say the same things about people.  There's a nonzero (but extremely small) probability that I'll suddenly go mad and stab the next person.  There's a nonzero (but even smaller) probability that I'll spontaneously erupt into a cloud of lethal pathogen that will destroy humanity.  Yada yada.<p>Yet, nobody builds houses under the assumption that one of the occupants would transform into a lethal cloud, and for good reason.<p>Yes, it does sound a bit more absurd when we apply it to humans.  But the underlying principle is very similar.<p>(I think this will be my last comment here because I'm just repeating myself.)</p>
]]></description><pubDate>Sun, 26 Apr 2026 22:59:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=47915701</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47915701</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47915701</guid></item><item><title><![CDATA[New comment by yongjik in "An AI agent deleted our production database. The agent's confession is below"]]></title><description><![CDATA[
<p>As I said, I believe statistical physics is a very good intuitional guidance.  Molecules move randomly.  That does not mean a cup of water will spontaneously boil itself.  Sometimes the probability of something happening is so low that even if it's not mathematically zero it does not matter because you'll never observe it in the known universe.<p>LLM generating each token probabilistically does not mean there's a realistic chance of generating <i>any</i> random stuff, where we can define "realistic" as "If we transform the whole known universe into data centers and run this model until the heat death of the universe, we will encounter it at least once."<p>Of course that does not mean LLMs are infallible.  It fails all the time!  But you can't explain it as a fundamental shortcoming of a probabilistic structure: that's not a logical argument.<p>Or, back to the original discussion, the fact that this one particular LLM generated a command to delete the database is <i>not</i> a fundamental shortcoming of LLM architecture.  It's just a shortcoming of LLMs we currently have.</p>
]]></description><pubDate>Sun, 26 Apr 2026 21:27:38 +0000</pubDate><link>https://news.ycombinator.com/item?id=47914733</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47914733</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47914733</guid></item><item><title><![CDATA[New comment by yongjik in "An AI agent deleted our production database. The agent's confession is below"]]></title><description><![CDATA[
<p>Mostly, I agree with you.  My complaint is that, when the ceiling fails, nobody says "Duh ceilings are supposed to fail, that's basic physics."  Because that (1) helps nobody, and (2) betrays a fundamental misunderstanding of physics.<p>And I do think it's stupid to wire an LLM to a production database.  Modern LLMs aren't that reliable (at least not yet), and the cost-benefit tradeoff does not make sense.  (What do you even gain by doing that?)<p>However, you can't just look at that and say "Duh, this setup is bound to fail, because LLMs can generate every arbitrary sequence of tokens."  That's a wrong explanation, and shows a misunderstanding of how LLMs (and probability) work.</p>
]]></description><pubDate>Sun, 26 Apr 2026 21:05:55 +0000</pubDate><link>https://news.ycombinator.com/item?id=47914458</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47914458</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47914458</guid></item><item><title><![CDATA[New comment by yongjik in "An AI agent deleted our production database. The agent's confession is below"]]></title><description><![CDATA[
<p>> It is fundamental to language modeling that every sequence of tokens is possible.<p>This is just trivially wrong that I don't understand why people repeat it.  There are many valid criticisms of LLM (especially the LLMs we currently have), this isn't one of them.<p>It's akin to saying that every molecules behave randomly according to statistical physics, so you should expect your ceiling to spontaneously disintegrate any day, and if you find yourself under the rubble one day it's just a consequence of basic physics.</p>
]]></description><pubDate>Sun, 26 Apr 2026 19:56:48 +0000</pubDate><link>https://news.ycombinator.com/item?id=47913580</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47913580</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47913580</guid></item><item><title><![CDATA[New comment by yongjik in "South Korea police arrest man for posting AI photo of runaway wolf"]]></title><description><![CDATA[
<p>I swear, some commenters here think "the government is incompetent" is an axiom and work backward from there to fill in the details.</p>
]]></description><pubDate>Sat, 25 Apr 2026 03:47:11 +0000</pubDate><link>https://news.ycombinator.com/item?id=47898461</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47898461</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47898461</guid></item><item><title><![CDATA[New comment by yongjik in "The blame game: The Trump coalition is fracturing as Iran operation stalls"]]></title><description><![CDATA[
<p>It might have started that way (I dunno) but these days it feels like just Trump being senile and incapable of admitting he made a mistake.<p>After all, MAGA supports Trump no matter what.  ICE murdered two US citizens already.  Trump could shoot a baby in Times Square and I don't think it will ever go below 30%.<p>But once you raise gas prices, that affects everyone's living cost.  I don't think whatever video Epstein made could ever match that.  (And these days MAGA will wave it away with "guh it's clearly AI, you libtards."  You can't do that at gas station.)</p>
]]></description><pubDate>Sat, 25 Apr 2026 03:34:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=47898393</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47898393</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47898393</guid></item><item><title><![CDATA[New comment by yongjik in "If America's so rich, how'd it get so sad?"]]></title><description><![CDATA[
<p>Sovereign citizen?  On <i>my</i> Hacker News?<p>...It's more likely than you think !!</p>
]]></description><pubDate>Fri, 24 Apr 2026 04:32:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=47885547</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47885547</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47885547</guid></item><item><title><![CDATA[New comment by yongjik in "California has more money than projected after admin miscalculated state budget"]]></title><description><![CDATA[
<p>> have states had the discipline not to raid these coffers in the boom years?<p>"Let's give the money back to voters because they will like that and we'll figure out something else in tough years" is, like, the quintessential example of "raiding these coffers."<p>It's basically like big tech companies turning profit into stock dividends because investors love it and the CEO will be handsomely rewarded, and who cares about long-term R&D.  When big companies do that we blame MBAs and capitalism.</p>
]]></description><pubDate>Wed, 22 Apr 2026 02:13:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=47857951</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47857951</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47857951</guid></item><item><title><![CDATA[New comment by yongjik in "Why Japan has such good railways"]]></title><description><![CDATA[
<p>At the same time, the same government, mysteriously, has no problem building a vast network of roads reaching everywhere and spanning the whole country.<p>If the US government neglects a section of highway until a city becomes unreachable by roads, there will be riots.  The same city losing a train service?  Totally expected, trains are supposed to suck.<p>The sorry state of American public transport is a self-fulfilling prophecy: everybody <i>knows</i> that public transportation sucks, and therefore nothing is done to improve it, because it's a waste of resource.</p>
]]></description><pubDate>Sun, 19 Apr 2026 00:24:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=47820743</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47820743</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47820743</guid></item><item><title><![CDATA[New comment by yongjik in "George Orwell Predicted the Rise of "AI Slop" in Nineteen Eighty-Four"]]></title><description><![CDATA[
<p>That's great, but I think Herbert had a vision of this incredible galaxy teeming with truthsayers, human computers, and space travel, and just needed a convenient excuse to explain away the total lack of computing devices.</p>
]]></description><pubDate>Fri, 17 Apr 2026 04:34:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=47802477</link><dc:creator>yongjik</dc:creator><comments>https://news.ycombinator.com/item?id=47802477</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47802477</guid></item></channel></rss>