<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: rglullis</title><link>https://news.ycombinator.com/user?id=rglullis</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Wed, 08 Apr 2026 11:28:41 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=rglullis" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by rglullis in "The threat is comfortable drift toward not understanding what you're doing"]]></title><description><![CDATA[
<p>And the fact that you only got to shut up after being asked what you are willing to put on the line tells me how devoid of meaning your argument is, <i>no matter what it is</i>.</p>
]]></description><pubDate>Tue, 07 Apr 2026 16:36:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=47677937</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47677937</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47677937</guid></item><item><title><![CDATA[New comment by rglullis in "Sam Altman may control our future – can he be trusted?"]]></title><description><![CDATA[
<p>> I cannot find a single action of anyone that cannot be construed as an attempt to get them power/money/influence<p>Try the other way around, via negativa. We definitely <i>can</i> find plenty of examples of people stepping out of positions of power, deciding <i>not</i> to do something because of moral conflict, etc. Is there any case of such action from Sam?<p>Fuck, anyone with any semblance of moral fortitude would refuse to take money from the Saudis. But he had no problem to do it.<p>> joined an AI research organisation at a time when everybody thought the big advances were much further away than they turned out to be.<p>No, this is selection bias. What he did was to put himself in a position where he could have his fingers on any and every possible pie, and then  when of these things turned out to be something <i>believed to be valuable</i> by people with money, then he manouvered himself to be in the driver seat.</p>
]]></description><pubDate>Tue, 07 Apr 2026 15:45:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=47677119</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47677119</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47677119</guid></item><item><title><![CDATA[New comment by rglullis in "The threat is comfortable drift toward not understanding what you're doing"]]></title><description><![CDATA[
<p>All I'm saying is "I do not know what is the real upside on leaving the current practices in academia and education in favor of 'let the LLM guide you'". If it was my ass on the line, I would apply the precautionary principle and I wouldn't take any significant bets with my future around this.<p>You on the other hand are the one "being sure" about how everything will be fine and of course there is no way for you to bring <i>actual evidence</i> because all you have is conjecture. So, given there is no way for you to back up your argument with evidence, the next best thing you can do is to put some Skin In The Game: can you back up with beliefs with <i>actions</i>? Are you willing to take any substantial risk in case your bet doesn't pay off?</p>
]]></description><pubDate>Tue, 07 Apr 2026 14:58:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=47676442</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47676442</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47676442</guid></item><item><title><![CDATA[New comment by rglullis in "Sam Altman may control our future – can he be trusted?"]]></title><description><![CDATA[
<p>> what information would significantly change your views<p>Quite simple: show me any single action took by Sam Altman which can not be construed as an attempt to get him more power/money/influence. You can't find it.<p>The difference between what he <i>claims to believe</i> and what he <i>actually does</i> is a textbook example of sociopathy.</p>
]]></description><pubDate>Tue, 07 Apr 2026 10:21:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=47672981</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47672981</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47672981</guid></item><item><title><![CDATA[New comment by rglullis in "Adobe modifies hosts file to detect whether Creative Cloud is installed"]]></title><description><![CDATA[
<p>> In which case, how else would you propose doing it?<p>- Registering an url handler?<p>- Asking the user?</p>
]]></description><pubDate>Mon, 06 Apr 2026 18:02:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=47664518</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47664518</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47664518</guid></item><item><title><![CDATA[Adobe modifies hosts file to detect whether Creative Cloud is installed]]></title><description><![CDATA[
<p>Article URL: <a href="https://www.osnews.com/story/144737/adobe-secretly-modifies-your-hosts-file-for-the-stupidest-reason/">https://www.osnews.com/story/144737/adobe-secretly-modifies-your-hosts-file-for-the-stupidest-reason/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=47664205">https://news.ycombinator.com/item?id=47664205</a></p>
<p>Points: 331</p>
<p># Comments: 166</p>
]]></description><pubDate>Mon, 06 Apr 2026 17:38:30 +0000</pubDate><link>https://www.osnews.com/story/144737/adobe-secretly-modifies-your-hosts-file-for-the-stupidest-reason/</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47664205</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47664205</guid></item><item><title><![CDATA[New comment by rglullis in "The threat is comfortable drift toward not understanding what you're doing"]]></title><description><![CDATA[
<p>> The answer is “vet your sources, don’t trust unsourced claims.”<p>This was already a problem for Wikipedia (articles being written which upon further investigation were based on nothing but Wikipedia itself). With LLM <i>themselves</i> facilitating AI slop and plagiarism, this problem gets to a scale that it becomes impossible to control.<p>> I’m sure the students will manage.<p>The problem with your hubris is that you are not going to be the one solely facing the fallout when this blows up.</p>
]]></description><pubDate>Mon, 06 Apr 2026 15:46:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=47662422</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47662422</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47662422</guid></item><item><title><![CDATA[New comment by rglullis in "The threat is comfortable drift toward not understanding what you're doing"]]></title><description><![CDATA[
<p>The problem I have with your logic is that you are hedging your arguments so much that the whole point become meaningless. 
 If you are trying to argue that young aspiring scientists will be able to use LLMs to learn new concepts <i>instead</i> of doing the hard work themselves, then you also need to explain how they will be able to develop the skills to analyze and "run more thorough verification" INDEPENDENTLY of LLMs.</p>
]]></description><pubDate>Mon, 06 Apr 2026 09:47:19 +0000</pubDate><link>https://news.ycombinator.com/item?id=47658783</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47658783</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47658783</guid></item><item><title><![CDATA[New comment by rglullis in "The threat is comfortable drift toward not understanding what you're doing"]]></title><description><![CDATA[
<p>> I’ve now ostensibly understood what a derivative does and what it’s used for, yet I have zero idea how to mathematically do it. Does that make any results I gain from this intuitive understanding any less valuable?<p>From a <i>science</i> standpoint, I'd say whatever "results" you got are completely worthless.<p>> I’ll generally have some sort of hypothesis of what kind of result I’m expecting, given that my understanding is correct<p>And how do you know if your understanding is correct, if you are only taking what the LLM gives to you and you are not able to verify independently?<p>> Science is what happens when you expect something, test something, and get a result.<p>Right, but has any LLM come up with any hypothesis on its own? Has any AI said "given all this literature that I read, I'd expect <insert something completely out of the training data space>?".</p>
]]></description><pubDate>Sun, 05 Apr 2026 22:13:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=47654486</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47654486</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47654486</guid></item><item><title><![CDATA[New comment by rglullis in "The threat is comfortable drift toward not understanding what you're doing"]]></title><description><![CDATA[
<p>>  so they can produce other, more useful, results<p>But to even *know* what is more useful, it is crucial to have walked the walk. Otherwise we will all end up with a bunch of people trying to reinvent the wheel, over and over again, like JavaScript "developers" who keep reinventing frameworks every six months.<p>> which nobody would buy for any other tool<p>I don't know about you, but I wasn't allowed to use calculators in my calculus classes <i>precisely</i> to learn the concepts properly. "Calculators are for those who know how to do it by hand" was something I heard a lot from my professors.</p>
]]></description><pubDate>Sun, 05 Apr 2026 12:20:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=47648614</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47648614</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47648614</guid></item><item><title><![CDATA[New comment by rglullis in "Subscription bombing and how to mitigate it"]]></title><description><![CDATA[
<p>So that explains the handful of random sign-ups I am getting on Communick. The pattern fits exactly what I am seeing as well (3-4 signups in an hour, weird usernames and gmail/hotmail addresses with lots of "." that are usually ignored. At least on my case the mitigation comes with my obsession in <i>not</i> collecting any unnecessary data. Email address are optional and only used if you are already a paying subscriber.<p>Maybe I should just remove it from the sign-up form altogether and use it as a honeypot.</p>
]]></description><pubDate>Thu, 02 Apr 2026 12:59:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=47613876</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47613876</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47613876</guid></item><item><title><![CDATA[New comment by rglullis in "Show HN: Dull – Instagram Without Reels, YouTube Without Shorts (iOS)"]]></title><description><![CDATA[
<p>> What's your basis for thinking this will work long term?<p>Even if <i>this</i> approach doesn't work long term, the important thing is to establish product-market fit, and to get enough people committed to the idea that <i>your</i> product is their gateway out of the closed platforms.<p>I can think of at least three different ways to set up a system that can go around the API restrictions and re-serve the data to a different client that the user can control. But if I go and implement any of those, someone will try it and give up on my product until <i>that</i> approach gets shut down.<p>By selling lifetime subscriptions, the <i>users</i> get invested in the success of the product as well and they will be more willing to fight the restrictions that the companies impose with you.</p>
]]></description><pubDate>Thu, 02 Apr 2026 10:37:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=47612513</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47612513</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47612513</guid></item><item><title><![CDATA[New comment by rglullis in "OkCupid gave 3M dating-app photos to facial recognition firm, FTC says"]]></title><description><![CDATA[
<p>"23andme", you mean? They were not free, but they were not building their product on open standards, were they? So the don't my pass my filter as well.</p>
]]></description><pubDate>Wed, 01 Apr 2026 04:19:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=47596773</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47596773</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47596773</guid></item><item><title><![CDATA[New comment by rglullis in "OkCupid gave 3M dating-app photos to facial recognition firm, FTC says"]]></title><description><![CDATA[
<p>> every online service<p>This deserves a few qualifiers. I think this should be applied to any service that is<p>- "free" or "freemium"<p>- wrapped as a black box which gives no way out for customers.<p>There are plenty of companies out there who provide services based on FOSS, but we collectively shy away from paying them because it seems "silly" to pay for software that people can run for free.</p>
]]></description><pubDate>Tue, 31 Mar 2026 23:28:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=47594832</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47594832</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47594832</guid></item><item><title><![CDATA[New comment by rglullis in "OpenAI closes funding round at an $852B valuation"]]></title><description><![CDATA[
<p>They got a very sweet deal from the Pentagon, it seems.</p>
]]></description><pubDate>Tue, 31 Mar 2026 21:10:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=47593511</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47593511</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47593511</guid></item><item><title><![CDATA[New comment by rglullis in "ChatGPT won't let you type until Cloudflare reads your React state"]]></title><description><![CDATA[
<p>I shouldn't be giving ideas to your boss, but I bet he would be interested in making ChatGPT available only by paying customers or free for those whose who gets their eyes scanned by The Orb. Give 30 days of raised limits and we're all set to live in the dystopia he wants.</p>
]]></description><pubDate>Sun, 29 Mar 2026 22:25:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=47568057</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47568057</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47568057</guid></item><item><title><![CDATA[New comment by rglullis in "South Korea Mandates Solar Panels for Public Parking Lots"]]></title><description><![CDATA[
<p>But then what are they going to do with the Gorillas? Are winters in Korea that cold?</p>
]]></description><pubDate>Sun, 29 Mar 2026 02:39:46 +0000</pubDate><link>https://news.ycombinator.com/item?id=47559961</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47559961</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47559961</guid></item><item><title><![CDATA[Open Source Gave Me Everything Until I Had Nothing Left to Give]]></title><description><![CDATA[
<p>Article URL: <a href="https://kennethreitz.org/essays/2026-03-18-open_source_gave_me_everything_until_i_had_nothing_left_to_give">https://kennethreitz.org/essays/2026-03-18-open_source_gave_me_everything_until_i_had_nothing_left_to_give</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=47540626">https://news.ycombinator.com/item?id=47540626</a></p>
<p>Points: 19</p>
<p># Comments: 0</p>
]]></description><pubDate>Fri, 27 Mar 2026 09:38:10 +0000</pubDate><link>https://kennethreitz.org/essays/2026-03-18-open_source_gave_me_everything_until_i_had_nothing_left_to_give</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47540626</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47540626</guid></item><item><title><![CDATA[New comment by rglullis in "Moving from GitHub to Codeberg, for lazy people"]]></title><description><![CDATA[
<p>Even the most minimal protection would stop that.</p>
]]></description><pubDate>Thu, 26 Mar 2026 17:40:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=47533401</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47533401</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47533401</guid></item><item><title><![CDATA[New comment by rglullis in "Moving from GitHub to Codeberg, for lazy people"]]></title><description><![CDATA[
<p>Either I am very lucky or what I am doing has zero value to bots, because I've been running servers online for at least 15 years, and  never had any issue that couldn't be solved with basic security hygiene. I use cloudflare as my DNS for some servers, but I always disable any of their paid features. To me they could go out of business tomorrow and my servers would be chugging along just fine.</p>
]]></description><pubDate>Thu, 26 Mar 2026 15:47:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=47531941</link><dc:creator>rglullis</dc:creator><comments>https://news.ycombinator.com/item?id=47531941</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47531941</guid></item></channel></rss>