<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: anonytrary</title><link>https://news.ycombinator.com/user?id=anonytrary</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Tue, 14 Apr 2026 22:24:05 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=anonytrary" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by anonytrary in "All You Need Is 4x 4090 GPUs to Train Your Own Model"]]></title><description><![CDATA[
<p>Thanks for sharing. Have you prodded the model with various inputs and written an article that show various output examples? I'd love to get an idea of what sort of "end product" 4x4090s is capable of producing.</p>
]]></description><pubDate>Sun, 29 Dec 2024 00:13:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=42536142</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=42536142</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42536142</guid></item><item><title><![CDATA[New comment by anonytrary in "That's not an abstraction, that's a layer of indirection"]]></title><description><![CDATA[
<p>I'm not sure what it's called (abstraction vs. indirection) but I dislike when everything needs a class/object with some odd combination of curried functions. Some programming languages force this on you more than others I think? As a contrived example "StringManager.SlicingManager.sliceStringMaker(0)(24)(myStr)", I've seen code that reminds me of this and wonder why anyone uses a language where this not only an acceptable idiom, but a preferred one.</p>
]]></description><pubDate>Sat, 28 Dec 2024 07:11:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=42529206</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=42529206</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42529206</guid></item><item><title><![CDATA[New comment by anonytrary in "Strategic Altruism"]]></title><description><![CDATA[
<p>Surprised the author didn't coin "fauxltruism".</p>
]]></description><pubDate>Wed, 22 May 2024 03:46:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=40437116</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=40437116</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40437116</guid></item><item><title><![CDATA[New comment by anonytrary in "BrainGPT turns thoughts into text"]]></title><description><![CDATA[
<p>Using EEG to predict thought is like looking at the clouds in Mumbai to predict the clouds in Austin. The electrical signal from individual neurons are lost in a sea of large-scale oscillations, which are further blurred by the layers of bone, muscle, and tissue that separate the device from the brain. Bitrate is like 1 bit per second, completely insufficient for most use-cases.</p>
]]></description><pubDate>Sun, 17 Dec 2023 23:26:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=38677478</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=38677478</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38677478</guid></item><item><title><![CDATA[New comment by anonytrary in "OpenAI's board has fired Sam Altman"]]></title><description><![CDATA[
<p>Very excited to see what Sam & Greg are up to in the coming months! Guys like this don't just run away with their tails between their legs. They will be back.</p>
]]></description><pubDate>Sat, 18 Nov 2023 02:19:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=38314163</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=38314163</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38314163</guid></item><item><title><![CDATA[New comment by anonytrary in "Computationally optimal arrangements of barbell plates"]]></title><description><![CDATA[
<p>Agree, moving the weights inefficiently will end up working the muscles you don't necessarily target in your main workout. It's less efficient, but probably gives a (very slightly) more well-rounded workout. If you're pressed for time, it could make sense to have a basic understanding of how to avoid dilly-dallying during your workout. If you go to a public gym, there will be other factors that affect your total time much more, like having to share equipment, which introduces a lot of uncertainty. Maybe these micro-optimizations are worthwhile if you have a private gym.<p>From a well-being/philosophical standpoint, maybe it's better to live life relaxed, and not one where you have to micro-manage every minute of your day to squeeze out every inch and penny of efficiency you can. That sounds like a horrible lifestyle, but I guess to each their own :)</p>
]]></description><pubDate>Sun, 09 Jul 2023 22:20:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=36659560</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=36659560</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36659560</guid></item><item><title><![CDATA[New comment by anonytrary in "India ruling party's IT cell used AI to show smile on arrested protesters' faces"]]></title><description><![CDATA[
<p>Deep fakes were always a huge concern for me in AI. That's just one way AI can be weaponized, and this example is very clear on the damage caused. Thankfully Twitter quickly corrects the issue, but sadly only because it became viral. Sadly, Twitter won't be able to correct all of the little cases of AI misuse that don't go viral.</p>
]]></description><pubDate>Mon, 29 May 2023 14:57:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=36114630</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=36114630</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36114630</guid></item><item><title><![CDATA[New comment by anonytrary in "Dishwasher Salmon"]]></title><description><![CDATA[
<p>Neat and probably works on most dishwasher models, but not enough temperature precision for my comfort level. I wouldn't trust it. Rather just pan fry or bake myself a salmon. I don't have a sous vide, but I imagine this is not a replacement for one since the whole point is precision cooking.</p>
]]></description><pubDate>Sun, 16 Apr 2023 03:37:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=35587048</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=35587048</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35587048</guid></item><item><title><![CDATA[New comment by anonytrary in "How to be a -10x Engineer"]]></title><description><![CDATA[
<p>Engineer who implements correct, comprehensible code but doesn't manage ticket statuses is more valuable than one who manages ticket statuses but scatters the codebase with technical debt and confusing abstractions/code. If "better communication" means spending an extra 10 hours with the latter dev to correct/re-teach them, then yes, communication is the problem. The most time I've lost at work is correcting/teaching engineers who eventually got let go due to low performance.</p>
]]></description><pubDate>Tue, 04 Apr 2023 20:14:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=35445293</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=35445293</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35445293</guid></item><item><title><![CDATA[New comment by anonytrary in "GPT-4 performs significantly worse on coding problems not in its training data"]]></title><description><![CDATA[
<p>Vindicated and excited. Gradient descent is likely not enough. I love it when we get closer to something but are still missing the answer. I would be very happy if "add more parameters and compute" isn't enough to get us to AGI. It means you need talent to get there, and money alone will not suffice. Bad news for OpenAI and other big firms, good news for science and the curious.<p>I imagine physicists got very excited with things like the ultraviolet catastrophe, and the irreconcilable nature of quantum mechanics and general relativity. It's these mysteries that keep the world exciting.</p>
]]></description><pubDate>Sat, 25 Mar 2023 07:16:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=35300287</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=35300287</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35300287</guid></item><item><title><![CDATA[New comment by anonytrary in "Microsoft lays off one of its responsible AI teams"]]></title><description><![CDATA[
<p>That's not how it goes down in practice, where a Chief Diversity Officer's job is to <i>control</i> diversity, rather than to let it flourish. The only "rules and arrangements" that trickle down from the top are the biases, actions, and behaviors of senior leadership. They are the ones who really set the examples. A company with a Chief Culture Officer sounds like one where the leaders are not interested in culture. That's why they hired someone else to "handle" it.</p>
]]></description><pubDate>Tue, 14 Mar 2023 06:16:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=35147992</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=35147992</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35147992</guid></item><item><title><![CDATA[New comment by anonytrary in "Microsoft lays off one of its responsible AI teams"]]></title><description><![CDATA[
<p>Having a Chief Diversity Officer to fix a lack of diversity in my mind is no different than having a Chief Culture Officer to fix a poor company culture. Those sorts of problems can only be solved as the collective sum of every individual employee's actions. For example, if a large team of male engineers are all saying no to equally talented female candidates, then you've just hired shitty people. A diversity figurehead isn't going to change the fact that your team doesn't like or respect women.</p>
]]></description><pubDate>Tue, 14 Mar 2023 03:40:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=35147156</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=35147156</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35147156</guid></item><item><title><![CDATA[New comment by anonytrary in "Joint statement by the Department of the Treasury, Federal Reserve, and FDIC"]]></title><description><![CDATA[
<p>Vivek Ramaswamy? Is that you? He's currently on Mario Nawfal's Twitter space saying this exact thing lmao.</p>
]]></description><pubDate>Sun, 12 Mar 2023 23:06:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=35127759</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=35127759</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35127759</guid></item><item><title><![CDATA[New comment by anonytrary in "Draw SVG rope using JavaScript"]]></title><description><![CDATA[
<p>Breaks down at high width and low thickness.</p>
]]></description><pubDate>Sun, 01 Jan 2023 08:03:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=34204563</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=34204563</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=34204563</guid></item><item><title><![CDATA[New comment by anonytrary in "AI Voice Generator: Text to Speech Software"]]></title><description><![CDATA[
<p>I entered a paragraph from the beginning of this article <a href="https://en.wikipedia.org/wiki/Hilbert_space" rel="nofollow">https://en.wikipedia.org/wiki/Hilbert_space</a><p>I selected several different voices, but it only generated between 2 and 11 seconds. Only got up to the first sentence...</p>
]]></description><pubDate>Sun, 01 Jan 2023 08:02:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=34204558</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=34204558</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=34204558</guid></item><item><title><![CDATA[New comment by anonytrary in "AI chatbots are not a replacement for search engines"]]></title><description><![CDATA[
<p>1. Does Google inform you that you have asked a bad or unproductive question?
2. Does Google guarantee the links it provides you with contain high quality <i>and</i> relevant information?
3. Does Google ensure that the #1 link contains the correct answer?<p>I think we are forgetting that Google suffers from these problems too. A human who asks bad questions will continue to ask bad questions. AI should in general be trained to do what the human wants. If the human wants something stupid, that's on the human. Google also isn't guaranteed to find the right information you're looking for, especially if it is on an obscure topic or if the query is highly overloaded or being taken over by current trends.</p>
]]></description><pubDate>Mon, 26 Dec 2022 10:01:34 +0000</pubDate><link>https://news.ycombinator.com/item?id=34136272</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=34136272</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=34136272</guid></item><item><title><![CDATA[New comment by anonytrary in "AI chatbots are not a replacement for search engines"]]></title><description><![CDATA[
<p>This is a great way to formulate the difference. In my mind, Google does search, which is not the same as synthesis. ChatGPT is a synthesis engine. I think there is room for engines like ChatGPT to also return links to the training documents that were most relevant to producing an answer. At that point, it would be doing search and synthesis at the same time.<p>I am not sure if neural networks have been trained to do this yet, but it would be very cool to see a network that produces both generated output and the most relevant input data that led to that output, by somehow keeping track of the influence certain inputs have on various learned features and internal structure of the network. I think of this as an index.<p>You make a good point about SEO. SEO is going to be obsolete in the near future. The SEO game is basically an adversarial attack on Google, which Google itself actually promotes and encourages website builders to do! In the future, we are going to look back on the cat-and-mouse game of SEO and see it as primitive and antithetical to the goal of search.</p>
]]></description><pubDate>Mon, 26 Dec 2022 09:41:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=34136161</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=34136161</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=34136161</guid></item><item><title><![CDATA[New comment by anonytrary in "AI chatbots are not a replacement for search engines"]]></title><description><![CDATA[
<p>The key thing to understand about Google (the search product) is that it does search somewhat well but it is extremely bad at synthesis. When it comes to big data, synthesis is just as important as search. The UX with traditional 2000-era search engines involves the user being given a library's worth of information rabbit holes to dig through. With synthesis engines, the UX is completely different and they might even be solving different user problems.<p>As per your example, nobody currently uses Google as a way to draft letters, but rather as a way to <i>learn</i> how to draft letters. I think the distinction is pretty key in understanding the difference between the two problem spaces. I would think that "write me a letter" is a problem that isn't in Google's domain. I do not think synthesis engines will necessarily replace search engines, but the two will both be useful.<p>The premise of Google's interaction design is that you will be taken to an external resource. Google in recent years has started adding widgets and blurbs at the top of the search results for common things like stocks, covid-cases chart, weather, etc. but this synthesized content isn't their primary focus and are likely hard-coded to a large extent.</p>
]]></description><pubDate>Mon, 26 Dec 2022 09:32:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=34136113</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=34136113</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=34136113</guid></item><item><title><![CDATA[New comment by anonytrary in "AI chatbots are not a replacement for search engines"]]></title><description><![CDATA[
<p>I don't think anyone who has played enough with ChatGPT is championing the notion that ChatGPT itself will replace Google. They are more so steel-manning the argument that Google might not be the best interface for search.</p>
]]></description><pubDate>Mon, 26 Dec 2022 09:24:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=34136086</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=34136086</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=34136086</guid></item><item><title><![CDATA[New comment by anonytrary in "For some with ADHD, the low rumble of brown noise quiets the brain"]]></title><description><![CDATA[
<p>This is why I sleep with a box fan on low at night. Does wonders for sleep and tranquility... People tend to lump all sorts of noise into "white" noise, including fans, but fans are much, much closer to the more tolerable brown noise. White noise is terrible and high pitched. Absolutely cannot stand it...</p>
]]></description><pubDate>Mon, 21 Nov 2022 05:51:47 +0000</pubDate><link>https://news.ycombinator.com/item?id=33689298</link><dc:creator>anonytrary</dc:creator><comments>https://news.ycombinator.com/item?id=33689298</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=33689298</guid></item></channel></rss>