<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: musebox35</title><link>https://news.ycombinator.com/user?id=musebox35</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Thu, 23 Apr 2026 10:37:21 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=musebox35" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by musebox35 in "TUI Studio – visual terminal UI design tool"]]></title><description><![CDATA[
<p>My ancient boxed copy of Visual Basic for DOS 1.0 that supported mouse clicks on TUI buttons would have found your viewpoint quite offensive if it had any AI in it ;-) Oh boy, good old days.</p>
]]></description><pubDate>Fri, 13 Mar 2026 15:31:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=47365821</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=47365821</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47365821</guid></item><item><title><![CDATA[New comment by musebox35 in "Consistency diffusion language models: Up to 14x faster, no quality loss"]]></title><description><![CDATA[
<p>Similar trend in open text-to-image models: Flux.1 was 12B but now we have 6B models with much better quality. Qwen Image goes from 20B to 7B while merging the edit line and improving quality. Now that the cost of spot H200s at 140GB came down to A100 levels, you can finally try larger scale finetuning/distillation/rl with these models. Very promising direction for open tools and models if the trend continues.</p>
]]></description><pubDate>Fri, 20 Feb 2026 12:50:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=47087376</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=47087376</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47087376</guid></item><item><title><![CDATA[New comment by musebox35 in "The suck is why we're here"]]></title><description><![CDATA[
<p>I guess, the sense of accomplishment is very person dependent. I enjoy programming a lot, but it is easy to find people who would challenge themselves to scale the said website to a million users/X view per day. I don't know the why, probably there is no fixed meaning to existence and nature likes diversity.<p>For me, the fun in programming also depends a lot on the task. Recently, I wanted to have Python configuration classes that can serialize to yaml, but I also wanted to automatically create an ArgumentParser that fills some of the fields. `hydra` from meta does that but I wanted something simpler. I asked an agent for a design but I did not like the convoluted parsing logic it created. I finally designed something by hand by abusing the metadata fields of the dataclass.field calls. It was deeply satisfying to get it to work the way I wanted.<p>But after that, do I really want to create every config class and fill every field by myself for the several scripts/classes that I planned to use? Once the initial template was there, I was happy to just guide the agent to fill in the boilerplate.<p>I agree that we should keep the fun in programming/art, but how we do that depends on the what, the who, and the when.</p>
]]></description><pubDate>Sun, 04 Jan 2026 08:57:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=46486248</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=46486248</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46486248</guid></item><item><title><![CDATA[New comment by musebox35 in "Go Gray, Not Cray: Why You Should Grayscale Your Phone"]]></title><description><![CDATA[
<p>That is likely. Another factor that came into my mind is the gpu using less power due to simpler computations. You can store less data for grayscale, so you need to go over less pixel data to do effects etc. Whether accessibility controls achieve this or not would be implementation dependent I guess.</p>
]]></description><pubDate>Sun, 28 Dec 2025 05:39:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=46408750</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=46408750</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46408750</guid></item><item><title><![CDATA[New comment by musebox35 in "Coursera to combine with Udemy"]]></title><description><![CDATA[
<p>The bitter lesson here is that if you want to control a business you can not avoid or outsource marketing. It is a huge part of any trade and you have to bear the marketing cost. 
I totally understand the desire to avoid it and concentrate on the craft and to create. I tried and failed at it numerous times. I decided that I will not start a business if I do not have any partners who understand and are willing to engage in sales and marketing.</p>
]]></description><pubDate>Thu, 18 Dec 2025 05:53:41 +0000</pubDate><link>https://news.ycombinator.com/item?id=46309322</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=46309322</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46309322</guid></item><item><title><![CDATA[New comment by musebox35 in "Everyone in Seattle hates AI"]]></title><description><![CDATA[
<p>If you dig ml/vision papers from old, you will see that formulation-wise they actually did, but they lacked the data, compute, and the mechanistic machinery provided by the transformer architecture. The wheels of progress are slow and requires many rotations to finally reach somewhere.</p>
]]></description><pubDate>Thu, 04 Dec 2025 04:50:31 +0000</pubDate><link>https://news.ycombinator.com/item?id=46143918</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=46143918</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46143918</guid></item><item><title><![CDATA[New comment by musebox35 in "Everyone in Seattle hates AI"]]></title><description><![CDATA[
<p>I think this is what is blunted by mass education and most textbooks. We need to discover it again if we want to enjoy our profession with all the signals flowing from social media about all the great things other people are achieving. Staying stupid and hungry really helps.</p>
]]></description><pubDate>Thu, 04 Dec 2025 04:43:38 +0000</pubDate><link>https://news.ycombinator.com/item?id=46143880</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=46143880</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46143880</guid></item><item><title><![CDATA[New comment by musebox35 in "Everyone in Seattle hates AI"]]></title><description><![CDATA[
<p>I think this is more about mechanistic understanding vs fundamental insight kind of situation. The linear algebra picture is currently very mechanistic since it only tells us what the computations are. There are research groups trying to go beyond that but the insight from these efforts are currently very limited.
However, the probabilistic view is very much clearer. You can have many explorable insights, both potentially true and false, by jıst understanding the loss functions, what the model is sampling from, what is the marginal or conditional distributions are and so on. Generative AI models are beautiful at that level. It is truly mind blowing that in 2025, we are able to sample from the megapixel image distributions conditioned on the NLP text prompts.</p>
]]></description><pubDate>Thu, 04 Dec 2025 04:36:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=46143850</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=46143850</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46143850</guid></item><item><title><![CDATA[New comment by musebox35 in "OpenAI declares 'code red' as Google catches up in AI race"]]></title><description><![CDATA[
<p>I have deep respect for cuda and Nvidia engineering. However, the arguments above seem to totally ignore Google Search indexing and query software stack. They are the king of distributed software and also hardware that scales. That is way TPUs are a thing now and they can compute with Nvidia where AMD failed. Distributed software is the bread and butter of Google with their multi-decade investment from day zero out of necessity. When you have to update an index of an evolving set of billions of documents daily and do that online while keeping subsecond query capability across the globe, that should teach you a few things about deep software stacks.</p>
]]></description><pubDate>Wed, 03 Dec 2025 16:44:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=46136656</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=46136656</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46136656</guid></item><item><title><![CDATA[New comment by musebox35 in "All it takes is for one to work out"]]></title><description><![CDATA[
<p>That is insightful. Courage to take risks means higher standard deviation in outcomes, more visible successes, but also more hard failures. Risk averse cultures have more stable outcomes, no big successes, but also less financially crippling failures. A personal or social safety net may or may not make you risk averse. Taking semi-calculated risks seems like a skill that needs to be learned for successful entrepreneurship.</p>
]]></description><pubDate>Sun, 30 Nov 2025 07:20:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=46094568</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=46094568</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46094568</guid></item><item><title><![CDATA[New comment by musebox35 in "Is Matrix Multiplication Ugly?"]]></title><description><![CDATA[
<p>The computations in transformers are actually generalized tensor tensor contractions implemented as matrix multiplications. Their efficient implementation in gpu hardware involves many algebraic gems and is a work of art. You can have a taste of the complexity involved in their design in this Youtube video: <a href="https://www.youtube.com/live/ufa4pmBOBT8" rel="nofollow">https://www.youtube.com/live/ufa4pmBOBT8</a></p>
]]></description><pubDate>Sat, 22 Nov 2025 06:40:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=46012667</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=46012667</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46012667</guid></item><item><title><![CDATA[New comment by musebox35 in "IDEmacs: A Visual Studio Code clone for Emacs"]]></title><description><![CDATA[
<p>As a 15+ years emacs user the only item on my wishlist is client-server remote editing mode similar to that of vs code. Then I can go back to using emacs on cloud VMs. Does anyone know a solution to this that works as good as VS Code even when your latency is high? Hopefully, I will be pissed off with all the weird configuration flags of VS Code enough to write one myself ;-) To be fair its python integration is quite good at least for the usual stuff.</p>
]]></description><pubDate>Sun, 16 Nov 2025 14:39:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=45945454</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=45945454</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45945454</guid></item><item><title><![CDATA[New comment by musebox35 in "AMD GPUs Go Brrr"]]></title><description><![CDATA[
<p>In some commercial contexts with the savings from that 20%, you can buy a lot freedom and then with the freedom you bought you can make more free things :)</p>
]]></description><pubDate>Sat, 15 Nov 2025 16:12:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=45938389</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=45938389</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45938389</guid></item><item><title><![CDATA[New comment by musebox35 in "AMD GPUs Go Brrr"]]></title><description><![CDATA[
<p>In certain contexts 20% is a lot bucks, leaving that on the plate would be very wasteful ;-)</p>
]]></description><pubDate>Sat, 15 Nov 2025 13:59:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=45937523</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=45937523</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45937523</guid></item><item><title><![CDATA[New comment by musebox35 in "Yann LeCun to depart Meta and launch AI startup focused on 'world models'"]]></title><description><![CDATA[
<p>Google Deepmind is the closest lab to that idea because Google is the only entity that is big enough to get close to the scale of AT&T. I was skeptical that the Deepmind and Google Brain merge would be successful but it seems to have worked surprisingly well. They are killing it with LLMs and image editing models. They are also backing the fastest growing cloud business in the world and collecting Nobel prizes along the way.</p>
]]></description><pubDate>Wed, 12 Nov 2025 14:55:48 +0000</pubDate><link>https://news.ycombinator.com/item?id=45900995</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=45900995</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45900995</guid></item><item><title><![CDATA[New comment by musebox35 in "Computer science courses that don't exist, but should (2015)"]]></title><description><![CDATA[
<p>I agree, it is another important factor. Pandemic pay and hire rates certainly accentuated this.</p>
]]></description><pubDate>Fri, 24 Oct 2025 13:51:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=45694643</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=45694643</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45694643</guid></item><item><title><![CDATA[New comment by musebox35 in "Computer science courses that don't exist, but should (2015)"]]></title><description><![CDATA[
<p>Sadly this naturally happens in any field that ends up expanding due to its success. Suddenly the number of new practitioners outnumbers the number of competent educators. I think it is a fundamental human resources problem with no easy fix. Maybe llms will help with this, but they seem to reinforce the convergence to the mean in many cases as those to be educated is not in a position to ask the deeper questions.</p>
]]></description><pubDate>Fri, 24 Oct 2025 05:29:06 +0000</pubDate><link>https://news.ycombinator.com/item?id=45691104</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=45691104</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45691104</guid></item><item><title><![CDATA[New comment by musebox35 in "Examples are the best documentation"]]></title><description><![CDATA[
<p>I totally agree with your assessment. I just wanted to highlight that “implementation agnostic” is both a blessing and a curse. You can always apply the principles of diataxis but it provides near zero guidance on how to actually build the documentation for a <i>specific</i> project. This does not reduce its conceptual value, but I wish there was another framework with a complementary practical value.</p>
]]></description><pubDate>Sun, 19 Oct 2025 14:53:19 +0000</pubDate><link>https://news.ycombinator.com/item?id=45634593</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=45634593</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45634593</guid></item><item><title><![CDATA[New comment by musebox35 in "Andrej Karpathy – It will take a decade to work through the issues with agents"]]></title><description><![CDATA[
<p>waymo already has driverless taxi service in a major us city and is expanding. Tesla is in the process. again this is if they cover the last 5%. Scalability arguments wont matter when they can not launch such a service. And no, cmos cameras are close but are not better than the human eye in low light unless you have an ir camera and can flood everywhere with active ir lights. they are certainly inferior in dynamic range. I have been doing vision for more than two decades and I would not be comfortable in a camera only robotaxi at high speed. Certainly not at night or under adverse weather conditions. But this is all speculation of course. Considering fully autonomous driving at scale has been a major unrealised promise for the past 10 years, I stand by my assessment until I see a major advancement in camera technology or affordable active sensors.</p>
]]></description><pubDate>Sat, 18 Oct 2025 17:01:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=45628747</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=45628747</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45628747</guid></item><item><title><![CDATA[New comment by musebox35 in "Andrej Karpathy – It will take a decade to work through the issues with agents"]]></title><description><![CDATA[
<p>And that last 5% is the toughest nut to crack. There is a reason waymo is way ahead even if they can not scale. Cameras are passive devices with relatively poor dynamic range and low light behavior. They are nowhere near a match/replacement for the human eye. Just try to picture a 5 year old at dusk or indoors and what you see will not be what you get.</p>
]]></description><pubDate>Sat, 18 Oct 2025 14:50:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=45627841</link><dc:creator>musebox35</dc:creator><comments>https://news.ycombinator.com/item?id=45627841</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45627841</guid></item></channel></rss>