<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: linguae</title><link>https://news.ycombinator.com/user?id=linguae</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Mon, 13 Apr 2026 20:57:44 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=linguae" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by linguae in "Show HN: Oberon System 3 runs natively on Raspberry Pi 3 (with ready SD card)"]]></title><description><![CDATA[
<p>I'm quite familiar with Project Oberon as a professor who studies operating systems and programming languages, but even though this is Hacker News where many of us are familiar with the project, I'm not surprised that there are also many readers who are not familiar with it, since Oberon does not have the userbase of much more popular programming languages and operating systems, and it's not even covered in many undergraduate courses on those topics.  Most undergraduate OS courses are Unix-focused, centering on either Linux, Minix, or xv6.  The Oberon OS is certainly not Unix.  Programming languages and compiler courses tend to vary, but I haven't seen one that uses Oberon.</p>
]]></description><pubDate>Mon, 13 Apr 2026 15:17:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=47753260</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47753260</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47753260</guid></item><item><title><![CDATA[New comment by linguae in "Apple's accidental moat: How the "AI Loser" may end up winning"]]></title><description><![CDATA[
<p>I don’t think it’s AI slop.  Even before modern generative AI, I’ve noticed a decline in Apple’s software quality.<p>Rather, I feel that Apple has forgotten its roots.  The Mac was “the computer for the rest of us,” and there were usability guidelines backed by research.  What made the Mac stand out against Windows during a time when Windows had 95%+ marketshare was the Mac’s ease of use.  The Mac really stood out in the 2000s, with Panther and Tiger being compelling alternatives to Windows XP.<p>I think Apple is less perfectionistic about its software than it was 15-20 years ago.  I don’t know what caused this change, but I have a few hunches:<p>0.  There’s no Steve Jobs.<p>1.  When the competition is Windows and Android, and where there’s no other commercial competitors, there’s a temptation to just be marginally better than Windows/Android than to be the absolute best.  Windows’ shooting itself in the foot doesn’t help matters.<p>2.  The amazing performance and energy efficiency of Apple Silicon is carrying the Mac.<p>3.  Many of the people who shaped the culture of Apple’s software from the 1980s to the 2000s are retired or have even passed away.  Additionally, there are not a lot of young software developers who have heard of people like Larry Tesler, Bill Atkinson, Bruce Tognazzini, Don Norman, and other people who shaped Apple’s UI/UX principles.<p>4.  Speaking of Bruce Tognazzini and Don Norman, I am reminded of this 2015 article (<a href="https://www.fastcompany.com/3053406/how-apple-is-giving-design-a-bad-name" rel="nofollow">https://www.fastcompany.com/3053406/how-apple-is-giving-desi...</a>) where they criticized Apple’s design as being focused on form over function.  It’s only gotten worse since 2015.  The saving grace for Apple is that the rest of the industry has gone even further in reducing usability.<p>I think what it will take for Apple to readopt its perfectionism is if competition forced it to.</p>
]]></description><pubDate>Mon, 13 Apr 2026 04:37:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=47747660</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47747660</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47747660</guid></item><item><title><![CDATA[New comment by linguae in "Apple Silicon and Virtual Machines: Beating the 2 VM Limit (2023)"]]></title><description><![CDATA[
<p>I have a personal Framework 13 and a work-issued MacBook Pro.  I love Framework’s mission of providing user-serviceable hardware; we need upgradable, serviceable hardware.  However, the battery life on my MacBook Pro is dramatically better than on my Framework.  Moreover, Apple Silicon offers excellent performance on top of its energy efficiency.  While I use Windows 11 on my Framework, I prefer macOS.<p>Additionally, today’s sky-high RAM and SSD prices have caused an unexpected situation: Apple’s inflated prices for RAM and SSD upgrades don’t look that bad in comparison to paying market prices for DIMMs and NVMe SSDs.  Yes, the Framework has the advantage of being upgradable, meaning that if RAM and SSD prices decrease, then upgrades will be cheaper in the future, whereas with a Mac you can’t (easily) upgrade the RAM and storage once purchased.  However, for someone who needs a computer right now and is willing to purchase another one in a few years, then a new Mac looks appealing, especially when considering the benefits of Apple Silicon.</p>
]]></description><pubDate>Sun, 12 Apr 2026 01:41:23 +0000</pubDate><link>https://news.ycombinator.com/item?id=47735483</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47735483</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47735483</guid></item><item><title><![CDATA[New comment by linguae in "LibreOffice – Let's put an end to the speculation"]]></title><description><![CDATA[
<p>None of the tools that you mentioned except for LibreOffice and OpenOffice are free-as-in-freedom, and if you’re using Linux on the desktop, then Microsoft Office and the Apple iWork suite are unavailable as desktop applications.</p>
]]></description><pubDate>Sun, 05 Apr 2026 21:38:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=47654126</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47654126</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47654126</guid></item><item><title><![CDATA[New comment by linguae in "U.S. stocks are set to deliver their worst quarter in nearly four years"]]></title><description><![CDATA[
<p>I interpreted the clause “two poor alternatives in a row” as Biden + Harris in the 2024 presidential election, and not Clinton + Harris, since Clinton was the 2016 nominee and Harris was the 2024 nominee after Biden dropped out, but the 2020 nominee was Biden, who did successfully defeat Trump that year.<p>In my opinion, Clinton’s and Harris’ losses had less to do with their gender and more to do with the candidates themselves:<p>1.  Clinton was facing strong anti-establishment headwinds, and Clinton is a very establishment politician.  Many people in 2016 were piping mad at establishment politicians.  Trump was able to win the GOP nomination on a platform of “draining the swamp” and pursuing an aggressively right-wing agenda compared to more moderate Republicans, and Sanders, who also had an anti-establishment platform, proved to be a formidable opponent to Clinton.  Despite Clinton’s loss, she was still able to win the popular vote.  Perhaps had there been less anti-establishment sentiment, it would have been a Clinton vs Jeb Bush election, and I believe Clinton would have won that race.<p>2.  Harris never won a presidential primary election.  The only reason she ended up becoming the nominee is because Biden dropped out of the race after his disastrous debate performance against Trump, which occurred after the primaries.  Since it was too late to have the voters decide on a replacement for Biden, the Democratic Party selected a replacement: Harris.  She only had a few months to campaign, whereas Trump had virtually campaigned his entire time out of office.<p>3.  Let’s not forget the Trump factor in 2024.  During Biden’s entire presidency, Trump was able to consolidate his hold on the GOP and his voting base, and in some ways he even expanded his base.  The conservative media was filled with defenses of January 6, and Trump was able to convince enough Americans that he and his supporters were persecuted in the aftermath of the 2020 election and January 6.</p>
]]></description><pubDate>Tue, 31 Mar 2026 14:53:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=47588237</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47588237</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47588237</guid></item><item><title><![CDATA[New comment by linguae in "U.S. stocks are set to deliver their worst quarter in nearly four years"]]></title><description><![CDATA[
<p>I believe Trump would have won 2020 had the COVID pandemic not happened.  Things were very chaotic in 2020 America.  Biden and his extensive experience in the federal government looked reassuring to a lot of Americans.  Biden would have had a tougher time against Trump had 2020 been more like 2019.  I believe Biden would have had a tougher time against Bernie Sanders in the primaries had COVID not happened, though a counterargument is that Super Tuesday happened on March 3, before shelter-in-place policies were in effect in California.<p>A big reason for Trump's success despite his polarizing nature is the polarizing effects of the platforms of our two parties, which distinguish themselves on "culture war" issues such as abortion, gun rights, immigration, LGBT+ rights, and race relations.  There are many Americans who love the MAGA agenda, and there are also many Americans who are not in 100% agreement with MAGA but who'd never vote for a Democrat since they feel that a candidate with the opposite cultural views is anathema.  If third parties were more viable in America, the latter group of voters could vote for a candidate that is more to their temperament instead of voting for whomever the GOP nominee is.</p>
]]></description><pubDate>Tue, 31 Mar 2026 13:31:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=47587117</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47587117</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47587117</guid></item><item><title><![CDATA[New comment by linguae in "Apple discontinues the Mac Pro"]]></title><description><![CDATA[
<p>As much as I love alluring designs such as the NeXT Cube (which I have), the Power Mac G4 Cube (which I wish I had), and the 2013 Mac Pro (which I also have), sometimes a person needs a big, hulking box of computational power with room for internal expansion, and from the first Quadra tower in the early 1990s until the 2012 Mac Pro was discontinued, and again from 2019 until today, Apple delivered this.<p>Even so, the ARM Mac Pro felt more like a halo car rather than a workhorse.  The ARM Mac Pro may have been more compelling had it supported GPUs.  Without this support, the price premium of the Mac Pro over the Mac Studio was too great to justify purchasing the Pro for many people, unless they absolutely needed internal expansion.<p>I’d love a user-upgradable Mac like my 2013 Mac Pro, but it’s clear that Apple has long moved on with its ARM Macs.  I’ve moved on to the PC ecosystem.  On one hand ARM Macs are quite powerful and energy-efficient, but on the other hand they’re very expensive for non-base RAM and storage configurations, though with today’s crazy prices for DDR5 RAM and NVMe SSDs, Apple’s prices for upgrades don’t look that bad by comparison.</p>
]]></description><pubDate>Fri, 27 Mar 2026 04:13:47 +0000</pubDate><link>https://news.ycombinator.com/item?id=47539043</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47539043</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47539043</guid></item><item><title><![CDATA[New comment by linguae in "Apple Confirms Mac Pro Is Dead, No Future Models Planned"]]></title><description><![CDATA[
<p>I believe this is the first time since 1987 with the introduction of the Macintosh II that there are no Macs in Apple's lineup that offer some type of combination of upgradeable RAM, upgradeable storage, and internal expansion slots.  The 2013 Mac Pro lacked internal expansion slots, but still had DIMM slots and an SSD slot.  The 2019 Mac Pro brought back expansion slots, though the 2023 Mac Pro took away DIMM slots in favor of the unified memory architecture found in all ARM Macs.<p>I have mixed feelings about this.  On one hand I miss being able to upgrade RAM at a later date without having to pay up-front for all of the RAM I'm expected to use for the lifetime of the machine.  This is especially painful in 2026 with today's sky-high RAM prices caused by intense demand.  On the other hand, the memory bandwidth in Apple's ARM Macs is tremendous, especially in higher-end Macs, due to the tight integration of the design.  This matters greatly in memory-intensive applications such as generative AI.  I feel less bad about non-expandable RAM given the tradeoffs, though it still makes for quite expensive computing, especially at 2026 RAM prices.<p>I guess Apple has finally achieved Steve Jobs' original Macintosh vision of closed-off appliances, though (thankfully) the NeXT Cube and the NeXTstation were not like that.  RIP to Jean Louis-Gassée's vision of expandable, upgradeable Macs, starting with the Macintosh II in 1987 and leading to other fine Macs such as the Macintosh IIfx, the Quadra lineup, high-end Power Macs (8100, 8500, 9500, 8600, 9600, G3, G4, G5), and the Mac Pro.</p>
]]></description><pubDate>Thu, 26 Mar 2026 22:35:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=47536697</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47536697</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47536697</guid></item><item><title><![CDATA[New comment by linguae in "An unsolicited guide to being a researcher [pdf]"]]></title><description><![CDATA[
<p>Indeed.  It seems, at least in America (I’m less familiar with the situation abroad) that computer science researchers who want to do longer-term work are getting squeezed.  Less funding means fewer research positions in academia.  Industry has many opportunities, especially in AI, but industry tends to favor shorter-term, product-focused research as opposed to longer-term work with fewer immediate prospects for productization.  This is a great environment for many researchers, but researchers who want to work on longer-term, “blue-skies” projects might not find a suitable position in industry these days.</p>
]]></description><pubDate>Mon, 23 Mar 2026 17:54:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=47492861</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47492861</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47492861</guid></item><item><title><![CDATA[New comment by linguae in "Why craft-lovers are losing their craft"]]></title><description><![CDATA[
<p>I wholeheartedly agree.  Computing professions such as software engineering used to feel like, "Wow, they're <i>paying</i> me to do this!"  Yes, there was real work involved, but for many of us it never felt like drudgery, and we produced, shipped, and made our customers, managers, and other stakeholders happy.  I remember a time (roughly 20 years ago) when zealous enthusiasts would proudly profess that they'd work for companies like Apple or Google for <i>free</i> if they could work on their dream projects.<p>Times have changed.  The field has become much more serious about making money; fantasies about volunteering at Apple have been replaced with fantasies about very large salaries and RSU grants.  Simultaneously (and I don't think coincidentally), the field has become less fun.  I recognized how privileged this sounds talking about "fun", given how for most of humanity, work isn't about having fun and personal fulfillment, but about making the money required to house, feed, and clothe themselves and their loved ones.  Even with the drudgery of corporate life, it beats the work conditions and the abuse that many other occupations get.<p>Still, let's pour one out for a time when the interests and passions of computing enthusiasts did line up with the interests of the corporate world.</p>
]]></description><pubDate>Sun, 22 Mar 2026 04:03:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=47474340</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47474340</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47474340</guid></item><item><title><![CDATA[New comment by linguae in "Why craft-lovers are losing their craft"]]></title><description><![CDATA[
<p>My take is that there used to be a significant overlap between hobbyist-style exploration/coding and what industry wanted, especially during the PC revolution where companies like Apple and Microsoft were started by hobbyists selling their creations to other people.  This continued through the 1990s and the 2000s; we know the story of how Mark Zuckerberg started Facebook from his Harvard dorm room.  I am a 90s kid who was inspired by the stories of Steve Jobs and Bill Gates to pursue a computing career.  I was also inspired by Bell Labs and Xerox PARC researchers.<p>The “hacker-friendliness” of software industry employment has been eroding in the past decade or so, and generative AI is another factor that strengthens the position of business owners and managers.  Perhaps this is the maturing of the software development field.  Back when computers were new and when there were few people skilled in computing, employment was more favorable for hobbyists.  Over time the frontiers of computing have been settled, which reduced the need for explorers, and thus explorers have been sidelined in favor of different types of workers.  LLMs are another step; while I’m not sure that LLMs could do academic research in computer science, they are already capable of doing software engineering tasks that undergraduates and interns could do.<p>I think what some of us are mourning is the closing of a frontier, of our figurative pastures being turned into suburban subdivisions.  It’s bigger than generative AI; it’s a field that is less dependent on hobbyists for its future.<p>There will always be other frontiers, and even in computing there are still interesting areas of research and areas where hobbyists can contribute.  But I think much of the software industry has moved in a direction where its ethos is different from the ethos of enthusiasts.</p>
]]></description><pubDate>Sun, 22 Mar 2026 02:38:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=47473917</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47473917</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47473917</guid></item><item><title><![CDATA[New comment by linguae in "Why craft-lovers are losing their craft"]]></title><description><![CDATA[
<p>I’ve come to the same conclusion, though my line of work was research rather than software engineering.  “He who pays the piper calls the tune.”  It’s fun as long as I enjoyed the tunes being called, but the tunes changed, and I became less interested in playing.<p>I am now a tenure-track community college professor.  I’m evaluated entirely by my teaching and service.  While teaching a full course load is intense, and while my salary is nowhere near what a FAANG engineer makes, I get three months of summer break and one month of winter break every year to rejuvenate and to work on personal projects, with nobody telling me what research projects to work on, how frequently I should publish, and how fast I ship code.<p>This quote from J. J. Thomson resonates with me, and it’s more than 100 years old:<p>"Granting the importance of this pioneering research, how can it best be promoted? The method of direct endowment will not work, for if you pay a man a salary for doing research, he and you will want to have something to point to at the end of the year to show that the money has not been wasted. In promising work of the highest class, however, results do not come in this regular fashion, in fact years may pass without any tangible results being obtained, and the position of the paid worker would be very embarrassing and he would naturally take to work on a lower, or at any rate a different plane where he could be sure of getting year by year tangible results which would justify his salary. The position is this: You want this kind of research, but, if you pay a man to do it, it will drive him to research of a different kind. The only thing to do is to pay him for doing something else and give him enough leisure to do research for the love of it." (from <a href="https://archive.org/details/b29932208/page/198/mode/2up" rel="nofollow">https://archive.org/details/b29932208/page/198/mode/2up</a>).</p>
]]></description><pubDate>Sun, 22 Mar 2026 02:17:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=47473790</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47473790</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47473790</guid></item><item><title><![CDATA[New comment by linguae in "Wayland set the Linux Desktop back by 10 years?"]]></title><description><![CDATA[
<p>I remember when I first learned about GNUstep in 2004 when I was in high school.  It's a shame GNUstep never took off; we could have had an ecosystem of applications that could run on both macOS and Linux using native GUIs.<p>With that said, the dream is not dead.  There's a project named Gershwin (<a href="https://github.com/gershwin-desktop/gershwin-desktop" rel="nofollow">https://github.com/gershwin-desktop/gershwin-desktop</a>), which is a Mac-like desktop environment built on top of GNUstep.  Gershwin appears to be heavily inspired by Apple Rhapsody (<a href="https://en.wikipedia.org/wiki/Rhapsody_(operating_system)" rel="nofollow">https://en.wikipedia.org/wiki/Rhapsody_(operating_system)</a>)  with some modern touches.</p>
]]></description><pubDate>Fri, 20 Mar 2026 01:39:31 +0000</pubDate><link>https://news.ycombinator.com/item?id=47449299</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47449299</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47449299</guid></item><item><title><![CDATA[New comment by linguae in "Ask HN: What is it like being in a CS major program these days?"]]></title><description><![CDATA[
<p>I'd like to give my perspective as a computer science professor at Ohlone College, which is a two-year community college located in Silicon Valley.  I used to work as an AI researcher in industry (but not in large language models) before becoming a tenure-track instructor in Fall 2024.<p>Our core computer science curriculum consists of five courses: (1) an introductory programming course taught in a procedural subset of C++, (2) an object-oriented programming course taught in C++, (3) a data structures and algorithms course taught in C++, (4) a discrete mathematics course, and (5) an assembly language course that also covers basic computer architecture.  Students who pass all five courses are prepared to transfer to a four-year university to complete their undergraduate computer science programs.  The majority of our students transfer to either San Jose State University or California State University East Bay, though many of our students transfer to University of California campuses, typically UC Davis, UC Santa Cruz, UC Merced, and UC Irvine.<p>Because I teach introductory freshman- and sophomore-level courses, I feel it is vital for students to have a strong foundation with basic programming and basic computer science before using generative AI tools, and thus I do not accept programming assignments that were completed using generative AI tools.  I admit that I'd have a different, more nuanced stance if I were teaching upper-division or graduate-level computer science courses.  I have found that students who rely on generative AI for programming tend to struggle more on exams, and they also tend to lack an understanding of the programming language constructs the generated program used.<p>With that said, I recognize that generative AI tools are likely to become more powerful and cheaper over time.  As much as I don't like this brave new world where students can cheat with even less friction today, we professors need to stay on top of things, and so I will be spending the entire month of June (1/3rd of my summer break) getting up to speed with large language models, both from a users' point of view and also from an AI research point of view.<p>Whenever my students are wondering whether it's worth studying computer science in light of the current job market and anxieties about AI replacing programmers, I tell them two things.  The first thing I tell them is that computers and computation are very interesting things to study in their own right.  Even if AI dramatically reduces software engineering jobs, there will still be a need for people to understand how computers and computation work.<p>The second thing I tell them is that economic conditions are not always permanent.  I was a freshman at Cal Poly San Luis Obispo in 2005, when computer science enrollment bottomed out in the United States.  In high school, well-meaning counselors and teachers warned me about the post-dot com bust job market and about outsourcing to India and other countries.  I was an avid Slashdot reader, and the piece of advice I kept reading was to forego studying computer science and earn a business degree.  However, I was a nerd who loved computers, who started programming at nine years old.  I even wrote an essay in high school saying that I'd move to India if that's where all of the jobs are going to end up.  The only other things I could imagine majoring in at the time were mathematics and linguistics, and neither major was known for excellent job prospects.  Thus, I decided to major in computer science.<p>A funny thing happened while I was at Cal Poly.  Web 2.0, smartphones, cloud computing, and big data took off during my undergraduate years.  My classmates and I were able to get internships at prestigious companies, even during the economic crisis of 2008-09.  Upon graduation, I ended up doing an internship in Japan at a major Japanese tech company and then started a PhD program at UC Santa Cruz, but many of my classmates ended up at companies like Microsoft, Apple, and Google, just in time for tech industry to enter an extended gold rush from roughly 2012 when Facebook went public until 2022 when interest rates started to go up.  Many of my classmates made out like bandits financially.  Me?  I made different choices going down a research/academic path; I still live in an apartment and I have no stock to my name.  I have no regrets, except maybe for not getting into Bitcoin in 2011 when I first heard about it....  Though I'm not "Silicon Valley successful", I'm living a much better life today than I was in high school, qualifying for Pell Grants and subsidized student loans to help pay for my Cal Poly education due to my parents' low income.<p>I still believe in the beauty of an undergraduate curriculum that encourages critical thinking and developing problem-solving skills, as opposed to merely learning industry topics du jour.  Specific tools often come and go; my 2005 Linux system administration knowledge didn't cover systemd and Wayland since they didn't exist at the time, but my copies of <i>Introduction to Algorithms</i> by Cormen et al. and my Knuth volumes remain relevant.</p>
]]></description><pubDate>Mon, 16 Mar 2026 19:04:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=47403289</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47403289</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47403289</guid></item><item><title><![CDATA[New comment by linguae in "Intel Demos Chip to Compute with Encrypted Data"]]></title><description><![CDATA[
<p>I was just thinking about this a few days ago, but not just for the CPU (which we have RISC-V and OpenPOWER), but for an entire system, including the GPU, audio, disk controllers, networking, etc.  I think a great target would be mid-2000s graphics and networking; I could go back to a 2006 Mac Pro without too much hardship.  Having a fully-open equivalent to mid-2000s hardware would be a boon for open computing.</p>
]]></description><pubDate>Tue, 10 Mar 2026 18:30:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=47327067</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47327067</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47327067</guid></item><item><title><![CDATA[New comment by linguae in "FreeBSD 14.4-Release Announcement"]]></title><description><![CDATA[
<p>I love the BSDs; I have the most experience with FreeBSD, I regularly use macOS, and lately I’ve been learning NetBSD due to its rumpkernel.<p>With that said, with the decline of commercial Unix and the dominance of Linux, POSIX, in my opinion, has become less important, and in its place Linux seems to be the standard.  I prefer the BSDs to Linux due to its design and documentation, but Linux has better hardware support, and the FOSS ecosystem, especially the desktop, is increasingly embracing Linuxisms such as Wayland and systemd.  The FOSS BSD ecosystems are too small to counter the Linuxization of the Unix ecosystem, and I feel that Apple does not pay much attention to the BSD side of macOS these days.<p>I don’t expect the BSDs to die, but I do believe they’ll need to find ways to adapt to an increasingly Linux-dominated FOSS ecosystem.</p>
]]></description><pubDate>Tue, 10 Mar 2026 14:39:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=47323922</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47323922</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47323922</guid></item><item><title><![CDATA[New comment by linguae in "FrameBook"]]></title><description><![CDATA[
<p>That’s what happened to my 2006 Core Duo MacBook after about three or four years of use.  It was an excellent laptop that was quite user-serviceable (I upgraded the RAM and hard drive), but I did have problems with the palmrests, and the Ethernet port stopped working after four years.<p>It was my first Apple laptop and I have fond memories of using it during my college years.</p>
]]></description><pubDate>Sun, 08 Mar 2026 15:53:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=47298307</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47298307</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47298307</guid></item><item><title><![CDATA[New comment by linguae in "On the Design of Programming Languages (1974) [pdf]"]]></title><description><![CDATA[
<p>There is a lot of good computer science, but the computer science community today is vastly larger than it was in the 1960s and 1970s when Dijkstra, Knuth, Wirth, and others became legends.  There are so many subfields of CS, each with its own deep literature and legendary figures.  It’s difficult to be a modern Dijkstra or Knuth due to these factors, though to be fair, it is an impressive feat for Dijkstra to be Dijkstra and for Knuth to be Knuth even in their heydays.  It’s just easier to get famous in an upstart field compared to getting famous in a mature field.<p>I think there are two typical paths to widespread visibility across CS subfields: (1) publishing a widely-adopted textbook, and (2) writing commonly-used software.  For example, many computer scientists know about Patterson and Hennessy due to their famous computer architecture textbooks, and many computer scientists know about people like Jeff Dean due to their software.<p>Reading more academically-oriented literature such as the ACM’s monthly periodical “Communications of the ACM” is also a good way to get acquainted with the latest developments of computer science.</p>
]]></description><pubDate>Wed, 04 Mar 2026 17:08:55 +0000</pubDate><link>https://news.ycombinator.com/item?id=47250578</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47250578</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47250578</guid></item><item><title><![CDATA[New comment by linguae in "LibreOffice hits back at critics, says its UI is better than Microsoft Office's"]]></title><description><![CDATA[
<p>I prefer the classic Microsoft Office toolbar/menu interface to the ribbon, but I grew up on Office 97.  A thought that dawned on me is that the ribbon has existed for nearly 20 years; it debuted in Microsoft Office 2007.  There is now an entire generation of computer users who have never used pre-ribbon versions of Microsoft Office.<p>I don't know what it's like using modern Microsoft Office with no experience using toolbars and menus and then switching to LibreOffice, which still uses traditional toolbars and menus.<p>I prefer traditional toolbars and menus, but I remember Microsoft doing user studies when developing the original Office 2007 ribbon.  It showed that the ribbon was more productive for beginners and casual users.  Given that many office suite users are casual users who use word processors, spreadsheets, and presentation tools, Microsoft Office may be more productive for them than LibreOffice.  It would be good if LibreOffice did user studies.</p>
]]></description><pubDate>Tue, 03 Mar 2026 01:51:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=47226913</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47226913</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47226913</guid></item><item><title><![CDATA[New comment by linguae in "The Windows 95 user interface: A case study in usability engineering (1996)"]]></title><description><![CDATA[
<p>Steve Jobs is famous for his 1996 quote about Microsoft not having taste (<a href="https://www.youtube.com/watch?v=UiOzGI4MqSU" rel="nofollow">https://www.youtube.com/watch?v=UiOzGI4MqSU</a>).  I disagree; as much as I love the classic Mac OS and Jobs-era Mac OS X, and despite my feelings about Microsoft's monopolistic behavior, 1995-2000 Microsoft's user interfaces were quite tasteful, in my opinion, and this was Microsoft's most tasteful period.  I have fond memories of Windows 95/NT 4/98/2000, Office 97, and Visual Basic 6.  I even liked Internet Explorer 5.  These were well-made products when it came to the user interface.  Yes, Windows 95 crashed a lot, but so did Macintosh System 7.<p>Things started going downhill, in my opinion, with the Windows XP "Fisher-Price" Luna interface and the Microsoft Office 2007 ribbon.</p>
]]></description><pubDate>Sat, 28 Feb 2026 23:42:23 +0000</pubDate><link>https://news.ycombinator.com/item?id=47201788</link><dc:creator>linguae</dc:creator><comments>https://news.ycombinator.com/item?id=47201788</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47201788</guid></item></channel></rss>