<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: millstone</title><link>https://news.ycombinator.com/user?id=millstone</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Tue, 07 Apr 2026 05:38:27 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=millstone" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by millstone in "GNU Autotools: A Tutorial [pdf]"]]></title><description><![CDATA[
<p>I think it is simply that make is too crufty to extend. It has perma-broken behavior around space handling, which nobody dares touch [1]. But you can't replace it because it is entrenched, so Makefiles must be wrapped, by configure or CMake or whatever. And these in turn have their own cruft, so must be wrapped, by autoconf and...well someone please wrap CMake!<p>The C and C++ ecosystem is crying out for a new build system, but there is no mechanism to converge on one. It's bad.<p>1: <a href="http://savannah.gnu.org/bugs/?712" rel="nofollow">http://savannah.gnu.org/bugs/?712</a></p>
]]></description><pubDate>Wed, 31 Mar 2021 05:04:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=26643712</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26643712</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26643712</guid></item><item><title><![CDATA[New comment by millstone in "GNU Autotools: A Tutorial [pdf]"]]></title><description><![CDATA[
<p>What does "get it all" mean? All of what?</p>
]]></description><pubDate>Wed, 31 Mar 2021 04:50:08 +0000</pubDate><link>https://news.ycombinator.com/item?id=26643638</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26643638</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26643638</guid></item><item><title><![CDATA[New comment by millstone in "Ubiquiti starts serving ads in their management interface"]]></title><description><![CDATA[
<p>What did you buy instead?</p>
]]></description><pubDate>Tue, 30 Mar 2021 06:28:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=26631712</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26631712</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26631712</guid></item><item><title><![CDATA[New comment by millstone in "Ubiquiti starts serving ads in their management interface"]]></title><description><![CDATA[
<p>Please tell us more.</p>
]]></description><pubDate>Tue, 30 Mar 2021 06:10:47 +0000</pubDate><link>https://news.ycombinator.com/item?id=26631642</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26631642</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26631642</guid></item><item><title><![CDATA[New comment by millstone in "Ubiquiti starts serving ads in their management interface"]]></title><description><![CDATA[
<p>Pitch to a VP with "we will do a tiny trial of 1% of customers, if it doesn't work it's GONE, no harm done."<p>Gets the greenlight, everyone who worked on it is invested and motivated to make the results look good, and it builds from there.</p>
]]></description><pubDate>Tue, 30 Mar 2021 05:55:41 +0000</pubDate><link>https://news.ycombinator.com/item?id=26631569</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26631569</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26631569</guid></item><item><title><![CDATA[New comment by millstone in "Ubiquiti starts serving ads in their management interface"]]></title><description><![CDATA[
<p>Can you say more? What happens if you try?</p>
]]></description><pubDate>Tue, 30 Mar 2021 05:43:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=26631524</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26631524</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26631524</guid></item><item><title><![CDATA[New comment by millstone in "Does Amazon make more from ads than AWS?"]]></title><description><![CDATA[
<p>Amazon ads are contrary to the company's mission statement of being the "most customer centric company." They could improve the customer experience at one stroke by eliminating pay-for-placement.</p>
]]></description><pubDate>Sun, 28 Mar 2021 05:11:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=26608143</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26608143</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26608143</guid></item><item><title><![CDATA[New comment by millstone in "The Pursuit of Appiness (2020)"]]></title><description><![CDATA[
<p>What a sad vision. Per the article, the most important thing about software is that it is safe. The role of an OS is to protect against abuse. The web is inherently safe; therefore we should use web apps instead of native apps. This is profound technological pessimism.<p>A different vision: computers empower people. An OS provides APIs and UI conventions, and apps can use them to build upon each other, so that users can bring their knowledge from one app to another. Users invest in learning advanced techniques because it's worth it, because all our software participates (think keyboard shortcuts or Unix pipes).<p>The web has none of this. Websites are not cultivating a new UI vocabulary. Most websites don't support even basic interactions beyond click and tap. They may even actively parasitize the old: Google Docs supports cmd-Z but not Edit->Undo; we are being taught to distrust our UI, the OS itself is being eroded.<p>> users who cannot perceive or experience the web delivering great experiences<p>Then build those great experiences and Apple will change its tune. Seriously.<p>> The web was a lifeboat away from native apps for Windows XP users. That lifeboat won't come for iPhone owners because Apple won't allow it.<p>iPhone owners are happy and Apple knows it. On the iPhone, web apps are not a lifeboat; instead the web is displacing high-quality native apps with alien-feeling lowest-common-denominator stuff. No thanks.</p>
]]></description><pubDate>Sat, 27 Mar 2021 07:06:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=26600550</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26600550</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26600550</guid></item><item><title><![CDATA[New comment by millstone in "Why OO Sucks by Joe Armstrong (2000)"]]></title><description><![CDATA[
<p>It does not. -XOverloadedStrings unlocks `fromString :: String -> a.` That is not a polymorphic string interface; it's just syntax sugar for making something else from a string literal.</p>
]]></description><pubDate>Fri, 26 Mar 2021 06:28:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=26589309</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26589309</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26589309</guid></item><item><title><![CDATA[New comment by millstone in "Why OO Sucks by Joe Armstrong (2000)"]]></title><description><![CDATA[
<p>Right. But OOP makes it central, and builds around it, while FP de-emphasizes it in preference to ADTs.<p>Strings are a good illustration. Instead of an abstract polymorphic String type, Haskell provided a concrete String type as an ADT. This proved too inflexible, which is why we have Text, lazy Text, ShortText, etc. Compare to NSString which also has multiple representations but hides them behind a single polymorphic interface.</p>
]]></description><pubDate>Fri, 26 Mar 2021 03:48:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=26588564</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26588564</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26588564</guid></item><item><title><![CDATA[New comment by millstone in "Why OO Sucks by Joe Armstrong (2000)"]]></title><description><![CDATA[
<p>I agree with that. If you create accessors as a matter of course, then you are mostly adding dead weight. But accessors can add a useful layer of abstraction at the interface layer, as you say.</p>
]]></description><pubDate>Fri, 26 Mar 2021 02:12:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=26588121</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26588121</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26588121</guid></item><item><title><![CDATA[New comment by millstone in "Why OO Sucks by Joe Armstrong (2000)"]]></title><description><![CDATA[
<p>OOP (as Alan Kay conceived it) was explicitly inspired by biology. Objects are cells and communicate through exchanging messages. State is local and hidden, and data itself disappears, which means that the program as written may be ignorant of how operations are performed.</p>
]]></description><pubDate>Fri, 26 Mar 2021 01:52:23 +0000</pubDate><link>https://news.ycombinator.com/item?id=26587998</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26587998</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26587998</guid></item><item><title><![CDATA[New comment by millstone in "Why OO Sucks by Joe Armstrong (2000)"]]></title><description><![CDATA[
<p>"Safety" and "performance" are not always the most important considerations. For example, Apple uses OOP so that its frameworks can evolve without breaking client apps. NSDictionary is a dynamic object because it permits the implementation to be changed or replaced, and this comes at a cost of performance.</p>
]]></description><pubDate>Fri, 26 Mar 2021 01:30:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=26587877</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26587877</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26587877</guid></item><item><title><![CDATA[New comment by millstone in "Why OO Sucks by Joe Armstrong (2000)"]]></title><description><![CDATA[
<p>getters, setters, and factories allow the implementation to evolve without breaking client code.</p>
]]></description><pubDate>Fri, 26 Mar 2021 00:00:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=26587262</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26587262</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26587262</guid></item><item><title><![CDATA[New comment by millstone in "Closing web browser windows doesn't close connections"]]></title><description><![CDATA[
<p>The behavior is disabled in private browsing. That suggests some acknowledgement of a privacy risk.<p>What is creepy here is that a random website which I closed ten minutes ago get to know when I unplug my ethernet cable. I am happy to tell a website when I close it - that is its purview. But when I close a website I want it to be closed. Now I know it lingers.</p>
]]></description><pubDate>Wed, 24 Mar 2021 05:29:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=26563923</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26563923</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26563923</guid></item><item><title><![CDATA[New comment by millstone in "The End of AMP?"]]></title><description><![CDATA[
<p>Google could defuse some of the AMP pushback by offering users a setting to disable it. Why haven't they done so?</p>
]]></description><pubDate>Tue, 23 Mar 2021 22:47:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=26561330</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26561330</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26561330</guid></item><item><title><![CDATA[New comment by millstone in "Substack's UI and 1Password temporarily cost me $2k"]]></title><description><![CDATA[
<p>Why do browser need to guess at autofill? Isn't it to everyone's advantage for forms to just tag their fields explicitly?</p>
]]></description><pubDate>Tue, 23 Mar 2021 01:24:30 +0000</pubDate><link>https://news.ycombinator.com/item?id=26549838</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26549838</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26549838</guid></item><item><title><![CDATA[New comment by millstone in "The Generic Dilemma (2009)"]]></title><description><![CDATA[
<p>> These should be compilable to a single asm routine, as long as there is an homogeneous way to copy them<p>That's a massive asterisk. In practice, copying values to the heap is complicated. Integer values may be memcpyed. Pointer values may require read and/or write barriers, or perhaps reference counting. Compound types may require some combination. Maybe you have copy-constructors and you need arbitrary callouts! These implementation details can't be hand-waved away.<p>> The difference between this case and the polymorphic case is that here the underlying structure (eg "compare" or "insert") isn't done in a homogeneous way (single routine) for every type: the polymorphism is ad-hoc and not parametric. The vtable solution is a form of boxing and that impacts runtime<p>Kindly, I think you have some confused terminology. Parametric vs ad-hoc polymorphism is a difference in the type system, and occurs long before the optimizer kicks in. Parametric polymorphism is a single function implementation parametrized by types. Ad-hoc polymorphism is multiple implementations, and the compiler chooses one based on types (e.g. C++ overloading or template specialization).<p>If the optimizer chooses to emit a specialization, that does NOT make it ad-hoc polymorphism. That's just a detail of the optimizer. And it is a distinct optimization from inlining, for good reasons, like not wanting to be tied to the inlining cost model.<p>> for efficient compilation of languages with generics you need to have an efficient compilation of parametrically polymorphic functions (by type erasure)<p>C++ and Rust both have efficient generics without using type erasure. Boxed generics are not necessary at all, as long as you accept static linking.</p>
]]></description><pubDate>Mon, 22 Mar 2021 04:18:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=26536812</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26536812</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26536812</guid></item><item><title><![CDATA[New comment by millstone in "The Generic Dilemma (2009)"]]></title><description><![CDATA[
<p>Swift has tried to address this by making specialization an optimization. When it compiles a generic function, it emits a single boxed implementation, like Java. But the optimizer may also choose to specialize it for concrete types, like C++ or Rust does.<p>It sounds nice, but there's a huge performance difference between boxed and specialized. When you make a function generic, the compiler boxes values, conservatively emit retain/release, others.<p>The difference is so large, it justifies giving the programmer levers to control specialization. Swift team has added "@_specialize" as a hint. I think they should go further.<p>What I don't know is how specialized generics can participate in library evolution. If a <i>dynamic</i> library provides a sort routine, and you specialize it, what happens when the library changes? This is where a JIT or the .NET model (compile at install time) can pay off. Apple has laid some groundwork for this (bitcode, Rosetta 2) so they can certainly get there if they choose to.<p>Anyways I think the real dilemma is whether generics are meant to be a performance optimization (C++, Rust) or just for typechecking (Java, C#). The cost of saying yes to performance is library evolution.</p>
]]></description><pubDate>Mon, 22 Mar 2021 02:28:08 +0000</pubDate><link>https://news.ycombinator.com/item?id=26536227</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26536227</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26536227</guid></item><item><title><![CDATA[New comment by millstone in "Safari is now probably the influential wild card browser for user privacy"]]></title><description><![CDATA[
<p>I don't understand why Apple is holding anyone back. Just build an app that doesn't work on iOS. There's lots of apps exclusive to iOS and Mac - why not exclusive Android?<p>If the app really is a game changer, if it's that compelling, Apple will absolutely change its tune.</p>
]]></description><pubDate>Sat, 20 Mar 2021 06:10:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=26521150</link><dc:creator>millstone</dc:creator><comments>https://news.ycombinator.com/item?id=26521150</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=26521150</guid></item></channel></rss>