<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: mattdesl</title><link>https://news.ycombinator.com/user?id=mattdesl</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Fri, 10 Apr 2026 11:12:01 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=mattdesl" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by mattdesl in "From RGB to L*a*b* color space (2024)"]]></title><description><![CDATA[
<p>For those just learning about perceptual colour spaces, I’d recommend exploring OKLab which is simpler to implement and overcomes some of the problems of CIELab.<p><a href="https://bottosson.github.io/posts/oklab/" rel="nofollow">https://bottosson.github.io/posts/oklab/</a></p>
]]></description><pubDate>Sun, 08 Mar 2026 09:09:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=47295771</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=47295771</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47295771</guid></item><item><title><![CDATA[New comment by mattdesl in "Shades of Halftone"]]></title><description><![CDATA[
<p>Still going through the article but loving all the detail and interactive components! Nice writeup.<p>PS: worth mentioning the RGB to CMYK function credited to me is not my original work, I believe I got it off stack overflow or similar many years ago. A more robust way of doing this transformation would be with a color management system and profile, as it happens I’ve done a bit of work on that! [1] Used this here [2].<p>Transforming with ICC profile will give you a result that might be closer to how a screen printer would turn your digital image into a four colour print, but more advanced screen printing workflows these days tend to use “rip” software that handles many layers (eg: 12 colors instead of 4) and stochastic screening [3] which produces quite different results than what most halftone shaders are doing.<p>[1] <a href="https://github.com/mattdesl/lcms-wasm" rel="nofollow">https://github.com/mattdesl/lcms-wasm</a><p>[2] <a href="https://sierra.mattdesl.com/" rel="nofollow">https://sierra.mattdesl.com/</a><p>[3] <a href="https://en.wikipedia.org/wiki/Stochastic_screening" rel="nofollow">https://en.wikipedia.org/wiki/Stochastic_screening</a></p>
]]></description><pubDate>Sat, 14 Feb 2026 17:36:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=47016442</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=47016442</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47016442</guid></item><item><title><![CDATA[New comment by mattdesl in "Iterative image reconstruction using random cubic bézier strokes"]]></title><description><![CDATA[
<p>Nice work, the outputs look ethereal and quite beautiful. For some related work using genetic algorithms and evolution strategies, see[1].<p>Sketch synthesis is an area I'm pretty interested in lately; I'm currently exploring similar things with CLIP to guide fitness, natural evolution strategy to optimize the rendered results, and using an implicit neural representation to represent pen plotter paths (rather than a series of explicit curves/strokes).[2]<p>[1] <a href="https://es-clip.github.io/" rel="nofollow">https://es-clip.github.io/</a><p>[2] <a href="https://x.com/mattdesl/status/2011434166022476109" rel="nofollow">https://x.com/mattdesl/status/2011434166022476109</a></p>
]]></description><pubDate>Mon, 19 Jan 2026 18:06:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=46682391</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=46682391</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46682391</guid></item><item><title><![CDATA[New comment by mattdesl in "Bruno Simon – 3D Portfolio"]]></title><description><![CDATA[
<p>Amazing as always, Bruno is a wizard with ThreeJS.<p>There’s a surprising amount of stutter and lag on iOS, evident after the loading bar completes and the app freezes for 30 sec. Also during gameplay, quite a bit of stuttering. My guess is GPU texture uploads or shader compilations. Otherwise it was buttery smooth.</p>
]]></description><pubDate>Wed, 10 Dec 2025 09:23:47 +0000</pubDate><link>https://news.ycombinator.com/item?id=46215750</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=46215750</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46215750</guid></item><item><title><![CDATA[New comment by mattdesl in "Everything that's wrong with Google Search in one image"]]></title><description><![CDATA[
<p>A similar thing happens when you search “Canada eTA” — a $7 (required) entry visa the government typically issues instantly. But on Google, several sponsored sites appear above the gov site, and charge $100+ for the same service but slower, and they do god knows what with your passport details and personal data.<p>There are tons of other examples like this. It’s very easy to get tricked by Google ads if you aren’t suspecting a scam.</p>
]]></description><pubDate>Thu, 25 Sep 2025 06:39:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=45369885</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=45369885</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45369885</guid></item><item><title><![CDATA[New comment by mattdesl in "What are OKLCH colors?"]]></title><description><![CDATA[
<p>Nice post. OKLCH is quite handy but for writing colors in CSS I hope eventually we’ll get some form of OKHSL/OKHSV[1] so users don’t need to worry about gamut boundaries.<p>[1] <a href="https://bottosson.github.io/posts/colorpicker/" rel="nofollow">https://bottosson.github.io/posts/colorpicker/</a></p>
]]></description><pubDate>Mon, 25 Aug 2025 07:09:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=45011139</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=45011139</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45011139</guid></item><item><title><![CDATA[New comment by mattdesl in "So what's the difference between plotted and printed artwork?"]]></title><description><![CDATA[
<p>A lot of it comes down to which pens you happen to have - I’ve had some success with Sakura gelly rolls for white, and also more recently have been enjoying sharpie creative acrylic markers which has a moderately opaque white ink. I’ve also had some really frustrating experiences with some other pens and instruments!</p>
]]></description><pubDate>Wed, 13 Aug 2025 15:50:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=44890052</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=44890052</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44890052</guid></item><item><title><![CDATA[New comment by mattdesl in "HyAB k-means for color quantization"]]></title><description><![CDATA[
<p>Yes, it's a matmul; many color models just boil down to simple math. For example, look at Li and Luo's 2024 "simple color appearance model"[1], which is <i>very</i> similar to OKLab (just matmul!), and created for many of the same reasons (just an approximation!). Like OKLab, it also improves upon CAM16-UCS hue linearity issues in blue. Ironically, Luo was one of the authors who proposed CAM16-UCS in 2017. And, although it certainly improves upon CAM16-UCS for many applications, I'm not yet convinced it is superior to OKLab (you can see my implementation here: [2]).<p>And I think you might be mis-remembering Ottosson's original blog post; he demonstrates a gradient between white and blue, not blue and yellow.<p>[1] <a href="https://opg.optica.org/oe/fulltext.cfm?uri=oe-32-3-3100" rel="nofollow">https://opg.optica.org/oe/fulltext.cfm?uri=oe-32-3-3100</a><p>[2] <a href="https://github.com/texel-org/color/blob/main/test/spaces/simple-ucs.js">https://github.com/texel-org/color/blob/main/test/spaces/sim...</a></p>
]]></description><pubDate>Thu, 10 Jul 2025 16:31:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=44522794</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=44522794</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44522794</guid></item><item><title><![CDATA[New comment by mattdesl in "HyAB k-means for color quantization"]]></title><description><![CDATA[
<p>Obviously; but this doesn’t suggest that OKLab is not a perceptually uniform color space.<p>There is no “one true” UCS model - all of these are just approximations of various perception and color matching studies, and at some point CAM16-UCS will probably be made obsolete as well.</p>
]]></description><pubDate>Thu, 10 Jul 2025 13:46:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=44521032</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=44521032</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44521032</guid></item><item><title><![CDATA[New comment by mattdesl in "HyAB k-means for color quantization"]]></title><description><![CDATA[
<p>> Oklab is not perceptually uniform<p>By what metric? If the target is parity with CAM16-UCS, OKLab comes closer than many color spaces also designed to be perceptually uniform.</p>
]]></description><pubDate>Thu, 10 Jul 2025 12:11:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=44520092</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=44520092</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44520092</guid></item><item><title><![CDATA[New comment by mattdesl in "HyAB k-means for color quantization"]]></title><description><![CDATA[
<p>It does a pretty good job at emulating CAM16 with a fraction of the parameters, computational complexity, and processing; it’s no wonder it was adopted by CSS.<p>I don’t know what you mean by “not being linked to any perceptual color space” - it is derived from CAM16 & CIEDE2000, pretty similar in ethos to other spaces like ITP and the more recently published sUCS.<p>There’s also tons of discussion on w3c GitHub about OKLab, and it’s evolved in many ways since the original blog post such as improved matrices, new lightness estimate and OKHSV/OKHSL, and very useful cusp & gamut approximations.<p>I have a hard time seeing how it’s a nightmare in practice!</p>
]]></description><pubDate>Wed, 09 Jul 2025 23:07:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=44515623</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=44515623</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44515623</guid></item><item><title><![CDATA[New comment by mattdesl in "HyAB k-means for color quantization"]]></title><description><![CDATA[
<p>I’ve done some color quantization tests with HyAB and OKLab on this same image. A couple notes:<p>- what works well for this image might not work well for other images! I learned the hard way after lots of testing on this image, only to find things that did not generalize well.<p>- parametrizing the AB plane weight is pretty useful for color quantization; I’ve found some images will be best with more weight given to colour, and other images need more weight given to tone. OKLab creator suggests a factor of 2 in deltaEOK[1] but again this is something that should be adjustable IMHO..<p>- there’s another interesting and efficient color space (poorly named) sUCS and sCAM[2] that boasts impressive results in their paper for tasks like this. Although I’ve found it not much better for my needs than OKLab in my brief tests[3] (and note, both color spaces are derived using CIEDE2000)<p>[1] <a href="https://github.com/color-js/color.js/blob/9d812464aa318a9b474c52117f2abd5d055ee4b3/src/deltaE/deltaEOK2.js#L23">https://github.com/color-js/color.js/blob/9d812464aa318a9b47...</a><p>[2] <a href="https://opg.optica.org/oe/fulltext.cfm?uri=oe-32-3-3100&id=545619" rel="nofollow">https://opg.optica.org/oe/fulltext.cfm?uri=oe-32-3-3100&id=5...</a><p>[3] <a href="https://x.com/mattdesl/status/1902699888057446670" rel="nofollow">https://x.com/mattdesl/status/1902699888057446670</a></p>
]]></description><pubDate>Wed, 09 Jul 2025 22:48:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=44515496</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=44515496</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44515496</guid></item><item><title><![CDATA[New comment by mattdesl in "HyAB k-means for color quantization"]]></title><description><![CDATA[
<p>Surely this would be even faster and potentially better with OKLab? Especially in the context of CIELab based distance metrics like CIEDE2000 which are a bit heavy.<p>My own gripe with box cutting is that perceptual color spaces tend not to have cube shaped volumes. But they are very fast algorithms.</p>
]]></description><pubDate>Wed, 09 Jul 2025 22:30:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=44515360</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=44515360</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44515360</guid></item><item><title><![CDATA[New comment by mattdesl in "Compiling a Neural Net to C for a 1,744× speedup"]]></title><description><![CDATA[
<p>I think the techniques in “Weight Agnostic Neural Networks” should be applicable here, too. It uses a variant of NEAT I believe. This would allow for learning the topology and wiring rather than just gates. But, in practice it is probably pretty slow, and may not be all that different than a pruned and optimized DLGN..<p><a href="https://weightagnostic.github.io/" rel="nofollow">https://weightagnostic.github.io/</a></p>
]]></description><pubDate>Wed, 28 May 2025 18:57:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=44119430</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=44119430</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44119430</guid></item><item><title><![CDATA[New comment by mattdesl in "Questioning Representational Optimism in Deep Learning"]]></title><description><![CDATA[
<p>The full title of the paper is “Questioning Representational Optimism in Deep Learning: The Fractured Entangled Representation Hypothesis.”<p><a href="https://arxiv.org/abs/2505.11581" rel="nofollow">https://arxiv.org/abs/2505.11581</a></p>
]]></description><pubDate>Tue, 20 May 2025 08:41:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=44039236</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=44039236</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44039236</guid></item><item><title><![CDATA[Questioning Representational Optimism in Deep Learning]]></title><description><![CDATA[
<p>Article URL: <a href="https://github.com/akarshkumar0101/fer">https://github.com/akarshkumar0101/fer</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=44038549">https://news.ycombinator.com/item?id=44038549</a></p>
<p>Points: 46</p>
<p># Comments: 7</p>
]]></description><pubDate>Tue, 20 May 2025 06:54:27 +0000</pubDate><link>https://github.com/akarshkumar0101/fer</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=44038549</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44038549</guid></item><item><title><![CDATA[New comment by mattdesl in "AlphaEvolve: A Gemini-powered coding agent for designing advanced algorithms"]]></title><description><![CDATA[
<p>There's also 'evolutionary strategy' algorithms that do not use the typical mutation and crossover, but instead use a population of candidates (search samples) to basically approximate the gradient landscape.</p>
]]></description><pubDate>Wed, 14 May 2025 16:10:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=43986222</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=43986222</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43986222</guid></item><item><title><![CDATA[New comment by mattdesl in "WebMonkeys: parallel GPU programming in JavaScript"]]></title><description><![CDATA[
<p>The author is working on a program synthesizer using interaction nets/calculus, which should be released soon. It sounds quite interesting:<p><a href="https://x.com/VictorTaelin/status/1907976343830106592" rel="nofollow">https://x.com/VictorTaelin/status/1907976343830106592</a></p>
]]></description><pubDate>Wed, 07 May 2025 10:01:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=43913862</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=43913862</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43913862</guid></item><item><title><![CDATA[New comment by mattdesl in "Towards the Cutest Neural Network"]]></title><description><![CDATA[
<p>I wonder how well BitNet (ternary weights) would work for this. It seems like a promising way forward for constrained hardware.<p><a href="https://arxiv.org/abs/2310.11453" rel="nofollow">https://arxiv.org/abs/2310.11453</a><p><a href="https://github.com/cpldcpu/BitNetMCU/blob/main/docs/documentation.md">https://github.com/cpldcpu/BitNetMCU/blob/main/docs/document...</a></p>
]]></description><pubDate>Mon, 05 May 2025 08:29:38 +0000</pubDate><link>https://news.ycombinator.com/item?id=43892965</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=43892965</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43892965</guid></item><item><title><![CDATA[New comment by mattdesl in "Ask HN: What are you working on? (April 2025)"]]></title><description><![CDATA[
<p>Developing an open source library that simulates pigment mixing in the browser, inspired by mixbox[1].<p>[1] <a href="https://scrtwpns.com/mixbox.pdf" rel="nofollow">https://scrtwpns.com/mixbox.pdf</a></p>
]]></description><pubDate>Mon, 28 Apr 2025 11:15:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=43819946</link><dc:creator>mattdesl</dc:creator><comments>https://news.ycombinator.com/item?id=43819946</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43819946</guid></item></channel></rss>