<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: Uncorrelated</title><link>https://news.ycombinator.com/user?id=Uncorrelated</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Tue, 14 Apr 2026 17:17:35 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=Uncorrelated" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by Uncorrelated in "Starfling: A one-tap endless orbital slingshot game in a single HTML file"]]></title><description><![CDATA[
<p>I got 54, allegedly better than 99% of players. But I also note that a score of zero is apparently better than 32% of players.<p>Very fun.</p>
]]></description><pubDate>Sat, 11 Apr 2026 05:51:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=47727826</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=47727826</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47727826</guid></item><item><title><![CDATA[New comment by Uncorrelated in "Introduction to Nintendo DS Programming"]]></title><description><![CDATA[
<p>Seems like you’re describing something like Pico-8:
<a href="https://www.lexaloffle.com/pico-8.php" rel="nofollow">https://www.lexaloffle.com/pico-8.php</a><p>There are also open-source versions of the concept, like TIC-80.<p>The only thing missing is upgradability.</p>
]]></description><pubDate>Thu, 09 Apr 2026 18:14:55 +0000</pubDate><link>https://news.ycombinator.com/item?id=47707335</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=47707335</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47707335</guid></item><item><title><![CDATA[New comment by Uncorrelated in "Learn Claude Code by doing, not reading"]]></title><description><![CDATA[
<p>I responded with a mix of mostly B and C answers and got “advanced.” Yet, as pointed out by another commenter, selecting all D answers (which would make you an expert!) gets you called a beginner.<p>I can only assume the quiz itself was vibe-coded and not tested. What an incredible time we live in.</p>
]]></description><pubDate>Mon, 30 Mar 2026 23:19:11 +0000</pubDate><link>https://news.ycombinator.com/item?id=47580904</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=47580904</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47580904</guid></item><item><title><![CDATA[Murten Panorama Digital Twin Scanning Project]]></title><description><![CDATA[
<p>Article URL: <a href="https://paulbourke.net/panorama/MurtenStory/">https://paulbourke.net/panorama/MurtenStory/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=47508028">https://news.ycombinator.com/item?id=47508028</a></p>
<p>Points: 1</p>
<p># Comments: 0</p>
]]></description><pubDate>Tue, 24 Mar 2026 19:46:46 +0000</pubDate><link>https://paulbourke.net/panorama/MurtenStory/</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=47508028</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47508028</guid></item><item><title><![CDATA[New comment by Uncorrelated in "The Met releases high-def 3D scans of 140 famous art objects"]]></title><description><![CDATA[
<p>For anyone wondering, you can access this by tapping the button showing a 3D cube at the bottom left of the 3D viewer. The button may be cut off if you're viewing in a web view in another app like I was.<p>The AR viewer runs with a much higher frame rate and you can get closer to the model. However the lighting is significantly worse, which ruins the appeal. The in-browser viewer is choppy and I can feel my phone getting a little warm, but it looks a lot more like viewing the real artifacts.</p>
]]></description><pubDate>Thu, 12 Mar 2026 21:29:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=47357393</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=47357393</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47357393</guid></item><item><title><![CDATA[New comment by Uncorrelated in "How far back in time can you understand English?"]]></title><description><![CDATA[
<p>You're thinking of the The History of England podcast, not The History of English. The History of English Podcast does cover English history, often going deeper than is strictly necessary for tracing the evolution of English, but its primary focus is language. It's also very cozy, something you could listen to while sipping tea by a warm fire, and its consistency, clarity, and depth has made it my favorite podcast.</p>
]]></description><pubDate>Sun, 22 Feb 2026 04:47:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=47108274</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=47108274</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47108274</guid></item><item><title><![CDATA[New comment by Uncorrelated in "Ask HN: What are you working on? (January 2026)"]]></title><description><![CDATA[
<p>Thank you! I just released the new update on a slow rollout, and you can get it ahead of schedule by updating manually. I recommend trying out RAW photography with the Extended Dynamic Range setting set to on :)</p>
]]></description><pubDate>Mon, 12 Jan 2026 01:15:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=46582528</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=46582528</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46582528</guid></item><item><title><![CDATA[New comment by Uncorrelated in "Ask HN: What are you working on? (January 2026)"]]></title><description><![CDATA[
<p>At the time I created it there weren't any that did the end-to-end process with ProRAW in a way that I liked. And I got really tired of manually editing every photo I took, so I built the app for me.<p>Plus, having full control over the way photos look, I've customized the output to match my taste; I don't think there's any other camera apps that produce photos quite like mine.</p>
]]></description><pubDate>Sun, 11 Jan 2026 21:18:19 +0000</pubDate><link>https://news.ycombinator.com/item?id=46580181</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=46580181</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46580181</guid></item><item><title><![CDATA[New comment by Uncorrelated in "Ask HN: What are you working on? (January 2026)"]]></title><description><![CDATA[
<p>I've been working on an iOS camera app to take natural-looking photos with reduced post-processing. The goal is take photos that look like what you see.<p>I just updated the RAW pipeline and I'm really happy with how the resulting photos look, plus there's this cool "RAW+ProRAW" capture mode I introduced recently.<p><a href="https://apps.apple.com/us/app/unpro-camera/id6535677796">https://apps.apple.com/us/app/unpro-camera/id6535677796</a><p>I initially released it early last year and have been using it as my main camera app since, but I haven't mentioned it in one of these threads before. Unfortunately this post has come just a bit early for my most recent update to be approved; there's some nice improvements coming.</p>
]]></description><pubDate>Sun, 11 Jan 2026 17:32:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=46577669</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=46577669</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46577669</guid></item><item><title><![CDATA[New comment by Uncorrelated in "What an unprocessed photo looks like"]]></title><description><![CDATA[
<p>I found the article you wrote on processing Librem 5 photos:<p><a href="https://puri.sm/posts/librem-5-photo-processing-tutorial/" rel="nofollow">https://puri.sm/posts/librem-5-photo-processing-tutorial/</a><p>Which is a pleasant read, and I like the pictures. Has the Librem 5's automatic JPEG output improved since you wrote the post about photography in Croatia (<a href="https://dosowisko.net/l5/photos/" rel="nofollow">https://dosowisko.net/l5/photos/</a>)?</p>
]]></description><pubDate>Mon, 29 Dec 2025 07:24:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=46418239</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=46418239</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46418239</guid></item><item><title><![CDATA[New comment by Uncorrelated in "The iPhone 15 Pro’s Depth Maps"]]></title><description><![CDATA[
<p>I'm sorry for neglecting to respond until now. The app was called Portrait Effects Studio and later Portrait Effects Playground; I took it down because it didn't meet my quality standards. I don't have any public videos anymore, but it supported background replacement and filters like duotone, outline, difference-of-Gaussians, etc., all applied based on depth or the portrait effects matte. I can send you a TestFlight link if you're curious.<p>I looked at your apps, and it turns out I'm already familiar with some, like 65x24. I had to laugh -- internally, anyway -- at the unfortunate one-star review you received on Matte Viewer from a user that didn't appear to understand the purpose of the app.<p>One that really surprised me was Trichromy, because I independently came up with and prototyped the same concept! And, even more surprisingly, there's at least one other such app on the App Store. And I thought I was <i>so</i> creative coming up with the idea. I tried Trichromy; it's quite elegant, and fast.<p>Actually, I feel we have a similar spirit in terms of our approach to creative photography, though your development skills apparently surpass mine. I'm impressed by the polish on your websites, too. Cheers.</p>
]]></description><pubDate>Wed, 11 Jun 2025 04:41:46 +0000</pubDate><link>https://news.ycombinator.com/item?id=44244257</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=44244257</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44244257</guid></item><item><title><![CDATA[New comment by Uncorrelated in "The iPhone 15 Pro’s Depth Maps"]]></title><description><![CDATA[
<p>Other commenters here are correct that the LIDAR is too low-resolution to be used as the primary source for the depth maps. In fact, iPhones use four-ish methods, that I know of, to capture depth data, depending on the model and camera used. Traditionally these depth maps were only captured for Portrait photos, but apparently recent iPhones capture them for standard photos as well.<p>1. The original method uses two cameras on the back, taking a picture from both simultaneously and using parallax to construct a depth map, similar to human vision. This was introduced on the iPhone 7 Plus, the first iPhone with two rear cameras (a 1x main camera and 2x telephoto camera.) Since the depth map depends on comparing the two images, it will naturally be limited to the field of view of the narrower lens.<p>2. A second method was later used on iPhone XR, which has only a single rear camera, using focus pixels on the sensor to roughly gauge depth. The raw result is low-res and imprecise, so it's refined using machine learning. See: <a href="https://www.lux.camera/iphone-xr-a-deep-dive-into-depth/" rel="nofollow">https://www.lux.camera/iphone-xr-a-deep-dive-into-depth/</a><p>3. An extension of this method was used on an iPhone SE that didn't even have focus pixels, producing depth maps purely based on machine learning. As you would expect, such depth maps have the least correlation to reality, and the system could be fooled by taking a picture of a picture. See: <a href="https://www.lux.camera/iphone-se-the-one-eyed-king/" rel="nofollow">https://www.lux.camera/iphone-se-the-one-eyed-king/</a><p>4. The fourth method is used for selfies on iPhones with FaceID; it uses the TrueDepth camera's 3D scanning to produce a depth map. You can see this with the selfie in the article; it has a noticeably fuzzier and low-res look.<p>You can also see some other auxiliary images in the article, which use white to indicate the human subject, glasses, hair, and skin. Apple calls these portrait effects mattes and they are produced using machine learning.<p>I made an app that used the depth maps and portrait effects mattes from Portraits for some creative filters. It was pretty fun, but it's no longer available. There are a lot of novel artistic possibilities for depth maps.</p>
]]></description><pubDate>Wed, 04 Jun 2025 20:13:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=44184980</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=44184980</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44184980</guid></item><item><title><![CDATA[New comment by Uncorrelated in "What Is HDR, Anyway?"]]></title><description><![CDATA[
<p>It's a JPEG + gain map format where the gain map is stored in the metadata. Same thing, as far as I can tell, that Halide is now using. It's what the industry is moving towards; it means that images display well on both SDR and HDR displays. I don't know what JPEG-XT is, aside from what I just skimmed on the Wikipedia page.<p>Not having before-and-after comparisons is mostly down to my being concerned about whether that would pass App Review; the guidelines indicate that the App Store images are supposed to be screenshots of the app, and I'm already pushing that rule with the example images for filters. I'm not sure a hubristic "here's how much better my photos are than Apple's" image would go over well. Maybe in my next update? I should at least have some comparisons on my website, but I've been bad at keeping that updated.<p>There's no Live Photo support, though I've been thinking about it. The reason is that my current iPhone 14 Pro Max does not support Live Photos while shooting in 48-megapixel mode; the capture process takes too long. I'd have to come up with a compromise such as only having video up to the moment of capture. That doesn't prevent me from implementing it for other iPhones/cameras/resolutions, but I don't like having features unevenly available.</p>
]]></description><pubDate>Wed, 14 May 2025 19:18:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=43988220</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=43988220</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43988220</guid></item><item><title><![CDATA[New comment by Uncorrelated in "What is HDR, anyway?"]]></title><description><![CDATA[
<p>Yes. For example, Lightroom and Camera Raw support HDR editing and export from RAW images, and Adobe published a good rundown on the feature when they introduced it.<p><a href="https://blog.adobe.com/en/publish/2023/10/10/hdr-explained" rel="nofollow">https://blog.adobe.com/en/publish/2023/10/10/hdr-explained</a><p>Greg Benz Photography maintains a list of software here:<p><a href="https://gregbenzphotography.com/hdr-display-photo-software/" rel="nofollow">https://gregbenzphotography.com/hdr-display-photo-software/</a><p>I'm not sure what FOSS options there are; it's difficult to search for given that "HDR" can mean three or four different things in common usage.</p>
]]></description><pubDate>Wed, 14 May 2025 17:38:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=43987214</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=43987214</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43987214</guid></item><item><title><![CDATA[New comment by Uncorrelated in "What is HDR, anyway?"]]></title><description><![CDATA[
<p>I find the default HDR (as in gain map) presentation of iPhone photos to look rather garish, rendering highlights too bright and distracting from the content of the images. The solution I came up with for my own camera app was to roll off and lower the highlights in the gain map, which results in final images that I find way more pleasing. This seems to be somewhat similar to what Halide is introducing with their "Standard" option for HDR.<p>Hopefully HN allows me to share an App Store link... this app works best on Pro iPhones, which support ProRAW, although I do some clever stuff on non-Pro iPhones to get a more natural look.<p><a href="https://apps.apple.com/us/app/unpro-camera/id6535677796">https://apps.apple.com/us/app/unpro-camera/id6535677796</a></p>
]]></description><pubDate>Wed, 14 May 2025 17:27:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=43987108</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=43987108</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43987108</guid></item><item><title><![CDATA[New comment by Uncorrelated in "Little Sisyphus A physics-based platformer for the NES"]]></title><description><![CDATA[
<p>I played some of this after I read the NESFab page posted about a week ago. It's an impressive NES game for any length of time spent on development, let alone a month. Now that I know that it's from the creator of NESFab, the polish makes sense -- obviously the creator is intimately familiar with both the hardware and their own development tools. Compliments must also be paid to the art and appropriately Sisyphean music.<p>I gave up at 35 souls.</p>
]]></description><pubDate>Wed, 19 Feb 2025 08:32:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=43099932</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=43099932</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43099932</guid></item><item><title><![CDATA[New comment by Uncorrelated in "Nikon reveals a lens that captures wide and telephoto images simultaneously"]]></title><description><![CDATA[
<p>iPhones can do this. They support taking photos simultaneously from the two or three cameras on the back; the cameras are hardware-synchronized and automatically match their settings to provide similar outputs. The catch is you need a third-party app to access it, and you'll end up with two or three separate photos per shot which you'll have to manage yourself. You also won't get manual controls over white balance, focus, or ISO, and you can't shoot in RAW or ProRAW.<p>There are probably a good number of camera apps that support this mode; two I know of are ProCam 8 and Camera M.</p>
]]></description><pubDate>Mon, 30 Dec 2024 19:24:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=42552544</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=42552544</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42552544</guid></item><item><title><![CDATA[New comment by Uncorrelated in "T * sin (t)' ≈ Ornamented Christmas Tree (2013)"]]></title><description><![CDATA[
<p>block_dagger was making a pun based on the sense of drill as a training exercise. A similar joke went over the heads of nearly everyone on a recent episode of Taskmaster:<p><a href="https://www.youtube.com/watch?v=6PJkA3o_Im0" rel="nofollow">https://www.youtube.com/watch?v=6PJkA3o_Im0</a></p>
]]></description><pubDate>Wed, 25 Dec 2024 06:35:55 +0000</pubDate><link>https://news.ycombinator.com/item?id=42507291</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=42507291</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42507291</guid></item><item><title><![CDATA[Modern Displays Up Close]]></title><description><![CDATA[
<p>Article URL: <a href="https://www.uncorrelatedcontents.com/blog/modern-displays-up-close">https://www.uncorrelatedcontents.com/blog/modern-displays-up-close</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=42081787">https://news.ycombinator.com/item?id=42081787</a></p>
<p>Points: 2</p>
<p># Comments: 0</p>
]]></description><pubDate>Thu, 07 Nov 2024 22:25:22 +0000</pubDate><link>https://www.uncorrelatedcontents.com/blog/modern-displays-up-close</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=42081787</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42081787</guid></item><item><title><![CDATA[New comment by Uncorrelated in "How JPEG XL compares to other image codecs"]]></title><description><![CDATA[
<p>Articles about the merits of JPEG XL come up with some regularity on Hacker News, as if to ask, "why aren't we all using this yet?"<p>This one has a section on animation and cinemagraphs, saying that video formats like AV1 and HEVC are better suited, which makes sense. Here's my somewhat off-topic question: is there a video format that requires support for looping, like GIFs? GIF is a pretty shoddy format for video compared to a modern video codec, but if a GIF loops, you can expect it to loop seamlessly in any decent viewer.<p>With videos it seems you have to hope that the video player has an option to loop, and oftentimes there's a brief delay at the end of the video before playback resumes at the beginning. It would be nice if there were a video format that included seamless looping as part of the spec -- but as far as I can tell, there isn't one. Why not? Is it just assumed that anyone who wants looping video will configure their player to do it?</p>
]]></description><pubDate>Sun, 27 Oct 2024 02:22:46 +0000</pubDate><link>https://news.ycombinator.com/item?id=41959375</link><dc:creator>Uncorrelated</dc:creator><comments>https://news.ycombinator.com/item?id=41959375</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41959375</guid></item></channel></rss>