<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: alexbock</title><link>https://news.ycombinator.com/user?id=alexbock</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Tue, 28 Apr 2026 22:12:30 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=alexbock" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by alexbock in "Types of optical systems in a lens designer's toolbox (2020)"]]></title><description><![CDATA[
<p>I'm guessing you're referring to <a href="https://news.ycombinator.com/item?id=39309409">https://news.ycombinator.com/item?id=39309409</a></p>
]]></description><pubDate>Sat, 24 May 2025 00:45:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=44077935</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=44077935</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44077935</guid></item><item><title><![CDATA[New comment by alexbock in "Types of optical systems in a lens designer's toolbox (2020)"]]></title><description><![CDATA[
<p>Thanks. I did not use any frameworks/libraries/dependencies for this project. It's vanilla JavaScript/HTML/CSS from scratch. The general concept of a spreadsheet-like data editor next to a visual view is a standard paradigm in commercial lens design software like Quadoa/OSLO/CODE V.</p>
]]></description><pubDate>Fri, 23 May 2025 22:53:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=44077314</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=44077314</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44077314</guid></item><item><title><![CDATA[New comment by alexbock in "Types of optical systems in a lens designer's toolbox (2020)"]]></title><description><![CDATA[
<p>If you want to play with any of these lens descriptions (or look at code for simulating them), I made a free and open source visual web UI for lens design. The default project when you visit it is a double gauss lens similar to the one shown in the article.<p><a href="https://alexbock.github.io/open-optical-designer/" rel="nofollow">https://alexbock.github.io/open-optical-designer/</a></p>
]]></description><pubDate>Fri, 23 May 2025 19:36:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=44075879</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=44075879</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44075879</guid></item><item><title><![CDATA[New comment by alexbock in "How hard would it be to display the contents of an image file on the screen?"]]></title><description><![CDATA[
<p>> (This is mathematically worse for a reason I don't remember; it causes any picture taken at a concert with blacklights or blue spotlights to look bad.<p>The iPhone camera sensor is prone to saturating and clipping the blue channel when strong light from a blue LED is in the frame. Once the blue channel clips at the maximum value, a typical HDR gain map won't do anything to restore more nuance to it because they're not designed to add high-frequency detail to a blob of clipped pixels with identical values in the base image.</p>
]]></description><pubDate>Sun, 19 Jan 2025 23:56:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=42763430</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=42763430</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42763430</guid></item><item><title><![CDATA[New comment by alexbock in "Touchscreens are out, and tactile controls are back"]]></title><description><![CDATA[
<p>Tesla vehicles display error descriptions prominently whenever an error code is presented, and detailed error diagnostics are available for anyone to browse in the service mode menu on the touchscreen. (Service mode is publicly accessible but does require looking up online how to open it.)</p>
]]></description><pubDate>Sun, 03 Nov 2024 19:33:03 +0000</pubDate><link>https://news.ycombinator.com/item?id=42035499</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=42035499</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42035499</guid></item><item><title><![CDATA[New comment by alexbock in "Phone cameras can take in more light than the human eye"]]></title><description><![CDATA[
<p>> All the "affordable" consumer FLIR options suffer from low resolution and/or low response time.<p>When I experimented with connecting two thermal cameras to a VR headset for stereo thermal vision, I used two Seek CompactPRO FastFrame units. They're 320x240@15Hz for $400 which is a lot more usable than the typical 80x60@9Hz consumer thermal, and it's easy to integrate the Android model into custom applications. They also have a 320x240@25Hz model for $1000.<p>I'm still impatiently waiting for affordable 640x480 thermal cameras, but in my opinion 320x240 at moderate frame rate is past the good-enough threshold to be legitimately useful for high contrast situations like identifying warm-blooded life on the side of a rural road.<p>> I'd love some sort of setup that outputs to >=8" 1080p display attached to my dash.<p>The Tesla Cybertruck has an option to display the view from the front bumper camera on the 18.5" main screen, but front camera display is unfortunately not available in any of Tesla's other models. With the proliferation of large touchscreens and camera arrays, more vehicles may support this from the factory soon.</p>
]]></description><pubDate>Wed, 29 May 2024 18:30:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=40515166</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=40515166</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40515166</guid></item><item><title><![CDATA[New comment by alexbock in "Phone cameras can take in more light than the human eye"]]></title><description><![CDATA[
<p>I've always been fascinated by this topic as well. As a further experiment, you may be interested to know that these IR lights can pass straight through red wine that looks totally dark and opaque to the human eye. I took some photos to demonstrate this with a DSLR with the IR filter removed here [1], but you can test this yourself by using a smartphone to look at the IR light of a TV remote with a glass of red wine in between them.<p>[1] <a href="https://alexbock.github.io/blog/nir-water-red-wine-comparison/index.html" rel="nofollow">https://alexbock.github.io/blog/nir-water-red-wine-compariso...</a></p>
]]></description><pubDate>Wed, 29 May 2024 17:29:06 +0000</pubDate><link>https://news.ycombinator.com/item?id=40514529</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=40514529</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40514529</guid></item><item><title><![CDATA[New comment by alexbock in "Wellington R. Burt "had one of the more unusual wills in American legal history""]]></title><description><![CDATA[
<p>I learned of the interesting "spite clause" described in the will section of this article while reading about the common law rule against perpetuities, which limits how far into the future wills can operate.<p>[The word "bizarre" in the title quote has been replaced with "unusual" as HN appears to automatically delete the former word from titles.]</p>
]]></description><pubDate>Fri, 09 Feb 2024 23:12:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=39321664</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=39321664</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39321664</guid></item><item><title><![CDATA[Wellington R. Burt "had one of the more unusual wills in American legal history"]]></title><description><![CDATA[
<p>Article URL: <a href="https://en.wikipedia.org/wiki/Wellington_R._Burt">https://en.wikipedia.org/wiki/Wellington_R._Burt</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=39321603">https://news.ycombinator.com/item?id=39321603</a></p>
<p>Points: 2</p>
<p># Comments: 1</p>
]]></description><pubDate>Fri, 09 Feb 2024 23:06:01 +0000</pubDate><link>https://en.wikipedia.org/wiki/Wellington_R._Burt</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=39321603</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39321603</guid></item><item><title><![CDATA[New comment by alexbock in "Salim Kara stole $2M in coins with a magnet and a car antenna (2022)"]]></title><description><![CDATA[
<p>It sounds like you're thinking of separating coins using eddy current braking [1]. This works even for non-magnetic coins because the effect is a function of the metal's electrical conductivity.<p>If you have a silver coin or a small piece of copper pipe and a large, strong neodymium magnet, then you can easily observe this effect at home by putting the metal sample on a table and quickly waving the magnet past it as close as you can without touching it. The metal will slide across the table following the magnet, despite the metal itself not being magnetic, because the moving magnet induces eddy currents which temporarily create a magnetic field like an electromagnet. Other metals besides silver and copper exhibit weaker responses due to higher electrical resistivity.<p>[1] <a href="https://en.wikipedia.org/wiki/Eddy_current_brake" rel="nofollow">https://en.wikipedia.org/wiki/Eddy_current_brake</a></p>
]]></description><pubDate>Tue, 02 Jan 2024 17:53:38 +0000</pubDate><link>https://news.ycombinator.com/item?id=38844593</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=38844593</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38844593</guid></item><item><title><![CDATA[New comment by alexbock in "Draggable objects"]]></title><description><![CDATA[
<p>I've always been surprised that Apple added this functionality to the iOS home screen without having a solution to the reorder vs nesting UI problem. Trying to move an app into a folder often results in the folder deciding to fly out of the way and let the item take its place when you're really trying to drop something on the folder to insert it.</p>
]]></description><pubDate>Fri, 29 Sep 2023 22:54:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=37711048</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=37711048</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=37711048</guid></item><item><title><![CDATA[New comment by alexbock in "Apple Vision Pro: Apple’s first spatial computer"]]></title><description><![CDATA[
<p>The single most important and most revealing aspect of this announcement is Apple's framing of the promotional images showing the face display on the front of the headset.<p>I think the large R&D investments tech companies are making into VR/AR headsets are ultimately centered on the idea that years in the future, an affordable, comfortable, and socially acceptable headset for all-day wear in daily life could hypothetically replace smartphones, tablets, laptops, and screens.<p>If a future daily wear headset as a phone/computer replacement reached a critical mass of adoption, then control of the platform will provide the owner with a huge profit source in selling virtual goods that can be "worn" or "placed" in the real world and are seen equally by any other person wearing a headset. I won't speculate on exactly what would be popular, but it is conceivable that this could include things like: buying virtual wall posters of licensed characters, virtual landscaping objects placed outside a house, filters that virtually "repaint" the exterior or interior of a house, or personal adornments like virtual clothing or appearance modifiers. That is to say, a digital layer of adjustment on top of the real world where everyone wearing a headset is automatically shown the virtual objects or adjustments that anyone else has made (within the scope of the latter's own appearance and owned spaces - random members of the public could not place publicly visible digital objects in the middle of a NYC street). Note that this virtual economy is a profit motive for a company to build AR, but not the selling point for a headset adopter.<p>Apple is not trying to sell this headset to consumers for $3500. They're showing off future hardware that they believe represents the bare minimum for what average people might be willing to wear regularly, with the expectation that they will be able to produce essentially this same unit and sell it for perhaps a third of the current price in several years. The way it's presented is also an early form of reputation management for the product space trying to influence public perception of how someone wearing a headset is viewed by others around them.<p>Standard see-through AR headset designs face fundamental implementation limits with the display technology that generally result in accepting one of two unacceptable limitations: a display that projects an image over the real world but cannot render black or otherwise draw anything darker than the scene behind it, or a liquid crystal light modulator with a polarizer that permanently makes the glass tinted dark like sunglasses even indoors. Apple is instead making a VR headset that is completely enclosed, displaying the world through pass-through cameras and drawing the wearer's eyes on a front display for everyone else to see.<p>The front lenticular OLED shows how Apple is approaching the social aspect of trying to market the acceptability of wearing a headset in the company of other people. In the long term, establishing a virtual economy for digital world overlays is fundamentally dependent on the social acceptance of wearing an AR device regularly. This announcement seems to be trying to thread that needle in advance of when this technology could eventually be priced and sold as a consumer product. I.e. establish an image of an Apple headset positioned without the kind of negative associations Google Glass quickly garnered.<p>I have no idea whether AR will succeed in replacing smartphones years from now or fade into obscurity, but what I find interesting about this announcement is that it makes a timeline where AR does take off at least appear conceivable. They've only taken a first and early step into trying to make it happen, but they haven't made a fatal mistake yet.</p>
]]></description><pubDate>Tue, 06 Jun 2023 02:37:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=36207380</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=36207380</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36207380</guid></item><item><title><![CDATA[New comment by alexbock in "Infra-Red, in Situ (Iris) Inspection of Silicon"]]></title><description><![CDATA[
<p>The center of the image it produces is perfectly sharp, but using a single achromatic doublet as a photographic lens at f/2 necessarily produces significant eighteenth-century softness as you move out to the edges. This image [1] taken to demonstrate that red wine (left) is clear and nearly as transparent as water (right) in the near infrared provides a decent example of how the images it produces wide open soften outside the center much more quickly than modern commercial camera lenses which typically don't lose sharpness at maximum aperture until the corners.<p>However, the sd Quattro is sensitive enough from 1000 nm - 1100 nm that I can take handheld shots outdoors on a sunny day while stopped down to f/5.6, and the smaller aperture gives more consistent sharpness across the frame. It also only takes a few seconds exposure on tripod to capture astrophotography of red giant stars that emit significant infrared like Betelgeuse.<p>Incidentally, the original reason I wanted an infrared-sensitive camera and a 1000 nm long pass filter was to photograph stars in the sky during the middle of the day, taking advantage of the quartic dependence on wavelength in Rayleigh scattering to remove the overpowering brightness of the sky.<p>[1] <a href="https://alexbock.github.io/blog/nir-examples/near-infrared-850nm-redwine-vs-water.jpg" rel="nofollow">https://alexbock.github.io/blog/nir-examples/near-infrared-8...</a> (note: this image used an 850 nm long pass filter rather than 1000 nm but was taken with the same doublet described before)</p>
]]></description><pubDate>Thu, 09 Mar 2023 00:01:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=35076492</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=35076492</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35076492</guid></item><item><title><![CDATA[New comment by alexbock in "Infra-Red, in Situ (Iris) Inspection of Silicon"]]></title><description><![CDATA[
<p>I enjoy monochrome infrared photography at wavelengths longer than 1000 nm using a Sigma sd Quattro (no Bayer filter and the sensor IR filter is reversibly user-removable). To reduce exposure times as much as possible, I've been testing an AR-coated near infrared achromatic doublet (AC254-050-B-ML) with a 3D-printed housing to attach it to the camera mount with a manual focusing barrel. The next step is to design a fast double gauss lens that can similarly be assembled from readily available NIR-coated stock lens elements.</p>
]]></description><pubDate>Wed, 08 Mar 2023 19:22:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=35073539</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=35073539</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35073539</guid></item><item><title><![CDATA[New comment by alexbock in "Show HN: Open Optical Designer, free and open source lens design software"]]></title><description><![CDATA[
<p>Producing a non-inverted image with only two lenses per eye and no additional prisms implies a Galilean telescope with a convex lens as the objective and a concave lens as the eyepiece. Zeiss makes distance magnifying eyeglasses using this design and calls them teleloupe spectacles:<p><a href="https://www.zeiss.com/vision-care/int/eye-care-professionals/other-products/magnifying-visual-devices/magnifying-visual-devices-for-professional.html" rel="nofollow">https://www.zeiss.com/vision-care/int/eye-care-professionals...</a><p>If you're going to make your own, keep in mind that there will be a tradeoff between magnification and field of view.</p>
]]></description><pubDate>Thu, 02 Mar 2023 01:27:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=34990767</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=34990767</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=34990767</guid></item><item><title><![CDATA[New comment by alexbock in "Show HN: Open Optical Designer, free and open source lens design software"]]></title><description><![CDATA[
<p>> Are there algorithms to select the lens elements for me given the properties I want?<p>I plan to add optimization features in the future to automatically tune design parameters to suit specified target criteria, as well as some application-oriented design generators for things like telescopes, binoculars, and camera lenses.<p>In the current version, the only features that automatically determine system specifications to achieve a specific result are autofocus for the distance from the last surface to the image, and a generator that produces a planoconvex singlet lens given a specified focal length and glass material.<p>> How does one buy lens elements? Are they custom manufactured or out of a catalog?<p>If I need some basic singlets or achromatic doublets for a visible-light use case that does not place strict requirements on the specific type of glass or the precise specifications of their AR coatings (if any), then I will typically order from Surplus Shed. They are significantly cheaper than other domestically shipped (US) sources for individual lens elements, but it is often necessary to design around the diameters and focal lengths that they have in stock rather than designing a finished system first and ordering exactly what it needs.<p>For specialized components like infrared and ultraviolet optics or for parts with formally documented specifications and product line consistency, I order from Thorlabs. They are more expensive.<p>Both Surplus Shed and Thorlabs will sell directly to individuals from their online stores. I have no affiliation with either company.</p>
]]></description><pubDate>Thu, 02 Mar 2023 00:27:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=34990353</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=34990353</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=34990353</guid></item><item><title><![CDATA[Show HN: Open Optical Designer, free and open source lens design software]]></title><description><![CDATA[
<p>Article URL: <a href="https://alexbock.github.io/open-optical-designer/">https://alexbock.github.io/open-optical-designer/</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=34939954">https://news.ycombinator.com/item?id=34939954</a></p>
<p>Points: 59</p>
<p># Comments: 5</p>
]]></description><pubDate>Sat, 25 Feb 2023 20:08:44 +0000</pubDate><link>https://alexbock.github.io/open-optical-designer/</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=34939954</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=34939954</guid></item><item><title><![CDATA[New comment by alexbock in "3D print and build a 164mm f/2.5 lens for less than $15"]]></title><description><![CDATA[
<p>If you want to work with 5000 nm - 10,000 nm in that price range, you can use an affordable thermal camera with a microbolometer sensor. Surplus silicon and germanium lenses can sometimes be found cheap on eBay, and there are always a lot of very low priced zinc selenide lenses for sale (intended for the CO2 laser cutter market).</p>
]]></description><pubDate>Thu, 25 Aug 2022 16:04:08 +0000</pubDate><link>https://news.ycombinator.com/item?id=32595677</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=32595677</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=32595677</guid></item><item><title><![CDATA[New comment by alexbock in "3D print and build a 164mm f/2.5 lens for less than $15"]]></title><description><![CDATA[
<p>I originally heard about it from this page written by a group who silvered a 28-inch mirror this way: <a href="https://sites.google.com/site/spraysilveringtelescopemirrors/home/how-to-spray-silver-a-telescope-mirror" rel="nofollow">https://sites.google.com/site/spraysilveringtelescopemirrors...</a><p>It's still ultimately a silver nitrate reduction reaction, but it's extremely consistent. I tried reproducing the classic silver nitrate/glucose/sodium hydroxide/ammonia silvering baths astronomers used before the modern switch to aluminum vacuum coatings, and my success rate was only about one in five attempts. It's much more difficult to execute than the typical silver nitrate demonstrations where someone silvers the inside of a glass flask because you need the silver to produce a perfectly even layer on the outside surface of the glass for a telescope. The spray reaction has worked for me every time.</p>
]]></description><pubDate>Thu, 25 Aug 2022 14:47:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=32594369</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=32594369</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=32594369</guid></item><item><title><![CDATA[New comment by alexbock in "3D print and build a 164mm f/2.5 lens for less than $15"]]></title><description><![CDATA[
<p>I made several attempts at acetone vapor smoothing on mirrors printed in ABS and ASA and the surfaces always came out very well polished at the micrometer scale but unacceptably warped, wrinkled, or patterned at the millimeter scale. The best finish I've gotten on a 3D-printed mirror was from sanding and polishing with a cotton ball soaked in cerium oxide slurry while the mirror blank spins on a pottery wheel.<p>Chemically depositing silver with the old-fashioned immersion setup is also very tricky as the reaction is quite temperamental. I recommend the two-part spray process if you want to try silvering a telescope mirror.</p>
]]></description><pubDate>Thu, 25 Aug 2022 04:02:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=32589340</link><dc:creator>alexbock</dc:creator><comments>https://news.ycombinator.com/item?id=32589340</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=32589340</guid></item></channel></rss>