<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: otaviogood</title><link>https://news.ycombinator.com/user?id=otaviogood</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Sat, 18 Apr 2026 08:12:49 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=otaviogood" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by otaviogood in "Ask HN: How to get started with robotics as a hobbyist?"]]></title><description><![CDATA[
<p>Like many have said, start with Arduino-like boards + servos. You can get servo driver boards that let you plug in multiple servos to your Arduino without power problems. Get those from Adafruit. Then you can graduate to the "bus" style servos like Feetech if you want to load up on servos.<p>If you want to learn to make PCBs, this tutorial shows you how to design them in open source software and then get them made cheaply (highly recommend):
<a href="https://www.youtube.com/watch?v=PlDOnSHkX2c" rel="nofollow">https://www.youtube.com/watch?v=PlDOnSHkX2c</a><p>Get a CAD program like Onshape (I love Onshape). Learn how to use it to design 3d printed components. Get a 3d printer. Maybe Bambu Lab. Now you should be able to make nice form factors for your robots. If you want to graduate from 3d printing to Aluminum, then you can send your designs to a place like PCBWay or JLCPCB and get the aluminum parts reasonably cheap.<p>At some point when you want to make a more "real" robot, read this guy's thesis and try to understand almost all of it:
A low cost modular actuator for dynamic robots
<a href="https://dspace.mit.edu/handle/1721.1/118671" rel="nofollow">https://dspace.mit.edu/handle/1721.1/118671</a>
It goes over so many things that are useful, like types of actuators, how they're designed, controlled, etc. I wish I had read that earlier. Then if you want to make something like that, I highly highly recommend MJBots.
<a href="https://mjbots.com/" rel="nofollow">https://mjbots.com/</a><p>Make something amazing!!!</p>
]]></description><pubDate>Sun, 15 Feb 2026 04:31:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=47021076</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=47021076</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47021076</guid></item><item><title><![CDATA[New comment by otaviogood in "DEDA – Tracking Dots Extraction, Decoding and Anonymisation Toolkit"]]></title><description><![CDATA[
<p>You might be reading into it too much. I think the originals were just random pieces of different kinds of paper. Graph paper, yellow lined, paper, blank white paper... I don't remember exactly, but I think the copies could be special paper with a colored backside so they would know which way was up really easily for the scanning process.</p>
]]></description><pubDate>Wed, 02 Apr 2025 00:49:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=43552749</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=43552749</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43552749</guid></item><item><title><![CDATA[New comment by otaviogood in "DEDA – Tracking Dots Extraction, Decoding and Anonymisation Toolkit"]]></title><description><![CDATA[
<p>DARPA scanned the shreds. The funny thing is, they didn't want to shred the original paper, so first they photocopied the paper in a high quality color copier, shredded it, and scanned it. And that's where the little yellow dots came from. :D</p>
]]></description><pubDate>Wed, 02 Apr 2025 00:15:12 +0000</pubDate><link>https://news.ycombinator.com/item?id=43552586</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=43552586</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43552586</guid></item><item><title><![CDATA[New comment by otaviogood in "DEDA – Tracking Dots Extraction, Decoding and Anonymisation Toolkit"]]></title><description><![CDATA[
<p>Me and my team used these yellow tracking dots to reconstruct shredded documents for a DARPA shredder challenge over a decade ago. You can see our program highlight the dots as we reconstruct the shredded docs. <a href="https://www.youtube.com/watch?v=uzZDhyrjdVo" rel="nofollow">https://www.youtube.com/watch?v=uzZDhyrjdVo</a>
Thanks to that, we were able to win by a large margin. :)</p>
]]></description><pubDate>Tue, 01 Apr 2025 22:39:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=43552025</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=43552025</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43552025</guid></item><item><title><![CDATA[New comment by otaviogood in "The boundary of neural network trainability is fractal"]]></title><description><![CDATA[
<p>This is <i>much</i> more interesting if you see the animations. <a href="https://x.com/jaschasd/status/1756930242965606582" rel="nofollow">https://x.com/jaschasd/status/1756930242965606582</a></p>
]]></description><pubDate>Mon, 19 Feb 2024 16:53:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=39431938</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=39431938</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39431938</guid></item><item><title><![CDATA[New comment by otaviogood in "OpenAI Tokenizer"]]></title><description><![CDATA[
<p>‘1984 is 1 token. 1884 is 2 tokens.’<p>I would be surprised if they use this tokenization still as it’s not math friendly.</p>
]]></description><pubDate>Wed, 05 Apr 2023 14:45:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=35455136</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=35455136</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35455136</guid></item><item><title><![CDATA[New comment by otaviogood in "Train CIFAR10 to 94% in under 10 seconds on a single A100"]]></title><description><![CDATA[
<p>Where do you see the room for improvement that gets you to much faster speeds? Also I love the single file to do everything. :)</p>
]]></description><pubDate>Mon, 30 Jan 2023 05:02:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=34575941</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=34575941</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=34575941</guid></item><item><title><![CDATA[New comment by otaviogood in "How much boilerplate code you need to write a ray-tracer?"]]></title><description><![CDATA[
<p>The geometry in my raytracer can either be ray-traced primitives, like spheres or boxes, or it can be "signed distance functions" (SDFs), which let you define all kinda of crazy shapes. Inigo Quilez does a good job of explaining SDFs here: <a href="https://www.iquilezles.org/www/articles/distfunctions/distfunctions.htm" rel="nofollow">https://www.iquilezles.org/www/articles/distfunctions/distfu...</a>
The refraction math in my code is around line 799 or 809 depending on what you're looking for. There are a few errors in the refraction code in this version. :/ But caustics are handled well by my renderer.</p>
]]></description><pubDate>Fri, 18 Feb 2022 06:15:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=30382653</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=30382653</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=30382653</guid></item><item><title><![CDATA[New comment by otaviogood in "How much boilerplate code you need to write a ray-tracer?"]]></title><description><![CDATA[
<p>This article talks about a pretty full-featured ray tracer. If you just want to play around with some ideas and have fun, you can make different tradeoffs and still get very nice and fast renderings. Here's a ray tracer that I made that's GPU-accelerated and can make some nice looking images quickly in 1000 lines of code. <a href="https://www.shadertoy.com/view/4ddcRn" rel="nofollow">https://www.shadertoy.com/view/4ddcRn</a> The tradeoff is that it's all procedural graphics. So there are no triangle meshes and it would also be a bit tricky to implement a bidirectional ray tracer like this.</p>
]]></description><pubDate>Thu, 17 Feb 2022 23:00:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=30379893</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=30379893</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=30379893</guid></item><item><title><![CDATA[New comment by otaviogood in "The Feynman Lectures on Physics Audio Collection"]]></title><description><![CDATA[
<p>I thought it would be nice to listen to these with my podcast player on iphone. Here's my too-complicated process that worked:<p>- in Chrome inspector, look at the network tab when you click play on a Feynman lecture. Right click the mp4 and do "copy as cURL".<p>- Go to command line (unix style) and paste. Then append to that command line something like "--output flp1.mp4". That will download the file locally with that file name.<p>- Put the file on Dropbox or something that will get it to your phone.<p>- From dropbox on iphone, share and export the file, then choose your podcast app. The podcast app that worked for me is "Pocket Casts".<p>- Now in Pocket Casts -> Profile -> Files, you should be able to play the mp4s with nice podcast-style controls and learn physics and be happy!</p>
]]></description><pubDate>Sat, 29 May 2021 05:53:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=27323333</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=27323333</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=27323333</guid></item><item><title><![CDATA[New comment by otaviogood in "Show HN: We made a proximity-based video chat for events"]]></title><description><![CDATA[
<p>Maybe we should have tried Daily. :) We tried Twilio video and unfortunately had a really hard time tracking down bugs. Wasted tons of time on that and then went back to doing our own stuff. :( I still don't know where the bug was. Could have been our code or Twilio's. But when we got rid of Twilio and used our own stuff, it resolved the bugs. Was a very frustrating process.
Also, the way Twilio video charges seems to assume lots of n-squared, high res connections, which we don't have, so for an application like ours, the cost can be more than 10x less.</p>
]]></description><pubDate>Wed, 05 May 2021 18:43:55 +0000</pubDate><link>https://news.ycombinator.com/item?id=27053979</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=27053979</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=27053979</guid></item><item><title><![CDATA[New comment by otaviogood in "Show HN: We made a proximity-based video chat for events"]]></title><description><![CDATA[
<p>25 person Zoom calls aren't fun. So me and my friends made a virtual gathering space for people to host online events, meetups, social hours, etc.<p>Tech:
Frontend uses Svelte / Snowpack, which is great. The game view uses the DOM, which is questionable, but my webgl implementation wasn't so hot either.<p>Backend is Firebase for general stuff, server written in Golang for the realtime game stuff, and another Golang server for video.<p>Main techincal lesson learned so far: WebRTC sure is a pain to get right across everyone's devices, browsers, and connections.</p>
]]></description><pubDate>Wed, 05 May 2021 17:08:21 +0000</pubDate><link>https://news.ycombinator.com/item?id=27052606</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=27052606</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=27052606</guid></item><item><title><![CDATA[Show HN: We made a proximity-based video chat for events]]></title><description><![CDATA[
<p>Article URL: <a href="https://unnamed.chat">https://unnamed.chat</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=27052605">https://news.ycombinator.com/item?id=27052605</a></p>
<p>Points: 11</p>
<p># Comments: 3</p>
]]></description><pubDate>Wed, 05 May 2021 17:08:21 +0000</pubDate><link>https://unnamed.chat</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=27052605</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=27052605</guid></item><item><title><![CDATA[New comment by otaviogood in "Hacker's guide to Neural Networks (2012)"]]></title><description><![CDATA[
<p>I think this eventually turned into Andrej Karpathy's class at Stanford, CS231n. The class notes are here:
<a href="http://cs231n.github.io/" rel="nofollow">http://cs231n.github.io/</a>
The class is on youtube.
If you like this hacker's guide, I think you'll definitely like the class and the notes.
edit:
A lot of the compute graph and backprop type stuff that is in the hacker's guide is covered in this specific class, starting about at this time: <a href="https://www.youtube.com/watch?v=i94OvYb6noo&t=207s" rel="nofollow">https://www.youtube.com/watch?v=i94OvYb6noo&t=207s</a></p>
]]></description><pubDate>Sun, 06 Jan 2019 23:22:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=18841410</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=18841410</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=18841410</guid></item><item><title><![CDATA[New comment by otaviogood in "Ask HN: How do I learn math/physics in my thirties?"]]></title><description><![CDATA[
<p>People say you have to "do" math to learn it. Usually they make it sound like you need to do the exercises in the books. I think that doing just that can be boring and demotivating.<p>I would suggest finding projects that can motivate you and help you exercise your math. Some suggestions of mathy things I regularly work on for fun:<p>1. Make a video game. If it's a 3d game, you'll have to do your matrices, dot products, trigonometry, etc.<p>2. shadertoy.com - This is a community site where people just program cool looking graphics for fun. All the code is open, so you can learn from it. Similar to game programming but without the mathless overhead. :)<p>3. Machine learning projects - I love writing various machine learning things, but the project that has been a great ML playground has been my self driving toy car. It gives me plenty of opportunities to explore many aspects of machine learning and that helps drive my math knowledge. My car repo is here: <a href="https://github.com/otaviogood/carputer" rel="nofollow">https://github.com/otaviogood/carputer</a> but a much easier project is donkeycar.com. ML will touch on linear algebra, calculus, probabilities/statistics, etc.<p>The most important thing for learning is to be inspired and have fun with what you're learning. :)</p>
]]></description><pubDate>Tue, 15 May 2018 18:01:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=17076124</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=17076124</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=17076124</guid></item><item><title><![CDATA[New comment by otaviogood in "Planet Shadertoy"]]></title><description><![CDATA[
<p>Books might not be the best resource for Shadertoy-type stuff. Almost all of Shadertoy 3d shaders use a technique called ray-marching with signed distance functions. If you Google it, you should find good resources. Also, someone on Shadertoy made a very good tutorial using Shadertoy, which I think is kindof amazing... <a href="https://www.shadertoy.com/view/4dSfRc" rel="nofollow">https://www.shadertoy.com/view/4dSfRc</a>
There are other tutorial shaders on Shadertoy and I always try to make mine readable and heavily commented...
<a href="https://www.shadertoy.com/user/otaviogood" rel="nofollow">https://www.shadertoy.com/user/otaviogood</a></p>
]]></description><pubDate>Sun, 25 Feb 2018 06:53:38 +0000</pubDate><link>https://news.ycombinator.com/item?id=16457574</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=16457574</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=16457574</guid></item><item><title><![CDATA[New comment by otaviogood in "Terrain rendering in fewer than 20 lines of code"]]></title><description><![CDATA[
<p>Shadertoy shaders _do_ include the render algorithm. That shader is completely ray-traced, so the only real input that is used for that shader is the x, y coordinates and time. It's not using any of the GPU's polygon rasterization. It's stateless and is generating the terrain while it ray-traces it.</p>
]]></description><pubDate>Sat, 25 Nov 2017 04:52:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=15774502</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=15774502</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=15774502</guid></item><item><title><![CDATA[New comment by otaviogood in "Sorting Two Tons of Lego, the Software Side"]]></title><description><![CDATA[
<p>It could be fun if you released your tagged data set of lego piece pictures so people in the ML community could try to write classifiers. Even untagged pics could be interesting.</p>
]]></description><pubDate>Sun, 07 May 2017 05:00:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=14283944</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=14283944</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=14283944</guid></item><item><title><![CDATA[New comment by otaviogood in "Ask HN: How do you get notified about newest research papers in your field?"]]></title><description><![CDATA[
<p><a href="http://www.arxiv-sanity.com" rel="nofollow">http://www.arxiv-sanity.com</a>
That helps sort through arxiv papers and get recommendations.</p>
]]></description><pubDate>Fri, 05 Aug 2016 16:19:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=12233503</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=12233503</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=12233503</guid></item><item><title><![CDATA[New comment by otaviogood in "Easy Scalable Text Rendering on the GPU"]]></title><description><![CDATA[
<p>The Skew language compiles to Javascript, C#, and C++ (which is still a work in progress). So if you don't like that it's in Skew, you might have the option to switch it to one of those languages. It probably won't work for the more system-dependent parts of the code, but you might get something out of it.</p>
]]></description><pubDate>Wed, 06 Apr 2016 23:18:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=11443142</link><dc:creator>otaviogood</dc:creator><comments>https://news.ycombinator.com/item?id=11443142</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=11443142</guid></item></channel></rss>