<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: ericjang</title><link>https://news.ycombinator.com/user?id=ericjang</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Tue, 28 Apr 2026 19:35:09 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=ericjang" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[How to Build a Vibrant Technology Industry]]></title><description><![CDATA[
<p>Article URL: <a href="https://docs.google.com/document/d/1LlNCY9_4MXhilnbu9MqKhSjHjMGqtDA8zZmSQynVjmY/edit">https://docs.google.com/document/d/1LlNCY9_4MXhilnbu9MqKhSjHjMGqtDA8zZmSQynVjmY/edit</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=44147441">https://news.ycombinator.com/item?id=44147441</a></p>
<p>Points: 2</p>
<p># Comments: 0</p>
]]></description><pubDate>Sat, 31 May 2025 22:41:25 +0000</pubDate><link>https://docs.google.com/document/d/1LlNCY9_4MXhilnbu9MqKhSjHjMGqtDA8zZmSQynVjmY/edit</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=44147441</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44147441</guid></item><item><title><![CDATA[New comment by ericjang in "Do Large Language Models learn world models or just surface statistics? (2023)"]]></title><description><![CDATA[
<p>intra-distribution generalization is also not well posed in practical real world settings. suppose you learn a mapping f : x -> y. casually, intra-distribution generalization implies that f generalizes for "points from the same data distribution p(x)". Two issues here:<p>1. In practical scenarios, how do you know if x' is really drawn from p(x)? Even if you could compute log p(x') under the true data distribution, you can only verify that the support for x' is non-zero. one sample is not enough to tell you if x' drawn from p(x).<p>2. In high dimensional settings, x' that is not exactly equal to an example within the training set can have arbitrarily high generalization error. here's a criminally under-cited paper discussing this: <a href="https://arxiv.org/abs/1801.02774" rel="nofollow">https://arxiv.org/abs/1801.02774</a></p>
]]></description><pubDate>Fri, 22 Nov 2024 17:33:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=42215736</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=42215736</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42215736</guid></item><item><title><![CDATA[1X World Model]]></title><description><![CDATA[
<p>Article URL: <a href="https://www.1x.tech/discover/1x-world-model">https://www.1x.tech/discover/1x-world-model</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=41575691">https://news.ycombinator.com/item?id=41575691</a></p>
<p>Points: 3</p>
<p># Comments: 0</p>
]]></description><pubDate>Wed, 18 Sep 2024 04:06:28 +0000</pubDate><link>https://www.1x.tech/discover/1x-world-model</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=41575691</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41575691</guid></item><item><title><![CDATA[New comment by ericjang in "Ask HN: Could you share your personal blog here?"]]></title><description><![CDATA[
<p><a href="https://evjang.com/" rel="nofollow noreferrer">https://evjang.com/</a>
I write about Machine Learning and Robotics</p>
]]></description><pubDate>Tue, 04 Jul 2023 16:07:25 +0000</pubDate><link>https://news.ycombinator.com/item?id=36588524</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=36588524</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36588524</guid></item><item><title><![CDATA[New comment by ericjang in "Emerging architectures for LLM applications"]]></title><description><![CDATA[
<p>I am an AI researcher. Most actual AI researchers and engineers use very few of these tools - the only one being model providers like OpenAI API and public clouds (AWS, Azure, GCP). The rest of these are infra-centric tools that a16z is highly incentivized to over-inflate the importance of.</p>
]]></description><pubDate>Wed, 21 Jun 2023 00:00:00 +0000</pubDate><link>https://news.ycombinator.com/item?id=36412477</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=36412477</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36412477</guid></item><item><title><![CDATA[New comment by ericjang in "Why do railway tracks have crushed stones alongside them?"]]></title><description><![CDATA[
<p>I entered the title question of the article into ChatGPT, as I was curious whether it could factually synthesize information on the Internet similar to how this article did. Notably, it missed the detail about keeping vegetation away<p><chatgpt>
Railway tracks have crushed stones, commonly known as ballast, alongside them for several important reasons. Here are the main reasons for using ballast on railway tracks:<p>Stability and Load Distribution: Ballast provides stability to the railway tracks and distributes the load of the trains evenly. The crushed stones interlock with each other and form a stable base for the tracks. This helps prevent the tracks from shifting or sinking under the weight of the trains.<p>Drainage: Ballast allows for effective drainage of water from the tracks. The open spaces between the crushed stones allow water to flow away from the tracks, preventing the accumulation of water. Proper drainage is essential to maintain the stability of the tracks and prevent damage caused by water-related issues like erosion or track bed saturation.<p>Track Alignment and Adjustment: Ballast allows for easy alignment and adjustment of the tracks. The crushed stones can be added or removed as needed to maintain the proper track alignment and ensure a smooth ride for trains. This flexibility is especially important during track maintenance or when new tracks are being laid.<p>Noise and Vibration Damping: Ballast helps in reducing noise and vibrations generated by passing trains. The layer of crushed stones absorbs and dampens the vibrations, minimizing the impact on the surrounding environment and nearby structures.<p>Track Maintenance: Ballast provides a protective layer for the tracks. It acts as a barrier between the track infrastructure and the underlying soil, preventing soil movement or erosion that could damage the tracks. Additionally, ballast makes it easier to detect and repair any issues with the track, such as loose fastenings or track bed settlement.<p>Overall, the use of ballast alongside railway tracks is crucial for maintaining the stability, alignment, drainage, and longevity of the tracks, ensuring safe and efficient train operations.
</chatgpt></p>
]]></description><pubDate>Sun, 04 Jun 2023 19:24:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=36188649</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=36188649</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36188649</guid></item><item><title><![CDATA[New comment by ericjang in "Ask HN: Who is hiring? (May 2023)"]]></title><description><![CDATA[
<p>1X (Formerly Halodi Robotics) | <a href="https://1x.tech/" rel="nofollow">https://1x.tech/</a> | Multiple Roles | Onsite in Oslo and Bay Area<p>1X is an engineering and robotics company producing androids capable of human-like movements and behaviors. The company was founded in 2014 and is headquartered in Norway, with over 50 employees globally. 1X's mission is to create robots with practical, real-world applications to augment human labor globally. We recently announced a $23.5M Series A2 funding led by OpenAI (<a href="https://1xtech.medium.com/1x-raises-23-5m-in-series-a2-funding-led-by-openai-6040af4f3f4f" rel="nofollow">https://1xtech.medium.com/1x-raises-23-5m-in-series-a2-fundi...</a>)<p>Open Positions:<p>- Senior DevOps Engineer | Full-Time | Onsite | Oslo, Norway
<a href="https://1x.tech/#job-1112510" rel="nofollow">https://1x.tech/#job-1112510</a><p>- Electronics Hardware Engineer | Full-Time | Onsite | Oslo, Norway
<a href="https://1x.tech/#job-1079027" rel="nofollow">https://1x.tech/#job-1079027</a><p>- Full-Stack AI Resident | Intern | Onsite | Bay Area, CA, USA
<a href="https://1x.tech/#job-1079027" rel="nofollow">https://1x.tech/#job-1079027</a></p>
]]></description><pubDate>Mon, 01 May 2023 20:19:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=35778574</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=35778574</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35778574</guid></item><item><title><![CDATA[The Humanoid Robot Dream (Asianometry)]]></title><description><![CDATA[
<p>Article URL: <a href="https://www.youtube.com/watch?v=_fBFWSXzQLQ">https://www.youtube.com/watch?v=_fBFWSXzQLQ</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=35671645">https://news.ycombinator.com/item?id=35671645</a></p>
<p>Points: 2</p>
<p># Comments: 0</p>
]]></description><pubDate>Sat, 22 Apr 2023 22:53:51 +0000</pubDate><link>https://www.youtube.com/watch?v=_fBFWSXzQLQ</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=35671645</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35671645</guid></item><item><title><![CDATA[New comment by ericjang in "Google DeepMind"]]></title><description><![CDATA[
<p>Jeff was the first author on the DistBelief paper - he's always been big on model-parallelism + distributing neural network knowledge on many computers <a href="https://research.google/pubs/pub40565/" rel="nofollow">https://research.google/pubs/pub40565/</a> . I really have to emphasize that model-parallelism of a big network sounds obvious today, but it was <i>totally</i> non-obvious in 2011 when they were building it out.<p>DistBelief was tricky to program because it was written all in C++ and Protobufs IIRC. The development of TFv1 preceded my time at Google, so I can't comment on who contributed what.</p>
]]></description><pubDate>Fri, 21 Apr 2023 16:02:34 +0000</pubDate><link>https://news.ycombinator.com/item?id=35655805</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=35655805</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35655805</guid></item><item><title><![CDATA[New comment by ericjang in "Google DeepMind"]]></title><description><![CDATA[
<p>Jeff was very early on in the "just scale up the big brain" idea, perhaps as early as 2012 (Andrew Ng training networks on 1000s of CPUs). This vision is sort of summarized in <a href="https://blog.google/technology/ai/introducing-pathways-next-generation-ai-architecture/" rel="nofollow">https://blog.google/technology/ai/introducing-pathways-next-...</a> and fleshed out more in <a href="https://arxiv.org/abs/2203.12533" rel="nofollow">https://arxiv.org/abs/2203.12533</a>, but he had been internally promoting this idea since before 2016.<p>When I joined Brain in 2016, I had thought the idea of training billion/trillion-parameter sparsely gated mixtures of experts was a huge waste of resources, and that the idea was incredibly naive. But it turns out he was right, and it would take ~6 more years before that was abundantly obvious to the rest of the research community.<p>Here's his scholar page (H index of 94)
<a href="https://scholar.google.com/citations?hl=en&user=NMS69lQAAAAJ&view_op=list_works" rel="nofollow">https://scholar.google.com/citations?hl=en&user=NMS69lQAAAAJ...</a><p>As a leader, he also managed the development of TensorFlow and TPU. Consider the context / time frame - the year is 2014/2015 and a lot of academics still don't believe deep learning works. Jeff pivots a >100-person org to go all-in on deep learning, invest in an upgraded version of Theano (TF) and then give it away to the community for free, and develop Google's own training chip to compete with Nvidia. These are highly non-obvious ideas that show much more spine & vision than most tech leaders. Not to mention he designed & coded large parts of TF himself!<p>And before that, he was doing systems engineering on non-ML stuff. It's rare to pivot as a very senior-level engineer to a completely new field and then do what he did.<p>Jeff certainly has made mistakes as a leader (failing to translate Google Brain's numerous fundamental breakthroughs to more ambitious AI products, and consolidating the redundant big model efforts in google research) but I would consider his high level directional bets to be incredibly prescient.</p>
]]></description><pubDate>Thu, 20 Apr 2023 22:17:40 +0000</pubDate><link>https://news.ycombinator.com/item?id=35647254</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=35647254</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35647254</guid></item><item><title><![CDATA[“Think about this step by step; the person giving you the problem is Yann LeCun”]]></title><description><![CDATA[
<p>Article URL: <a href="https://twitter.com/stanislavfort/status/1639731204307005443">https://twitter.com/stanislavfort/status/1639731204307005443</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=35311640">https://news.ycombinator.com/item?id=35311640</a></p>
<p>Points: 10</p>
<p># Comments: 0</p>
]]></description><pubDate>Sun, 26 Mar 2023 06:24:42 +0000</pubDate><link>https://twitter.com/stanislavfort/status/1639731204307005443</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=35311640</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35311640</guid></item><item><title><![CDATA[New comment by ericjang in "The sinister attempts to ‘decolonise’ mathematics"]]></title><description><![CDATA[
<p>Can you link the source? My google search of "punctuality and logical objectivity were "white expectations" not to be applied to persons of color" didn't find any exact matches so I'm curious which guide you're referring to.</p>
]]></description><pubDate>Tue, 15 Nov 2022 10:22:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=33607438</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=33607438</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=33607438</guid></item><item><title><![CDATA[New comment by ericjang in "Alaska snow crab season canceled after disappearance of an estimated 1B crabs"]]></title><description><![CDATA[
<p>would it be simpler just to de-claw the crabs so that they are unable to poke / snip at each other?</p>
]]></description><pubDate>Sat, 15 Oct 2022 00:17:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=33210322</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=33210322</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=33210322</guid></item><item><title><![CDATA[New comment by ericjang in "South Korean prosecutors say Do Kwon 'obviously on the run', asks Interpol help"]]></title><description><![CDATA[
<p>s/substantially/sublimate ?</p>
]]></description><pubDate>Mon, 19 Sep 2022 16:29:23 +0000</pubDate><link>https://news.ycombinator.com/item?id=32900506</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=32900506</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=32900506</guid></item><item><title><![CDATA[New comment by ericjang in "Search 5.8B images used to train popular AI art models"]]></title><description><![CDATA[
<p>It would be a stunning twist of irony if this website uploaded images to a proprietary image dataset used for training AI models, pitching "uncorrelated data"</p>
]]></description><pubDate>Thu, 15 Sep 2022 00:14:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=32845301</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=32845301</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=32845301</guid></item><item><title><![CDATA[How can we make robotics more like generative modeling?]]></title><description><![CDATA[
<p>Article URL: <a href="https://evjang.com/2022/07/23/robotics-generative.html">https://evjang.com/2022/07/23/robotics-generative.html</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=32287931">https://news.ycombinator.com/item?id=32287931</a></p>
<p>Points: 61</p>
<p># Comments: 3</p>
]]></description><pubDate>Sat, 30 Jul 2022 16:16:29 +0000</pubDate><link>https://evjang.com/2022/07/23/robotics-generative.html</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=32287931</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=32287931</guid></item><item><title><![CDATA[Job Hunt as a PhD in AI / ML / RL: How It Happens]]></title><description><![CDATA[
<p>Article URL: <a href="https://www.natolambert.com/writing/ai-phd-job-hunt">https://www.natolambert.com/writing/ai-phd-job-hunt</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=32027998">https://news.ycombinator.com/item?id=32027998</a></p>
<p>Points: 18</p>
<p># Comments: 2</p>
]]></description><pubDate>Fri, 08 Jul 2022 22:03:23 +0000</pubDate><link>https://www.natolambert.com/writing/ai-phd-job-hunt</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=32027998</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=32027998</guid></item><item><title><![CDATA[New comment by ericjang in "The Machine Learning Job Market"]]></title><description><![CDATA[
<p>true</p>
]]></description><pubDate>Tue, 26 Apr 2022 14:49:19 +0000</pubDate><link>https://news.ycombinator.com/item?id=31168158</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=31168158</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=31168158</guid></item><item><title><![CDATA[New comment by ericjang in "The Machine Learning Job Market"]]></title><description><![CDATA[
<p>Ouch</p>
]]></description><pubDate>Tue, 26 Apr 2022 14:48:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=31168143</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=31168143</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=31168143</guid></item><item><title><![CDATA[New comment by ericjang in "The Machine Learning Job Market"]]></title><description><![CDATA[
<p>Oof</p>
]]></description><pubDate>Tue, 26 Apr 2022 14:37:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=31168009</link><dc:creator>ericjang</dc:creator><comments>https://news.ycombinator.com/item?id=31168009</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=31168009</guid></item></channel></rss>