<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: JHonaker</title><link>https://news.ycombinator.com/user?id=JHonaker</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Thu, 23 Apr 2026 10:37:34 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=JHonaker" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by JHonaker in "ggsql: A Grammar of Graphics for SQL"]]></title><description><![CDATA[
<p>Totally get that. I was mostly just long-windedly complaining that the one problem I have with it seems to be exacerbated by, not fixed, by this. I was also hoping someone would say “oh it’s actually way easier than you think, see (amazing link).”<p>I really do think it’s a good idea to explore! Sometimes I feel crazy because I’m the only one in my department that prefers to just write SQL to deal with our DBs instead of fiddling with a python/R connector that always has its own quirks.</p>
]]></description><pubDate>Tue, 21 Apr 2026 01:08:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=47843347</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=47843347</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47843347</guid></item><item><title><![CDATA[New comment by JHonaker in "ggsql: A Grammar of Graphics for SQL"]]></title><description><![CDATA[
<p>I applaud the project, and I completely agree that the concepts maps nicely to SQL. The R equivalent of a WITH data prep block followed by the VISUALIZE is pretty much how all my plotting code is structured.<p>However, I don't see what the benefits of this are (other than having a simple DSL, but that creates the yet another DSL problme) over ggplot2. What do I gain by using this over ggplot2 in R?<p>The only problem, and the only reason I ever leave ggplot2 for visualizations, is how difficult it is to do anything "non-standard" that hasn't already had a geom created in the ggplot ecosystem. When you want to do something "different" it's way easier to drop into the primitive drawing operations of whatever you're using than it is to try to write the ggplot-friendly adapter.<p>Even wrapping common "partial specificiations" as a function (which should "just work" imo) is difficult depending on whether you're trying to wrap something that composes with the rest of the spec via `+` or via pipe (`|>`, the operator formerly known as `%>%`)</p>
]]></description><pubDate>Mon, 20 Apr 2026 18:16:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=47838395</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=47838395</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47838395</guid></item><item><title><![CDATA[New comment by JHonaker in "Kent Beck: Parkinson's"]]></title><description><![CDATA[
<p>Most of the memories I have of my grandmother are post-serious Parkinson's progression. She was able to live a very long life, but it seriously affected her. Good luck, hoping for the best for you.<p>Fuck Parkinson's.</p>
]]></description><pubDate>Fri, 17 Apr 2026 02:52:57 +0000</pubDate><link>https://news.ycombinator.com/item?id=47802009</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=47802009</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47802009</guid></item><item><title><![CDATA[New comment by JHonaker in "Bayesian statistics for confused data scientists"]]></title><description><![CDATA[
<p>I’m not sure what your professional experience is in, but as a counterpoint, I’ve never been in a situation where I hadn’t wished for a system I’m working with to already be in a Bayesian framework. Having said that, I only occasionally am building things from scratch instead of modifying existing systems, so I’m not always lucky enough to be able to work with them.<p>The pain points around getting a sampler/model pairing working in a reasonable timeframe is definitely a valid complaint. In my experience, inference methods in Bayesian stats are much less forgiving of poorly specified models (or said another way, don’t let you get away with ignoring important structural components of the phenomena of interest). A poorly performing model (in terms of sampler speed/mixing) is often a sign of a problem with the geometry of the parameter space. Frustratingly this can sometimes be a result of conceptually equivalent, but computationally different parameterizations (e.g. centered vs non-centered multi level effects).<p>The struggles are worth it IMO because it is helpful feedback that helps guide design, and the ease with which I can compute meaningful uncertainty bounds on pretty much any quantity of interest is invaluable.</p>
]]></description><pubDate>Sun, 22 Mar 2026 13:29:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=47477346</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=47477346</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47477346</guid></item><item><title><![CDATA[New comment by JHonaker in "Diode – Build, program, and simulate hardware"]]></title><description><![CDATA[
<p>I shorted it and it crashed the page. I feel like that was appropriate. :D</p>
]]></description><pubDate>Wed, 25 Feb 2026 15:23:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=47152756</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=47152756</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47152756</guid></item><item><title><![CDATA[New comment by JHonaker in "Bayesian Data Analysis, Third edition (2013) [pdf]"]]></title><description><![CDATA[
<p>My go to for teaching statistics is Statistical Rethinking. It’s basically a course in how to actually thing about modeling: what you’re really looking for is analyzing a hypothesis, and a model may be consistent with a number of hypotheses, figuring out what hypotheses any given model implies is the hard/fun part, and this book teaches you that. The only drawback is that it’s not free. (Although there are excellent lectures by the author available for free on YouTube. These are worth watching even if you don’t get the book.)<p>I also recommend Gelman’s (one of the authors of the linked book) Regression and Other Stories as a more approachable text for this content.<p>Think Bayes and Bayesian Methods for Hackers are introductory books from a beginner coming from a programming background.<p>If you want something more from the ML world that heavily emphasizes the benefits of probabilistic (Bayesian) methods, I highly recommend Kevin Murphy’s Probabilistic Machine Learning. I have only read the first edition before he split it into two volumes and expanded it, but I’ve only heard good things about the new volumes too.</p>
]]></description><pubDate>Sun, 28 Sep 2025 19:53:36 +0000</pubDate><link>https://news.ycombinator.com/item?id=45407381</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=45407381</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45407381</guid></item><item><title><![CDATA[New comment by JHonaker in "The sisters “paradox” – counter-intuitive probability"]]></title><description><![CDATA[
<p>Any time you start conditioning on something, i.e. selecting subsets of data to analyze. You can fool yourself quite often if you do something seemingly innocuous like select "everyone with at least one X" and compare expectations to what's true unconditionally (meaning not conditioning on anything, not "in all cases") with conditional computations.</p>
]]></description><pubDate>Thu, 28 Aug 2025 17:08:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=45054540</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=45054540</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45054540</guid></item><item><title><![CDATA[New comment by JHonaker in "You Don't Need Monads"]]></title><description><![CDATA[
<p>This is beautiful whether you interpret it genuinely or as satire.</p>
]]></description><pubDate>Thu, 07 Aug 2025 14:47:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=44825192</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=44825192</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44825192</guid></item><item><title><![CDATA[New comment by JHonaker in "Machine Learning: The Native Language of Biology"]]></title><description><![CDATA[
<p>Agreed wholeheartedly. I have argued with the VP of our department about this paper quite a few times.<p>I feel like Breiman sets up a strawman that I've never encountered when I work with my colleagues that are trained in the statistics community. That doesn't mean it didn't exist 25 years ago when he wrote it. I concede that we are sometimes willing to make simplifying assumptions in order to state something particular, but it's almost like we've been culturally conditioned to steep everything we say with every caveat possible.<p>Whereas I am constantly having to point out the poor feedback we've had about some of the XGBoost models despite the fact that they're clearly the most "predictive" when evaluated naively.</p>
]]></description><pubDate>Fri, 06 Jun 2025 20:06:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=44204489</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=44204489</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44204489</guid></item><item><title><![CDATA[New comment by JHonaker in "Consider Knitting"]]></title><description><![CDATA[
<p>> I'm so slow it takes too much time for me to not think that it's a waste of time because I could have done something more meaningful instead.<p>That's the funny thing about the idea of meaningful things. It is solely determined by what you think is meaningful. Personally, just sitting and making something is an extremely meaningful activity to me.</p>
]]></description><pubDate>Wed, 04 Jun 2025 14:16:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=44180975</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=44180975</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44180975</guid></item><item><title><![CDATA[New comment by JHonaker in "What will happen if you do an entire woodworking project using a VR headset? [video]"]]></title><description><![CDATA[
<p>Sometimes even eye to brain to hand latency is too long...</p>
]]></description><pubDate>Tue, 18 Feb 2025 16:36:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=43091732</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=43091732</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43091732</guid></item><item><title><![CDATA[New comment by JHonaker in "Show HN: I made a tool that generates your Chinese name based on your name"]]></title><description><![CDATA[
<p>Very neat, but audio doesn't work on desktop either. (Linux, Firefox) I had an old-school generated voice say "Chinese letter Chinese letter Chinese letter." It was definitely amusing, but I don't think that's what you were going for.</p>
]]></description><pubDate>Fri, 24 Jan 2025 21:26:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=42817066</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=42817066</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42817066</guid></item><item><title><![CDATA[New comment by JHonaker in "Ross Ulbricht granted a full pardon"]]></title><description><![CDATA[
<p>Wow. I thought you were being glib, but the average comment length is noticeably higher in the linked discussion. While length isn’t necessarily a valid proxy for meaningful conversation, this was definitely an eye-opening contrast to the current thread.</p>
]]></description><pubDate>Wed, 22 Jan 2025 01:42:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=42787748</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=42787748</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42787748</guid></item><item><title><![CDATA[New comment by JHonaker in "Zuckerberg approved training Llama on LibGen [pdf]"]]></title><description><![CDATA[
<p>Yes but your library still legally obtained those copies in the first place.</p>
]]></description><pubDate>Sun, 12 Jan 2025 17:46:06 +0000</pubDate><link>https://news.ycombinator.com/item?id=42675266</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=42675266</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42675266</guid></item><item><title><![CDATA[New comment by JHonaker in "Defining Statistical Models in Jax?"]]></title><description><![CDATA[
<p>Ah, I did not realize that the `realNVP` was a link! Thanks.</p>
]]></description><pubDate>Thu, 17 Oct 2024 14:26:47 +0000</pubDate><link>https://news.ycombinator.com/item?id=41870001</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=41870001</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41870001</guid></item><item><title><![CDATA[New comment by JHonaker in "Defining Statistical Models in Jax?"]]></title><description><![CDATA[
<p>Thanks! I've read the first one before. I'll take a look at the other two!</p>
]]></description><pubDate>Thu, 17 Oct 2024 14:06:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=41869811</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=41869811</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41869811</guid></item><item><title><![CDATA[New comment by JHonaker in "Defining Statistical Models in Jax?"]]></title><description><![CDATA[
<p>I'm very excited by the work being put in to make Bayesian inference more manageable. It's in a spot that feels very similar to deep learning circa mid-2010s when Caffe, Torch, and hand-written gradients were the options. We <i>can</i> do it, but doing anything more complicated than common model structures like hierarchical Gaussian linear models requires dropping out of the nice places and into the guts.<p>I've had a lot of success with Numpyro (a JAX library), and used quite a lot of tools that are simpler interfaces to Stan. I've also had to write quite a few model-specific things from scratch by hand (more for sequential Monte Carlo than MCMC). I'm very excited for a world where PPLs become scalable and easier to use /customize.<p>> I think there is a good chance that normalizing flow-based variational inference will displace MCMC as the go-to method for Bayesian posterior inference as soon as everyone gets access to good GPUs.<p>Wow. This is incredibly surprising. I'm only tangentially aware of normalizing flows, but apparently I need to look at the intersection of them and Bayesian statistics now! Any sources from anyone would be most appreciated!</p>
]]></description><pubDate>Thu, 17 Oct 2024 13:28:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=41869481</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=41869481</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41869481</guid></item><item><title><![CDATA[New comment by JHonaker in "Damas-Hindley-Milner inference two ways"]]></title><description><![CDATA[
<p>There's no reason you still can't!</p>
]]></description><pubDate>Wed, 16 Oct 2024 17:25:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=41861529</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=41861529</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41861529</guid></item><item><title><![CDATA[New comment by JHonaker in "Sorry, GenAI is NOT going to 10x computer programming"]]></title><description><![CDATA[
<p>I've been looking for such a walkthrough for months. I can't find anyone actually showing them working and building something. There are plenty of, "look at this code this thing spit out that successfully builds," but I've not been able to find anything I would consider truly showing someone building something novel.<p>If anyone has a YT video or a detailed article, please show me!</p>
]]></description><pubDate>Tue, 01 Oct 2024 18:43:11 +0000</pubDate><link>https://news.ycombinator.com/item?id=41712485</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=41712485</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41712485</guid></item><item><title><![CDATA[New comment by JHonaker in "TiddlyWiki5 – A self-contained JavaScript wiki for the browser"]]></title><description><![CDATA[
<p>I am not the author of the book!<p>As far as the wiki: It's not hosted. It's just on my local machine. You basically just drop the minified JS library in a tiddler and then create a custom widget type and a ViewTemplate to automatically display the widget when the content-type is the correct one.<p>I used <a href="http://whatfettle.com/2008/05/TiddlyProcessing/" rel="nofollow">http://whatfettle.com/2008/05/TiddlyProcessing/</a> as the basis (although it's old) and the main docs on creating a custom widget (<a href="https://tiddlywiki.com/dev/#Javascript%20Widget%20Tutorial" rel="nofollow">https://tiddlywiki.com/dev/#Javascript%20Widget%20Tutorial</a>)</p>
]]></description><pubDate>Mon, 19 Aug 2024 14:57:31 +0000</pubDate><link>https://news.ycombinator.com/item?id=41291607</link><dc:creator>JHonaker</dc:creator><comments>https://news.ycombinator.com/item?id=41291607</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41291607</guid></item></channel></rss>