<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: haensi</title><link>https://news.ycombinator.com/user?id=haensi</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Tue, 28 Apr 2026 19:39:13 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=haensi" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[Ask HN: What are the recommender systems papers from 2024-2025?]]></title><description><![CDATA[
<p>I’ve been keeping up with the classics (NCF, Wide & Deep, LightGCN), but the field seems to have shifted dramatically in the last 18–24 months toward LLM-based reasoning and graph-based retrieval at scale.<p>I’m looking for the "state of the art" in 2026. Specifically:<p>LLM4Rec: Beyond just using LLMs for feature engineering—who is doing generative recommendation well?<p>Retrieval vs. Ranking: Any new breakthroughs in the "Two-Tower" paradigm or vector database integration?<p>Real-world Scale: Papers that address the latency/cost trade-offs of these newer, heavier models.<p>What has been the most influential paper you’ve read recently that changed how you think about discovery?</p>
<hr>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=46692368">https://news.ycombinator.com/item?id=46692368</a></p>
<p>Points: 15</p>
<p># Comments: 1</p>
]]></description><pubDate>Tue, 20 Jan 2026 14:50:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=46692368</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=46692368</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46692368</guid></item><item><title><![CDATA[New comment by haensi in "TimeCapsuleLLM: LLM trained only on data from 1800-1875"]]></title><description><![CDATA[
<p>What kind of Christian books do you read?Jonathan Edwards, John Bunyan, J.C. Ryle, C.H. Spurgeon?</p>
]]></description><pubDate>Tue, 13 Jan 2026 13:39:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=46600789</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=46600789</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46600789</guid></item><item><title><![CDATA[New comment by haensi in "Ask HN: What did you read in 2025?"]]></title><description><![CDATA[
<p>I’m a Christian as well and spent a day in Oxford earlier this year. After spending some time at Magdalen College, I bought every book I could by C.S. Lewis and just finished Letters to Malcolm (on prayer) today.<p>His refreshingly honest take is very relatable, humorous and encouraging.<p>I can highly recommend it if you’re interested in prayer life (and how to use powerful formulations in letters)</p>
]]></description><pubDate>Fri, 26 Dec 2025 21:33:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=46396518</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=46396518</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46396518</guid></item><item><title><![CDATA[New comment by haensi in "OpenAI acquired AI training monitor Neptune"]]></title><description><![CDATA[
<p>Weights & Biases if they are serious and need performance and scale.</p>
]]></description><pubDate>Thu, 04 Dec 2025 14:42:34 +0000</pubDate><link>https://news.ycombinator.com/item?id=46148177</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=46148177</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46148177</guid></item><item><title><![CDATA[New comment by haensi in "Translating My Grandfather's Biograpy"]]></title><description><![CDATA[
<p>Thanks for sharing this, I was not aware of it and I’m currently in the process of learning Hebrew, dealing with the intersection of (Jewish and Christian) culture and technology.<p>One example is a GPT version [1] fine tuned on texts of Sefaria [2].<p>The initiator was in direct contact with Sam Altman to kickstart it. (Personal communication).<p>He talks about this publicly [3]<p>[1]: <a href="https://github.com/Sefaria/AppliedAI">https://github.com/Sefaria/AppliedAI</a>
[2]: <a href="https://www.sefaria.org/texts" rel="nofollow">https://www.sefaria.org/texts</a>
[3]: <a href="https://m.youtube.com/watch?v=tZkJl2fk0rc" rel="nofollow">https://m.youtube.com/watch?v=tZkJl2fk0rc</a></p>
]]></description><pubDate>Sat, 23 Nov 2024 00:49:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=42218513</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=42218513</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42218513</guid></item><item><title><![CDATA[New comment by haensi in "Data Version Control"]]></title><description><![CDATA[
<p>There’s another thread from October 2022 on that topic.<p><a href="https://news.ycombinator.com/item?id=33047634">https://news.ycombinator.com/item?id=33047634</a><p>What makes DVC especially useful for MLOps? Aren’t MLFlow or W&B solving that in a way that’s open source (the former) or just increases the speed and scale massively ( the latter)?<p>Disclaimer: I work at W&B.</p>
]]></description><pubDate>Sat, 19 Oct 2024 21:01:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=41890767</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=41890767</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41890767</guid></item><item><title><![CDATA[New comment by haensi in "Cerebras Inference: AI at Instant Speed"]]></title><description><![CDATA[
<p>Their CEO talks about this in the gradient dissent podcast [1]<p>[1]: <a href="https://m.youtube.com/watch?v=qNXebAQ6igs" rel="nofollow">https://m.youtube.com/watch?v=qNXebAQ6igs</a></p>
]]></description><pubDate>Tue, 27 Aug 2024 17:26:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=41370253</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=41370253</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41370253</guid></item><item><title><![CDATA[New comment by haensi in "Pharia-1-LLM: transparent and compliant"]]></title><description><![CDATA[
<p>Very interesting to see an LLM with weights and the code base.
They also talk about tokenizer fertility in the HF model card [1]<p>[1]: <a href="https://huggingface.co/Aleph-Alpha/Pharia-1-LLM-7B-control" rel="nofollow">https://huggingface.co/Aleph-Alpha/Pharia-1-LLM-7B-control</a></p>
]]></description><pubDate>Mon, 26 Aug 2024 11:54:18 +0000</pubDate><link>https://news.ycombinator.com/item?id=41356267</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=41356267</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41356267</guid></item><item><title><![CDATA[New comment by haensi in "Ask HN: Book Recommendations in SF – Science for Kids and S for Adults?"]]></title><description><![CDATA[
<p>Thanks a lot I will check those out. I love used books, I already have found so many treasures that were unknown to me before.</p>
]]></description><pubDate>Sun, 11 Aug 2024 18:36:28 +0000</pubDate><link>https://news.ycombinator.com/item?id=41218367</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=41218367</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41218367</guid></item><item><title><![CDATA[Ask HN: Book Recommendations in SF – Science for Kids and S for Adults?]]></title><description><![CDATA[
<p>Hi HN,<p>I’ll be visiting San Francisco in about 4 weeks and am hoping to get some book recommendations. Here’s what I’m looking for:<p>1. *For my kids (ages 3 & 5):* I’m particularly interested in science books for young children that are only available or easiest to find in the US. I’d love to bring back some unique and educational reads that might not be readily available in Germany.<p>2. *For myself:* I’m a machine learning engineer with interests in linguistics, philosophy, theology, natural history, and epistemology. If you know of any extraordinary books in these areas, especially those that are hard to find or are particularly renowned in the US, I’d love to hear your suggestions.<p>3. *Must-Reads:* Finally, if there are any other books—across any genre—that you consider a must-read, I’m all ears. Whether it’s fiction, non-fiction, or anything in between, I’m looking to expand my library with some great reads.<p>Also, any tips on where to buy these books in SF would be greatly appreciated!<p>Thanks in advance for your recommendations!</p>
<hr>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=41212701">https://news.ycombinator.com/item?id=41212701</a></p>
<p>Points: 1</p>
<p># Comments: 2</p>
]]></description><pubDate>Sat, 10 Aug 2024 22:27:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=41212701</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=41212701</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=41212701</guid></item><item><title><![CDATA[New comment by haensi in "An interview with AMD CEO Lisa Su about solving hard problems"]]></title><description><![CDATA[
<p>You made my day with woulda shoulda cuda.<p>I’m going to watch finding NeMo now</p>
]]></description><pubDate>Mon, 17 Jun 2024 17:54:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=40708772</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=40708772</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40708772</guid></item><item><title><![CDATA[New comment by haensi in "Ask HN: Any real-time human lang translation AI for video yet?"]]></title><description><![CDATA[
<p>You mean something like this?<p><a href="https://targum.video/" rel="nofollow">https://targum.video/</a></p>
]]></description><pubDate>Mon, 06 May 2024 20:53:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=40279389</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=40279389</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=40279389</guid></item><item><title><![CDATA[W&B Prompts]]></title><description><![CDATA[
<p>Article URL: <a href="https://wandb.ai/wandb/wb-announcements/reports/Introducing-W-B-Prompts--Vmlldzo0MTI4NjY5">https://wandb.ai/wandb/wb-announcements/reports/Introducing-W-B-Prompts--Vmlldzo0MTI4NjY5</a></p>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=35670591">https://news.ycombinator.com/item?id=35670591</a></p>
<p>Points: 1</p>
<p># Comments: 1</p>
]]></description><pubDate>Sat, 22 Apr 2023 20:25:10 +0000</pubDate><link>https://wandb.ai/wandb/wb-announcements/reports/Introducing-W-B-Prompts--Vmlldzo0MTI4NjY5</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=35670591</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35670591</guid></item><item><title><![CDATA[New comment by haensi in "Prompt engineering vs. blind prompting"]]></title><description><![CDATA[
<p>Thanks for this 101 article! The entire LLMOps field is developing so fast and is being defined as we speak.<p>Somehow, this time feels to me like the early days of computer science, when Don Knuth was barely known and a Turing award was only known to Turing award winners. I met Don Knuth in Palo Alto in March and we talked about LLMs. His take: „Vint Cerf told me he was underwhelmed when he asked the LLM to write a biography on Vinton Cerf.“<p>There are also tools being built and released for Prompt engineering [1]. Full transparency: I work at W&B<p>LangChain and other connecting elements will vastly increase the usability and combinations of different tools.<p>[1]: <a href="https://wandb.ai/wandb/wb-announcements/reports/Introducing-W-B-Prompts--Vmlldzo0MTI4NjY5" rel="nofollow">https://wandb.ai/wandb/wb-announcements/reports/Introducing-...</a></p>
]]></description><pubDate>Sat, 22 Apr 2023 20:21:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=35670558</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=35670558</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35670558</guid></item><item><title><![CDATA[Show HN: Track and Compare Audio Transcription with Whisper × W&B]]></title><description><![CDATA[
<p>Hi HN,<p>based on Whisper [0] and Whisper.cpp [1], I created a comparison of transcription performance (quantitative metrics such as relative speed).<p>You can find the code in the Colab [2] and a blog post [3] containing a how-to guide and visualizations.<p>In the future, I'd love to add WER evaluation and visualizations based on ground-truth data.<p>Bonus: Normally you would log these results from Python to Weights & Biases, but there is a way to log even from C++ / the cli by using `subprocess`<p>Would love to know what you think of this comparison and what features / attributes you would like to see in a more sophisticated comparison.<p>Thanks!<p>[0]: <a href="https://news.ycombinator.com/item?id=32927360" rel="nofollow">https://news.ycombinator.com/item?id=32927360</a>
[1]: <a href="https://news.ycombinator.com/item?id=33877893" rel="nofollow">https://news.ycombinator.com/item?id=33877893</a>
[2]: <a href="https://colab.research.google.com/drive/1mXZUdIbvdNVOFRJaIhW-b8LfB17spOm1" rel="nofollow">https://colab.research.google.com/drive/1mXZUdIbvdNVOFRJaIhW...</a>
[3]: <a href="https://wandb.ai/hans-ramsl/gradient-dissent-transcription/reports/How-to-Track-and-Compare-Audio-Transcriptions-with-Whisper-and-Weights-Biases--VmlldzozNDc5OTg2" rel="nofollow">https://wandb.ai/hans-ramsl/gradient-dissent-transcription/r...</a></p>
<hr>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=34789339">https://news.ycombinator.com/item?id=34789339</a></p>
<p>Points: 5</p>
<p># Comments: 0</p>
]]></description><pubDate>Tue, 14 Feb 2023 14:01:14 +0000</pubDate><link>https://wandb.ai/hans-ramsl/gradient-dissent-transcription/reports/How-to-Track-and-Compare-Audio-Transcriptions-with-Whisper-and-Weights-Biases--VmlldzozNDc5OTg2</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=34789339</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=34789339</guid></item><item><title><![CDATA[New comment by haensi in "Ask HN: What's the best way to run whisper large on Apple Silicon?"]]></title><description><![CDATA[
<p>Do you mean something like Whisper.cpp? [0]
See the demo there.<p>[0]: <a href="https://github.com/ggerganov/whisper.cpp#whispercpp">https://github.com/ggerganov/whisper.cpp#whispercpp</a></p>
]]></description><pubDate>Tue, 14 Feb 2023 13:36:50 +0000</pubDate><link>https://news.ycombinator.com/item?id=34789014</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=34789014</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=34789014</guid></item><item><title><![CDATA[New comment by haensi in "Data Version Control"]]></title><description><![CDATA[
<p>Have you tested Weights & Biases Artifacts[1]?<p>It comes with a smart versioning approach, checks the Δ based on the checksum and has a feature to visualize the lineage.<p>You can also use your existing object store and link it for very large / sensitive data.[2]<p>Disclaimer: I work at W&B.<p>[1]: <a href="https://docs.wandb.ai/guides/data-and-model-versioning/model-versioning" rel="nofollow">https://docs.wandb.ai/guides/data-and-model-versioning/model...</a>
[2]: <a href="https://docs.wandb.ai/guides/artifacts/track-external-files#amazon-s3-gcs-references" rel="nofollow">https://docs.wandb.ai/guides/artifacts/track-external-files#...</a></p>
]]></description><pubDate>Mon, 03 Oct 2022 06:59:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=33064329</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=33064329</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=33064329</guid></item><item><title><![CDATA[New comment by haensi in "The Two Cultures and The Scientific Revolution (1959) [pdf]"]]></title><description><![CDATA[
<p>Don Knuth further elaborates on this in his great book Things [0], [2]:<p>> The truth in fact is that C. P. Snow got it wrong by at least an order of magnitude — there are many more than two cultures. I think
a lot of you know the Apple Macintosh ads telling us to “think different,” but people already do. From my own corner of the academic
world, I know for example that physicists think different from mathematicians; mathematicians who do algebra think different from
mathematicians who do geometry; both kinds of mathematicians
think different from computer scientists who work on algorithms;
and so on and so forth. People often decry this lack of unity in the
knowledge of the world, but let’s face it: People are different. Vive
la différence.<p>Fun fact: PG mentions the two cultures while writing about Knuth [1]<p>[0]: <a href="http://web.stanford.edu/group/cslipublications/cslipublications/pdf/1575863278.pdf" rel="nofollow">http://web.stanford.edu/group/cslipublications/cslipublicati...</a><p>[1]: <a href="http://www.paulgraham.com/knuth.html?viewfullsite=1" rel="nofollow">http://www.paulgraham.com/knuth.html?viewfullsite=1</a><p>[2]: <a href="https://en.wikipedia.org/wiki/Things_a_Computer_Scientist_Rarely_Talks_About" rel="nofollow">https://en.wikipedia.org/wiki/Things_a_Computer_Scientist_Ra...</a></p>
]]></description><pubDate>Sun, 06 Feb 2022 19:36:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=30236122</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=30236122</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=30236122</guid></item><item><title><![CDATA[New comment by haensi in "Ask HN: Those making $500/month on side projects in 2022 – Show and tell"]]></title><description><![CDATA[
<p>I currently teach computer science at a local high school. Topics range from algorithms and data structures to AI and societal impact.<p>I do it to encourage young people to use software to make a change in their and other’s lives.<p>Currently making $500 with 90min/week effort. I already got the syllabus material ready so I do not necessarily have to invest time here.<p>Right now they’re writing an exam.</p>
]]></description><pubDate>Thu, 20 Jan 2022 07:35:38 +0000</pubDate><link>https://news.ycombinator.com/item?id=30005909</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=30005909</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=30005909</guid></item><item><title><![CDATA[Ask HN: What cutting-edge technology do you use?]]></title><description><![CDATA[
<p>What technology do you use today that is far from mainstream? Think Tony Stark technology.
Examples are mobile phones / internet in the 80s, metal 3D printers, quantum ML, night vision device, large language models (GPT-3) for personal communication, CRISPR, manned drones.</p>
<hr>
<p>Comments URL: <a href="https://news.ycombinator.com/item?id=29680957">https://news.ycombinator.com/item?id=29680957</a></p>
<p>Points: 50</p>
<p># Comments: 40</p>
]]></description><pubDate>Sat, 25 Dec 2021 06:21:11 +0000</pubDate><link>https://news.ycombinator.com/item?id=29680957</link><dc:creator>haensi</dc:creator><comments>https://news.ycombinator.com/item?id=29680957</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=29680957</guid></item></channel></rss>