<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: mingodad</title><link>https://news.ycombinator.com/user?id=mingodad</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Wed, 08 Apr 2026 19:18:55 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=mingodad" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by mingodad in "How to run Qwen 3.5 locally"]]></title><description><![CDATA[
<p>I'm still a bit confused because it says "All uploads use Unsloth Dynamic 2.0" but then when looking at the available options like for 4 bits there is:<p>IQ4_XS 5.17 GB, Q4_K_S 5.39 GB, IQ4_NL 5.37 GB, Q4_0 5.38 GB, Q4_1 5.84 GB, Q4_K_M 5.68 GB, UD-Q4_K_XL 5.97 GB<p>And no explanation for what they are and what tradeoffs they have, but in the turorial it explicitly used Q4_K_XL with llama.cpp .<p>I'm using a macmini m4 16GB and so far my prefered model is Qwen3-4B-Instruct-2507-Q4_K_M although a bit chat but my test with Qwen3.5-4B-UD-Q4_K_XL shows it's a lot more chat, I'm basically using it in chat mode for basic man style questions.<p>I understand that each user has it's own specific needs but would be nice to have a place that have a list of typical models/hardware listed with it's common config parameters and memory usage.<p>Even on redit specific channels it's a bit of nightmare of loot of talk but no concrete config/usage clear examples.<p>I'm floowing this topic heavilly for the last 3 months and I see more confusion than clarification.<p>Right now I'm getting good cost/benefit results with the qwen cli with coder-model in the cloud and watching constantly to see when a local model on affordable hardware with enviroment firendly energy comsumption arrives.</p>
]]></description><pubDate>Sun, 08 Mar 2026 08:47:34 +0000</pubDate><link>https://news.ycombinator.com/item?id=47295692</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=47295692</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47295692</guid></item><item><title><![CDATA[New comment by mingodad in "Coccinelle: Source-to-source transformation tool"]]></title><description><![CDATA[
<p>There is also <a href="https://sparse.docs.kernel.org/" rel="nofollow">https://sparse.docs.kernel.org/</a> that can be used to help refactoring C code.</p>
]]></description><pubDate>Sun, 22 Feb 2026 11:41:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=47110227</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=47110227</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=47110227</guid></item><item><title><![CDATA[New comment by mingodad in "OpenAI declares 'code red' as Google catches up in AI race"]]></title><description><![CDATA[
<p>Interesting enough today I found that <a href="https://huggingface.co/openai/gpt-oss-20b" rel="nofollow">https://huggingface.co/openai/gpt-oss-20b</a> when downloaded an running locally and asked:<p>====<p>when have you been built?<p>I’m a GPT‑4‑based model that OpenAI released on March 14 2023.
The underlying training data goes up to early 2023, and the model was trained in the months leading up to that release. If you’re asking about the ChatGPT product that ships the model to users, it went live in November 2022 and has since received updates (GPT‑3.5, GPT‑4, etc.) that keep it current.<p>====<p>But when supposedly running it from <a href="https://huggingface.co/chat/models/openai/gpt-oss-20b" rel="nofollow">https://huggingface.co/chat/models/openai/gpt-oss-20b</a>:<p>====<p>when have you been built?<p>I’m a language model created by OpenAI. The current generation (GPT‑4) that powers this chat was first released in March 2023 and has been updated and fine‑tuned up through the end of 2024. My training data runs up to the beginning of June 2025, so I’m built on knowledge available up to that point.<p>====<p>And that makes me thinking that although <a href="https://huggingface.co/chat" rel="nofollow">https://huggingface.co/chat</a> claims to be using the models available to public at <a href="https://huggingface.co" rel="nofollow">https://huggingface.co</a> , it doesn't seems to be true and I raised this question here <a href="https://huggingface.co/ggml-org/gpt-oss-20b-GGUF/discussions/4" rel="nofollow">https://huggingface.co/ggml-org/gpt-oss-20b-GGUF/discussions...</a> , <a href="https://github.com/huggingface/inference-playground/issues/102" rel="nofollow">https://github.com/huggingface/inference-playground/issues/1...</a> and <a href="https://github.com/ggml-org/llama.cpp/discussions/15396#discussioncomment-15136920" rel="nofollow">https://github.com/ggml-org/llama.cpp/discussions/15396#disc...</a> .</p>
]]></description><pubDate>Wed, 03 Dec 2025 02:17:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=46129528</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=46129528</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46129528</guid></item><item><title><![CDATA[New comment by mingodad in "So you wanna build a local RAG?"]]></title><description><![CDATA[
<p>I did an experiment while learning about LLMs and llama.cpp consisting in trying to use create a Lua extension to use llama.cpp API to enhance LLMs with agent/RAG written in Lua with simple code to learn the basics and after more than 5 hours chatting with <a href="https://aistudio.google.com/prompts/new_chat?model=gemini-3-pro-preview" rel="nofollow">https://aistudio.google.com/prompts/new_chat?model=gemini-3-...</a> (see the scrapped output of the whole session attached) I've got a lot far in terms of learning how to use an LLM to help develop/debug/learn about a topic (in this case agent/RAG with llama.cpp API using Lua).<p>I'm posting it here just in case it can help others to see and comment/improve it (it was using around 100K tokens at the end and started getting noticeable slow but still very helpful).<p>You can see the scrapped text for the whole seession here 
<a href="https://github.com/ggml-org/llama.cpp/discussions/17600" rel="nofollow">https://github.com/ggml-org/llama.cpp/discussions/17600</a></p>
]]></description><pubDate>Sat, 29 Nov 2025 19:23:41 +0000</pubDate><link>https://news.ycombinator.com/item?id=46090034</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=46090034</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46090034</guid></item><item><title><![CDATA[New comment by mingodad in "AI documentation you can talk to, for every repo"]]></title><description><![CDATA[
<p>I've asked to index my project <a href="https://github.com/mingodad/parsertl-playground" rel="nofollow">https://github.com/mingodad/parsertl-playground</a> and the result <a href="https://deepwiki.com/mingodad/parsertl-playground" rel="nofollow">https://deepwiki.com/mingodad/parsertl-playground</a> seems to be reasonable good (still going through in more detail but overall impressive).</p>
]]></description><pubDate>Tue, 11 Nov 2025 11:41:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=45886236</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=45886236</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=45886236</guid></item><item><title><![CDATA[New comment by mingodad in "I write type-safe generic data structures in C"]]></title><description><![CDATA[
<p><a href="https://github.com/mingodad/cfront-3">https://github.com/mingodad/cfront-3</a></p>
]]></description><pubDate>Tue, 01 Jul 2025 08:50:48 +0000</pubDate><link>https://news.ycombinator.com/item?id=44431929</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=44431929</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44431929</guid></item><item><title><![CDATA[New comment by mingodad in "I write type-safe generic data structures in C"]]></title><description><![CDATA[
<p>Here I've got it to work with recent compilers/OSs <a href="https://github.com/mingodad/cfront-3">https://github.com/mingodad/cfront-3</a></p>
]]></description><pubDate>Tue, 01 Jul 2025 08:47:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=44431907</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=44431907</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44431907</guid></item><item><title><![CDATA[New comment by mingodad in "Ask HN: What are you working on? (May 2025)"]]></title><description><![CDATA[
<p>I'm collecting a collection of PEG grammars here <a href="https://mingodad.github.io/cpp-peglib" rel="nofollow">https://mingodad.github.io/cpp-peglib</a> and Yacc/Lex grammars here <a href="https://mingodad.github.io/parsertl-playground/playground" rel="nofollow">https://mingodad.github.io/parsertl-playground/playground</a> both are wasm based playgrounds to test/develop/debug grammars.<p>The idea is to improve the tooling to work with grammars, for example generating railroad diagrams, source, stats, state machines, traces, ...<p>On both of then select one grammar from "Examples" then click "Parse" to see a parse tree or ast for the content in "Input source", then edit the grammar/input to test new ideas.<p>There is also <a href="https://mingodad.github.io/plgh/json2ebnf.html" rel="nofollow">https://mingodad.github.io/plgh/json2ebnf.html</a> to generate EBNF for railroad diagram generation form tree-sitter grammars.<p>Any feedback, contribution is welcome !</p>
]]></description><pubDate>Sun, 25 May 2025 21:36:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=44091401</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=44091401</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44091401</guid></item><item><title><![CDATA[New comment by mingodad in "Show HN: SQLite JavaScript - extend your database with JavaScript"]]></title><description><![CDATA[
<p>There is also <a href="https://github.com/ricomariani/CG-SQL-author">https://github.com/ricomariani/CG-SQL-author</a> that has a powerful stored procedure capabilities that can be transpiled to C/Lua/..., you can try it in your browser here <a href="https://mingodad.github.io/CG-SQL-Lua-playground" rel="nofollow">https://mingodad.github.io/CG-SQL-Lua-playground</a> .</p>
]]></description><pubDate>Thu, 22 May 2025 15:52:43 +0000</pubDate><link>https://news.ycombinator.com/item?id=44063278</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=44063278</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44063278</guid></item><item><title><![CDATA[New comment by mingodad in "Show HN: Vaev – A browser engine built from scratch (It renders google.com)"]]></title><description><![CDATA[
<p>There is also <a href="https://sciter.com/" rel="nofollow">https://sciter.com/</a> that the author tried to find finance to make it opensource but couldn't find enough supporters.</p>
]]></description><pubDate>Mon, 19 May 2025 06:47:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=44027068</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=44027068</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=44027068</guid></item><item><title><![CDATA[New comment by mingodad in "Show HN: A Database Written in Golang"]]></title><description><![CDATA[
<p>I do have a big collection of LALR(1) grammars to test/study/develop/document here <a href="https://mingodad.github.io/parsertl-playground/playground/" rel="nofollow">https://mingodad.github.io/parsertl-playground/playground/</a> including sqlite, tidb, vites, postgresql, mysql, ...</p>
]]></description><pubDate>Thu, 27 Feb 2025 09:36:58 +0000</pubDate><link>https://news.ycombinator.com/item?id=43192728</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=43192728</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43192728</guid></item><item><title><![CDATA[New comment by mingodad in "TinyCompiler: A compiler in a week-end"]]></title><description><![CDATA[
<p>Not exactly the same but here <a href="https://github.com/robertoraggi/cplusplus">https://github.com/robertoraggi/cplusplus</a> there is one person serious effort to create a C++-23 compiler front end.</p>
]]></description><pubDate>Fri, 21 Feb 2025 16:56:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=43129874</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=43129874</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43129874</guid></item><item><title><![CDATA[New comment by mingodad in "TinyCompiler: A compiler in a week-end"]]></title><description><![CDATA[
<p>But if otherwise anyone want to try with Yacc/Lex I have an online interpreter/editor with more than 250 examples to try/study at <a href="https://mingodad.github.io/parsertl-playground/playground/" rel="nofollow">https://mingodad.github.io/parsertl-playground/playground/</a> and just included a grammar for tinycompiler there (select "Tinycompiler parser" from "Examples" then click "Parse" to see a parse tree for the content in "Input source").</p>
]]></description><pubDate>Fri, 21 Feb 2025 12:17:23 +0000</pubDate><link>https://news.ycombinator.com/item?id=43126656</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=43126656</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=43126656</guid></item><item><title><![CDATA[New comment by mingodad in "Servo's progress in 2024"]]></title><description><![CDATA[
<p>No one mention Opera before adopting webkit/blink it was lightweight and fast. And even it's source code appeared in some places after a leak.</p>
]]></description><pubDate>Thu, 06 Feb 2025 10:24:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=42960953</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=42960953</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42960953</guid></item><item><title><![CDATA[New comment by mingodad in "Supercharge SQLite with Ruby Functions"]]></title><description><![CDATA[
<p>Also available with LUA <a href="http://lua.sqlite.org/index.cgi/doc/tip/doc/lsqlite3.wiki#db_create_function" rel="nofollow">http://lua.sqlite.org/index.cgi/doc/tip/doc/lsqlite3.wiki#db...</a></p>
]]></description><pubDate>Mon, 27 Jan 2025 09:24:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=42839043</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=42839043</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42839043</guid></item><item><title><![CDATA[New comment by mingodad in "Some programming language ideas"]]></title><description><![CDATA[
<p>Using the same idea there is <a href="https://datadraw.sourceforge.net/" rel="nofollow">https://datadraw.sourceforge.net/</a> and <a href="https://github.com/google/rune">https://github.com/google/rune</a> using it.<p>DataDraw is an ultra-fast persistent database for high performance programs written in C.  It's so fast that many programs keep all their data in a DataDraw database, even while being manipulated in inner loops of compute intensive applications.  Unlike slow SQL databases, DataDraw databases are compiled, and directly link into your C programs.  DataDraw databases are resident in memory, making data manipulation even faster than if they were stored in native C data structures (really).  Further, they can automatically support infinite undo/redo, greatly simplifying many applications.</p>
]]></description><pubDate>Thu, 09 Jan 2025 09:10:45 +0000</pubDate><link>https://news.ycombinator.com/item?id=42643344</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=42643344</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42643344</guid></item><item><title><![CDATA[New comment by mingodad in "Lua is so underrated"]]></title><description><![CDATA[
<p>But I implemented a warning/error for variable shadowing.</p>
]]></description><pubDate>Fri, 27 Dec 2024 13:42:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=42521975</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=42521975</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42521975</guid></item><item><title><![CDATA[New comment by mingodad in "Lua is so underrated"]]></title><description><![CDATA[
<p>No changes to "global by default" and no checking are made to type annotations.
The idea was to be able to convert any existing Lua code automatically and get it working as before like I did with some non trivial projects like:<p>- <a href="https://github.com/mingodad/CorsixTH-ljs">https://github.com/mingodad/CorsixTH-ljs</a>
- <a href="https://github.com/mingodad/ZeroBraneStudioLJS">https://github.com/mingodad/ZeroBraneStudioLJS</a></p>
]]></description><pubDate>Fri, 27 Dec 2024 13:34:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=42521922</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=42521922</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42521922</guid></item><item><title><![CDATA[New comment by mingodad in "Lua is so underrated"]]></title><description><![CDATA[
<p>I've made changes to the Lua/LuaJIT lexer/Parser to have a C like syntax and a program to convert Lua syntax to LJS syntax you can see it in the repositories shown bellow, the C api and language semantics remain the same.<p>- <a href="https://github.com/mingodad/ljs">https://github.com/mingodad/ljs</a><p>- <a href="https://github.com/mingodad/ljs-5.4">https://github.com/mingodad/ljs-5.4</a><p>- <a href="https://github.com/mingodad/ljs-5.1">https://github.com/mingodad/ljs-5.1</a><p>- <a href="https://github.com/mingodad/ljsjit">https://github.com/mingodad/ljsjit</a><p>- <a href="https://github.com/mingodad/raptorjit-ljs">https://github.com/mingodad/raptorjit-ljs</a><p>- <a href="https://github.com/mingodad/CorsixTH-ljs">https://github.com/mingodad/CorsixTH-ljs</a><p>- <a href="https://github.com/mingodad/snabb-ljs">https://github.com/mingodad/snabb-ljs</a><p>- <a href="https://github.com/mingodad/ZeroBraneStudioLJS">https://github.com/mingodad/ZeroBraneStudioLJS</a></p>
]]></description><pubDate>Fri, 27 Dec 2024 13:14:52 +0000</pubDate><link>https://news.ycombinator.com/item?id=42521814</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=42521814</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42521814</guid></item><item><title><![CDATA[New comment by mingodad in "Removing global state from LLD, the LLVM linker"]]></title><description><![CDATA[
<p>Also for C/C++ binaries with debug info gdb is one of the ingredients used to show where and how much globals exists:<p>gdb -batch -ex "info variables" -ex quit --args binary-to-examine</p>
]]></description><pubDate>Thu, 21 Nov 2024 22:26:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=42209331</link><dc:creator>mingodad</dc:creator><comments>https://news.ycombinator.com/item?id=42209331</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=42209331</guid></item></channel></rss>