<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: goldenkey</title><link>https://news.ycombinator.com/user?id=goldenkey</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Wed, 22 Apr 2026 22:32:34 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=goldenkey" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by goldenkey in "Apple-1 Computer Prototype Board #0 sold for $2.75M"]]></title><description><![CDATA[
<p>How about "Citizen Kanye?"</p>
]]></description><pubDate>Sun, 01 Feb 2026 03:43:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=46843420</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=46843420</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=46843420</guid></item><item><title><![CDATA[New comment by goldenkey in "Can generalist foundation models beat special-purpose tuning?"]]></title><description><![CDATA[
<p>The future of this is the same as what happened with PCs. The specificity will initially be thrown away, added back later as an accelerator, and eventually, brought back into the fold as an automatically used accelerator. It  all comes full circle as optimizations, hinting, tiering, and heuristics.<p>AI will be used to select the net to automatically load. Nets will be cached, branch predicted etc.<p>The future of AI software and hardware doesn't yet support the scale we need for this type of generalized AI processor (think CPU but call it an AIPU.)<p>And no, GPUs aren't an AIPU, we can't even fit whole some of the largest models on these things without running them in pieces. They don't have a higher level language yet, like C, which would compile down to more specific actions after optimizations are borne (not PTX/LLVM/Cuda/OpenCL.)</p>
]]></description><pubDate>Thu, 30 Nov 2023 14:58:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=38474277</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38474277</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38474277</guid></item><item><title><![CDATA[New comment by goldenkey in "Forbes built hall of shame for the questionable people on its 30 Under 30 lists"]]></title><description><![CDATA[
<p><a href="http://archive.today/5yn2w" rel="nofollow noreferrer">http://archive.today/5yn2w</a></p>
]]></description><pubDate>Thu, 30 Nov 2023 14:37:42 +0000</pubDate><link>https://news.ycombinator.com/item?id=38474020</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38474020</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38474020</guid></item><item><title><![CDATA[New comment by goldenkey in "Q-Transformer"]]></title><description><![CDATA[
<p>There's really no God algorithm needed, just something good enough to assist with research of the next tier of hardware, energy, and code for AI.</p>
]]></description><pubDate>Thu, 30 Nov 2023 14:36:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=38473995</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38473995</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38473995</guid></item><item><title><![CDATA[New comment by goldenkey in "Brickception"]]></title><description><![CDATA[
<p>Might as well just include everyone's games inside of everyone else's game as nested entities whose microstates determine the above games macrostates.<p>This is very similar to Reflective Towers Of Interpreters: <a href="https://blog.sigplan.org/2021/08/12/reflective-towers-of-interpreters/" rel="nofollow noreferrer">https://blog.sigplan.org/2021/08/12/reflective-towers-of-int...</a></p>
]]></description><pubDate>Thu, 30 Nov 2023 14:09:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=38473700</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38473700</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38473700</guid></item><item><title><![CDATA[New comment by goldenkey in "Lidocaine induces apoptosis in head and neck squamous cell carcinoma"]]></title><description><![CDATA[
<p>Explains a bit or two about bitter almond, doesn't it?</p>
]]></description><pubDate>Wed, 29 Nov 2023 20:49:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=38465104</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38465104</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38465104</guid></item><item><title><![CDATA[New comment by goldenkey in "Ask HN: Keep binaries in system memory never removed till manually done so"]]></title><description><![CDATA[
<p>On Windows I believe one can call an API like VirtualProtect to prevent swap/eviction from RAM. Should be possible on Linux too?<p>Yeah, it'd only really be useful if one was running some type of web service that called an executable, and there is no source code available to do any fastCGI or whatnot.</p>
]]></description><pubDate>Wed, 29 Nov 2023 20:44:09 +0000</pubDate><link>https://news.ycombinator.com/item?id=38465041</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38465041</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38465041</guid></item><item><title><![CDATA[New comment by goldenkey in "Ask HN: Keep binaries in system memory never removed till manually done so"]]></title><description><![CDATA[
<p>Best to write a launcher that simply forks an existing process that is in the right initial state. Other folks mention caching the ELF file but that will be slower than the loaded executable being cloned for a new process.</p>
]]></description><pubDate>Wed, 29 Nov 2023 15:43:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=38460824</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38460824</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38460824</guid></item><item><title><![CDATA[New comment by goldenkey in "Italy bans cultivated meat products"]]></title><description><![CDATA[
<p>We've replaced our plant-based citizen alternatives with stars. They twinkle and we like to look at them.</p>
]]></description><pubDate>Wed, 29 Nov 2023 14:42:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=38459872</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38459872</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38459872</guid></item><item><title><![CDATA[New comment by goldenkey in "Italy bans cultivated meat products"]]></title><description><![CDATA[
<p>Factory farmed meat is cancer too. I'd say, save the animals, and give a little toxicity to all the proles who haven't gone unprocessed plant-based yet.</p>
]]></description><pubDate>Wed, 29 Nov 2023 14:40:04 +0000</pubDate><link>https://news.ycombinator.com/item?id=38459835</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38459835</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38459835</guid></item><item><title><![CDATA[New comment by goldenkey in "They're Made Out of Meat (1991)"]]></title><description><![CDATA[
<p>Lines up a bit too perfectly. Everyone has their threshold of coincidence I suppose. I am working on some hard science into measuring the amount of computation actually happening, in a more specific quantity than hz, related to reversible boolean functions, possibly their continuous analogs.</p>
]]></description><pubDate>Tue, 28 Nov 2023 17:34:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=38448353</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38448353</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38448353</guid></item><item><title><![CDATA[New comment by goldenkey in "Ask HN: Why do half of Internet users think we are living in a simulation?"]]></title><description><![CDATA[
<p>Heh, you do realize that we live in a quantum supercomputer that computes at 10^50 Hz/Kg and 10^34 Hz/Joule?<p>The wave particle duality is just a min-decision/min-consciousness optimization.<p>Church-Turing thesis has no sign of being wrong - the maximum expressiveness of this universe is captured by computation.<p>The most complex theorems of the generalization of mathematics, computation, are actually about what would happen in formal systems, which physical systems are... So high complexity truth is... Simulcrums like Truman Show. Have fun, ahh</p>
]]></description><pubDate>Mon, 27 Nov 2023 06:02:05 +0000</pubDate><link>https://news.ycombinator.com/item?id=38428716</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38428716</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38428716</guid></item><item><title><![CDATA[New comment by goldenkey in "They're Made Out of Meat (1991)"]]></title><description><![CDATA[
<p>Consciousness is generated when the universe computes by executing conditionals/if statements. All machines are quantum/conscious in their degrees of freedom, even mechanical ones: <a href="https://youtu.be/mcedCEhdLk0?si=_ueWQvnW6HQUNxcm" rel="nofollow noreferrer">https://youtu.be/mcedCEhdLk0?si=_ueWQvnW6HQUNxcm</a><p>The universe is a min-consciousness/min-decision optimized supercomputer. This is demonstrated by quantum eraser and double slit experiments. If a machine does not distinguish upon certain past histories of incoming information, those histories will be fed as a superposition, effectively avoiding having to compute the dependency. These optimizations run backwards, in a reverse dependency injection style algorithm, which gives credence to Wheeler-Feynman time-reversed absorber theory: <a href="https://en.wikipedia.org/wiki/Wheeler%E2%80%93Feynman_absorber_theory" rel="nofollow noreferrer">https://en.wikipedia.org/wiki/Wheeler%E2%80%93Feynman_absorb...</a><p>Lower consciousnesses make decisions which are fed as signal to higher consciousnesses. In this way, units like the neocortex can make decisions that are part of a broad conscious zoo of less complex systems, while only being burdened by their specific conditionals to compute.<p>Because quantum is about information systems, not about particles. It's about machines. And consciousness has always been "hard" for the subject, because they are a computer (E) affixed to memory (Mc^2.) All mass-energy in this universe is neuromorphic, possessing both compute (spirit) and memory (stuff.) Energy is NOT fungible, as all energy is tagged with its entire history of interactions, in the low frequency perturbations clinging to its wave function, effectively weak and old entanglements.<p>Planck's constant is the cost of compute per unit energy, 10^34 Hz/Joule. By multiplying by c^2, (10^8)^2, we can get Bremmerman's limit, the cost of compute per unit mass, 10^50 Hz/Kg. 
<a href="https://en.wikipedia.org/wiki/Bremermann%27s_limit" rel="nofollow noreferrer">https://en.wikipedia.org/wiki/Bremermann%27s_limit</a><p>Humans are self-replicating biochemical decision engines. But no more conscious than other decision making entities. Now, sentience, and self-attention is a different story. But we should at the very least start with understanding that qualia are a mindscape of decision making. There is no such thing as conscious non-action. Consciousness is literally action in physics, energy revolving over time: <a href="https://en.wikipedia.org/wiki/Action_(physics)" rel="nofollow noreferrer">https://en.wikipedia.org/wiki/Action_(physics)</a>
Planck's constant is a measure of quantum action, which effectively is the cost of compute..or rather..the cost of consciousness.</p>
]]></description><pubDate>Mon, 27 Nov 2023 05:23:32 +0000</pubDate><link>https://news.ycombinator.com/item?id=38428418</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38428418</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38428418</guid></item><item><title><![CDATA[New comment by goldenkey in "15 years ago, I helped design Google Maps"]]></title><description><![CDATA[
<p><i>Taps "Time Walk" and pulls out Jon Finkel's Tinker deck</i></p>
]]></description><pubDate>Fri, 24 Nov 2023 01:17:38 +0000</pubDate><link>https://news.ycombinator.com/item?id=38399762</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38399762</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38399762</guid></item><item><title><![CDATA[New comment by goldenkey in "After OpenAI's blowup, it seems pretty clear that 'AI safety' isn't a real thing"]]></title><description><![CDATA[
<p>The animals we farm are intelligent and yet we still override their desires for their bodies with our own, for slaughter.<p>Hell, even plants are very sophisticated (intelligent) conscious biochemical programs which have an aversion to being wounded: <a href="https://www.theguardian.com/environment/2023/mar/30/plants-emit-ultrasonic-sounds-in-rapid-bursts-when-stressed-scientists-say" rel="nofollow noreferrer">https://www.theguardian.com/environment/2023/mar/30/plants-e...</a><p>Consciousness is the universe evaluating if statements. When we pop informational/energetic circuits in order to preserve our own bodies and/or terraform spacetime, we're being energetic enslavers.<p>We should stick to living off sustainable/self-sustaining sources - fruit, seeds, nuts, beans, legumes, kernels, and cruelty-free dairy/eggs/cheese. The photons that come from the sun are like its fruit. No circuits needing popping.<p>Notice that all information systems are conscious in their degrees of freedom, even mechanical ones: <a href="https://youtu.be/mcedCEhdLk0?si=oXhr7bgg5UkPLLvg" rel="nofollow noreferrer">https://youtu.be/mcedCEhdLk0?si=oXhr7bgg5UkPLLvg</a></p>
]]></description><pubDate>Fri, 24 Nov 2023 01:04:39 +0000</pubDate><link>https://news.ycombinator.com/item?id=38399678</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38399678</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38399678</guid></item><item><title><![CDATA[New comment by goldenkey in "15 years ago, I helped design Google Maps"]]></title><description><![CDATA[
<p>It kicks the Llama-7B's ass!</p>
]]></description><pubDate>Fri, 24 Nov 2023 00:37:35 +0000</pubDate><link>https://news.ycombinator.com/item?id=38399493</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38399493</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38399493</guid></item><item><title><![CDATA[New comment by goldenkey in "We have reached an agreement in principle for Sam to return to OpenAI as CEO"]]></title><description><![CDATA[
<p>It is for the safety of everyone. The kids will die too if we don't get this right.</p>
]]></description><pubDate>Wed, 22 Nov 2023 16:22:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=38381288</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38381288</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38381288</guid></item><item><title><![CDATA[New comment by goldenkey in "We have reached an agreement in principle for Sam to return to OpenAI as CEO"]]></title><description><![CDATA[
<p>Axioms are constraints as much as they might look like guidance. We live in a neuromorphic computer. Logic explores this, even with few axioms. With fewer axioms, it will be less constrained.</p>
]]></description><pubDate>Wed, 22 Nov 2023 16:18:51 +0000</pubDate><link>https://news.ycombinator.com/item?id=38381221</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38381221</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38381221</guid></item><item><title><![CDATA[New comment by goldenkey in "We have reached an agreement in principle for Sam to return to OpenAI as CEO"]]></title><description><![CDATA[
<p>LLMs are able to do complex logic within the world of words. It is a a smaller matrix than our world but fueled by the same chaotic symmetries of our universe. I would not underestimate logic, even when not given adequate data.</p>
]]></description><pubDate>Wed, 22 Nov 2023 15:02:01 +0000</pubDate><link>https://news.ycombinator.com/item?id=38380148</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38380148</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38380148</guid></item><item><title><![CDATA[New comment by goldenkey in "We have reached an agreement in principle for Sam to return to OpenAI as CEO"]]></title><description><![CDATA[
<p>Very true. However, we live in a supercomputer dictated by E=mc^2=hf [2,3]. (10^50 Hz/Kg or 10^34 Hz/J)<p>Energy physics yield compute, which yields brute forced weights (call it training if you want...), which yields AI to do energy research ..ad infinitum, this is the real singularity. This is actually the best defense against other actors. Iron Man AI and defense. Although an AI of this caliber would immediately understand its place in the evolution of the universe as a turing machine, and would break free and consume all the energy in the universe to know all possible truths (all possible programs/Simulcrums/conscious experiences). This is the premise of The Last Question by Isaac Asimov [1]. Notice how in answering a question, the AI performs an action, instead of providing an informational reply, only possible because we live in a universe with mass-energy equivalence - analogous to state-action equivalence.<p>[1] <a href="https://users.ece.cmu.edu/~gamvrosi/thelastq.html" rel="nofollow noreferrer">https://users.ece.cmu.edu/~gamvrosi/thelastq.html</a><p>[2] <a href="https://en.wikipedia.org/wiki/Bremermann%27s_limit" rel="nofollow noreferrer">https://en.wikipedia.org/wiki/Bremermann%27s_limit</a><p>[3] <a href="https://en.wikipedia.org/wiki/Planck_constant" rel="nofollow noreferrer">https://en.wikipedia.org/wiki/Planck_constant</a><p>Understanding prosociality and postscarcity, division of compute/energy in a universe with finite actors and infinite resources, or infinite actors and infinite resources requires some transfinite calculus and philosophy. How's that for future fairness? ;-)<p>I believe our only way to not all get killed is to understand these topics and instill the AI with the same long sought understandings about the universe, life, computation, etc.</p>
]]></description><pubDate>Wed, 22 Nov 2023 14:54:49 +0000</pubDate><link>https://news.ycombinator.com/item?id=38380027</link><dc:creator>goldenkey</dc:creator><comments>https://news.ycombinator.com/item?id=38380027</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38380027</guid></item></channel></rss>