<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: dellamonica</title><link>https://news.ycombinator.com/user?id=dellamonica</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Thu, 07 May 2026 00:19:28 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=dellamonica" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by dellamonica in "How to optimally trap points in high-dimensional spaces inside ellipsoids"]]></title><description><![CDATA[
<p>The ellipse can be also encoded by just the lengths along each axis and then by a rotation in R^n (which is just a unitary matrix multiplication). So in essence, for the problem in the original post there are three pieces of information needed: a translation, a rotation, and the deformations of the unit ball along the axes.</p>
]]></description><pubDate>Fri, 23 Feb 2024 10:38:02 +0000</pubDate><link>https://news.ycombinator.com/item?id=39479053</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=39479053</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39479053</guid></item><item><title><![CDATA[New comment by dellamonica in "How to optimally trap points in high-dimensional spaces inside ellipsoids"]]></title><description><![CDATA[
<p>Every ellipse can be encoded by the matrix A (and the geometric concept is generalized to arbitrary dimensions).<p>Not sure I follow the physics analogy though. A unit ball is a specific case of an ellipse where A is the identity matrix. Perhaps the entries of A would be the atoms in this case as they uniquely shape it?</p>
]]></description><pubDate>Fri, 23 Feb 2024 00:24:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=39475275</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=39475275</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39475275</guid></item><item><title><![CDATA[New comment by dellamonica in ".NET on Linux: What a Contrast"]]></title><description><![CDATA[
<p>Yes, if you don't use dotnet and is already used to Unix tooling then it is a tough sell, but otherwise the integration with dotnet is quite good.</p>
]]></description><pubDate>Sat, 10 Feb 2024 17:41:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=39328319</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=39328319</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39328319</guid></item><item><title><![CDATA[New comment by dellamonica in ".NET on Linux: What a Contrast"]]></title><description><![CDATA[
<p>I mean, the entire dotnet is available, you can do anything in PS though obviously that is not always the smart call.<p>It has been very useful to me to use as a REPL on my own C# libraries, I can instantiate and use types from these libraries and interface with the file system and network on an ad hoc basis.</p>
]]></description><pubDate>Sat, 10 Feb 2024 17:26:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=39328159</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=39328159</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39328159</guid></item><item><title><![CDATA[New comment by dellamonica in ".NET on Linux: What a Contrast"]]></title><description><![CDATA[
<p>What is wrong with PowerShell core?<p>*PS core is the one based on the new versions of dotnet.</p>
]]></description><pubDate>Sun, 04 Feb 2024 12:15:15 +0000</pubDate><link>https://news.ycombinator.com/item?id=39249649</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=39249649</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=39249649</guid></item><item><title><![CDATA[New comment by dellamonica in "'A-team' of math proves a critical link between addition and sets"]]></title><description><![CDATA[
<p>It's rather difficult to provide a good formalization but let me give it a shot.<p>Suppose that mathematicians write papers with pen and paper in a subset of natural language without ambiguity (you wish!). What they write as proofs can be checked thoroughly for correctness by some other human (referee) otherwise it will not be published (again, we wish!).<p>Now for a paper with N tokens (or chars or whatever measure of length) how much computation can the referee reasonably dedicate to the task of checking that proof. Unless you believe something special about the human brain, I'd claim that poly time on N is a reasonable assumption. Now this means that checking is poly(N) for human checkable proofs of length N.<p>This is my argument that non deterministic Turing machines would absolutely crunch through everything that we could do in the current model of mathemticsl research.<p>The class of instances is that of theorems which a human could write a hand made proof and have it verified by another human. Recall that we are discussing formalization and what AI could achieve in this space, so I think formulating what we <i>currently</i> can achieve is a nice contrast with what could come in the future.</p>
]]></description><pubDate>Mon, 11 Dec 2023 14:09:29 +0000</pubDate><link>https://news.ycombinator.com/item?id=38600748</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=38600748</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38600748</guid></item><item><title><![CDATA[New comment by dellamonica in "'A-team' of math proves a critical link between addition and sets"]]></title><description><![CDATA[
<p>First of all, thank you for a thorough response. I'll need to take time to read it (and the refs in your other reply) with the care it deserves.<p>Basically I'm talking about the subset of proofs that could be done with pen and paper and pass a careful refereeing process. And you got that interpretation in your reply. You say that it is not insightful and that is of course an opinion you are entitled to.<p>To me it is an interesting observation that NP is a complexity class powerful enough to basically obsolete us. The context of this conversation/thread is using Lean to formalize a proof and whether in the near future, integrating AI models would potentially give us novel tools for discovering new mathematics. We routinely solve NP hard problems in practice for many instances, so I think drawing this parallel here was relevant.<p>I do agree with you that there would still be out of reach proofs even with an efficient algo for NP, but as some other person replied, it would be a nice problem to have...</p>
]]></description><pubDate>Mon, 11 Dec 2023 13:57:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=38600663</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=38600663</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38600663</guid></item><item><title><![CDATA[New comment by dellamonica in "'A-team' of math proves a critical link between addition and sets"]]></title><description><![CDATA[
<p>No misunderstanding about NP here for sure. As I said, this is about as much of a thesis as Church Turing is about what can be computed.<p>I have no clue about CiC, lean and whatnot. It was never my field and I don't doubt there can be some very weird things you can do with some fancy logic models that are rarely discussed by non logicians.<p>What I'm claiming is that anything a human could prove without a computer could be done by a non deterministic machine in poly time under a very reasonable assumption of proof length. I'm baking in the assumption of proof length, so I can claim something about NP...<p>Let me put it this way: if you can write a paper with your proof and I can check it with only my brain without a computer helping me, then a poly time algorithm could also check that proof. Unless you believe something exotic about how the brain computes, how would it not be the case?</p>
]]></description><pubDate>Sun, 10 Dec 2023 13:41:30 +0000</pubDate><link>https://news.ycombinator.com/item?id=38591460</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=38591460</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38591460</guid></item><item><title><![CDATA[New comment by dellamonica in "'A-team' of math proves a critical link between addition and sets"]]></title><description><![CDATA[
<p>Right, and this is also the current status of handmade mathematics. All we know is that we did not find a proof yet with everything that has been tried. This typically means that a problem is harder the more stuff has been thrown at it and failed.<p>A non deterministic Turing machine would absolutely crunch through math theorems, I really don't know why there is so much push back against what I am stating. Basically, if you had such a machine, there essentially would no longer be any room left for proving stuff the manual way, you would get completely outclassed by them.<p>The real consequence for me though is that this is a strong evidence (call it faith) against P=NP.</p>
]]></description><pubDate>Sun, 10 Dec 2023 11:46:59 +0000</pubDate><link>https://news.ycombinator.com/item?id=38590914</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=38590914</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38590914</guid></item><item><title><![CDATA[New comment by dellamonica in "'A-team' of math proves a critical link between addition and sets"]]></title><description><![CDATA[
<p>Could you give me a reference? This is not something I'm familiar with.<p>Can you claim that this equivalence proof is not in NP, without requiring this specific encoding? I would be very surprised to learn that there is no encoding where such proofs cannot be checked efficiently.</p>
]]></description><pubDate>Sun, 10 Dec 2023 11:36:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=38590869</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=38590869</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38590869</guid></item><item><title><![CDATA[New comment by dellamonica in "'A-team' of math proves a critical link between addition and sets"]]></title><description><![CDATA[
<p>Then in that target language, found by a clever human, you could do the same type of enumeration...<p>My whole point is that humans simply cannot process/create by themselves any truly long proof (we can obviously create a process for that). Therefore enumeration puts everything achievable by humans in the NP complexity. Feel free to disagree, this is not so much a theorem as more of a thesis.</p>
]]></description><pubDate>Sat, 09 Dec 2023 12:40:16 +0000</pubDate><link>https://news.ycombinator.com/item?id=38581482</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=38581482</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38581482</guid></item><item><title><![CDATA[New comment by dellamonica in "'A-team' of math proves a critical link between addition and sets"]]></title><description><![CDATA[
<p>This is all very interesting but it  seems that we're just taking different views on what is the instance size. If it is the length of the theorem statement in some suitable encoding and the goal is to find a proof, of any possible length, then yeah, this is way too hard.<p>I'm taking the view that the (max) length of the proof can be taken as a parameter for the complexity because anything too long would not have any chance of being found by a human. It may also not be trusted by mathematicians anyway... do you know if the hardware is bug free, the compiler is 100% correct and no cosmic particle corrupted some part of your exponential length proof? It's a tough sell.</p>
]]></description><pubDate>Fri, 08 Dec 2023 22:26:30 +0000</pubDate><link>https://news.ycombinator.com/item?id=38575488</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=38575488</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38575488</guid></item><item><title><![CDATA[New comment by dellamonica in "'A-team' of math proves a critical link between addition and sets"]]></title><description><![CDATA[
<p>Of course it would, you would enumerate lengths too. If the lengths need to be larger than polynomially bounded then we can be sure it would never be found by a human anyway.</p>
]]></description><pubDate>Fri, 08 Dec 2023 21:23:37 +0000</pubDate><link>https://news.ycombinator.com/item?id=38574623</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=38574623</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38574623</guid></item><item><title><![CDATA[New comment by dellamonica in "'A-team' of math proves a critical link between addition and sets"]]></title><description><![CDATA[
<p>It doesn't require anything like that.<p>Math proofs are of NP complexity. If you had access to a non deterministic Turing machine you could enumerate all possible proofs of a given length and check them all in poly time.<p>That does not say anything about LLMs though. Personally, I believe they could be quite helpful to mathematicians in a way similar to copilot for software programming.</p>
]]></description><pubDate>Fri, 08 Dec 2023 20:46:19 +0000</pubDate><link>https://news.ycombinator.com/item?id=38574102</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=38574102</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38574102</guid></item><item><title><![CDATA[New comment by dellamonica in "An easy-sounding problem yields numbers too big for our universe"]]></title><description><![CDATA[
<p>Without digging too much, I don't think such an argument could be made by this paper. A non trivial lower bound on a concrete problem in a general computation framework would be a marvel on its own.<p>As another example of what I mean, take sorting. There is an omega(n log n) lower bound that applies to a comparison based algorithm, but no lower bound other than trivial n is known for general computation.<p>It really boils down to how you define your search space. If you encode your input in a way that is already exponential on some parameter n, then even a linear algorithm would take at least exp(n) time just to read the input.<p>Let's say you have an algorithm, encoded in L characters, that can produce these extremely long paths, then a natural question is whether for any two vectors of n k-bit integers what is the complexity in terms of nk and L of determining whether they are connected? Note that I don't need to generate/read a huge state graph, I could do something smart by understanding the algorithm.</p>
]]></description><pubDate>Tue, 05 Dec 2023 16:09:20 +0000</pubDate><link>https://news.ycombinator.com/item?id=38532704</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=38532704</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38532704</guid></item><item><title><![CDATA[New comment by dellamonica in "An easy-sounding problem yields numbers too big for our universe"]]></title><description><![CDATA[
<p>It might be possible to compute whether the start and end States are connected without constructing the actual path. As usual non trivial lower bounds on computation are basically non existent.<p>As an example, we can determine whether a number is composite (not prime) without computing its factors.</p>
]]></description><pubDate>Mon, 04 Dec 2023 23:33:23 +0000</pubDate><link>https://news.ycombinator.com/item?id=38524774</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=38524774</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=38524774</guid></item><item><title><![CDATA[New comment by dellamonica in "*@gmail.com"]]></title><description><![CDATA[
<p>It does though. My gmail account has a dot. For some reason someone with a similar name to mine must have for believed his address was the non dotted version of mine and to this day I keep getting emails addressed to this other person... and yes, it is a real person who I've managed to contact.<p>The point is that without the dots rule I'd never get those emails, and the senders would get their message bounced back right away.</p>
]]></description><pubDate>Thu, 31 Aug 2023 13:36:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=37336941</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=37336941</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=37336941</guid></item><item><title><![CDATA[New comment by dellamonica in "Native AOT Overview"]]></title><description><![CDATA[
<p>There has been a push for using Source Generators to move stuff that relies on reflection to compile time code generation. JSON serialization is (mostly) supported in this mode with perf advantages as well.<p>This does not address pre-existing packages obviously. To make them work with AOT you might need to include an XML file (rd.xml) that indicates all the types that could be dynamically generated in your code. It is quite brittle and AFAIK everything might work until you get a runtime failure if something you did not include ends up being constructed.</p>
]]></description><pubDate>Fri, 26 May 2023 14:32:27 +0000</pubDate><link>https://news.ycombinator.com/item?id=36084678</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=36084678</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36084678</guid></item><item><title><![CDATA[New comment by dellamonica in "Native AOT Overview"]]></title><description><![CDATA[
<p>It can get really tricky: using reflection you could read a string from any input and create a generic type instantiation that never happens in the source code. How would the code for that type be present in an AOT scenario?<p>There are also several optimizations that can be made in AOT compiled code that are not allowed when you enable unrestricted reflection or dynamically loading code. As an example, suppose an interface is implemented by just one type in your code. If you AOT compile it, the compiler can safely assume that all calls to methods on that interface actually go to that single type implementing it, thus it can replace interface dispatches with direct calls.</p>
]]></description><pubDate>Fri, 26 May 2023 14:24:41 +0000</pubDate><link>https://news.ycombinator.com/item?id=36084575</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=36084575</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=36084575</guid></item><item><title><![CDATA[New comment by dellamonica in "How randomness can improve algorithms"]]></title><description><![CDATA[
<p>There are lots of techniques that use randomness to show the existence of objects with desired properties. Some of them rely on the "first moment" (expectation) which seems to be what you are saying. These tend to be the simpler ones.<p>A good book on the topic is The Probabilistic Method by Alon and Spencer.<p>To give you some hint of what randomness can get you, look at the most typical random graph model, which for a probability p selects an edge uniformly and independently with that prob. With overwhelming probability, this graph is <i>extremely</i> uniform, meaning that in every set you look (other than then very tiny ones) the density of edges is very close to p. Actually constructing such a graph, either symbolically or by an algorithm is a very hard problem and even the best known results don't get us close to the real random model.</p>
]]></description><pubDate>Wed, 05 Apr 2023 12:22:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=35452883</link><dc:creator>dellamonica</dc:creator><comments>https://news.ycombinator.com/item?id=35452883</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=35452883</guid></item></channel></rss>