<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Hacker News: thw0rted</title><link>https://news.ycombinator.com/user?id=thw0rted</link><description>Hacker News RSS</description><docs>https://hnrss.org/</docs><generator>hnrss v2.1.1</generator><lastBuildDate>Sun, 12 Apr 2026 13:33:49 +0000</lastBuildDate><atom:link href="https://hnrss.org/user?id=thw0rted" rel="self" type="application/rss+xml"></atom:link><item><title><![CDATA[New comment by thw0rted in "Arm China Has Gone Rogue"]]></title><description><![CDATA[
<p>You know that's based on a comic from like half a century ago, right?</p>
]]></description><pubDate>Thu, 02 Sep 2021 12:54:54 +0000</pubDate><link>https://news.ycombinator.com/item?id=28391461</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28391461</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28391461</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>I'm not arguing against democracy: it's the worst system except all the other ones.  I didn't think much of Brexit, I think they really shafted themselves, but at the same time I get the strong impression that EU governance is hopelessly broken.<p>I don't have to bring a solution to notice that the system we have is not working.  (That's not to say I don't wish I had one, I just don't.)</p>
]]></description><pubDate>Wed, 11 Aug 2021 11:28:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=28140677</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28140677</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28140677</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>Maybe I'm just too jaded but I don't think "making voices heard" matters -- in the link I posted upthread, the overwhelming majority of voters did not want the Chat Control measure to pass, but it did anyway, "for the children".  (I can't even do that -- I'm an American living over here, I have no say in politics but am subject to a lot of their rules.)<p>Maybe we'll get lucky and the next vote will fail, or maybe if it passes there will be providers that refuse to comply.  I think if it happens, it's far more likely that most will cave, and a few will just pull the plug and stop offering service.</p>
]]></description><pubDate>Wed, 11 Aug 2021 08:15:10 +0000</pubDate><link>https://news.ycombinator.com/item?id=28139357</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28139357</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28139357</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>Your post I originally replied to said<p>> They've tried to do this for decades and have failed.... Let's see how voters like it.<p>My "point" is that I thought the same way you did -- look what a mess Clipper Chip was, they always want backdoors but surely a voice of reason will show up, etc -- but something has changed.  Couple the vote in the EU with the way the major tech companies reacted to GDPR (you'd be surprised how many sites simply block all of Europe rather than comply) and it's a wakeup call. There is a real chance of the bad guys winning here.</p>
]]></description><pubDate>Wed, 11 Aug 2021 07:53:55 +0000</pubDate><link>https://news.ycombinator.com/item?id=28139261</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28139261</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28139261</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>If you think EU policy only impacts the EU, you didn't pay attention to what happened with GDPR.  Some companies might scan only EU-to-EU communications, some might scan communications where only one end is in the EU, and some might just scan everything because why build two completely separate systems rather than just doing whatever is compliant everywhere you operate?</p>
]]></description><pubDate>Tue, 10 Aug 2021 10:39:33 +0000</pubDate><link>https://news.ycombinator.com/item?id=28127162</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28127162</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28127162</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>If you care about the issue, this post[1] does a pretty good job explaining what people are worried about.<p>1: <a href="https://www.hackerfactor.com/blog/index.php?/archives/929-One-Bad-Apple.html" rel="nofollow">https://www.hackerfactor.com/blog/index.php?/archives/929-On...</a></p>
]]></description><pubDate>Tue, 10 Aug 2021 08:41:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=28126476</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28126476</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28126476</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>If you believe this, then you don't understand what their system can do.</p>
]]></description><pubDate>Tue, 10 Aug 2021 08:35:56 +0000</pubDate><link>https://news.ycombinator.com/item?id=28126441</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28126441</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28126441</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>Just to be clear, the EU already voted to <i>allow</i> snooping, the law "currently in the works" is to <i>require</i> it:<p><a href="https://www.patrick-breyer.de/en/posts/message-screening/?lang=en" rel="nofollow">https://www.patrick-breyer.de/en/posts/message-screening/?la...</a></p>
]]></description><pubDate>Tue, 10 Aug 2021 08:32:13 +0000</pubDate><link>https://news.ycombinator.com/item?id=28126412</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28126412</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28126412</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>Bad news, buddy: <a href="https://www.patrick-breyer.de/en/posts/message-screening/?lang=en" rel="nofollow">https://www.patrick-breyer.de/en/posts/message-screening/?la...</a><p>ETA: in short, about a month ago they did get the votes, at least in the EU, and it's now "allowed" for providers to scan all content.  In a little while, they're going to have a vote to change "allowed" to "required", and we have no reason to think it'll go differently.</p>
]]></description><pubDate>Tue, 10 Aug 2021 08:30:44 +0000</pubDate><link>https://news.ycombinator.com/item?id=28126409</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28126409</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28126409</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>Does that 30k number include iCloud Photo data?  Do you have a citation for this?</p>
]]></description><pubDate>Tue, 10 Aug 2021 08:27:08 +0000</pubDate><link>https://news.ycombinator.com/item?id=28126389</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28126389</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28126389</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>I posted more detail upthread but what I've found suggests that Apple does have a key to decrypt pictures but they claim to use it only to respond to a warrant.  (They could of course be lying about that, but I don't believe they are.)</p>
]]></description><pubDate>Tue, 10 Aug 2021 08:25:55 +0000</pubDate><link>https://news.ycombinator.com/item?id=28126384</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28126384</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28126384</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>From everything that I've read, iCloud Photo Library is currently encrypted on the server, with a key that Apple only uses when presented with a warrant.  If I ran the company (disclaimer: I do not) I'd implement this with an airgapped system in a vault somewhere, where a very small number of people have access to bring encrypted images in on a CD-R under two-person control.<p>That being said, one of two things is true.  Either Apple does exactly what they say, in which case they are <i>not</i> able to perform server-side content / fingerprint scanning,  or Apple is outright lying about only using their key on behalf of law enforcement.  This latter case would open them to all sorts of legal liabilities, like a suit from shareholders for false reports.  It would also require the silence of every Apple engineer who has ever been involved in at least their iCloud Photo program, and probably a bunch of server infrastructure as well.  Additionally, they'd be legally obligated to report their scan results to the NCMEC but would have to do so in a way that doesn't give away that they're lying about how their systems work.</p>
]]></description><pubDate>Tue, 10 Aug 2021 08:23:14 +0000</pubDate><link>https://news.ycombinator.com/item?id=28126369</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28126369</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28126369</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>...<i>if</i> a human actually gets the file, figures out what type it is, and examines it for themselves, they'd be obligated to report it.  With the number of Win10 devices in the world, how big would their security team have to be to hand-groom every automatically submitted "suspicious" sample?  (For that matter, why would a vanilla JPG get flagged as "suspicious" in the first place?)</p>
]]></description><pubDate>Tue, 10 Aug 2021 08:13:24 +0000</pubDate><link>https://news.ycombinator.com/item?id=28126291</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28126291</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28126291</guid></item><item><title><![CDATA[New comment by thw0rted in "WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan"]]></title><description><![CDATA[
<p>Tweens, yes.  Teens can be opted into the feature but it only offers the (teen) user a warning before viewing the image, it never notifies the parent.</p>
]]></description><pubDate>Tue, 10 Aug 2021 08:08:17 +0000</pubDate><link>https://news.ycombinator.com/item?id=28126270</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28126270</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28126270</guid></item><item><title><![CDATA[New comment by thw0rted in "Expanded Protections for Children"]]></title><description><![CDATA[
<p>Well, since Talking About Statistics Is Hard, you have to look at the exact phrasing from the website: it "ensures less than a one in one trillion chance per year of incorrectly flagging a given account".  So, each of those billion accounts has a 1:1T chance of false positive.  If I remember my stat 101 correctly, that should translate to a 1:1000 chance of having at least 1 false positive on the planet during any given year.  (And remember, even a false positive just means that a human reviews your photos, not that you get reported to the police.)</p>
]]></description><pubDate>Mon, 09 Aug 2021 14:36:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=28117406</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28117406</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28117406</guid></item><item><title><![CDATA[New comment by thw0rted in "Expanded Protections for Children"]]></title><description><![CDATA[
<p>I think your take is correct but doesn't answer the question about why this matching has to take place on the device, if it's only for photos that are going into iCloud, and the iCloud contents are already being stored unencrypted.<p>The only remotely plausible answer I've seen is that Apple wants to keep potentially-violating material out of their general storage, and flagged images are being sent to the review team <i>instead</i> of regular backup, but that's a pretty weak guess.</p>
]]></description><pubDate>Mon, 09 Aug 2021 14:15:07 +0000</pubDate><link>https://news.ycombinator.com/item?id=28117103</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28117103</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28117103</guid></item><item><title><![CDATA[New comment by thw0rted in "Expanded Protections for Children"]]></title><description><![CDATA[
<p>I still don't understand how there are people on HN who think that giving their kids <i>less</i> access to technology is somehow a virtuous position to take.  When I was the same age as my kids I could have gotten into all sorts of shit on a BBS or Compuserve forum -- my parents had no idea what was going on, but they'd given me a basic sense of right and wrong, and somebody to talk to if I was concerned.  You've got to educate them about the world, but cutting them off from it is not the way to do that.</p>
]]></description><pubDate>Mon, 09 Aug 2021 14:12:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=28117068</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28117068</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28117068</guid></item><item><title><![CDATA[New comment by thw0rted in "Expanded Protections for Children"]]></title><description><![CDATA[
<p>Dumb question: do we know from what they've released publicly if it will be possible for security researchers to snag a copy of the database, perform the same perceptual hash algorithm on a given image, and determine if there's a "hit", without violating some kind of license term?<p>Perhaps the community could run a crowdsourced "keep them honest" service web service -- upload the latest illegal-in-China Winnie-the-Pooh meme, oh hey look at that, it's in the China-only version of the database, isn't that weird, etc etc.  (Obviously you wouldn't want people "testing" images that are in the database for the actual stated purpose...)</p>
]]></description><pubDate>Mon, 09 Aug 2021 13:57:26 +0000</pubDate><link>https://news.ycombinator.com/item?id=28116879</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28116879</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28116879</guid></item><item><title><![CDATA[New comment by thw0rted in "Expanded Protections for Children"]]></title><description><![CDATA[
<p>If the software is scanning everything, and they were already scanning uploaded content, why have a press release at all?  If you're deploying an unrestricted panopticon to all your devices, why on earth would you go to the trouble of <i>announcing</i> it in the first place?</p>
]]></description><pubDate>Mon, 09 Aug 2021 13:51:22 +0000</pubDate><link>https://news.ycombinator.com/item?id=28116805</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28116805</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28116805</guid></item><item><title><![CDATA[New comment by thw0rted in "Expanded Protections for Children"]]></title><description><![CDATA[
<p>"It is an effective deterrent" to <i>using this one specific platform to distribute</i> CSAM.  The problem with this solution is the exact same problem with the tired old "solution" to E2E encryption that gets trotted out every couple of months.  If you add monitoring to the tool that criminals are using -- especially, <i>especially</i> if the company loudly and publicly announces that they are adding monitoring! -- you will, at best, catch a few of the very dumbest possible criminals, while the rest move on to one of countless available non-monitored tools.</p>
]]></description><pubDate>Mon, 09 Aug 2021 13:46:53 +0000</pubDate><link>https://news.ycombinator.com/item?id=28116745</link><dc:creator>thw0rted</dc:creator><comments>https://news.ycombinator.com/item?id=28116745</comments><guid isPermaLink="false">https://news.ycombinator.com/item?id=28116745</guid></item></channel></rss>