麻豆蜜桃精品无码视频-麻豆蜜臀-麻豆免费视频-麻豆免费网-麻豆免费网站-麻豆破解网站-麻豆人妻-麻豆视频传媒入口

Set as Homepage - Add to Favorites

【запрещённая порнография】Enter to watch online.WhatsApp won't use Apple's child abuse image scanner

Source:Global Perspective Monitoring Editor:recreation Time:2025-07-03 16:02:32

Just because Apple has a plan — and запрещённая порнографияa forthcoming security feature — designed to combat the spread of child sex abuse images, that doesn't mean everyone's getting on board.

WhatsApp boss Will Cathcart joined the chorus of Apple critics on Friday, stating in no uncertain terms that the Facebook-owned messaging app won't be adopting this new feature once it launches. Cathcart then went on to lay out his concerns about the machine learning-driven system in a sprawling thread.

"This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control," Cathcart wrote midway through the thread. "Countries where iPhones are sold will have different definitions on what is acceptable."


You May Also Like

While WhatsApp's position the feature itself is clear enough, Cathcart's thread focuses mostly on raising hypothetical scenarios that suggest where things could go wrong with it. He wants to know if and how the system will be used in China, and "what will happen when" spyware companies exploit it, and how error-proof it really is.

The thread amounts to an emotional appeal. It isn't terribly helpful for those who might be seeking information on why Apple's announcement raised eyebrows. Cathcart parrots some of the top-level talking points raised by critics, but the approach is more provocative than informative.

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

As Mashable reported on Thursday, one piece the forthcoming security update uses a proprietary technology called NeuralHash that scans each image file hash — a signature, basically — and checks it against the hashes of known Child Sex Abuse Materials (CSAM). All of this happens before a photo gets stored in iCloud Photos, and Apple isn't allowed to do or look at a thing unless the hash check sets off alarms.

The hash check approach is fallible, of course. It's not going to catch CSAM that aren't catalogued in a database, for one. Matthew Green, a cybersecurity expert and professor at Johns Hopkins University, also pointed to the possible risk of someone weaponizing a CSAM file hash inside a non-CSAM image file.

There's another piece to the security update as well. In addition to NeuralHash-powered hash checks, Apple will also introduce a parental control feature that scans images sent via iMessage to child accounts (meaning accounts that belong to minors, as designated by the account owners) for sexually explicit materials. Parents and guardians that activate the feature will be notified when Apple's content alarm trips.

SEE ALSO: Tesla channels old school sorority values by policing customers' social media posts

The Electronic Frontier Foundation (EFF) released a statement critical of the forthcoming update shortly after Apple's announcement. It's an evidence-supported takedown of the plan that offers a much clearer sense of the issues Cathcart gestures at vaguely in his thread.

There's a reasonable discussion to be had about the merits and risks of Apple's plan. Further, WhatsApp is perfectly within its rights to raise objections and commit to not making use of the feature. But you, a user who might just want to better understand this thing before you form an opinion, have better options for digging up the info you want than a Facebook executive's Twitter thread.

Start with Apple's own explanation of what's coming. The EFF response is a great place to turn next, along with some of the supporting links shared in that write-up. It's not that voices like Cathcart and even Green have nothing to add to the conversation; more than you're going to get a fuller picture if you look beyond the 280-character limits of Twitter.

Topics Apple Cybersecurity Privacy Social Media WhatsApp

0.1826s , 8019.671875 kb

Copyright © 2025 Powered by 【запрещённая порнография】Enter to watch online.WhatsApp won't use Apple's child abuse image scanner,Global Perspective Monitoring  

Sitemap

Top 主站蜘蛛池模板: 中文字幕日本偷拍盗摄 | 日韩在线精品强乱中文字幕 | 日本不卡高清免费 | 麻豆果冻传媒av人妻少妇无码 | 国产精品v欧美精品v | 亚洲粉嫩无码一区二区三区 | 91在线无码精品秘 蜜桃原神 | 午夜dj在线观看免费动漫大全 | 四虎成人网址 | 四虎影视国产精品 | 日韩精品卡2卡3卡4卡5 | 久草超碰在线 | 欧美激情观看一区 | 成年人免费观看网站 | 日韩一区精品视频一区二区 | 婷婷激情综合 | 乱伦四区 | 国产女人喷潮在线观看 | 亚洲制服类中文字幕 | 欧真成人精 | 国产精品国内免费一区二区三区 | 精品啪在线观看国产老湿机 | 欧美日韩在线亚洲综合国产人 | 国产91丝袜在线观看 | 欧美一级在线不卡视频 | 精品国产91久久久久久久a | 亚洲深夜无码 | 九九精品视频免费观看 | 亚洲成人电影一区二区高清无码 | 欧美乱人伦中 | 亚洲一区二区福利电影 | 熟妇人妻一二三区无码精品 | 午夜理论片yy4408私人影院 | 日韩精品亚洲电影天堂 | 日韩午夜看片成人精品 | 亚洲欧美真 | 国产在线精品哟 | 亚洲中文字幕网址在线 | 依人网络在线综合视 | 亚洲午夜无码片在线观看影院百度 | 精品国产免费无码久久久 |