麻豆蜜桃精品无码视频-麻豆蜜臀-麻豆免费视频-麻豆免费网-麻豆免费网站-麻豆破解网站-麻豆人妻-麻豆视频传媒入口

Set as Homepage - Add to Favorites

【???? ????? ?????】Enter to watch online.Apple delays plan to check iPhones for child abuse images

Source:Global Perspective Monitoring Editor:relaxation Time:2025-07-03 19:30:23

The ???? ????? ?????pushback against Apple's plan to scan iPhone photos for child exploitation images was swift and apparently effective.

Apple said Friday that it is delaying the previously announced system that would scan iPhone users' photos for digital fingerprints that indicated the presence of known Child Sexual Abuse Material (CSAM). The change is in response to criticism from privacy advocates and public outcry against the idea.

"Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material," a September 3 update at the top of the original press release announcing the program reads. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."


You May Also Like

Announced in August, the new feature for iOS 15 would have checked photos in an iPhone user's photo library — on the device before sending the photos to iCloud — against a database of known CSAM images. If the automated system found a match, the content would be sent to a human reviewer, and ultimately reported to child protection authorities.

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

The fact that the scanning happened on the device alarmed both experts and users. Beyond it being generally creepy that Apple would have the ability to view photos users hadn't even sent to the cloud yet, many criticized the move as hypocritical for a company that has leaned so heavily into privacy. Additionally, the Electronic Frontier Foundation criticized the ability as a "backdoor" that could eventually serve as a way for law enforcement or other government agencies to gain access to an individual's device.

"Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor," the EFF said at the time.

Experts who had criticized the move were generally pleased with the decision to do more research.

Others said the company should go further to protect users' privacy. The digital rights organization fight for the future said Apple should focusing on strengthening encryption.

While other companies scan cloud-based photo libraries for CSAM, like Google, and the overall goal of protecting children is obviously a good one, Apple thoroughly bungled the rollout of this product with privacy concerns justifiably overshadowing the intended purpose. Better luck next time, folks.

Topics Apple Cybersecurity iPhone Privacy

0.1801s , 10027.125 kb

Copyright © 2025 Powered by 【???? ????? ?????】Enter to watch online.Apple delays plan to check iPhones for child abuse images,Global Perspective Monitoring  

Sitemap

Top 主站蜘蛛池模板: 亚洲久悠悠色悠在线播放 | 韩国v欧美v亚洲ⅴ日本v | 狠狠干夜夜操 | 精品无码成人网站久久久久久 | 国产爆乳美女娇喘呻吟 | 国产午夜精品无码免费不卡影院 | 污污免费网站 | 午夜精品一 | 国产成人免费大片 | 激情一区二区三区成人 | 免费又黄又爽又猛大片午夜 | 99精品免费| 午夜欧美日韩在线视 | 综合在线视| 日韩精品 | 黃色A片三級三級三級免费看夭女 | 国产精品久久久久久久久牛牛 | 尤物视频免费在线不卡 | 成人欧美一区在线视频 | 狠狠热精品免费视频 | 国产美女在线精品免费观看 | 成人免费毛片一区二区三区 | 亚洲综合五月激情 | 精品国产第国产综合 | 成人午夜视频一区二区国语 | 韩国男男腐啪gv | 日本亚洲精品成人 | 韩国三级bd高清漂亮的老师5 | 天天草天天干 | 国产美女的裸体的网站免费观看 | 亚洲欧美日韩一 | 日本韩国欧美人人澡 | 少妇高潮喷潮久久久影院 | 欧亚成人A片一区二 | 亚洲精品日韩一区 | 成人在线一区二区三区 | 精品国产免费第一区二区 | 国语对白嫖老妇胖老太 | 亞洲HD | 午夜精品高清 | 国精视频一 |