Which did not have official relations with Israel. Using this system as a “diplomatic calling card” - with that list of countries, is outrageous. Terrific reporting from the Financial Times here, including more circumstantial evidence that it was Apple who tipped off the State Department about these hacked phones in Uganda. Monday, 20 December 2021 Notchmeister 1.0 ★ Remarkably detailed for an operation that, quite obviously, was intended to be clandestine.
This holiday season we have a special gift for Mac usersĮverywhere, especially ones with a new MacBook Pro and notch. We’re proud to announce the immediate availability of Pixel factory parasite in city save for mac# Think of it as a fun way to spruce up your notch. The Mac has a grand tradition of silly utilities - exquisitely well-crafted, but serving no purpose other than to be fun - and Notchmeister is a perfect example. My thanks to Mux for once again sponsoring DF last week. Use their Video API to build video streaming into your application and make it play beautifully at scale on any device. A Mux stream is just one GET request away from magical-feeling features like automatic thumbnails, animated GIFs, and data-driven encoding decisions. Looking to understand if your videos are gaining traction? They’ve got that covered with Mux Data: get info about views, viewers, and playing time. “Expanded Protections for Children.” However references to the With iOS 15.2, are still present on the page, which is titled Two of the three safety features, which released earlier this week Wednesday, 15 December 2021 Apple Updates ‘Child Safety’ Webpage to Remove Mention of CSAM Fingerprint Matching, But Feature May Still Be Forthcoming ★ You can also see whether viewers are getting errors or rebuffering, and whether you should be using Mux (trick question - yes). When reached for comment, Apple spokesperson Shane Bauer said that More controversial CSAM detection, whose launch was delayedįollowing backlash from privacy advocates, have been removed. Researchers, and others, we have decided to take additional time “Based on feedback from customers, advocacy groups, The company’s position hasn’t changed since September, when itįirst announced it would be delaying the launch of the CSAMĭetection. Over the coming months to collect input and make improvementsīefore releasing these critically important child safetyįeatures,” the company’s September statement read.Ĭrucially, Apple’s statement does not say the feature has beenĬanceled entirely.
Documents outlining how theįunctionality works are still live on Apple’s site.
Now that some of the new child safety features are shipping with this week’s iOS 15.2 update (machine-learning-based nude/sexually-explicit image detection in Messages, and “Expanded guidance in Siri, Spotlight, and Safari Search”), Apple has updated the page to state which features are currently shipping. Pixel factory parasite in city save update# I think the CSAM fingerprinting, in some form, is still forthcoming, because I suspect Apple wants to change iCloud Photos storage to use end-to-end encryption. Concede for the moment that CSAM identification needs to happen somewhere, for a large cloud service like iCloud.
If that identification takes place server-side, then the service cannot use E2E encryption - it can’t identify what it can’t decrypt. If the sync service does use E2E encryption - which I’d love to see iCloud Photos do - then such matching has to take place on the device side.