Apple nude image scanner comes to Germany

After it has already established itself in other countries, the nude image scanner from Apple is now also announced for Germany.

In the In the latest iOS beta 16, Apple activated the controversial nude image scanner for iMessage in Germany for the first time. It is supposed to prevent children from sending or receiving nude pictures.

Already active in other states

After the USA, Great Britain , Canada, Australia and New Zealand, Apple now apparently wants to release its nude image scanner for Germany and France. The activation of the function in the current beta of iOS 16 is no guarantee that it will also make it into the final version. But the “chances” are good that Apple will officially roll out the nude image scanner on all devices in September with the new iOS version.

If you are worried that your iPhone will automatically scan your photos in the future, you can rest easy at this point. At least if he puts his trust in Apple. Because according to their own statements, the feature is “off by default“. So the user has to activate it explicitly.

Nude image scanner only warns the child

If parents activate the function on the your child’s iPhone, all incoming and sent images are checked for intimate parts. If the nude photo scanner detects a nude photo, it will display it in a veiled manner and display a warning message to the child. It also offers the child the option to notify an adult. Although Apple had previously planned to automatically notify parents, this feature was canceled for the time being after heavy criticism.

It is still unclear exactly when the nude image scanner will recognize a photo as a hit. According to Apple, however, if genitals are visible in the photo, it is enough to trigger the appropriate cues. Content is recognized by an AI that Apple is constantly evolving and updating through operating system updates.

Apple’s original plans went even further

Apple’s original goal was to make more extensive adjustments to the system that were intended to curb the spread of abuse images via its own services. The nude image scanner was just a small part of it. Because, as we reported last year, Apple is the largest platform for the distribution of child pornography.

Apple

Apple wanted to automatically detect and report images depicting child sexual abuse, as soon as they are uploaded to iCloud. For this purpose, all photos on the iPhone should be automatically compared with a database of known abuse images.

But data protection and security experts caused 2021 so much headwind that Apple decided to put the plans on hold for the time being lay. Nevertheless, the nude image scanner is a step in the same direction, which could lead to the original plans being implemented sooner or later.

Tarnkappe.info

Related Articles

Back to top button