Worried about the infection his young child had contracted, the father sent the photos to the doctor. Two days later, Google suspended his account and notified the police as part of a crackdown on the possession and distribution of child pornography images.
Credit: Arkan Perdana via Unsplash
The limits of artificial intelligence. A New York Times article reports that Google’s AI is becoming too aggressive in some situations, treating situations that are not child pornography as child pornography. A father paid a price for taking pictures of his child’s groin infection to send him to a doctor.
Google has found that the content of the photos is against its terms of service and potentially illegal. Consequently, the US firm closed the Google account of the user in question and alerted the authorities, prompting the opening of a police investigation against him.
The fight against child pornography creates interpretation problems for AI
In this case, it was the nurse who asked the father of the family to send photos before the video consultation so that the doctor could analyze them. The facts date back to February 2021, when some medical practices in the United States were no longer providing medical consultations due to COVID-19. Two days after the photos were taken, the person received a notification from Google that their account had been suspended. So he lost access to his emails, contacts, photos… and even his phone number because he was a customer of Google Fi, the US-exclusive virtual mobile operator service.
A police investigation a few months later concluded that the user did not commit any crime, but that the victim of the misunderstanding should have allowed investigators to access all of their data and content stored by Google. Surveillance systems set up by GAFAM to detect illegal content have been heavily criticized by privacy advocates. Apple recently shelved a previously announced controversial child pornography system to make changes.
New York Times.