NYT: Parents Lose Google Accounts Over Abuse Image False Positives

At least two parents have reportedly lost access to their Google accounts because the company’s system for detecting child sexual abuse material (CSAM) mistakenly flagged images they had taken of their children for medical purposes as depicting some form of abuse.

The New York Times reports(Opens in a new window) that one of the parents, a software engineer referred to only as Mark, took photos of his child’s groin because “his son’s penis looked swollen and was hurting him.” A nurse requested the photographs ahead of a video consult related to the issue.

The other parent said in a Quora post(Opens in a new window) discovered by the Times that he’d lost access to his Google account in February 2021 because he “took pictures of [his] son’s infection in his intimal parts to send to his pediatrician who was following daily updates.”

Both instances appeared to involve people with Android phones taking pictures of medical problems involving their child’s groin that were automatically backed up to Google Photos and sent to someone else—significant others and medical professionals—via a variety of services.

Because the images were backed up to Google Photos, a Google system designed to detect CSAM flagged them. Upon review, the company deactivated the accounts and referred the cases to the CyberTipline at the National Center for Missing and Exploited Children.

The Times confirmed that both of the parents were cleared of any suspicion of abuse by investigators. Yet they still don’t have access to his Google account, which in Mark’s case caused a variety of problems that stem from the sheer ubiquity of the company’s services:

“Not only did he lose emails, contact information for friends and former colleagues, and documentation of his son’s first years of life, his Google Fi account shut down, meaning he had to get a new phone number with another carrier. Without access to his old phone number and email address, he couldn’t get the security codes he needed to sign in to other internet accounts, locking him out of much of his digital life.”

Some, including former Google product manager and current Homebrew partner Hunter Walk(Opens in a new window), have said it’s preferable for a system related to CSAM to have false positives (flagging innocent material as abusive) than to have false negatives (flagging abusive material as innocent).

But that doesn’t necessarily mean people should permanently lose access to their Google accounts—and, as Mark’s experience demonstrates, many other services—when their photos are mistakenly flagged as CSAM. A functioning appeals process should be in place.

It’s possible that Google’s appeals process is working as intended. The Times reports that Google based its decision not to restore Mark’s account on a video “of a young child lying in bed with an unclothed woman” he had taken six months prior to the primary incident.

Recommended by Our Editors

Mark told the Times that he “did not remember this video and no longer had access to it” but characterized it as “a private moment he would have been inspired to capture, not realizing it would ever be viewed or judged by anyone else,” as the report puts it.

Google appears to be the only one with access to that video. Mark could be mischaracterizing a video depicting potential abuse; Google could be using an intimate moment as an excuse not to restore Mark’s account. Right now there isn’t enough public information to pass judgment.

As it stands, however, both parents cited in the Times report were cleared of wrongdoing by investigators. It’s not clear why that isn’t enough for them to regain access to their Google accounts—or what else they’re supposed to do to prove their innocence to the company.

Google didn’t immediately respond to a request for comment.

SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.”,”first_published_at”:”2021-09-30T21:22:09.000000Z”,”published_at”:”2022-03-24T14:57:33.000000Z”,”last_published_at”:”2022-03-24T14:57:28.000000Z”,”created_at”:null,”updated_at”:”2022-03-24T14:57:33.000000Z”})” x-show=”showEmailSignUp()” class=”rounded bg-gray-lightest text-center md:px-32 md:py-8 p-4 mt-8 container-xs” readability=”31.423799582463″>

Like What You’re Reading?

Sign up for SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.

Facebook Comments Box

Hits: 0