Messenger Kids, a “safe” chat app that Facebook recently introduced for underage children and their friends to talk to each other, has already made the news in a not-so-good way for a serious design “flaw” that allowed adult pedophiles to join children’s “parent-approved” group chats.
According to reports, the “technical error,” as Facebook is calling it, allowed the “friends” of underage Facebook users to create group chats with adult Facebook users that had not first been pre-approved by these children’s parents. Facebook has since “turned off” the group chat feature, and issued the following statement directly to one of its infringed users:
“We found a technical error that allowed [CHILD]’s friend [FRIEND] to create a group chat with [CHILD] and one or more of [FRIEND]’s parent-approved friends. We want you to know that we’ve turned off this group chat and are making sure that group chats like this won’t be allowed in the future.”
Keep in mind that this was not a public announcement by Facebook, but rather a private message that was “sent to thousands of users in recent days.” Facebook claims the “glitch” only affected “a small number of group chats,” though the company offered no further specifics as to what caused the “technical error.”
“‘Thousands’ of group chats doesn’t sound like a small number to me, especially given that you have to multiply that vague figure by the number of kids involved in each one,” writes Stephen Green for PJ Media. “Worse, Facebook has a bad history of slow-rolling the truth when it comes to its many security issues.”
For related news, be sure to check out Facebook.Fetch.news.
As explained by The Verge, the group chat feature in Messenger Kids was somehow unable to properly navigate user permissions, allowing the approved invitees of one child participant to chat with other child participants, even when those other child participants’ parents had not given their permission.
“Whoever launched the group could invite any user who was authorized to chat with them, even if that user wasn’t authorized to chat with the other children in the group,” writes Russell Brandom for The Verge. “As a result, thousands of children were left in chats with unauthorized users, a violation of the core promise of Messenger Kids.”
The group chat feature of Messenger Kids, it’s important to note, has been around since December 2017, which begs the question: How many child predators have been able to directly connect with their targets, thanks to this Facebook “technical error?” An even more important question is: Was this really even a glitch at all?
We bring this up because Facebook has been caught in multiple privacy scandals, perhaps most notably the Cambridge Analytica fiasco, which was also blamed on a “software bug.”
Facebook has since been ordered to pay up about $5 billion in fines – a drop in the bucket compared to the multi-billion dollar profits the company takes in annually. But Mark Zuckerberg, Facebook’s infamous CEO, has yet to face any personal liability for his social media company’s major data breaches.
But that could change, thanks to the Parent Coalition for Student Privacy, as well as 16 other public health advocacy groups, which are seeking justice against Facebook and Zuckerberg. They recently filed a complaint with the Federal Trade Commission (FTC) alleging that Facebook hasn’t done nearly enough to ensure that its Messenger Kids app functions as claimed to truly keep children safe.
“How many kids were, um, exposed?” asks Green about Facebook’s Messenger Kids “glitch.” “I don’t trust Facebook’s initial admission, and given all that you’ve just read, neither should you.”
Sources for this article include: