Facebook Admits Scanning Messenger Content Amid Data Misuse Scandal by Titan007
In the wake of the massive unauthorized data use scandal, Facebook dropped another bombshell: it admits to reviewing messages people exchange on Messenger—text, photos, and links—claiming it’s to ensure users “play by the rules.”
The first hints came from Mark Zuckerberg, who said in an interview that Facebook played a role during last year’s crisis in Myanmar, where hate speech spread across the platform as more than half a million refugees fled to Bangladesh. According to him, Facebook detects such messages, including when people try to send sensational content via Messenger.
Other executives later confirmed that Messenger conversations are analyzed in much the same way Facebook reviews public posts, with the stated goal of preventing abuse.
“For example, in Messenger, when you send a photo, our system automatically scans it to check for child exploitation; when you send a link, it’s checked for malware. We designed these automated tools so we can stop inappropriate and violent behavior online at any moment,” the company says.
Many users believed these chats were completely private. Clearly, that’s not quite the case. Facebook insists the data viewed in this way isn’t sold to advertisers—but whether anyone believes that is another question.
Messenger used to be part of Facebook’s main app before it was split into a stand-alone app in 2014. Facebook’s other major chat app, WhatsApp, uses end-to-end encryption, meaning even WhatsApp can’t see message contents—making it feel safer to users and more challenging for investigators who want access. Messenger also offers an encrypted option, but users have to turn it on.
At the same time, the company confirmed it had shared personal data from 87 million users with Cambridge Analytica—far more than the previously assumed “only” 50 million. Initially, when the story first broke, Facebook claimed the privacy of roughly 270,000 users had been compromised.

Comments