Mark Zuckerberg's big new vision for Facebook could throw oil on its burning safety issues — and he knows it

  • Facebook CEO Mark Zuckerberg wants to recast himself as the guardian of people's privacy — but it could come at a cost to safety on his platform.
  • The shift to encrypted messaging will make it much harder to detect the spread of hideous videos, like the one of the New Zealand mosque massacres last week.
  • Zuckerberg knows this, admitting encryption "removes some of the signal that you have to detect really terrible things some people try to do."
  • One lawmaker told Business Insider that Zuckerberg could be trying to absolve himself of responsibility for moderating the harmful content.
  • In putting out a fire over privacy, Zuckerberg needs to be careful that he doesn't throw oil over another burning issue.

It's almost a year to the day since the revelation that the data of millions of Facebook users made its way into the hands of Cambridge Analytica, which weaponized the information to Donald Trump's benefit in 2016.

The fallout from the scandal continues to plague Facebook, with evidence published in court documents on Thursday suggesting that some staff knew about Cambridge Analytica's dirty deeds months before The Guardian first reported on the issue in December 2015.

The consequences of Cambridge Analytica were so profound that it forced Mark Zuckerberg to completely rethink the philosophy of his company. The fruits of that period of introspection were published in his new manifesto for a "privacy-focused" Facebook earlier this month.

Gone is the man who once said, "They 'trust me.' Dumb f--ks," when referring to other people's data. Zuckerberg wants to recast himself and his company as the guardians of privacy. Specifically, this will involve a big lurch towards end-to-end encryption, which will be the backbone of newly interoperable messaging services WhatsApp, Messenger, and Instagram Direct Messages.

The plan to effectively create two Facebooks, a public "town square" and a private "living room," is divisive internally. It contributed to the departure of senior executives, including 13-year veteran and product boss Chris Cox. But Zuckerberg, having been accused of a spectacular failure of leadership over Cambridge Analytica, is convinced that this is the right way forward for his company.

"The future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won't stick around forever. This is the future I hope we will help bring about," he said in his blueprint for Facebook's future.

The problem with Facebook's privacy push

But while Facebook's embrace of encryption may help solve its privacy problem, it could come at a cost. Namely, that it will become much harder to detect the spread of hideous videos like the one of the New Zealand mosque shootings last week, which has drawn international condemnation. 

Much of the attention of has focused on the video's spread on the public-facing Facebook — the "town square," to use Zuckerberg's parlance. Here, Facebook has removed 1.5 million versions of the footage, with both its AI and moderators creaking under the pressure of its virality.

What about its spread on WhatsApp, Facebook's already encrypted messaging service? Here, only the sender and receiver of a message can view its content, making it impossible for Facebook or law enforcement to detect. This is the Facebook "living room" that Zuckerberg imagines.

A quick Twitter search shows people complaining about receiving copies of the Christchurch massacre footage via WhatsApp. "Oh my god.. just received the Christchurch mosque attack video in a family WhatsApp group," tweeted British journalist Umer Ali last week. It was also spotted by former Facebook product manager Antonio García Martínez, who said he had identified "a litany of complaints" from WhatsApp users.

It's not the first time WhatsApp has been abused by bad actors. Terrorists have used it to send guarded messages, as Khalid Masood did before he killed six people in an attack in Westminster, London, in 2017. And in India last year, WhatsApp was used to spread misinformation about child abduction, fuelling mob lynchings.

Facebook has already taken steps to limit the number of people a WhatsApp message can be forwarded to, in order to stem the spread of toxic content. Still, Zuckerberg is well aware that his privacy pivot could have disastrous drawbacks. In an interview with Wired earlier this month, he said (emphasis ours):

"There is just a clear trade-off here when you're building a messaging system between end-to-end encryption, which provides world-class privacy and the strongest security measures on the one hand, but removes some of the signal that you have to detect really terrible things some people try to do, whether it’s child exploitation or terrorism or extorting people."

Does Zuckerberg want to wash his hands of toxic content?

It has led some to question if Zuckerberg has an ulterior motive for his privacy vision: To absolve himself of responsibility for moderating the harmful content on his platform, at a time when rregulators are talking about leveling huge fines on tech firms for not dealing with the issue effectively.

"Is this just a way for Facebook to avoid any responsibility for what people share on the platform?" asked Damian Collins, the British lawmaker who has been investigating the Cambridge Analytica scandal for months.

He told Business Insider: "This becomes a charter for spreading disinformation and other harmful content if the platform is basically going to absolve itself of any responsibility to know people are sharing."

García Martínez has similar suspicions. "The dedication to encryption is Zuck's move to get out from under the content moderation onus, and simply write off dealing with the issue," he tweeted.

Concerns were also raised by Ben Horowitz, the cofounder of the influential Silicon Valley venture-capital firm Andreessen Horowitz. Horowitz's partner, Marc Andreessen, is a Facebook board member.

"If a social network is truly private via end-to-end encryption as Mark Zuckerberg specified, nobody including Facebook or the U.S. Government would be able to monitor it for hate speech and other violations. Essentially, Facebook would be flying right into the face of some of the current backlash against them," he said last week.

Zuckerberg has promised to publicly consult with experts around the world, including governments, law enforcement, regulators, and safety advocates, to address these issues. In putting out a fire over privacy, Zuckerberg will need to be careful that he doesn't throw oil over another burning issue.

See Also:

Read Full Story