Facebook doesn't want to talk about its role in electing Trump



Mark Zuckerberg, founder and CEO of Facebook, posted a cute photo of himself watching the election results with his young daughter. It was accompanied by a message that refrained from attacking or supporting any particular candidate and ended with "feeling hopeful."

Zuckerberg never endorsed one candidate, but also never seemed like a big fan of Trump and in fact drew outrage from one of his spokespeople after he made a "veiled criticism" of Trump's wall-building policies. Other Facebook bigwigs, such as co-founder Dustin Moskowitz, who donated $20 million to defeat Trump, and board member Peter Thiel, who donated $1.25 million in Trump's favor, were more public about their preferences.

SEE ALSO: Trump, GOP Sweep Gives Push To US Anti-Encryption Bill

Yet Facebook may have had a major role in determining the outcome of the election, as it became a hub of fake news that told people with certain political leanings what they wanted to hear, and so they eagerly shared and spread it to their like-minded friends and family, who then shared it with their like-minded friends and family. Right and left-leaning sites alike did this, but pro-Trump and anti-Clinton fake stories and memes were quickly identified as being much more popular, so many purveyors of false content went with them, according to reports from BuzzFeed, the Guardian, and The New York Times.

RELATED: See the best photos from this election year

In addition to that, there's the way Facebook's algorithms show you news (or "news") that your personal internet history indicates you'd rather read. Wall Street Journal had a feature a few months ago that showed how a Facebook user's newsfeed can vary wildly depending on what it believes are your political preferences. A conservative may only see anti-Clinton stories on Breitbart; a liberal might find stories from Vox about how terrifying of a candidate Trump is.

And Facebook's "trending" bar, which was once staffed by human editors but is now in the hands of engineers and robots, has been known to put out many a fake story. The fifteenth anniversary of 9/11 was marked with a link to a British tabloid that claimed it found new footage that that the terrorist attack was an inside job (the article — and the entire 9/11 topic — was removed after Facebook was contacted). The trending bar's inclination to broadcast fake news from dubious sources is a recurring problem that Facebook does not appear inclined to solve. In the days before the election, an article from an unabashedly Trump supporting website called "TruthFeed" that claimed that Gloria Allred was paying women to accuse Trump of sexual assault, was prominently placed on Facebook's trending bar. When Vocativ asked Facebook for comment, there was no response. The topic was removed from the trending bar.

As a business, Facebook's primary responsibility is to its shareholders. If fake, hyper-partisan news that gets massive user engagement even as it misinforms the populace is what brings in the money, then it doesn't have to make things "better" by putting in some kind of controls that ensure that, at the very least, the news stories it's tacitly approving by placing them in its trending bar are actual news stories with actual facts in them.

But if Zuckerberg truly believes, as he says, that "we are all blessed to have the ability to make the world better, and we have the responsibility to do it," then he is in a far better position than most to do exactly that, right now, by making sure that fake "news" stories don't effectively bury the legitimate ones on the platform he created, and by reducing the echo chamber effect that allows so many users to comfortably reinforce their own beliefs without ever having to encounter anything that tells them any different.

Especially since, a full quarter of the world's population now uses his platform.

The post Facebook Doesn't Want To Talk About Its Role In Electing Trump appeared first on Vocativ.

Advertisement