By Matt Kapko
Nov. 16, 2016
The leaders of Facebook, the world's largest social network, which reaches 1.18 billion people every day, want to absolve the company from any responsibility for the potential proliferation of misinformation on its platform. As the dust settles following a polarized presidential election, Facebook is trying to defend itself against claims of impropriety and unchecked influence on American politics.
The social giant is taking hits from all sides of the political spectrum and CEO Mark Zuckerberg is walking a challenging line as he tries to placate Facebook's critics. In May, he met with a group of conservative leaders to dismiss reports that Facebook's team of curators in charge of its Trending Topics had deliberately suppressed stories from right-leaning news outlets. Then, in a blog post published four days after the election, Zuckerberg defended the social network as a neutral party that doesn't bear the same responsibilities as a news source and said Facebook should be "extremely cautious about becoming arbiters of truth ourselves."
Facebook advocates influence, shirks responsibility
Politics aside, the contradictions Zuckerberg made about the social network's influence and its potential impact on users could become a glaring problem. If the content, including any misinformation, that Facebook distributes to more than 1.79 billion people every month can't influence the outcome of an election, just how effective are the $6.8 billion in ads it sold during the third quarter of 2016?
Facebook proudly claims its highly targeted advertising can influence purchase decisions, and it accepts some responsibility for giving a voice to disenfranchised people who have organized uprisings in the Middle East and elsewhere around the world. Back home in the United States, however, Facebook tells a more nuanced story to boost the most important aspect of its success. There's no denying Facebook is the world's largest media platform, but the company is unwilling to deem itself a media company — and accept all the responsibilities that come with that distinction.
Zuckerberg publicly addressed Facebook's latest storm of criticism at a conference last week, but he dug in deeper in the blog post published Saturday. "Many people are asking whether fake news contributed to the [election] result, and what our responsibility is to prevent fake news from spreading," he wrote. "Of all the content on Facebook, more than 99 percent of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other."