Facebook’s global growth has skyrocketed in the last ten years, way beyond anyone’s wildest dreams. With such rapid adoption and acceleration, it’s hard to imagine there was ever a time when Facebook wasn’t the biggest fish in the ocean. But it was—even smaller than the way outdated MySpace. Remember? During this time, the social-network-turned-media-platform, has had to define and enforce a set of behavioral and moral guidelines for its users that span the globe. The platform went from 70 million users in 2008—with a team of 12 people in a room deliberating over behavioral norms known as the “site integrity team”—to 2.2 billion users today and to 16K people worldwide, all of whom must be monitored for improper behavior around nudity, violence, and racial abuse—a nearly impossible task. In this disinformation age, the dangers of posting false images or information are growing. Lai Mohammed, Nigeria’s minister of communications, has been quoted as saying “In a multi-ethnic and multi-religious country like ours,” he said, “fake news is a time bomb.”

Fake News Responsible for Nigerian Deaths

Enter Nigeria. For all the fact checking and monitoring that Facebook has in place today, it was surprising to learn there are only four third-party people who review the platform used by 24 million Nigerians, for false information. Yes, you heard that correctly, four people. According to recent reports by the BBC, Facebook disabled the account of a man in the UK who was spreading misinformation to thousands in Nigeria. Misinformation that has been linked to deadly outcomes. We’ll come back to that in just a moment.

On June 23, 2018, a series of horrifying images began to circulate on Facebook. One showed a baby with open machete wounds across his head and jaw. Another, viewed more than 11,000 times, showed a man’s skull hacked open. There were pictures of homes burnt to the ground, bloodied corpses dumped in mass graves, and children murdered in their beds. The Facebook users who posted the images claimed they showed a massacre underway in the Gashish district of Plateau State, Nigeria.

A massacre did, indeed, happen in Gashish that weekend. Somewhere between 86 and 238 Berom people were killed between June 22 and 24, according to estimates made by the police and local community leaders. But, in fact, many of the most unsettling images circulating at the time had nothing to do with the violence in Gashish. Nonetheless, the images landed in the Facebook feeds of young Berom men in the city of Jos, hours to the north of the rural district where the massacre was happening. These images then ignited a blaze of fear, anger, and calls for retribution against the Fulani—ultimately resulting in a fiery massacre that killed 11 people. Days later, their bodies were still being discovered across the city, dumped in ditches, behind houses, and along the roadsides. Many were burnt beyond recognition.

Hostility between the Fulani and the Berom predates the rise of Facebook; however, with the rise of rapid-fire misinformation and cyber war across the internet, fact-checking is even more important than ever.

Debunking Fake News

The resource-constrained Plateau State law enforcement officials use Facebook to debunk false information whenever they find it on the platform. At times of crisis, they even use their personal accounts to quash rumors, and they call on community leaders to do the same. But the sheer volume of misinformation circulating in Plateau State is overwhelming their efforts to counteract it. Between September and October 2018, they debunked seven false stories on Facebook. In addition to monitoring the platform, they hold regular meetings with local imams, pastors, and politicians to raise awareness of threats.

Scaling Ethics is Hard, Especially at Facebook

Back in 2008, Facebook began writing a document. It was a constitution of sorts, laying out what could and what could not be posted on their platform. Back then, the rules were simple—no nudity, no gore. But today, the challenge is a lot more complex. How does one define hate speech, exactly? Where’s the exact line between a joke and an attack? How much half-naked butt is too much butt? Facebook has worked to answer those questions with algorithms.

Imagine a team of people sitting in a room, pouring over a set of “if then” statements, manually searching through thousands of questionable images or posts, possibly seeded by internet trolls, trying to determine the essence of “appropriate.” Ultimately, the team referenced the U.S. constitution’s definition of freedom of speech and civil rights guidelines to find its true North—and to keep up with questionable content, flagged by users. Some would argue that the company’s history with tricky decisions shows a need for outside expertise.

What belongs on Facebook and what doesn’t? How do we enforce the content regulation? These are central questions in our current reckoning over social media, and given the vastness of the company’s platform, they can be exceedingly difficult to answer. “Fake news is not your friend,” the company has said—but you can still post as much as you want. Everyone seems to agree that terrorism does not belong on Facebook, though there’s still more of it there than you might expect.

But imagine you could start from the beginning. What would you rule in, and what would you rule out? Facebook’s content policy evolved from a single sheet of paper into 27 pages of comically-specific rules about nudity, sex, violence, and more. Essentially what Facebook has taken the First Amendment, this high-minded principle of American law, and turn it into an engineering manual that can be executed every four seconds, for any piece of content happening anywhere on the globe.

Zuck’s Dilemma: Media Literacy

Facebook’s young CEO, Mark Zuckerberg last visited the war-torn region in 2016, touring the country to promote the platform’s goal to “connect the world.” But in a connected world, it can sometimes be tough to get to the “truth,” even when you have an ever-evolving set of policies and thousands of people monitoring to enforce these policies, the one unpredictable factor is that human beings are making them. And human beings are fallible.

For millions of young people in Nigeria, the platform has become an integral part of everyday life. However, they many have no idea how to report the upsetting or frightening images that continue to appear on their phones, often seeing posts that vilify their people. The real issue is media literacy. A lack of basic media literacy leaves them vulnerable to the kind of dangerous misinformation that is seen by thousands on Facebook in this region.

Facebook has stated it was it is addressing the issue of media literacy. The company said that, in addition to their fact-checking initiative and machine learning tools, it has recently launched an “online safety and digital literacy youth program” with 140 Nigerian secondary schools.

The Bigger Question

This all begs the bigger question: what is Facebook? The world’s largest publisher? A technology company? An advertising agency? A news organization? The short answer: too many things at the same time. The short answer is that Facebook is no longer an online yearbook where users find and connect with college friends. It has become a media and advertising company that, in some people’s opinions, does not care about morality or privacy, only profit and the need to stay out of legal trouble.

Corporate social responsibility has long been a consideration for many global corporations, yet, with the deep pockets and resources that Facebook has at its fingertips, it has done very little unless a public or governmental mandate forces the company to make, and enforce, policy changes. And for some people in Nigeria, some of these changes have come too late.

Related Resources:

Malware Strain Spreading Via Skype & Facebook
Why Facebook Seeking Friend in the Security Business?
Facebook Accounts Sale on Dark Web

#

Share this article

mail envelope

Subscribe to our blog

Stay up to date with the latest marketing, sales and service tips and news.

U.S.-based enterprise technology leader and brand strategist, with a passion for helping global organizations crystallize their vision, gain alignment, and develop marketing communications programs that work. Expertise includes Adtech, AI, Fintech, SaaS, Security and Open Source Software. She holds a BA in Psychology and Organizational Development from Sonoma State University.

Post a comment