Boogaloo symbols with a red line through them, indicating the Facebook ban

When Facebook announced Tuesday that it would ban violent, anti-government “boogaloo” groups, it joined a number of social media companies taking new steps to control hate speech and misinformation on their platforms.

Ordinary users of Facebook and other social media providers have been joined on the platforms by a rising number of extremists, along with accounts devoted to spreading misinformation. In the past few weeks these companies, including Twitter, Snapchat, and others, have been accelerating their efforts to clamp down on hate speech, potentially violent movements, and conspiracy theories.

Facebook, in particular, has been facing increasing pressure from advertisers, users, and its own employees to do more to control harmful content. Some advocacy groups say they’re optimistic that new moves by Facebook, Reddit, and YouTube represent a higher level of commitment by social media companies across the board to address the problem.

On Tuesday, Facebook removed 220 Facebook accounts, 95 Instagram accounts, 28 individual pages, and 106 groups that made up an informal network of boogaloo users, as well as over 400 other groups and 100 pages that were hosting content the company says violated its Dangerous Individuals and Organizations policy.

More on Social Media Policies

Facebook’s ban comes a month after members of the movement had allegedly used the platform to plan the May 29 shooting death of a federal security officer in Oakland, Calif. The officer was killed while guarding a federal courthouse during a Black Lives Matter protest.

Boogaloo is a loosely organized, far-right, anti-government group that coalesced on a variety of social media platforms. It built an identity based on shared memes and symbols such as loud Hawaiian shirts and igloos that play on references to the 1980s break dancing film “Breakin’ 2: Electric Boogaloo.” Adherents include a range of gun-rights extremists, anti-Semites, and white supremacists, according to organizations that study hate groups. Boogaloo supporters anticipate a civil war against what they see as a tyrannical government.

In announcing its ban, Facebook promised to remain vigilant. “So long as violent movements operate in the physical world, they will seek to exploit digital platforms,” the announcement says, in part. “As we’ve seen following other designations, we expect to see adversarial behavior from this network including people trying to return to using our platform and adopting new terminology.”

What Other Companies Are Doing

Pressure on Facebook to toughen its content moderation practices increased in late May when Twitter tagged a post by President Donald Trump that appeared to advocate violent action against protesters, saying that it glorified violence.

When Facebook resisted calls to take similar action, a number of the company’s longtime staffers wrote an open letter to CEO Mark Zuckerberg, calling the company’s refusal to fact-check or label political speech “cowardly.” Employees later staged a virtual walkout to demand stronger action against objectionable content.

In addition, a movement called #StopHateForProfit started calling for an ad boycott of Facebook and Instagram. A long list of businesses, including North Face, Ben & Jerry’s, Verizon, Honda, Best Buy, Pfizer, Unilever, Ford, Target, Coca-Cola, Starbucks, and Microsoft, have stopped advertising on Facebook and Instagram. 

(Consumer Reports has also paused its paid advertising on Facebook and Instagram, saying that it is joining “a growing number of nonprofits and corporations that are demanding meaningful action to stop the spread of misinformation and hate speech” on the platforms.)

Critics of Facebook are hesitant to give the company much credit for banning the boogaloo groups. “In some ways it’s remarkable, but it’s also too little too late,” says Shireen Mitchell, founder of Digital Sistas and Stop Online Violence Against Women, groups that have long monitored the platform’s moderation policies. “For years, whenever there were threats of violence [against people of color] they did not do anything.”

Melissa Ryan, CEO of Card Strategies, a consulting firm that specialized in combating online disinformation, says that Facebook is just catching up to other companies. “Of all the tech companies, Facebook seems the least interested in making fundamental changes in their business model to make the platform a safer place,” she says. “But we seem to have reached a tipping point, and the ad boycott has so much to do with that.”

Facebook denies that yesterday’s actions were related to the boycott, saying that the ban on the groups resulted from months of investigation. The company added that over the years it has banned more than 250 white supremacist organizations and individuals, including David Duke, American Renaissance, and Richard Spencer.

Other social media companies have also become more active in removing, flagging, or taking actions to repress content for violating standards on hate speech or advocating violence. 

  • YouTube banned well-trafficked channels and users that featured white supremacist content this week. They included the accounts of the far-right white supremacist figures Stefan Molyneux, David Duke, and Richard Spencer. 

  • Reddit banned about 2,000 of its online communities this week in an effort to enforce an update to the company’s content policies. The most significant was The_Donald, a group with more than 800,000 subscribers that was notorious for racist memes and encouraging violence and harassment. It had no official connection to the president.

  • Twitch, a livestreaming service and subsidiary of Amazon, suspended Trump’s account this week for violating content policies after the account posted videos including a 2015 rally where he accused Mexico of sending drugs, crime, and rapists to the U.S.
  • Snapchat stopped promoting Trump’s official account on the app’s Discover page in June. The company said his behavior on other platforms could incite racial violence and promote injustice, although Trump’s Snapchat posts hadn’t violated company policies. The president’s account is still available to users who subscribe to it or search for it. 
  • Twitter started adding information to some of Trump’s tweets in May. The company labeled tweets about mail-in ballots and fraud as “potentially misleading” and added a link to information about the subject. The company obscured his tweet about shooting looters with a message saying it glorified violence. The tweet was still visible if you clicked a View link.

In the wake of such actions, Trump has joined other conservatives in accusing the social platforms of liberal bias. A number of prominent conservatives have opened accounts on Parler, a 2-year-old social media platform popular with Republican politicians. 

How to Report Objectionable Content

Social media platforms provide their users with ways to report content that violates policies, including rules meant to protect against harassment, hate speech, and incitement to violence. 

Facebook maintains this guide with details on how to report abusive content as well as spam. Essentially, to report a group or user, go to the appropriate page, click on the three-dot icon below the cover photo, and click on “Report group,” or “Find support or report profile.” To report an individual post, click on the three-dot icon in the top right of the post and click “Find support or report post” from the menu. 

Instagram has a similar page with instructions on how to report objectionable content. To report a post, tap on the post, and then on the three-dot icon to pull up a menu that includes “Report inappropriate.” Reporting a profile is similar. You can report a comment by tapping the icon next to the comment. You can also report problematic content using this form, even if you don’t have an Instagram account.
 
Follow these links to learn how to report objectionable content on the other social media platforms: TwitterYouTubeRedditTwitch, and Snapchat all have mechanisms for reporting objectionable comment on their platforms.