NewsLocal NewsI-Team InvestigationsState of Hate

Actions

Domestic extremism often starts with online propaganda

Experts say teens are targeted for radicalization
Domestic extremism often starts with online propaganda
Posted
and last updated

TAMPA, Fla. — In the past few months, the I-Team has reported how hate incidents in Florida have risen to their highest level in decades in a series we call “State of Hate."

We are now learning from experts about some factors behind this trend and what can be done to stop it.

Domestic extremism often starts with online propaganda

A Jewish University of Central Florida student attacked by neo-Nazis in Orlando, a white man hurling racial epithets runs a black driver off the road in Pinellas County, Nazi flags displayed outside the Tampa Convention Center and antisemitic flyers circulated in St. Petersburg are all recent examples of Florida’s growing state of hate.

“These things don’t happen in a vacuum,” said Jon Lewis, a research fellow at the Program on Extremism at George Washington University.

Domestic extremism often starts with online propaganda

He says the “Unite the Right” rally in Charlottesville, Virginia, five years ago showed white supremacy groups becoming better organized.

More than two dozen Oath Keepers and Proud Boys from Florida were arrested in connection with January 6, 2021, U.S. Capitol riot.

“The networks, the ecosystems, the online rhetoric continues to inspire and to mobilize individuals like that to violence,” Lewis said.

Radicalization for many starts in the teen years

Some perpetrators of racially motivated attacks became indoctrinated while in their teens.

An 18-year-old was charged last month with threatening attacks on New Jersey synagogues, posting online that he planned “bombings, shootings, and 'maybe' beheadings.”

Federal authorities reported he was a radical Muslim extremist.

The teenager who fatally shot 10 African Americans in a Buffalo grocery store studied a racist concept called “replacement theory” online.

“He was an 18-year-old who’d been sitting at home alone during COVID for two years, sitting on 4Chan every day just absorbing the hate, absorbing the rhetoric,” Lewis said.

The non-profit research and policy group Everytown reports the shooter used YouTube “as his personal library of firearms and tactical information,” … posting links to dozens of videos in an online manifesto.

The rabbit holes

“YouTube is one of the most dangerous places,” said Kelley McBride, who is Vice President of the Poynter Institute for Media Studies.

Domestic extremism often starts with online propaganda

“It has absolutely the potential to take you down that rabbit hole, depending on what you’re searching for and what you’re clicking on,” McBride said.

Those rabbit holes appear because big tech companies use algorithms to determine what pops up next based on your prior media consumption.

In 2018, Congress held hearings on terrorists using social media to recruit.

“Enemies of our way of life have sought to take advantage of our freedoms to advance hateful causes. violent, Islamic terrorist groups like Isis have been particularly aggressive,” Sen. John Thune said at the time.

But nearly five years after those companies vowed to make changes, videos about Q-Anon, bombmaking and white nationalism are still easily accessible.

“They can figure out if you are going down a path toward a hateful ideology. They’re very reticent to talk publicly about what they do to counter that and whether they are doing enough,” McBride said.

First responders are parents and community members

“The genie’s already out of the bottle,” said Tampa Psychiatrist Dr. Rahul Mehra.

Domestic extremism often starts with online propaganda

He says young people feeling disconnected from families and peers can lead to online manipulation.

“Visual images are the starting point for our emotions. That is why people scroll. You’re seeing images and if those images fit the bias you have, it’s further intensifying the emotional state,” Mehra said.

Mehra created a series of YouTube videos called “Emotional Vaccines,” which deal with tough issues like suicide prevention, gun safety and the consequences of choices.

“What we’ve really got to do for solutions is think proactively and think upstream. So this is designed to help educate people in a public health policy approach,” Mehra said.

Mehra hopes to reach troubled teens before they choose violence or extremism, but he says we are the first line of defense.

“True first responders are parents, or families, or pediatricians, or coaches…teachers and other institutions that are more likely to spend time with children,” Mehra said.

“When you talk about where the threat emanates from, it is almost always gonna be from those online spaces that serve as incubators for hate,” Lewis said.

“We’re trying to explain how this happens so that people who are family and friends of a susceptible individual might be able to intervene,” said McBride.

We reached out to YouTube for response to our experts' claims.

Spokesperson Elena Hernandez sent the following statement and links to additional materials:

“We’re always working to remove content that violates our Community Guidelines and connect viewers with authoritative, timely, and relevant videos. YouTube has long established policies against hate speech[support.google.com], harmful conspiracies and violent extremism[support.google.com], and we remove millions of videos for violating these policies each year, the vast majority of which receive less than 10 views. Additionally, our systems are trained to raise up content from authoritative sources like news outlets and medical experts in search results and recommendations.”

Additional information

On enforcement of our Community Guidelines following the hateful attack in Buffalo:

  • Following the abhorrent and hateful attack in Buffalo, we removed thousands of videos in accordance with our Community Guidelines [youtube.com], including content glorifying the perpetrator and reuploads of his manifesto.
  • Additionally, our Trust and Safety teams comprehensively reviewed the suspect’s Discord chat logs and removed 3 videos for linking [support.google.com] to websites that violate our Community Guidelines. 

On transparency

  • We share information about how our systems and policies work as openly as possible, and we’re always looking for new ways to deepen transparency. For example: 

On our recommendations system

  • A number of published papers suggest YouTube recommendations aren’t actually steering viewers towards extreme content. Instead, consumption of news and political content on YouTube more generally reflects personal preferences that can be seen across their online habits.
  • We recognize the importance of using recommendations and search results to surface authoritative information. We were the first company in the industry to incorporate authoritativeness into our search and discovery algorithms, meaning our systems prominently raise authoritative sources [blog.youtube] in search results and “watch next” panels. Today, millions of search queries are getting this treatment.
  • For topics prone to misinformation, we surface an information panel providing context from third party sources at the top of search result and below related videos.
  • A 2020 study [arxiv.org] from researchers at Harvard and the University of Pennsylvania found no evidence that “echo chambers” are caused by YouTube recommendations.
  • Other researchers have found that YouTube’s recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content by favoring mainstream media and cable news content over independent YouTube channels (Ledwich, 2020 [firstmonday.org]). A 2022 ADL report [adl.org] is also consistent with these findings.
  • An April 2022 study [arxiv.org] by A. Chen, B. Nyhan, et. all., found that “non-subscribers are rarely recommended videos from alternative and extremist channels.”

You can see our entire State of Hate series by clicking here.

The FBI has listed domestic terrorism as one of the greatest threats to our homeland.

To learn more, click here.

If you have a story, you think the I-Team should cover, email us at adam@abcactionnews.com