A floral tribute is seen on Linwood Avenue near the Linwood Masjid on March 15 in Christchurch, New Zealand. 49 people have been confirmed dead and more than 20 are injured following attacks at two mosques in Christchurch.

A floral tribute is seen on Linwood Avenue near the Linwood Masjid on March 15 in Christchurch, New Zealand. 49 people have been confirmed dead and more than 20 are injured following attacks at two mosques in Christchurch.

After the massacre of at least 49 people at two New Zealand mosques, many are focusing on the role of big tech companies in proliferating hate speech and violence around the world.

Nearly 20 minutes of the massacre was live-streamed on social media. The horrifying video spread quickly, according to The New York Times’ Charlie Warzel.

Though platforms like Facebook, Twitter, and YouTube scrambled to take down the recording and an accompanying manifesto apparently from the gunman, they were no match for the speed of their users; new artificial-intelligence tools created to scrub such platforms of terrorist content could not defeat human cunning and impulse to gawk. In minutes, the video was downloaded and mirrored onto additional platforms where it ricocheted around the globe. Still frames of bodies were screenshotted and uploaded to sites like Reddit, 4chan and Twitter where it was shared and reshared.

How can we interrogate online discourse with the presence of trends like s—tposting? That’s when trolls “derail productive discussion and distract readers,” according to Bellingcat, and the screed the alleged killer posted on Twitter is full of it.

Some suggested combating these posts by further examining the Islamophobia behind the attack.

As Recode’s Peter Kafka noted, Facebook’s CEO Mark Zuckerberg responded to similar concerns over Russian interference in the 2016 election. At the time, he said “‘[w]e don’t check what people say before they say it, and frankly, I don’t think society should want us to. Freedom means you don’t have to ask for permission first, and by default, you can say what you want.'”

How should we respond in an online ecosystem that has weaponized virality? We talk about that question and more.

Guests

  • Kevin Roose Tech columnist, The New York Times; @kevinroose
  • Joan Donovan Director of the Technology and Social Change Research Project at Harvard Kennedy School’s Shorenstein Center on Media Politics and Public Policy; @BostonJoan
  • Qasim Rashid Human rights attorney; author, "Talk To Me: Changing the Narrative on Race, Religion and Education"; @MuslimIQ
  • Ben Collins Extremism/disinformation reporter, NBC News; @oneunderscore_

Topics + Tags

Most Recent Shows