Journalist: Gina Barton
Source Link: USA Today
September 22, 2020
Gaige Grosskreutz wasn’t even out of the hospital when his phone started blowing up. Shot point blank in the arm with an AR-15, he was the only person to survive a triple shooting at a protest condemning the shooting of Jacob Blake by Kenosha police in Wisconsin.
Weeks later, the messages haven’t stopped. Although some are encouraging, most are ugly, even threatening. In some corners of the internet, Grosskreutz, 26, has become the target of angry white supremacists who say he and others who support Black Lives Matter should be stopped by any means necessary – including homicide.
His family and friends – people who didn’t protest in Kenosha – got frightening messages, too. Strangers showed up at their homes to find out “what really happened” the night Grosskreutz was shot.
“And that’s the thing that affects me, seeing the people that I care about be upset for me, scared for me,” Grosskreutz said. “I just don’t understand the need to target people who weren’t even there.”
Kyle Rittenhouse, 17, who considered himself militia, was charged with five felonies in the shooting that wounded Grosskeutz and killed Joseph Rosenbaum and Anthony Huber.
Huber’s girlfriend, Hannah Gittings, also has received online threats, according to her friend Danielle Rasmussen, who sponsored an online fundraiser for her. On the fundraising site, people donated $5 to gain access and leave a nasty message, then got the money refunded, she said. They posted laughing emojis in reaction to posts about Huber’s death and sent mocking texts to Rasmussen’s husband.
“They’re doing that instead of being part of the solution,” she said. “Holding people accountable and doing the right thing, sometimes you have to have tough skin.”
Along with shining the spotlight on Wisconsin, a crucial state in the presidential election, the shootings laid bare the extent of online harassment and its effects. It’s a problem that makes victims of violence unwitting pawns in ideological arguments, forcing them to delete their social media accounts, change their phone numbers or even move. It’s almost impossible to stop, experts say, because of a combination of ineffective criminal laws, ignorant police agencies and an unregulated internet.
Over the past decade, incidents of police violence, mass shootings and high-profile crimes have been followed by online attacks – not just on survivors but their families, their attorneys and the journalists who cover their stories.
“It’s such a challenging time that we’re living though,” said Jessie Daniels, a sociology professor at Hunter College in New York. “On the one hand, people are using social media to galvanize people against white supremacists, in support of Black Lives Matter and to point out the brutality of police killings. At the same time, those very tools people are using for social justice can be turned on them in very pernicious ways.”
Sandy Hook was not a hoax
On Dec. 14, 2012, Leonard “Lenny” Pozner’s 6-year-old son, Noah, died at Sandy Hook Elementary School in Newtown, Connecticut. Twenty children and six adults were fatally shot that day.
“My life prior to that tragedy was a completely different life compared to everything that happened after that,” Pozner told the Milwaukee Journal Sentinel, part of the USA TODAY Network. “This is my new life: I am a parent of a murdered child who is part of an internet conspiracy.”
Before, Pozner was a father of three who worked in information technologies and sometimes tuned in as Alex Jones spouted outlandish theories about 9/11 and the Kennedy assassination. After, Pozner was left to parent his two girls while being stalked and terrorized by people accusing him of being an anti-gun “crisis actor” who never had a son – and worse.
Pozner learned there was little he or anyone in law enforcement could do to stop them.
He contacted Jones, who called the mass shooting at Sandy Hook a hoax, to no avail. The grieving father joined a Facebook group of conspiracy theorists, making himself available to answer their questions. He changed at least one woman’s mind, he said. She had young children and couldn’t fathom they could be shot at school. After she left the group, he said, she became the target of online harassment, too.
Law enforcement often made the problem worse, Pozner said. Although he pleaded with authorities not to include his address in complaints, which are public record, they often did. As a result, he’s had to move over and over again. In one case, Pozner outed a stranger who filed baseless child abuse complaints against him, only to have a detective threaten him for harassing his tormentor, he said. Noah’s mother had to move as well.
Federal and state authorities were no better. In response to a records request, a state attorney general provided Pozner’s entire complaint – without blacking out his personal information as the law allows – to a conspiracy theorist who posted it online.
Nearly a decade after his son’s death, Pozner lives in hiding.
“There is nothing there to protect you when it comes to the internet unless you’re willing to fight like hell,” he said. “Most people will just give up. I didn’t think I had the option to. I was on a one-way track: It was just keep fighting this or die.”
History of online harassment
Coordinated online harassment and disinformation campaigns are bolstered by algorithms that elevate “content that’s hot, that’s hate-filled, that makes people angry and gets lots of reactions,” according to Daniels.
An early example of such a campaign was Gamergate, in which female video game developers were not only vilified online but driven from their jobs and forced to flee their homes.
“If Facebook and Twitter had really taken a hard stance against bigotry and harassment in the wake of Gamergate, if they had learned their lesson, then we would be in a very different position now,” said Whitney Phillips, an assistant professor of communication at Syracuse University. “But instead, they did nothing. They continued incentivizing or at least tolerating this kind of behavior.”
Coordinated antagonism based in identity, she said, ramped up during Donald Trump’s 2016 presidential campaign.
“And when Trump was elected, of course, he normalized them,” she said.
Daniels, author of the books “Cyber Racism” and “White Lies,” said, “White supremacists have felt so empowered lately because they’re getting their actions and their statements validated from the highest office in the land. And that’s pretty intoxicating.”
Foreign governments are engaged in disinformation campaigns designed to fuel divisions and elevate white supremacy, Daniels said. The problem is exacerbated when people who don’t necessarily espouse neo-Nazi beliefs repost content about being sick of partisan bickering or not trusting the media.
“To have a politics of social justice, it relies on … some sort of shared belief that there’s truth and there’s stuff that’s not true,” Daniels said. “The president has been an expert at fueling the idea that there is no shared belief, which erodes the ground beneath human rights and social justice.”
Counteracting online threats
Because of her work, Phillips said, she’s been threatened online. Police are of little help for several reasons, she said. One problem is that it takes time to track down the source of anonymous threats, especially if there are hundreds or thousands of them coming in. If a police department has limited resources, such threats often aren’t a priority.
Another issue is the legal definition of what constitutes a threat. Saying, “I hope someone comes to your house and kills you” is protected speech under the First Amendment and can’t be prosecuted. Saying, “I’m going to come to your house and kill you” may be a crime.
“The kinds of utterances that actually would be actionable by law enforcement is really small compared to the kind of harassment that people receive,” Phillips said. “A lot of it is more diffuse and nebulous, and equally scary and equally threatening. But the law doesn’t see it as such.”
Society has been slow to realize online interactions can have real-world consequences, she said. It’s not enough to turn off the computer or log off social media.
“For too many years, and still some people believe – which is shocking, but they do –that there is somehow something fundamentally different between the offline world and the online world, and if something happens online, it’s not really real,” she said.
Online hatred morphed into murder in 2017 in Charlottesville, Virginia, when a member of a Facebook group of white supremacists, James Fields, intentionally drove his car into a group of protesters, killing Heather Heyer, 32.
Heyer was demonstrating in opposition to a “Unite the Right” rally organized by white nationalists. In the days after Heyer’s death, Facebook was roundly criticized for being slow to remove a post promoting the event. The following year, Facebook CEO Mark Zuckerberg told Congress hate groups are not allowed on Facebook.
After the triple shooting Aug. 25 in Kenosha, Facebook received similar backlash for not quickly removing an event called “Armed Citizens to Protect our Lives and Property.” The invitation, which was linked to a self-styled militia group known as the Kenosha Guard, was reposted by Jones’ website, Infowars.
“Any patriots willing to take up arms and defend the City tonight from the evil thugs?” one of the group’s posts read. “No doubt they are currently planning on the next part of the City to burn tonight!”
Other posts encouraged people to “lock and load” and to slash protesters’ tires, put sugar in their gas tanks and mark their cars with paintballs so they could be easily followed.
When people reported those posts, Facebook replied they did not violate community standards. Grosskreutz said he received the same response when he reported threats to Facebook.
“The platform companies really have to take responsibility for creating these tools that make us all so vulnerable,” Daniels said. “They have really thrown gasoline on the fire of white supremacy in the United States and globally.”
Facebook did not respond to an email request for comment. During a company meeting, first reported by BuzzFeed, Zuckerberg described the shootings as “really, deeply troubling.” In a video of his remarks, which was publicly posted, he called the failure to remove the Kenosha Guard posting until after Huber and Rosenbaum were killed “largely an operational mistake.”
In the video, Zuckerberg said the posting violated a policy against dangerous organizations.
“This page and the militia, the Kenosha Guard page and event, violated this new policy we put in place a couple weeks ago against – that included QAnon and other militia groups that we worried could be trying to organize violence now, in this volatile period and especially as we get closer to the election – and after the election – when I think there’s a significant risk of civil unrest as well,” he said.
In the years since his son was killed at Sandy Hook, Pozner has filed numerous lawsuits. Perhaps his most important victory, he said, came when Jones admitted, under oath, that Noah was a real person who died in his classroom as a result of a mass shooting.
“I don’t need him to go away,” Pozner said of Jones and Infowars. “He has every right to scream with his last breath, as long as he’s not talking about me and pouring salt on my wound.”
Pozner won financial settlements against Jones and others who publicly accused him of lying about his son’s death or encouraged others to harass him. One woman who threatened him, Lucy Richards, was convicted of a federal crime and sentenced to five months in prison.
Pozner has made it his life’s work to help others whose lives are disrupted by online threats. In 2014, he founded the nonprofit HONR Network, which works to remove harmful posts and improve platforms’ anti-harassment policies. Its members advocate for more government regulation of the internet.
“I never considered that it was a choice to not do what I’m doing,” Pozner said. “My response doesn’t have a decision process with it. This is just the only way I could have responded.”
Working with about 300 volunteers worldwide, the network has gotten hundreds of thousands of pieces of content taken down, he said. The group assisted Maatje Benassi, a U.S. Army reservist falsely accused of starting the coronavirus pandemic. A YouTube channel featured 4,000 videos publicizing the lie, which led to death threats against Benassi and her family. HONR worked with YouTube to get the videos removed.
Because of the nonprofit group’s efforts, Facebook and YouTube updated their policies to better protect victims of violence, Pozner said.
Twitter, he said, has been less cooperative.
“They’re not like the other companies,” he said. “The volume of content they have to deal with is greater, but Twitter is responsible for the misinformation, hate and probably a lot of crimes that go on because that’s how the ideas spread.”
Twitter did not respond to an email request for comment.
The platform’s online help center says, “You can report Tweets, profiles, or Direct Messages directly to us. Twitter may take action on the threatening Tweet, Direct Message, and/or the responsible account.
“However, if someone has Tweeted or messaged a violent threat that you feel is credible or you fear for your own or someone else’s physical safety, you may want to contact your local law enforcement agency.”
Push for government regulation
Advocates of government regulation said it’s the only way to stop the barrage of online hate.
Pozner uses this analogy, inspired by a story in the Detroit News in 2015:
At the turn of the 20th century, America’s newest technology was the automobile. People bought cars without knowing how to use them safely. They drove wherever they wanted, in every direction, at every speed. They parked on lawns. People died in crashes. Children playing in the street were routinely hit by cars and killed.
Eventually, the government stepped in. Cities set speed limits and started doing traffic control, but that wasn’t enough. They put up stop signs and traffic lights, painted crosswalks and designated no-parking zones. Authorities set up rules of the road and required people to pass safety tests and get licenses to drive. Entire government agencies are dedicated to automobile safety.
In Pozner’s view, a similar evolution needs to take place when it comes to the internet.
“I don’t think this is going away, this hate,” he said. “I think it’s only going to get worse until the government steps in.”
Contributing: Rory Linnane
If you are being harassed or threatened online, contact the HONR network at https://honrnetwork.org/report-online-abuse.