People have been buzzing about the Online Harassment Summit during South by Southwest Interactive since it was announced back in October. The day-long series of discussions on Saturday was put together after a panel on harassment and online gaming scheduled for Interactive was canceled due to threats of violence. The ensuing controversy prompted SXSW organizers to schedule a more formal discussion of online harassment, and everyone remained on guard. We were reminded of the event’s code of conduct at every corner, and we experienced more thorough bag checks at the summit entrance than at most TSA security checkpoints.
But for all the talk of the Harassment Summit in the months leading up to SXSW, the event was sparsely attended, and the panelists were largely preaching to the choir. People who deal in harassment and threats on the Internet are not the audience for an event like this, and representatives from the networks where that harassment happens weren’t in attendance, either. The participating panelists brought thought-provoking ideas and the conversations were lively and informative, but at the end of the day, we had more questions than answers. Was SXSW the right festival for this summit? Was the decision to put the summit across the river from the Austin Convention Center a good one? Why didn’t companies like Twitter and Reddit send any representatives to take part in the discussions?
Here were our biggest takeaways from the SXSW Online Harassment Summit, which ended on a somewhat hopeful note despite the lack of resolution.
It’s the platforms, not the people
You can’t force people to be nice, especially not on the Internet. Not even Facebook, which requires people to use their real names, is immune to problems of harassment and bullying. Panelists agreed that it’s not people who will change, but the platforms they use.
“I think one of the most toxic things tech companies have done is seeing their users as customers,” said CUNY Graduate Center sociologist Katherine Cross, who spoke on a panel about harassment and online gaming. “Whether they like it or not, they were creating communities. By not having a community development level focus, they’ve allowed the most toxic tendencies to flourish.”
Social media companies intend for their communities to harbor positive interactions, but many of the panelists agreed that these good intentions are just not cutting it anymore—especially platforms like Reddit and the location-based Yik Yak, which have a level of anonymity around them.
“I can’t say this clearly enough—Reddit is failing women in every marginalized community spectacularly,” said Giant Spacekat cofounder and game developer Brianna Wu, who spoke on a summit panel about online harassment against women. Reddit largely stays out of monitoring content, encouraging users to monitor themselves and each other instead. This has left some users feeling threatened and unsafe, with little or no support from platform leaders.
Like Reddit, Yik Yak’s policy is to take a step back from playing content police, shifting responsibility to their user base. “Communities don’t always know they have this power in their hands to take care of these things,” said Yik Yak cofounder Tyler Droll during a SXSW panel held separately from the Harassment Summit. “A part of that may be on us, but a part of that may be on the community too, of not knowing they can take care of this quickly, of standing by idly and letting that happen.”
But despite the constant name-dropping of Twitter, Reddit, and other platforms where harassment regularly occurs, those companies weren’t part of the conversation—not even Yik Yak, who didn’t participate in the summit even though both founders were in Austin that weekend. The panels were loaded with prominent academics, activists, policy makers, and others experienced dealing with or studying online harassment, but only Facebook sent a representative.
What’s the solution?
While the people from the companies who can actually make changes weren’t around to hear suggestions, panelists floated a few ideas to curb online harassment.
Online Abuse Prevention Center founder Randi Lee Harper said during the gaming and harassment panel that she’s currently working with tech companies, who have to be convinced that harassment is a “quality of content problem” that will send users fleeing and affect their bottom line. Harper said she’s under non-disclosure agreements and couldn’t talk much about the improvements being made, but hinted that the companies she’s working with understand that harassment is a huge problem. She also pointed to the work of Civil Comments, which uses crowdsourced comment moderation, as a model for other platforms.
Platforms need more moderation, panelists suggested.
“Pay community managers well,” Cross said. “Hire lots of them.”
They need more granular privacy settings, too.
Caroline Sinders, an interaction designer for IBM Watson, had a few ideas for Twitter: “What if in moments of harassment you could turn off the comments? What if there was a way to flag a tweet so it couldn’t be embedded?”
Brianna Wu praised Twitter for making strides in the past year to improve harassment crackdowns, most recently in launching Twitter’s Trust & Safety Council to help the company “strike the right balance between fighting abuse and speaking truth to power.”
Shireen Mitchell, founder of educational nonprofit Digital Sisters, which is geared towards helping women and children from underserved communities, suggested that including a broader spectrum of people on these harassment councils will open up the conversation. “We don’t see much intersection of gender and race at these tables, so we’re not hearing an equal representation of voices,” she said.
For Dr. Mary Anne Franks, a professor at the University of Miami School of Law who advocates against revenge porn, it’s not just about seeing reform at the social level, but at a wider legislative level, too. “We need to be able to say, ‘This isn’t freedom of speech,’” she said during the harassment against women panel. “We need to be able to say, ‘This is abuse, this is harassment… this is affecting my mental health, my lifestyle, and my career prospects.’”
One such proposal made news: Representative Katherine Clark, a Democrat from Massachusetts, announced the Cybercrime Enforcement Training Assistance Act during one of the panels. The legislation would establish a $20 million federal grant to train state and local law enforcement agencies on how to handle cybercrime investigations.
Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read ouraffiliate link policyfor more details.