Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsCurbing Online Abuse Isn’t Impossible. Here’s Where We Start
from Wired!
http://www.wired.com/2014/05/fighting-online-harassment/
Fucking dumb bitch, the message began, then went on to detail the manner in which Jenny Haniver should be sexually assaulted and murdered. Haniver (her gaming name, not her real one) found it in the voicemail of her Xbox account, left by a male competitor in the online combat game Call of Duty: Modern Warfare 3. For Haniver, this was far from an isolated incident. In another match, after an opponent asked if she was menstruating and opined that girls played videogames only for attention, he left Haniver a voicemail that said, Im gonna impregnate you with triplets and then make you have a very late-term abortion. For three and a half years, Haniver has kept track of the invective heaped on her in multiplayer games, posting some 200 incidents on her blog so far.
Haniver, of course, is not aloneharassment on the Internet is ubiquitous, particularly for women. In a 2013 Pew Research survey, 23 percent of people ages 18 to 29 reported being stalked or harassed online; advocacy groups report that around 70 percent of the cases they deal with involve female victims, and one study of online gaming found players with female voices received three times as many negative responses as men.
Boasting more than 67 million active players each month, the battle-arena game League of Legends is perhaps the most popular videogame in the world. But two years ago its publisher, Riot Games, noticed that a significant number of players had quit the game and cited noxious behavior as the reason. In response, the company assembled a player behavior team, bringing together staff members with PhDs in psychology, cognitive science, and neuroscience to study the issue of harassment by building and analyzing behavioral profiles for tens of millions of users.
This process led them to a surprising insightone that shaped our entire approach to this problem, says Jeffrey Lin, Riots lead designer of social systems, who spoke about the process at last years Game Developers Conference. If we remove all toxic players from the game, do we solve the player behavior problem? We dont. That is, if you think most online abuse is hurled by a small group of maladapted trolls, youre wrong. Riot found that persistently negative players were only responsible for roughly 13 percent of the games bad behavior. The other 87 percent was coming from players whose presence, most of the time, seemed to be generally inoffensive or even positive. These gamers were lashing out only occasionally, in isolated incidentsbut their outbursts often snowballed through the community. Banning the worst trolls wouldnt be enough to clean up League of Legends, Riots player behavior team realized. Nothing less than community-wide reforms could succeed.
Some of the reforms Riot came up with were small but remarkably effective. Originally, for example, it was a default in the game that opposing teams could chat with each other during play, but this often spiraled into abusive taunting. So in one of its earliest experiments, Riot turned off that chat function but allowed players to turn it on if they wanted. The impact was immediate. A week before the change, players reported that more than 80 percent of chat between opponents was negative. But a week after switching the default, negative chat had decreased by more than 30 percent while positive chat increased nearly 35 percent. The takeaway? Creating a simple hurdle to abusive behavior makes it much less prevalent.
Haniver, of course, is not aloneharassment on the Internet is ubiquitous, particularly for women. In a 2013 Pew Research survey, 23 percent of people ages 18 to 29 reported being stalked or harassed online; advocacy groups report that around 70 percent of the cases they deal with involve female victims, and one study of online gaming found players with female voices received three times as many negative responses as men.
Boasting more than 67 million active players each month, the battle-arena game League of Legends is perhaps the most popular videogame in the world. But two years ago its publisher, Riot Games, noticed that a significant number of players had quit the game and cited noxious behavior as the reason. In response, the company assembled a player behavior team, bringing together staff members with PhDs in psychology, cognitive science, and neuroscience to study the issue of harassment by building and analyzing behavioral profiles for tens of millions of users.
This process led them to a surprising insightone that shaped our entire approach to this problem, says Jeffrey Lin, Riots lead designer of social systems, who spoke about the process at last years Game Developers Conference. If we remove all toxic players from the game, do we solve the player behavior problem? We dont. That is, if you think most online abuse is hurled by a small group of maladapted trolls, youre wrong. Riot found that persistently negative players were only responsible for roughly 13 percent of the games bad behavior. The other 87 percent was coming from players whose presence, most of the time, seemed to be generally inoffensive or even positive. These gamers were lashing out only occasionally, in isolated incidentsbut their outbursts often snowballed through the community. Banning the worst trolls wouldnt be enough to clean up League of Legends, Riots player behavior team realized. Nothing less than community-wide reforms could succeed.
Some of the reforms Riot came up with were small but remarkably effective. Originally, for example, it was a default in the game that opposing teams could chat with each other during play, but this often spiraled into abusive taunting. So in one of its earliest experiments, Riot turned off that chat function but allowed players to turn it on if they wanted. The impact was immediate. A week before the change, players reported that more than 80 percent of chat between opponents was negative. But a week after switching the default, negative chat had decreased by more than 30 percent while positive chat increased nearly 35 percent. The takeaway? Creating a simple hurdle to abusive behavior makes it much less prevalent.
The article is very interesting and it is encouraging to see a company take the issue seriously and assign a team of folk with good credentials to examine the problem and devise solutions. Perhaps some of what they are talking about could be applicable here?
InfoView thread info, including edit history
TrashPut this thread in your Trash Can (My DU » Trash Can)
BookmarkAdd this thread to your Bookmarks (My DU » Bookmarks)
0 replies, 561 views
ShareGet links to this post and/or share on social media
AlertAlert this post for a rule violation
PowersThere are no powers you can use on this post
EditCannot edit other people's posts
ReplyReply to this post
EditCannot edit other people's posts
Rec (1)
ReplyReply to this post