img

Can online gaming ditch its sexist ways?

(MENAFN - The Conversation) A huge online community has developed around the increasingly diverse world of video games. Online streaming systems like let people watch others play video games in real time, attracting crowds . And women are increasingly finding as gamers and as Twitch streamers.

Computational social scientists like me – the band of scholars working at the – are attracted to online communities like Twitch because they help us study those social groups and society at large. Of particular interest are the community norms that develop. An internet-wide example is the understanding that typing in all-caps . Individual communities develop their own and unwritten .

Like other online communities, Twitch has , of which . The site's managers recently after both streamed gender-biased material that violated Twitch's rules. My own research, together with , and , has , including whether they are treated similarly to men or commonly identified as different – and even subjected to sexual objectification.

The paradox of online communities

Online, cultural norms present a paradox: The communities they apply to are online, open groups that anyone with an internet connection can join just by creating an account, which is usually free. But it takes time, effort and, especially, acceptance to become a true member. We sought to find out whether these norms involve gender stereotypes and sexism, excluding and mistreating women and girls.

In our research, we focused on Twitch chats, where viewers can comment on a broadcast while watching a video stream. Viewers can chat among themselves, and interact with the streamer. We wanted to see whether chat language involved more objectification when the streamer was a woman.

A screenshot of the Twitch chat and video stream.

We analyzed several months' worth of chat transcripts using several data science tools, including , noting how often words were , and between words. We noticed major distinctions between the language commenters used on the top 100 most popular women-operated streams, and on streams of similar popularity operated by men.

When watching a man stream, viewers typically talk about the game and try to engage with the streamer; game jargon (words like 'points,' 'winner' and 'star') and user nicknames are among the most important terms. But when watching a woman stream, the tone changes: Game jargon drops, and objectification language increases (words like 'cute,' 'fat' and 'boobs'). The difference is particularly striking when the streamer is popular, and less so when looking at comments on less-popular streamers' activity.

Most characteristic terms in the Twitch chat by gender and popularity of the streamer. Larger fonts denote words that are more distinctive of each group of streamers. Objectifying terms are represented in red. Toward more open online communities

The trends our research identified suggest that objectification and harassment may be a problem for female streamers who want to become part of the online gaming community. Site owners and managers must face the fact that anyone can join the community: As online gaming becomes a more mainstream activity, some of its social norms – especially those related to gender stereotyping – are being called into question.

Other online communities are also trying to open up to a broader set of participants, but for different reasons. For example, Wikipedia to attract newcomers and transform them into active contributors, in part because of a that includes harassment of women.

The big communities like Wikipedia and Twitch – with of active users () – are looking for ways technology can help inform these social changes. Twitch recently released a tool called ' that watches for specific keywords and lets streamers identify and filter out their viewers' trolling, objectifying language and other forms of abuse. Similarly, Wikipedia has been developing to detect instances of harassment and incivility in contributors' discussions, marking them for humans to review for potential disciplinary action.

While these tools look promising, it's not yet clear how they will affect their communities' overall well-being. that could unfairly target certain groups or classes of users. And abuse detection systems often with more sophisticated manipulation and abuse techniques.

Moreover, detecting abuse is only a part of the issue. To improve openness and engage new users, these sites also need to with veteran community members. One example is the , which emphasizes the word 'friendly' in its description of itself as a place for people to learn about 'Wikipedia culture.'

The organizations that operate Twitch and Wikipedia are just beginning to wrestle with the results of the online-community paradox. As new users join other sites, those communities' owners will also need to examine their cultural norms to drive out toxic standards that effectively silence entire groups.


The Conversation

MENAFN1611201701990000ID1096111791


More Stories