{"title":"Measuring Toxicity Toward Women in Game-Based Communities","authors":"Matthew Belskie, Hanlin Zhang, B. Hemminger","doi":"10.1123/jege.2022-0035","DOIUrl":null,"url":null,"abstract":"Prior research into gaming toxicity in game-specific Reddit communities nearly always considers toxicity in aggregate, and so provides very few clues for a valid coding scheme for isolating toxic language and triggers that specifically target women gamers. Existing research offers a starting place for devising valid methods for measuring and detecting toxic language and toxic triggers within specified data sets, but that research is less useful is its applicability to game-related forms of toxicity targeting women gamers. Where this research had originally hoped to develop an automated method for scoring, limitations with automated detection of toxicity discussed within the paper prompted a shift to what the authors identify as a key intermediate step—better accuracy in toxicity detection by automated means—that will contribute to future achievements in reducing toxicity toward women and other targeted groups in gaming communities. This paper is intended to aid projects that aim to incrementally improve our understanding of toxicity toward women in games and game communities and how to effectively measure it. The conclusion of this research ultimately hopes to contribute to providing information to inform policies that create a safer and more respectful gaming environment for all gamers.","PeriodicalId":266441,"journal":{"name":"Journal of Electronic Gaming and Esports","volume":"83 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Electronic Gaming and Esports","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1123/jege.2022-0035","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Prior research into gaming toxicity in game-specific Reddit communities nearly always considers toxicity in aggregate, and so provides very few clues for a valid coding scheme for isolating toxic language and triggers that specifically target women gamers. Existing research offers a starting place for devising valid methods for measuring and detecting toxic language and toxic triggers within specified data sets, but that research is less useful is its applicability to game-related forms of toxicity targeting women gamers. Where this research had originally hoped to develop an automated method for scoring, limitations with automated detection of toxicity discussed within the paper prompted a shift to what the authors identify as a key intermediate step—better accuracy in toxicity detection by automated means—that will contribute to future achievements in reducing toxicity toward women and other targeted groups in gaming communities. This paper is intended to aid projects that aim to incrementally improve our understanding of toxicity toward women in games and game communities and how to effectively measure it. The conclusion of this research ultimately hopes to contribute to providing information to inform policies that create a safer and more respectful gaming environment for all gamers.