在线百科全书的感知质量

Corinna Raith, Stefan Koch
{"title":"在线百科全书的感知质量","authors":"Corinna Raith, Stefan Koch","doi":"10.4018/ijsmoc.2019010104","DOIUrl":null,"url":null,"abstract":"This study illustrates how different user groups perceive and evaluate the content quality of Wikipedia articles as compared to entries of a traditional encyclopedia. Therefore, an experimental set-up was used with blinded articles of different topic fields from the German Wikipedia and Brockhaus online, evaluated by experts with different backgrounds (university vs. practice) and by students of the field. The findings showed that the quality of both encyclopedias was assessed similarly (intra-group evaluations), although more faults and mistakes were criticized in the Wikipedia sample. However, the inter-group comparison revealed differences in the groups' quality perceptions. This partly applied to the comparison of the expert groups, and especially to the comparison of expert and (non-expert) student evaluations. Students tended to give better ratings, especially within the Wikipedia sample. Most noticeable, they did not detect any content-related faults in both sets, highlighting that further training is needed to improve their information literacy.","PeriodicalId":422935,"journal":{"name":"International Journal of Social Media and Online Communities","volume":"66 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Perceived Quality of Online Encyclopedias\",\"authors\":\"Corinna Raith, Stefan Koch\",\"doi\":\"10.4018/ijsmoc.2019010104\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This study illustrates how different user groups perceive and evaluate the content quality of Wikipedia articles as compared to entries of a traditional encyclopedia. Therefore, an experimental set-up was used with blinded articles of different topic fields from the German Wikipedia and Brockhaus online, evaluated by experts with different backgrounds (university vs. practice) and by students of the field. The findings showed that the quality of both encyclopedias was assessed similarly (intra-group evaluations), although more faults and mistakes were criticized in the Wikipedia sample. However, the inter-group comparison revealed differences in the groups' quality perceptions. This partly applied to the comparison of the expert groups, and especially to the comparison of expert and (non-expert) student evaluations. Students tended to give better ratings, especially within the Wikipedia sample. Most noticeable, they did not detect any content-related faults in both sets, highlighting that further training is needed to improve their information literacy.\",\"PeriodicalId\":422935,\"journal\":{\"name\":\"International Journal of Social Media and Online Communities\",\"volume\":\"66 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Social Media and Online Communities\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.4018/ijsmoc.2019010104\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Social Media and Online Communities","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4018/ijsmoc.2019010104","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

这项研究说明了不同的用户群体如何感知和评估维基百科文章的内容质量,与传统百科全书的条目相比。因此,我们使用了一个实验装置,对来自德语维基百科和Brockhaus在线的不同主题领域的盲法文章进行了评估,由不同背景的专家(大学与实践)和该领域的学生进行评估。研究结果表明,两个百科全书的质量评估相似(组内评估),尽管维基百科样本中有更多的缺点和错误受到批评。然而,组间比较揭示了组间质量感知的差异。这部分适用于专家组的比较,特别是专家和(非专家)学生评价的比较。学生倾向于给出更高的评分,尤其是在维基百科的样本中。最值得注意的是,他们没有在两个集合中发现任何与内容相关的错误,这突出表明需要进一步的培训来提高他们的信息素养。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Perceived Quality of Online Encyclopedias
This study illustrates how different user groups perceive and evaluate the content quality of Wikipedia articles as compared to entries of a traditional encyclopedia. Therefore, an experimental set-up was used with blinded articles of different topic fields from the German Wikipedia and Brockhaus online, evaluated by experts with different backgrounds (university vs. practice) and by students of the field. The findings showed that the quality of both encyclopedias was assessed similarly (intra-group evaluations), although more faults and mistakes were criticized in the Wikipedia sample. However, the inter-group comparison revealed differences in the groups' quality perceptions. This partly applied to the comparison of the expert groups, and especially to the comparison of expert and (non-expert) student evaluations. Students tended to give better ratings, especially within the Wikipedia sample. Most noticeable, they did not detect any content-related faults in both sets, highlighting that further training is needed to improve their information literacy.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Beyond the Surface An Innovative Methodological Approach to Analysing Social Media Movements “Shared Online, Made People Envious, Felt Good” Users and Gratification Theory Approach to Understand Why People Use Digital Media Mainly During Lockdown Investigating Appearance Ideal Alignment of Popular Fitness Apparel Brands on Instagram
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1