{"title":"Altmetric.com or PlumX: Does it matter?","authors":"Behrooz Rasuli, Majid Nabavi","doi":"10.1002/leap.1631","DOIUrl":null,"url":null,"abstract":"<p>Altmetric.com and PlumX are two prominent tools for collecting alternative metrics data. This study has two main objectives: first, to evaluate how the choice between Altmetric.com and PlumX affects the results of alternative metrics analysis, and second, to investigate the social impact of ‘hot papers’ through the alternative metrics data provided by these platforms. We employed a descriptive and exploratory approach, gathering common alternative metrics from 4236 hot papers using both Altmetric.com and PlumX. The data collected included various alternative metrics such as policy mentions, Mendeley readers, Wikipedia mentions, blog mentions, Facebook mentions, and news mentions, in addition to citation counts from Scopus. We conducted descriptive statistics and inferential analyses to examine the relationships between citations and alternative metrics, as well as to compare the data obtained from both platforms. Our findings indicate that PlumX has broader coverage of hot papers compared to Altmetric.com. While the mean and individual values of alternative metrics differ between the two platforms, the median and geometric mean are similar. Both Altmetric.com and PlumX demonstrate that publications with higher citation counts tend to receive more online attention. Notably, all alternative metrics for <i>Immunology</i> and <i>Chemistry</i> showed statistically significant differences between the two platforms, whereas in <i>Mathematics</i>, alternative metrics (with the exception of Mendeley readers) did not exhibit significant differences. The findings suggest that researchers should be aware of potential variations in data captured by different alternative metrics platforms. Additionally, interpreting alternative metrics data requires caution, considering the research fields and the specific platform used.</p>","PeriodicalId":51636,"journal":{"name":"Learned Publishing","volume":"37 4","pages":""},"PeriodicalIF":2.2000,"publicationDate":"2024-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/leap.1631","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Learned Publishing","FirstCategoryId":"91","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/leap.1631","RegionNum":3,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"INFORMATION SCIENCE & LIBRARY SCIENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Altmetric.com and PlumX are two prominent tools for collecting alternative metrics data. This study has two main objectives: first, to evaluate how the choice between Altmetric.com and PlumX affects the results of alternative metrics analysis, and second, to investigate the social impact of ‘hot papers’ through the alternative metrics data provided by these platforms. We employed a descriptive and exploratory approach, gathering common alternative metrics from 4236 hot papers using both Altmetric.com and PlumX. The data collected included various alternative metrics such as policy mentions, Mendeley readers, Wikipedia mentions, blog mentions, Facebook mentions, and news mentions, in addition to citation counts from Scopus. We conducted descriptive statistics and inferential analyses to examine the relationships between citations and alternative metrics, as well as to compare the data obtained from both platforms. Our findings indicate that PlumX has broader coverage of hot papers compared to Altmetric.com. While the mean and individual values of alternative metrics differ between the two platforms, the median and geometric mean are similar. Both Altmetric.com and PlumX demonstrate that publications with higher citation counts tend to receive more online attention. Notably, all alternative metrics for Immunology and Chemistry showed statistically significant differences between the two platforms, whereas in Mathematics, alternative metrics (with the exception of Mendeley readers) did not exhibit significant differences. The findings suggest that researchers should be aware of potential variations in data captured by different alternative metrics platforms. Additionally, interpreting alternative metrics data requires caution, considering the research fields and the specific platform used.