Harpreet Auby, Lorena S. Grundy, Sandra Huffman, Kaylla Cantilina, Samuel B. Gavitte, Sarah E. Kaczynski, Melissa Penyai, Milo D. Koretsky
<p>Peer review, an established part of research practice, is intended to ensure quality and engender trust among researchers. Reviewers with appropriate expertise evaluate a manuscript to ensure scholarly practices and engage in productive dialogue with the authors. However, significant concerns with peer review (Ware, <span>2008</span>) have sparked proposals to improve the process. Biases can manifest through prestige, nationality, gender, and content (Lee et al., <span>2013</span>). Because the peer review process is primarily maintained by unpaid labor, it burdens overworked scholars, leading to fewer and fewer willing peer reviewers and, at times, rushed reviews (Dance, <span>2023</span>; Flaherty, <span>2022</span>). Delays in peer review are an issue for junior faculty seeking promotion and graduate students entering the job market (Dance, <span>2023</span>). Rushed peer reviews have been shown to miss errors, leading to erroneous publications that erode trust in the scientific community (Campbell, <span>2024</span>; Lowe, <span>2010</span>).</p><p>Previous <i>JEE</i> editorials have addressed the challenges of establishing a new peer review culture in STEM (Benson, <span>2019</span>; Knight & Main, <span>2024</span>) and the positive influences of the <i>JEE</i> Mentored Review Program on the identity of mentees (Jensen et al., <span>2021</span>). Other solutions have been proposed to promote more equitable and efficient peer review. Some claim that crowdsourcing peer reviews using discipline-specific online forums has been fast and effective (List, <span>2017</span>); however, this approach has yet to be widely explored. Artificial intelligence has also been used to aid the review process by potentially improving the quality of reviews and addressing the lack of reviewers; however, this remedy raises substantial concerns about bias, reliability, and appropriate use of data (Hosseini & Horbach, <span>2023</span>).</p><p>Here, we provide reflections from a mentored peer review process within a single engineering education research group. We assert that engaging in this process offers a shared learning experience where emerging scholars can learn about an essential research practice. Furthermore, it has the potential to grow the number of qualified reviewers, improve paper quality, and increase reviewers' academic reading and writing confidence—all while providing quality feedback in a specific review.</p><p>This mentored review process can be viewed through a community of practice lens (Wenger, <span>1998</span>) where novices (e.g., graduate students new to manuscript writing and reviewing) engage as legitimate peripheral participants (Lave & Wenger, <span>1991</span>) in authentic practice to interact with more central participants and learn the sociotechnical practices of the community. The designers of the <i>JEE</i> Mentored Peer Review Program take this approach, giving junior faculty members experience by matching them w
在讨论过程中,指导教师也会在场,回答有关评审过程的问题,并/或提出与论文评审无关的其他想法。小组会议结束后,主持人撰写了评审草案。在吸收了其他参与者的反馈意见并经过教师顾问的质量检查后,主持人提交了评审稿。通过与经验丰富的审稿人进行富有成效的对话,新手认为他们对审稿过程的理解以及对批判性阅读和学术写作的信心都有所增强。具体来说,一些新手在讨论他们审阅的稿件中的困惑点时,从小组的肯定中受益匪浅;最初对自己批评的合理性犹豫不决的新手,对自己发现论点中潜在的差距或缺陷的能力产生了信心。有经验的审稿人获得了一个有组织的机会,来练习他们审阅期刊稿件的技能,并指导新手审稿人。经验丰富的审稿人认为,这种情境可以为今后的指导互动提供参考。无论是新手审稿人还是有经验的审稿人,都非常欣赏从其他角度对稿件提出的见解。这不仅能让他们接触到不同的观点,还能提出同行评审员在评审论文时可能会采取的视角,这将对小组成员今后的写作有所启发。指导小组的同行评审过程支持了指导教师的目标,即培养被指导者与同行评审和出版流程相关的技能。此外,指导教师还认为,参与这一过程加强了重视感性认识、尊重和合作的共同小组文化。最后,由于这一过程的灵活性,被指导者的个人优势和经验得到了凸显。对期刊而言,这一过程提供了一个投资培养合格同行评审员的机会。此外,这一过程的参与者很可能会成为更好的写作者,从而在未来向期刊提交更高质量的稿件,因为同行评审被认为可以促进写作技巧的提高(Finkenstaedt-Quinn et al.参与这项活动为新手和经验丰富的审稿人提供了培养成为成功学者必备技能的机会。我们相信,通过参与真实的实践,这一过程作为研究生和博士后的专业发展工具,为各学科提供了价值。在下面的章节中,我们将介绍我们对有意开展小组同行评审的研究小组的建议。
{"title":"Reflections on a mentored group peer review process","authors":"Harpreet Auby, Lorena S. Grundy, Sandra Huffman, Kaylla Cantilina, Samuel B. Gavitte, Sarah E. Kaczynski, Melissa Penyai, Milo D. Koretsky","doi":"10.1002/jee.20616","DOIUrl":"https://doi.org/10.1002/jee.20616","url":null,"abstract":"<p>Peer review, an established part of research practice, is intended to ensure quality and engender trust among researchers. Reviewers with appropriate expertise evaluate a manuscript to ensure scholarly practices and engage in productive dialogue with the authors. However, significant concerns with peer review (Ware, <span>2008</span>) have sparked proposals to improve the process. Biases can manifest through prestige, nationality, gender, and content (Lee et al., <span>2013</span>). Because the peer review process is primarily maintained by unpaid labor, it burdens overworked scholars, leading to fewer and fewer willing peer reviewers and, at times, rushed reviews (Dance, <span>2023</span>; Flaherty, <span>2022</span>). Delays in peer review are an issue for junior faculty seeking promotion and graduate students entering the job market (Dance, <span>2023</span>). Rushed peer reviews have been shown to miss errors, leading to erroneous publications that erode trust in the scientific community (Campbell, <span>2024</span>; Lowe, <span>2010</span>).</p><p>Previous <i>JEE</i> editorials have addressed the challenges of establishing a new peer review culture in STEM (Benson, <span>2019</span>; Knight & Main, <span>2024</span>) and the positive influences of the <i>JEE</i> Mentored Review Program on the identity of mentees (Jensen et al., <span>2021</span>). Other solutions have been proposed to promote more equitable and efficient peer review. Some claim that crowdsourcing peer reviews using discipline-specific online forums has been fast and effective (List, <span>2017</span>); however, this approach has yet to be widely explored. Artificial intelligence has also been used to aid the review process by potentially improving the quality of reviews and addressing the lack of reviewers; however, this remedy raises substantial concerns about bias, reliability, and appropriate use of data (Hosseini & Horbach, <span>2023</span>).</p><p>Here, we provide reflections from a mentored peer review process within a single engineering education research group. We assert that engaging in this process offers a shared learning experience where emerging scholars can learn about an essential research practice. Furthermore, it has the potential to grow the number of qualified reviewers, improve paper quality, and increase reviewers' academic reading and writing confidence—all while providing quality feedback in a specific review.</p><p>This mentored review process can be viewed through a community of practice lens (Wenger, <span>1998</span>) where novices (e.g., graduate students new to manuscript writing and reviewing) engage as legitimate peripheral participants (Lave & Wenger, <span>1991</span>) in authentic practice to interact with more central participants and learn the sociotechnical practices of the community. The designers of the <i>JEE</i> Mentored Peer Review Program take this approach, giving junior faculty members experience by matching them w","PeriodicalId":50206,"journal":{"name":"Journal of Engineering Education","volume":"113 4","pages":"1110-1114"},"PeriodicalIF":3.9,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/jee.20616","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142540940","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}