Objective: The purpose of this study was to compare pharmacy students' ability to correctly answer drug information questions using Micromedex with Watson, Micromedex without Watson, or Google.
Methods: This multicenter randomized trial compared pharmacy student responses to drug information questions using Micromedex with Watson, Micromedex without Watson, or Google from January to March of 2020. First- to fourth-year pharmacy students at two institutions were included. The primary outcome was the number of correct answers. Secondary outcomes were the time taken to answer the questions and differences in number of correct answers by pharmacy student year and institution.
Results: The analysis included 162 participants: 52 students in the Micromedex group, 51 students in the Watson group, and 59 students in the Google group. There was a significant difference among groups in the total number of questions answered correctly (p=0.02). Post-hoc analysis revealed that participants in the Micromedex group answered more questions correctly than those in the Google group (p=0.015). There were no significant differences between Micromedex and Watson groups (p=0.52) or between Watson and Google groups (p=0.22). There was also no difference in time to complete the questions among groups (p=0.72).
Conclusion: Utilizing Google did not save students time and led to more incorrect answers. These findings suggest that health care educators and health sciences librarians should further reinforce training on the appropriate use of drug information resources.
Background: One-shot library sessions have numerous drawbacks; most notably, they rarely have a long-term impact on students' research behavior or skill sets. Library literature notes that when students interact with an embedded librarian, their skills improve. While close partnerships with subject faculty are important, librarians must also assess students' skill sets to determine the impact of these teaching efforts.
Case presentation: During the course, the embedded librarian used various activities and assignments to teach information-seeking skills, with the expected outcome of increased skill sets. This IRB-approved research project focused on measuring and assessing students' information-seeking abilities before and after interacting with the embedded nursing librarian. Changes in students' information fluency skills were measured using pre- and post-tests.
Conclusions: The study results provide evidence of the benefits of the embedded librarianship model. Continued measurement of students' skills acquisition is important to enable librarians and library administrators to show the positive impacts the library has on student learning and success.
Background: While writing a scoping review, we needed to update our search strategy. We wanted to capture articles generated by our additional search terms and articles published since our original search. Simultaneously, we strove to optimize project resources by not rescreening articles that had been captured in our original results.
Case presentation: In response, we created Open Update Re-run Deduplicate (OUR2D2), a computer application that allows the user to compare search results from a variety of library databases. OUR2D2 supports extensible markup language (XML) files from EndNote and comma-separated values (CSV) files using article titles for comparisons. We conducted unit tests to ensure appropriate functionality as well as accurate data extraction and analysis. We tested OUR2D2 by comparing original and updated search results from PubMed, Embase, Clarivate Web of Science, CINAHL, Scopus, ProQuest Dissertation and Theses, and Lens and estimate that this application saved twenty-one hours of work during the screening process.
Conclusions: OUR2D2 could be useful for individuals seeking to update literature review strategies across fields without rescreening articles from previous searches. Because the OUR2D2 source code is freely available with a permissive license, we recommend this application for researchers conducting literature reviews who need to update their search results over time, want a powerful and flexible analysis framework, and may not have access to paid subscription tools.
Background: Despite a strong research presence in Lancashire Teaching Hospitals National Health Service (NHS) Foundation Trust (LTHTR), allied health professionals from the organization are underrepresented in developing and publicizing research that is inspired by day-to-day clinical practice and staff experiences. Two LTHTR departments, Library and Knowledge Services (LKS) and Research and Innovation (R&I), came together to enable a group of staff to develop the knowledge and skills that they needed to access information and create new "home grown" research.
Case presentation: A clinical librarian and an academic research nurse created a research engagement program in the diagnostic radiography department at LTHTR, which included the development, delivery, and evaluation of 6 workshops. Sixteen individuals took part in these workshops, and data were collected on library usage, self-efficacy in information literacy, and research output before and after their delivery. Library membership increased by 50% among diagnostic radiography staff, literature search requests from this department increased by 133%, and all participants who attended at least 1 workshop reported an increased Information Literacy Self Efficacy Scale (ILSES) score. An increase in research activity and outputs was also attributed to the program.
Conclusions: This project has resulted in a set of freely available workshop plans and support resources that can be customized for other health care professionals and has won several awards for its innovative use of departmental collaboration. Through the evaluation of the program from workshop attendees and non-attenders, we have identified impacts, outputs, and barriers to engagement in order to continue to deliver this content to other departments and embed a home grown research culture at LTHTR.
For its fifteenth anniversary, the Jay Sexter Library at Touro University Nevada (TUN) sought ways to capture its institutional history by founding an archive. Among many challenges, the library struggled to convince the administration of the importance of an archive. To generate interest in TUN's history, a task force comprising library, executive administration, and advancement staff hosted and recorded a panel event with some of the university's original faculty, staff, and administration. By having this event, new TUN employees were able to experience the shared knowledge of TUN's early days, and the library was able to create and preserve its own institutional history.
Objective: Reproducibility of systemic reviews (SRs) can be hindered by the presence of citation bias. Citation bias may occur when authors of SRs conduct hand-searches of included study reference lists to identify additional studies. Such a practice may lead to exaggerated SR summary effects. The purpose of this paper is to examine the prevalence of hand-searching reference lists in otolaryngology SRs.
Methods: The authors searched for systematic reviews published in eight clinical otolaryngology journals using the Cochrane Library and PubMed, with the date parameter of January 1, 2008, to December 31, 2017. Two independent authors worked separately to extract data from each SR for the following elements: whether reference lists were hand-searched, other kinds of supplemental searching, PRISMA adherence, and funding source. Following extraction, the investigators met to review discrepancies and achieve consensus.
Results: A total of 539 systemic reviews, 502 from clinical journals and 37 from the Cochrane library, were identified. Of those SRs, 72.4% (390/539) hand-searched reference lists, including 97.3% (36/37) of Cochrane reviews. For 228 (58.5%) of the SRs that hand-searched reference lists, no other supplemental search (e.g., search of trial registries) was conducted.
Conclusions: These findings indicate that hand-searching reference lists is a common practice in otolaryngology SRs. Moreover, a majority of studies at risk of citation bias did not attempt to mitigate the bias by conducting additional supplemental searches. The implication is that summary effects in otolaryngology systematic reviews may be biased toward statistically significant findings.
Objective: There are concerns about nonscientific and/or unclear information on the coronavirus disease 2019 (COVID-19) that is available on the Internet. Furthermore, people's ability to understand health information varies and depends on their skills in reading and interpreting information. This study aims to evaluate the readability and creditability of websites with COVID-19-related information.
Methods: The search terms "coronavirus," "COVID," and "COVID-19" were input into Google. The websites of the first thirty results for each search term were evaluated in terms of their credibility and readability using the Health On the Net Foundation code of conduct (HONcode) and Flesch-Kincaid Grade Level (FKGL), Simple Measure of Gobbledygook (SMOG), Gunning Fog, and Flesch Reading Ease Score (FRE) scales, respectively.
Results: The readability of COVID-19-related health information on websites was suitable for high school graduates or college students and, thus, was far above the recommended readability level. Most websites that were examined (87.2%) had not been officially certified by HONcode. There was no significant difference in the readability scores of websites with and without HONcode certification.
Conclusion: These results suggest that organizations should improve the readability of their websites and provide information that more people can understand. This could lead to greater health literacy, less health anxiety, and the provision of better preventive information about the disease.

