In modern library systems, access to the digital content is heavily dependent on effective metadata. The University of Florida (UF) Digital Collections (UFDC) are an actively growing, open access, digital library comprising over 500,000 records. As with any large-scale digital library project, a well-known challenge is the varying quality and quantity of legacy metadata available for each title. Inconsistent metadata makes digitized materials harder to find. If users cannot find the content they are looking for, a great deal of human effort has been wasted and the investment in digital collections is not being realized. Subject terms can be one of the most efficient methods for accessing desired materials, and subject terms created from controlled vocabularies deliver the most consistent results. To date, applying and editing subject metadata has been a record-by-record, labor-intensive process, making the prospect of retrospective projects cost-prohibitive. The UF team is investigating the capacity of research library staff to implement a Machine Assisted Indexing (MAI) system to automate the process of selecting and applying subject terms, based on the use of a rule set combined with controlled vocabularies, to the metadata of a body of already digitized content. To execute the project, the Smathers Libraries team at UF is collaborating with Access Innovations (AI) consultants to implement a machine-assisted indexing system to mitigate the challenges discussed above.
{"title":"Testing Assumptions—Does Enhancing Subject Terms Increase Use of Digital Library Content?","authors":"Todd Digby, Chelsea Dinsmore","doi":"10.29242/lac.2018.49","DOIUrl":"https://doi.org/10.29242/lac.2018.49","url":null,"abstract":"In modern library systems, access to the digital content is heavily dependent on effective metadata. The University of Florida (UF) Digital Collections (UFDC) are an actively growing, open access, digital library comprising over 500,000 records. As with any large-scale digital library project, a well-known challenge is the varying quality and quantity of legacy metadata available for each title. Inconsistent metadata makes digitized materials harder to find. If users cannot find the content they are looking for, a great deal of human effort has been wasted and the investment in digital collections is not being realized. Subject terms can be one of the most efficient methods for accessing desired materials, and subject terms created from controlled vocabularies deliver the most consistent results. To date, applying and editing subject metadata has been a record-by-record, labor-intensive process, making the prospect of retrospective projects cost-prohibitive. The UF team is investigating the capacity of research library staff to implement a Machine Assisted Indexing (MAI) system to automate the process of selecting and applying subject terms, based on the use of a rule set combined with controlled vocabularies, to the metadata of a body of already digitized content. To execute the project, the Smathers Libraries team at UF is collaborating with Access Innovations (AI) consultants to implement a machine-assisted indexing system to mitigate the challenges discussed above.","PeriodicalId":193553,"journal":{"name":"Proceedings of the 2018 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment: December 5–7, 2018, Houston, TX","volume":"256 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117348812","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Developing Library Learning Outcomes: Reflecting on Instruction across the Library","authors":"Ashley McMullin, Jennifer Schwartz","doi":"10.29242/lac.2018.75","DOIUrl":"https://doi.org/10.29242/lac.2018.75","url":null,"abstract":"","PeriodicalId":193553,"journal":{"name":"Proceedings of the 2018 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment: December 5–7, 2018, Houston, TX","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127082155","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Introduction Usability testing is, famously, an iterative process. You test something, you make changes based upon the results, you test again. The recent website redesign process for the University of Miami Libraries began with an extensive “Discovery and Content Analysis” phase, which was the foundation for over a year’s worth of testing across the subsequent stages of the project. Each set of tests was designed to assess and/or improve the performance of specific features of the site and made use of a diverse set of methodologies and tools. While there was no static set of questions or tasks that appeared in all tests, those which performed well were removed from subsequent testing, while those which performed poorly continued on.
{"title":"Comparing Apples to Oranges? Doing UX Work across Time and Space","authors":"A. Darby, Kineret Ben-Knaan","doi":"10.29242/lac.2018.71","DOIUrl":"https://doi.org/10.29242/lac.2018.71","url":null,"abstract":"Introduction Usability testing is, famously, an iterative process. You test something, you make changes based upon the results, you test again. The recent website redesign process for the University of Miami Libraries began with an extensive “Discovery and Content Analysis” phase, which was the foundation for over a year’s worth of testing across the subsequent stages of the project. Each set of tests was designed to assess and/or improve the performance of specific features of the site and made use of a diverse set of methodologies and tools. While there was no static set of questions or tasks that appeared in all tests, those which performed well were removed from subsequent testing, while those which performed poorly continued on.","PeriodicalId":193553,"journal":{"name":"Proceedings of the 2018 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment: December 5–7, 2018, Houston, TX","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126584278","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Purpose About collection management The methods used to manage collections have changed dramatically in the last half century, a phenomenon that has been well documented in the literature.1 This has been due in part to a reaction to the rapid growth of information resources,2 increased costs of acquiring these resources coupled with decreased share of institutional funding towards libraries,3 and the shift to digital formats, resulting in changes in methods of making these resources accessible to library patrons. Some in the field have gone so far as to suggest that collection management is undergoing a “paradigm shift.”4 These changes have increased the need for information about the collections themselves, notably inputs (costs and needs), outputs (purchases/acquisitions, circulations, and uses), and outcomes (citations, student grades, and faculty grant successes).5
{"title":"The Collection Assessment is Done…Now What?","authors":"K. Harker, Coby Condrey, L. Crawford","doi":"10.29242/lac.2018.66","DOIUrl":"https://doi.org/10.29242/lac.2018.66","url":null,"abstract":"Purpose About collection management The methods used to manage collections have changed dramatically in the last half century, a phenomenon that has been well documented in the literature.1 This has been due in part to a reaction to the rapid growth of information resources,2 increased costs of acquiring these resources coupled with decreased share of institutional funding towards libraries,3 and the shift to digital formats, resulting in changes in methods of making these resources accessible to library patrons. Some in the field have gone so far as to suggest that collection management is undergoing a “paradigm shift.”4 These changes have increased the need for information about the collections themselves, notably inputs (costs and needs), outputs (purchases/acquisitions, circulations, and uses), and outcomes (citations, student grades, and faculty grant successes).5","PeriodicalId":193553,"journal":{"name":"Proceedings of the 2018 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment: December 5–7, 2018, Houston, TX","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129430115","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jeannette E. Pierce, Shannon Cary, G. Gray, Caryn Scoville
{"title":"Knowing Our Users: Deriving Value from the Ithaka S+R Local Surveys at the University of Missouri","authors":"Jeannette E. Pierce, Shannon Cary, G. Gray, Caryn Scoville","doi":"10.29242/lac.2018.9","DOIUrl":"https://doi.org/10.29242/lac.2018.9","url":null,"abstract":"","PeriodicalId":193553,"journal":{"name":"Proceedings of the 2018 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment: December 5–7, 2018, Houston, TX","volume":"96 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127340935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The Texas Tech University Libraries conducted the LibQUAL survey in 2017. After receiving the survey results, the libraries had many unanswered questions—what is the next step? What are the problem areas? Which problems should be addressed first? The website was identified as a topic that merited further study. The user experience (UX) department collaborated with the web librarian to outline projects to gather more evidence that would guide their action. They used a variety of research methods to assess the website: X/O tests to allocate valuable home page real estate to the services and features of most interest to users; card sorting to design a more understandable website navigation; usability testing to evaluate whether common tasks could be performed easily; heuristic evaluations of frequently used webpages to see if they conformed to accepted usability principles; A/B tests to compare different design prototypes; and subsequent surveys to re-evaluate the modifications. By the triangulation of several data sources, they made informed decisions about how to improve the website. As an initial step, LibQUAL does not offer specific answers, but suggests potential directions for further study. This paper describes ways to iteratively test the UX of a website using several complementary methods following an exploratory survey. These strategies extend the value of survey results, making assessments more effective and practical. This pattern can be used not only for a website but for evaluating other services. Introduction In 2011, the Texas Tech University (TTU) Libraries conducted the LibQUAL survey. After receiving the results, the library dean made the comment, “LibQUAL results bring more questions than answers.” At that time, the results were not well disseminated beyond administration, and limited action was taken in response to the survey. In 2017, under a different dean and with a newly-formed user experience (UX) department, the TTU Libraries opted to conduct LibQUAL again. They used a census and received 3,631 valid surveys—a sizable increase over the 584 received in 2011 when they used a sampling method. Participants came from all the sub-groups defined by LibQUAL, and their subject areas covered all the disciplines the university offered. In addition, 1,433 participants shared comments and suggestions about the libraries’ services and resources. The libraries wanted the LibQUAL findings to have a greater impact on services, resources, and spaces than they had in 2011—how to interpret the results, how to share them, and how to make improvements became a challenge for the libraries and the UX department. Comment Coding and Data Revisualization Reviewing best practices for interpreting LibQUAL results was a useful starting point, such as the report, “Libraries Act on Their LibQUAL+ Findings: From Data to Action.”1 There were a few presentations that focused on the practical aspects of analysis: “Analysis and Interpretation of the LibQUAL R
{"title":"LibQUAL+ Results Bring More Questions Than Answers","authors":"Kimberly Vardeman, Jingjing Wu","doi":"10.29242/lac.2018.34","DOIUrl":"https://doi.org/10.29242/lac.2018.34","url":null,"abstract":"The Texas Tech University Libraries conducted the LibQUAL survey in 2017. After receiving the survey results, the libraries had many unanswered questions—what is the next step? What are the problem areas? Which problems should be addressed first? The website was identified as a topic that merited further study. The user experience (UX) department collaborated with the web librarian to outline projects to gather more evidence that would guide their action. They used a variety of research methods to assess the website: X/O tests to allocate valuable home page real estate to the services and features of most interest to users; card sorting to design a more understandable website navigation; usability testing to evaluate whether common tasks could be performed easily; heuristic evaluations of frequently used webpages to see if they conformed to accepted usability principles; A/B tests to compare different design prototypes; and subsequent surveys to re-evaluate the modifications. By the triangulation of several data sources, they made informed decisions about how to improve the website. As an initial step, LibQUAL does not offer specific answers, but suggests potential directions for further study. This paper describes ways to iteratively test the UX of a website using several complementary methods following an exploratory survey. These strategies extend the value of survey results, making assessments more effective and practical. This pattern can be used not only for a website but for evaluating other services. Introduction In 2011, the Texas Tech University (TTU) Libraries conducted the LibQUAL survey. After receiving the results, the library dean made the comment, “LibQUAL results bring more questions than answers.” At that time, the results were not well disseminated beyond administration, and limited action was taken in response to the survey. In 2017, under a different dean and with a newly-formed user experience (UX) department, the TTU Libraries opted to conduct LibQUAL again. They used a census and received 3,631 valid surveys—a sizable increase over the 584 received in 2011 when they used a sampling method. Participants came from all the sub-groups defined by LibQUAL, and their subject areas covered all the disciplines the university offered. In addition, 1,433 participants shared comments and suggestions about the libraries’ services and resources. The libraries wanted the LibQUAL findings to have a greater impact on services, resources, and spaces than they had in 2011—how to interpret the results, how to share them, and how to make improvements became a challenge for the libraries and the UX department. Comment Coding and Data Revisualization Reviewing best practices for interpreting LibQUAL results was a useful starting point, such as the report, “Libraries Act on Their LibQUAL+ Findings: From Data to Action.”1 There were a few presentations that focused on the practical aspects of analysis: “Analysis and Interpretation of the LibQUAL R","PeriodicalId":193553,"journal":{"name":"Proceedings of the 2018 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment: December 5–7, 2018, Houston, TX","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122461533","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Introduction Assessing diversity, equity, and inclusion (DEI) in US academic libraries is currently a non-standardized process. Often, efforts to assess DEI have focused on counting the number of librarians or staff of color working in the library, an artificial and limited measure that narrowly equates DEI with staffing. In 2012, the Association of College & Research Libraries (ACRL) published a list of “Diversity Standards” that expanded the idea of what DEI entailed in libraries, while also increasing the complexity of measuring DEI.1 ACRL’s “Standards” suggested that, in addition to the diversity of the workforce, DEI competency in libraries also included areas such as delivery of services, organizational dynamics, the development of collections, professional development, and research.2 In a similar vein, the American Library Association (ALA) included DEI as one of eight “Key Action Areas” for the Association.3 While workforce diversity remained prominent within the goals and strategies of this strategic area, there was also recognition of the importance of DEI within LIS education, professional development, and research endeavors.4
{"title":"Taking AIM: Integrating Organization Development into the Creation of a Diversity, Equity, & Inclusion Audit","authors":"Kawanna Bright, Nikhat J. Ghouse","doi":"10.29242/lac.2018.54","DOIUrl":"https://doi.org/10.29242/lac.2018.54","url":null,"abstract":"Introduction Assessing diversity, equity, and inclusion (DEI) in US academic libraries is currently a non-standardized process. Often, efforts to assess DEI have focused on counting the number of librarians or staff of color working in the library, an artificial and limited measure that narrowly equates DEI with staffing. In 2012, the Association of College & Research Libraries (ACRL) published a list of “Diversity Standards” that expanded the idea of what DEI entailed in libraries, while also increasing the complexity of measuring DEI.1 ACRL’s “Standards” suggested that, in addition to the diversity of the workforce, DEI competency in libraries also included areas such as delivery of services, organizational dynamics, the development of collections, professional development, and research.2 In a similar vein, the American Library Association (ALA) included DEI as one of eight “Key Action Areas” for the Association.3 While workforce diversity remained prominent within the goals and strategies of this strategic area, there was also recognition of the importance of DEI within LIS education, professional development, and research endeavors.4","PeriodicalId":193553,"journal":{"name":"Proceedings of the 2018 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment: December 5–7, 2018, Houston, TX","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126848465","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Emily Daly, Joyce Chapman, Arianne Hartsell-Gundy, Brenda Yang
{"title":"1G Needs Are Student Needs: A Mixed-Methods Approach to Understanding the Experiences of First-Generation College Students","authors":"Emily Daly, Joyce Chapman, Arianne Hartsell-Gundy, Brenda Yang","doi":"10.29242/lac.2018.61","DOIUrl":"https://doi.org/10.29242/lac.2018.61","url":null,"abstract":"","PeriodicalId":193553,"journal":{"name":"Proceedings of the 2018 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment: December 5–7, 2018, Houston, TX","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131449235","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Engaging Graduate Students in Research and Scholarly Life Cycle Practices: Localized Modeling of Scholarly Communication for Alignment with Strategic Initiatives","authors":"Anjum Najmi, Scott Lancaster","doi":"10.29242/lac.2018.46","DOIUrl":"https://doi.org/10.29242/lac.2018.46","url":null,"abstract":"","PeriodicalId":193553,"journal":{"name":"Proceedings of the 2018 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment: December 5–7, 2018, Houston, TX","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114283052","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Introduction Librarians rely on statistics to capture who their users are and what resources and services they use. For an academic campus, the number of students, faculty, and researchers provide a sense of potential users of library services and resources. But who uses the services and resources and what do they use? Library staff can spend a lot of time tracking, consolidating, and analyzing the amount of materials that circulate, usage of e-resources, instructional sessions taught, questions asked at service points, and consultations in order to identify resources and services that need be supported and continued. Those numbers could also help identify areas of users’ needs that that the library staff could address.
{"title":"Finding Hidden Treasures in the Data","authors":"C. Dennison, Jan Sung","doi":"10.29242/lac.2018.3","DOIUrl":"https://doi.org/10.29242/lac.2018.3","url":null,"abstract":"Introduction Librarians rely on statistics to capture who their users are and what resources and services they use. For an academic campus, the number of students, faculty, and researchers provide a sense of potential users of library services and resources. But who uses the services and resources and what do they use? Library staff can spend a lot of time tracking, consolidating, and analyzing the amount of materials that circulate, usage of e-resources, instructional sessions taught, questions asked at service points, and consultations in order to identify resources and services that need be supported and continued. Those numbers could also help identify areas of users’ needs that that the library staff could address.","PeriodicalId":193553,"journal":{"name":"Proceedings of the 2018 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment: December 5–7, 2018, Houston, TX","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134139277","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}