{"title":"评估本科生信息素养随时间的变化","authors":"Arthur Brian Ault, Jessame E. Ferguson","doi":"10.1108/PMM-02-2019-0005","DOIUrl":null,"url":null,"abstract":"\nPurpose\nThe research project assessed information literacy skill changes in college students at two points in time, as entering first-year students in 2012 and as seniors in their senior seminar capstone courses in the 2015–2016 academic year. The paper aims to discuss this issue.\n\n\nDesign/methodology/approach\nThe Standardized Assessment of Information Literacy Skills (SAILS) individual test was the selected instrument. Version 1 of the test was used for first-year students and Version 2 was used for seniors. All testing was done in person in computer labs with a librarian or library staff member present to proctor the test. This resulted in obtaining 330 student results as first years and 307 as seniors, with 161 exact matches for both administrations of the test. Exact matching of student scores to demographic details pulled from the college’s student information systems were used in the analysis.\n\n\nFindings\nThe analysis shows that overall first-year students tested below the 70 percent proficiency benchmark in all eight skill sets, but by the time they were seniors they scored above 70 percent in three skill sets. Male students and students of color performed lower than their counterparts, but these groups did demonstrate significant improvement in four skill sets by the time they were seniors. Students in the Honors program, those who took longer to complete the test as seniors, those with higher GPAs, those in Humanities majors, and those who had upper level course exposures to librarian information literacy instruction had higher performance on the test. There were no statistically significant results for students who were first generation, Pell Grant eligible, or were in-state or out-of-state residents.\n\n\nOriginality/value\nThere are few published studies that utilized the SAILS test for longitudinal institution-wide assessment. The majority of institutions that utilized the individual version of SAILS did so to determine change within a selected course, or set of courses, in the same semester and very few are published.\n","PeriodicalId":44583,"journal":{"name":"Performance Measurement and Metrics","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2019-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1108/PMM-02-2019-0005","citationCount":"1","resultStr":"{\"title\":\"Assessing undergraduate information literacy change over time\",\"authors\":\"Arthur Brian Ault, Jessame E. Ferguson\",\"doi\":\"10.1108/PMM-02-2019-0005\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\nPurpose\\nThe research project assessed information literacy skill changes in college students at two points in time, as entering first-year students in 2012 and as seniors in their senior seminar capstone courses in the 2015–2016 academic year. The paper aims to discuss this issue.\\n\\n\\nDesign/methodology/approach\\nThe Standardized Assessment of Information Literacy Skills (SAILS) individual test was the selected instrument. Version 1 of the test was used for first-year students and Version 2 was used for seniors. All testing was done in person in computer labs with a librarian or library staff member present to proctor the test. This resulted in obtaining 330 student results as first years and 307 as seniors, with 161 exact matches for both administrations of the test. Exact matching of student scores to demographic details pulled from the college’s student information systems were used in the analysis.\\n\\n\\nFindings\\nThe analysis shows that overall first-year students tested below the 70 percent proficiency benchmark in all eight skill sets, but by the time they were seniors they scored above 70 percent in three skill sets. Male students and students of color performed lower than their counterparts, but these groups did demonstrate significant improvement in four skill sets by the time they were seniors. Students in the Honors program, those who took longer to complete the test as seniors, those with higher GPAs, those in Humanities majors, and those who had upper level course exposures to librarian information literacy instruction had higher performance on the test. There were no statistically significant results for students who were first generation, Pell Grant eligible, or were in-state or out-of-state residents.\\n\\n\\nOriginality/value\\nThere are few published studies that utilized the SAILS test for longitudinal institution-wide assessment. The majority of institutions that utilized the individual version of SAILS did so to determine change within a selected course, or set of courses, in the same semester and very few are published.\\n\",\"PeriodicalId\":44583,\"journal\":{\"name\":\"Performance Measurement and Metrics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2019-07-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1108/PMM-02-2019-0005\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Performance Measurement and Metrics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1108/PMM-02-2019-0005\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"INFORMATION SCIENCE & LIBRARY SCIENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Performance Measurement and Metrics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/PMM-02-2019-0005","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"INFORMATION SCIENCE & LIBRARY SCIENCE","Score":null,"Total":0}
Assessing undergraduate information literacy change over time
Purpose
The research project assessed information literacy skill changes in college students at two points in time, as entering first-year students in 2012 and as seniors in their senior seminar capstone courses in the 2015–2016 academic year. The paper aims to discuss this issue.
Design/methodology/approach
The Standardized Assessment of Information Literacy Skills (SAILS) individual test was the selected instrument. Version 1 of the test was used for first-year students and Version 2 was used for seniors. All testing was done in person in computer labs with a librarian or library staff member present to proctor the test. This resulted in obtaining 330 student results as first years and 307 as seniors, with 161 exact matches for both administrations of the test. Exact matching of student scores to demographic details pulled from the college’s student information systems were used in the analysis.
Findings
The analysis shows that overall first-year students tested below the 70 percent proficiency benchmark in all eight skill sets, but by the time they were seniors they scored above 70 percent in three skill sets. Male students and students of color performed lower than their counterparts, but these groups did demonstrate significant improvement in four skill sets by the time they were seniors. Students in the Honors program, those who took longer to complete the test as seniors, those with higher GPAs, those in Humanities majors, and those who had upper level course exposures to librarian information literacy instruction had higher performance on the test. There were no statistically significant results for students who were first generation, Pell Grant eligible, or were in-state or out-of-state residents.
Originality/value
There are few published studies that utilized the SAILS test for longitudinal institution-wide assessment. The majority of institutions that utilized the individual version of SAILS did so to determine change within a selected course, or set of courses, in the same semester and very few are published.
期刊介绍:
■Quantitative and qualitative analysis ■Benchmarking ■The measurement and role of information in enhancing organizational effectiveness ■Quality techniques and quality improvement ■Training and education ■Methods for performance measurement and metrics ■Standard assessment tools ■Using emerging technologies ■Setting standards or service quality