Pub Date : 2015-10-01DOI: 10.5325/JASSEINSTEFFE.5.2.0101
Terrel L. Rhodes
This overview examines the current state of assessment and what is needed for student learning and success for graduates. In particular, an examination of current reductionist pressures and the focus on limited/disconnected measures related to learning that do not reflect demonstrated student achievement, as well as the emergence of promising alternatives, such as direct assessment (e.g., VALUE rubrics), collaborative efforts (e.g., the Multi-State Collaborative), and e-portfolios that privilege student authorship and creativity. The past twenty years highlight the importance of formative and summative assessment focused on improvement through project-based signature work useful for pedagogical improvements, curricular reform, and accountability.
{"title":"Assessment: Growing Up Is a Many-Splendored Thing","authors":"Terrel L. Rhodes","doi":"10.5325/JASSEINSTEFFE.5.2.0101","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.5.2.0101","url":null,"abstract":"This overview examines the current state of assessment and what is needed for student learning and success for graduates. In particular, an examination of current reductionist pressures and the focus on limited/disconnected measures related to learning that do not reflect demonstrated student achievement, as well as the emergence of promising alternatives, such as direct assessment (e.g., VALUE rubrics), collaborative efforts (e.g., the Multi-State Collaborative), and e-portfolios that privilege student authorship and creativity. The past twenty years highlight the importance of formative and summative assessment focused on improvement through project-based signature work useful for pedagogical improvements, curricular reform, and accountability.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"20 1","pages":"101 - 116"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90949491","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-10-01DOI: 10.5325/jasseinsteffe.5.2.148
Amy L. Housley Gaffney
This study presents one undergraduate program's analysis of a writing-centered learning outcome, operationalized in terms of American Psychological Association (APA) style, using a standardized rubric. The analysis of data from year one revealed several problematic areas. The rubric was revised and the same outcome was analyzed again the following year. Data from year two demonstrated improvement, but also revealed a different set of concerns. Ultimately, the process proved useful for assessing student writing and making revisions as a result.
{"title":"Revising and Reflecting: How Assessment of APA Style Evolved Over Two Assessment Cycles in an Undergraduate Communication Program","authors":"Amy L. Housley Gaffney","doi":"10.5325/jasseinsteffe.5.2.148","DOIUrl":"https://doi.org/10.5325/jasseinsteffe.5.2.148","url":null,"abstract":"\u0000 This study presents one undergraduate program's analysis of a writing-centered learning outcome, operationalized in terms of American Psychological Association (APA) style, using a standardized rubric. The analysis of data from year one revealed several problematic areas. The rubric was revised and the same outcome was analyzed again the following year. Data from year two demonstrated improvement, but also revealed a different set of concerns. Ultimately, the process proved useful for assessing student writing and making revisions as a result.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91131711","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-10-01DOI: 10.5325/JASSEINSTEFFE.5.2.0117
S. Weisler
The last twenty years have seen considerable accomplishments in the area of the assessment of student learning. I discuss some of the most significant of these achievements while also considering certain open questions that have arisen over the years and that remain in need of further analysis and resolution. Remarks are structured into three sections: Past, Present, and Future. Past chronicles the progress made toward developing a culture of assessment; Present focuses on the results of current research on student learning of interest to the assessment community; and Future considers several current issues that demand our collective attention.
{"title":"Some Perspectives on Assessment of Student Learning","authors":"S. Weisler","doi":"10.5325/JASSEINSTEFFE.5.2.0117","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.5.2.0117","url":null,"abstract":"The last twenty years have seen considerable accomplishments in the area of the assessment of student learning. I discuss some of the most significant of these achievements while also considering certain open questions that have arisen over the years and that remain in need of further analysis and resolution. Remarks are structured into three sections: Past, Present, and Future. Past chronicles the progress made toward developing a culture of assessment; Present focuses on the results of current research on student learning of interest to the assessment community; and Future considers several current issues that demand our collective attention.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"34 1","pages":"117 - 130"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73239384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-10-01DOI: 10.5325/JASSEINSTEFFE.5.2.0131
A. Cole
Much is made about the various challenges of assessment but little is made about the changing landscape of the American university system and whether such changes have made their way into assessment results. A case study based upon assessment work undertaken over a ten-year period is used to illustrate how outside factors may influence assessment results but may not be caught in regular assessment processes. Based upon the lessons of this case study, we suggest that assessment should not only present student-learning outcomes, but seek to interpret student learning outcomes in the context of educational change over time.
{"title":"Overlooked but Not Unimportant: Changes in the University Landscape and Assessment Results","authors":"A. Cole","doi":"10.5325/JASSEINSTEFFE.5.2.0131","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.5.2.0131","url":null,"abstract":"Much is made about the various challenges of assessment but little is made about the changing landscape of the American university system and whether such changes have made their way into assessment results. A case study based upon assessment work undertaken over a ten-year period is used to illustrate how outside factors may influence assessment results but may not be caught in regular assessment processes. Based upon the lessons of this case study, we suggest that assessment should not only present student-learning outcomes, but seek to interpret student learning outcomes in the context of educational change over time.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"33 1","pages":"131 - 147"},"PeriodicalIF":0.0,"publicationDate":"2015-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78335595","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-05-20DOI: 10.1163/15685306-12341652
George Anthony Peffer
{"title":"Editor’s Note","authors":"George Anthony Peffer","doi":"10.1163/15685306-12341652","DOIUrl":"https://doi.org/10.1163/15685306-12341652","url":null,"abstract":"","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"4 1","pages":"vii - viii"},"PeriodicalIF":0.0,"publicationDate":"2015-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78352573","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-04-01DOI: 10.5325/JASSEINSTEFFE.5.1.0001
T. Crowell, Elizabeth G. Calamidas
This comprehensive five-year program assessment study illustrates the full circle of the assessment cycle: development, implementation, feedback, and reshaping of curriculum. First, this study provides specific information on the development and implementation of multiple measures of program assessment for both core and track goals and objectives. Second, it provides five years of quantitative and qualitative data illustrating the use of internship e-portfolios and presentations in assessing students’ proficiencies on 13 specific core competencies and numerous track competencies. Data collected from students’ self-rating, faculty’s rating, and Site supervisors’ assessment provide evaluations of students’ performance and levels of preparedness to enter the public health field and indicate high levels of proficiencies for students and support that program and track goals and objectives are being met. Qualitative data support these statistics and provide insight into program needs that are not being met; based on both sets of data, program solutions are identified and implemented creating an assessment feedback loop. Finally, audience feedback from internship presentations provides a final measure of program assessment and also supports the numerous benefits of students attending these events. Results of all these measures provide valuable insight into future program and course curriculum, along with teaching strategies and techniques in order to increase student learning.
{"title":"Comprehensive Five-Year Program Assessment Study","authors":"T. Crowell, Elizabeth G. Calamidas","doi":"10.5325/JASSEINSTEFFE.5.1.0001","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.5.1.0001","url":null,"abstract":"This comprehensive five-year program assessment study illustrates the full circle of the assessment cycle: development, implementation, feedback, and reshaping of curriculum. First, this study provides specific information on the development and implementation of multiple measures of program assessment for both core and track goals and objectives. Second, it provides five years of quantitative and qualitative data illustrating the use of internship e-portfolios and presentations in assessing students’ proficiencies on 13 specific core competencies and numerous track competencies. Data collected from students’ self-rating, faculty’s rating, and Site supervisors’ assessment provide evaluations of students’ performance and levels of preparedness to enter the public health field and indicate high levels of proficiencies for students and support that program and track goals and objectives are being met. Qualitative data support these statistics and provide insight into program needs that are not being met; based on both sets of data, program solutions are identified and implemented creating an assessment feedback loop. Finally, audience feedback from internship presentations provides a final measure of program assessment and also supports the numerous benefits of students attending these events. Results of all these measures provide valuable insight into future program and course curriculum, along with teaching strategies and techniques in order to increase student learning.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"4 1","pages":"1 - 33"},"PeriodicalIF":0.0,"publicationDate":"2015-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81841140","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-04-01DOI: 10.5325/JASSEINSTEFFE.5.1.0034
S. Hamill
Whereas “best practice” institutions provide information on conducting assessment and closing the loop, their approach may not generalize to other campuses as each institution has its own unique structure and culture. Using Walvoord’s (2010) approach to diagramming an assessment structure, this article extends this work by describing an adaptive process for evaluating and redesigning an assessment system. Through a case study of a public institution, a step-by-step guide is provided for institutions that want to create a simple, meaningful, and sustainable assessment system reflective of their own campus and its unique culture.
{"title":"Evaluating and Redesigning a College Assessment System to Close the Loop","authors":"S. Hamill","doi":"10.5325/JASSEINSTEFFE.5.1.0034","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.5.1.0034","url":null,"abstract":"Whereas “best practice” institutions provide information on conducting assessment and closing the loop, their approach may not generalize to other campuses as each institution has its own unique structure and culture. Using Walvoord’s (2010) approach to diagramming an assessment structure, this article extends this work by describing an adaptive process for evaluating and redesigning an assessment system. Through a case study of a public institution, a step-by-step guide is provided for institutions that want to create a simple, meaningful, and sustainable assessment system reflective of their own campus and its unique culture.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"7 1","pages":"34 - 57"},"PeriodicalIF":0.0,"publicationDate":"2015-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88113407","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-04-01DOI: 10.5325/JASSEINSTEFFE.5.1.0058
Tamara Walser
The purpose of this article is to describe the use of evaluability assessment, an approach to program evaluation, to inform continuous improvement efforts, support accountability requirements, and facilitate a culture of assessment in higher education. Examples of two evaluability assessments conducted in a college of education are discussed, including the key activities carried out for each component of the evaluability assessments, findings and recommendations from each evaluability assessment, and how results of the evaluation work have been used. The examples demonstrate the utility of evaluability assessment as part of a higher education assessment system. They further highlight the value of stakeholder involvement, initiation of assessment by leadership based on need, and use of results in supporting a culture of assessment in higher education. Future research is needed to better understand the utility of evaluability assessment in other program evaluation approaches in diverse higher education contexts.
{"title":"Evaluability Assessment in Higher Education: Supporting Continuous Improvement, Accountability, and a Culture of Assessment","authors":"Tamara Walser","doi":"10.5325/JASSEINSTEFFE.5.1.0058","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.5.1.0058","url":null,"abstract":"The purpose of this article is to describe the use of evaluability assessment, an approach to program evaluation, to inform continuous improvement efforts, support accountability requirements, and facilitate a culture of assessment in higher education. Examples of two evaluability assessments conducted in a college of education are discussed, including the key activities carried out for each component of the evaluability assessments, findings and recommendations from each evaluability assessment, and how results of the evaluation work have been used. The examples demonstrate the utility of evaluability assessment as part of a higher education assessment system. They further highlight the value of stakeholder involvement, initiation of assessment by leadership based on need, and use of results in supporting a culture of assessment in higher education. Future research is needed to better understand the utility of evaluability assessment in other program evaluation approaches in diverse higher education contexts.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"94 2 1","pages":"58 - 77"},"PeriodicalIF":0.0,"publicationDate":"2015-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87671281","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2015-04-01DOI: 10.5325/JASSEINSTEFFE.5.1.0078
Natasha A. Jankowski, Ruth C. Slotnick
Assessment practitioners are tasked with a range of responsibilities from enhancing teaching and learning to improving institutional effectiveness and providing quality assurance, yet little is known about the roles and related skill sets needed to undertake these tasks. Through an examination of job postings coupled with a review of the current literature, one-on-one interviews with four leaders in the field of assessment and an exploration of our own professional experience, this paper proposes a framework of five essential roles for assessment practitioners including assessment/method expert, narrator/translator, facilitator/guide, political navigator and visionary/believer.
{"title":"The Five Essential Roles of Assessment Practitioners","authors":"Natasha A. Jankowski, Ruth C. Slotnick","doi":"10.5325/JASSEINSTEFFE.5.1.0078","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.5.1.0078","url":null,"abstract":"Assessment practitioners are tasked with a range of responsibilities from enhancing teaching and learning to improving institutional effectiveness and providing quality assurance, yet little is known about the roles and related skill sets needed to undertake these tasks. Through an examination of job postings coupled with a review of the current literature, one-on-one interviews with four leaders in the field of assessment and an exploration of our own professional experience, this paper proposes a framework of five essential roles for assessment practitioners including assessment/method expert, narrator/translator, facilitator/guide, political navigator and visionary/believer.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"18 1","pages":"100 - 78"},"PeriodicalIF":0.0,"publicationDate":"2015-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87323691","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-10-01DOI: 10.5325/JASSEINSTEFFE.4.2.0160
Shani D. Carter
Student learning outcomes assessment is conducted at the undergraduate level for academic departments to determine whether students are meeting learning goals. The assessments do not include course grades; rather, they include measures such as standardized tests or student coursework being rated outside the course by committees. Increasingly, there has been a movement to conduct outcomes assessment on graduate programs as well. This article presents an assessment methodology that can be applied to the special conditions of doctoral programs, which are generally structured very differently than undergraduate and masters’ level programs, and which therefore require different methods of outcomes assessment.
{"title":"Doctoral Programs Outcomes Assessment: An Approach to Assessing Program Inputs, Learning Objectives, and Postgraduation Outcomes","authors":"Shani D. Carter","doi":"10.5325/JASSEINSTEFFE.4.2.0160","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.4.2.0160","url":null,"abstract":"Student learning outcomes assessment is conducted at the undergraduate level for academic departments to determine whether students are meeting learning goals. The assessments do not include course grades; rather, they include measures such as standardized tests or student coursework being rated outside the course by committees. Increasingly, there has been a movement to conduct outcomes assessment on graduate programs as well. This article presents an assessment methodology that can be applied to the special conditions of doctoral programs, which are generally structured very differently than undergraduate and masters’ level programs, and which therefore require different methods of outcomes assessment.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"95 1","pages":"160 - 179"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74207670","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}