Pub Date : 2014-10-01DOI: 10.5325/JASSEINSTEFFE.4.2.0116
S. Al-Thani, Ali Abdelmoneim, Khaled Daoud, Adel Cherif, D. Moukarzel
From 2006 to 2012 Qatar University transitioned from doing no program level outcomes-based assessment of student learning to implementation of a robust, effective, and institutionally pervasive Student Learning Outcomes Assessment System (SLOAS) that is characterized by a high level of compliance and meaningful improvements to both learning and assessment processes. Keys to the success of the implementation have been support from campus leadership, creation of a structure and processes that support assessment at all levels, and an intensive program of faculty development and faculty incentives. A unique feature of the system is the auditing of annual program assessment reports by external experts. Comparison of results from the fourth and fifth years of the implementation suggest the following trends: a relatively high and increasing tendency to identify learning improvements involving revisions of curriculum and courses, a low and decreasing tendency to identify learning improvements that cost money, and a high and increasing tendency to make changes to assessment processes that make them more meaningful and more manageable.
{"title":"Assessment of Student Learning Outcomes for Assurance of Learning at Qatar University","authors":"S. Al-Thani, Ali Abdelmoneim, Khaled Daoud, Adel Cherif, D. Moukarzel","doi":"10.5325/JASSEINSTEFFE.4.2.0116","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.4.2.0116","url":null,"abstract":"From 2006 to 2012 Qatar University transitioned from doing no program level outcomes-based assessment of student learning to implementation of a robust, effective, and institutionally pervasive Student Learning Outcomes Assessment System (SLOAS) that is characterized by a high level of compliance and meaningful improvements to both learning and assessment processes. Keys to the success of the implementation have been support from campus leadership, creation of a structure and processes that support assessment at all levels, and an intensive program of faculty development and faculty incentives. A unique feature of the system is the auditing of annual program assessment reports by external experts. Comparison of results from the fourth and fifth years of the implementation suggest the following trends: a relatively high and increasing tendency to identify learning improvements involving revisions of curriculum and courses, a low and decreasing tendency to identify learning improvements that cost money, and a high and increasing tendency to make changes to assessment processes that make them more meaningful and more manageable.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"32 1","pages":"116 - 136"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82696926","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-10-01DOI: 10.5325/JASSEINSTEFFE.4.2.0103
Dale Carpenter, Renee Corbin, Nancy Luke
In the current program evaluation and institutional effectiveness climate for educator preparation programs, the focus on demonstrating that institution graduates make an impact on P-12 student learning includes but sometimes overshadows the need to demonstrate the effectiveness of the operations implemented by programs and institutions. The authors provide a proposed definition for the assessment of operational effectiveness and identify assessments that measure operational effectiveness. The use of operational effectiveness assessments is explained from the point of view of one institution discussing the outcomes and changes that enhanced daily operational effectiveness resulting from the data.
{"title":"Assessment of Operational Effectiveness for Education Program Providers","authors":"Dale Carpenter, Renee Corbin, Nancy Luke","doi":"10.5325/JASSEINSTEFFE.4.2.0103","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.4.2.0103","url":null,"abstract":"In the current program evaluation and institutional effectiveness climate for educator preparation programs, the focus on demonstrating that institution graduates make an impact on P-12 student learning includes but sometimes overshadows the need to demonstrate the effectiveness of the operations implemented by programs and institutions. The authors provide a proposed definition for the assessment of operational effectiveness and identify assessments that measure operational effectiveness. The use of operational effectiveness assessments is explained from the point of view of one institution discussing the outcomes and changes that enhanced daily operational effectiveness resulting from the data.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"356 1","pages":"103 - 115"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77152161","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-10-01DOI: 10.5325/JASSEINSTEFFE.4.2.0137
G. Blau, Corrine M. Snell, D. Campbell, K. Viswanathan, Lynne M. Andersson, Andrea B. Lopez
Professional development engagement (PDE) is the level of undergraduate engagement in professional development. Professional development (PD) is defined as “activities designed to help students prepare for a successful college-to-work transition.” This study tested a new 12-item measure of PDE for a complete-data sample of 437 undergraduate business students. The “did not use” response to an activity for each of the 10 CPDC items resulted in a surprisingly high aggregated loss of respondents. Results indicated that students who either never joined a student professional organization or never lived on or near Main Campus had a higher “did not use” response percentage.
{"title":"Testing a New Measure of Perceived Professional Development Engagement for Undergraduates","authors":"G. Blau, Corrine M. Snell, D. Campbell, K. Viswanathan, Lynne M. Andersson, Andrea B. Lopez","doi":"10.5325/JASSEINSTEFFE.4.2.0137","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.4.2.0137","url":null,"abstract":"Professional development engagement (PDE) is the level of undergraduate engagement in professional development. Professional development (PD) is defined as “activities designed to help students prepare for a successful college-to-work transition.” This study tested a new 12-item measure of PDE for a complete-data sample of 437 undergraduate business students. The “did not use” response to an activity for each of the 10 CPDC items resulted in a surprisingly high aggregated loss of respondents. Results indicated that students who either never joined a student professional organization or never lived on or near Main Campus had a higher “did not use” response percentage.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"1 1","pages":"137 - 159"},"PeriodicalIF":0.0,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89040661","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This article presents results of a study on factors that contribute to successful learning outcomes assessment practices. This qualitative study uses the case study method to analyze factors of success in higher education programs. Interviews of faculty members reveal that factors such as communication, implementing assessment as a change initiative, and using a learning community approach are identified by study participants as the main facilitators of success.
{"title":"Promoting Learning Outcomes Assessment in Higher Education: Factors of Success","authors":"Abdou Ndoye","doi":"10.1353/AIE.2013.0008","DOIUrl":"https://doi.org/10.1353/AIE.2013.0008","url":null,"abstract":"This article presents results of a study on factors that contribute to successful learning outcomes assessment practices. This qualitative study uses the case study method to analyze factors of success in higher education programs. Interviews of faculty members reveal that factors such as communication, implementing assessment as a change initiative, and using a learning community approach are identified by study participants as the main facilitators of success.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"64 1","pages":"157 - 175"},"PeriodicalIF":0.0,"publicationDate":"2014-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84583132","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-01-01DOI: 10.5325/JASSEINSTEFFE.4.1.0001
G. Blau, Corinne M. Snell, D. Campbell, K. Viswanathan, W. Aaronson, Satyajit Karnik
Professional development engagement (PDE) is defined as “the level of perceived undergraduate engagement in professional development activities.” Current measures of student engagement do not adequately measure PDE. A promising PDE scale was utilized for a sample of senior-level business undergraduates. After controlling for student background/precollege variables and college-related variables sets, an organization-related variables set, followed by a motivation-related variables set, explained significant incremental variance in PDE. Specific variables with a significant positive relationship to PDE were joining a student professional organization, motivation to attend the business school, and career development center access ease and service quality.
{"title":"Professional Development Engagement","authors":"G. Blau, Corinne M. Snell, D. Campbell, K. Viswanathan, W. Aaronson, Satyajit Karnik","doi":"10.5325/JASSEINSTEFFE.4.1.0001","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.4.1.0001","url":null,"abstract":"Professional development engagement (PDE) is defined as “the level of perceived undergraduate engagement in professional development activities.” Current measures of student engagement do not adequately measure PDE. A promising PDE scale was utilized for a sample of senior-level business undergraduates. After controlling for student background/precollege variables and college-related variables sets, an organization-related variables set, followed by a motivation-related variables set, explained significant incremental variance in PDE. Specific variables with a significant positive relationship to PDE were joining a student professional organization, motivation to attend the business school, and career development center access ease and service quality.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"16 1","pages":"1 - 26"},"PeriodicalIF":0.0,"publicationDate":"2014-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81414640","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-01-01DOI: 10.5325/jasseinsteffe.4.1.27
Mark E. Engberg, M. Manderino, K. Dollard
This study examines a group of institutions that have recently participated in a national survey focused on student learning. In doing so, the study investigates how institutions decide upon using an externally developed instrument, how campuses use the results to improve their practices, and what barriers and challenges they face in translating survey results into actionable strategies for change. The study highlights issues surrounding assessment use, while also presenting a set of recommendations that might serve as a guide for those campuses concerned about “closing the loop” in their assessment practices.
{"title":"Collecting Dust or Creating Change:","authors":"Mark E. Engberg, M. Manderino, K. Dollard","doi":"10.5325/jasseinsteffe.4.1.27","DOIUrl":"https://doi.org/10.5325/jasseinsteffe.4.1.27","url":null,"abstract":"\u0000 This study examines a group of institutions that have recently participated in a national survey focused on student learning. In doing so, the study investigates how institutions decide upon using an externally developed instrument, how campuses use the results to improve their practices, and what barriers and challenges they face in translating survey results into actionable strategies for change. The study highlights issues surrounding assessment use, while also presenting a set of recommendations that might serve as a guide for those campuses concerned about “closing the loop” in their assessment practices.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"36 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78052475","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-01-01DOI: 10.5325/JASSEINSTEFFE.4.1.0085
Christopher A. Mccullough, Elizabeth A. Jones
Given the importance of assessment in higher education, it is critical to understand how to promote faculty engagement in assessment initiatives in institutions of higher education. This qualitative study identified factors that are associated with positive faculty satisfaction with assessment endeavors and those factors that reduce faculty satisfaction with these activities. Data revealed that faculty satisfaction varied across academic programs. Factors that promoted assessment included assessment methodologies, resources, support, participation, and effective leadership. Factors that reduced faculty satisfaction included the lack of comparative data across institutions, increased workload, and the continuous change in assessment plans.
{"title":"Creating a Culture of Faculty Participation in Assessment: Factors that Promote and Impede Satisfaction","authors":"Christopher A. Mccullough, Elizabeth A. Jones","doi":"10.5325/JASSEINSTEFFE.4.1.0085","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.4.1.0085","url":null,"abstract":"Given the importance of assessment in higher education, it is critical to understand how to promote faculty engagement in assessment initiatives in institutions of higher education. This qualitative study identified factors that are associated with positive faculty satisfaction with assessment endeavors and those factors that reduce faculty satisfaction with these activities. Data revealed that faculty satisfaction varied across academic programs. Factors that promoted assessment included assessment methodologies, resources, support, participation, and effective leadership. Factors that reduced faculty satisfaction included the lack of comparative data across institutions, increased workload, and the continuous change in assessment plans.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"4 1","pages":"101 - 85"},"PeriodicalIF":0.0,"publicationDate":"2014-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79768829","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2014-01-01DOI: 10.5325/JASSEINSTEFFE.4.1.0052
Slotnick, Cratsley, Consalvo, Lerch
Two community colleges and two state universities in central Massachusetts developed a collaborative partnership of faculty assessment teams using institutionally developed rubrics and the LEAP VALUE Written Communication rubric to compare the scoring results and record perceptions of the scoring process itself. Qualitative analysis revealed that while there were differences in interpretations of terminology impacting the assessor confidence and voice when applying both the national and local rubrics to score student work, the process of explicating what goes into selecting a score was central to the process of judging student artifacts. Despite the differences in interpretation of language, quantitative data demonstrated that the LEAP VALUE rubric in its original form or slightly modified allowed assessors to detect significant differences in freshman and sophomore writing samples. By creating a shared partnership for assessment using a mixed-methods approach faculty were able to discuss the requisite level of proficiency in written communication for successful transfer. Translating this knowledge into the types of assignment prompts and assessments needed to measure and communicate a student's proficiency may help to maximize the transfer success for academically at-risk students—indeed, for all students.
{"title":"Outcomes-Based Assessment in Writing:","authors":"Slotnick, Cratsley, Consalvo, Lerch","doi":"10.5325/JASSEINSTEFFE.4.1.0052","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.4.1.0052","url":null,"abstract":"\u0000 Two community colleges and two state universities in central Massachusetts developed a collaborative partnership of faculty assessment teams using institutionally developed rubrics and the LEAP VALUE Written Communication rubric to compare the scoring results and record perceptions of the scoring process itself. Qualitative analysis revealed that while there were differences in interpretations of terminology impacting the assessor confidence and voice when applying both the national and local rubrics to score student work, the process of explicating what goes into selecting a score was central to the process of judging student artifacts. Despite the differences in interpretation of language, quantitative data demonstrated that the LEAP VALUE rubric in its original form or slightly modified allowed assessors to detect significant differences in freshman and sophomore writing samples. By creating a shared partnership for assessment using a mixed-methods approach faculty were able to discuss the requisite level of proficiency in written communication for successful transfer. Translating this knowledge into the types of assignment prompts and assessments needed to measure and communicate a student's proficiency may help to maximize the transfer success for academically at-risk students—indeed, for all students.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"53 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2014-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80657825","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Editor's Note","authors":"George Anthony Peffer","doi":"10.2307/j.ctv1bd4n7x.4","DOIUrl":"https://doi.org/10.2307/j.ctv1bd4n7x.4","url":null,"abstract":"","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"26 1","pages":"vii - viii"},"PeriodicalIF":0.0,"publicationDate":"2013-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80337104","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2013-08-19DOI: 10.1515/9783110679830-001
George Anthony Peffer, L. L. Chrystal, A. Gansemer-Topf, F. S. Laanan, K. Royal, J. Gregg, Sarah F. Rosaen, R. A. Hayes, Marcus Paroske, D. M. De La Mare, Charles Powell
Many four-year institutions are experiencing increasing enrollment of students transferring from two-year institutions. While many institutions collect quantitative data that illustrate enrollment, retention, and graduation rates of transfer students, little is known about the transfer-student transition experience. For this qualitative assessment, 22 traditional-age students who transferred from a two-year community college to a four-year institution were interviewed. Specifically, this assessment looked at reasons why students first enrolled at the community college, the mechanics of the transfer process, and academic and social integration. Student responses provide insight into how institutions can better support the transition and success of transfer students
{"title":"Contributors","authors":"George Anthony Peffer, L. L. Chrystal, A. Gansemer-Topf, F. S. Laanan, K. Royal, J. Gregg, Sarah F. Rosaen, R. A. Hayes, Marcus Paroske, D. M. De La Mare, Charles Powell","doi":"10.1515/9783110679830-001","DOIUrl":"https://doi.org/10.1515/9783110679830-001","url":null,"abstract":"Many four-year institutions are experiencing increasing enrollment of students transferring from two-year institutions. While many institutions collect quantitative data that illustrate enrollment, retention, and graduation rates of transfer students, little is known about the transfer-student transition experience. For this qualitative assessment, 22 traditional-age students who transferred from a two-year community college to a four-year institution were interviewed. Specifically, this assessment looked at reasons why students first enrolled at the community college, the mechanics of the transfer process, and academic and social integration. Student responses provide insight into how institutions can better support the transition and success of transfer students","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"86 1","pages":"1 - 18 - 19 - 32 - 33 - 53 - 54 - 74 - v - vi - vii - viii"},"PeriodicalIF":0.0,"publicationDate":"2013-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86314129","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}