Pub Date : 2016-10-01DOI: 10.5325/JASSEINSTEFFE.6.2.0099
Eleanore C. T. Heaton, D. Ciancio, R. L. Williams
We examined the extent to which socioeconomic status (SES), precollege academic variables, non-course collegiate measures, and in-course collegiate measures predicted receipt and/or retention of a Helping Outstanding Pupils Educationally (HOPE) scholarship at a major southeastern state university. Students (N = 181) enrolled in seven sections of a 200-level general education course participated in the study. Logistic regression analyses revealed that SES and precollege academic models significantly predicted HOPE receipt. Within these models, high school grade point average proved to be the most consistent predictor of HOPE receipt. SES, in-course collegiate, and college grade models were the strongest predictors of HOPE retention.
{"title":"HOPE Scholarship Status of Students in a Large General Education Course","authors":"Eleanore C. T. Heaton, D. Ciancio, R. L. Williams","doi":"10.5325/JASSEINSTEFFE.6.2.0099","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.6.2.0099","url":null,"abstract":"We examined the extent to which socioeconomic status (SES), precollege academic variables, non-course collegiate measures, and in-course collegiate measures predicted receipt and/or retention of a Helping Outstanding Pupils Educationally (HOPE) scholarship at a major southeastern state university. Students (N = 181) enrolled in seven sections of a 200-level general education course participated in the study. Logistic regression analyses revealed that SES and precollege academic models significantly predicted HOPE receipt. Within these models, high school grade point average proved to be the most consistent predictor of HOPE receipt. SES, in-course collegiate, and college grade models were the strongest predictors of HOPE retention.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"23 1","pages":"122 - 99"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78115279","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.5325/JASSEINSTEFFE.6.2.0165
V. Porterfield, M. Weiner, Paul C. Siracusa
ABSTRACT: This study evaluates unit and, more narrowly, sensitive item nonresponse to surveys in the university setting. Subgroups within the responding sample of a student survey at a large, public university in the United States are probed for patterns of differential nonresponse, with a focus on assessing sensitive item non- response. The standout result is that international students are significantly more likely than domestic students to be sensitive to items involving sexual orientation. This result aligns with literature on cultural differences between domestic and international students in US universities. Additionally, this study found nonsignificant nonresponse to other generally accepted sensitive items.
{"title":"A Diagnostic Mechanism for Assessing Respondent Burden: Sensitive Item Nonresponse Bias in Student Surveys","authors":"V. Porterfield, M. Weiner, Paul C. Siracusa","doi":"10.5325/JASSEINSTEFFE.6.2.0165","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.6.2.0165","url":null,"abstract":"ABSTRACT: This study evaluates unit and, more narrowly, sensitive item nonresponse to surveys in the university setting. Subgroups within the responding sample of a student survey at a large, public university in the United States are probed for patterns of differential nonresponse, with a focus on assessing sensitive item non- response. The standout result is that international students are significantly more likely than domestic students to be sensitive to items involving sexual orientation. This result aligns with literature on cultural differences between domestic and international students in US universities. Additionally, this study found nonsignificant nonresponse to other generally accepted sensitive items.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"14 1","pages":"165 - 190"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86615146","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.5325/JASSEINSTEFFE.6.2.0191
P. Dibartolo, A. Rudnitsky, L. Duncan, Minh Ly
ABSTRACT: This article presents a case study of faculty members in a psychology department whose shared questions about pedagogy and learning informed a data-driven curricular review and revision using an open-ended assessment that privileged deep learning. The authors describe the development of this assessment and how its results across the arc of the major led to a revision of the department’s curriculum, including the creation of new courses that focused on developing students’ abilities to “think like psychologists.” The study indicates that faculty intuitions of potential problems in student learning can be successfully assessed and then addressed through curricular changes.
{"title":"Using a “Messy” Problem as a Departmental Assessment of Undergraduates’ Ability to Think Like Psychologists","authors":"P. Dibartolo, A. Rudnitsky, L. Duncan, Minh Ly","doi":"10.5325/JASSEINSTEFFE.6.2.0191","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.6.2.0191","url":null,"abstract":"ABSTRACT: This article presents a case study of faculty members in a psychology department whose shared questions about pedagogy and learning informed a data-driven curricular review and revision using an open-ended assessment that privileged deep learning. The authors describe the development of this assessment and how its results across the arc of the major led to a revision of the department’s curriculum, including the creation of new courses that focused on developing students’ abilities to “think like psychologists.” The study indicates that faculty intuitions of potential problems in student learning can be successfully assessed and then addressed through curricular changes.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"127 1","pages":"191 - 211"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89128688","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01DOI: 10.5325/JASSEINSTEFFE.6.2.0123
Christopher D. Pelletier, Jaya Rose, Mona Russell, D. Guberman, Kanchan Das, Joseph Bland, Heidi S. Bonner, C. Chambers
ABSTRACT: The focus of the present study is whether student engagement as measured at the class level via the Classroom Level Survey of Student Engagement (CLASSE) is associated with higher levels of student satisfaction. To do so, we administered the CLASSE survey to 370 first-year, sophomore, junior, and senior students across nine classrooms. There were no statistically significant relationships between student engagement and overall satisfaction. When analyzed with student responses to questions about their satisfaction in the classroom, we found no variation in student satisfaction by race, gender, or class level. Implications for future research and practice are discussed.
{"title":"Connecting Student Engagement to Student Satisfaction: A Case Study at East Carolina University","authors":"Christopher D. Pelletier, Jaya Rose, Mona Russell, D. Guberman, Kanchan Das, Joseph Bland, Heidi S. Bonner, C. Chambers","doi":"10.5325/JASSEINSTEFFE.6.2.0123","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.6.2.0123","url":null,"abstract":"ABSTRACT: The focus of the present study is whether student engagement as measured at the class level via the Classroom Level Survey of Student Engagement (CLASSE) is associated with higher levels of student satisfaction. To do so, we administered the CLASSE survey to 370 first-year, sophomore, junior, and senior students across nine classrooms. There were no statistically significant relationships between student engagement and overall satisfaction. When analyzed with student responses to questions about their satisfaction in the classroom, we found no variation in student satisfaction by race, gender, or class level. Implications for future research and practice are discussed.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"GE-21 1","pages":"123 - 141"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84612915","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-07-06DOI: 10.5406/15549399.56.2.02
George Anthony Peffer
{"title":"Editor’s Note","authors":"George Anthony Peffer","doi":"10.5406/15549399.56.2.02","DOIUrl":"https://doi.org/10.5406/15549399.56.2.02","url":null,"abstract":"","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"159 1","pages":"iv - v"},"PeriodicalIF":0.0,"publicationDate":"2016-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72861428","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-04-01DOI: 10.5325/JASSEINSTEFFE.6.1.0050
Gary Blau, Darin Kapanjie
This study used two different samples of undergraduate business students taking both online and face-to-face courses to measure the perceived favorability of online courses. The Fall 2014 sample consisted of 237 complete data respondents, while the Spring 2015 sample consisted of 114 complete data respondents. A new reliable four-item perceived favorability of online versus face-to-face courses measure was used. Across both samples, two correlates, that is, satisfaction with course tools, and satisfaction with instructor response time, were each positively related to perceived favorability of online courses, beyond controlled for background and behavioral variables.
{"title":"Correlates of Business Undergraduates’ Perceived Favorability of Online Compared to Face-to-Face Courses","authors":"Gary Blau, Darin Kapanjie","doi":"10.5325/JASSEINSTEFFE.6.1.0050","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.6.1.0050","url":null,"abstract":"This study used two different samples of undergraduate business students taking both online and face-to-face courses to measure the perceived favorability of online courses. The Fall 2014 sample consisted of 237 complete data respondents, while the Spring 2015 sample consisted of 114 complete data respondents. A new reliable four-item perceived favorability of online versus face-to-face courses measure was used. Across both samples, two correlates, that is, satisfaction with course tools, and satisfaction with instructor response time, were each positively related to perceived favorability of online courses, beyond controlled for background and behavioral variables.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"68 1","pages":"50 - 66"},"PeriodicalIF":0.0,"publicationDate":"2016-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84189087","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-04-01DOI: 10.5325/JASSEINSTEFFE.6.1.0001
Peggy C. Holzweiss, Rebecca M. Bustamante, M. Fuller
abstract: In this study, results are presented from a rigorous content analysis of responses to two open-ended questions included in the Administrators’ Survey of Assessment Culture. A sample of 302 US higher education administrators provided 566 narrative responses addressing (1) the primary reason they conducted assessment on campus, and (2) how they would characterize their campus assessment cultures. Analysis revealed two meta-themes: “Institutional Structures,” including procedures, data usage, and accountability; and “Organizational Culture,” administrators’ descriptions of rituals, artifacts, discourse, values, and change related to assessment. Implications are shared for reframing and cultivating notions of institutional cultures of assessment.
{"title":"Institutional Cultures of Assessment: A Qualitative Study of Administrator Perspectives","authors":"Peggy C. Holzweiss, Rebecca M. Bustamante, M. Fuller","doi":"10.5325/JASSEINSTEFFE.6.1.0001","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.6.1.0001","url":null,"abstract":"abstract: In this study, results are presented from a rigorous content analysis of responses to two open-ended questions included in the Administrators’ Survey of Assessment Culture. A sample of 302 US higher education administrators provided 566 narrative responses addressing (1) the primary reason they conducted assessment on campus, and (2) how they would characterize their campus assessment cultures. Analysis revealed two meta-themes: “Institutional Structures,” including procedures, data usage, and accountability; and “Organizational Culture,” administrators’ descriptions of rituals, artifacts, discourse, values, and change related to assessment. Implications are shared for reframing and cultivating notions of institutional cultures of assessment.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"14 1","pages":"1 - 27"},"PeriodicalIF":0.0,"publicationDate":"2016-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85158944","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-04-01DOI: 10.5325/JASSEINSTEFFE.6.1.0028
Mary E. Yakimowski, Mary Truxal
abstract: Policymakers and accreditation agencies are now seeking to gauge the effectiveness of teacher preparation programs by following teacher candidates into their professional practice and, further, by linking to their pupils’ academic performance. However, the task of gathering and analyzing such data is complex, especially within states that have not received federal funding to link the pupil test database to individual teachers by higher education institute where they attended. In this case study, researchers examine mathematics pupil performance in grades 3–8, as measured by the state-mandated assessment, and make connections to a specific university teacher education program. The results of this longitudinal study of pupil performance are shared in order to evaluate the specific teacher preparation program and provide a model for those who investigate the impact of teacher preparation programs. Additionally, obstacles faced and challenges of such a quantitative study for a higher education institution are shared.
{"title":"Assessing Teacher Education through Mathematics Pupil Performance: A Case Study about Implementation in Response to External Pressure","authors":"Mary E. Yakimowski, Mary Truxal","doi":"10.5325/JASSEINSTEFFE.6.1.0028","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.6.1.0028","url":null,"abstract":"abstract: Policymakers and accreditation agencies are now seeking to gauge the effectiveness of teacher preparation programs by following teacher candidates into their professional practice and, further, by linking to their pupils’ academic performance. However, the task of gathering and analyzing such data is complex, especially within states that have not received federal funding to link the pupil test database to individual teachers by higher education institute where they attended. In this case study, researchers examine mathematics pupil performance in grades 3–8, as measured by the state-mandated assessment, and make connections to a specific university teacher education program. The results of this longitudinal study of pupil performance are shared in order to evaluate the specific teacher preparation program and provide a model for those who investigate the impact of teacher preparation programs. Additionally, obstacles faced and challenges of such a quantitative study for a higher education institution are shared.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"17 1","pages":"28 - 49"},"PeriodicalIF":0.0,"publicationDate":"2016-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73078814","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-04-01DOI: 10.5325/JASSEINSTEFFE.6.1.0067
Ron W. Germaine, L. Spencer
This article describes the context, purpose, methodology, findings, and recommendations from a survey conducted over a seven-year period to identify faculty perceptions of an accreditation process. The survey using both closed and open-ended responses was administered annually to the same population in the Sanford College of Education. Findings show that faculty saw the accreditation process as good professional development, that it improved programs, and that it strengthened collaboration. Based on our findings, we offer recommendations to overcome barriers in the accreditation process and thus maximize the benefits of the process to faculty, programs and schools involved in the accreditation work.
{"title":"Faculty Perceptions of a Seven-Year Accreditation Process","authors":"Ron W. Germaine, L. Spencer","doi":"10.5325/JASSEINSTEFFE.6.1.0067","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.6.1.0067","url":null,"abstract":"This article describes the context, purpose, methodology, findings, and recommendations from a survey conducted over a seven-year period to identify faculty perceptions of an accreditation process. The survey using both closed and open-ended responses was administered annually to the same population in the Sanford College of Education. Findings show that faculty saw the accreditation process as good professional development, that it improved programs, and that it strengthened collaboration. Based on our findings, we offer recommendations to overcome barriers in the accreditation process and thus maximize the benefits of the process to faculty, programs and schools involved in the accreditation work.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"61 1","pages":"67 - 98"},"PeriodicalIF":0.0,"publicationDate":"2016-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91019994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-03-04DOI: 10.5325/JASSEINSTEFFE.5.2.0148
A. Gaffney
This study presents one undergraduate program’s analysis of a writing- centered learning outcome, operationalized in terms of American Psychological Association (APA) style, using a standardized rubric. The analysis of data from year one revealed several problematic areas. The rubric was revised and the same outcome was analyzed again the following year. Data from year two demonstrated improvement, but also revealed a different set of concerns. Ultimately, the process proved useful for assessing student writing and making revisions as a result.
{"title":"Revising and Reflecting: How Assessment of APA Style Evolved Over Two Assessment Cycles in an Undergraduate Communication Program","authors":"A. Gaffney","doi":"10.5325/JASSEINSTEFFE.5.2.0148","DOIUrl":"https://doi.org/10.5325/JASSEINSTEFFE.5.2.0148","url":null,"abstract":"This study presents one undergraduate program’s analysis of a writing- centered learning outcome, operationalized in terms of American Psychological Association (APA) style, using a standardized rubric. The analysis of data from year one revealed several problematic areas. The rubric was revised and the same outcome was analyzed again the following year. Data from year two demonstrated improvement, but also revealed a different set of concerns. Ultimately, the process proved useful for assessing student writing and making revisions as a result.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"14 1","pages":"148 - 167"},"PeriodicalIF":0.0,"publicationDate":"2016-03-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74691910","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}