首页 > 最新文献

Journal of Assessment and Institutional Effectiveness最新文献

英文 中文
Evaluating Academic Rigor, Part II: An Investigation of Student Ratings, Course Grades, and Course Level 评估学术严谨性,第二部分:学生评分、课程成绩和课程水平的调查
Q4 Social Sciences Pub Date : 2020-07-23 DOI: 10.5325/jasseinsteffe.9.1-2.0049
James E. Johnson, James A. Jones, T. Weidner, Allison K. Manwell
abstract:Part I of this project (Johnson, Weidner, Jones, & Manwell, 2018) confirmed the definition of course rigor, as well as the development of questions used to assess rigor. This paper, part II of this project, assessed the student ratings rigor questions to investigate course rigor relative to instructor ratings, course ratings, course grades, enrollment, and course level. A total of 203 courses (2,720 students) participated during a three-year period. Results indicated that course rigor is strongly related to instructor and course ratings, but minimally to course grades. Lower-level courses also were found to have significantly lower rigor than upper-level courses. These results contradict the theory of retributional bias and suggest that faculty are more likely to receive high student ratings if perceived rigor is high. This study also provides a foundation from which course rigor can be further evaluated in different academic contexts.
摘要:该项目的第一部分(Johnson, Weidner, Jones, & Manwell, 2018)确认了课程严谨性的定义,以及用于评估严谨性的问题的发展。本文是该项目的第二部分,评估了学生评分严格性问题,以调查课程严格性与教师评分、课程评分、课程成绩、入学和课程水平的关系。在三年期间,共有203门课程(2 720名学生)参加。结果表明,课程严谨性与教师和课程评分密切相关,但与课程成绩关系不大。低水平课程的严谨性也明显低于高水平课程。这些结果与报复性偏见理论相矛盾,表明如果教师的严密性高,他们更有可能得到学生的高分。本研究也为课程严谨性在不同学术背景下的进一步评估提供了基础。
{"title":"Evaluating Academic Rigor, Part II: An Investigation of Student Ratings, Course Grades, and Course Level","authors":"James E. Johnson, James A. Jones, T. Weidner, Allison K. Manwell","doi":"10.5325/jasseinsteffe.9.1-2.0049","DOIUrl":"https://doi.org/10.5325/jasseinsteffe.9.1-2.0049","url":null,"abstract":"abstract:Part I of this project (Johnson, Weidner, Jones, & Manwell, 2018) confirmed the definition of course rigor, as well as the development of questions used to assess rigor. This paper, part II of this project, assessed the student ratings rigor questions to investigate course rigor relative to instructor ratings, course ratings, course grades, enrollment, and course level. A total of 203 courses (2,720 students) participated during a three-year period. Results indicated that course rigor is strongly related to instructor and course ratings, but minimally to course grades. Lower-level courses also were found to have significantly lower rigor than upper-level courses. These results contradict the theory of retributional bias and suggest that faculty are more likely to receive high student ratings if perceived rigor is high. This study also provides a foundation from which course rigor can be further evaluated in different academic contexts.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"35 1","pages":"49 - 78"},"PeriodicalIF":0.0,"publicationDate":"2020-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73979059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Reexamining Three Held Assumptions about Creating Classroom Assignments That Can Be Used for Institutional Assessment 重新审视关于创建可用于机构评估的课堂作业的三个假设
Q4 Social Sciences Pub Date : 2020-07-23 DOI: 10.5325/jasseinsteffe.9.1-2.0029
Mark C. Nicholas, Barbara C. Storandt, E. Atwood
abstract:This article empirically examines three assumptions that emerged from the literature on using classroom assignments for institutional assessment. The potential misalignment between the source of evidence (classroom assignments) and the assessment method (institutional rubric) is a serious threat to validity when using course-embedded assessment models. Findings revealed that approaches for faculty development in assignment design were drawing from approaches designed for using assignments in the classroom without an examination of implications for institutional assessment. Findings can inform the practice of individual faculty, approaches used for professional development in assignment design, and the movement for accountability focused on using course-embedded assignments.
本文实证研究了从文献中出现的关于使用课堂作业进行机构评估的三个假设。在使用课程嵌入式评估模型时,证据来源(课堂作业)和评估方法(机构标题)之间潜在的不一致对有效性构成严重威胁。研究结果显示,在作业设计中,教师发展的方法借鉴了在课堂上使用作业的方法,而没有对机构评估的影响进行检查。研究结果可以为个别教师的实践、作业设计中用于专业发展的方法以及专注于使用课程嵌入式作业的问责制运动提供信息。
{"title":"Reexamining Three Held Assumptions about Creating Classroom Assignments That Can Be Used for Institutional Assessment","authors":"Mark C. Nicholas, Barbara C. Storandt, E. Atwood","doi":"10.5325/jasseinsteffe.9.1-2.0029","DOIUrl":"https://doi.org/10.5325/jasseinsteffe.9.1-2.0029","url":null,"abstract":"abstract:This article empirically examines three assumptions that emerged from the literature on using classroom assignments for institutional assessment. The potential misalignment between the source of evidence (classroom assignments) and the assessment method (institutional rubric) is a serious threat to validity when using course-embedded assessment models. Findings revealed that approaches for faculty development in assignment design were drawing from approaches designed for using assignments in the classroom without an examination of implications for institutional assessment. Findings can inform the practice of individual faculty, approaches used for professional development in assignment design, and the movement for accountability focused on using course-embedded assignments.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"5 1","pages":"29 - 48"},"PeriodicalIF":0.0,"publicationDate":"2020-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91039514","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Student Achievement Factors in a College Introductory Computer Course 大学计算机导论课程中的学生成绩因素
Q4 Social Sciences Pub Date : 2020-04-01 DOI: 10.5325/jasseinsteffe.10.1-2.0001
Willis L. Boughton
abstract:Custom software was used to collect data for 416 students over multiple semesters of a college introductory computer transfer course. The objective was to quantitatively identify achievement factors and corresponding actions to be taken to improve student achievement. Student activity data were exam review time, time spent on assignments, attendance, and Student Response System (SRS) use. Data on specific student skills were obtained from answers to assessment questions. Regression analysis shows that attendance, SRS use, time spent on assignments, and time spent on exam reviews do not significantly affect achievement. Score on assessment questions requiring basic math skill and score on questions requiring skill in observation and written explanation of classroom demonstrations do significantly affect achievement. So does score on questions repeated from one exam to the next, but contrary to intuition, the overall effect of these repeated questions is to lower student achievement. A regression model using only these three factors plus percentage of assignments done predicts student achievement to within half a letter grade. Improving student skill in basic math provides the greatest opportunity to improve student achievement. Successful and unsuccessful students are exclusive groups; unsuccessful students are not "partially" successful.
使用定制软件收集了416名学生在大学计算机入门转学课程的多个学期的数据。目的是定量地确定成绩因素和为提高学生成绩而采取的相应行动。学生活动数据包括考试复习时间、作业时间、出勤率和学生反应系统(SRS)使用情况。具体的学生技能数据是从评估问题的答案中获得的。回归分析显示出勤率、SRS的使用、花在作业上的时间和花在考试复习上的时间对成绩没有显著影响。要求基本数学技能的评估题得分和要求观察和书面解释课堂演示技能的问题得分对成绩有显著影响。从一次考试到下一次考试重复出现的问题的分数也是如此,但与直觉相反,这些重复问题的总体影响是降低学生的成绩。仅使用这三个因素加上完成作业的百分比的回归模型预测学生的成绩在半个字母等级之内。提高学生的基础数学技能是提高学生成绩的最大机会。成功和不成功的学生是排他性的群体;不成功的学生并不是“部分”成功。
{"title":"Student Achievement Factors in a College Introductory Computer Course","authors":"Willis L. Boughton","doi":"10.5325/jasseinsteffe.10.1-2.0001","DOIUrl":"https://doi.org/10.5325/jasseinsteffe.10.1-2.0001","url":null,"abstract":"abstract:Custom software was used to collect data for 416 students over multiple semesters of a college introductory computer transfer course. The objective was to quantitatively identify achievement factors and corresponding actions to be taken to improve student achievement. Student activity data were exam review time, time spent on assignments, attendance, and Student Response System (SRS) use. Data on specific student skills were obtained from answers to assessment questions. Regression analysis shows that attendance, SRS use, time spent on assignments, and time spent on exam reviews do not significantly affect achievement. Score on assessment questions requiring basic math skill and score on questions requiring skill in observation and written explanation of classroom demonstrations do significantly affect achievement. So does score on questions repeated from one exam to the next, but contrary to intuition, the overall effect of these repeated questions is to lower student achievement. A regression model using only these three factors plus percentage of assignments done predicts student achievement to within half a letter grade. Improving student skill in basic math provides the greatest opportunity to improve student achievement. Successful and unsuccessful students are exclusive groups; unsuccessful students are not \"partially\" successful.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"6 1","pages":"1 - 32"},"PeriodicalIF":0.0,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89948698","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Teaching Students About the World of Work: A Challenge to Postsecondary Educators ed. by Nancy Hoffman and Michael Lawrence Collins (review) 《教育学生关于工作的世界:对高等教育工作者的挑战》南希·霍夫曼、迈克尔·劳伦斯·柯林斯主编
Q4 Social Sciences Pub Date : 2020-04-01 DOI: 10.5325/jasseinsteffe.10.1-2.0114
Marcy L. Brown
{"title":"Teaching Students About the World of Work: A Challenge to Postsecondary Educators ed. by Nancy Hoffman and Michael Lawrence Collins (review)","authors":"Marcy L. Brown","doi":"10.5325/jasseinsteffe.10.1-2.0114","DOIUrl":"https://doi.org/10.5325/jasseinsteffe.10.1-2.0114","url":null,"abstract":"","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"62 1","pages":"114 - 116"},"PeriodicalIF":0.0,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88830825","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Challenges Assessing the Impact of Project-Based Learning on Critical Thinking Skills 评估基于项目的学习对批判性思维技能影响的挑战
Q4 Social Sciences Pub Date : 2020-04-01 DOI: 10.5325/jasseinsteffe.10.1-2.0033
J. Henderson, Scott C. Marley, M. Wilcox, Natalie Nailor, Stephanie Sowl, Kevin Close
abstract:The lack of a precise definition for critical thinking makes it difficult for educators to agree on specifically how critical thinking should be assessed. This study assesses the efficacy of an innovative university initiative designed to promote critical thinking through project-based learning (PBL). Over 400 students participated in a 2 × 2 factorial experiment including critical thinking items from both a widely used inventory as well as a new assessment designed to assess critical thinking in a more practical fashion. The authors' novel critical thinking assessment breaks down critical thinking into construction and critique components, with results indicating that critique is more challenging for students, regardless of experimental condition. However, students specifically prompted for critique demonstrated more attempts at critique than their counterparts who did not receive a critique prompt. The results indicate a paucity of critical thought in general, suggesting multiple challenges for both the teaching and assessment of critical thinking skills.
由于缺乏对批判性思维的精确定义,教育者很难就如何评估批判性思维达成一致。本研究评估了一项创新大学倡议的有效性,该倡议旨在通过基于项目的学习(PBL)促进批判性思维。超过400名学生参加了一个2 × 2的析因实验,其中包括广泛使用的清单中的批判性思维项目以及旨在以更实用的方式评估批判性思维的新评估。作者新颖的批判性思维评估将批判性思维分解为构建和批评两个部分,结果表明,无论实验条件如何,批评对学生来说都更具挑战性。然而,特别提示批评的学生比没有收到批评提示的学生表现出更多的批评尝试。结果表明,批判性思维普遍缺乏,这表明批判性思维技能的教学和评估都面临多重挑战。
{"title":"Challenges Assessing the Impact of Project-Based Learning on Critical Thinking Skills","authors":"J. Henderson, Scott C. Marley, M. Wilcox, Natalie Nailor, Stephanie Sowl, Kevin Close","doi":"10.5325/jasseinsteffe.10.1-2.0033","DOIUrl":"https://doi.org/10.5325/jasseinsteffe.10.1-2.0033","url":null,"abstract":"abstract:The lack of a precise definition for critical thinking makes it difficult for educators to agree on specifically how critical thinking should be assessed. This study assesses the efficacy of an innovative university initiative designed to promote critical thinking through project-based learning (PBL). Over 400 students participated in a 2 × 2 factorial experiment including critical thinking items from both a widely used inventory as well as a new assessment designed to assess critical thinking in a more practical fashion. The authors' novel critical thinking assessment breaks down critical thinking into construction and critique components, with results indicating that critique is more challenging for students, regardless of experimental condition. However, students specifically prompted for critique demonstrated more attempts at critique than their counterparts who did not receive a critique prompt. The results indicate a paucity of critical thought in general, suggesting multiple challenges for both the teaching and assessment of critical thinking skills.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"75 1","pages":"33 - 60"},"PeriodicalIF":0.0,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86357543","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Formative Assessment in the Disciplines: Framing a Continuum of Professional Learning by Margaret Heritage and E. Caroline Wylie (review) 《学科中的形成性评估:构建专业学习的连续体》作者:玛格丽特·Heritage和E.卡罗琳·怀利(书评)
Q4 Social Sciences Pub Date : 2020-04-01 DOI: 10.5325/jasseinsteffe.10.1-2.0112
K. Daugherty
{"title":"Formative Assessment in the Disciplines: Framing a Continuum of Professional Learning by Margaret Heritage and E. Caroline Wylie (review)","authors":"K. Daugherty","doi":"10.5325/jasseinsteffe.10.1-2.0112","DOIUrl":"https://doi.org/10.5325/jasseinsteffe.10.1-2.0112","url":null,"abstract":"","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"21 1","pages":"112 - 114"},"PeriodicalIF":0.0,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84933571","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Comparison of NSSE Data Obtained via Computer Versus Mobile Devices 通过计算机和移动设备获得的NSSE数据的比较
Q4 Social Sciences Pub Date : 2020-04-01 DOI: 10.5325/jasseinsteffe.10.1-2.0061
Jihee Hwang, Felix Wao
abstract:Institutional surveys are an important means for assessing student learning experiences and outcomes in higher education. With the widespread ownership of smartphones and tablets, a growing number of students use mobile devices to complete institutional surveys. Using National Survey of Student Engagement data collected at a large four-year research university, this study examines how survey response patterns and data quality are different between computer (i.e., laptop, desktop) and mobile device responses. The findings indicate that mobile respondents are likely to take a longer time to complete the survey and have higher item nonresponse rates. In examining engagement indicator subscales, first-year students who used mobile devices reported significantly lower internal consistency reliability of all measures in academic challenges compared to computer respondents. Additionally, controlling for student demographics and precollege traits, the adjusted means of academic challenges and supportive environment subscales were significantly lower for mobile device respondents from first-year students.
院校调查是评估高等教育学生学习经历和成果的重要手段。随着智能手机和平板电脑的普及,越来越多的学生使用移动设备完成机构调查。本研究利用从一所大型四年制研究型大学收集的全国学生参与度调查数据,研究了计算机(即笔记本电脑、台式机)和移动设备的调查响应模式和数据质量的差异。调查结果表明,移动受访者可能需要更长的时间来完成调查,并且有更高的项目无回复率。在检查敬业度指标子量表时,使用移动设备的一年级学生报告的所有学术挑战测量的内部一致性可靠性明显低于使用电脑的受访者。此外,在控制学生人口统计学和大学前特征后,一年级学生的学业挑战和支持性环境子量表的调整均值显著低于一年级学生。
{"title":"Comparison of NSSE Data Obtained via Computer Versus Mobile Devices","authors":"Jihee Hwang, Felix Wao","doi":"10.5325/jasseinsteffe.10.1-2.0061","DOIUrl":"https://doi.org/10.5325/jasseinsteffe.10.1-2.0061","url":null,"abstract":"abstract:Institutional surveys are an important means for assessing student learning experiences and outcomes in higher education. With the widespread ownership of smartphones and tablets, a growing number of students use mobile devices to complete institutional surveys. Using National Survey of Student Engagement data collected at a large four-year research university, this study examines how survey response patterns and data quality are different between computer (i.e., laptop, desktop) and mobile device responses. The findings indicate that mobile respondents are likely to take a longer time to complete the survey and have higher item nonresponse rates. In examining engagement indicator subscales, first-year students who used mobile devices reported significantly lower internal consistency reliability of all measures in academic challenges compared to computer respondents. Additionally, controlling for student demographics and precollege traits, the adjusted means of academic challenges and supportive environment subscales were significantly lower for mobile device respondents from first-year students.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"308 1","pages":"61 - 84"},"PeriodicalIF":0.0,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77206864","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Course Grade Reliability 课程等级
Q4 Social Sciences Pub Date : 2020-04-01 DOI: 10.5325/jasseinsteffe.10.1-2.0085
D. Eubanks, A. Good, Megan Schramm-Possinger
abstract:This study analyzes the reliability of approximately 800,000 college grades from three higher educational institutions that vary in type and size. Comparisons of intraclass correlation coefficients (ICCs) reveal patterns among institutions and academic disciplines. Results from this study suggest that there are styles of grading associated with academic disciplines. Individual grade assignment ICC is comparable to rubric-derived learning assessments at one institution, and both are arguably too low to be used for decision making at that level. A reliability lift calculation suggests that grade averages over eight (or so) courses per student have enough reliability to be used as outcome measures. We discuss how grade statistics can complement efforts to assess program fairness, rigor, and comparability, as well as assessing the complexity of a curriculum. The R code and statistical notes are included to facilitate use by assessment and institutional research offices.
本研究分析了来自三所不同类型和规模的高等教育机构的大约80万份大学成绩的可靠性。类内相关系数(ICCs)的比较揭示了机构和学科之间的模式。本研究的结果表明,存在与学科相关的评分风格。个人等级分配ICC与某一机构的基于规则的学习评估相当,但两者都太低,无法用于该级别的决策。可靠性提升计算表明,每个学生超过8门(或左右)课程的平均成绩具有足够的可靠性,可以用作结果度量。我们讨论了成绩统计如何补充评估课程公平性、严谨性和可比性的努力,以及评估课程的复杂性。包括R代码和统计注释,以方便评估和院校研究办公室使用。
{"title":"Course Grade Reliability","authors":"D. Eubanks, A. Good, Megan Schramm-Possinger","doi":"10.5325/jasseinsteffe.10.1-2.0085","DOIUrl":"https://doi.org/10.5325/jasseinsteffe.10.1-2.0085","url":null,"abstract":"abstract:This study analyzes the reliability of approximately 800,000 college grades from three higher educational institutions that vary in type and size. Comparisons of intraclass correlation coefficients (ICCs) reveal patterns among institutions and academic disciplines. Results from this study suggest that there are styles of grading associated with academic disciplines. Individual grade assignment ICC is comparable to rubric-derived learning assessments at one institution, and both are arguably too low to be used for decision making at that level. A reliability lift calculation suggests that grade averages over eight (or so) courses per student have enough reliability to be used as outcome measures. We discuss how grade statistics can complement efforts to assess program fairness, rigor, and comparability, as well as assessing the complexity of a curriculum. The R code and statistical notes are included to facilitate use by assessment and institutional research offices.","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"75 1","pages":"111 - 85"},"PeriodicalIF":0.0,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86292010","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Evaluating Academic Course Rigor, Part 1: Defining a Nebulous Construct 评估学术课程的严谨性,第一部分:定义一个模糊的结构
Q4 Social Sciences Pub Date : 2019-12-03 DOI: 10.5325/jasseinsteffe.8.1-2.0086
J. Johnson, T. Weidner, James A. Jones, Allison K. Manwell
abstract:A widely accepted definition of academic course rigor is elusive within higher education. Although many conceptualizations of course rigor have been identified, both empirically and anecdotally, the need to operationally define and investigate course rigor is necessary given contemporary attacks on the quality of higher education. This article, part 1 of a two-part study, describes the three-phase process by which academic course and instructor rigor and corresponding rigor questions were defined and validated. Results revealed that five rigor components are critical to a definition of course rigor: critical thinking; challenge; mastering complex material; time and labor intensity; and production of credible work. These components were used to create questions distributed in 264 courses (2,557 students). The final phase of part 1 used factor analysis to confirm a strong one-factor solution, confirming the operational definition and corresponding rigor questions were acceptable to empirically evaluate course and instructor rigor. Keywords: rigor, course ratings
一个被广泛接受的学术课程严谨性的定义在高等教育中是难以捉摸的。尽管已经确定了许多课程严谨性的概念,无论是经验还是轶事,但考虑到当前对高等教育质量的攻击,从操作上定义和调查课程严谨性的必要性是必要的。本文是由两部分组成的研究的第1部分,描述了定义和验证学术课程和教师严谨性以及相应的严谨性问题的三个阶段过程。结果显示,五个严谨性组成部分对课程严谨性的定义至关重要:批判性思维;挑战;掌握复杂材料;时间和劳动强度;并制作可信的作品。这些组件用于创建分发在264门课程(2,557名学生)中的问题。第1部分的最后阶段使用因素分析来确认一个强有力的单因素解决方案,确认操作定义和相应的严谨性问题是可以接受的,以经验评估课程和讲师的严谨性。关键词:严谨性,课程评分
{"title":"Evaluating Academic Course Rigor, Part 1: Defining a Nebulous Construct","authors":"J. Johnson, T. Weidner, James A. Jones, Allison K. Manwell","doi":"10.5325/jasseinsteffe.8.1-2.0086","DOIUrl":"https://doi.org/10.5325/jasseinsteffe.8.1-2.0086","url":null,"abstract":"abstract:A widely accepted definition of academic course rigor is elusive within higher education. Although many conceptualizations of course rigor have been identified, both empirically and anecdotally, the need to operationally define and investigate course rigor is necessary given contemporary attacks on the quality of higher education. This article, part 1 of a two-part study, describes the three-phase process by which academic course and instructor rigor and corresponding rigor questions were defined and validated. Results revealed that five rigor components are critical to a definition of course rigor: critical thinking; challenge; mastering complex material; time and labor intensity; and production of credible work. These components were used to create questions distributed in 264 courses (2,557 students). The final phase of part 1 used factor analysis to confirm a strong one-factor solution, confirming the operational definition and corresponding rigor questions were acceptable to empirically evaluate course and instructor rigor. Keywords: rigor, course ratings","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"35 1","pages":"121 - 86"},"PeriodicalIF":0.0,"publicationDate":"2019-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84009993","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Malleable and Immutable Student Characteristics: Incoming Profiles and Experiences on Campus 可塑和不可变的学生特征:入学简介和校园经历
Q4 Social Sciences Pub Date : 2019-12-03 DOI: 10.5325/jasseinsteffe.8.1-2.0022
Michael Ben-Avie, Brian D. Darrow
abstract:Our predictive models of student success provide evidence that students' incoming profiles do not define their destiny. We have found that the learning and developmental experiences that they have after enrollment are far more important in predicting persistence, academic achievement, and graduation. In contrast to immutable student demographic characteristics, we have found that malleable characteristics among students (such as academic habits of mind, sense of belonging, and future orientation) predict student success. Paying attention to students' development does not detract from their learning. In fact, promoting the highest levels of development among students seems to be what helps them reach high academic goals. Keywords: predictive modeling, student success, longitudinal, cohort study, malleable characteristics, learning and development
我们的学生成功预测模型提供的证据表明,学生的入学档案并不能决定他们的命运。我们发现,他们在入学后的学习和发展经历在预测坚持、学业成就和毕业方面要重要得多。与不可变的学生人口特征相反,我们发现学生的可塑特征(如学习习惯、归属感和未来取向)预测了学生的成功。关注学生的发展并不会影响他们的学习。事实上,促进学生的最高水平发展似乎是帮助他们达到高学业目标的原因。关键词:预测建模,学生成功,纵向,队列研究,可延展性,学习与发展
{"title":"Malleable and Immutable Student Characteristics: Incoming Profiles and Experiences on Campus","authors":"Michael Ben-Avie, Brian D. Darrow","doi":"10.5325/jasseinsteffe.8.1-2.0022","DOIUrl":"https://doi.org/10.5325/jasseinsteffe.8.1-2.0022","url":null,"abstract":"abstract:Our predictive models of student success provide evidence that students' incoming profiles do not define their destiny. We have found that the learning and developmental experiences that they have after enrollment are far more important in predicting persistence, academic achievement, and graduation. In contrast to immutable student demographic characteristics, we have found that malleable characteristics among students (such as academic habits of mind, sense of belonging, and future orientation) predict student success. Paying attention to students' development does not detract from their learning. In fact, promoting the highest levels of development among students seems to be what helps them reach high academic goals. Keywords: predictive modeling, student success, longitudinal, cohort study, malleable characteristics, learning and development","PeriodicalId":56185,"journal":{"name":"Journal of Assessment and Institutional Effectiveness","volume":"98 1","pages":"22 - 50"},"PeriodicalIF":0.0,"publicationDate":"2019-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84102452","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
期刊
Journal of Assessment and Institutional Effectiveness
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1