首页 > 最新文献

Learning and Performance Assessment最新文献

英文 中文
Designing a Cloud-Based Assessment Model 设计基于云的评估模型
Pub Date : 1900-01-01 DOI: 10.4018/978-1-7998-0420-8.ch020
T. Rickards, A. Steele
A cloud based assessment learning environment exists when the collaborative sharing features of cloud computing tools (e.g. Google Docs) are utilised for a continuous assessment of student learning activity over an extended period of time. This chapter describes a New Zealand Polytechnic based success story which utilised a multi-method approach to investigate student perceptions of a cloud assessment learning environment. The learning environment factors that are examined in this chapter include progress monitoring, cloud tools (i.e. Google Docs), feedback, cloud storage, technology preference, student achievement, and student engagement. This chapter not only describes this unique learning environment, it also provides a clear insight into student perceptions of the cloud assessment learning environment. In concluding, the chapter provides some outcomes that may be utilised to improve pedagogy and student outcomes in a STEM based multimedia learning environment.
当利用云计算工具(例如Google Docs)的协作共享特性在较长时间内对学生的学习活动进行持续评估时,就存在基于云的评估学习环境。本章描述了一个基于新西兰理工学院的成功案例,该案例利用多方法方法来调查学生对云评估学习环境的看法。本章考察的学习环境因素包括进度监控、云工具(如Google Docs)、反馈、云存储、技术偏好、学生成绩和学生参与度。本章不仅描述了这种独特的学习环境,还提供了学生对云评估学习环境的看法的清晰见解。最后,本章提供了一些可用于改善基于STEM的多媒体学习环境中的教学方法和学生成绩的结果。
{"title":"Designing a Cloud-Based Assessment Model","authors":"T. Rickards, A. Steele","doi":"10.4018/978-1-7998-0420-8.ch020","DOIUrl":"https://doi.org/10.4018/978-1-7998-0420-8.ch020","url":null,"abstract":"A cloud based assessment learning environment exists when the collaborative sharing features of cloud computing tools (e.g. Google Docs) are utilised for a continuous assessment of student learning activity over an extended period of time. This chapter describes a New Zealand Polytechnic based success story which utilised a multi-method approach to investigate student perceptions of a cloud assessment learning environment. The learning environment factors that are examined in this chapter include progress monitoring, cloud tools (i.e. Google Docs), feedback, cloud storage, technology preference, student achievement, and student engagement. This chapter not only describes this unique learning environment, it also provides a clear insight into student perceptions of the cloud assessment learning environment. In concluding, the chapter provides some outcomes that may be utilised to improve pedagogy and student outcomes in a STEM based multimedia learning environment.","PeriodicalId":320077,"journal":{"name":"Learning and Performance Assessment","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134324025","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Understanding Your Learner 了解你的学习者
Pub Date : 1900-01-01 DOI: 10.4018/978-1-7998-0420-8.ch072
T. Souders
Now more than ever before, health care educators are being challenged to meet the complex and dynamic needs of an expanding health care workforce. Continuing education requirements as well as graduate and undergraduate programs are striving to keep pace with the demands for more highly skilled health care professionals. Likewise, technology and related instructional media have been evolving at an exponential pace. The confluence of these variables requires health care educators to be knowledgeable about the options and tools available to design and deliver instruction using a variety of platforms in more diverse settings. In order to ensure that instruction achieves its intended goals, it is imperative to fully assess the learner characteristics of the target audience. The purpose of this chapter is to discuss the rationale for conducting a learner analysis and utilizing learner characteristics in designing effective instruction.
现在比以往任何时候,卫生保健教育工作者正面临挑战,以满足不断扩大的卫生保健工作人员的复杂和动态的需求。继续教育要求以及研究生和本科课程正在努力跟上对更多高技能医疗保健专业人员的需求。同样,技术和相关的教学媒体也在以指数级的速度发展。这些变量的融合要求卫生保健教育工作者了解在更多样化的环境中使用各种平台设计和提供教学的可用选项和工具。为了确保教学达到预期目标,充分评估目标受众的学习者特征是必不可少的。本章的目的是讨论进行学习者分析和利用学习者特征设计有效教学的基本原理。
{"title":"Understanding Your Learner","authors":"T. Souders","doi":"10.4018/978-1-7998-0420-8.ch072","DOIUrl":"https://doi.org/10.4018/978-1-7998-0420-8.ch072","url":null,"abstract":"Now more than ever before, health care educators are being challenged to meet the complex and dynamic needs of an expanding health care workforce. Continuing education requirements as well as graduate and undergraduate programs are striving to keep pace with the demands for more highly skilled health care professionals. Likewise, technology and related instructional media have been evolving at an exponential pace. The confluence of these variables requires health care educators to be knowledgeable about the options and tools available to design and deliver instruction using a variety of platforms in more diverse settings. In order to ensure that instruction achieves its intended goals, it is imperative to fully assess the learner characteristics of the target audience. The purpose of this chapter is to discuss the rationale for conducting a learner analysis and utilizing learner characteristics in designing effective instruction.","PeriodicalId":320077,"journal":{"name":"Learning and Performance Assessment","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130495475","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Serious Games for Students' E-Assessment Literacy in Higher Education 高等教育学生电子评估素养的严肃游戏
Pub Date : 1900-01-01 DOI: 10.4018/978-1-5225-0531-0.CH014
María Soledad Ibarra-Sáiz, Gregorio Rodríguez-Gómez
In this chapter it will present partial results from the DevalS Project (Developing Sustainable Assessment – Improving Students' Assessment Competence through Virtual Simulations), financed by the Spanish Ministry of Economy and Competitiveness (Ref. EDU2012-31804). The results will be focused on the use and usefulness of serious games for e-assessment literacy from a students' point of view. Firstly, it will introduce the project. Secondly, it will review the serious games that have been developed and implemented in different undergraduate courses. Finally, it will present the results and conclusions of surveys undertaken by students.
在本章中,它将展示由西班牙经济和竞争力部资助的DevalS项目(发展可持续评估-通过虚拟模拟提高学生的评估能力)的部分结果(参考文献EDU2012-31804)。从学生的角度来看,研究结果将集中在严肃游戏对电子素养评估的使用和有用性上。首先,介绍项目。其次,它将回顾在不同的本科课程中开发和实施的严肃游戏。最后,它将呈现学生进行调查的结果和结论。
{"title":"Serious Games for Students' E-Assessment Literacy in Higher Education","authors":"María Soledad Ibarra-Sáiz, Gregorio Rodríguez-Gómez","doi":"10.4018/978-1-5225-0531-0.CH014","DOIUrl":"https://doi.org/10.4018/978-1-5225-0531-0.CH014","url":null,"abstract":"In this chapter it will present partial results from the DevalS Project (Developing Sustainable Assessment – Improving Students' Assessment Competence through Virtual Simulations), financed by the Spanish Ministry of Economy and Competitiveness (Ref. EDU2012-31804). The results will be focused on the use and usefulness of serious games for e-assessment literacy from a students' point of view. Firstly, it will introduce the project. Secondly, it will review the serious games that have been developed and implemented in different undergraduate courses. Finally, it will present the results and conclusions of surveys undertaken by students.","PeriodicalId":320077,"journal":{"name":"Learning and Performance Assessment","volume":"142 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123400509","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Student Participation in Assessment Processes 学生参与评核过程
Pub Date : 1900-01-01 DOI: 10.4018/978-1-7998-0420-8.ch058
Victoria Quesada, Eduardo García-Jiménez, M. Gómez-Ruiz
The participation of students in higher education assessment processes has been proven to have many benefits. However, there is a diverse range of techniques and options when implementing participative assessment, with each offering new possibilities. This chapter focuses on the topic of student participation in assessment processes, and it explores the main stages when it can be developed: participation in design, during implementation, and in grading. This chapter also considers the different modalities that can be used, especially self-assessment, peer assessment, and co-assessment and the three stages that characterise them. Finally, it analyses three experiences of student participation in higher education assessment, highlighting their strengths and weaknesses. These experiences show how participative assessment can be developed in everyday classes, in groups, or individually and how participative assessment can occur in different class settings. They also demonstrate the importance of design, assessment literacy, and some difficulties that might appear during the process.
学生参与高等教育评估过程已被证明有许多好处。然而,在实施参与式评估时,有各种各样的技术和选择,每种技术和选择都提供了新的可能性。本章的重点是学生参与评估过程的主题,并探讨了可以发展的主要阶段:参与设计,参与实施,参与评分。本章还考虑了可以使用的不同模式,特别是自我评估,同行评估和共同评估以及表征它们的三个阶段。最后,分析了三种学生参与高等教育评估的经验,突出了它们的优缺点。这些经验表明参与性评估如何在日常课堂、小组或个人中发展,以及参与性评估如何在不同的课堂环境中发生。它们还展示了设计、评估素养的重要性,以及在这个过程中可能出现的一些困难。
{"title":"Student Participation in Assessment Processes","authors":"Victoria Quesada, Eduardo García-Jiménez, M. Gómez-Ruiz","doi":"10.4018/978-1-7998-0420-8.ch058","DOIUrl":"https://doi.org/10.4018/978-1-7998-0420-8.ch058","url":null,"abstract":"The participation of students in higher education assessment processes has been proven to have many benefits. However, there is a diverse range of techniques and options when implementing participative assessment, with each offering new possibilities. This chapter focuses on the topic of student participation in assessment processes, and it explores the main stages when it can be developed: participation in design, during implementation, and in grading. This chapter also considers the different modalities that can be used, especially self-assessment, peer assessment, and co-assessment and the three stages that characterise them. Finally, it analyses three experiences of student participation in higher education assessment, highlighting their strengths and weaknesses. These experiences show how participative assessment can be developed in everyday classes, in groups, or individually and how participative assessment can occur in different class settings. They also demonstrate the importance of design, assessment literacy, and some difficulties that might appear during the process.","PeriodicalId":320077,"journal":{"name":"Learning and Performance Assessment","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115756852","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Designing Assessment, Assessing Instructional Design 设计评估,评估教学设计
Pub Date : 1900-01-01 DOI: 10.4018/978-1-5225-9279-2.ch047
Stefanie Panke
Assessment plays a vital role in delivering, evaluating, monitoring, improving and shaping learning experiences on the Web, at the desk and in the classroom. In the process of orchestrating educational technologies instructional designers are often confronted with the challenge of designing or deploying creative and authentic assessment techniques. For an instructional designer, the focus of assessment can be on individual learning, organizational improvement or the evaluation of educational technologies. A common question across these domains is how to translate pedagogical concepts such as authenticity and creativity into concrete practical applications and metrics. Educational technologies can support creative processes and offer connections to authentic contexts, just as well as they can curtail creativity and foster standardized testing routines. The chapter discusses theoretical frameworks and provides examples of the conceptual development and implementation of assessment approaches in three different areas: Needs assessment, impact assessment and classroom assessment.
评估在提供、评估、监测、改进和塑造网络上、课桌上和课堂上的学习经验方面发挥着至关重要的作用。在编排教育技术的过程中,教学设计师经常面临设计或部署创造性和真实的评估技术的挑战。对于教学设计师来说,评估的重点可以是个人学习、组织改进或教育技术的评估。这些领域的一个共同问题是如何将真实性和创造力等教学概念转化为具体的实际应用和指标。教育技术可以支持创造性的过程,并提供与真实环境的联系,就像它们可以限制创造力和促进标准化测试程序一样。本章讨论了理论框架,并提供了三个不同领域评估方法的概念发展和实施的例子:需求评估、影响评估和课堂评估。
{"title":"Designing Assessment, Assessing Instructional Design","authors":"Stefanie Panke","doi":"10.4018/978-1-5225-9279-2.ch047","DOIUrl":"https://doi.org/10.4018/978-1-5225-9279-2.ch047","url":null,"abstract":"Assessment plays a vital role in delivering, evaluating, monitoring, improving and shaping learning experiences on the Web, at the desk and in the classroom. In the process of orchestrating educational technologies instructional designers are often confronted with the challenge of designing or deploying creative and authentic assessment techniques. For an instructional designer, the focus of assessment can be on individual learning, organizational improvement or the evaluation of educational technologies. A common question across these domains is how to translate pedagogical concepts such as authenticity and creativity into concrete practical applications and metrics. Educational technologies can support creative processes and offer connections to authentic contexts, just as well as they can curtail creativity and foster standardized testing routines. The chapter discusses theoretical frameworks and provides examples of the conceptual development and implementation of assessment approaches in three different areas: Needs assessment, impact assessment and classroom assessment.","PeriodicalId":320077,"journal":{"name":"Learning and Performance Assessment","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114307988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Authentic Assessment 真实的评价
Pub Date : 1900-01-01 DOI: 10.4018/978-1-7998-0420-8.ch054
Simona Iftimescu, Romiță Iucu, Elena Marin, Mihaela Stingu
The purpose of this chapter is to analyze and discuss the concept of authentic assessment at Master's degree level. Firstly, this chapter attempts to provide a better understanding of the Master's program within the context of the Bologna system by providing a short historical perspective on the evolution of the Bologna process, as well as trying to identify the true beneficiaries. The chapter also addresses some of the challenges of the assessment process with two main themes: types and aim of the assessment process. Furthermore, the authors focus on the role of the authentic assessment, at a Master's degree level – as reflected by students' perception and correlated with its intended purpose. Drawing on the findings, the authors attempt to shape a description of what authentic assessment is and what it should be at Master's degree level.
本章的目的是分析和讨论硕士学位阶段真实性评估的概念。首先,本章试图通过对博洛尼亚进程的演变提供一个简短的历史视角,并试图确定真正的受益者,从而在博洛尼亚系统的背景下更好地理解硕士课程。本章还讨论了评估过程中的一些挑战,涉及两个主题:评估过程的类型和目标。此外,作者关注的是真实评估的作用,在硕士学位水平上,这反映在学生的感知上,并与预期目的相关。根据这些发现,作者试图对什么是真正的评估以及在硕士学位水平上应该是什么进行描述。
{"title":"Authentic Assessment","authors":"Simona Iftimescu, Romiță Iucu, Elena Marin, Mihaela Stingu","doi":"10.4018/978-1-7998-0420-8.ch054","DOIUrl":"https://doi.org/10.4018/978-1-7998-0420-8.ch054","url":null,"abstract":"The purpose of this chapter is to analyze and discuss the concept of authentic assessment at Master's degree level. Firstly, this chapter attempts to provide a better understanding of the Master's program within the context of the Bologna system by providing a short historical perspective on the evolution of the Bologna process, as well as trying to identify the true beneficiaries. The chapter also addresses some of the challenges of the assessment process with two main themes: types and aim of the assessment process. Furthermore, the authors focus on the role of the authentic assessment, at a Master's degree level – as reflected by students' perception and correlated with its intended purpose. Drawing on the findings, the authors attempt to shape a description of what authentic assessment is and what it should be at Master's degree level.","PeriodicalId":320077,"journal":{"name":"Learning and Performance Assessment","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114428226","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Formative Assessment and Preservice Elementary Teachers' Mathematical Justification 形成性评估与职前小学教师的数学论证
Pub Date : 1900-01-01 DOI: 10.4018/978-1-7998-0420-8.ch018
Alden J. Edson, D. Rogers, C. Browning
The focus of this chapter is on elementary preservice teachers' (PSTs') use of justification in problem-solving contexts based on a semester algebra course designed for elementary education mathematics minors. Formative assessment and digital tools facilitated the development of PSTs' understanding and use of justification in algebraic topics. The instructional model used includes the following components: negotiating a “taken-as-shared” justification rubric criteria; engaging in problem solving; preparing, digitally recording, and posting justification videos to the Cloud; and finally, listening and sharing descriptive feedback on the posted videos. VoiceThread was the digital venue for the preservice teachers to listen to their peers' justifications and post descriptive feedback. Findings from an analysis of a group focus on the PSTs' peer- and self-feedback as it developed through a semester and the PSTs' ability to provide a range of descriptive feedback with the potential to promote growth in the understanding and use of mathematical justification.
本章的重点是小学职前教师(pst)在解决问题的背景下使用辩护基于为基础教育数学未成年人设计的学期代数课程。形成性评估和数字工具促进了pst对代数主题的理解和使用证明的发展。所使用的教学模型包括以下组成部分:协商“共享”的论证标准;善于解决问题的;准备、数字录制和发布辩护视频到云;最后,倾听和分享对发布的视频的描述性反馈。VoiceThread是职前教师听取同龄人的理由并发布描述性反馈的数字场所。对一个小组的分析发现,重点关注了教师在一个学期中发展起来的同伴和自我反馈,以及教师提供一系列描述性反馈的能力,这些反馈有可能促进对数学论证的理解和使用的增长。
{"title":"Formative Assessment and Preservice Elementary Teachers' Mathematical Justification","authors":"Alden J. Edson, D. Rogers, C. Browning","doi":"10.4018/978-1-7998-0420-8.ch018","DOIUrl":"https://doi.org/10.4018/978-1-7998-0420-8.ch018","url":null,"abstract":"The focus of this chapter is on elementary preservice teachers' (PSTs') use of justification in problem-solving contexts based on a semester algebra course designed for elementary education mathematics minors. Formative assessment and digital tools facilitated the development of PSTs' understanding and use of justification in algebraic topics. The instructional model used includes the following components: negotiating a “taken-as-shared” justification rubric criteria; engaging in problem solving; preparing, digitally recording, and posting justification videos to the Cloud; and finally, listening and sharing descriptive feedback on the posted videos. VoiceThread was the digital venue for the preservice teachers to listen to their peers' justifications and post descriptive feedback. Findings from an analysis of a group focus on the PSTs' peer- and self-feedback as it developed through a semester and the PSTs' ability to provide a range of descriptive feedback with the potential to promote growth in the understanding and use of mathematical justification.","PeriodicalId":320077,"journal":{"name":"Learning and Performance Assessment","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121712999","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Examining the Psychometric Properties of the Standards Assessment Inventory 《标准评定量表》心理测量学性质的检验
Pub Date : 1900-01-01 DOI: 10.4018/IJTEPD.2018010106
William R. Merchant, K. Ciampa, Zora Wolfe
The purpose of this article is to assess the psychometric properties of the Standards Assessment Inventory (SAI) in order to confirm its construct validity using modern statistical procedures. The SAI is a 50-item assessment designed to measure the degree to which professional development programs align with seven factors related to “high quality” teacher learning (Learning Forward, 2011). These seven factors are Learning Communities, Leadership, Resources, Data, Learning Design, Implementation, and Outcomes. In their original evaluation of the factor structure of the SAI, Learning Forward (2011) tested one model containing all 50 items loading onto a single factor, and seven individual factor models, each containing one of the seven standards of professional development. To date there has been no published report related to the psychometric properties of a seven-factor model, which allows each of the seven standards to covary. The initial test of this model produced a poor fit, after which a series of modifications were attempted to improve the functioning of the SAI. After all meaningful modifications were added, the overall fit of the SAI was still outside of a range that would suggest a statistically valid measurement model. Suggestions for SAI modification and use are made as they relate to these findings.
本文的目的是评估标准评估量表(SAI)的心理测量特性,以确认其结构效度,并采用现代统计程序。SAI是一个包含50个项目的评估,旨在衡量专业发展计划与“高质量”教师学习相关的七个因素的一致程度(learning Forward, 2011)。这七个因素是学习社区、领导力、资源、数据、学习设计、实施和成果。Learning Forward(2011)在最初对SAI的因素结构进行评估时,测试了一个包含所有50个项目加载到单个因素上的模型,以及七个单独的因素模型,每个模型包含七个专业发展标准中的一个。到目前为止,还没有发表过有关七因素模型的心理测量特性的报告,该模型允许七个标准中的每一个都是协变的。该模型的初始测试产生了较差的拟合,之后进行了一系列修改以改善SAI的功能。在添加了所有有意义的修改之后,SAI的整体拟合仍然超出了表明统计上有效的测量模型的范围。根据这些发现,对SAI的修改和使用提出了建议。
{"title":"Examining the Psychometric Properties of the Standards Assessment Inventory","authors":"William R. Merchant, K. Ciampa, Zora Wolfe","doi":"10.4018/IJTEPD.2018010106","DOIUrl":"https://doi.org/10.4018/IJTEPD.2018010106","url":null,"abstract":"The purpose of this article is to assess the psychometric properties of the Standards Assessment Inventory (SAI) in order to confirm its construct validity using modern statistical procedures. The SAI is a 50-item assessment designed to measure the degree to which professional development programs align with seven factors related to “high quality” teacher learning (Learning Forward, 2011). These seven factors are Learning Communities, Leadership, Resources, Data, Learning Design, Implementation, and Outcomes. In their original evaluation of the factor structure of the SAI, Learning Forward (2011) tested one model containing all 50 items loading onto a single factor, and seven individual factor models, each containing one of the seven standards of professional development. To date there has been no published report related to the psychometric properties of a seven-factor model, which allows each of the seven standards to covary. The initial test of this model produced a poor fit, after which a series of modifications were attempted to improve the functioning of the SAI. After all meaningful modifications were added, the overall fit of the SAI was still outside of a range that would suggest a statistically valid measurement model. Suggestions for SAI modification and use are made as they relate to these findings.","PeriodicalId":320077,"journal":{"name":"Learning and Performance Assessment","volume":"109 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121869823","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A Case Study of Peer Assessment in a Composition MOOC MOOC作文课程中同伴评议的案例研究
Pub Date : 1900-01-01 DOI: 10.4018/978-1-7998-0420-8.ch043
L. Vu
The large enrollments of multiple thousands of students in MOOCs seem to exceed the assessment capacity of instructors; therefore, the inability for instructors to grade so many papers is likely responsible for MOOCs turning to peer assessment. However, there has been little empirical research about peer assessment in MOOCs, especially composition MOOCs. This study aimed to address issues in peer assessment in a composition MOOC, particularly the students' perceptions and the peer-grading scores versus instructor-grading scores. The findings provided evidence that peer assessment was well received by the majority of students although many students also expressed negative feelings about this activity. Statistical analysis shows that there were significant differences in the grades given by students and those given by the instructors, which means the grades the students awarded to their peers tended to be higher in comparison to the instructor-assigned grades. Based on the results, this study concludes with implementations for peer assessment in a composition MOOC context.
mooc成千上万的学生注册人数似乎超出了教师的评估能力;因此,讲师无法给这么多论文打分,可能是mooc转向同行评估的原因。然而,关于mooc中同伴评议的实证研究却很少,尤其是在作文类mooc中。本研究旨在解决MOOC作文中同伴评估的问题,特别是学生的看法和同伴评分分数与教师评分分数。研究结果证明,尽管许多学生也对这项活动表达了负面情绪,但大多数学生都接受了同伴评估。统计分析表明,学生给的分数和老师给的分数有显著差异,这意味着学生给同龄人的分数比老师给的分数要高。在此基础上,本研究总结了在MOOC作文环境下同伴评估的实现。
{"title":"A Case Study of Peer Assessment in a Composition MOOC","authors":"L. Vu","doi":"10.4018/978-1-7998-0420-8.ch043","DOIUrl":"https://doi.org/10.4018/978-1-7998-0420-8.ch043","url":null,"abstract":"The large enrollments of multiple thousands of students in MOOCs seem to exceed the assessment capacity of instructors; therefore, the inability for instructors to grade so many papers is likely responsible for MOOCs turning to peer assessment. However, there has been little empirical research about peer assessment in MOOCs, especially composition MOOCs. This study aimed to address issues in peer assessment in a composition MOOC, particularly the students' perceptions and the peer-grading scores versus instructor-grading scores. The findings provided evidence that peer assessment was well received by the majority of students although many students also expressed negative feelings about this activity. Statistical analysis shows that there were significant differences in the grades given by students and those given by the instructors, which means the grades the students awarded to their peers tended to be higher in comparison to the instructor-assigned grades. Based on the results, this study concludes with implementations for peer assessment in a composition MOOC context.","PeriodicalId":320077,"journal":{"name":"Learning and Performance Assessment","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121984305","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Competency-Based Assessment 能力评估
Pub Date : 1900-01-01 DOI: 10.4018/978-1-7998-0420-8.CH006
M. K. Idrissi, Meriem Hnida, S. Bennani
Competency-based Assessment (CBA) is the measurement of student's competency against a standard of performance. It is a process of collecting evidences to analyze student's progress and achievement. In higher education, Competency-based Assessment puts the focus on learning outcomes to constantly improve academic programs and meet labor market demands. As of to date, competencies are described using natural language but rarely used in e-learning systems, and the common sense idea is that: the way competency is defined shapes the way it is conceptualized, implemented and assessed. The main objective of this chapter is to introduce and discuss Competency-based Assessment from a methodological and technical perspectives. More specifically, the objective is to highlight ongoing issues regarding competency assessment in higher education in the 21st century, to emphasis the benefits of its implementation and finally to discuss some competency modeling and assessment techniques.
基于能力的评估(CBA)是对学生能力的衡量标准的表现。它是一个收集证据来分析学生进步和成绩的过程。在高等教育中,以能力为基础的评估将重点放在学习成果上,以不断改进学术课程,满足劳动力市场的需求。到目前为止,能力是用自然语言描述的,但很少在电子学习系统中使用,常识的想法是:能力的定义方式决定了它的概念化、实施和评估方式。本章的主要目的是从方法论和技术角度介绍和讨论基于能力的评估。更具体地说,目标是强调21世纪高等教育中关于能力评估的持续问题,强调其实施的好处,最后讨论一些能力建模和评估技术。
{"title":"Competency-Based Assessment","authors":"M. K. Idrissi, Meriem Hnida, S. Bennani","doi":"10.4018/978-1-7998-0420-8.CH006","DOIUrl":"https://doi.org/10.4018/978-1-7998-0420-8.CH006","url":null,"abstract":"Competency-based Assessment (CBA) is the measurement of student's competency against a standard of performance. It is a process of collecting evidences to analyze student's progress and achievement. In higher education, Competency-based Assessment puts the focus on learning outcomes to constantly improve academic programs and meet labor market demands. As of to date, competencies are described using natural language but rarely used in e-learning systems, and the common sense idea is that: the way competency is defined shapes the way it is conceptualized, implemented and assessed. The main objective of this chapter is to introduce and discuss Competency-based Assessment from a methodological and technical perspectives. More specifically, the objective is to highlight ongoing issues regarding competency assessment in higher education in the 21st century, to emphasis the benefits of its implementation and finally to discuss some competency modeling and assessment techniques.","PeriodicalId":320077,"journal":{"name":"Learning and Performance Assessment","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129502556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
期刊
Learning and Performance Assessment
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1