首页 > 最新文献

Studies in Language Assessment最新文献

英文 中文
Language assessment literacy for language learning-oriented assessment 语言素养评价以语言学习为导向
Q4 LINGUISTICS Pub Date : 2017-01-01 DOI: 10.58379/lixl1198
L. Hamp-Lyons
This paper reflects on the findings of a small-scale and exploratory study which attempted to explore whether and how learning-oriented assessment opportunities might be revealed in, or inserted into formal speaking tests, order to provide language assessment literacy opportunities for language teachers teaching in test preparation courses as well as teachers training to become speaking test raters. Hamp-Lyons and Green (2014) closely studied a set of authentic speaking test video samples from the Cambridge: First (First Certificate of English) speaking test, in order to learn whether, and where, learning-oriented behaviours could be encouraged or added to interlocutors’ behaviours, without disrupting the required reliability and validity of the test. We paid particular attention to some basic components of effective interaction that we would want an examiner or interlocutor to exhibit if they seek to encourage interactive responses from test candidates: body language (in particular eye contact; intonation, pacing and pausing); management of turn-taking and elicitation of candidate-candidate interaction. We call this shift in focus to view tests as learning opportunities learning-oriented language assessment (LOLA).
本文对一项小规模的探索性研究的结果进行了反思,该研究试图探讨是否以及如何在正式的口语测试中揭示或插入以学习为导向的评估机会,以便为参加备考课程的语言教师教学以及培训教师成为口语测试评分员提供语言评估素养机会。ham - lyons和Green(2014)仔细研究了一组来自剑桥第一英语口语测试的真实口语测试视频样本,以了解在不破坏测试所需的信度和有效性的情况下,是否以及在哪里可以鼓励或添加以学习为导向的行为到对话者的行为中。我们特别关注了有效互动的一些基本组成部分,如果考官或对话者试图鼓励考生的互动反应,我们希望他们展示这些基本组成部分:肢体语言(特别是眼神交流;语调、节奏和停顿);轮换的管理和候选人之间互动的激发。我们把这种重心的转变称为“以学习为导向的语言评估”(LOLA)。
{"title":"Language assessment literacy for language learning-oriented assessment","authors":"L. Hamp-Lyons","doi":"10.58379/lixl1198","DOIUrl":"https://doi.org/10.58379/lixl1198","url":null,"abstract":"This paper reflects on the findings of a small-scale and exploratory study which attempted to explore whether and how learning-oriented assessment opportunities might be revealed in, or inserted into formal speaking tests, order to provide language assessment literacy opportunities for language teachers teaching in test preparation courses as well as teachers training to become speaking test raters. Hamp-Lyons and Green (2014) closely studied a set of authentic speaking test video samples from the Cambridge: First (First Certificate of English) speaking test, in order to learn whether, and where, learning-oriented behaviours could be encouraged or added to interlocutors’ behaviours, without disrupting the required reliability and validity of the test. We paid particular attention to some basic components of effective interaction that we would want an examiner or interlocutor to exhibit if they seek to encourage interactive responses from test candidates: body language (in particular eye contact; intonation, pacing and pausing); management of turn-taking and elicitation of candidate-candidate interaction. We call this shift in focus to view tests as learning opportunities learning-oriented language assessment (LOLA).","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"83 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2017-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91307941","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Developing assessment literacy of teachers of languages: A conceptual and interpretive challenge 发展语言教师的评估素养:概念和解释上的挑战
Q4 LINGUISTICS Pub Date : 2017-01-01 DOI: 10.58379/rxyj7968
A. Scarino
The teaching and learning of (foreign) languages in the context of globalisation is at a juncture in Australian education where fundamental changes in the field present distinctive challenges for teachers. These changes necessitate a reconceptualisation of the constructs and alter the very nature of assessment: the conceptualisation of what it is that is to be assessed, the processes used to elicit evidence of student learning and the frames of reference that provide the context for making judgments about students’ language learning. In this paper I discuss the shift from communicative language teaching towards an intercultural orientation in language learning. Based on data from a three-year-study that investigated teacher assessment of language learning from an intercultural perspective in a range of specific languages in the K–12 context, I discuss the nature of the challenge for teachers as they develop their assessment practices. This challenge is characterised as both conceptual and interpretive. I conclude by drawing implications for developing the assessment literacy of teachers of languages.
全球化背景下的(外语)教学正处于澳大利亚教育的关键时刻,该领域的根本变化给教师带来了独特的挑战。这些变化需要对结构进行重新概念化,并改变评估的本质:对要评估的内容进行概念化,用于引出学生学习证据的过程,以及为判断学生语言学习提供背景的参考框架。在本文中,我讨论了在语言学习中从交际性语言教学向跨文化取向的转变。根据一项为期三年的研究的数据,该研究从跨文化的角度调查了K-12背景下一系列特定语言的教师对语言学习的评估,我讨论了教师在制定评估实践时面临的挑战的性质。这一挑战既有概念性的,也有解释性的。最后,我提出了发展语言教师评估素养的启示。
{"title":"Developing assessment literacy of teachers of languages: A conceptual and interpretive challenge","authors":"A. Scarino","doi":"10.58379/rxyj7968","DOIUrl":"https://doi.org/10.58379/rxyj7968","url":null,"abstract":"The teaching and learning of (foreign) languages in the context of globalisation is at a juncture in Australian education where fundamental changes in the field present distinctive challenges for teachers. These changes necessitate a reconceptualisation of the constructs and alter the very nature of assessment: the conceptualisation of what it is that is to be assessed, the processes used to elicit evidence of student learning and the frames of reference that provide the context for making judgments about students’ language learning. In this paper I discuss the shift from communicative language teaching towards an intercultural orientation in language learning. Based on data from a three-year-study that investigated teacher assessment of language learning from an intercultural perspective in a range of specific languages in the K–12 context, I discuss the nature of the challenge for teachers as they develop their assessment practices. This challenge is characterised as both conceptual and interpretive. I conclude by drawing implications for developing the assessment literacy of teachers of languages.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"23 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2017-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76104014","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 19
Read, J. (Ed) Post-admission Language Assessment of University Students (主编)《大学学生入学后语言评估》
Q4 LINGUISTICS Pub Date : 2016-01-01 DOI: 10.58379/lhrm6354
M. Czajkowski
n/a
N/A
{"title":"Read, J. (Ed) Post-admission Language Assessment of University Students","authors":"M. Czajkowski","doi":"10.58379/lhrm6354","DOIUrl":"https://doi.org/10.58379/lhrm6354","url":null,"abstract":"<jats:p>n/a</jats:p>","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2016-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90302343","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Construct and Predictive Validity of a Self-Assessment Scale 自评量表之建构与预测效度
Q4 LINGUISTICS Pub Date : 2016-01-01 DOI: 10.58379/jdlz9308
Jason Fan
Guided by the theory of interpretive validity argument, this study investigated the plausibility and accuracy of five sets of warrants which were deemed crucial to the validity of a self-assessment (SA) scale designed and used in a local EFL context. Methodologically, this study utilized both the Rasch measurement theory and structural equation modeling (SEM) to examine the five warrants and their respective rebuttals. Results from Rasch analysis indicated that the scale could reliably distinguish students at different proficiency levels. Among the 26 can-do statements in the SA scale, only one statement failed to fit the expectations of the Rasch model. Furthermore, each category was found to function as intended, though the first category was somewhat underused. Confirmatory factor analysis of the SA data supported the tenability of the Higher-Order Factor model which is consistent with the current view of L2 ability. Structural regression analysis revealed that the association between students’ self-assessments and their scores on a standardized proficiency test was moderately strong. The multiple strands of evidence generated by various quantitative analyses of the SA data generally supported the validity of the SA scale. Future research, however, is warranted to examine other inferences in the validity argument structure, particularly in relation to the utility of the SA scale in English teaching and learning.
在解释效度理论的指导下,本研究调查了五组认证书的合理性和准确性,这些认证书被认为是在本地英语语境中设计和使用的自我评估量表的效度的关键。在方法上,本研究利用Rasch测量理论和结构方程模型(SEM)来检验五种权证及其各自的反驳。Rasch分析结果表明,量表能可靠地区分不同水平的学生。在SA量表的26个can-do语句中,只有一个语句不符合Rasch模型的期望。此外,发现每个类别都按预期发挥作用,尽管第一个类别有些未充分利用。SA数据的验证性因子分析支持高阶因子模型的可成立性,这与当前对第二语言能力的看法是一致的。结构回归分析显示,学生自我评价与标准化能力测试成绩之间的相关性中等。对SA数据的各种定量分析所产生的多股证据总体上支持SA量表的有效性。然而,未来的研究有必要检验效度论证结构中的其他推论,特别是与SA量表在英语教学中的效用有关的推论。
{"title":"The Construct and Predictive Validity of a Self-Assessment Scale","authors":"Jason Fan","doi":"10.58379/jdlz9308","DOIUrl":"https://doi.org/10.58379/jdlz9308","url":null,"abstract":"Guided by the theory of interpretive validity argument, this study investigated the plausibility and accuracy of five sets of warrants which were deemed crucial to the validity of a self-assessment (SA) scale designed and used in a local EFL context. Methodologically, this study utilized both the Rasch measurement theory and structural equation modeling (SEM) to examine the five warrants and their respective rebuttals. Results from Rasch analysis indicated that the scale could reliably distinguish students at different proficiency levels. Among the 26 can-do statements in the SA scale, only one statement failed to fit the expectations of the Rasch model. Furthermore, each category was found to function as intended, though the first category was somewhat underused. Confirmatory factor analysis of the SA data supported the tenability of the Higher-Order Factor model which is consistent with the current view of L2 ability. Structural regression analysis revealed that the association between students’ self-assessments and their scores on a standardized proficiency test was moderately strong. The multiple strands of evidence generated by various quantitative analyses of the SA data generally supported the validity of the SA scale. Future research, however, is warranted to examine other inferences in the validity argument structure, particularly in relation to the utility of the SA scale in English teaching and learning.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"45 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2016-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86852035","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Testing measurement invariance of an EAP listening placement test across undergraduate and graduate Students EAP听力分班测试在本科生和研究生之间的测试测量不变性
Q4 LINGUISTICS Pub Date : 2016-01-01 DOI: 10.58379/tznp6615
S. Youn, Seongah Im
The increasing number of international undergraduates enrolled in English-medium universities creates challenges for an existing EAP (English for Academic Purposes) placement test, especially when the validity of the existing test is not examined with incoming undergraduate examinees. As an attempt to address this issue from a measurement perspective, this study tested measurement invariance in a listening placement test across undergraduate and graduate examinees to investigate whether the test measures the same trait dimension across qualitatively distinct groups of examinees. Using 590 students’ listening placement test results, the best fitting baseline model was identified first and then competing models with a series of increasingly restrictive hypotheses were compared to test measurement and structural invariance of the target test across the undergraduate and graduate examinees. Measurement invariance across the undergraduate and graduate examinees was held, indicating invariant factors, equal factor loadings for each item, and error variance. However, structural invariance was not completely established especially for the factor means across two groups, which may suggest different score interpretations and uses depending on examinees’ academic status.
越来越多的国际本科生被英语授课的大学录取,这给现有的学术英语分班考试带来了挑战,特别是当现有考试的有效性没有被即将入学的本科考生检查时。为了从测量的角度解决这一问题,本研究在本科生和研究生的听力分班测试中测试了测量不变性,以探讨该测试是否在质量不同的考生群体中测量了相同的特质维度。利用590名考生的听力分班测试结果,首先确定了最佳拟合基线模型,然后将一系列限制性假设的竞争模型与目标测试的测试测量和结构不变性进行了比较。在本科生和研究生考生之间保持测量不变性,表明不变性因素,每个项目的等因子负荷和误差方差。然而,结构不变性并没有完全建立,特别是对于两组的因素均值,这可能表明不同的分数解释和使用取决于考生的学术地位。
{"title":"Testing measurement invariance of an EAP listening placement test across undergraduate and graduate Students","authors":"S. Youn, Seongah Im","doi":"10.58379/tznp6615","DOIUrl":"https://doi.org/10.58379/tznp6615","url":null,"abstract":"The increasing number of international undergraduates enrolled in English-medium universities creates challenges for an existing EAP (English for Academic Purposes) placement test, especially when the validity of the existing test is not examined with incoming undergraduate examinees. As an attempt to address this issue from a measurement perspective, this study tested measurement invariance in a listening placement test across undergraduate and graduate examinees to investigate whether the test measures the same trait dimension across qualitatively distinct groups of examinees. Using 590 students’ listening placement test results, the best fitting baseline model was identified first and then competing models with a series of increasingly restrictive hypotheses were compared to test measurement and structural invariance of the target test across the undergraduate and graduate examinees. Measurement invariance across the undergraduate and graduate examinees was held, indicating invariant factors, equal factor loadings for each item, and error variance. However, structural invariance was not completely established especially for the factor means across two groups, which may suggest different score interpretations and uses depending on examinees’ academic status.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"23 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2016-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90205219","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Negotiating the boundary between achievement and proficiency: An evaluation of the exit standard of an academic English pathway program 谈判成绩和熟练程度之间的界限:学术英语衔接课程退出标准的评估
Q4 LINGUISTICS Pub Date : 2016-01-01 DOI: 10.58379/krdu8216
Susy Macqueen, S. O'Hagan, B. Hughes
Academic English programs are popular pathways into English-medium university courses across the world. A typical program design hinges on an established university entrance standard, e.g. IELTS 6.5, and extrapolates the timing and structure of the pathway stages in relation to the test standard. The general principle is that the course assessments substitute for the test standard so that successful completion of the course is considered equivalent to achieving the minimum test standard for university entrance. This study reports on an evaluation of such course assessments at a major Australian university. The evaluation undertook to determine the appropriateness of the exit standard in relation to an independent measure of academic English ability. It also explored the suitability of the course final assessments used to produce measures in relation to that standard: by investigating the robustness of the processes and instruments and their appropriateness in relation to the course and the target academic domain. The evaluation was revealing about the difficult relationship between best practice in achievement testing in academic English pathway programs and external proficiency test standards. Using the sociological concept of ‘boundary object’ worlds (Star & Griesemer, 1989), we suggest that program evaluations that arise from a specific institutional concern for meeting adequate language standards can be informative about interactions between assessments in use.
在世界各地,学术英语课程是进入英语授课的大学课程的热门途径。一个典型的课程设计取决于既定的大学入学标准,例如雅思6.5,并根据考试标准推断出衔接阶段的时间和结构。总的原则是,课程评估代替考试标准,因此成功完成课程被认为相当于达到了大学入学的最低考试标准。本研究报告了澳大利亚一所主要大学对此类课程评估的评价。评估的目的是确定毕业标准与学术英语能力的独立衡量标准之间的适当性。它还通过调查过程和工具的稳健性及其相对于课程和目标学术领域的适当性,探讨了用于产生与该标准有关的措施的课程最后评估的适用性。该评估揭示了学术英语衔接课程成绩测试的最佳实践与外部水平测试标准之间的困难关系。使用“边界对象”世界的社会学概念(Star & Griesemer, 1989),我们建议,出于满足适当语言标准的特定机构关注而产生的程序评估可以提供有关使用中评估之间相互作用的信息。
{"title":"Negotiating the boundary between achievement and proficiency: An evaluation of the exit standard of an academic English pathway program","authors":"Susy Macqueen, S. O'Hagan, B. Hughes","doi":"10.58379/krdu8216","DOIUrl":"https://doi.org/10.58379/krdu8216","url":null,"abstract":"Academic English programs are popular pathways into English-medium university courses across the world. A typical program design hinges on an established university entrance standard, e.g. IELTS 6.5, and extrapolates the timing and structure of the pathway stages in relation to the test standard. The general principle is that the course assessments substitute for the test standard so that successful completion of the course is considered equivalent to achieving the minimum test standard for university entrance. This study reports on an evaluation of such course assessments at a major Australian university. The evaluation undertook to determine the appropriateness of the exit standard in relation to an independent measure of academic English ability. It also explored the suitability of the course final assessments used to produce measures in relation to that standard: by investigating the robustness of the processes and instruments and their appropriateness in relation to the course and the target academic domain. The evaluation was revealing about the difficult relationship between best practice in achievement testing in academic English pathway programs and external proficiency test standards. Using the sociological concept of ‘boundary object’ worlds (Star & Griesemer, 1989), we suggest that program evaluations that arise from a specific institutional concern for meeting adequate language standards can be informative about interactions between assessments in use.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"42 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2016-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82617223","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Evaluating the achievements and challenges in reforming a national language exam: The reform team’s perspective 评价语文考试改革的成就与挑战:改革团队的视角
Q4 LINGUISTICS Pub Date : 2016-01-01 DOI: 10.58379/qfjy5510
C. Spöttl, B. Kremmel, F. Holzknecht, J. Alderson
This paper outlines the reform of the national school-leaving exam in Austria from a teacher-designed exam to a professionally developed and standardized exam for the foreign languages English, French, Italian and Spanish, evaluating the unexpected challenges met along the way from the project team’s perspective. It describes the assessment context prior to the reform to illustrate the perceived need for change and outlines the steps taken to address this need. The paper explains how key features of the exam reform project were implemented step-by-step to raise awareness with stakeholders and convince authorities to support and adopt the new approach. Reporting on the various stages of the project, it evaluates its success in introducing one standardized CEFR-based test for all students nationwide. The paper in particular highlights the unexpected political, technical and practical challenges faced, and how these were addressed, overcome or endured and with what consequences. The paper concludes with reflections and recommendations on how comparable test development projects may be approached.
本文概述了奥地利国家中学毕业考试的改革,从教师设计的考试到专业开发和标准化的外语考试,包括英语、法语、意大利语和西班牙语,从项目组的角度评估了在此过程中遇到的意想不到的挑战。它描述了改革前的评估情况,以说明改革的需要,并概述了为解决这一需要而采取的步骤。该文件解释了如何逐步实施考试改革项目的关键特征,以提高利益相关者的认识,并说服当局支持和采用新方法。报告了项目的各个阶段,评估了在全国范围内为所有学生引入一种标准化的基于cefr的考试的成功。该文件特别强调了所面临的意想不到的政治、技术和实践挑战,以及如何解决、克服或忍受这些挑战以及产生了什么后果。本文最后对如何处理可比较的测试开发项目进行了反思和建议。
{"title":"Evaluating the achievements and challenges in reforming a national language exam: The reform team’s perspective","authors":"C. Spöttl, B. Kremmel, F. Holzknecht, J. Alderson","doi":"10.58379/qfjy5510","DOIUrl":"https://doi.org/10.58379/qfjy5510","url":null,"abstract":"This paper outlines the reform of the national school-leaving exam in Austria from a teacher-designed exam to a professionally developed and standardized exam for the foreign languages English, French, Italian and Spanish, evaluating the unexpected challenges met along the way from the project team’s perspective. It describes the assessment context prior to the reform to illustrate the perceived need for change and outlines the steps taken to address this need. The paper explains how key features of the exam reform project were implemented step-by-step to raise awareness with stakeholders and convince authorities to support and adopt the new approach. Reporting on the various stages of the project, it evaluates its success in introducing one standardized CEFR-based test for all students nationwide. The paper in particular highlights the unexpected political, technical and practical challenges faced, and how these were addressed, overcome or endured and with what consequences. The paper concludes with reflections and recommendations on how comparable test development projects may be approached.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"52 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2016-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90673109","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Introduction to Special Issue 特刊简介
Q4 LINGUISTICS Pub Date : 2016-01-01 DOI: 10.58379/ydms1439
C. Elder
n/a
N/A
{"title":"Introduction to Special Issue","authors":"C. Elder","doi":"10.58379/ydms1439","DOIUrl":"https://doi.org/10.58379/ydms1439","url":null,"abstract":"<jats:p>n/a</jats:p>","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"300 12","pages":""},"PeriodicalIF":0.0,"publicationDate":"2016-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72391740","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
​Defining assessment standards for a new national tertiary-level qualification 确定新的国家三级资格考核标准
Q4 LINGUISTICS Pub Date : 2016-01-01 DOI: 10.58379/cthx3423
J. Read
In this era of public accountability, defining levels of performance for assessment purposes has become a major consideration for educational institutions. It was certainly true of the development by the national qualifications authority of the New Zealand Certificates of English Language (NZCEL), a five-level sequence of awards for learners of English as an additional language at the post-secondary level implemented in 2014. The process of defining the five levels involved benchmarking of standards both nationally and internationally, particularly in relation to the Common European Framework of Reference (CEFR). This paper presents an outsider’s view of the definition of standards for the NZCEL, based on information provided by key participants at the national and local levels. The process has involved taking account of not only the CEFR but also the New Zealand Qualifications Framework (NZQF) and the band score levels of the International English Language Testing System (IELTS). The paper focuses in particular on the issue of establishing the equivalence of NZCEL 4 (Academic) to other recognised measures of English language proficiency as an admission requirement to undergraduate study for international students. The benchmarking process was both multi-faceted and open-ended, in that several issues remain unresolved as implementation of programmes leading to the NZCEL 4 (Academic) has proceeded. At the time of writing, the NZCEL qualifications are scheduled for a formal review and the paper concludes with a discussion of the issues that ideally should be addressed in evaluating the qualification to date.
在这个公共问责的时代,为评估目的确定绩效水平已成为教育机构的主要考虑因素。国家资格认证机构新西兰英语语言证书(NZCEL)的发展无疑是正确的,这是一个五个级别的奖励,用于在2014年实施的高等教育阶段将英语作为额外语言的学习者。确定五个级别的过程涉及国家和国际标准的基准,特别是关于欧洲共同参考框架(CEFR)。本文根据国家和地方层面的主要参与者提供的信息,介绍了NZCEL标准定义的局外人观点。这一过程不仅考虑了CEFR,还考虑了新西兰资格框架(NZQF)和国际英语语言测试系统(雅思)的成绩水平。本文特别关注的问题是建立NZCEL 4(学术)与其他公认的英语语言能力标准的等效性,作为国际学生本科学习的入学要求。基准测试过程是多方面的,也是开放式的,因为随着NZCEL 4(学术)项目的实施,一些问题仍未解决。在撰写本文时,NZCEL资格已被安排进行正式审查,论文最后讨论了迄今为止在评估资格时理想情况下应该解决的问题。
{"title":"​Defining assessment standards for a new national tertiary-level qualification","authors":"J. Read","doi":"10.58379/cthx3423","DOIUrl":"https://doi.org/10.58379/cthx3423","url":null,"abstract":"In this era of public accountability, defining levels of performance for assessment purposes has become a major consideration for educational institutions. It was certainly true of the development by the national qualifications authority of the New Zealand Certificates of English Language (NZCEL), a five-level sequence of awards for learners of English as an additional language at the post-secondary level implemented in 2014. The process of defining the five levels involved benchmarking of standards both nationally and internationally, particularly in relation to the Common European Framework of Reference (CEFR). This paper presents an outsider’s view of the definition of standards for the NZCEL, based on information provided by key participants at the national and local levels. The process has involved taking account of not only the CEFR but also the New Zealand Qualifications Framework (NZQF) and the band score levels of the International English Language Testing System (IELTS). The paper focuses in particular on the issue of establishing the equivalence of NZCEL 4 (Academic) to other recognised measures of English language proficiency as an admission requirement to undergraduate study for international students. The benchmarking process was both multi-faceted and open-ended, in that several issues remain unresolved as implementation of programmes leading to the NZCEL 4 (Academic) has proceeded. At the time of writing, the NZCEL qualifications are scheduled for a formal review and the paper concludes with a discussion of the issues that ideally should be addressed in evaluating the qualification to date.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2016-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74876476","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Kunnan, A. J. (Ed.) Talking about language assessment: The LAQ interviews 孔南,A. J.(编):《论语言评估:LAQ访谈》
Q4 LINGUISTICS Pub Date : 2016-01-01 DOI: 10.58379/jjli5881
Paul Gruba
n/a
N/A
{"title":"Kunnan, A. J. (Ed.) Talking about language assessment: The LAQ interviews","authors":"Paul Gruba","doi":"10.58379/jjli5881","DOIUrl":"https://doi.org/10.58379/jjli5881","url":null,"abstract":"<jats:p>n/a</jats:p>","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"16 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2016-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80469845","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Studies in Language Assessment
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1