The assessment criteria for general-purpose speaking tests are normally produced from test developers’ intuition or communicative competence models. Therefore, the ideal second language (L2) communication in general-purpose speaking tests reflects language specialists’ perspectives. However, neglect of the views of non- language specialists (i.e., linguistic laypersons) on communication is problematic since these laypersons are interlocutors in many real- world situations. This study (a) investigated whether L2 speakers’ results on general-purpose speaking tests align with linguistic laypersons’ judgments of L2 communicative ability and (b) explored performance features that affect these judgments. Twenty-six post- graduate students of non-linguistic disciplines rated 13 speakers’ communicative ability on general-purpose speaking tests and provided verbal explanations of the performance features affecting their ratings. Their ratings were compared with the speakers’ test results, and the features that determined their ratings were examined. Although these ratings were not completely different from the test results, some speakers’ test results did not align with their ratings. The linguistic laypersons’ judgments were affected not only by features that the general-proficiency tests assessed but by other factors as well. The findings of this study will deepen our understanding of real- world interlocutors’ views on communication and contribute to the development of authentic criteria for general-purpose speaking tests.
{"title":"The gap between communicative ability measurements: General-purpose English speaking tests and linguistic laypersons’ judgments","authors":"Takanori Sato","doi":"10.58379/ngnh8496","DOIUrl":"https://doi.org/10.58379/ngnh8496","url":null,"abstract":"The assessment criteria for general-purpose speaking tests are normally produced from test developers’ intuition or communicative competence models. Therefore, the ideal second language (L2) communication in general-purpose speaking tests reflects language specialists’ perspectives. However, neglect of the views of non- language specialists (i.e., linguistic laypersons) on communication is problematic since these laypersons are interlocutors in many real- world situations. This study (a) investigated whether L2 speakers’ results on general-purpose speaking tests align with linguistic laypersons’ judgments of L2 communicative ability and (b) explored performance features that affect these judgments. Twenty-six post- graduate students of non-linguistic disciplines rated 13 speakers’ communicative ability on general-purpose speaking tests and provided verbal explanations of the performance features affecting their ratings. Their ratings were compared with the speakers’ test results, and the features that determined their ratings were examined. Although these ratings were not completely different from the test results, some speakers’ test results did not align with their ratings. The linguistic laypersons’ judgments were affected not only by features that the general-proficiency tests assessed but by other factors as well. The findings of this study will deepen our understanding of real- world interlocutors’ views on communication and contribute to the development of authentic criteria for general-purpose speaking tests.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"70 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"77020650","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"L. Cheng & J. Fox. Assessment in the Language Classroom: Teachers Supporting Student Learning.","authors":"Lyn May","doi":"10.58379/icxk5618","DOIUrl":"https://doi.org/10.58379/icxk5618","url":null,"abstract":"<jats:p>n/a</jats:p>","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"14 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88058589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The impact of the use of assessment on teaching and learning is increasingly regarded as a key concern in evaluating assessment use. Realising intended forms of impact relies on more than the design of an assessment: account must also be taken of the ways in which teachers, learners and others understand the demands of the assessment and incorporate these into their practice. The measures that testing agencies take to present and explicate their tests to teachers and other stakeholders therefore play an important role in promoting intended impact and mitigating unintended, negative impact. Materials that support teachers in preparing learners to take tests (such as descriptions of the test, preparation materials and teacher training resources) play an important role in communicating the test providers’ intentions. In this study, these support materials are analysed. The selected materials, provided to teachers by Cambridge English Language Assessment, go with the Speaking component of a major international test of general English proficiency: Cambridge English: First. The study addresses how these materials might embody or reflect learning-oriented assessment principles of task authenticity, learner engagement and feedback within a coherent systemic theory of action, reconciling formative and summative assessment functions to the benefit of learning.
{"title":"Learning-oriented Language Test Preparation Materials: A contradiction in terms? ","authors":"A. Green","doi":"10.58379/sfun3846","DOIUrl":"https://doi.org/10.58379/sfun3846","url":null,"abstract":"The impact of the use of assessment on teaching and learning is increasingly regarded as a key concern in evaluating assessment use. Realising intended forms of impact relies on more than the design of an assessment: account must also be taken of the ways in which teachers, learners and others understand the demands of the assessment and incorporate these into their practice. The measures that testing agencies take to present and explicate their tests to teachers and other stakeholders therefore play an important role in promoting intended impact and mitigating unintended, negative impact. Materials that support teachers in preparing learners to take tests (such as descriptions of the test, preparation materials and teacher training resources) play an important role in communicating the test providers’ intentions. In this study, these support materials are analysed. The selected materials, provided to teachers by Cambridge English Language Assessment, go with the Speaking component of a major international test of general English proficiency: Cambridge English: First. The study addresses how these materials might embody or reflect learning-oriented assessment principles of task authenticity, learner engagement and feedback within a coherent systemic theory of action, reconciling formative and summative assessment functions to the benefit of learning.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"11 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2017-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84950077","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Solano-Flores, G. Assessing English Language Learners: Theory and Practice","authors":"Yangting Wang","doi":"10.58379/tsdf2267","DOIUrl":"https://doi.org/10.58379/tsdf2267","url":null,"abstract":"<jats:p>n/a</jats:p>","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"59 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2017-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74285132","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Language Teacher Assessment Literacy – scoping the territory","authors":"Kathryn Hill","doi":"10.58379/kgyb3160","DOIUrl":"https://doi.org/10.58379/kgyb3160","url":null,"abstract":"<jats:p>n/a</jats:p>","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"29 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2017-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72526364","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Read, J. Assessing English proficiency for university study","authors":"Naoki Ikeda","doi":"10.58379/biyf3236","DOIUrl":"https://doi.org/10.58379/biyf3236","url":null,"abstract":"<jats:p>n/a</jats:p>","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"33 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2017-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81419407","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Assessment literacy (AL) is central to the quality of education because competencies in assessing student learning lead to informed decisions. While the AL of university English teachers in China is particularly crucial as they teach the largest group of adult English language learners in the world, it has regrettably remained largely unexplored. The present study subjected an adapted version of the Teacher Assessment Literacy Questionnaire to rigorous psychometric property analyses, and used it to investigate the AL level of Chinese university English teachers (N=891) and the effects of their demographic characteristics on AL performance. Findings reveal a basic level of AL in certain dimensions with limited influence from demographic characteristics. Discussions are centered around validation of the AL instrument, causes for limited AL competence, and key factors that have impacted AL. This study concludes with a reflection of constructing contextually-grounded AL measures and implications for principles, policy and practice of teacher assessment education.
{"title":"University English teacher assessment literacy: A survey-test report from China","authors":"Yueting Xu, Gavin T. L. Brown","doi":"10.58379/uzon5145","DOIUrl":"https://doi.org/10.58379/uzon5145","url":null,"abstract":"Assessment literacy (AL) is central to the quality of education because competencies in assessing student learning lead to informed decisions. While the AL of university English teachers in China is particularly crucial as they teach the largest group of adult English language learners in the world, it has regrettably remained largely unexplored. The present study subjected an adapted version of the Teacher Assessment Literacy Questionnaire to rigorous psychometric property analyses, and used it to investigate the AL level of Chinese university English teachers (N=891) and the effects of their demographic characteristics on AL performance. Findings reveal a basic level of AL in certain dimensions with limited influence from demographic characteristics. Discussions are centered around validation of the AL instrument, causes for limited AL competence, and key factors that have impacted AL. This study concludes with a reflection of constructing contextually-grounded AL measures and implications for principles, policy and practice of teacher assessment education.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"43 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2017-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88364604","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The concept of literacy has expanded significantly in the course of the last decade to include computer literacy or science literacy in academic and public discourse. The recent focus of attention on classroom-based language assessment has brought the aspect of language assessment literacy (LAL) to the forefront, with foreign language teachers as one important group of stakeholders whose professionalization of this concept is important. The current paper aims at relating the findings from part of a large-scale study that was undertaken in seven different European countries in order to shed light on perceived LAL levels of foreign language teachers and their training needs in language testing and assessment. The focus will be on the qualitative part of the mixed methods study that examines foreign language school teachers’ perceptions in three different educational contexts, namely Cyprus, Germany and Greece. The paper closes with an outline of the implications for teacher development and directions for future research.
{"title":"Assessment Literacy of Foreign Language Teachers around Europe: Research, Challenges and Future Prospects","authors":"Dina Tsagari, Kari Vogt","doi":"10.58379/uhix9883","DOIUrl":"https://doi.org/10.58379/uhix9883","url":null,"abstract":"The concept of literacy has expanded significantly in the course of the last decade to include computer literacy or science literacy in academic and public discourse. The recent focus of attention on classroom-based language assessment has brought the aspect of language assessment literacy (LAL) to the forefront, with foreign language teachers as one important group of stakeholders whose professionalization of this concept is important. The current paper aims at relating the findings from part of a large-scale study that was undertaken in seven different European countries in order to shed light on perceived LAL levels of foreign language teachers and their training needs in language testing and assessment. The focus will be on the qualitative part of the mixed methods study that examines foreign language school teachers’ perceptions in three different educational contexts, namely Cyprus, Germany and Greece. The paper closes with an outline of the implications for teacher development and directions for future research.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"41 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2017-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73785883","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The research suggests that while language teachers recognise the importance of developing their assessment literacy, they often have difficulty when it comes to articulating and prioritising their needs (Fulcher, 2012; Tsagari & Vogt, this issue). One explanation for this might be that they lack the means for reflecting in any systematic way on the nature of their classroom-based assessment (CBA) practices. In other words, teachers need to develop a better understanding of what they already do before they can start to think about which aspects of their CBA practices could be developed. This paper will describe a framework designed to help teachers identify and analyse their existing CBA practices as a precursor to reflecting on their professional development needs. Discussion of language teacher assessment literacy has tended to focus on the more planned and formal types of assessment. However, it is now widely recognised that classroom teachers are involved in a more or less continuous process of appraisal and, moreover, that the feedback provided in these more incidental and embedded forms of assessment can have a powerful effect on learning (Hattie, 2009). Hence, the framework described in this article attempts to present a view of CBA which encompasses the full spectrum of assessment practices, including the types of assessment which occur spontaneously in the course of routine classroom interactions (Leung, 2005; Purpura, Liu, Tsutagawa & Woodson, 2014). The starting point for this was an existing framework, based on an ethnographic study of CBA in language classrooms, designed to help researchers identify and make sense of observed CBA practices (Hill, 2012; Hill & McNamara, 2012). This framework was extended and elaborated with reference to principles of TAL as well as to the research on CBA more generally and reframed as a tool to help teachers make sense of their own assessment practices, which, it is argued, represents an essential pre-condition for developing assessment literacy.
{"title":"Understanding classroom-based assessment practices: A precondition for teacher assessment literacy","authors":"Kathryn Hill","doi":"10.58379/yiwz4710","DOIUrl":"https://doi.org/10.58379/yiwz4710","url":null,"abstract":"The research suggests that while language teachers recognise the importance of developing their assessment literacy, they often have difficulty when it comes to articulating and prioritising their needs (Fulcher, 2012; Tsagari & Vogt, this issue). One explanation for this might be that they lack the means for reflecting in any systematic way on the nature of their classroom-based assessment (CBA) practices. In other words, teachers need to develop a better understanding of what they already do before they can start to think about which aspects of their CBA practices could be developed. This paper will describe a framework designed to help teachers identify and analyse their existing CBA practices as a precursor to reflecting on their professional development needs. Discussion of language teacher assessment literacy has tended to focus on the more planned and formal types of assessment. However, it is now widely recognised that classroom teachers are involved in a more or less continuous process of appraisal and, moreover, that the feedback provided in these more incidental and embedded forms of assessment can have a powerful effect on learning (Hattie, 2009). Hence, the framework described in this article attempts to present a view of CBA which encompasses the full spectrum of assessment practices, including the types of assessment which occur spontaneously in the course of routine classroom interactions (Leung, 2005; Purpura, Liu, Tsutagawa & Woodson, 2014). The starting point for this was an existing framework, based on an ethnographic study of CBA in language classrooms, designed to help researchers identify and make sense of observed CBA practices (Hill, 2012; Hill & McNamara, 2012). This framework was extended and elaborated with reference to principles of TAL as well as to the research on CBA more generally and reframed as a tool to help teachers make sense of their own assessment practices, which, it is argued, represents an essential pre-condition for developing assessment literacy.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"59 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2017-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85077872","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper explains how teachers working in a distinctive educational policy context in Singapore expanded their assessment practices to broaden English language learning in their classrooms. The educational policy context, known as the Integrated Programmes, has been implemented since 2004 for selected schools in the country to de-emphasise the influence of examinations and promote teacher autonomy to enhance students’ learning. More specifically, this paper discusses the findings of a study where a small group of high school teachers expanded their language learning and assessment constructs beyond those considered in the national examination and mainstream school practices in the country. They did this by (1) paying greater attention to culture, (2) building on an extended understanding of genres, (3) giving increased importance to content knowledge, and (4) placing a stronger emphasis on higher-order thinking, learning, and communicating in authentic contexts. While these four areas are drawn from the distinctive educational context of the Integrated Programmes, they serve to illuminate and illustrate how English language teachers in general can develop their assessment literacy to expand beyond the assessment constructs examined in high stakes tests to increase students’ learning in their local contexts.
{"title":"Developing assessment literacy in Singapore: How teachers broaden English language learning by expanding assessment constructs","authors":"Rajenthiran Sellan","doi":"10.58379/xgnu8346","DOIUrl":"https://doi.org/10.58379/xgnu8346","url":null,"abstract":"This paper explains how teachers working in a distinctive educational policy context in Singapore expanded their assessment practices to broaden English language learning in their classrooms. The educational policy context, known as the Integrated Programmes, has been implemented since 2004 for selected schools in the country to de-emphasise the influence of examinations and promote teacher autonomy to enhance students’ learning. More specifically, this paper discusses the findings of a study where a small group of high school teachers expanded their language learning and assessment constructs beyond those considered in the national examination and mainstream school practices in the country. They did this by (1) paying greater attention to culture, (2) building on an extended understanding of genres, (3) giving increased importance to content knowledge, and (4) placing a stronger emphasis on higher-order thinking, learning, and communicating in authentic contexts. While these four areas are drawn from the distinctive educational context of the Integrated Programmes, they serve to illuminate and illustrate how English language teachers in general can develop their assessment literacy to expand beyond the assessment constructs examined in high stakes tests to increase students’ learning in their local contexts.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":"15 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2017-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84394949","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}