{"title":"Using computerised comparative judgement to assess translation","authors":"Chao Han, Bei Hu, Qin Fan, Jing Duan, Xi Li","doi":"10.1556/084.2022.00001","DOIUrl":null,"url":null,"abstract":"\n Translation assessment represents a productive line of research in Translation Studies. An array of methods has been trialled to assess translation quality, ranging from intuitive assessment to error analysis and from rubric scoring to item-based assessment. In this article, we introduce a lesser-known approach to translation assessment called comparative judgement. Rooted in psychophysical analysis, comparative judgement grounds itself on the assumption that humans tend to be more accurate in making relative judgements than in making absolute judgements. We conducted an experiment, as both a methodological exploration and a feasibility investigation, in which novice and experienced judges were recruited to assess English-Chinese translation, using a computerised comparative judgement platform. The collected data were analysed to shed light on the validity and reliability of assessment results and the judges’ perceptions. Our analysis shows that (1) overall, comparative judgement produced valid measures and facilitated judgement reliability, although such results seemed to be affected by translation directionality and judges’ experience, and (2) the judges were generally confident about their decisions, despite some emergent factors undermining the validity of their decision making. Finally, we discuss the use of comparative judgement as a possible method in translation assessment and its implications for future practice and research.","PeriodicalId":44202,"journal":{"name":"Across Languages and Cultures","volume":" ","pages":""},"PeriodicalIF":1.0000,"publicationDate":"2022-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Across Languages and Cultures","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1556/084.2022.00001","RegionNum":3,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"LANGUAGE & LINGUISTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Translation assessment represents a productive line of research in Translation Studies. An array of methods has been trialled to assess translation quality, ranging from intuitive assessment to error analysis and from rubric scoring to item-based assessment. In this article, we introduce a lesser-known approach to translation assessment called comparative judgement. Rooted in psychophysical analysis, comparative judgement grounds itself on the assumption that humans tend to be more accurate in making relative judgements than in making absolute judgements. We conducted an experiment, as both a methodological exploration and a feasibility investigation, in which novice and experienced judges were recruited to assess English-Chinese translation, using a computerised comparative judgement platform. The collected data were analysed to shed light on the validity and reliability of assessment results and the judges’ perceptions. Our analysis shows that (1) overall, comparative judgement produced valid measures and facilitated judgement reliability, although such results seemed to be affected by translation directionality and judges’ experience, and (2) the judges were generally confident about their decisions, despite some emergent factors undermining the validity of their decision making. Finally, we discuss the use of comparative judgement as a possible method in translation assessment and its implications for future practice and research.
期刊介绍:
Across Languages and Cultures publishes original articles and reviews on all sub-disciplines of Translation and Interpreting (T/I) Studies: general T/I theory, descriptive T/I studies and applied T/I studies. Special emphasis is laid on the questions of multilingualism, language policy and translation policy. Publications on new research methods and models are encouraged. Publishes book reviews, news, announcements and advertisements.