{"title":"A structural equation investigation of linguistic features as indices of writing quality in assessed secondary-level EMI learners’ scientific reports","authors":"Jack Pun , Wangyin Kenneth Li","doi":"10.1016/j.asw.2024.100897","DOIUrl":null,"url":null,"abstract":"<div><div>While inquiry into the relationship between linguistic features and L2 writing quality has been a long-standing line of research, little scholarly attention has been drawn to the predictive value of linguistic features in assessing the writing quality of English-medium scientific report writing. This study adds to the existing literature by examining the relation of lexical and syntactic complexity to writing quality, based on 106 scientific reports composed by Hong Kong Chinese learners of English in EMI secondary schools. Natural language processing tools were employed to extract computational indices of linguistic complexity features, followed by the use of a structural equation modeling (SEM) approach to investigate their predictive power. The validity of the anticipated construct was confirmed based upon several goodness-of-fit criteria. The SEM analysis indicated that writing quality was predicted by lexical sophistication (i.e., text-based complexity: word range and academic words; psycholinguistic complexity: word familiarity and age-of-acquisition ratings), lexical diversity (i.e., MTLD and VocD), and syntactic complexity (i.e., mean length of sentence and dependent clauses per T-unit). However, the relation of lexical diversity and syntactic complexity to writing quality was mediated by lexical sophistication. Implications for scientific report writing assessment and pedagogy in EMI educational contexts are discussed.</div></div>","PeriodicalId":46865,"journal":{"name":"Assessing Writing","volume":"62 ","pages":"Article 100897"},"PeriodicalIF":4.2000,"publicationDate":"2024-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Assessing Writing","FirstCategoryId":"98","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1075293524000904","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
While inquiry into the relationship between linguistic features and L2 writing quality has been a long-standing line of research, little scholarly attention has been drawn to the predictive value of linguistic features in assessing the writing quality of English-medium scientific report writing. This study adds to the existing literature by examining the relation of lexical and syntactic complexity to writing quality, based on 106 scientific reports composed by Hong Kong Chinese learners of English in EMI secondary schools. Natural language processing tools were employed to extract computational indices of linguistic complexity features, followed by the use of a structural equation modeling (SEM) approach to investigate their predictive power. The validity of the anticipated construct was confirmed based upon several goodness-of-fit criteria. The SEM analysis indicated that writing quality was predicted by lexical sophistication (i.e., text-based complexity: word range and academic words; psycholinguistic complexity: word familiarity and age-of-acquisition ratings), lexical diversity (i.e., MTLD and VocD), and syntactic complexity (i.e., mean length of sentence and dependent clauses per T-unit). However, the relation of lexical diversity and syntactic complexity to writing quality was mediated by lexical sophistication. Implications for scientific report writing assessment and pedagogy in EMI educational contexts are discussed.
期刊介绍:
Assessing Writing is a refereed international journal providing a forum for ideas, research and practice on the assessment of written language. Assessing Writing publishes articles, book reviews, conference reports, and academic exchanges concerning writing assessments of all kinds, including traditional (direct and standardised forms of) testing of writing, alternative performance assessments (such as portfolios), workplace sampling and classroom assessment. The journal focuses on all stages of the writing assessment process, including needs evaluation, assessment creation, implementation, and validation, and test development.