Shuhan Zhong, Sizhe Song, Tianhao Tang, Fei Nie, Xinrui Zhou, Yankun Zhao, Yizhe Zhao, Kuen Fung Sin, S.-H. Gary Chan
{"title":"DYPA","authors":"Shuhan Zhong, Sizhe Song, Tianhao Tang, Fei Nie, Xinrui Zhou, Yankun Zhao, Yizhe Zhao, Kuen Fung Sin, S.-H. Gary Chan","doi":"10.1145/3610908","DOIUrl":null,"url":null,"abstract":"Identifying early a person with dyslexia, a learning disorder with reading and writing, is critical for effective treatment. As accredited specialists for clinical diagnosis of dyslexia are costly and undersupplied, we research and develop a computer-assisted approach to efficiently prescreen dyslexic Chinese children so that timely resources can be channelled to those at higher risk. Previous works in this area are mostly for English and other alphabetic languages, tailored narrowly for the reading disorder, or require costly specialized equipment. To overcome that, we present DYPA, a novel DYslexia Prescreening mobile Application for Chinese children. DYPA collects multimodal data from children through a set of specially designed interactive reading and writing tests in Chinese, and comprehensively analyzes their cognitive-linguistic skills with machine learning. To better account for the dyslexia-associated features in handwritten characters, DYPA employs a deep learning based multilevel Chinese handwriting analysis framework to extract features across the stroke, radical and character levels. We have implemented and installed DYPA in tablets, and our extensive trials with more than 200 pupils in Hong Kong validate its high predictive accuracy (81.14%), sensitivity (74.27%) and specificity (82.71%).","PeriodicalId":20553,"journal":{"name":"Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies","volume":"42 1","pages":"0"},"PeriodicalIF":3.6000,"publicationDate":"2023-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"DYPA\",\"authors\":\"Shuhan Zhong, Sizhe Song, Tianhao Tang, Fei Nie, Xinrui Zhou, Yankun Zhao, Yizhe Zhao, Kuen Fung Sin, S.-H. Gary Chan\",\"doi\":\"10.1145/3610908\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Identifying early a person with dyslexia, a learning disorder with reading and writing, is critical for effective treatment. As accredited specialists for clinical diagnosis of dyslexia are costly and undersupplied, we research and develop a computer-assisted approach to efficiently prescreen dyslexic Chinese children so that timely resources can be channelled to those at higher risk. Previous works in this area are mostly for English and other alphabetic languages, tailored narrowly for the reading disorder, or require costly specialized equipment. To overcome that, we present DYPA, a novel DYslexia Prescreening mobile Application for Chinese children. DYPA collects multimodal data from children through a set of specially designed interactive reading and writing tests in Chinese, and comprehensively analyzes their cognitive-linguistic skills with machine learning. To better account for the dyslexia-associated features in handwritten characters, DYPA employs a deep learning based multilevel Chinese handwriting analysis framework to extract features across the stroke, radical and character levels. We have implemented and installed DYPA in tablets, and our extensive trials with more than 200 pupils in Hong Kong validate its high predictive accuracy (81.14%), sensitivity (74.27%) and specificity (82.71%).\",\"PeriodicalId\":20553,\"journal\":{\"name\":\"Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies\",\"volume\":\"42 1\",\"pages\":\"0\"},\"PeriodicalIF\":3.6000,\"publicationDate\":\"2023-09-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3610908\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3610908","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Identifying early a person with dyslexia, a learning disorder with reading and writing, is critical for effective treatment. As accredited specialists for clinical diagnosis of dyslexia are costly and undersupplied, we research and develop a computer-assisted approach to efficiently prescreen dyslexic Chinese children so that timely resources can be channelled to those at higher risk. Previous works in this area are mostly for English and other alphabetic languages, tailored narrowly for the reading disorder, or require costly specialized equipment. To overcome that, we present DYPA, a novel DYslexia Prescreening mobile Application for Chinese children. DYPA collects multimodal data from children through a set of specially designed interactive reading and writing tests in Chinese, and comprehensively analyzes their cognitive-linguistic skills with machine learning. To better account for the dyslexia-associated features in handwritten characters, DYPA employs a deep learning based multilevel Chinese handwriting analysis framework to extract features across the stroke, radical and character levels. We have implemented and installed DYPA in tablets, and our extensive trials with more than 200 pupils in Hong Kong validate its high predictive accuracy (81.14%), sensitivity (74.27%) and specificity (82.71%).