Wei Qin, Xiaowei Wang, Zhenzhen Hu, Lei Wang, Yunshi Lan, Richang Hong
{"title":"通过分离记忆检索生成数学单词问题","authors":"Wei Qin, Xiaowei Wang, Zhenzhen Hu, Lei Wang, Yunshi Lan, Richang Hong","doi":"10.1145/3639569","DOIUrl":null,"url":null,"abstract":"<p>The task of math word problem(MWP) generation, which generates an MWP given an equation and relevant topic words, has increasingly attracted researchers’ attention. In this work, we introduce a simple memory retrieval module to search related training MWPs, which are used to augment the generation. To retrieve more relevant training data, we also propose a disentangled memory retrieval module based on the simple memory retrieval module. To this end, we first disentangle the training MWPs into logical description and scenario description and then record them in respective memory modules. Later, we use the given equation and topic words as queries to retrieve relevant logical descriptions and scenario descriptions from the corresponding memory modules respectively. The retrieved results are then used to complement the process of the MWP generation. Extensive experiments and ablation studies verify the superior performance of our method and the effectiveness of each proposed module. The code is available at https://github.com/mwp-g/MWPG-DMR.</p>","PeriodicalId":49249,"journal":{"name":"ACM Transactions on Knowledge Discovery from Data","volume":"32 1","pages":""},"PeriodicalIF":4.0000,"publicationDate":"2024-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Math Word Problem Generation via Disentangled Memory Retrieval\",\"authors\":\"Wei Qin, Xiaowei Wang, Zhenzhen Hu, Lei Wang, Yunshi Lan, Richang Hong\",\"doi\":\"10.1145/3639569\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>The task of math word problem(MWP) generation, which generates an MWP given an equation and relevant topic words, has increasingly attracted researchers’ attention. In this work, we introduce a simple memory retrieval module to search related training MWPs, which are used to augment the generation. To retrieve more relevant training data, we also propose a disentangled memory retrieval module based on the simple memory retrieval module. To this end, we first disentangle the training MWPs into logical description and scenario description and then record them in respective memory modules. Later, we use the given equation and topic words as queries to retrieve relevant logical descriptions and scenario descriptions from the corresponding memory modules respectively. The retrieved results are then used to complement the process of the MWP generation. Extensive experiments and ablation studies verify the superior performance of our method and the effectiveness of each proposed module. The code is available at https://github.com/mwp-g/MWPG-DMR.</p>\",\"PeriodicalId\":49249,\"journal\":{\"name\":\"ACM Transactions on Knowledge Discovery from Data\",\"volume\":\"32 1\",\"pages\":\"\"},\"PeriodicalIF\":4.0000,\"publicationDate\":\"2024-01-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on Knowledge Discovery from Data\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1145/3639569\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Knowledge Discovery from Data","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1145/3639569","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Math Word Problem Generation via Disentangled Memory Retrieval
The task of math word problem(MWP) generation, which generates an MWP given an equation and relevant topic words, has increasingly attracted researchers’ attention. In this work, we introduce a simple memory retrieval module to search related training MWPs, which are used to augment the generation. To retrieve more relevant training data, we also propose a disentangled memory retrieval module based on the simple memory retrieval module. To this end, we first disentangle the training MWPs into logical description and scenario description and then record them in respective memory modules. Later, we use the given equation and topic words as queries to retrieve relevant logical descriptions and scenario descriptions from the corresponding memory modules respectively. The retrieved results are then used to complement the process of the MWP generation. Extensive experiments and ablation studies verify the superior performance of our method and the effectiveness of each proposed module. The code is available at https://github.com/mwp-g/MWPG-DMR.
期刊介绍:
TKDD welcomes papers on a full range of research in the knowledge discovery and analysis of diverse forms of data. Such subjects include, but are not limited to: scalable and effective algorithms for data mining and big data analysis, mining brain networks, mining data streams, mining multi-media data, mining high-dimensional data, mining text, Web, and semi-structured data, mining spatial and temporal data, data mining for community generation, social network analysis, and graph structured data, security and privacy issues in data mining, visual, interactive and online data mining, pre-processing and post-processing for data mining, robust and scalable statistical methods, data mining languages, foundations of data mining, KDD framework and process, and novel applications and infrastructures exploiting data mining technology including massively parallel processing and cloud computing platforms. TKDD encourages papers that explore the above subjects in the context of large distributed networks of computers, parallel or multiprocessing computers, or new data devices. TKDD also encourages papers that describe emerging data mining applications that cannot be satisfied by the current data mining technology.