{"title":"现代法国诗歌的生成与RoBERTa和GPT-2","authors":"Mika Hämäläinen, Khalid Alnajjar, T. Poibeau","doi":"10.48550/arXiv.2212.02911","DOIUrl":null,"url":null,"abstract":"We present a novel neural model for modern poetry gen- eration in French. The model consists of two pretrained neural models that are fine-tuned for the poem gener- ation task. The encoder of the model is a RoBERTa based one while the decoder is based on GPT-2. This way the model can benefit from the superior natural language understanding performance of RoBERTa and the good natural language generation performance of GPT-2. Our evaluation shows that the model can cre- ate French poetry successfully. On a 5 point scale, the lowest score of 3.57 was given by human judges to typ- icality and emotionality of the output poetry while the best score of 3.79 was given to understandability .","PeriodicalId":13714,"journal":{"name":"Intech","volume":"7 1","pages":"12-16"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Modern French Poetry Generation with RoBERTa and GPT-2\",\"authors\":\"Mika Hämäläinen, Khalid Alnajjar, T. Poibeau\",\"doi\":\"10.48550/arXiv.2212.02911\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present a novel neural model for modern poetry gen- eration in French. The model consists of two pretrained neural models that are fine-tuned for the poem gener- ation task. The encoder of the model is a RoBERTa based one while the decoder is based on GPT-2. This way the model can benefit from the superior natural language understanding performance of RoBERTa and the good natural language generation performance of GPT-2. Our evaluation shows that the model can cre- ate French poetry successfully. On a 5 point scale, the lowest score of 3.57 was given by human judges to typ- icality and emotionality of the output poetry while the best score of 3.79 was given to understandability .\",\"PeriodicalId\":13714,\"journal\":{\"name\":\"Intech\",\"volume\":\"7 1\",\"pages\":\"12-16\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Intech\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.48550/arXiv.2212.02911\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Intech","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.48550/arXiv.2212.02911","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Modern French Poetry Generation with RoBERTa and GPT-2
We present a novel neural model for modern poetry gen- eration in French. The model consists of two pretrained neural models that are fine-tuned for the poem gener- ation task. The encoder of the model is a RoBERTa based one while the decoder is based on GPT-2. This way the model can benefit from the superior natural language understanding performance of RoBERTa and the good natural language generation performance of GPT-2. Our evaluation shows that the model can cre- ate French poetry successfully. On a 5 point scale, the lowest score of 3.57 was given by human judges to typ- icality and emotionality of the output poetry while the best score of 3.79 was given to understandability .