通过持续预训练构建领域指定的日语金融大语言模型

Masanori Hirano, Kentaro Imajo
{"title":"通过持续预训练构建领域指定的日语金融大语言模型","authors":"Masanori Hirano, Kentaro Imajo","doi":"arxiv-2404.10555","DOIUrl":null,"url":null,"abstract":"Large language models (LLMs) are now widely used in various fields, including\nfinance. However, Japanese financial-specific LLMs have not been proposed yet.\nHence, this study aims to construct a Japanese financial-specific LLM through\ncontinual pre-training. Before tuning, we constructed Japanese\nfinancial-focused datasets for continual pre-training. As a base model, we\nemployed a Japanese LLM that achieved state-of-the-art performance on Japanese\nfinancial benchmarks among the 10-billion-class parameter models. After\ncontinual pre-training using the datasets and the base model, the tuned model\nperformed better than the original model on the Japanese financial benchmarks.\nMoreover, the outputs comparison results reveal that the tuned model's outputs\ntend to be better than the original model's outputs in terms of the quality and\nlength of the answers. These findings indicate that domain-specific continual\npre-training is also effective for LLMs. The tuned model is publicly available\non Hugging Face.","PeriodicalId":501294,"journal":{"name":"arXiv - QuantFin - Computational Finance","volume":"214 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Construction of Domain-specified Japanese Large Language Model for Finance through Continual Pre-training\",\"authors\":\"Masanori Hirano, Kentaro Imajo\",\"doi\":\"arxiv-2404.10555\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Large language models (LLMs) are now widely used in various fields, including\\nfinance. However, Japanese financial-specific LLMs have not been proposed yet.\\nHence, this study aims to construct a Japanese financial-specific LLM through\\ncontinual pre-training. Before tuning, we constructed Japanese\\nfinancial-focused datasets for continual pre-training. As a base model, we\\nemployed a Japanese LLM that achieved state-of-the-art performance on Japanese\\nfinancial benchmarks among the 10-billion-class parameter models. After\\ncontinual pre-training using the datasets and the base model, the tuned model\\nperformed better than the original model on the Japanese financial benchmarks.\\nMoreover, the outputs comparison results reveal that the tuned model's outputs\\ntend to be better than the original model's outputs in terms of the quality and\\nlength of the answers. These findings indicate that domain-specific continual\\npre-training is also effective for LLMs. The tuned model is publicly available\\non Hugging Face.\",\"PeriodicalId\":501294,\"journal\":{\"name\":\"arXiv - QuantFin - Computational Finance\",\"volume\":\"214 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-04-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - QuantFin - Computational Finance\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2404.10555\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuantFin - Computational Finance","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2404.10555","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

大语言模型(LLM)目前已广泛应用于各个领域,包括金融领域。因此,本研究旨在通过持续的预训练构建日语金融专用 LLM。在调整之前,我们构建了以日本金融为重点的数据集,用于持续预训练。作为基础模型,我们使用了一个日本 LLM,该 LLM 在日本金融基准测试中取得了百亿级参数模型中最先进的性能。在使用数据集和基础模型进行持续预训练后,调整后的模型在日本金融基准测试中的表现优于原始模型。此外,输出比较结果表明,就答案的质量和长度而言,调整后模型的输出最终优于原始模型的输出。这些结果表明,针对特定领域的持续预训练对 LLM 也很有效。调整后的模型可在 "Hugging Face "网站上公开获取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Construction of Domain-specified Japanese Large Language Model for Finance through Continual Pre-training
Large language models (LLMs) are now widely used in various fields, including finance. However, Japanese financial-specific LLMs have not been proposed yet. Hence, this study aims to construct a Japanese financial-specific LLM through continual pre-training. Before tuning, we constructed Japanese financial-focused datasets for continual pre-training. As a base model, we employed a Japanese LLM that achieved state-of-the-art performance on Japanese financial benchmarks among the 10-billion-class parameter models. After continual pre-training using the datasets and the base model, the tuned model performed better than the original model on the Japanese financial benchmarks. Moreover, the outputs comparison results reveal that the tuned model's outputs tend to be better than the original model's outputs in terms of the quality and length of the answers. These findings indicate that domain-specific continual pre-training is also effective for LLMs. The tuned model is publicly available on Hugging Face.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A deep primal-dual BSDE method for optimal stopping problems Robust financial calibration: a Bayesian approach for neural SDEs MANA-Net: Mitigating Aggregated Sentiment Homogenization with News Weighting for Enhanced Market Prediction QuantFactor REINFORCE: Mining Steady Formulaic Alpha Factors with Variance-bounded REINFORCE Signature of maturity in cryptocurrency volatility
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1