Space semantic aware loss function for embedding creation in case of transaction data

M. Vatkin, D. A. Vorobey
{"title":"Space semantic aware loss function for embedding creation in case of transaction data","authors":"M. Vatkin, D. A. Vorobey","doi":"10.33581/2520-6508-2022-1-97-102","DOIUrl":null,"url":null,"abstract":"Transaction data are the most popular data type of bank domain, they are often represented as sparse vectors with a large number of features. Using sparse vectors in deep learning tasks is computationally inefficient and may lead to overfitting. Аutoencoders are widely applied to extract new useful features in a lower dimensional space. In this paper we propose to use a novel loss function based on the metric that estimates the quality of mapping the semantic structure of the original tabular data to the embedded space. The proposed loss function allows preserving the item relation structure of the original space during the dimension reduction transformation. The obtained results show the improvement of the resulting embedding properties while using the combination of the new loss function and the traditional mean squared error one.","PeriodicalId":36323,"journal":{"name":"Zhurnal Belorusskogo Gosudarstvennogo Universiteta. Matematika. Informatika","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Zhurnal Belorusskogo Gosudarstvennogo Universiteta. Matematika. Informatika","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.33581/2520-6508-2022-1-97-102","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Mathematics","Score":null,"Total":0}
引用次数: 0

Abstract

Transaction data are the most popular data type of bank domain, they are often represented as sparse vectors with a large number of features. Using sparse vectors in deep learning tasks is computationally inefficient and may lead to overfitting. Аutoencoders are widely applied to extract new useful features in a lower dimensional space. In this paper we propose to use a novel loss function based on the metric that estimates the quality of mapping the semantic structure of the original tabular data to the embedded space. The proposed loss function allows preserving the item relation structure of the original space during the dimension reduction transformation. The obtained results show the improvement of the resulting embedding properties while using the combination of the new loss function and the traditional mean squared error one.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
事务数据情况下嵌入创建的空间语义感知损失函数
交易数据是银行领域最常用的数据类型,它们通常被表示为具有大量特征的稀疏向量。在深度学习任务中使用稀疏向量计算效率低下,并且可能导致过拟合。Аutoencoders被广泛应用于在低维空间中提取新的有用特征。在本文中,我们提出使用一种新的基于度量的损失函数来估计原始表格数据的语义结构映射到嵌入空间的质量。所提出的损失函数允许在降维变换中保留原始空间的项目关系结构。结果表明,将新的损失函数与传统的均方误差函数结合使用后,得到的嵌入性能得到了改善。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
0.50
自引率
0.00%
发文量
21
审稿时长
16 weeks
期刊最新文献
Algorithm for solving the knapsack problem with certain properties of Pareto layers Numerical study of the relative equilibrium of a droplet with a simply connected free surface on a rotating plane On the Hosoya polynomial of the third type of the chain hex-derived network Algebraic equations and polynomials over the ring of p-complex numbers On the theory of operator interpolation in spaces of rectangular matrixes
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1