BeautyNet:一个使用腕带传感器的化妆活动识别框架

Fatimah Albargi, Naima Khan, Indrajeet Ghosh, Ahana Roy
{"title":"BeautyNet:一个使用腕带传感器的化妆活动识别框架","authors":"Fatimah Albargi, Naima Khan, Indrajeet Ghosh, Ahana Roy","doi":"10.1109/SMARTCOMP58114.2023.00072","DOIUrl":null,"url":null,"abstract":"The significance of enhancing facial features has grown increasingly in popularity among all groups of people bringing a surge in makeup activities. The makeup market is one of the most profitable and founding sectors in the fashion industry which involves product retailing and demands user training. Makeup activities imply exceptionally delicate hand movements and require much training and practice for perfection. However, the only available choices in learning makeup activities are hands-on workshops by professional instructors or, at most, video-based visual instructions. None of these exhibits immense benefits to beginners, or visually impaired people. One can consistently watch and listen to the best of their abilities, but to precisely practice, perform, and reach makeup satisfaction, recognition from an IoT (Internet-of-Things) device with results and feedback would be the utmost support. In this work, we propose a makeup activity recognition framework, BeautyNet which detects different makeup activities from wrist-worn sensor data collected from ten participants of different age groups in two experimental setups. Our framework employs a LSTM-autoencoder based classifier to extract features from the sensor data and classifies five makeup activities (i.e., applying cream, lipsticks, blusher, eyeshadow, and mascara) in controlled and uncontrolled environment. Empirical results indicate that BeautyNet achieves 95% and 93% accuracy for makeup activity detection in controlled and uncontrolled settings, respectively. In addition, we evaluate BeautyNet with various traditional machine learning algorithms using our in-house dataset and noted an increase in accuracy by ≈ 4-7%.","PeriodicalId":163556,"journal":{"name":"2023 IEEE International Conference on Smart Computing (SMARTCOMP)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"BeautyNet: A Makeup Activity Recognition Framework using Wrist-worn Sensor\",\"authors\":\"Fatimah Albargi, Naima Khan, Indrajeet Ghosh, Ahana Roy\",\"doi\":\"10.1109/SMARTCOMP58114.2023.00072\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The significance of enhancing facial features has grown increasingly in popularity among all groups of people bringing a surge in makeup activities. The makeup market is one of the most profitable and founding sectors in the fashion industry which involves product retailing and demands user training. Makeup activities imply exceptionally delicate hand movements and require much training and practice for perfection. However, the only available choices in learning makeup activities are hands-on workshops by professional instructors or, at most, video-based visual instructions. None of these exhibits immense benefits to beginners, or visually impaired people. One can consistently watch and listen to the best of their abilities, but to precisely practice, perform, and reach makeup satisfaction, recognition from an IoT (Internet-of-Things) device with results and feedback would be the utmost support. In this work, we propose a makeup activity recognition framework, BeautyNet which detects different makeup activities from wrist-worn sensor data collected from ten participants of different age groups in two experimental setups. Our framework employs a LSTM-autoencoder based classifier to extract features from the sensor data and classifies five makeup activities (i.e., applying cream, lipsticks, blusher, eyeshadow, and mascara) in controlled and uncontrolled environment. Empirical results indicate that BeautyNet achieves 95% and 93% accuracy for makeup activity detection in controlled and uncontrolled settings, respectively. In addition, we evaluate BeautyNet with various traditional machine learning algorithms using our in-house dataset and noted an increase in accuracy by ≈ 4-7%.\",\"PeriodicalId\":163556,\"journal\":{\"name\":\"2023 IEEE International Conference on Smart Computing (SMARTCOMP)\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE International Conference on Smart Computing (SMARTCOMP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SMARTCOMP58114.2023.00072\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE International Conference on Smart Computing (SMARTCOMP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SMARTCOMP58114.2023.00072","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

增强面部特征的重要性在所有人群中越来越受欢迎,带来了化妆活动的激增。化妆品市场是时尚行业中最赚钱、最基础的行业之一,涉及产品零售,需要用户培训。化妆活动需要非常精细的手部动作,需要大量的训练和练习才能达到完美。然而,学习化妆活动的唯一选择是专业教师的动手讲习班,或者最多是基于视频的视觉指导。这些对初学者或视障人士都没有多大好处。人们可以不断地观看和倾听他们的最佳能力,但要精确地练习,表演并达到化妆满意度,来自物联网(IoT)设备的识别结果和反馈将是最大的支持。在这项工作中,我们提出了一个化妆活动识别框架,BeautyNet,它从两个实验设置中收集的来自不同年龄组的10名参与者的腕戴传感器数据中检测不同的化妆活动。我们的框架采用基于lstm自编码器的分类器从传感器数据中提取特征,并在受控和非受控环境中对五种化妆活动(即涂面霜、口红、腮红、眼影和睫毛膏)进行分类。实证结果表明,BeautyNet在受控和非受控设置下的化妆活动检测准确率分别达到95%和93%。此外,我们使用内部数据集使用各种传统机器学习算法对BeautyNet进行了评估,并注意到准确率提高了约4-7%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
BeautyNet: A Makeup Activity Recognition Framework using Wrist-worn Sensor
The significance of enhancing facial features has grown increasingly in popularity among all groups of people bringing a surge in makeup activities. The makeup market is one of the most profitable and founding sectors in the fashion industry which involves product retailing and demands user training. Makeup activities imply exceptionally delicate hand movements and require much training and practice for perfection. However, the only available choices in learning makeup activities are hands-on workshops by professional instructors or, at most, video-based visual instructions. None of these exhibits immense benefits to beginners, or visually impaired people. One can consistently watch and listen to the best of their abilities, but to precisely practice, perform, and reach makeup satisfaction, recognition from an IoT (Internet-of-Things) device with results and feedback would be the utmost support. In this work, we propose a makeup activity recognition framework, BeautyNet which detects different makeup activities from wrist-worn sensor data collected from ten participants of different age groups in two experimental setups. Our framework employs a LSTM-autoencoder based classifier to extract features from the sensor data and classifies five makeup activities (i.e., applying cream, lipsticks, blusher, eyeshadow, and mascara) in controlled and uncontrolled environment. Empirical results indicate that BeautyNet achieves 95% and 93% accuracy for makeup activity detection in controlled and uncontrolled settings, respectively. In addition, we evaluate BeautyNet with various traditional machine learning algorithms using our in-house dataset and noted an increase in accuracy by ≈ 4-7%.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Teaching Humanoid Robots to Assist Humans for Collaborative Tasks Keynotes A Novel Context Aware Paths Recommendation Approach for the Cultural Heritage Enhancement Internet of Things in SPA Medicine: A General Framework to Improve User Treatments Nisshash: Design of An IoT-based Smart T-Shirt for Guided Breathing Exercises
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1