Human-Robot Cooperative Piano Playing with Learning-Based Real-Time Music Accompaniment

Huijiang Wang, Xiaoping Zhang, Fumiya Iida
{"title":"Human-Robot Cooperative Piano Playing with Learning-Based Real-Time Music Accompaniment","authors":"Huijiang Wang, Xiaoping Zhang, Fumiya Iida","doi":"arxiv-2409.11952","DOIUrl":null,"url":null,"abstract":"Recent advances in machine learning have paved the way for the development of\nmusical and entertainment robots. However, human-robot cooperative instrument\nplaying remains a challenge, particularly due to the intricate motor\ncoordination and temporal synchronization. In this paper, we propose a\ntheoretical framework for human-robot cooperative piano playing based on\nnon-verbal cues. First, we present a music improvisation model that employs a\nrecurrent neural network (RNN) to predict appropriate chord progressions based\non the human's melodic input. Second, we propose a behavior-adaptive controller\nto facilitate seamless temporal synchronization, allowing the cobot to generate\nharmonious acoustics. The collaboration takes into account the bidirectional\ninformation flow between the human and robot. We have developed an\nentropy-based system to assess the quality of cooperation by analyzing the\nimpact of different communication modalities during human-robot collaboration.\nExperiments demonstrate that our RNN-based improvisation can achieve a 93\\%\naccuracy rate. Meanwhile, with the MPC adaptive controller, the robot could\nrespond to the human teammate in homophony performances with real-time\naccompaniment. Our designed framework has been validated to be effective in\nallowing humans and robots to work collaboratively in the artistic\npiano-playing task.","PeriodicalId":501031,"journal":{"name":"arXiv - CS - Robotics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11952","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Recent advances in machine learning have paved the way for the development of musical and entertainment robots. However, human-robot cooperative instrument playing remains a challenge, particularly due to the intricate motor coordination and temporal synchronization. In this paper, we propose a theoretical framework for human-robot cooperative piano playing based on non-verbal cues. First, we present a music improvisation model that employs a recurrent neural network (RNN) to predict appropriate chord progressions based on the human's melodic input. Second, we propose a behavior-adaptive controller to facilitate seamless temporal synchronization, allowing the cobot to generate harmonious acoustics. The collaboration takes into account the bidirectional information flow between the human and robot. We have developed an entropy-based system to assess the quality of cooperation by analyzing the impact of different communication modalities during human-robot collaboration. Experiments demonstrate that our RNN-based improvisation can achieve a 93\% accuracy rate. Meanwhile, with the MPC adaptive controller, the robot could respond to the human teammate in homophony performances with real-time accompaniment. Our designed framework has been validated to be effective in allowing humans and robots to work collaboratively in the artistic piano-playing task.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用基于学习的实时音乐伴奏进行人机合作钢琴演奏
机器学习领域的最新进展为音乐和娱乐机器人的开发铺平了道路。然而,人机合作弹奏乐器仍然是一项挑战,特别是由于复杂的运动协调和时间同步。在本文中,我们提出了基于非语言线索的人机合作钢琴演奏理论框架。首先,我们提出了一个音乐即兴演奏模型,该模型利用电流神经网络(RNN)根据人类的旋律输入预测适当的和弦行进。其次,我们提出了一种行为自适应控制器,以促进无缝的时间同步,使 cobot 能够产生和谐的音响效果。这种协作考虑到了人类与机器人之间的双向信息流。我们开发了一个基于熵的系统,通过分析人机协作过程中不同通信方式的影响来评估合作质量。同时,通过 MPC 自适应控制器,机器人可以在同音表演中与人类队友进行实时伴奏。我们所设计的框架已被验证能够有效地让人类和机器人在艺术钢琴演奏任务中协同工作。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
IMRL: Integrating Visual, Physical, Temporal, and Geometric Representations for Enhanced Food Acquisition Human-Robot Cooperative Piano Playing with Learning-Based Real-Time Music Accompaniment GauTOAO: Gaussian-based Task-Oriented Affordance of Objects Reinforcement Learning with Lie Group Orientations for Robotics Haptic-ACT: Bridging Human Intuition with Compliant Robotic Manipulation via Immersive VR
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1