Pub Date : 2025-10-15DOI: 10.1126/scirobotics.adv1383
Max Linnander, Dustin Goetz, Gregory Reardon, Vijay Kumar, Elliot Hawkes, Yon Visell
Tactile displays that lend tangible form to digital content could transform computing interactions. However, achieving the resolution, speed, and dynamic range needed for perceptual fidelity remains challenging. We present a dynamic tactile display that directly converts projected light into visible and tactile patterns via a photomechanical surface populated with millimeter-scale optotactile pixels. The pixels transduce incident light into mechanical displacements through photostimulated thermal gas expansion, yielding millimeter-scale displacements with response times of 2 to 100 milliseconds. The use of projected light for power transmission and addressing renders these displays highly scalable. We demonstrate optically driven displays with up to 1511 addressable pixels, several times more pixels than prior tactile displays attaining comparable performance. Perceptual studies confirm that these displays can reproduce diverse spatiotemporal tactile patterns with high fidelity. This research establishes a foundation for practical and versatile high-resolution tactile displays driven by light.
{"title":"Tactile displays driven by projected light","authors":"Max Linnander, Dustin Goetz, Gregory Reardon, Vijay Kumar, Elliot Hawkes, Yon Visell","doi":"10.1126/scirobotics.adv1383","DOIUrl":"10.1126/scirobotics.adv1383","url":null,"abstract":"<div >Tactile displays that lend tangible form to digital content could transform computing interactions. However, achieving the resolution, speed, and dynamic range needed for perceptual fidelity remains challenging. We present a dynamic tactile display that directly converts projected light into visible and tactile patterns via a photomechanical surface populated with millimeter-scale optotactile pixels. The pixels transduce incident light into mechanical displacements through photostimulated thermal gas expansion, yielding millimeter-scale displacements with response times of 2 to 100 milliseconds. The use of projected light for power transmission and addressing renders these displays highly scalable. We demonstrate optically driven displays with up to 1511 addressable pixels, several times more pixels than prior tactile displays attaining comparable performance. Perceptual studies confirm that these displays can reproduce diverse spatiotemporal tactile patterns with high fidelity. This research establishes a foundation for practical and versatile high-resolution tactile displays driven by light.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 107","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145295091","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-15DOI: 10.1126/scirobotics.adv4408
Xiangxiao Liu, Matthew D. Loring, Luca Zunino, Kaitlyn E. Fouke, François A. Longchamp, Alexandre Bernardino, Auke J. Ijspeert, Eva A. Naumann
Brains evolve within specific sensory and physical environments, yet neuroscience has traditionally focused on studying neural circuits in isolation. Understanding of their function requires integrative brain-body testing in realistic contexts. To investigate the neural and biomechanical mechanisms of sensorimotor transformations, we constructed realistic neuromechanical simulations (simZFish) of the zebrafish optomotor response, a visual stabilization behavior. By computationally reproducing the body mechanics, physical body-water interactions, hydrodynamics, visual environments, and experimentally derived neural network architectures, we closely replicated the behavior of real larval zebrafish. Through systematic manipulation of physiological and circuit connectivity features, impossible in biological experiments, we demonstrate how embodiment shapes neural activity, circuit architecture, and behavior. Changing lens properties and retinal connectivity revealed why the lower posterior visual field drives optimal optomotor responses in the simZFish, explaining receptive field properties observed in real zebrafish. When challenged with novel visual stimuli, the simZFish predicted previously unknown neuronal response types, which we identified via two-photon calcium imaging in the live brains of real zebrafish and incorporated to update the simZFish neural network. In virtual rivers, the simZFish performed rheotaxis autonomously by using current-induced optic flow patterns as navigational cues, compensating for the simulated water flow. Last, experiments with a physical robot (ZBot) validated the role of embodied sensorimotor circuits in maintaining position in a real river with complex fluid dynamics and visual environments. By iterating between simulations, behavioral observations, neural imaging, and robotic testing, we demonstrate the power of integrative approaches to investigating sensorimotor processing, providing insights into embodied neural circuit functions.
{"title":"Artificial embodied circuits uncover neural architectures of vertebrate visuomotor behaviors","authors":"Xiangxiao Liu, Matthew D. Loring, Luca Zunino, Kaitlyn E. Fouke, François A. Longchamp, Alexandre Bernardino, Auke J. Ijspeert, Eva A. Naumann","doi":"10.1126/scirobotics.adv4408","DOIUrl":"10.1126/scirobotics.adv4408","url":null,"abstract":"<div >Brains evolve within specific sensory and physical environments, yet neuroscience has traditionally focused on studying neural circuits in isolation. Understanding of their function requires integrative brain-body testing in realistic contexts. To investigate the neural and biomechanical mechanisms of sensorimotor transformations, we constructed realistic neuromechanical simulations (simZFish) of the zebrafish optomotor response, a visual stabilization behavior. By computationally reproducing the body mechanics, physical body-water interactions, hydrodynamics, visual environments, and experimentally derived neural network architectures, we closely replicated the behavior of real larval zebrafish. Through systematic manipulation of physiological and circuit connectivity features, impossible in biological experiments, we demonstrate how embodiment shapes neural activity, circuit architecture, and behavior. Changing lens properties and retinal connectivity revealed why the lower posterior visual field drives optimal optomotor responses in the simZFish, explaining receptive field properties observed in real zebrafish. When challenged with novel visual stimuli, the simZFish predicted previously unknown neuronal response types, which we identified via two-photon calcium imaging in the live brains of real zebrafish and incorporated to update the simZFish neural network. In virtual rivers, the simZFish performed rheotaxis autonomously by using current-induced optic flow patterns as navigational cues, compensating for the simulated water flow. Last, experiments with a physical robot (ZBot) validated the role of embodied sensorimotor circuits in maintaining position in a real river with complex fluid dynamics and visual environments. By iterating between simulations, behavioral observations, neural imaging, and robotic testing, we demonstrate the power of integrative approaches to investigating sensorimotor processing, providing insights into embodied neural circuit functions.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 107","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145295966","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-24DOI: 10.1126/scirobotics.adv4049
Amanda Prorok
The current trend toward generalist robot behaviors with monolithic artificial intelligence (AI) models is unsustainable. I advocate for a paradigm shift that embraces distributed architectures for collective robotic intelligence. A modular “mixture-of-robots” approach with specialized interdependent components can achieve superlinear gains, offering benefits in scalability, adaptability, and learning complex interactive skills.
{"title":"Extending robot minds through collective learning","authors":"Amanda Prorok","doi":"10.1126/scirobotics.adv4049","DOIUrl":"10.1126/scirobotics.adv4049","url":null,"abstract":"<div >The current trend toward generalist robot behaviors with monolithic artificial intelligence (AI) models is unsustainable. I advocate for a paradigm shift that embraces distributed architectures for collective robotic intelligence. A modular “mixture-of-robots” approach with specialized interdependent components can achieve superlinear gains, offering benefits in scalability, adaptability, and learning complex interactive skills.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.science.org/doi/reader/10.1126/scirobotics.adv4049","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145133624","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-24DOI: 10.1126/scirobotics.adv7932
Andrew I. Cooper, Patrick Courtney, Kourosh Darvish, Moritz Eckhoff, Hatem Fakhruldeen, Andrea Gabrielli, Animesh Garg, Sami Haddadin, Kanako Harada, Jason Hein, Maria Hübner, Dennis Knobbe, Gabriella Pizzuto, Florian Shkurti, Ruja Shrestha, Kerstin Thurow, Rafael Vescovi, Birgit Vogel-Heuser, Ádám Wolf, Naruki Yoshikawa, Yan Zeng, Zhengxue Zhou, Henning Zwirnmann
Science laboratory automation enables accelerated discovery in life sciences and materials. However, it requires interdisciplinary collaboration to address challenges such as robust and flexible autonomy, reproducibility, throughput, standardization, the role of human scientists, and ethics. This article highlights these issues, reflecting perspectives from leading experts in laboratory automation across different disciplines of the natural sciences.
{"title":"Accelerating discovery in natural science laboratories with AI and robotics: Perspectives and challenges","authors":"Andrew I. Cooper, Patrick Courtney, Kourosh Darvish, Moritz Eckhoff, Hatem Fakhruldeen, Andrea Gabrielli, Animesh Garg, Sami Haddadin, Kanako Harada, Jason Hein, Maria Hübner, Dennis Knobbe, Gabriella Pizzuto, Florian Shkurti, Ruja Shrestha, Kerstin Thurow, Rafael Vescovi, Birgit Vogel-Heuser, Ádám Wolf, Naruki Yoshikawa, Yan Zeng, Zhengxue Zhou, Henning Zwirnmann","doi":"10.1126/scirobotics.adv7932","DOIUrl":"10.1126/scirobotics.adv7932","url":null,"abstract":"<div >Science laboratory automation enables accelerated discovery in life sciences and materials. However, it requires interdisciplinary collaboration to address challenges such as robust and flexible autonomy, reproducibility, throughput, standardization, the role of human scientists, and ethics. This article highlights these issues, reflecting perspectives from leading experts in laboratory automation across different disciplines of the natural sciences.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145133623","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-24DOI: 10.1126/scirobotics.aeb6685
Robin R. Murphy
The novel Flybot gives a science-forward view of the challenges in building a fully autonomous robot fly.
新颖的Flybot给出了一个科学的观点,在建立一个完全自主的机器人飞行的挑战。
{"title":"Lowly fly or impressive miniature robot?","authors":"Robin R. Murphy","doi":"10.1126/scirobotics.aeb6685","DOIUrl":"10.1126/scirobotics.aeb6685","url":null,"abstract":"<div >The novel <i>Flybot</i> gives a science-forward view of the challenges in building a fully autonomous robot fly.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145133973","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-17DOI: 10.1126/scirobotics.adu4851
Xurui Liu, Hanchuan Tang, Na Li, Linjie He, Ye Tian, Bo Hao, Junnan Xue, Chaoyu Yang, Joseph Jao Yiu Sung, Li Zhang, Jianfeng Zang
Intelligent miniature systems capable of wireless sensing and manipulation hold considerable promise for advancing biomedical applications. However, the development of these systems has been substantially hindered by sensing-actuation incompatibility at small scales. To overcome this challenge, we propose a robotic sensing approach that integrates embedded ultrasonic soft sensors (EUSSs) with magnetic actuators, resulting in a wireless sensor-integrated miniature machine with seamless integration and minimal interference between fields. The EUSS, with its compact dimensions (1.3 millimeters by 1.3 millimeters by 1.6 millimeters), softness (98 kilopascals), and lightweight design (4.6 milligrams), is compatible with both soft and rigid components in terms of deformability and size. By engineering onboard transducers and using passive ultrasound communication along with external magnetic fields, we could wirelessly detect and regulate environmental parameters such as force, vibration, viscosity, and temperature. Demonstrations in rabbit and porcine models show the potential for robotic feedback control, accurate drug dosing, and in situ physiological monitoring, paving the way for real-world applications of intelligent miniature machines.
{"title":"Miniature magneto-ultrasonic machines for wireless robotic sensing and manipulation","authors":"Xurui Liu, Hanchuan Tang, Na Li, Linjie He, Ye Tian, Bo Hao, Junnan Xue, Chaoyu Yang, Joseph Jao Yiu Sung, Li Zhang, Jianfeng Zang","doi":"10.1126/scirobotics.adu4851","DOIUrl":"10.1126/scirobotics.adu4851","url":null,"abstract":"<div >Intelligent miniature systems capable of wireless sensing and manipulation hold considerable promise for advancing biomedical applications. However, the development of these systems has been substantially hindered by sensing-actuation incompatibility at small scales. To overcome this challenge, we propose a robotic sensing approach that integrates embedded ultrasonic soft sensors (EUSSs) with magnetic actuators, resulting in a wireless sensor-integrated miniature machine with seamless integration and minimal interference between fields. The EUSS, with its compact dimensions (1.3 millimeters by 1.3 millimeters by 1.6 millimeters), softness (98 kilopascals), and lightweight design (4.6 milligrams), is compatible with both soft and rigid components in terms of deformability and size. By engineering onboard transducers and using passive ultrasound communication along with external magnetic fields, we could wirelessly detect and regulate environmental parameters such as force, vibration, viscosity, and temperature. Demonstrations in rabbit and porcine models show the potential for robotic feedback control, accurate drug dosing, and in situ physiological monitoring, paving the way for real-world applications of intelligent miniature machines.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145077510","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-17DOI: 10.1126/scirobotics.aea9377
Xuhui Hu, Aiguo Song, Min Xu
Innovations in sensing and control technology helped an arm prosthesis novice win a global assistive robotics competition.
传感和控制技术的创新帮助手臂假肢新手赢得全球辅助机器人比赛。
{"title":"Arm prosthesis with dexterous control and sensory feedback delivers winning performance at Cybathlon","authors":"Xuhui Hu, Aiguo Song, Min Xu","doi":"10.1126/scirobotics.aea9377","DOIUrl":"10.1126/scirobotics.aea9377","url":null,"abstract":"<div >Innovations in sensing and control technology helped an arm prosthesis novice win a global assistive robotics competition.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.science.org/doi/reader/10.1126/scirobotics.aea9377","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145078231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-17DOI: 10.1126/scirobotics.aeb1340
Garrett Kryt, Rory Dougall, Jaimie Borisoff
An extending, articulating powered wheelchair competed and won the wheelchair race at Cybathlon 2024.
在2024年Cybathlon的轮椅比赛中,一款可伸缩的动力轮椅赢得了冠军。
{"title":"BCIT’s BEAST wheelchair takes on Cybathlon with power, precision, and pilot-led design","authors":"Garrett Kryt, Rory Dougall, Jaimie Borisoff","doi":"10.1126/scirobotics.aeb1340","DOIUrl":"10.1126/scirobotics.aeb1340","url":null,"abstract":"<div >An extending, articulating powered wheelchair competed and won the wheelchair race at Cybathlon 2024.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.science.org/doi/reader/10.1126/scirobotics.aeb1340","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145078357","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-17DOI: 10.1126/scirobotics.adt1497
Bo Ai, Stephen Tian, Haochen Shi, Yixuan Wang, Tobias Pfaff, Cheston Tan, Henrik I. Christensen, Hao Su, Jiajun Wu, Yunzhu Li
Dynamics models that predict the effects of physical interactions are essential for planning and control in robotic manipulation. Although models based on physical principles often generalize well, they typically require full-state information, which can be difficult or impossible to extract from perception data in complex, real-world scenarios. Learning-based dynamics models provide an alternative by deriving state transition functions purely from perceived interaction data, enabling the capture of complex, hard-to-model factors and predictive uncertainty and accelerating simulations that are often too slow for real-time control. Recent successes in this field have demonstrated notable advancements in robot capabilities, including long-horizon manipulation of deformable objects, granular materials, and complex multiobject interactions such as stowing and packing. A crucial aspect of these investigations is the choice of state representation, which determines the inductive biases in the learning system for reduced-order modeling of scene dynamics. This article provides a timely and comprehensive review of current techniques and trade-offs in designing learned dynamics models, highlighting their role in advancing robot capabilities through integration with state estimation and control and identifying critical research gaps for future exploration.
{"title":"A review of learning-based dynamics models for robotic manipulation","authors":"Bo Ai, Stephen Tian, Haochen Shi, Yixuan Wang, Tobias Pfaff, Cheston Tan, Henrik I. Christensen, Hao Su, Jiajun Wu, Yunzhu Li","doi":"10.1126/scirobotics.adt1497","DOIUrl":"10.1126/scirobotics.adt1497","url":null,"abstract":"<div >Dynamics models that predict the effects of physical interactions are essential for planning and control in robotic manipulation. Although models based on physical principles often generalize well, they typically require full-state information, which can be difficult or impossible to extract from perception data in complex, real-world scenarios. Learning-based dynamics models provide an alternative by deriving state transition functions purely from perceived interaction data, enabling the capture of complex, hard-to-model factors and predictive uncertainty and accelerating simulations that are often too slow for real-time control. Recent successes in this field have demonstrated notable advancements in robot capabilities, including long-horizon manipulation of deformable objects, granular materials, and complex multiobject interactions such as stowing and packing. A crucial aspect of these investigations is the choice of state representation, which determines the inductive biases in the learning system for reduced-order modeling of scene dynamics. This article provides a timely and comprehensive review of current techniques and trade-offs in designing learned dynamics models, highlighting their role in advancing robot capabilities through integration with state estimation and control and identifying critical research gaps for future exploration.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145077693","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-10DOI: 10.1126/scirobotics.adu5771
Lauren L. Wright, Pooja Vegesna, Joseph E. Michaelis, Bilge Mutlu, Sarah Sebo
Reading fluency is a vital building block for developing literacy, yet the best way to practice fluency—reading aloud—can cause anxiety severe enough to inhibit literacy development in ways that can have an adverse effect on students through adulthood. One promising intervention to mitigate oral reading anxiety is to have children read aloud to a robot. Although observations in prior work have suggested that people likely feel more comfortable in the presence of a robot instead of a human, few studies have empirically demonstrated that people feel less anxious performing in front of a robot compared with a human or used objective physiological indicators to identify decreased anxiety. To investigate whether a robotic reading companion could reduce reading anxiety felt by children, we conducted a within-subjects study where children aged 8 to 11 years (n = 52) read aloud to a human and a robot individually while being monitored for physiological responses associated with anxiety. We found that children exhibited fewer physiological indicators of anxiety, specifically vocal jitter and heart rate variability, when reading to the robot compared with reading to a person. This paper provides strong evidence that a robot’s presence has an effect on the anxiety a person experiences while doing a task, offering justification for the use of robots in a wide-reaching array of social interactions that may be anxiety inducing.
{"title":"Robotic reading companions can mitigate oral reading anxiety in children","authors":"Lauren L. Wright, Pooja Vegesna, Joseph E. Michaelis, Bilge Mutlu, Sarah Sebo","doi":"10.1126/scirobotics.adu5771","DOIUrl":"10.1126/scirobotics.adu5771","url":null,"abstract":"<div >Reading fluency is a vital building block for developing literacy, yet the best way to practice fluency—reading aloud—can cause anxiety severe enough to inhibit literacy development in ways that can have an adverse effect on students through adulthood. One promising intervention to mitigate oral reading anxiety is to have children read aloud to a robot. Although observations in prior work have suggested that people likely feel more comfortable in the presence of a robot instead of a human, few studies have empirically demonstrated that people feel less anxious performing in front of a robot compared with a human or used objective physiological indicators to identify decreased anxiety. To investigate whether a robotic reading companion could reduce reading anxiety felt by children, we conducted a within-subjects study where children aged 8 to 11 years (<i>n</i> = 52) read aloud to a human and a robot individually while being monitored for physiological responses associated with anxiety. We found that children exhibited fewer physiological indicators of anxiety, specifically vocal jitter and heart rate variability, when reading to the robot compared with reading to a person. This paper provides strong evidence that a robot’s presence has an effect on the anxiety a person experiences while doing a task, offering justification for the use of robots in a wide-reaching array of social interactions that may be anxiety inducing.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145028467","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}