Pub Date : 2025-10-22DOI: 10.1126/scirobotics.adu3679
Martina Pfeiffer, Fiona Cole, Dongfang Wang, Yonggang Ke, Philip Tinnefeld
DNA origami nanorobots allow for the rational design of nanomachines that respond to environmental stimuli with preprogrammed tasks. To date, this mostly is achieved by constructing two-state switches that, upon activation, change their conformation, resulting in the performance of an operation. Their applicability is often limited to a single, specific stimulus-output combination because of their intrinsic properties as two-state systems only. This makes expanding them further challenging. Here, we addressed this limitation by introducing reconfigurable DNA origami arrays as networks of coupled two-state systems. This universal design strategy enables the integration of various operational units into any two-state system within the nanorobot, allowing it to process multiple stimuli, compute responses using multilevel Boolean logic, and execute a range of operations with controlled order, timing, and spatial position. We anticipate that this strategy will be instrumental in further developing DNA origami nanorobots for applications in various technological fields.
{"title":"Spring-loaded DNA origami arrays as energy-supplied hardware for modular nanorobots","authors":"Martina Pfeiffer, Fiona Cole, Dongfang Wang, Yonggang Ke, Philip Tinnefeld","doi":"10.1126/scirobotics.adu3679","DOIUrl":"10.1126/scirobotics.adu3679","url":null,"abstract":"<div >DNA origami nanorobots allow for the rational design of nanomachines that respond to environmental stimuli with preprogrammed tasks. To date, this mostly is achieved by constructing two-state switches that, upon activation, change their conformation, resulting in the performance of an operation. Their applicability is often limited to a single, specific stimulus-output combination because of their intrinsic properties as two-state systems only. This makes expanding them further challenging. Here, we addressed this limitation by introducing reconfigurable DNA origami arrays as networks of coupled two-state systems. This universal design strategy enables the integration of various operational units into any two-state system within the nanorobot, allowing it to process multiple stimuli, compute responses using multilevel Boolean logic, and execute a range of operations with controlled order, timing, and spatial position. We anticipate that this strategy will be instrumental in further developing DNA origami nanorobots for applications in various technological fields.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 107","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145339482","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-22DOI: 10.1126/scirobotics.adw8905
Corey Zheng, Shu Jia
Vision is a critical sensory function for humans, animals, and engineered systems, enabling environmental perception essential for imaging and autonomous operation. Although bioinspired, tunable optical systems have advanced adaptability and performance, challenges remain in achieving biocompatibility, robust yet flexible construction, and specialized multifunctionality. Here, we present a photoresponsive hydrogel soft lens (PHySL) that combines optical tunability, an all-solid configuration, and high resolution. PHySL leverages a dynamic hydrogel actuator that autonomously harnesses optical energy, enabling substantial focal tuning through all-optical control. Beyond mimicking biological vision, the system achieves advanced functionalities, including focus control, wavefront engineering, and optical steering by responding to spatiotemporal light stimuli. PHySL highlights the potential of optically powered soft robotics applied in soft vision systems, autonomous soft robots, adaptive medical devices, and next-generation wearable systems.
{"title":"Bioinspired photoresponsive soft robotic lens","authors":"Corey Zheng, Shu Jia","doi":"10.1126/scirobotics.adw8905","DOIUrl":"10.1126/scirobotics.adw8905","url":null,"abstract":"<div >Vision is a critical sensory function for humans, animals, and engineered systems, enabling environmental perception essential for imaging and autonomous operation. Although bioinspired, tunable optical systems have advanced adaptability and performance, challenges remain in achieving biocompatibility, robust yet flexible construction, and specialized multifunctionality. Here, we present a photoresponsive hydrogel soft lens (PHySL) that combines optical tunability, an all-solid configuration, and high resolution. PHySL leverages a dynamic hydrogel actuator that autonomously harnesses optical energy, enabling substantial focal tuning through all-optical control. Beyond mimicking biological vision, the system achieves advanced functionalities, including focus control, wavefront engineering, and optical steering by responding to spatiotemporal light stimuli. PHySL highlights the potential of optically powered soft robotics applied in soft vision systems, autonomous soft robots, adaptive medical devices, and next-generation wearable systems.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 107","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145339465","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-22DOI: 10.1126/scirobotics.adx1367
J. Chen, A. Alexiev, A. Sergnese, N. Fabian, A. Pettinari, Y. Cai, V. Perepelook, K. Schmidt, A. Hayward, A. Guevara, B. Laidlaw, I. Moon, B. Markowitz, I. Ballinger, Z. Yang, C. Rosen, N. Shalabi, S. Owyang, G. Traverso
Acute mesenteric ischemia (AMI) results from insufficient blood flow to the intestines, leading to tissue necrosis with high morbidity and mortality. Diagnosis is often delayed because of nonspecific symptoms that mimic common gastrointestinal conditions. Current diagnostic methods, such as computed tomography and mesenteric angiography, are complex, costly, and invasive, highlighting the need for a rapid, accessible, and minimally invasive alternative. Here, we present FIREFLI (finding ischemia via reflectance of light), a bioinspired, ingestible capsule designed for luminance-based diagnosis of AMI. Upon ingestion, the device activates in the small intestine’s pH environment, emitting pulses from three radially spaced white light-emitting diodes and measuring reflected light across 10 wavelengths. FIREFLI then computes a tissue luminance biomarker, which outperforms color-change biomarkers because of superior intrasubject consistency. The diagnosis is processed onboard and wirelessly transmitted to an external mobile device. In vivo studies in swine (n = 9) demonstrated a diagnostic accuracy of 90%, with a sensitivity of 98% and specificity of 85%. By providing a noninvasive, real-time diagnostic solution, FIREFLI has the potential to facilitate earlier detection and treatment of AMI, ultimately improving patient outcomes.
{"title":"An ingestible capsule for luminance-based diagnosis of mesenteric ischemia","authors":"J. Chen, A. Alexiev, A. Sergnese, N. Fabian, A. Pettinari, Y. Cai, V. Perepelook, K. Schmidt, A. Hayward, A. Guevara, B. Laidlaw, I. Moon, B. Markowitz, I. Ballinger, Z. Yang, C. Rosen, N. Shalabi, S. Owyang, G. Traverso","doi":"10.1126/scirobotics.adx1367","DOIUrl":"10.1126/scirobotics.adx1367","url":null,"abstract":"<div >Acute mesenteric ischemia (AMI) results from insufficient blood flow to the intestines, leading to tissue necrosis with high morbidity and mortality. Diagnosis is often delayed because of nonspecific symptoms that mimic common gastrointestinal conditions. Current diagnostic methods, such as computed tomography and mesenteric angiography, are complex, costly, and invasive, highlighting the need for a rapid, accessible, and minimally invasive alternative. Here, we present FIREFLI (finding ischemia via reflectance of light), a bioinspired, ingestible capsule designed for luminance-based diagnosis of AMI. Upon ingestion, the device activates in the small intestine’s pH environment, emitting pulses from three radially spaced white light-emitting diodes and measuring reflected light across 10 wavelengths. FIREFLI then computes a tissue luminance biomarker, which outperforms color-change biomarkers because of superior intrasubject consistency. The diagnosis is processed onboard and wirelessly transmitted to an external mobile device. In vivo studies in swine (<i>n</i> = 9) demonstrated a diagnostic accuracy of 90%, with a sensitivity of 98% and specificity of 85%. By providing a noninvasive, real-time diagnostic solution, FIREFLI has the potential to facilitate earlier detection and treatment of AMI, ultimately improving patient outcomes.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 107","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145339092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-15DOI: 10.1126/scirobotics.adv1383
Max Linnander, Dustin Goetz, Gregory Reardon, Vijay Kumar, Elliot Hawkes, Yon Visell
Tactile displays that lend tangible form to digital content could transform computing interactions. However, achieving the resolution, speed, and dynamic range needed for perceptual fidelity remains challenging. We present a dynamic tactile display that directly converts projected light into visible and tactile patterns via a photomechanical surface populated with millimeter-scale optotactile pixels. The pixels transduce incident light into mechanical displacements through photostimulated thermal gas expansion, yielding millimeter-scale displacements with response times of 2 to 100 milliseconds. The use of projected light for power transmission and addressing renders these displays highly scalable. We demonstrate optically driven displays with up to 1511 addressable pixels, several times more pixels than prior tactile displays attaining comparable performance. Perceptual studies confirm that these displays can reproduce diverse spatiotemporal tactile patterns with high fidelity. This research establishes a foundation for practical and versatile high-resolution tactile displays driven by light.
{"title":"Tactile displays driven by projected light","authors":"Max Linnander, Dustin Goetz, Gregory Reardon, Vijay Kumar, Elliot Hawkes, Yon Visell","doi":"10.1126/scirobotics.adv1383","DOIUrl":"10.1126/scirobotics.adv1383","url":null,"abstract":"<div >Tactile displays that lend tangible form to digital content could transform computing interactions. However, achieving the resolution, speed, and dynamic range needed for perceptual fidelity remains challenging. We present a dynamic tactile display that directly converts projected light into visible and tactile patterns via a photomechanical surface populated with millimeter-scale optotactile pixels. The pixels transduce incident light into mechanical displacements through photostimulated thermal gas expansion, yielding millimeter-scale displacements with response times of 2 to 100 milliseconds. The use of projected light for power transmission and addressing renders these displays highly scalable. We demonstrate optically driven displays with up to 1511 addressable pixels, several times more pixels than prior tactile displays attaining comparable performance. Perceptual studies confirm that these displays can reproduce diverse spatiotemporal tactile patterns with high fidelity. This research establishes a foundation for practical and versatile high-resolution tactile displays driven by light.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 107","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145295091","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-10-15DOI: 10.1126/scirobotics.adv4408
Xiangxiao Liu, Matthew D. Loring, Luca Zunino, Kaitlyn E. Fouke, François A. Longchamp, Alexandre Bernardino, Auke J. Ijspeert, Eva A. Naumann
Brains evolve within specific sensory and physical environments, yet neuroscience has traditionally focused on studying neural circuits in isolation. Understanding of their function requires integrative brain-body testing in realistic contexts. To investigate the neural and biomechanical mechanisms of sensorimotor transformations, we constructed realistic neuromechanical simulations (simZFish) of the zebrafish optomotor response, a visual stabilization behavior. By computationally reproducing the body mechanics, physical body-water interactions, hydrodynamics, visual environments, and experimentally derived neural network architectures, we closely replicated the behavior of real larval zebrafish. Through systematic manipulation of physiological and circuit connectivity features, impossible in biological experiments, we demonstrate how embodiment shapes neural activity, circuit architecture, and behavior. Changing lens properties and retinal connectivity revealed why the lower posterior visual field drives optimal optomotor responses in the simZFish, explaining receptive field properties observed in real zebrafish. When challenged with novel visual stimuli, the simZFish predicted previously unknown neuronal response types, which we identified via two-photon calcium imaging in the live brains of real zebrafish and incorporated to update the simZFish neural network. In virtual rivers, the simZFish performed rheotaxis autonomously by using current-induced optic flow patterns as navigational cues, compensating for the simulated water flow. Last, experiments with a physical robot (ZBot) validated the role of embodied sensorimotor circuits in maintaining position in a real river with complex fluid dynamics and visual environments. By iterating between simulations, behavioral observations, neural imaging, and robotic testing, we demonstrate the power of integrative approaches to investigating sensorimotor processing, providing insights into embodied neural circuit functions.
{"title":"Artificial embodied circuits uncover neural architectures of vertebrate visuomotor behaviors","authors":"Xiangxiao Liu, Matthew D. Loring, Luca Zunino, Kaitlyn E. Fouke, François A. Longchamp, Alexandre Bernardino, Auke J. Ijspeert, Eva A. Naumann","doi":"10.1126/scirobotics.adv4408","DOIUrl":"10.1126/scirobotics.adv4408","url":null,"abstract":"<div >Brains evolve within specific sensory and physical environments, yet neuroscience has traditionally focused on studying neural circuits in isolation. Understanding of their function requires integrative brain-body testing in realistic contexts. To investigate the neural and biomechanical mechanisms of sensorimotor transformations, we constructed realistic neuromechanical simulations (simZFish) of the zebrafish optomotor response, a visual stabilization behavior. By computationally reproducing the body mechanics, physical body-water interactions, hydrodynamics, visual environments, and experimentally derived neural network architectures, we closely replicated the behavior of real larval zebrafish. Through systematic manipulation of physiological and circuit connectivity features, impossible in biological experiments, we demonstrate how embodiment shapes neural activity, circuit architecture, and behavior. Changing lens properties and retinal connectivity revealed why the lower posterior visual field drives optimal optomotor responses in the simZFish, explaining receptive field properties observed in real zebrafish. When challenged with novel visual stimuli, the simZFish predicted previously unknown neuronal response types, which we identified via two-photon calcium imaging in the live brains of real zebrafish and incorporated to update the simZFish neural network. In virtual rivers, the simZFish performed rheotaxis autonomously by using current-induced optic flow patterns as navigational cues, compensating for the simulated water flow. Last, experiments with a physical robot (ZBot) validated the role of embodied sensorimotor circuits in maintaining position in a real river with complex fluid dynamics and visual environments. By iterating between simulations, behavioral observations, neural imaging, and robotic testing, we demonstrate the power of integrative approaches to investigating sensorimotor processing, providing insights into embodied neural circuit functions.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 107","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145295966","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-24DOI: 10.1126/scirobotics.adv4049
Amanda Prorok
The current trend toward generalist robot behaviors with monolithic artificial intelligence (AI) models is unsustainable. I advocate for a paradigm shift that embraces distributed architectures for collective robotic intelligence. A modular “mixture-of-robots” approach with specialized interdependent components can achieve superlinear gains, offering benefits in scalability, adaptability, and learning complex interactive skills.
{"title":"Extending robot minds through collective learning","authors":"Amanda Prorok","doi":"10.1126/scirobotics.adv4049","DOIUrl":"10.1126/scirobotics.adv4049","url":null,"abstract":"<div >The current trend toward generalist robot behaviors with monolithic artificial intelligence (AI) models is unsustainable. I advocate for a paradigm shift that embraces distributed architectures for collective robotic intelligence. A modular “mixture-of-robots” approach with specialized interdependent components can achieve superlinear gains, offering benefits in scalability, adaptability, and learning complex interactive skills.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.science.org/doi/reader/10.1126/scirobotics.adv4049","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145133624","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-24DOI: 10.1126/scirobotics.adv7932
Andrew I. Cooper, Patrick Courtney, Kourosh Darvish, Moritz Eckhoff, Hatem Fakhruldeen, Andrea Gabrielli, Animesh Garg, Sami Haddadin, Kanako Harada, Jason Hein, Maria Hübner, Dennis Knobbe, Gabriella Pizzuto, Florian Shkurti, Ruja Shrestha, Kerstin Thurow, Rafael Vescovi, Birgit Vogel-Heuser, Ádám Wolf, Naruki Yoshikawa, Yan Zeng, Zhengxue Zhou, Henning Zwirnmann
Science laboratory automation enables accelerated discovery in life sciences and materials. However, it requires interdisciplinary collaboration to address challenges such as robust and flexible autonomy, reproducibility, throughput, standardization, the role of human scientists, and ethics. This article highlights these issues, reflecting perspectives from leading experts in laboratory automation across different disciplines of the natural sciences.
{"title":"Accelerating discovery in natural science laboratories with AI and robotics: Perspectives and challenges","authors":"Andrew I. Cooper, Patrick Courtney, Kourosh Darvish, Moritz Eckhoff, Hatem Fakhruldeen, Andrea Gabrielli, Animesh Garg, Sami Haddadin, Kanako Harada, Jason Hein, Maria Hübner, Dennis Knobbe, Gabriella Pizzuto, Florian Shkurti, Ruja Shrestha, Kerstin Thurow, Rafael Vescovi, Birgit Vogel-Heuser, Ádám Wolf, Naruki Yoshikawa, Yan Zeng, Zhengxue Zhou, Henning Zwirnmann","doi":"10.1126/scirobotics.adv7932","DOIUrl":"10.1126/scirobotics.adv7932","url":null,"abstract":"<div >Science laboratory automation enables accelerated discovery in life sciences and materials. However, it requires interdisciplinary collaboration to address challenges such as robust and flexible autonomy, reproducibility, throughput, standardization, the role of human scientists, and ethics. This article highlights these issues, reflecting perspectives from leading experts in laboratory automation across different disciplines of the natural sciences.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145133623","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-24DOI: 10.1126/scirobotics.aeb6685
Robin R. Murphy
The novel Flybot gives a science-forward view of the challenges in building a fully autonomous robot fly.
新颖的Flybot给出了一个科学的观点,在建立一个完全自主的机器人飞行的挑战。
{"title":"Lowly fly or impressive miniature robot?","authors":"Robin R. Murphy","doi":"10.1126/scirobotics.aeb6685","DOIUrl":"10.1126/scirobotics.aeb6685","url":null,"abstract":"<div >The novel <i>Flybot</i> gives a science-forward view of the challenges in building a fully autonomous robot fly.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145133973","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-17DOI: 10.1126/scirobotics.adu4851
Xurui Liu, Hanchuan Tang, Na Li, Linjie He, Ye Tian, Bo Hao, Junnan Xue, Chaoyu Yang, Joseph Jao Yiu Sung, Li Zhang, Jianfeng Zang
Intelligent miniature systems capable of wireless sensing and manipulation hold considerable promise for advancing biomedical applications. However, the development of these systems has been substantially hindered by sensing-actuation incompatibility at small scales. To overcome this challenge, we propose a robotic sensing approach that integrates embedded ultrasonic soft sensors (EUSSs) with magnetic actuators, resulting in a wireless sensor-integrated miniature machine with seamless integration and minimal interference between fields. The EUSS, with its compact dimensions (1.3 millimeters by 1.3 millimeters by 1.6 millimeters), softness (98 kilopascals), and lightweight design (4.6 milligrams), is compatible with both soft and rigid components in terms of deformability and size. By engineering onboard transducers and using passive ultrasound communication along with external magnetic fields, we could wirelessly detect and regulate environmental parameters such as force, vibration, viscosity, and temperature. Demonstrations in rabbit and porcine models show the potential for robotic feedback control, accurate drug dosing, and in situ physiological monitoring, paving the way for real-world applications of intelligent miniature machines.
{"title":"Miniature magneto-ultrasonic machines for wireless robotic sensing and manipulation","authors":"Xurui Liu, Hanchuan Tang, Na Li, Linjie He, Ye Tian, Bo Hao, Junnan Xue, Chaoyu Yang, Joseph Jao Yiu Sung, Li Zhang, Jianfeng Zang","doi":"10.1126/scirobotics.adu4851","DOIUrl":"10.1126/scirobotics.adu4851","url":null,"abstract":"<div >Intelligent miniature systems capable of wireless sensing and manipulation hold considerable promise for advancing biomedical applications. However, the development of these systems has been substantially hindered by sensing-actuation incompatibility at small scales. To overcome this challenge, we propose a robotic sensing approach that integrates embedded ultrasonic soft sensors (EUSSs) with magnetic actuators, resulting in a wireless sensor-integrated miniature machine with seamless integration and minimal interference between fields. The EUSS, with its compact dimensions (1.3 millimeters by 1.3 millimeters by 1.6 millimeters), softness (98 kilopascals), and lightweight design (4.6 milligrams), is compatible with both soft and rigid components in terms of deformability and size. By engineering onboard transducers and using passive ultrasound communication along with external magnetic fields, we could wirelessly detect and regulate environmental parameters such as force, vibration, viscosity, and temperature. Demonstrations in rabbit and porcine models show the potential for robotic feedback control, accurate drug dosing, and in situ physiological monitoring, paving the way for real-world applications of intelligent miniature machines.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145077510","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2025-09-17DOI: 10.1126/scirobotics.aea9377
Xuhui Hu, Aiguo Song, Min Xu
Innovations in sensing and control technology helped an arm prosthesis novice win a global assistive robotics competition.
传感和控制技术的创新帮助手臂假肢新手赢得全球辅助机器人比赛。
{"title":"Arm prosthesis with dexterous control and sensory feedback delivers winning performance at Cybathlon","authors":"Xuhui Hu, Aiguo Song, Min Xu","doi":"10.1126/scirobotics.aea9377","DOIUrl":"10.1126/scirobotics.aea9377","url":null,"abstract":"<div >Innovations in sensing and control technology helped an arm prosthesis novice win a global assistive robotics competition.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 106","pages":""},"PeriodicalIF":27.5,"publicationDate":"2025-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.science.org/doi/reader/10.1126/scirobotics.aea9377","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"145078231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}