Pub Date : 2022-10-30DOI: 10.1109/SENSORS52175.2022.9967044
Tatsuki Tanaka, T. Sugawara
As more and more information systems rely sen-sors for their critical decisions, there is a growing threat of injecting false signals to sensors in the analog domain. In particular, LightCommands showed that MEMS microphones are susceptible to light, through the photoacoustic and photoelectric effects, enabling an attacker to silently inject voice commands to smart speakers. Understanding such unexpected transduction mechanisms is essential for designing secure and reliable MEMS sensors. Is there any other transduction mechanism enabling laser-induced attacks? We positively answer the question by experimentally evaluating two commercial piezoresistive MEMS pressure sensors. By shining a laser light at the piezoresistors through an air hole on the sensor package, the pressure reading changes by ±1000 hPa with 0.5 mW laser power. This phenomenon can be explained by the photoelectric effect at the piezoresistors, which increases the number of carriers and decreases the resistance. We finally show that an attacker can induce the target signal at the sensor reading by shining an amplitude-modulated laser light.
{"title":"Laser-Based Signal-Injection Attack on Piezoresistive MEMS Pressure Sensors","authors":"Tatsuki Tanaka, T. Sugawara","doi":"10.1109/SENSORS52175.2022.9967044","DOIUrl":"https://doi.org/10.1109/SENSORS52175.2022.9967044","url":null,"abstract":"As more and more information systems rely sen-sors for their critical decisions, there is a growing threat of injecting false signals to sensors in the analog domain. In particular, LightCommands showed that MEMS microphones are susceptible to light, through the photoacoustic and photoelectric effects, enabling an attacker to silently inject voice commands to smart speakers. Understanding such unexpected transduction mechanisms is essential for designing secure and reliable MEMS sensors. Is there any other transduction mechanism enabling laser-induced attacks? We positively answer the question by experimentally evaluating two commercial piezoresistive MEMS pressure sensors. By shining a laser light at the piezoresistors through an air hole on the sensor package, the pressure reading changes by ±1000 hPa with 0.5 mW laser power. This phenomenon can be explained by the photoelectric effect at the piezoresistors, which increases the number of carriers and decreases the resistance. We finally show that an attacker can induce the target signal at the sensor reading by shining an amplitude-modulated laser light.","PeriodicalId":120357,"journal":{"name":"2022 IEEE Sensors","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129225966","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-30DOI: 10.1109/SENSORS52175.2022.9967191
Jaime Aru, Erik Verreycken, D. Laurijssen, J. Steckel
In the last decades our never-ending desire for space exploration has grown exponentially. In this endeavour, one of the major points of interest is the red planet Mars. To autonomously navigate the Martian terrains a combination of optical sensors (LiDAR, Camera) are used in the latest NASA Perseverance Rover. However, the harsh Martian climate and dust storms can significantly impair the accuracy of these sensors due to their use of light as a medium. By utilising a 3D sonar sensor, which is not affected by bad visibility, we can attempt to reduce navigation issues. However, because of the many differences between the Earth and Mars (e.g. temperature, atmospheric pressure…), a degradation in performance can be expected for the 3D sonar sensor in comparison to its performance on Earth. We developed a simulation which can give us an estimate of the performance differences between a 3D sonar on Earth and one on Mars. This simulation is then used to asses performance in different realistic scenarios, like high winds and component failure. A Martian sonar would have reduced range compared to its terrestrial counterpart, but we believe it to be a worthwhile addition to current Mars rover's navigation methods.
{"title":"3D Sonar on Mars","authors":"Jaime Aru, Erik Verreycken, D. Laurijssen, J. Steckel","doi":"10.1109/SENSORS52175.2022.9967191","DOIUrl":"https://doi.org/10.1109/SENSORS52175.2022.9967191","url":null,"abstract":"In the last decades our never-ending desire for space exploration has grown exponentially. In this endeavour, one of the major points of interest is the red planet Mars. To autonomously navigate the Martian terrains a combination of optical sensors (LiDAR, Camera) are used in the latest NASA Perseverance Rover. However, the harsh Martian climate and dust storms can significantly impair the accuracy of these sensors due to their use of light as a medium. By utilising a 3D sonar sensor, which is not affected by bad visibility, we can attempt to reduce navigation issues. However, because of the many differences between the Earth and Mars (e.g. temperature, atmospheric pressure…), a degradation in performance can be expected for the 3D sonar sensor in comparison to its performance on Earth. We developed a simulation which can give us an estimate of the performance differences between a 3D sonar on Earth and one on Mars. This simulation is then used to asses performance in different realistic scenarios, like high winds and component failure. A Martian sonar would have reduced range compared to its terrestrial counterpart, but we believe it to be a worthwhile addition to current Mars rover's navigation methods.","PeriodicalId":120357,"journal":{"name":"2022 IEEE Sensors","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128526179","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-30DOI: 10.1109/SENSORS52175.2022.9967210
Yuan Cao, J. Floehr, D. Azarkh, U. Schnakenberg
Artificial fertilization depends on the oocyte quality, especially on the zona pellucida. This gelatinous outer layer becomes soft before and hardens after sperm penetration. Here, we propose a setup that characterizes the stiffness of the zona pellucida of mouse oocytes by electrical impedance spectroscopy. Single oocytes are hydrodynamically trapped at an aperture, which is located between two ring-shaped electrodes. By applying weak negative pressures to the cell trap, the electrical impedance correlates with the stiffness of the zona pellucida.
{"title":"Mouse Oocyte Characterization by Electrical Impedance Spectroscopy","authors":"Yuan Cao, J. Floehr, D. Azarkh, U. Schnakenberg","doi":"10.1109/SENSORS52175.2022.9967210","DOIUrl":"https://doi.org/10.1109/SENSORS52175.2022.9967210","url":null,"abstract":"Artificial fertilization depends on the oocyte quality, especially on the zona pellucida. This gelatinous outer layer becomes soft before and hardens after sperm penetration. Here, we propose a setup that characterizes the stiffness of the zona pellucida of mouse oocytes by electrical impedance spectroscopy. Single oocytes are hydrodynamically trapped at an aperture, which is located between two ring-shaped electrodes. By applying weak negative pressures to the cell trap, the electrical impedance correlates with the stiffness of the zona pellucida.","PeriodicalId":120357,"journal":{"name":"2022 IEEE Sensors","volume":"92 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115965007","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-30DOI: 10.1109/SENSORS52175.2022.9967231
Abhishek Khoyani, Harshdeep Kaur, Marzieh Amini, H. Sadreazami
The brain-computer interface is a technology that allows a machine to connect with the human brain and work based on the commands released by thoughts and activities of the brain. Electrodes are placed on the scalp and the changes in electric waves released by the brain are recorded as Electroencephalography (EEG) signals. In this work, we propose the use of generative adversarial networks and musigma methods to augment the EEG signals. Some of the existing deep learning methods such as convolutional neural network and recurrent neural network for classification of the EEG signals are implemented and their classification performance is examined with and without data augmentation. It is shown that the use of data augmentation can improve the performance of the EEG signal classification with deep learning models to a considerable extend.
{"title":"Motor Imagery Brain Activity Recognition through Data Augmentation using DC-GANs and Mu-Sigma","authors":"Abhishek Khoyani, Harshdeep Kaur, Marzieh Amini, H. Sadreazami","doi":"10.1109/SENSORS52175.2022.9967231","DOIUrl":"https://doi.org/10.1109/SENSORS52175.2022.9967231","url":null,"abstract":"The brain-computer interface is a technology that allows a machine to connect with the human brain and work based on the commands released by thoughts and activities of the brain. Electrodes are placed on the scalp and the changes in electric waves released by the brain are recorded as Electroencephalography (EEG) signals. In this work, we propose the use of generative adversarial networks and musigma methods to augment the EEG signals. Some of the existing deep learning methods such as convolutional neural network and recurrent neural network for classification of the EEG signals are implemented and their classification performance is examined with and without data augmentation. It is shown that the use of data augmentation can improve the performance of the EEG signal classification with deep learning models to a considerable extend.","PeriodicalId":120357,"journal":{"name":"2022 IEEE Sensors","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122494003","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-30DOI: 10.1109/SENSORS52175.2022.9967244
Stefan Schulte, Gianni Allevato, Christoph Haugwitz, M. Kupnik
Air-coupled ultrasonic phased arrays are a complement to existing lidar-, camera- and radar-based sensors for object detection and spatial imaging. These in-air sonar systems typically use conventional beamforming (CBF) for high-frame rate image formation. Consequently, in real-world multi-target environments, the unique identification of reflectors is a challenging task due to the array-specific point spread function (PSF). Therefore, we present a neural auto-encoder network based on Xception for removing the PSF characteristics from CBF images and estimating the number of reflectors. Based on this information, the reflector coordinates are extracted by Gaussian mixture model clustering. We train and test the architecture on simulated and randomized multi-target CBF images. The performance is evaluated in terms of the localization precision, reflector count error and the angular resolution obtained. The preliminary results show a low mean error for the localization (-0.61°, -3 mm) and an accuracy of 83% for the reflector count estimation. The angular resolution of the given array can be improved from 14° to 2°. Overall, we highlight the potential of state-of-the-art auto-encoder networks, typically used for optical images, for CBF image enhancement and the combination with clustering for target localization.
{"title":"Deep-Learned Air-Coupled Ultrasonic Sonar Image Enhancement and Object Localization","authors":"Stefan Schulte, Gianni Allevato, Christoph Haugwitz, M. Kupnik","doi":"10.1109/SENSORS52175.2022.9967244","DOIUrl":"https://doi.org/10.1109/SENSORS52175.2022.9967244","url":null,"abstract":"Air-coupled ultrasonic phased arrays are a complement to existing lidar-, camera- and radar-based sensors for object detection and spatial imaging. These in-air sonar systems typically use conventional beamforming (CBF) for high-frame rate image formation. Consequently, in real-world multi-target environments, the unique identification of reflectors is a challenging task due to the array-specific point spread function (PSF). Therefore, we present a neural auto-encoder network based on Xception for removing the PSF characteristics from CBF images and estimating the number of reflectors. Based on this information, the reflector coordinates are extracted by Gaussian mixture model clustering. We train and test the architecture on simulated and randomized multi-target CBF images. The performance is evaluated in terms of the localization precision, reflector count error and the angular resolution obtained. The preliminary results show a low mean error for the localization (-0.61°, -3 mm) and an accuracy of 83% for the reflector count estimation. The angular resolution of the given array can be improved from 14° to 2°. Overall, we highlight the potential of state-of-the-art auto-encoder networks, typically used for optical images, for CBF image enhancement and the combination with clustering for target localization.","PeriodicalId":120357,"journal":{"name":"2022 IEEE Sensors","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121375287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-30DOI: 10.1109/SENSORS52175.2022.9967109
Mizuki Odaira, Yukihiro Tatsumi, Kensuke Murakami, Ken Ogasahara, Satoshi Shimizu, Yong-Joon Choi, Kazuhiro Takahashi, T. Noda, K. Sawada
In this study, we propose a pressure-ion image sensor that can simultaneously measure pressure and ion distribution with a patterned piezoelectric film on a pH image sensor. By patterning using photolithography and lift-off, a structure with alternating pressure and pH sensing areas was achieved, and the fabricated pressure-ion image sensor successfully visualized pH and pressure distribution simultaneously.
{"title":"Fabrication of Multimodal Image Sensor Capable of Simultaneous Measurement of Pressure and pH","authors":"Mizuki Odaira, Yukihiro Tatsumi, Kensuke Murakami, Ken Ogasahara, Satoshi Shimizu, Yong-Joon Choi, Kazuhiro Takahashi, T. Noda, K. Sawada","doi":"10.1109/SENSORS52175.2022.9967109","DOIUrl":"https://doi.org/10.1109/SENSORS52175.2022.9967109","url":null,"abstract":"In this study, we propose a pressure-ion image sensor that can simultaneously measure pressure and ion distribution with a patterned piezoelectric film on a pH image sensor. By patterning using photolithography and lift-off, a structure with alternating pressure and pH sensing areas was achieved, and the fabricated pressure-ion image sensor successfully visualized pH and pressure distribution simultaneously.","PeriodicalId":120357,"journal":{"name":"2022 IEEE Sensors","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121166433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-30DOI: 10.1109/SENSORS52175.2022.9967277
M. Schneider, Júlia Santasusagna, Ingrid Anna Maria Magnet, U. Schmid
This exploratory work demonstrates the potential of plate-type piezoelectric MEMS resonators for measuring the dynamic viscosity of human blood. These micromachined silicon sensors are operated in roof-tile shaped vibrational modes, featuring high quality factors in liquids. The quality factor of the 17 vibrational mode is used in combination with a sensor calibration procedure which is based on viscosity standards to monitor this fluidic material parameter. We demonstrate, that the MEMS sensor can provide real-time viscosity data over extended periods of time, which may be of high interest in cardiovascular medicine and medical applications such as extracorporal membrane oxygenation (ECMO).
{"title":"Ex Vivo Blood Viscosity Monitoring with Piezoelectric MEMS Resonators","authors":"M. Schneider, Júlia Santasusagna, Ingrid Anna Maria Magnet, U. Schmid","doi":"10.1109/SENSORS52175.2022.9967277","DOIUrl":"https://doi.org/10.1109/SENSORS52175.2022.9967277","url":null,"abstract":"This exploratory work demonstrates the potential of plate-type piezoelectric MEMS resonators for measuring the dynamic viscosity of human blood. These micromachined silicon sensors are operated in roof-tile shaped vibrational modes, featuring high quality factors in liquids. The quality factor of the 17 vibrational mode is used in combination with a sensor calibration procedure which is based on viscosity standards to monitor this fluidic material parameter. We demonstrate, that the MEMS sensor can provide real-time viscosity data over extended periods of time, which may be of high interest in cardiovascular medicine and medical applications such as extracorporal membrane oxygenation (ECMO).","PeriodicalId":120357,"journal":{"name":"2022 IEEE Sensors","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121225253","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-30DOI: 10.1109/SENSORS52175.2022.9967194
Yuna Jung, Daniel W. Gulick, J. Christen
Hydrocephalus is an accumulation of excess pressure in the brain due to malfunction of the fluid drainage system, arachnoid granulations. Standard treatment uses a shunt to drain excess cerebrospinal fluid to the abdomen. Conventional shunts suffer high failure rates over time. To reduce failure, we propose replacing the shunt with a miniaturized valve placed in the intracranial space. Our current prototype uses a duckbill valve design with 1 mm outlet width. The valve leaflets are silicone (PDMS), with the fluid channel defined using photolithography. In bench top pressure vs. flow testing, the silicone duckbill valve achieved the target cracking pressure range of 5 to 15 cmH2O with no cycling degradation or reverse flow leakage. Upcoming studies will monitor long-term degradation and test valve performance in vivo.
{"title":"Miniaturized Passive Bio-mechanical Valve for Hydrocephalus Treatment","authors":"Yuna Jung, Daniel W. Gulick, J. Christen","doi":"10.1109/SENSORS52175.2022.9967194","DOIUrl":"https://doi.org/10.1109/SENSORS52175.2022.9967194","url":null,"abstract":"Hydrocephalus is an accumulation of excess pressure in the brain due to malfunction of the fluid drainage system, arachnoid granulations. Standard treatment uses a shunt to drain excess cerebrospinal fluid to the abdomen. Conventional shunts suffer high failure rates over time. To reduce failure, we propose replacing the shunt with a miniaturized valve placed in the intracranial space. Our current prototype uses a duckbill valve design with 1 mm outlet width. The valve leaflets are silicone (PDMS), with the fluid channel defined using photolithography. In bench top pressure vs. flow testing, the silicone duckbill valve achieved the target cracking pressure range of 5 to 15 cmH2O with no cycling degradation or reverse flow leakage. Upcoming studies will monitor long-term degradation and test valve performance in vivo.","PeriodicalId":120357,"journal":{"name":"2022 IEEE Sensors","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116348960","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-30DOI: 10.1109/SENSORS52175.2022.9967143
Kazutaka Sato, Shuichi Morizane, Atsushi Takenaka, M. Ueki, T. Matsunaga, Sang-seok Lee
The purpose of this study is to realize the grasp force measurement of forceps for minimally invasive surgical robots. The minimally invasive surgical robots currently in widespread use limit the direct sensation of the surgeon, and require a high level of control skills and proficiency to use. To solve those problems, the grasping force sensing of the robotic forceps is needed. However, considering the practical aspects such as miniaturization and productivity, it is not easy to attach the conventional force sensor to the robotic forceps. In this study, we propose a novel grasping force sensing method using optical interference between ultra-thin optical fibers. We demonstrate the working principle of the grasp sensor through the sensor fabrication and evaluation.
{"title":"An Optical Grasping Force Sensor for Minimally Invasive Surgical Robotic Forceps","authors":"Kazutaka Sato, Shuichi Morizane, Atsushi Takenaka, M. Ueki, T. Matsunaga, Sang-seok Lee","doi":"10.1109/SENSORS52175.2022.9967143","DOIUrl":"https://doi.org/10.1109/SENSORS52175.2022.9967143","url":null,"abstract":"The purpose of this study is to realize the grasp force measurement of forceps for minimally invasive surgical robots. The minimally invasive surgical robots currently in widespread use limit the direct sensation of the surgeon, and require a high level of control skills and proficiency to use. To solve those problems, the grasping force sensing of the robotic forceps is needed. However, considering the practical aspects such as miniaturization and productivity, it is not easy to attach the conventional force sensor to the robotic forceps. In this study, we propose a novel grasping force sensing method using optical interference between ultra-thin optical fibers. We demonstrate the working principle of the grasp sensor through the sensor fabrication and evaluation.","PeriodicalId":120357,"journal":{"name":"2022 IEEE Sensors","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126652848","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-10-30DOI: 10.1109/SENSORS52175.2022.9967200
Visva Moorthy, P. Kassanos, E. Burdet, E. Yeatman
The rapidly growing field of stretchable sensors has recently produced a myriad of sensing devices. However, these are often realized using expensive materials and complex manufacturing techniques. This work demonstrates stretchable stencil printed strain sensors, fabricated using either graphite or carbon black as a conductive filler in a polydimethylsiloxane (PDMS) matrix, that was also used as the device substrate. The strain sensors demonstrated highly linear responses and sensitivities (R2 = 0.95 and R2 = 0.98, and gauge factors of 2.77 and 1.50, respectively) that are comparable to other published sensors manufactured using similar or more complex processes.
{"title":"Stencil Printing of Low-Cost Carbon-Based Stretchable Strain Sensors","authors":"Visva Moorthy, P. Kassanos, E. Burdet, E. Yeatman","doi":"10.1109/SENSORS52175.2022.9967200","DOIUrl":"https://doi.org/10.1109/SENSORS52175.2022.9967200","url":null,"abstract":"The rapidly growing field of stretchable sensors has recently produced a myriad of sensing devices. However, these are often realized using expensive materials and complex manufacturing techniques. This work demonstrates stretchable stencil printed strain sensors, fabricated using either graphite or carbon black as a conductive filler in a polydimethylsiloxane (PDMS) matrix, that was also used as the device substrate. The strain sensors demonstrated highly linear responses and sensitivities (R2 = 0.95 and R2 = 0.98, and gauge factors of 2.77 and 1.50, respectively) that are comparable to other published sensors manufactured using similar or more complex processes.","PeriodicalId":120357,"journal":{"name":"2022 IEEE Sensors","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126699687","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}