Shuai Zhang, Bo Ouyang, Xian He, Xin Yuan, Shanlin Yang
{"title":"Face Tracking Strategy Based on Manipulability of a 7-DOF Robot Arm and Head Motion Intention Ellipsoids","authors":"Shuai Zhang, Bo Ouyang, Xian He, Xin Yuan, Shanlin Yang","doi":"10.1109/RCAR54675.2022.9872298","DOIUrl":null,"url":null,"abstract":"Nurses recognize facial expressions or eye motions to monitor a patient’s condition in the intensive care unit (ICU), for example, pain, agitation, and delirium. However, there are no instruments that can record the facial expression or eye motion accurately like an ECG monitor. To tackle this issue, we develop a face tracking strategy using a 7-DOF robot arm with a camera mounted on the end-effector. First, we constrain the linear and angular velocities of head motion intention to ellipsoids which are determined by the patient’s head pose and the geometry of hospital beds, named head motion intention ellipsoids (HMIEs). Moreover, we defined manipulability ellipsoids (MEs) of the 7-DOF robot arm based on Jacobian matrix, which is adjusted in the null space during the tracking. We calculate the optimal configuration of the camera with the feedback of the head configuration while minimizing the difference between HMIEs and MEs. Simulation experimental results verified that the proposed face tracking strategy outperforms the visual servoing control strategy only based on the pseudo-inverse of the Jacobian matrix.","PeriodicalId":304963,"journal":{"name":"2022 IEEE International Conference on Real-time Computing and Robotics (RCAR)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Real-time Computing and Robotics (RCAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RCAR54675.2022.9872298","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Nurses recognize facial expressions or eye motions to monitor a patient’s condition in the intensive care unit (ICU), for example, pain, agitation, and delirium. However, there are no instruments that can record the facial expression or eye motion accurately like an ECG monitor. To tackle this issue, we develop a face tracking strategy using a 7-DOF robot arm with a camera mounted on the end-effector. First, we constrain the linear and angular velocities of head motion intention to ellipsoids which are determined by the patient’s head pose and the geometry of hospital beds, named head motion intention ellipsoids (HMIEs). Moreover, we defined manipulability ellipsoids (MEs) of the 7-DOF robot arm based on Jacobian matrix, which is adjusted in the null space during the tracking. We calculate the optimal configuration of the camera with the feedback of the head configuration while minimizing the difference between HMIEs and MEs. Simulation experimental results verified that the proposed face tracking strategy outperforms the visual servoing control strategy only based on the pseudo-inverse of the Jacobian matrix.