首页 > 最新文献

ROBOMECH Journal最新文献

英文 中文
Short-range Lidar SLAM utilizing localization data of monocular localization 利用单目定位数据的近程激光雷达SLAM
IF 1.4 Q3 INSTRUMENTS & INSTRUMENTATION Pub Date : 2021-10-26 DOI: 10.1186/s40648-021-00211-7
Sousuke Nakamura, Shunsuke Muto, Daichi Takahashi
{"title":"Short-range Lidar SLAM utilizing localization data of monocular localization","authors":"Sousuke Nakamura, Shunsuke Muto, Daichi Takahashi","doi":"10.1186/s40648-021-00211-7","DOIUrl":"https://doi.org/10.1186/s40648-021-00211-7","url":null,"abstract":"","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":"8 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2021-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"65734095","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Stability analysis of multi-serial-link mechanism driven by antagonistic multiarticular artificial muscles 对抗性多关节人工肌肉驱动多串联机构的稳定性分析
IF 1.4 Q3 INSTRUMENTS & INSTRUMENTATION Pub Date : 2021-10-25 DOI: 10.21203/rs.3.rs-968693/v1
Y. Ishikawa, Hiroyuki Nabae, G. Endo, K. Suzumori
Artificial multiarticular musculoskeletal systems consisting of serially connected links driven by monoarticular and multiarticular muscles, which are often inspired by vertebrates, enable robots to elicit dynamic, elegant, and flexible movements. However, serial links driven by multiarticular muscles can cause unstable motion (e.g., buckling). The stability of musculoskeletal mechanisms driven by antagonistic multiarticular muscles depends on the muscle configuration, origin/insertion of muscles, spring constants of muscles, contracting force of muscles, and other factors. We analyze the stability of a multi-serial-link mechanism driven by antagonistic multiarticular muscles aiming to avoid buckling and other undesired motions. We theoretically derive the potential energy of the system and the stable condition at the target point, and validate the results through dynamic simulations and experiments. This paper presents the static stability criteria of serially linked robots, which are redundantly driven by monoarticular and multiarticular muscles, resulting in the design and control guidelines for those robots.
人工多关节肌肉骨骼系统由单关节和多关节肌肉驱动的串联链接组成,通常受到脊椎动物的启发,使机器人能够进行动态,优雅和灵活的运动。然而,由多关节肌肉驱动的串联连接会导致不稳定的运动(例如屈曲)。拮抗性多关节肌驱动的肌肉骨骼机制的稳定性取决于肌肉的形态、肌肉的起源/插入、肌肉的弹簧常数、肌肉的收缩力和其他因素。我们分析了由拮抗多关节肌肉驱动的多串联连杆机制的稳定性,旨在避免屈曲和其他不希望的运动。从理论上推导了系统的势能和目标点的稳定状态,并通过动态仿真和实验验证了结果。本文提出了由单关节和多关节肌肉冗余驱动的串联机器人的静态稳定性准则,从而给出了该类机器人的设计和控制准则。
{"title":"Stability analysis of multi-serial-link mechanism driven by antagonistic multiarticular artificial muscles","authors":"Y. Ishikawa, Hiroyuki Nabae, G. Endo, K. Suzumori","doi":"10.21203/rs.3.rs-968693/v1","DOIUrl":"https://doi.org/10.21203/rs.3.rs-968693/v1","url":null,"abstract":"Artificial multiarticular musculoskeletal systems consisting of serially connected links driven by monoarticular and multiarticular muscles, which are often inspired by vertebrates, enable robots to elicit dynamic, elegant, and flexible movements. However, serial links driven by multiarticular muscles can cause unstable motion (e.g., buckling). The stability of musculoskeletal mechanisms driven by antagonistic multiarticular muscles depends on the muscle configuration, origin/insertion of muscles, spring constants of muscles, contracting force of muscles, and other factors. We analyze the stability of a multi-serial-link mechanism driven by antagonistic multiarticular muscles aiming to avoid buckling and other undesired motions. We theoretically derive the potential energy of the system and the stable condition at the target point, and validate the results through dynamic simulations and experiments. This paper presents the static stability criteria of serially linked robots, which are redundantly driven by monoarticular and multiarticular muscles, resulting in the design and control guidelines for those robots.","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":"9 1","pages":"1-12"},"PeriodicalIF":1.4,"publicationDate":"2021-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41973448","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A framework of physically interactive parameter estimation based on active environmental groping for safe disaster response work 基于主动环境探索的安全灾害响应物理交互参数估计框架
IF 1.4 Q3 INSTRUMENTS & INSTRUMENTATION Pub Date : 2021-10-02 DOI: 10.1186/s40648-021-00209-1
Kamezaki, Mitsuhiro, Uehara, Yusuke, Azuma, Kohga, Sugano, Shigeki
Disaster response robots are expected to perform complicated tasks such as traveling over unstable terrain, climbing slippery steps, and removing heavy debris. To complete such tasks safely, the robots must obtain not only visual-perceptual information (VPI) such as surface shape but also the haptic-perceptual information (HPI) such as surface friction of objects in the environments. VPI can be obtained from laser sensors and cameras. In contrast, HPI can be basically obtained from only the results of physical interaction with the environments, e.g., reaction force and deformation. However, current robots do not have a function to estimate the HPI. In this study, we propose a framework to estimate such physically interactive parameters (PIPs), including hardness, friction, and weight, which are vital parameters for safe robot-environment interaction. For effective estimation, we define the ground (GGM) and object groping modes (OGM). The endpoint of the robot arm, which has a force sensor, actively touches, pushes, rubs, and lifts objects in the environment with a hybrid position/force control, and three kinds of PIPs are estimated from the measured reaction force and displacement of the arm endpoint. The robot finally judges the accident risk based on estimated PIPs, e.g., safe, attentional, or dangerous. We prepared environments that had the same surface shape but different hardness, friction, and weight. The experimental results indicated that the proposed framework could estimate PIPs adequately and was useful to judge the risk and safely plan tasks.
灾难响应机器人有望执行复杂的任务,如在不稳定的地形上行走、爬湿滑的台阶、清除沉重的碎片。为了安全地完成这些任务,机器人不仅需要获得表面形状等视觉感知信息,还需要获得环境中物体表面摩擦等触觉感知信息。VPI可以通过激光传感器和相机获得。相比之下,HPI基本上只能从物理与环境相互作用的结果中获得,例如反作用力和变形。然而,目前的机器人没有一个功能来估计HPI。在这项研究中,我们提出了一个框架来估计这些物理交互参数(pip),包括硬度、摩擦和重量,这些参数是机器人与环境安全交互的重要参数。为了有效估计,我们定义了地面(GGM)和目标摸索模式(OGM)。具有力传感器的机械臂端点通过位置/力混合控制,主动地接触、推动、摩擦和提升环境中的物体,并根据测量的机械臂端点反作用力和位移估计出三种pip。机器人最终根据估计的pip来判断事故风险,例如安全、注意或危险。我们准备了表面形状相同但硬度、摩擦力和重量不同的环境。实验结果表明,该框架能较好地估计pip值,对风险判断和安全规划任务有一定的指导意义。
{"title":"A framework of physically interactive parameter estimation based on active environmental groping for safe disaster response work","authors":"Kamezaki, Mitsuhiro, Uehara, Yusuke, Azuma, Kohga, Sugano, Shigeki","doi":"10.1186/s40648-021-00209-1","DOIUrl":"https://doi.org/10.1186/s40648-021-00209-1","url":null,"abstract":"Disaster response robots are expected to perform complicated tasks such as traveling over unstable terrain, climbing slippery steps, and removing heavy debris. To complete such tasks safely, the robots must obtain not only visual-perceptual information (VPI) such as surface shape but also the haptic-perceptual information (HPI) such as surface friction of objects in the environments. VPI can be obtained from laser sensors and cameras. In contrast, HPI can be basically obtained from only the results of physical interaction with the environments, e.g., reaction force and deformation. However, current robots do not have a function to estimate the HPI. In this study, we propose a framework to estimate such physically interactive parameters (PIPs), including hardness, friction, and weight, which are vital parameters for safe robot-environment interaction. For effective estimation, we define the ground (GGM) and object groping modes (OGM). The endpoint of the robot arm, which has a force sensor, actively touches, pushes, rubs, and lifts objects in the environment with a hybrid position/force control, and three kinds of PIPs are estimated from the measured reaction force and displacement of the arm endpoint. The robot finally judges the accident risk based on estimated PIPs, e.g., safe, attentional, or dangerous. We prepared environments that had the same surface shape but different hardness, friction, and weight. The experimental results indicated that the proposed framework could estimate PIPs adequately and was useful to judge the risk and safely plan tasks.","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":"46 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2021-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138527533","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Force display control system for simultaneous 3-axis translational motion in surgical training simulator for chiseling operation 凿凿手术训练模拟器三轴同步平移力显示控制系统
IF 1.4 Q3 INSTRUMENTS & INSTRUMENTATION Pub Date : 2021-09-15 DOI: 10.1186/s40648-021-00208-2
Kentaro Masuyama, Y. Noda, Y. Ito, Y. Kagiyama, Koichi Ueki
{"title":"Force display control system for simultaneous 3-axis translational motion in surgical training simulator for chiseling operation","authors":"Kentaro Masuyama, Y. Noda, Y. Ito, Y. Kagiyama, Koichi Ueki","doi":"10.1186/s40648-021-00208-2","DOIUrl":"https://doi.org/10.1186/s40648-021-00208-2","url":null,"abstract":"","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":"8 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2021-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"65734045","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Self-excited air flow passage changing device for periodic pressurization of soft robot 一种用于软机器人周期增压的自激气流通道变换装置
IF 1.4 Q3 INSTRUMENTS & INSTRUMENTATION Pub Date : 2021-08-26 DOI: 10.1186/s40648-021-00207-3
Takayama, Toshio, Sumi, Yusuke
Recently pneumatic-driven soft robots have been widely developed. Usually, the operating principle of this robot is the inflation and deflation of elastic inflatable chambers by air pressure. Some soft robots need rapid and periodic inflation and deflation of their air chambers to generate continuous motion such as progress motion or rotational motion. However, if the soft robot needs to operate far from the air pressure source, long air tubes are required to supply air pressure to its air chambers. As a result, there is a large delay in supplying air pressure to the air chamber, and the motion of the robot slows down. In this paper, we propose a compact device that changes its airflow passages by self-excited motion generated by a supply of continuous airflow. The diameter and the length of the device are 20 and 50 mm, respectively, and can be driven in a small pipe. Our proposed in-pipe mobile robot is connected to the device and can move in a small pipe by dragging the device into it. To apply the device widely to other soft robots, we also discuss a method of adjusting the output pressure and motion frequency.
近年来,气动软机器人得到了广泛的发展。通常,这种机器人的工作原理是通过空气压力使弹性充气室充气和放气。一些软机器人需要快速和周期性的充气和充气,以产生连续的运动,如前进运动或旋转运动。然而,如果软机器人需要在远离气压源的地方操作,则需要长空气管向其气室提供空气压力。因此,向气室供气压力有很大的延迟,机器人的运动速度减慢。在本文中,我们提出了一种紧凑的装置,该装置通过连续气流供应产生的自激运动来改变其气流通道。该装置的直径和长度分别为20和50mm,可以在小管道中驱动。我们提出的管道内移动机器人与设备相连,可以通过将设备拖拽到一个小管道中来移动。为了将该装置广泛应用于其他软体机器人,我们还讨论了一种调节输出压力和运动频率的方法。
{"title":"Self-excited air flow passage changing device for periodic pressurization of soft robot","authors":"Takayama, Toshio, Sumi, Yusuke","doi":"10.1186/s40648-021-00207-3","DOIUrl":"https://doi.org/10.1186/s40648-021-00207-3","url":null,"abstract":"Recently pneumatic-driven soft robots have been widely developed. Usually, the operating principle of this robot is the inflation and deflation of elastic inflatable chambers by air pressure. Some soft robots need rapid and periodic inflation and deflation of their air chambers to generate continuous motion such as progress motion or rotational motion. However, if the soft robot needs to operate far from the air pressure source, long air tubes are required to supply air pressure to its air chambers. As a result, there is a large delay in supplying air pressure to the air chamber, and the motion of the robot slows down. In this paper, we propose a compact device that changes its airflow passages by self-excited motion generated by a supply of continuous airflow. The diameter and the length of the device are 20 and 50 mm, respectively, and can be driven in a small pipe. Our proposed in-pipe mobile robot is connected to the device and can move in a small pipe by dragging the device into it. To apply the device widely to other soft robots, we also discuss a method of adjusting the output pressure and motion frequency.","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":"43 7","pages":""},"PeriodicalIF":1.4,"publicationDate":"2021-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138527525","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Recalling of multiple grasping methods from an object image with a convolutional neural network 用卷积神经网络从物体图像中回忆多种抓取方法
IF 1.4 Q3 INSTRUMENTS & INSTRUMENTATION Pub Date : 2021-07-06 DOI: 10.1186/s40648-021-00206-4
M. Sanada, T. Matsuo, N. Shimada, Y. Shirai
{"title":"Recalling of multiple grasping methods from an object image with a convolutional neural network","authors":"M. Sanada, T. Matsuo, N. Shimada, Y. Shirai","doi":"10.1186/s40648-021-00206-4","DOIUrl":"https://doi.org/10.1186/s40648-021-00206-4","url":null,"abstract":"","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":" ","pages":""},"PeriodicalIF":1.4,"publicationDate":"2021-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s40648-021-00206-4","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44615887","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
3D pointing gestures as target selection tools: guiding monocular UAVs during window selection in an outdoor environment 作为目标选择工具的三维指向手势:指导单目无人机在室外环境下进行窗口选择
IF 1.4 Q3 INSTRUMENTS & INSTRUMENTATION Pub Date : 2021-04-16 DOI: 10.1186/s40648-021-00200-w
A. Medeiros, P. Ratsamee, J. Orlosky, Yuki Uranishi, Manabu Higashida, H. Takemura
{"title":"3D pointing gestures as target selection tools: guiding monocular UAVs during window selection in an outdoor environment","authors":"A. Medeiros, P. Ratsamee, J. Orlosky, Yuki Uranishi, Manabu Higashida, H. Takemura","doi":"10.1186/s40648-021-00200-w","DOIUrl":"https://doi.org/10.1186/s40648-021-00200-w","url":null,"abstract":"","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":"8 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2021-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s40648-021-00200-w","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"65733861","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Toward mission-dependent long robotic arm enhancement: design method of flying watch attachment allocation based on thrust drivability 面向任务的长机械臂增强:基于推力可驱动性的飞行表附件配置设计方法
IF 1.4 Q3 INSTRUMENTS & INSTRUMENTATION Pub Date : 2021-03-20 DOI: 10.1186/s40648-021-00198-1
Siyi Pan, G. Endo
{"title":"Toward mission-dependent long robotic arm enhancement: design method of flying watch attachment allocation based on thrust drivability","authors":"Siyi Pan, G. Endo","doi":"10.1186/s40648-021-00198-1","DOIUrl":"https://doi.org/10.1186/s40648-021-00198-1","url":null,"abstract":"","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":"8 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2021-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"65733586","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
EKF-based self-attitude estimation with DNN learning landscape information 基于ekf的自我态度估计与DNN学习景观信息
IF 1.4 Q3 INSTRUMENTS & INSTRUMENTATION Pub Date : 2021-03-06 DOI: 10.1186/s40648-021-00196-3
Ryota Ozaki, Y. Kuroda
{"title":"EKF-based self-attitude estimation with DNN learning landscape information","authors":"Ryota Ozaki, Y. Kuroda","doi":"10.1186/s40648-021-00196-3","DOIUrl":"https://doi.org/10.1186/s40648-021-00196-3","url":null,"abstract":"","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":"8 1","pages":""},"PeriodicalIF":1.4,"publicationDate":"2021-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s40648-021-00196-3","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"65733533","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Projection-mapping-based object pointing using a high-frame-rate camera-projector system 使用高帧率相机投影系统的基于投影映射的对象指向
IF 1.4 Q3 INSTRUMENTS & INSTRUMENTATION Pub Date : 2021-03-01 DOI: 10.1186/s40648-021-00197-2
Deepak Kumar, Sushil Raut, Kohei Shimasaki, T. Senoo, I. Ishii
{"title":"Projection-mapping-based object pointing using a high-frame-rate camera-projector system","authors":"Deepak Kumar, Sushil Raut, Kohei Shimasaki, T. Senoo, I. Ishii","doi":"10.1186/s40648-021-00197-2","DOIUrl":"https://doi.org/10.1186/s40648-021-00197-2","url":null,"abstract":"","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":" ","pages":""},"PeriodicalIF":1.4,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1186/s40648-021-00197-2","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47859465","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
ROBOMECH Journal
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1