TY - GEN
T1 - Multisensor fusion in Reconfigurable Robots using Adaptive Transformation
AU - Povendhan, A. P.
AU - Hayat, A. A.
AU - Yi, Lim
AU - Hoong, J. C.C.
AU - Elara, M. R.
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - Multisensor fusion is essential for various applications like accurate autonomous navigation, object detection, and perception. The calibration approach for getting accurate and meaningful results utilizes the intrinsic and extrinsic calibration of sensors, given that the sensors are fixed relative to each other. The robots with the ability to change their morphology are referred to as reconfigurable robots. The reconfigurable robot named Panthera is taken as a use case to showcase the need and practical application where the relative change in the position of the sensors occurs. The proposed approach account for the adaptive transformation, which captures the change in the relative pose of the sensors in real-time. The scheme presented is demonstrated using multiple stereo cameras and three-dimensional LiDARs. Moreover, the result for the effectiveness of the adaptive transformation for two cases is presented and discussed. The framework is generic and suitable for a multisensor fusion where each sensor is subjected to change in relative pose.
AB - Multisensor fusion is essential for various applications like accurate autonomous navigation, object detection, and perception. The calibration approach for getting accurate and meaningful results utilizes the intrinsic and extrinsic calibration of sensors, given that the sensors are fixed relative to each other. The robots with the ability to change their morphology are referred to as reconfigurable robots. The reconfigurable robot named Panthera is taken as a use case to showcase the need and practical application where the relative change in the position of the sensors occurs. The proposed approach account for the adaptive transformation, which captures the change in the relative pose of the sensors in real-time. The scheme presented is demonstrated using multiple stereo cameras and three-dimensional LiDARs. Moreover, the result for the effectiveness of the adaptive transformation for two cases is presented and discussed. The framework is generic and suitable for a multisensor fusion where each sensor is subjected to change in relative pose.
UR - http://www.scopus.com/inward/record.url?scp=85124691887&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85124691887&partnerID=8YFLogxK
U2 - 10.1109/NIR52917.2021.9666081
DO - 10.1109/NIR52917.2021.9666081
M3 - Conference contribution
AN - SCOPUS:85124691887
T3 - 2021 International Conference "Nonlinearity, Information and Robotics", NIR 2021
BT - 2021 International Conference "Nonlinearity, Information and Robotics", NIR 2021
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 International Conference "Nonlinearity, Information and Robotics", NIR 2021
Y2 - 26 August 2021 through 29 August 2021
ER -