TY - JOUR
T1 - Implementation of a flexible and lightweight depth-based visual servoing solution for feature detection and tracing of large, spatially-varying manufacturing workpieces
AU - Clift, Lee
AU - Tiwari, Divya
AU - Scraggs, Chris
AU - Hutabarat, Windo
AU - Tinkler, Lloyd
AU - Aitken, Jonathan M.
AU - Tiwari, Ashutosh
PY - 2022/2/11
Y1 - 2022/2/11
N2 - This work proposes a novel solution for detecting and tracing spatially varying edges of large manufacturing workpieces, using a consumer grade RGB depth camera, with only a partial view of the workpiece and without prior knowledge. The proposed system can visually detect and trace various edges, with a wide array of degrees, to an accuracy of 15 mm or less, without the need for any previous information, setup or planning. A combination of physical experiments on the setup and more complex simulated experiments were conducted. The effectiveness of the system is demonstrated via simulated and physical experiments carried out on both acute and obtuse edges, as well as typical aerospace structures, made from a variety of materials, with dimensions ranging from 400 mm to 600 mm. Simulated results show that, with artificial noise added, the solution presented can detect aerospace structures to an accuracy of 40 mm or less, depending on the amount of noise present, while physical aerospace inspired structures can be traced with a consistent accuracy of 5 mm regardless of the cardinal direction. Compared to current industrial solutions, the lack of required planning and robustness of edge detection means it should be able to complete tasks more quickly and easily than the current standard, with a lower financial and computational cost than the current techniques being used within.
AB - This work proposes a novel solution for detecting and tracing spatially varying edges of large manufacturing workpieces, using a consumer grade RGB depth camera, with only a partial view of the workpiece and without prior knowledge. The proposed system can visually detect and trace various edges, with a wide array of degrees, to an accuracy of 15 mm or less, without the need for any previous information, setup or planning. A combination of physical experiments on the setup and more complex simulated experiments were conducted. The effectiveness of the system is demonstrated via simulated and physical experiments carried out on both acute and obtuse edges, as well as typical aerospace structures, made from a variety of materials, with dimensions ranging from 400 mm to 600 mm. Simulated results show that, with artificial noise added, the solution presented can detect aerospace structures to an accuracy of 40 mm or less, depending on the amount of noise present, while physical aerospace inspired structures can be traced with a consistent accuracy of 5 mm regardless of the cardinal direction. Compared to current industrial solutions, the lack of required planning and robustness of edge detection means it should be able to complete tasks more quickly and easily than the current standard, with a lower financial and computational cost than the current techniques being used within.
KW - kinematics
KW - visual servoing
KW - edge detection
KW - robotic vision
KW - inverse kinematics
KW - edge tracing
KW - digital manufacturing
U2 - 10.3390/robotics11010025
DO - 10.3390/robotics11010025
M3 - Article
SN - 2218-6581
VL - 11
JO - Robotics
JF - Robotics
IS - 1
M1 - 25
ER -