节点文献

面向机械臂抓取的航天器位姿检测

Spacecraft posture detection for robotic arm grabbing

  • 推荐 CAJ下载
  • PDF下载
  • 不支持迅雷等下载工具,请取消加速工具后下载。

【作者】 黄成苏俊飞许家忠

【Author】 HUANG Cheng;SU Junfei;XU Jiazhong;College of Automation,Harbin University of Science and Technology;

【通讯作者】 黄成;

【机构】 哈尔滨理工大学自动化学院

【摘要】 针对机械臂非合作抓取任务中目标检测效率低的问题,提出了一种基于改进YOLOv8n的目标位姿检测算法。将大型可分离核注意力(Large Separable Kernel Attention, LSKA)引入金字塔池化层(Spatial Pyramid Pooling Fusion, SPPF),以增强模型的多尺度聚合特征能力;设计一种全新的轻量化模块RGCSPELAN,以减少模型的运行成本与计算量;然后,用inner思想改造平均成对距离交并比(Mean Pairwise Distance Intersection over Union, MPDIoU),进而改造加权交并比(Wise-IoU),形成新的损失函数Wise-MPDIoUinner,以提高模型的训练效果与检测性能;最后,基于目标位置检测信息和深度信息,通过构建目标实时坐标系得到其空间三维姿态信息并完成机械臂抓取任务。实验结果表明:本文算法的精确率达到了96.5%,mAP@0.5指标达到了96.7%,参数量降低了16%,推理速度提升了33%,实现了模型精度与轻量化之间的平衡,且满足UR5机械臂在非合作抓取任务中的实时性要求。

【Abstract】 An improved YOLOv8n-based target position detection algorithm is proposed to address the inefficiency of target detection in non-cooperative robotic arm gripping tasks. First, the Large Separable Kernel Attention(LSKA) mechanism is integrated into the Spatial Pyramid Pooling Fusion(SPPF) layer to enhance the model′s multiscale feature aggregation capability. Second, a novel lightweight module, RGCSPELAN, is designed to reduce the computational cost and runtime of the model. Additionally, the average pairwise distance intersection over union(MPDIoU) is restructured with an inner transformation concept, which is further combined with the weighted intersection over union(Wise-IoU) to develop a new loss function, Wise-MPDIoU^inner. This loss function enhances both the training efficiency and detection performance of the model. Finally, using target position detection and depth information, a real-time coordinate system is constructed to determine the target’s 3D spatial attitude, enabling the completion of robotic arm grasping tasks. Experimental results demonstrate that the proposed algorithm achieves an accuracy of 96. 5%, an mAP@0. 5 of 96. 7%, a 16% reduction in parameters, and a 33% improvement in inference speed. The algorithm effectively balances model accuracy and computational efficiency, meeting the real-time requirements of the UR5 robot for non-cooperative grasping tasks.

【基金】 国家自然科学基金资助项目(No.52102455)
  • 【文献出处】 光学精密工程 ,Optics and Precision Engineering , 编辑部邮箱 ,2024年21期
  • 【分类号】TP241;V44
  • 【下载频次】17
节点文献中: