← 返回
无人水面艇上基于激光雷达的海洋物体检测
Marine Object Detection Using LiDAR on an Unmanned Surface Vehicle
| 作者 | Yvan Eustache · Cédric Seguin · Antoine Pecout · Alexandre Foucher · Johann Laurent · Dominique Heller |
| 期刊 | IEEE Access |
| 出版日期 | 2025年1月 |
| 技术分类 | 储能系统技术 |
| 技术标签 | 储能系统 |
| 相关度评分 | ★★★★ 4.0 / 5.0 |
| 关键词 | 海洋目标检测 3D目标检测 无人水面艇 LiDAR数据 后期融合策略 |
语言:
中文摘要
本文提出专为小型无人水面艇设计的纯激光雷达3D物体检测方法,针对能效和计算约束优化。贡献包括2米自主无人艇采集的点云数据集及硬件在环仿真增强数据。采用PointPillars网络训练评估,对比纯激光雷达与激光雷达-相机多模态方案。核心创新是后期融合策略,传感器独立检测后再集成,资源占用显著低于早期融合。该方案适合紧凑低功耗水面无人机部署,推进实用可扩展海洋感知系统。
English Abstract
Marine object detection plays a crucial role in various applications such as collision avoidance and autonomous navigation in maritime environments. While most existing datasets focus on 2D object detection, this research introduces a novel 3D object detection approach that relies exclusively on LiDAR (Light Detection And Ranging) data, specifically tailored for small Unmanned Surface Vehicles (USVs), where energy efficiency and computational constraints are key challenges. This study contributes a new point cloud dataset collected from a 2-meter autonomous USV and augmented through a hardware-in-the-loop simulation environment. The PointPillars network, chosen for its efficiency in processing LiDAR data, was trained and evaluated in this maritime context. A comparative analysis was also conducted between the proposed LiDAR-only method and a multimodal (LiDAR-camera) approach. The core innovation of this work is a step for late fusion strategy, where object detection is performed independently across sensors before integration. This results in a significantly less resource-intensive solution compared to early fusion methods. Consequently, the LiDAR-only approach highly suitable for deployment on compact, low-power autonomous surface drones, marking a step forward in practical and scalable marine perception systems.
S
SunView 深度解读
该低功耗物体检测技术与阳光电源储能系统的智能运维相关。阳光iSolarCloud平台在光伏电站巡检中采用无人机和传感器融合技术,该激光雷达方案的后期融合策略可降低边缘计算功耗。结合阳光智能逆变器的边缘AI能力,该技术可优化光伏电站智能巡检和故障诊断,提升运维效率,降低LCOE。