Spacecraft pose estimation is crucial for autonomous in-space operations, such as rendezvous, docking and on-orbit servicing. Vision-based pose estimation methods, which typically employ RGB imaging sensors, is a compelling solution for spacecraft pose estimation, but are challenged by harsh lighting conditions, which produce imaging artifacts such as glare, over-exposure, blooming and lens flare. Due to their much higher dynamic range, neuromorphic or event sensors are more resilient to extreme lighting conditions. However, event sensors generally have lower spatial resolution and suffer from reduced signal-to-noise ratio during periods of low relative motion. This work addresses these individual sensor limitations by introducing a sensor fusion approach combining RGB and event sensors. A beam-splitter prism was employed to achieve precise optical and temporal alignment. Then, a RANSAC-based technique was developed to fuse the information from the RGB and event channels to achieve pose estimation that leveraged the strengths of the two modalities. The pipeline was complemented by dropout uncertainty estimation to detect extreme conditions that affect either channel. To benchmark the performance of the proposed event-RGB fusion method, we collected a comprehensive real dataset of RGB and event data for satellite pose estimation in a laboratory setting under a variety of challenging illumination conditions. Encouraging results on the dataset demonstrate the efficacy of our event-RGB fusion approach and further supports the usage of event sensors for spacecraft pose estimation. To support community research on this topic, our dataset has been released publicly.
翻译:航天器姿态估计对于自主在轨操作(如交会对接与在轨服务)至关重要。基于视觉的姿态估计方法通常采用可见光成像传感器,是航天器姿态估计的一种有效解决方案,但易受严苛光照条件的影响,导致产生眩光、过曝、光晕及镜头光斑等成像伪影。得益于更高的动态范围,神经形态事件传感器对极端光照条件具有更强的鲁棒性。然而,事件传感器通常空间分辨率较低,且在相对运动较缓时信噪比显著下降。本研究通过提出一种融合可见光与事件传感器的多模态融合方法,以克服单一传感器的局限性。采用分光棱镜实现精确的光学与时序对准,进而开发了一种基于RANSAC的技术,融合可见光与事件通道的信息,充分利用两种模态的优势实现姿态估计。该流程辅以随机丢弃不确定性估计,以检测影响任一通道的极端条件。为评估所提出的事件-可见光融合方法的性能,我们在实验室环境中采集了涵盖多种挑战性光照条件的可见光与事件数据,构建了全面的卫星姿态估计真实数据集。在该数据集上取得的积极结果验证了事件-可见光融合方法的有效性,并进一步支持事件传感器在航天器姿态估计中的应用。为促进该领域的学术研究,本数据集已公开发布。