Event cameras, when combined with inertial sensors, show significant potential for motion estimation in challenging scenarios, such as high-speed maneuvers and low-light environments. There are many methods for producing such estimations, but most boil down to a synchronous discrete-time fusion problem. However, the asynchronous nature of event cameras and their unique fusion mechanism with inertial sensors remain underexplored. In this paper, we introduce a monocular event-inertial odometry method called AsynEIO, designed to fuse asynchronous event and inertial data within a unified Gaussian Process (GP) regression framework. Our approach incorporates an event-driven frontend that tracks feature trajectories directly from raw event streams at a high temporal resolution. These tracked feature trajectories, along with various inertial factors, are integrated into the same GP regression framework to enable asynchronous fusion. With deriving analytical residual Jacobians and noise models, our method constructs a factor graph that is iteratively optimized and pruned using a sliding-window optimizer. Comparative assessments highlight the performance of different inertial fusion strategies, suggesting optimal choices for varying conditions. Experimental results on both public datasets and our own event-inertial sequences indicate that AsynEIO outperforms existing methods, especially in high-speed and low-illumination scenarios.
翻译:事件相机与惯性传感器结合,在高速机动和低光照等挑战性场景中展现出显著的运动估计潜力。现有许多方法致力于此类估计,但大多可归结为同步离散时间融合问题。然而,事件相机的异步特性及其与惯性传感器的独特融合机制仍未得到充分探索。本文提出一种名为AsynEIO的单目事件-惯性里程计方法,旨在统一的高斯过程回归框架内融合异步事件与惯性数据。该方法采用事件驱动的前端模块,以高时间分辨率直接从原始事件流中追踪特征轨迹。这些追踪到的特征轨迹与多种惯性因子被整合至同一高斯过程回归框架,实现异步融合。通过推导解析残差雅可比矩阵与噪声模型,本方法构建了因子图,并采用滑动窗口优化器进行迭代优化与剪枝。对比评估凸显了不同惯性融合策略的性能表现,为多变条件下的最优选择提供了依据。在公开数据集及自主采集的事件-惯性序列上的实验结果表明,AsynEIO在高速与低光照场景中尤其优于现有方法。