Point cloud downsampling is a crucial pre-processing operation to downsample points in order to unify data size and reduce computational cost, to name a few. Recent research on point cloud downsampling has achieved great success which concentrates on learning to sample in a task-aware way. However, existing learnable samplers can not directly perform arbitrary-size downsampling, and assume the input size is fixed. In this paper, we introduce the AS-PD, a novel task-aware sampling framework that directly downsamples point clouds to any smaller size based on a sample-to-refine strategy. Given an input point cloud of arbitrary size, we first perform a task-agnostic pre-sampling on the input point cloud to a specified sample size. Then, we obtain the sampled set by refining the pre-sampled set to make it task-aware, driven by downstream task losses. The refinement is realized by adding each pre-sampled point with a small offset predicted by point-wise multi-layer perceptrons (MLPs). With the density encoding and proper training scheme, the framework can learn to adaptively downsample point clouds of different input sizes to arbitrary sample sizes. We evaluate sampled results for classification and registration tasks, respectively. The proposed AS-PD surpasses the state-of-the-art method in terms of downstream performance. Further experiments also show that our AS-PD exhibits better generality to unseen task models, implying that the proposed sampler is optimized to the task rather than a specified task model.
翻译:点下取样是一个至关重要的预处理操作, 用于下层取样, 以统一数据大小, 并降低计算成本, 举几个例子。 最近对点云下取样的研究取得了巨大成功, 集中学习以任务感应方式进行取样。 然而, 现有的可学习采样器无法直接进行任意大小的抽样, 并假定输入大小是固定的。 在本文中, 我们引入了 AS- PD, 这是一种新颖的任务感量取样框架, 直接从下层取样点到任何较小大小, 以便根据样到反射战略来统一数据大小。 鉴于输入点的采样范围, 我们首先在输入点云上进行任务性无差别的预采样。 然后, 我们通过改进预采样的采样程序来进行抽样, 在下游任务损失的驱动下游中, 将每个预采样点都添加一个小量点, 由点到多层任务( MLPs) 来预测。 有了一个任意的输入点模式和正确的培训云, 我们的试样框架可以向下层的下层评估一个调整性比例。 。 。 我们的试样的系统可以学习一个普通的试样级 。