PPF算法讲解

PPF 点对特征 算法笔记

PPF相关调研

  1. 两点与法向量之间的构造点对特征

  1. 添加RGB特征

  1. 基于边缘的PPF

  1. 三维物体表面识别

  1. 增强的位姿检索

  1. PPF-MEAM

文章“Point Pair Feature-Based Pose Estimation with Multiple Edge Appearance Models (PPF-MEAM) for Robotic Bin Picking”

传统PPF算法

应用:3D点云6D位姿识别

创新点:整体建模,局部匹配。Model Globally, Match Locally

定义全局模型描述 Model Globally

离线建立ppf模型

在model模型点云中提取ppf特征集合。

PPF特征公式

包括点对距离,点法向量和点对矢量之间的夹角。
在模型表面采样了足够的点对ppf后,将这些特征写入哈希表,特征为键,点对(集)为值。

局部匹配 Match Locally

当定义好全局模型描述(Global Model Description)后,就可以考虑局部匹配了。

依据:

局部坐标系 Local Coordinates

处理思路:

广义霍夫投票方法 Voting Scheme

前面我们定义了 local coordinates,现在只需要通过一种方法找到最优的 local coordinates 使得 scene 中落在 model 表面的点最多,即可求出物体 pose。

具体实现:

投票速度优化 Efficient Voting Loop

位姿聚类 Pose Clustering

参考代码与解释

opencv实现

opencv传统PPF实现

文档说明

关键代码说明

  1. 算子使用
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
// pc is the loaded point cloud of the model
// (Nx6) and pcTest is a loaded point cloud of
// the scene (Mx6)
ppf_match_3d::PPF3DDetector detector(0.03, 0.05);
detector.trainModel(pc);
vector<Pose3DPtr> results;
detector.match(pcTest, results, 1.0/10.0, 0.05);
cout << "Poses: " << endl;
// print the poses
for (size_t i=0; i<results.size(); i++)
{
Pose3DPtr pose = results[i];
cout << "Pose Result " << i << endl;
pose->printPose();
}
  1. PPF3DDetector 类的学习

初始化

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
PPF3DDetector::PPF3DDetector()
{
sampling_step_relative = 0.05;//模型相对采样步长
distance_step_relative = 0.05;//模型相对采样距离
scene_sample_step = (int)(1/0.04);//场景采样步长
angle_step_relative = 30;//角度细分度
angle_step_radians = (360.0/angle_step_relative)*M_PI/180.0;
angle_step = angle_step_radians;
trained = false;

hash_table = NULL;
hash_nodes = NULL;

setSearchParams();
}

PPF3DDetector::PPF3DDetector(const double RelativeSamplingStep, const double RelativeDistanceStep, const double NumAngles)
{
sampling_step_relative = RelativeSamplingStep;
distance_step_relative = RelativeDistanceStep;
angle_step_relative = NumAngles;
angle_step_radians = (360.0/angle_step_relative)*M_PI/180.0;
//SceneSampleStep = 1.0/RelativeSceneSampleStep;
angle_step = angle_step_radians;
trained = false;

hash_table = NULL;
hash_nodes = NULL;

setSearchParams();
}

void PPF3DDetector::setSearchParams(const double positionThreshold, const double rotationThreshold, const bool useWeightedClustering)
{
if (positionThreshold<0)
position_threshold = sampling_step_relative;
else
position_threshold = positionThreshold;

if (rotationThreshold<0)
rotation_threshold = ((360/angle_step) / 180.0 * M_PI);
else
rotation_threshold = rotationThreshold;

use_weighted_avg = useWeightedClustering;
}

参数说明

  1. sampling_step_relative
1
2
3
4
5
6
7
sampling_step_relative //相对采样步长 RelativeSamplingStep
//如果将RelativeSamplingStep设置为0.05,并且模型的直径为1米(1000毫米),则从对象表面采样的点将相距大约50毫米
//从另一个角度来看,如果采样相对采样步长设置为0.05,则最多生成20x20x20=8000个模型点(取决于模型如何填充体积)。
//因此,最多产生8000x8000对 点对。
//实际上模型分布不均,所有点会少一些
//减小此值将得到更多的模型点,从而获得更精确的表示
//通常,范围在0.025到0.05之间的值对于大多数应用程序来说是足够的,其中默认值是0.03。
  1. 对模型做去离群点滤波

1
从模型中去除异常值并在最初准备一个理想模型是非常明智的。这是因为,离群点的存在直接影响了相关计算,降低了匹配精度。
  1. scene_sample_step 场景采样步长
1
2
3
4
//对场景点云也要根据 采样步长进行采样,此部分由RelativeSceneSampleStep控制
//其中Scene Sample Step=(int)(1.0/RelativeSceneSampleStep)
//换言之,如果RelativeSceneSampleStep=1.0/5.0,子采样场景将再次均匀采样到点数的1/5。
//此参数的最大值为1,增加此参数也会增加稳定性,但会降低速度。同样,由于初始场景独立的相对采样,微调此参数不是一个大问题。
  1. relative Instance Step 充当哈希表上的离散化步骤

点对特征被映射到哈希表的buckets 中。调整relative Instance Step,会调整hashtable的冲突。理论上冲突越多,精度越低。

减小此参数会增加量化的影响,但会开始将不相似的点对分配给相同的箱子。然而,增加它,会削弱对相似的分组能力。

一般来说,由于在采样阶段,训练模型点的选择是一致的,距离由Relative Sampling Step控制,因此Relative Distance Step应该等于这个值。

同样,在0.025-0.05范围内的值是合理的。但这一次,当模型密集时,不建议减小该值。对于有噪声的场景,可以通过增加该值来提高匹配对噪声点的鲁棒性。

Opencv 源码翻译

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
#ifndef __OPENCV_SURFACE_MATCHING_PPF_MATCH_3D_HPP__
#define __OPENCV_SURFACE_MATCHING_PPF_MATCH_3D_HPP__

#include <opencv2/core.hpp>

#include <vector>
#include "pose_3d.hpp"
#include "t_hash_int.hpp"

namespace cv
{
namespace ppf_match_3d
{

//! @addtogroup surface_matching
//! @{

/**
* @brief Struct, holding a node in the hashtable
*/
typedef struct THash
{
int id;
int i, ppfInd;
} THash;

/**
* @brief Class, allowing the load and matching 3D models.
* Typical Use:
* @code
* // Train a model
* ppf_match_3d::PPF3DDetector detector(0.05, 0.05);
* detector.trainModel(pc);
* // Search the model in a given scene
* vector<Pose3DPtr> results;
* detector.match(pcTest, results, 1.0/5.0,0.05);
* @endcode
*/
class CV_EXPORTS_W PPF3DDetector
{
public:

/**
* \brief Empty constructor. Sets default arguments
*/
CV_WRAP PPF3DDetector();

/**
* Constructor with arguments 带参数的构造函数
* @param [in] relativeSamplingStep Sampling distance relative to the object's diameter. Models are first sampled uniformly in order to improve efficiency. Decreasing this value leads to a denser model, and a more accurate pose estimation but the larger the model, the slower the training. Increasing the value leads to a less accurate pose computation but a smaller model and faster model generation and matching. Beware of the memory consumption when using small values.
* 模型采样步长。为了提高效率,首先对模型进行均匀采样。减小此值将导致模型更密集,并且姿势估计更精确,但模型越大,训练越慢。增加该值会导致姿势计算的精度降低,但模型更小,模型生成和匹配更快。使用小值时要小心内存消耗。

* @param [in] relativeDistanceStep The discretization distance of the point pair distance relative to the model's diameter. This value has a direct impact on the hashtable. Using small values would lead to too fine discretization, and thus ambiguity in the bins of hashtable. Too large values would lead to no discrimination over the feature vectors and different point pair features would be assigned to the same bin. This argument defaults to the value of RelativeSamplingStep. For noisy scenes, the value can be increased to improve the robustness of the matching against noisy points.
* 模型采样步长 ppf点对间距与模型最大直径的比值?对哈希表生成有直接影响。过小会导致国度离散化,导致hash表中bin的模糊,过大会无法区分特征向量,不同的点对特征将会分配到一个bin中。此参数默认与RelativeSamplingStep相同,对于有噪声的场景,可以通过增加该值来提高匹配对噪声点的鲁棒性。

* @param [in] numAngles Set the discretization of the point pair orientation as the number of subdivisions of the angle. This value is the equivalent of RelativeDistanceStep for the orientations. Increasing the value increases the precision of the matching but decreases the robustness against incorrect normal directions. Decreasing the value decreases the precision of the matching but increases the robustness against incorrect normal directions. For very noisy scenes where the normal directions can not be computed accurately, the value can be set to 25 or 20.
* 角度分支,用于设置ppf方向,设置模型采样步长relativeDistanceStep的方向。提高该值,会增加匹配精度,同时降低针对不正确正常方向的稳健性。降低值会降低匹配的精度,但会提高针对不正确正常方向的鲁棒性。对于无法准确计算正常方向的非常嘈杂的场景,值可以设置为 25 或 20
*/
CV_WRAP PPF3DDetector(const double relativeSamplingStep, const double relativeDistanceStep=0.05, const double numAngles=30);

virtual ~PPF3DDetector();

/**
* Set the parameters for the search
* @param [in] positionThreshold Position threshold controlling the similarity of translations. Depends on the units of calibration/model.
* 匹配的位移阈值,取决于模型的实际的单位,如mm 。控制模型与场景之间所容忍的最大位移偏差?

* @param [in] rotationThreshold Position threshold controlling the similarity of rotations. This parameter can be perceived as a threshold over the difference of angles
* 旋转阈值,控制模型与场景之间所容忍的最大旋转偏差?
* @param [in] useWeightedClustering The algorithm by default clusters the poses without weighting. A non-zero value would indicate that the pose clustering should take into account the number of votes as the weights and perform a weighted averaging instead of a simple one.
* 在将pose进行聚类时,是否考虑以投票数作为权重,并进行加权平均,而不是简单的平均值。
*/
void setSearchParams(const double positionThreshold=-1, const double rotationThreshold=-1, const bool useWeightedClustering=false);

/**
* \brief Trains a new model.
*
* @param [in] Model The input point cloud with normals (Nx6)
*
* \details Uses the parameters set in the constructor to downsample and learn a new model. When the model is learnt, the instance gets ready for calling "match".
* 模型训练
*/
CV_WRAP void trainModel(const Mat& Model);

/**
* \brief Matches a trained model across a provided scene.
*
* @param [in] scene Point cloud for the scene 输入场景点云
* @param [out] results List of output poses 位姿集合
* @param [in] relativeSceneSampleStep The ratio of scene points to be used for the matching after sampling with relativeSceneDistance. For example, if this value is set to 1.0/5.0, every 5th point from the scene is used for pose estimation. This parameter allows an easy trade-off between speed and accuracy of the matching. Increasing the value leads to less points being used and in turn to a faster but less accurate pose computation. Decreasing the value has the inverse effect.
* 用于匹配的场景点比例,增加值会导致使用较少的点,进而导致更快但更不准确的姿势计算。降低值具有反作用。
* @param [in] relativeSceneDistance Set the distance threshold relative to the diameter of the model. This parameter is equivalent to relativeSamplingStep in the training stage. This parameter acts like a prior sampling with the relativeSceneSampleStep parameter.
* 设置相对于模型直径的距离阈值(采样距离)。此参数相当于培训阶段的相对采样步长。增加值会导致使用较少的点,进而导致更快但更不准确的姿势计算。降低值具有反作用。
*/
CV_WRAP void match(const Mat& scene, CV_OUT std::vector<Pose3DPtr> &results, const double relativeSceneSampleStep=1.0/5.0, const double relativeSceneDistance=0.03);

void read(const FileNode& fn);
void write(FileStorage& fs) const;

protected:

double angle_step, angle_step_radians, distance_step;
double sampling_step_relative, angle_step_relative, distance_step_relative;
Mat sampled_pc, ppf;
int num_ref_points;
hashtable_int* hash_table;
THash* hash_nodes;

double position_threshold, rotation_threshold;
bool use_weighted_avg;

int scene_sample_step;

void clearTrainingModels();

private:
void computePPFFeatures(const Vec3d& p1, const Vec3d& n1,
const Vec3d& p2, const Vec3d& n2,
Vec4d& f);

bool matchPose(const Pose3D& sourcePose, const Pose3D& targetPose);

void clusterPoses(std::vector<Pose3DPtr>& poseList, int numPoses, std::vector<Pose3DPtr> &finalPoses);

bool trained;
};

//! @}

} // namespace ppf_match_3d

} // namespace cv
#endif

  • 版权声明: 本博客所有文章除特别声明外,著作权归作者所有。转载请注明出处!
  • Copyrights © 2020-2023 cyg
  • 访问人数: | 浏览次数: