空飛ぶロボットのつくりかた

ロボットをつくるために必要な技術をまとめます。ロボットの未来についても考えたりします。

Open3Dのお勉強(二回目)

やりたいこと

Depthセンサで取得したデータをOpen3Dで自由自在に操りたい

f:id:robonchu:20180224200259p:plain

教科書

Open3D: A Modern Library for 3D Data Processing — Open3D 0.1 dev documentation

Rgbd Odometry

[Park2017] : Park, Q.-Y. Zhou, and V. Koltun, Colored Point Cloud Registration Revisited, ICCV, 2017. : https://pdfs.semanticscholar.org/e668/cdf876884106948c7151601785023f04ed40.pdf

の実装が

    option = OdometryOption()
    odo_init = np.identity(4)
    print(option)

    [success_hybrid_term, trans_hybrid_term, info] = compute_rgbd_odometry(
            source_rgbd_image, target_rgbd_image,
            pinhole_camera_intrinsic, odo_init,
            RGBDOdometryJacobianFromHybridTerm(), option)

で使える。

もうひとつ実装あるが、こちらがオススメみたい。

if success_hybrid_term:
    print("Using Hybrid RGB-D Odometry")
    print(trans_hybrid_term)
    source_pcd_hybrid_term = create_point_cloud_from_rgbd_image(
            source_rgbd_image, pinhole_camera_intrinsic)
    source_pcd_hybrid_term.transform(trans_hybrid_term)
    draw_geometries([target_pcd, source_pcd_hybrid_term])

返り値のtrans_hybrid_termは以下

三次元の同次座標変換行列

Using Hybrid RGB-D Odometry
[[  9.99994666e-01  -1.00290715e-03  -3.10826763e-03  -3.75410348e-03]
 [  9.64492959e-04   9.99923448e-01  -1.23356675e-02   2.54977516e-02]
 [  3.12040122e-03   1.23326038e-02   9.99919082e-01   1.88139799e-03]
 [  0.00000000e+00   0.00000000e+00   0.00000000e+00   1.00000000e+00]]

参考:

compute_rgbd_odometryについて確認

help(compute_rgbd_odometry)

結果

Help on built-in function compute_rgbd_odometry in module py3d:

compute_rgbd_odometry(...)
    compute_rgbd_odometry(rgbd_source: py3d.RGBDImage, rgbd_target: py3d.RGBDImage, pinhole_camera_intrinsic: py3d.PinholeCameraIntrinsic=PinholeCameraIntrinsic with width = -1 and height = -1.
    Access intrinsics with intrinsic_matrix., odo_init: numpy.ndarray[float64[4, 4]]=array([[1., 0., 0., 0.],
           [0., 1., 0., 0.],
           [0., 0., 1., 0.],
           [0., 0., 0., 1.]]), jacobian: py3d.RGBDOdometryJacobian=RGBDOdometryJacobianFromHybridTerm, option: py3d.OdometryOption=OdometryOption class.
    minimum_correspondence_ratio = 0.100000
    iteration_number_per_pyramid_level = [ 20, 10, 5, ] 
    max_depth_diff = 0.030000
    min_depth = 0.000000
    max_depth = 4.000000) 

-> Tuple[bool, numpy.ndarray[float64[4, 4]], numpy.ndarray[float64[6, 6]]]
    
    Function to estimate 6D rigid motion from two RGBD image pairs

KD tree

We pick the 1500-th point as the anchor point and paint it red.

# src/Python/Tutorial/Basic/kdtree.py

import sys
import numpy as np
sys.path.append("../..")
from py3d import *

if __name__ == "__main__":

    print("Testing kdtree in py3d ...")
    print("Load a point cloud and paint it gray.")
    pcd = read_point_cloud("../../TestData/Feature/cloud_bin_0.pcd")
    pcd.paint_uniform_color([0.5, 0.5, 0.5])
    pcd_tree = KDTreeFlann(pcd)

    print("Paint the 1500th point red.")
    pcd.colors[1500] = [1, 0, 0]

    print("Find its 200 nearest neighbors, paint blue.")
    [k, idx, _] = pcd_tree.search_knn_vector_3d(pcd.points[1500], 200)
    np.asarray(pcd.colors)[idx[1:], :] = [0, 0, 1]

    print("Find its neighbors with distance less than 0.2, paint green.")
    [k, idx, _] = pcd_tree.search_radius_vector_3d(pcd.points[1500], 0.2)
    np.asarray(pcd.colors)[idx[1:], :] = [0, 1, 0]

    print("Visualize the point cloud.")
    draw_geometries([pcd])
    print("")

近傍点を探索する

f:id:robonchu:20180225112642p:plain

Iterative Closest Point

ICPとは: 2つの点群を重ね合わせる手法

うまくfittingするには初期値が重要みたい

そして、この初期値はglobal registration algorithmでいい感じが得れるっぽい。

The initial alignment is usually obtained by a global registration algorithm.

See Global registration for examples(http://www.open3d.org/docs/tutorial/Advanced/global_registration.html#global-registration).

参考: 

Open3DのTutorialでは"point-to-point ICP"と"point-to-plane ICP"を使っている

The input are two point clouds and an initial transformation that roughly aligns the source point cloud to the target point cloud.

The output is a refined transformation that tightly aligns the two point clouds

# src/Python/Tutorial/Basic/icp_registration.py

def draw_registration_result(source, target, transformation):
    source_temp = copy.deepcopy(source)
    target_temp = copy.deepcopy(target)
    source_temp.paint_uniform_color([1, 0.706, 0])
    target_temp.paint_uniform_color([0, 0.651, 0.929])
    source_temp.transform(transformation)
    draw_geometries([source_temp, target_temp])

if __name__ == "__main__":
    source = read_point_cloud("../../TestData/ICP/cloud_bin_0.pcd")
    target = read_point_cloud("../../TestData/ICP/cloud_bin_1.pcd")
    threshold = 0.02
    trans_init = np.asarray(
                [[0.862, 0.011, -0.507,  0.5],
                [-0.139, 0.967, -0.215,  0.7],
                [0.487, 0.255,  0.835, -1.4],
                [0.0, 0.0, 0.0, 1.0]])
    draw_registration_result(source, target, trans_init)
    print("Initial alignment")
    evaluation = evaluate_registration(source, target,
            threshold, trans_init)
    print(evaluation)

    print("Apply point-to-point ICP")
    reg_p2p = registration_icp(source, target, threshold, trans_init,
            TransformationEstimationPointToPoint())
    print(reg_p2p)
    print("Transformation is:")
    print(reg_p2p.transformation)
    print("")
    draw_registration_result(source, target, reg_p2p.transformation)

    print("Apply point-to-plane ICP")
    reg_p2l = registration_icp(source, target, threshold, trans_init,
            TransformationEstimationPointToPlane())
    print(reg_p2l)
    print("Transformation is:")
    print(reg_p2l.transformation)
    print("")
    draw_registration_result(source, target, reg_p2l.transformation)

f:id:robonchu:20180225115139p:plain

Point to Point ICP

In general, the ICP algorithm iterates over two steps:

・ Find correspondence set K={(p,q)}. from target point cloud P, and source point cloud Q transformed with current transformation matrix T.

・ Update the transformation T by minimizing an objective function E(T) defined over the correspondence set K.

f:id:robonchu:20180225115231p:plain

イテレーション回数を増やすと一致率向上するよ

Point to Plane ICP

point to pointより計算速度が速いみたい

[Rusinkiewicz2001] has shown that the point-to-plane ICP algorithm has a faster convergence speed than the point-to-point ICP algorithm.

f:id:robonchu:20180225115312p:plain

Working with numpy

numpy -> open3dの型

open3d -> numpyの型

変換が簡単にできる

# src/Python/Tutorial/Basic/working_with_numpy.py

import sys, copy
import numpy as np
sys.path.append("../..")
from py3d import *

if __name__ == "__main__":

    # generate some neat n times 3 matrix using a variant of sync function
    x = np.linspace(-3, 3, 401)
    mesh_x, mesh_y = np.meshgrid(x,x)
    z = np.sinc((np.power(mesh_x,2)+np.power(mesh_y,2)))
    xyz = np.zeros((np.size(mesh_x),3))
    xyz[:,0] = np.reshape(mesh_x,-1)
    xyz[:,1] = np.reshape(mesh_y,-1)
    xyz[:,2] = np.reshape(z,-1)
    print('xyz')
    print(xyz)

    # Pass xyz to Open3D.PointCloud and visualize
    pcd = PointCloud()
    pcd.points = Vector3dVector(xyz)
    write_point_cloud("../../TestData/sync.ply", pcd)

    # Load saved point cloud and transform it into NumPy array
    pcd_load = read_point_cloud("../../TestData/sync.ply")
    xyz_load = np.asarray(pcd_load.points)
    print('xyz_load')
    print(xyz_load)

    # visualization
    draw_geometries([pcd_load])

print結果

xyz
[[-3.00000000e+00 -3.00000000e+00 -3.89817183e-17]
 [-2.98500000e+00 -3.00000000e+00 -4.94631078e-03]
 [-2.97000000e+00 -3.00000000e+00 -9.52804798e-03]
 ...[f:id:robonchu:20180225115312p:plain]
 [ 2.97000000e+00  3.00000000e+00 -9.52804798e-03]
 [ 2.98500000e+00  3.00000000e+00 -4.94631078e-03]
 [ 3.00000000e+00  3.00000000e+00 -3.89817183e-17]]
Writing PLY: [========================================] 100%
Reading PLY: [========================================] 100%
xyz_load
[[-3.00000000e+00 -3.00000000e+00 -3.89817183e-17]
 [-2.98500000e+00 -3.00000000e+00 -4.94631078e-03]
 [-2.97000000e+00 -3.00000000e+00 -9.52804798e-03]
 ...
 [ 2.97000000e+00  3.00000000e+00 -9.52804798e-03]
 [ 2.98500000e+00  3.00000000e+00 -4.94631078e-03]
 [ 3.00000000e+00  3.00000000e+00 -3.89817183e-17]]

f:id:robonchu:20180225120527p:plain

ん〜すごくわかりやすい!