opencv 尝试使用skimage或cv2计算基本矩阵

c3frrgcw  于 2023-03-03  发布在  其他
关注(0)|答案(1)|浏览(189)

我正在尝试理解如何使用cv2.findEssentialMat或skimage变体。我有一个已知的世界坐标系中的3d模型和两个校准的相机。我计算了系统的真实本质矩阵,我的目标是将已知的3d模型点投影到每个相机中,并使用cv2(或skimage)恢复已知的本质矩阵。
因为3d结构未知,所以从图像对应关系计算基本矩阵,如果已知,则简单地,我可以简单地计算摄像机之间的相对姿态为∆E=E_r @ E_l^-1,然后计算基本矩阵E=T_x @ R,其中@ a是矩阵乘法,T_x是与归一化平移向量相关联的矩阵,而R是旋转矩阵。
在我确认我的计算是正确的之后,我将世界点投影到每个摄像机中,并使用cv2.findEssentialMat从图像坐标计算本质矩阵。我希望我计算的本质矩阵与cv2.findEssentialMat计算的本质矩阵相同。然而这两者非常不同。有人知道我遗漏了什么吗?
这是我的设置

def apply_rt(pts, rt):
    return (rt[:3, :3] @ pts.T).T + rt[:3, -1]

def project_points(xyzs, intrinsic, extrinsic, should_normalize=False):
    projected_points_cr = (intrinsic @ apply_rt(xyzs, extrinsic).T).T
    divisors = projected_points_cr[:, 2:]
    projected_points_cr = projected_points_cr / divisors
    if should_normalize:
        projected_points_cr = (np.linalg.inv(intrinsic) @ projected_points_cr.T).T
    return projected_points_cr[:, :2]

xyzs_world = np.array([
       [73.008575, 52.755592, 39.83713 ],
       [72.075424, 47.25908 , 40.036087],
       [72.20231 , 45.843777, 40.351246],
       [73.43591 , 54.131153, 39.584625],
       [74.108826, 44.653885, 42.031307],
       [77.388   , 49.497707, 42.064713],
       [77.185585, 48.65105 , 42.162895],
       [77.388245, 43.62944 , 42.370674],
       [77.02157 , 53.728027, 41.259354],
       [74.88333 , 54.320747, 40.528275],
       [76.66945 , 43.3657  , 42.48002 ],
       [71.57631 , 51.054092, 38.390312],
       [71.905975, 52.52903 , 38.386307],
       [72.099724, 48.342022, 39.933678],
       [73.300545, 45.25413 , 41.518044],
       [75.36074 , 54.748974, 40.528008],
       [74.20519 , 53.809402, 40.38417 ],
       [74.10402 , 48.007614, 41.790726],
       [71.70616 , 48.58014 , 39.25619 ],
       [77.78094 , 45.657528, 42.211655],
       [74.894455, 44.303238, 42.315697],
       [72.35703 , 51.532913, 39.52483 ],
       [77.83151 , 53.58341 , 41.273026],
       [73.03928 , 51.10847 , 40.470455],
       [77.50123 , 48.578613, 42.125782],
       [74.307816, 55.139206, 39.728397],
       [74.199196, 48.10794 , 41.825573],
       [72.19932 , 50.435875, 39.648094],
       [71.919846, 45.61991 , 39.96143 ],
       [77.08149 , 43.011536, 42.4573  ],
       [76.36457 , 46.383766, 42.35271 ],
       [74.79315 , 48.619663, 41.99683 ],
       [74.39389 , 52.70309 , 40.95705 ],
       [73.83906 , 44.51386 , 41.910065],
       [72.639786, 44.48964 , 40.993587],
       [75.381966, 45.770576, 42.32749 ],
       [78.57214 , 47.213417, 41.966934],
       [77.5113  , 46.790035, 42.220528],
       [73.846436, 46.48003 , 41.783493],
       [74.31437 , 53.745804, 40.484417],
       [74.80585 , 44.500835, 42.28195 ],
       [76.932724, 42.753857, 42.496098],
       [74.27546 , 47.899445, 41.879692],
       [71.30465 , 45.965378, 38.79394 ],
       [78.21704 , 43.769154, 42.18486 ],
       [76.79267 , 52.646767, 41.56925 ],
       [72.67541 , 54.200233, 38.643562],
       [78.73034 , 50.631893, 41.681744],
       [74.487144, 51.408913, 41.39855 ],
       [74.22435 , 51.41863 , 41.266033]], dtype=np.float32)

intrinsics_right = np.array([
       [2549.682106, 0.000000, 1193.621053],
       [0.000000, 2588.379958, 632.521320],
       [0.000000, 0.000000, 1.000000]])
intrinsics_left = np.array([
       [2514.183473, 0.000000, 1421.766881],
       [0.000000, 2530.292937, 1017.599687],
       [0.000000, 0.000000, 1.000000]])

extrinsics_right = np.array([
        [0.908730, 0.031275, -0.416211, -52.430834],
        [-0.046736, -0.983293, -0.175928, 50.173038],
        [-0.414760, 0.179323, -0.892086, 179.569364],
        [0.000000, 0.000000, 0.000000, 1.000000]])

extrinsics_left = np.array([
        [0.908264, -0.017047, 0.418049, -66.914767],
        [-0.008166, -0.999702, -0.023024, 22.166049],
        [0.418317, 0.017498, -0.908133, 130.945395],
        [0.000000, 0.000000, 0.000000, 1.000000]])

rel_pose = extrinsics_right @ np.linalg.inv(extrinsics_left)
X_r = apply_rt(xyzs_world, extrinsics_right)
X_l = apply_rt(xyzs_world, extrinsics_left)

essential_gt = T_x @ rel_pose[:3,:3]
# result is:
# np.array([[0.005759, -0.415231, -0.020870],
#       [-0.415659, -0.153303, 0.895284],
#       [0.059779, -0.896665, -0.147390]])

epipolar_constraint = np.linalg.norm(np.einsum('ij,ij->i', X_r, (essential_gt @ X_l.T).T))
print(epipolar_constraint) # print 1e-11

epipolar_constraint的值约为1e-11,表明计算结果良好。下一步,我将把世界点投影到每个摄像机中,这样,我就有了相应的图像点。然后我调用cv2.findEssentialMat。按照opencv文档中的说明,我规范化图像坐标,并传递标识符代替extrinics,因为我有两个不同的摄像机。

rcs_l = project_points(xyzs_world, intrinsics_left, extrinsics_left, should_normalize=True)
rcs_r = project_points(xyzs_world, intrinsics_right, extrinsics_right, should_normalize=True)
essential, inliers = cv2.findEssentialMat(rcs_l, rcs_r, np.eye(3))
# result is 
# np.array([[-0.000108, 0.000701, -0.394863],
#       [-0.000634, 0.000407, -0.586585],
#       [0.423443, 0.566300, 0.000232]])
wwwo4jvm

wwwo4jvm1#

同样地,essential也满足对极约束,这两个矩阵在图像点和3d点上都是相同的,结果表明这两个矩阵在一定尺度上是相同的,它们的不同之处仅在于它们各自SVD的奇异值。将essential分解为其旋转和平移分量,并再次乘以T_x, R = decompose(essential),然后取np.linalg.norm(T_x@R - essential_gt)≈≈0

相关问题