Journal of System Simulation ›› 2021, Vol. 33 ›› Issue (7): 1638-1646.doi: 10.16182/j.issn1004731x.joss.20-0218

Previous Articles     Next Articles

Position and Attitude Estimation Based on Combination Matching in Calibration Area

Cai Peng1,2, Shen Chaoping1,2, Li Hongyan1,2   

  1. 1. Aeronautical Engineering Institute, Jiangsu Aviation Technical College, Zhenjiang 212134, China;
    2. Zhenjiang Key Laboratory of UAV Application Technology, Jiangsu Aviation Technical College, Zhenjiang 212134, China
  • Received:2020-04-28 Revised:2020-06-03 Online:2021-07-18 Published:2021-07-20

Abstract: Vision navigation technology of scene matching needs hardware to measure the distance and attitude of the camera. A method of position and attitude estimation based on the combination of feature points in the calibration area is proposed. The software can get the camera position and attitude of the real-time image by selecting the best set of Scale-Invariant Feature Transform(SIFT) feature matching points of the reference image calibration area of the real-time image, and calculating the local coordinates of the ground of SIFT feature matching points based on the triangle internal linear interpolation method, using space resection, avoids the defect of hardware measuring camera's distance and attitude, and expands the application scope of vision navigation technology of scene matching. The experimental results show that the position and attitude of real-time image calculated by this method is close to the real.

Key words: position and attitude estimation, Scale-Invariant Feature Transform(SIFT), feature matching, calibration area, combination matching of feature points, space resection

CLC Number: