Certificate of Registration Media number Эл #ФС77-53688 of 17 April 2013. ISSN 2308-6033. DOI 10.18698/2308-6033
  • Русский
  • Английский

Signature method for determination of the lunar lander position by video image

Published: 23.05.2023

Authors: Bobkov A.V., Xu Yang

Published in issue: #5(137)/2023

DOI: 10.18698/2308-6033-2023-5-2278

Category: Aviation and Rocket-Space Engineering | Chapter: Aircraft Dynamics, Ballistics, Motion Control

The paper considers the problem of developing a visual navigation system to determine the proper position of the lunar lander. The paper proposes a new method for comparing the observed image frame with a vector map of the Moon based on the comparison of signatures. Experiments show that the proposed method is able to work in real time, is resistant to lighting conditions, small changes in the camera angle, scale and noise, and is able to work with a large number of gaps and false positives of the crater detector. The proposed method can be used in modern domestic and international space programs for the exploration of the Moon to ensure a soft high-precision safe landing of the lunar lander in a given area of the Moon.

[1] Salamuniccara G., Loncaric S. Open framework for objective evaluation of crater detection algorithms with first test-field subsystem based on MOLA data. Advances in Space Research, 2008, vol. 42 (1), pp. 6–19. DOI: 10.1016/j.asr.2007.04.028
[2] Carr J.R., Sobek J.S. Digital scene matching area correlator (DSMAC). In: Image Processing for Missile Guidance, Proceedings of the Society of Photo-Optical Instrumentation Engineers, 1980, 238, pp. 36–41.
[3] Hu Tao, He Liang. Review of planetary crater detection algorithms (in Chinese). Manned Spaceflight, 2020, vol. 26 (5).
[4] Feng Junhua, Cui Hutao. Autonomous crater detection and matching on planetary surface (in Chinese). Acta Aeronautica et Astronautica Sinica, 2010, vol. 31 (9), pp. 1858–1863.
[5] Li J.F., Cui W., Baoyin H.X. A survey of autonomous navigation for deep space exploration (in Chinese). Mech. Eng., 2012, vol. 34, pp. 1–9.
[6] Johnson A., Ansar A., Matthies L., Trawny N., Mourikis A.I., Roumeliotis S.I.. A general approach to terrain relative navigation for planetary landing. In: Proc. 2007 AIAA Infotech at Aerospace Conference, 2007, May, pp. 7–10.
[7] Woicke S., et al. Comparison of crater-detection algorithms for terrain-relative navigation. In: 2018 AIAA Guidance, Navigation, and Control Conference, 2018, pp. 1601.
[8] Ansar A. 2004 small body GN&C research report: Feature recognition algorithms. Small Body Guidance Navigation and Control FY 2004 RTD Annual Report (Internal Document). Pasadena, CA, Jet Propulsion Laboratory, no. D-30282 / D-30714, 2004, pp. 151–171.
[9] Singh L., Lim S. On lunar on-orbit vision-based navigation: Terrain mapping, feature tracking driven EKF. In: AIAA Guidance, Navigation and Control Conference and Exhibit, 2008, pp. 6834.
[10] Syryamkin V.I., Shidlovsky V.S. Korrelyatsionno-ekstremalnye radionavigatsionnye sistemy [Correlation-extreme radio navigation systems]. Tomsk, Tomsk University Publ., 2010, 316 p.
[11] Lowe D.G. Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 2004, vol. 60 (2), pp. 91–110.
[12] Maass B., Krüger H., Theil S. An edge-free, scale-, pose- and illumination-invariant approach to crater detection for spacecraft navigation. In: 2011 7th International Symposium on Image and Signal Processing and Analysis (ISPA). Dubrovnik, Croatia, 2011, pp. 603–608.
[13] Frome A., Huber D., Kolluri R., Bulow T., Malik J. Recognizing objects in range data using regional point descriptors. In: Proc. European Conf. on Computer Vision (ECCV), 2004.
[14] Johnson A., Hebert M. Using spin images for efficient multiple model recognition in cluttered 3D scenes. In: IEEE Transactions on Pattern Analysis and Machine Intelligence, 1999, vol. 21 (5), pp. 433–449.
[15] Johnson A.E., Montgomery J.F. Overview of terrain relative navigation approaches for precise lunar landing. In: 2008 IEEE Aerospace Conference, 2008, pp. 1–10.
[16] Cheng Y., Ansar A. Landmark based position estimation for pinpoint landing on Mars. In: Proc. IEEE Int’l Conf. on Robotics and Automation (ICRA), 2005, pp. 4470–4475.
[17] Cheng Y., Johnson A., Olson C., Matthies L. Optical landmark detection for spacecraft navigation. In: Proc. 13th Annual AAS/AIAA Space Flight Mechanics Meeting. American Astronautical Society, 2003.
[18] Malinnikov V.A., Uchaev D.V., Oberst Yu. Metodika avtomatizirovannogo obnaruzheniya kraterov na poverkhnosti nebesnykh tel po ikh opticheskim izobrazheniyam [Technique for automated detection of craters on the surface of celestial bodies from their optical images]. Izvestiya vysshikh uchebnykh zavedeniy. Geodeziya i aerofotosyemka — Izvestia Vuzov. Geodesy and Aerophotosurveying, 2012, no. 6, pp. 12–18.
[19] Gaskell R. Automated landmark identification for spacecraft navigation. In: Proc. AAS/AIAA Astrodynamics Specialists Conf., AAS Paper no. 01-422. American Astronautical Society, 2001.
[20] Gaskell R. Determination of landmark topography from imaging data. In: Proc. AAS/AIAA Astrodynamics Specialists Conf. American Astronautical Society, 2002, Paper no. AAS 02-021.
[21] Wetzler P.G., Honda R., Enke B., Merline W.J., Chapman C.R., Burl M.C. Learning to detect small impact craters. In: Proc. 7th IEEE Wrksp. on Application of Computer Vision, 2005.
[22] Kamarudin N., et al. An overview of crater analyses, tests and various methods of crater detection algorithm. Frontiers in Environmental Engineering, 2012, vol. 1 (1), pp. 1–7.
[23] Wang Dong, Xing Shuai. A planetary image based automatic impact crater extraction method (in Chinese). Journal of Astronautics, 2015, vol. 36 (10).