Terrain relative navigation enhanced with sar for moon’s shadowed regions

Moeko Hidaka, Masaki Takahashi, Takayuki Ishida, Seisuke Fukuda

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This study proposes a terrain relative navigation (TRN) method to estimate spacecraft positions in the shadowed regions of the moon with high precision by combining the camera image and the synthetic aperture radar (SAR) data. TRN, which estimates the position of a spacecraft by comparing the preliminary terrain information stored in a database with the observed terrain, is an effective method of correcting the drift errors of inertial measurement units (IMUs). Optical cameras are primarily used for TRN sensors to observe the terrain; however, shadow images do not have enough characteristic points to match with the database. SAR is a form of radar that can generate high-resolution photo-like images, regardless of light conditions. In this study, we propose the use of SAR as navigation sensors and demonstrate effectiveness of SAR.

Original languageEnglish
Title of host publicationAIAA Scitech 2020 Forum
PublisherAmerican Institute of Aeronautics and Astronautics Inc, AIAA
ISBN (Print)9781624105951
DOIs
Publication statusPublished - 2020
EventAIAA Scitech Forum, 2020 - Orlando, United States
Duration: 2020 Jan 62020 Jan 10

Publication series

NameAIAA Scitech 2020 Forum
Volume1 PartF

Conference

ConferenceAIAA Scitech Forum, 2020
CountryUnited States
CityOrlando
Period20/1/620/1/10

ASJC Scopus subject areas

  • Aerospace Engineering

Fingerprint Dive into the research topics of 'Terrain relative navigation enhanced with sar for moon’s shadowed regions'. Together they form a unique fingerprint.

Cite this