Optical Navigation Thought Experiment

You can imagine yourself floating, with your eyes closed, somewhere in the general neighborhood of the Earth, Sun, and Moon. You have no idea which way you’re facing, and no idea where you are.

You open your eyes, and move your head around to look all around yourself. You’re wearing sunglasses (the Sun is awfully bright in space), so the only things that you can see are the disks of the Earth, the Sun, and the Moon. You can’t see any of the features on any of those celestial bodies (you can’t tell, for instance, if you’re looking at North America or Indonesia), and you can’t see any of the dimmer stars through your sunglasses. All you see are three disks of light.

In this situation, the only information that you have is the following:

  1.  You know the sizes of each of the disks
  2. You know the angular separation between each disk (a number between 0 and Pi)

You also have your cell phone, and you can use it to check the time and to call Earth. You call NASA, and they tell you the locations of the Earth, Moon, and Sun from their own frame of reference at the time that you specify. They can’t tell you anything about the locations in your frame of reference, because they have no idea which way you’re oriented. You can use that information, however, to figure out the legs of the triangle in space formed by the Earth, the Sun, and the Moon.

Given the known geometry of that triangle, and your own measurements of the sizes of the disks and their relative separations, you can figure out one of two possible locations in space that agree with your measurements. These are the only locations in space that will agree with your view of the Earth, Sun, and Moon. There is one above the plane formed by the three celestial bodies, and the other is below.

NASA then flies someone out to meet you and that person tells you your orientation relative to the Earth. This additional information enables you to narrow your number of possible positions from two to one since you and someone at NASA can now agree upon a direction to face. You could both agree to turn such that your outstretched arms are parallel to the line connecting the Earth and the Moon, and your eyes are facing in the direction that keeps the Sun in front of you (assuming a nonsingular configuration of the celestial bodies).

Before changing your orientation, you knew you were at one of two locations. Now that you have aligned yourself in this way, you have the additional information of whether you must tilt your chin down or up to look at the Sun. If your head tilts down, then you know that you are at the position that is above the plane formed by the Earth, Sun, and Moon. If your head tilts up, then you are below.

This is precisely the means by which the Cislunar Explorers navigate. Each carries a number of Raspberry Pi camera modules that it uses to peer into the surrounding darkness. From this view, and an onboard ephemerides table for the Earth, Sun, and Moon, the spacecraft can isolate itself to within tens of kilometers. By including some dynamics in an Extended Kalman Filter, it can do even better.

The team has validated this method of navigation in simulation, and is in the process of implementing the general logic described above on flight hardware. In the coming months, a hardware test will be performed with simulated Earth, Sun, and Moon disks of light.