CAVU Aerospace UK

Onboard Image Processing on Ingenuity: Pioneering flight on a world beyond Earth. Case study: Lost Navigation Image on Flight Six

Ingenuity must fly autonomously on Mars, because the signal delay from Earth is 3–22 minutes each way. This requires the helicopter to determine its position, velocity, and orientation in real time using onboard sensors. The navigation camera (Navcam) is the primary source of visual information for this process.

 

  1. Vision-Based Navigation- Visual Odometry
  • Navcam Role:
    • Downward-facing black-and-white camera captures high-contrast images of the terrain below.
    • The camera has a wide field of view to maximize ground feature detection.
  • Feature Detection:
    • The onboard computer identifies distinct points on the ground (rocks, shadows, surface textures) called “features.”
    • These features are tracked across successive frames.
  • Motion Estimation:
    • By comparing the displacement of features from one frame to the next, Ingenuity estimates its relative movement across the surface.
    • This process is called visual odometry and gives a 2D displacement and velocity vector relative to the Martian terrain.

 

  1. Sensor Fusion- Visual-Inertial Odometry
  • Inertial Measurement Unit (IMU):
    • Measures angular rates (gyroscope) and accelerations (accelerometer) at very high frequency (up to 1000 Hz).
  • Data Fusion:
    • IMU data provides rapid, high-frequency motion information but can drift over time.
    • Navcam visual data is slower (frames/sec) but provides absolute reference points from the ground.
    • Extended Kalman Filter (EKF) or similar algorithms combine IMU and camera data to produce accurate real-time estimates of position, velocity, and attitude.

 

  1. Altitude and Orientation
  • LIDAR Altimeter: Measures height above the surface.
  • Inclination Sensors: Provide initial tilt information to align visual tracking.
  • These inputs are combined with visual odometry to ensure stable hovering, ascent, descent, and navigation along a planned path.

 

  1. Real-Time Control Integration
  • Processed position and velocity estimates feed directly into flight control algorithms:
    • Adjust rotor speeds for stability.
    • Correct drift during lateral motion.
    • Maintain desired trajectory to the landing site.
  • All calculations happen onboard in milliseconds, enabling Ingenuity to fly safely even in unpredictable terrain.

 

Key Advantages of Onboard Image Processing

  • Autonomy: No real-time commands from Earth needed.
  • Precision: Navigation accurate to a few cm during short flights.
  • Robustness: Works in low-light, dusty, or high-contrast environments on Mars.
  • Safety: Avoids hazards without human intervention, critical in the thin atmosphere where small errors can be catastrophic.

 

Ingenuity’s Lost Navigation Image on Flight Six

During Ingenuity’s sixth flight on Mars (May 22, 2021), the helicopter experienced a mid‑air anomaly caused by its navigation imaging system glitching. About 54 seconds into the flight, one image from the downward‑looking navigation camera was lost, and subsequent images were forwarded with incorrect timestamps to the navigation software.  

Ingenuity uses navigation images — paired with precise timing — to track surface features and estimate its position, velocity, and orientation. When the timestamps became inaccurate, the navigation algorithm began making corrections based on incorrect timing information. This caused the helicopter’s state estimates to be continuously “corrected” for phantom errors, leading to large oscillations in pitch and roll as the flight progressed.  

Despite the unexpected behaviour, Ingenuity’s design provided enough flight stability margin that it was able to muster through the anomaly and land safely near its intended touchdown location. The helicopter’s programmed behaviour to ignore visual navigation images during the final descent phase also helped it stabilise and complete the landing successfully.

NASA engineers analysed the data after the flight and concluded that the glitch revealed a timing vulnerability in the navigation pipeline — the difference between when an image was actually taken and when the system believed it was taken. This issue provided valuable information for refining future autonomous flight systems in Mars environments.