Enhancing Feature Tracking Reliability for Visual Navigation using Real-Time Safety Filter
🚀 Making Visual Navigation More Reliable
Enhancing Feature Tracking with a Real-Time Safety Filter
In environments without GPS, many robots rely on visual SLAM — using a camera to track their position by detecting and following visual features (like corners or edges). But here’s the catch:
🔍 If the number of visible features suddenly drops, SLAM performance can fail dramatically — causing tracking loss, localization errors, or even crashes.
Our work addresses this with a simple question:
Can we make a robot proactively protect its feature visibility — before it’s too late?
🎯 Key Idea: A Real-Time Safety Filter for Features
We propose a real-time safety filter that runs alongside the robot’s control system. Instead of blindly following a planned velocity, the filter slightly modifies the velocity to keep enough visual features in view.
✨ Think of it like:
A safety assistant that whispers to the robot:
“Maybe slow down a bit so you don’t lose track of those corners.”
🔧 How It Works (In Simple Terms)
Here’s what happens at each time step:
- 🧭 A controller gives a reference velocity (
v_ref
) to the robot. - 👁 The camera sees visual features and computes an information score — a measure of how “rich” the current features are.
- 🧮 A quadratic program (QP) solves for a new velocity (
v_filtered
) that:- Is close to
v_ref
- Ensures the information score stays above a threshold
- Is close to
📊 Suggested Graphic:
A diagram showing:
- Original velocity vector
- Modified velocity vector
- Feature score threshold
- Features moving out of FOV
🧪 Experiments
✅ Simulation:
- We tested in a simulated indoor environment with sudden changes in feature visibility.
- Without our filter, the robot frequently lost track of features.
- With our filter, it slowed or adjusted direction to maintain trackability.

Baseline Result

Proposed Result

Figure: Simulation Result
✅ Real-world Deployment:
- We mounted a monocular camera on a wheeled robot.
- The robot navigated safely even when entering textureless areas like blank walls or glass.
📷 Suggested Graphic:
Side-by-side screenshots of:
- Feature tracking over time (with and without filter)
- Trajectories diverging due to tracking failure
📈 Why It Matters
- 🎯 SLAM safety: Prevents catastrophic failures due to feature loss.
- ⚡ Real-time: The filter runs fast enough for real robots.
- 🧠 Minimal intervention: Only adjusts motion when necessary — keeps original behavior otherwise.
🧩 Future Ideas
We’re excited to:
- Combine this with active vision to move toward richer feature regions
- Extend it to multi-sensor fusion systems
- Use it in autonomous drones, where losing features mid-flight can be fatal
Bibtex
@article{kim2025enhancing,
title={Enhancing Feature Tracking Reliability for Visual Navigation using Real-Time Safety Filter},
author={Kim, Dabin and Jang, Inkyu and Han, Youngsoo and Hwang, Sunwoo and Kim, H Jin},
journal={arXiv preprint arXiv:2502.01092},
year={2025}
}