Modular Autonomous Driving Stack Prototype is a personal exploration into autonomous vehicle systems, featuring modular implementations of the four core pillars: Perception, Planning, Control, and Localization. Each component functions independently to support rapid testing, debugging, and future stack-level integration.
-
Semantic segmentation + object detection pipeline
-
Identifies: vehicles, pedestrians, traffic lights, roads, crosswalks, speed-limit signs
-
Occupancy grid generation for spatial awareness
-
Behavior logic: lane following, red light stopping, rule-based responses
Object Detection
Semantic Segmentation
All Together
Occupancy Grid
Behaviour Planning Implementation stopping at red light
-
Path planning using A* algorithm
-
Supports waypoint navigation and dynamic mission flow
-
Longitudinal control via PID
-
Lateral controller switching based on speed:
- Stanley controller (low-speed)
- Pure Pursuit controller (medium/high-speed > 20 km/h)
Global Path Marking
Path Plot
- Sensor fusion using GNSS, IMU, and LiDAR
- Error-State Kalman Filter (ESKF) for reliable position and orientation estimation
Estimated vs Ground Truth of Localization
Error Plots
- Vehicle spawning independent of simulation constraints
🛠️ Built for learning, experimentation, and stepping closer to full-stack autonomy.