SensorFusionNanoDegree Save Abandoned

Learn to detect obstacles in lidar point clouds through clustering and segmentation, apply thresholds and filters to radar data in order to accurately track objects, and augment your perception by projecting camera images into three dimensions and fusing these projections with other sensor data. Combine this sensor data with Kalman filters to perceive the world around a vehicle and track objects over time.

Project README

SensorFusionNanoDegree [NEW LINK]

This repository has been moved to GitLAB-SensorFusion. Find latest update there.


SensorFusionNanoDegree

Learn to detect obstacles in lidar point clouds through clustering and segmentation, apply thresholds and filters to radar data in order to accurately track objects, and augment your perception by projecting camera images into three dimensions and fusing these projections with other sensor data. Combine this sensor data with Kalman filters to perceive the world around a vehicle and track objects over time.

PREREQUISITE KNOWLEDGE

You should have intermediate C++ knowledge, and be familiar with calculus, probability, and linear algebra. See detailed requirements.

LIDAR

Process raw lidar data with filtering, segmentation, and clustering to detect other vehicles on the road.

CAMERAS

Fuse camera images together with lidar point cloud data. You'll extract object features, classify objects, and project the camera image into three dimensions to fuse with lidar data.

RADAR

Analyze radar signatures to detect and track objects. Calculate velocity and orientation by correcting for radial velocity distortions, noise, and occlusions.

KALMAN FILTERS

Fuse data from multiple sources using Kalman filters, and build extended and unscented Kalman filters for tracking nonlinear movement.

Open Source Agenda is not affiliated with "SensorFusionNanoDegree" Project. README Source: ptiwari0664/SensorFusionNanoDegree

Open Source Agenda Badge

Open Source Agenda Rating