You can find me on Twitter @bhutanisanyam1, connect with me on Linkedin hereHere and Here are two articles on my Learning Path to Self Driving Cars

You can find the Markdown File Here

You can find the Lecture 1 Notes hereLecture 3 Notes can be found hereLecture 4 Notes can be found hereLecture 5 Notes can be found here

These are the Lecture 2 notes for the MIT 6.S094: Deep Learning for Self-Driving Cars Course (2018), Taught by Lex Fridman.

All images are from the Lecture slides.

Self Driving Cars

(or Driverless Cars or Autonomous cars or Robocars)

Utopian View:

  1. 1.3M people die everywhere across the world due to car crashes.
  2. 35–40,000 in the US alone.
  3. Oppurtunity: To make design AI systems that save lifes.
  4. Autonomous Vehicles have the power to take away Drunk, Drugged, Distracted, Drowsy Driving.
  1. Increasing shared mobility.
  2. Saves money.
  3. Reduce in costs, makes them more accessible by reducing travel costs by an order of magnitude.

Dystopian View:

An Engineer’s view: We want to find the best possible ways to transform our society and improve lives.

Skepticism

  1. >2032: Driverless Taxi service in Major US with arbitrary pickup and drop, even in Geo restricted areas.
  2. >2045: The Majority of US cities wiill mandate these.

Plot of Technology adoption rate Vs the number of years:

Overview:

Different Approaches to Automation

Levels of Autonomy (SAE J3016):

Beyond Traditional Levels: Two Types of AI Systems.

  1. How often is it available?
  2. Is it sensor based?
  3. Number of Seconds provided to the Driver to take over, currently its almost 0 seconds. (Time to wake up and taking control)
  4. Taking over control remotely, support by a human that is not inside the car. For Human centred autonomy, human will be responsible. The assumption is the system will fail, humans will be needed to take over control.

  1. No teleoperation is involved.
  2. There should be no 10 seconds rule. That’s not good enough.
  3. Must find Safe Harbor.
  4. Allow the Human to take over as per choice. AI only forces over-ride when danger (ex: crash) is close.

L0: Starting Point. L1,L2,L3: A1: Human Centred. L4,L5: A2: Fully Autonomy.

Human Centred Approach: Criticism: When humans are given the system, they will ‘overtrust’ the system. The better the system become, the lesser the humans will pay attention.

Public Perception of What happens inside an Autonomous vehicle:

Engineer’s perception:

MIT-AVT Naturalistic Driving Dataset

Tesla Auto-Pilot:

  1. To trust the system, Let the system reveal its flaws. Know where it works, and it doesn’t.
  2. Check the System to its limits: In challenging environments.

Self Driving Cars: Robotics View.

Sensors

Source of Raw Data that can be processed.

Ultrasonic:

  1. Works well in proximity.
  2. Cheap.
  3. Sensor size can be tiny.
  4. Works in bad weather, visibility.
  5. Range is very short.
  6. Terrible resolution.
  7. Cannot Detect speed.

RADAR:

  1. Commonly available in vehicles with some degree of autonomy.
  2. Cheap-Both Electronic and Ultrasonic variants are cheap.
  3. Performs well in challenging weather.
  4. Low resolution.
  5. Most reliable today and Widely used.
  1. All the plus points of ultrasonics along with ability to detect speeds.
  2. Low Resolution.
  3. No Texture, color Resolution.

LIDAR:

  1. Range is good, not great.
  2. Works in dark and bright lighting conditions.
  3. Not effective in Bad weather conditions.
  4. No information about color, texture.
  5. Able to detect speed.
  6. Huge Sensor Size.
  7. Expensive.
  8. Not effective in Ultrasonics.

Camera:

  1. Cameras have the highest Range.
  2. Ultrasonics have high resolution, but have very small range.

  1. Cheap.
  2. Small sensor size.
  3. Bad Performance in proximity.
  4. Range is highest.
  5. Works well in bright lighting, sensitive to lighting conditions (not always).
  6. Does not work in dark conditions.
  7. Not effective in bad visibility (Bad Weather).
  8. Provides Rich Textural Information (Required for Deep learning).

Sensor Fusion:

Cheap Sensors: Ultrasonics + Cameras + RADAR.

Future of Sensors

Cameras Vs LIDAR

Cheap Sensor Fusion Vs LIDAR

Companies

Waymo

Uber

Tesla

Audi A8 System (To be released end of 2018):

Notable Mentions.

Oppurtunities for AI and Deep Learning.

Localisation and Mapping:

Being able to localise itself in space.

Visual Odometry:

Traditional Approach:

Deep learning approach — End to End Method:

Scene Understanding

Movement Planning

After understanding the scene, How to get from A to B?

Driver State

Re: Two Paths to an Autonomous Future:

Argument: A1 systems are more favourable in the current years.

Challenges for A2 Systems:

You can find me on Twitter @bhutanisanyam1, connect with me on Linkedin hereHere and Here are two articles on my Learning Path to Self Driving Cars

Subscribe to my Newsletter for a Weekly curated list of Deep learning, Computer Vision Articles