Illuri Sandeep
8 min readApr 1, 2021

--

Self Driving Cars Intro

What is an Autonomous Car?

An autonomous car is a vehicle capable of sensing its environment and operating without human involvement. A human passenger is not required to take control of the vehicle at any time, nor is a human passenger required to be present in the vehicle at all. An autonomous car can go anywhere a traditional car goes and do everything that an experienced human driver does.

Over 1million years humans started walking and over 4,000 years ago humans invented horse carriages and over 100 years ago humans invented modern Automobile and now we are in new era of automobiles that is Autonomous vehicles.

But why do we need self driving cars?

  1. Our roads will become much safer.
  2. We’ll be more productive in transportation than ever before.
  3. We can save lot of money.
  4. Reduced Traffic congestion.
  5. Environmental gains.
  6. Free up parking lots for other uses.

Self driving cars vs Regular cars

What Are the Levels of Automated Driving?

Level 0: No Automation. The driver is completely responsible for controlling the vehicle, performing tasks like steering, braking, accelerating or slowing down. Level 0 vehicles can have safety features such as backup cameras, blind spot warnings and collision warnings. Even automatic emergency braking, which applies aggressive braking in the event of an imminent collision, is classified as Level 0 because it does not act over a sustained period.

Level 1: Driver Assistance. At this level, the automated systems start to take control of the vehicle in specific situations, but do not fully take over. An example of Level 1 automation is adaptive cruise control, which controls acceleration and braking, typically in highway driving. Depending on the functionality, drivers are able to take their feet off the pedals.

Level 2: Partial Automation. At this level, the vehicle can perform more complex functions that pair steering (lateral control) with acceleration and braking (longitudinal control), thanks to a greater awareness of its surroundings.

Level 2+: Advanced Partial Automation. While Level 2+ is not one of the officially recognized SAE levels, it represents an important category that delivers advanced performance at a price consumers can afford. Level 2+ includes functions where the vehicle systems are essentially driving, but the driver is still required to monitor the vehicle and be ready to step in if needed. (By contrast, Level 3 represents a significant technology leap, as it is the first level at which drivers can disengage from the act of driving — often referred to as “mind off.” At Level 3, the vehicle must be able to safely stop in the event of a failure, requiring much more advanced software and hardware.) Examples of Level 2+ include highway assistance or traffic jam assistance. The ability for drivers to take their hands off the wheel and glance away from the road ahead for a few moments makes for a much more relaxing and enjoyable experience, so there is strong consumer interest.

Level 3: Conditional Automation. At Level 3, drivers can fully disengage from the act of driving, but only in specific situations. Conditions could be limited to certain vehicle speeds, road types and weather conditions. But because drivers can apply their focus to some other task — such as looking at a phone or newspaper — this is generally considered the initial entry point into autonomous driving. For example, features such as traffic jam pilot mean that drivers can sit back and relax while the system handles it all — acceleration, steering and braking. In stop-and-go traffic, the vehicle sends an alert to the driver to regain control when the vehicle gets through the traffic jam and vehicle speed increases. The vehicle must also monitor the driver’s state to ensure that the driver resumes control, and be able to come to a safe stop if the driver does not.

Level 4: High Automation. At this level, the vehicle’s autonomous driving system is fully capable of monitoring the driving environment and handling all driving functions for routine routes and conditions. However, depending on the operational design domain (ODD) of the vehicle, the system may on rare occasions need a driver to step in. In those cases, the vehicle can alert the driver that there is, say, an environmental condition that requires a human in control, such as heavy snow.

Level 5: Full Automation. Level 5-capable vehicles are fully autonomous. No driver is required behind the wheel at all. In fact, Level 5 vehicles might not even have a steering wheel or gas/brake pedals. Level 5 vehicles could have “smart cabins” so that passengers can issue voice commands to choose a destination or set cabin conditions such as temperature or choice of media.

History of self-driving cars

Experiments have been conducted on self-driving cars since at least the 1920s; promising trials took place in the 1950s and work has proceeded since then. The first self-sufficient and truly autonomous cars appeared in the 1980s, with Carnegie Mellon University’s Navlab and ALV projects in 1984 and Mercedes-Benz and Bundeswehr University Munich’s Eureka Prometheus Project in 1987. Since then, numerous major companies and research organizations have developed working autonomous vehicles including Mercedes-Benz, General Motors, Continental Automotive Systems, Autoliv Inc., Bosch, Nissan, Toyota, Audi, Volvo, Vislab from University of Parma, Oxford University and Google. In July 2013, Vislab demonstrated BRAiVE, a vehicle that moved autonomously on a mixed traffic route open to public traffic. As of now many tech companies and startups are busy in building a level 5 autonomous vehicles. some companies are shown in below but there are more.

How do autonomous cars work?

Autonomous cars rely on sensors, actuators, complex algorithms, machine learning systems, and powerful processors to execute software. Autonomous cars create and maintain a map of their surroundings based on a variety of sensors situated in different parts of the vehicle. Radar sensors monitor the position of nearby vehicles. Video cameras detect traffic lights, read road signs, track other vehicles, and look for pedestrians. Lidar (light detection and ranging) sensors bounce pulses of light off the car’s surroundings to measure distances, detect road edges, and identify lane markings. Ultrasonic sensors in the wheels detect curbs and other vehicles when parking. Sophisticated software then processes all this sensory input, plots a path, and sends instructions to the car’s actuators, which control acceleration, braking, and steering. Hard-coded rules, obstacle avoidance algorithms, predictive modeling, and object recognition help the software follow traffic rules and navigate obstacles.

Let’s take a look on hardware of self driving cars

  1. CAN- Controller Area Network

It is the vehicle internal communication network. The CAN card is how your computer system connects to the car internal network, to send signals for acceleration, braking and steering. This is how CAN card looks like.

2. GPS- global positioning systems

GPS receives signals from satellites, circling earth. The signal helps us in determine our location. This is how GPS hardware looks like in autonomous car.

3. IMU- Inertia measurement unit

The IMU measures the vehicles movement and location by tracking the position, speed, acceleration and other factors.

4.LIDAR- Light Detection and Ranging

Lidar is an array of pulse lasers, the lidar that scan 360 degrees around the vehicle. The reflection of the laser beams builds the point cloud that our software can use to understand the environment.

5.CAMERA

Camera captures the image data. we can computer vison to extract the content of these images and understand the environment around us.

6.RADAR- Radio Detection And Ranging

Radar is also used to detect the obstacles. Radar has low resolution which makes it difficult to understand what kind of obstacle that radar has detected. But the advantage of radar is it is very economical and works at all type of weather and lighting conditions. It is also used to measure the speed of other vehicles.

This is how complete architecture looks like

Now let’s take a look on software stack of self driving cars

This software stack is based on the Baidu Apollo project it may vary for other companies. The software layer is divided into three sub layers .

  1. Real-time operating system(RTOS)

The RTOS makes the certain task will be completed with in a given time. The real time operating system of a self driving car can produce timely calculations, analysis and execute corresponding actions in a short time, after the car sensors collect the data from the outside world.

2. Runtime Frame Work

The runtime frame work is the operating environment of the car, which is customized version of ROS. Even though ROS stands for robot operating system, it is actually a software framework run on top of the car. ROS has long history in the robotics industry and there are currently more than 3,000 basic libraries that support the rapid development of applications. ROS divides the autonomous system into multiples modules bases on the function. Each module is responsible for receiving, processing and publishing it’s own messages.

3. Layer of Application module

Application module is the core content of the software stack, it has a variety of modules includes MAP engine, localization, perception, planning, control, end-to-end driving, Human-machine interface(HMI). Each module has it’s own algorithm and the relationship’s between the modules are complex.

We will discuss further more on self driving cars on next blog’s.

--

--