Autonomous vehicle technology has been a major research and development topic in the automotive industry during the last decade.The Second generation of driver assistance systems was introduced around the 1990s based on sensors that measure the external state of the vehicle with the focus of providing information and warnings to the driver.Figure 1: Evolution of Advanced Driver Assist Systems The following sensors that measure conditions outside the vehicle, and vehicle position relative to its environment are essential in driving assist systems and autonomous vehicle technologies: Vision, LIDAR, RADAR, ultrasonic range, GPS, and inter-vehicle communication.Prominent sensors such as LIDAR, GPS, radar, vision, ultrasonic and inertial measurement unit, their working principle and usage are discussed in the following sections.LIDAR transmits a beam of light pulse from a rotating mirror, part of that light will reflect back to the sensor to detect any non-absorbing object or surface (Figure 4).Using the calculated distance, the sensor constructs a 3D map of the world including the objects around it. LIDAR uses infrared (IR), visible or ultraviolet (UV) waves of the electromagnetic spectrum for different applications and comes in various 1D, 2D and 3D configurations.Active safety and driver assistance system that intervene to avoid and/or mitigate an emergency situation and then immediately disengage are also not included for the various levels of automation.The short-term goal is to automate driving in select well-defined situations as implemented by some of the technology companies, for example testing of self-driving taxi cabs in well-defined suburbs.The technologies developed in the automotive industry also has direct applications in construction, mining, agricultural equipment, seaborne shipping vessels, and unmanned aerial vehicles (UAVs).The latest generation of Driver Assist Systems (DAS), also called Advanced Driver Assist Systems (ADAS), defines and controls trajectories beyond the current request of the driver, i.e. overriding driver commands to avoid a collision.The embedded control software development (that includes data fusion from sensors, inter-vehicular communication, real-time cloud computing support) is the key technical challenge.A high-end LIDAR sensor can measure multiple distances per laser pulse which is helpful to see through dust, rain and mostly transparent surfaces such as glass windows and porous object like wire fences.Despite the heavy intensity of investment in technology development, it will still take a few decades before an entirely autonomous self-driving vehicle navigates itself through national highways and congested urban cities [1].Some current ADAS examples include traffic jam assist, collision avoidance assist, pedestrian and oncoming traffic detection systems.Data fusion strategies that combine real-time information from these multiple sensors is an important part of the embedded control software.Actuation technologies, i.e. computer control steering, throttle, transmission, and braking, are mature and do not present any R&D challenges.Echoes sensed vary in frequency depending on the speed of the object.30), medium (?2.2.3.