لخّصلي

خدمة تلخيص النصوص العربية أونلاين،قم بتلخيص نصوصك بضغطة واحدة من خلال هذه الخدمة

نتيجة التلخيص (31%)

Autonomous vehicle technology has been a major research and development
topic in the automotive industry during the last decade.The Second generation of driver
assistance systems was introduced around the 1990s based on sensors that measure the
external state of the vehicle with the focus of providing information and warnings to the
driver.Figure 1: Evolution of Advanced Driver Assist Systems
The following sensors that measure conditions outside the vehicle, and vehicle
position relative to its environment are essential in driving assist systems and
autonomous vehicle technologies: Vision, LIDAR, RADAR, ultrasonic range, GPS, and
inter-vehicle communication.Prominent sensors such as LIDAR, GPS, radar,
vision, ultrasonic and inertial measurement unit, their working principle and usage are
discussed in the following sections.LIDAR transmits a beam of light pulse from a rotating mirror, part of that light
will reflect back to the sensor to detect any non-absorbing object or surface (Figure 4).Using the calculated distance, the sensor constructs a 3D map of the world
including the objects around it. LIDAR uses infrared (IR), visible or ultraviolet (UV)
waves of the electromagnetic spectrum for different applications and comes in various
1D, 2D and 3D configurations.Active safety and driver assistance system that intervene to avoid and/or
mitigate an emergency situation and then immediately disengage are also not included
for the various levels of automation.The short-term goal is to automate driving in select well-defined situations as
implemented by some of the technology companies, for example testing of self-driving
taxi cabs in well-defined suburbs.The technologies developed in
the automotive industry also has direct applications in construction, mining, agricultural
equipment, seaborne shipping vessels, and unmanned aerial vehicles (UAVs).The latest generation of Driver Assist Systems (DAS),
also called Advanced Driver Assist Systems (ADAS), defines and controls trajectories
beyond the current request of the driver, i.e. overriding driver commands to avoid a
collision.The embedded control software development (that includes data fusion
from sensors, inter-vehicular communication, real-time cloud computing support) is the
key technical challenge.A high-end LIDAR sensor can measure multiple distances per
laser pulse which is helpful to see through dust, rain and mostly transparent surfaces
such as glass windows and porous object like wire fences.Despite the heavy
intensity of investment in technology development, it will still take a few decades before
an entirely autonomous self-driving vehicle navigates itself through national highways
and congested urban cities [1].Some current ADAS examples include traffic jam
assist, collision avoidance assist, pedestrian and oncoming traffic detection systems.Data
fusion strategies that combine real-time information from these multiple sensors is an
important part of the embedded control software.Actuation technologies, i.e. computer
control steering, throttle, transmission, and braking, are mature and do not present any
R&D challenges.Echoes sensed vary in frequency depending on the speed of the object.30),
medium (?2.2.3.


النص الأصلي

Autonomous vehicle technology has been a major research and development
topic in the automotive industry during the last decade. The technologies developed in
the automotive industry also has direct applications in construction, mining, agricultural
equipment, seaborne shipping vessels, and unmanned aerial vehicles (UAVs).
Significant R&D activities in this area date back three decades. Despite the heavy
intensity of investment in technology development, it will still take a few decades before
an entirely autonomous self-driving vehicle navigates itself through national highways
and congested urban cities [1]. People have been trying to make self-driving cars, since
the invention of the car.
The past, present and potential future of driver assistance systems are reviewed
by Benglar et al [8] and summarized in Figure 1. Early driver assistance systems were
based on sensors that measure the internal status of the vehicles. These sensors enable
the control of vehicle dynamics so that the trajectory requested by the driver is followed
in the best way possible. In 1995, an additional dynamic driving control system such as
electronic stability control (ESC) was introduced. The Second generation of driver
assistance systems was introduced around the 1990s based on sensors that measure the
external state of the vehicle with the focus of providing information and warnings to the
driver.Figure 1: Evolution of Advanced Driver Assist Systems
The following sensors that measure conditions outside the vehicle, and vehicle
position relative to its environment are essential in driving assist systems and
autonomous vehicle technologies: Vision, LIDAR, RADAR, ultrasonic range, GPS, and
inter-vehicle communication. The latest generation of Driver Assist Systems (DAS),
also called Advanced Driver Assist Systems (ADAS), defines and controls trajectories
beyond the current request of the driver, i.e. overriding driver commands to avoid a
collision. All these sensors have overlapping and complementary capabilities. Data
fusion strategies that combine real-time information from these multiple sensors is an
important part of the embedded control software. Actuation technologies, i.e. computer
control steering, throttle, transmission, and braking, are mature and do not present any
R&D challenges. The embedded control software development (that includes data fusion
from sensors, inter-vehicular communication, real-time cloud computing support) is the
key technical challenge.
The on-road vehicle automation requires a standard set of regulations and
terminology with a taxonomy and definitions. Some regulation and standard have been
released [9]. The new standard J3016 from SAE International simplifies communication
and facilitates collaboration within technical and policy domains. According to the
standard as shown in Figure 2, the levels of driving automation can be divided into Conditional, High, and Full Automation. The standard does not provide complete
definitions applicable to lower levels of automation (No Automation, Assisted, or Partial
Automation). Active safety and driver assistance system that intervene to avoid and/or
mitigate an emergency situation and then immediately disengage are also not included
for the various levels of automation.The short-term goal is to automate driving in select well-defined situations as
implemented by some of the technology companies, for example testing of self-driving
taxi cabs in well-defined suburbs. The long-term goal is to achieve door-to-door
automated driving in any situation. Some current ADAS examples include traffic jam
assist, collision avoidance assist, pedestrian and oncoming traffic detection systems.
2. Sensors: Principles & Limitations
Different sensors and systems are used for navigation and control of the
autonomous vehicle as shown in Figure 3. Prominent sensors such as LIDAR, GPS, radar,
vision, ultrasonic and inertial measurement unit, their working principle and usage are
discussed in the following sections.LIDAR transmits a beam of light pulse from a rotating mirror, part of that light
will reflect back to the sensor to detect any non-absorbing object or surface (Figure 4).
The target distance is calculated as speed of light times the measured time period at
multiple angles. It scans the field of view in 3D with a finite spatial resolution and
frequency. Using the calculated distance, the sensor constructs a 3D map of the world
including the objects around it. LIDAR uses infrared (IR), visible or ultraviolet (UV)
waves of the electromagnetic spectrum for different applications and comes in various
1D, 2D and 3D configurations. They are the second most valued sensors, after vision for
object detection like a vehicle, pedestrian and obstacles, as shown in a typical
application of self-driving cars (Figure 4). Their performance is poor in rain, snow, dust
and foggy environments. A high-end LIDAR sensor can measure multiple distances per
laser pulse which is helpful to see through dust, rain and mostly transparent surfaces
such as glass windows and porous object like wire fences. To reduce the signal to noise
ratio, a higher power laser generation is desired but in order to prevent damage to the
human eye, a laser power of 905nm is used to achieve desired range with low duty cycle
[10]. Current cost of LIDAR sensor is relatively high [11, 12] and there are some issues
with the long-term reliability of their mechanical scanning mechanisms. They have been used heavily in research applications, but not widely used in automotive OEM safety
systems until recently2.2. RADAR
Radar transmits electromagnetic pulses and senses the echoes to detect and track
objects. Echoes sensed vary in frequency depending on the speed of the object. Radar
can measure the relative distance, velocity, and orientation of the object [13, 14]. In case
of monostatic radars in which the transmitter and receiver are located at the same
location, range of the target is measured by using the round trip travel time of a pulse,
times the speed of light divided by two. They are typically available for short (≈ 30),
medium (≈ 60) and long-range (≈ 200 m) distances and range from 3 MHz (Very long
range) to 100+ GHz (Short range) frequencies (Fig.4). Radar requires less computing
resources compared to vision or LIDAR. Typical application include lane keeping,
advance cruise control, object detection, etc [15].
2.3. GPS
GPS is used to locate the position of a GPS receiver (x,y,z coordinates) using four
or more geostationary satellites. These satellites maintain their position relative to earth
and broadcast reference signals. A GPS receiver on earth deciphers the actual location
within meters accuracy. However, the so called ”differential GPS” can pinpoint a location
within centimeter accuracy which is necessary for navigation of autonomous vehicle.
GPS consists of three segments: space, control, and user segments. The space segment
includes the satellites, control segment manages and controls them and user segment is
related to development of user equipment for both military and civil purposes [16]. GPS
based navigation application are greatly used to accurately predict the vehicle location with respect to map which is known as localization. However GPS based navigation are
inaccurate and can lead to ghosting phenomena.
2.4. Vision
Vision system is composed of a camera and image processing unit (Figure 5). A
typical camera is a combination of focusing lens and array of photo-detectors for each
pixel in the field of view (FOV). The array of photo-detectors send pixel information to
image processing unit. This unit processes the information based on certain algorithms
to detect desired objects. Vision sensors capture more visual information, hence tracking
the surrounding environment more effectively than other sensors. They are categorized
into mono and stereo types. Mono camera systems are often used for lane marking, lane
edge detection, basic object detection, road sign detection and localization. Multiple or
Stereo camera systems provide depth for objects detection. Their primary advantage is
their low-cost, off-the-shelf components and their software implementation. Their
primary disadvantage is handling a full range of ambient and uncontrolled conditions
such as lighting, shadowing, reflection, weather, dust, smoke. Also processing data from
these sensors in real-time require a large amount of computational resources. Even with
these limitations they are extensively used in autonomous vehicles.2.5. IMU
A moving vehicle can experience linear and rotational motions along x, y, and z￾axes: lateral, longitudinal, and vertical which can be measured by inertial measurement unit including linear and angular accelerations. This information is used to improve GPS
measurements. IMU includes accelerometers and gyroscopes 2.6. Ultrasonic Range
Ultrasonic sensors are short range sensors (typical ≈ 2 m) which send an
ultrasonic pulse wave and detect echoes returned from the obstacles using transmitter￾receiver pair. These are mainly used in relatively low speed ADAS modules like parking
space detection and assistance [17] and obstacle detection during congested traffic
conditions [18]. They are reliably detected under any weather conditions like rain, snow
and winds. However they have range limitation which restricts them to be used as OEM
vehicle sensors.
3. SENSOR FUSION
In sensor fusion techniques, raw sensor data is received. After receiving the data,
feature extraction, clustering and object detection hypotheses are conducted. These
hypotheses are then associated with tracks showing state estimates of objects detected.
The associated information is used by state estimators like Bayesian filters, to predict
the current states [19]. The order of sensor measurements availability can be different
from the order of raw data acquisition by the sensors. Buffering of information until all
the data is available results in an unneeded dead-time which can degrade the
performance of control system. To avoid performance degradation pseudo￾measurements that are aligned in time or asynchronous tracking systems that employ
every measurement upon availability can be used


تلخيص النصوص العربية والإنجليزية أونلاين

تلخيص النصوص آلياً

تلخيص النصوص العربية والإنجليزية اليا باستخدام الخوارزميات الإحصائية وترتيب وأهمية الجمل في النص

تحميل التلخيص

يمكنك تحميل ناتج التلخيص بأكثر من صيغة متوفرة مثل PDF أو ملفات Word أو حتي نصوص عادية

رابط دائم

يمكنك مشاركة رابط التلخيص بسهولة حيث يحتفظ الموقع بالتلخيص لإمكانية الإطلاع عليه في أي وقت ومن أي جهاز ماعدا الملخصات الخاصة

مميزات أخري

نعمل علي العديد من الإضافات والمميزات لتسهيل عملية التلخيص وتحسينها


آخر التلخيصات

‎لا يـزال الاقت...

‎لا يـزال الاقتصـاد العالمـي يواجـه العديـد مـن الصعوبـات والتحديـات منـذ جائحـة كورونـا مـروًرا بال...

الشوط الأول: مش...

الشوط الأول: مشوار السحب حيث يتحرك المكبس من الأعلى (النقطة الميتة العليا) إلى الأسفل (النقطة الميتة...

تغير الوضع تبعً...

تغير الوضع تبعًا لصدمة خارجية- تتمثل في الأزمة وانحلال الاتحاد السوفيتي- وصدمة داخلية- فضيحة فساد ما...

Black holes are...

Black holes are fascinating objects because there are made only from our concepts of space and time3...

مقدمة البحث : ش...

مقدمة البحث : شهد العالم بعد الحرب الباردة وانفراد الولايات المتحدة الأمريكية بالقيادة في ظل الأحادي...

وبدأ يسوع من ذل...

وبدأ يسوع من ذلك الحين ينادي فيقول : توبوا ، قد اقترب ملكوت السموات . هكذا بدأت المرحلة الأولى من خد...

Air pollution i...

Air pollution is the contamination of the indoor or outdoor environment by any chemical, physical, o...

أهمية القياس ال...

أهمية القياس العلمي يعد القياس أمرا على جانب كبير من الأهمية في أي علم من العلوم فجميع العلوم تسعى ل...

ثانيا معوقات تط...

ثانيا معوقات تطبيق التكنولوجيا المصرفية أشار مصرفيون إلى أن أغلب البنوك المحلية تعانى من تقصير واضح،...

- صخور الزمن ال...

- صخور الزمن الثالث تغطي تكوينات الزمن الثالث حوالي ثلث المساحة الكلية الأراضي مصر، وتتمثل في صخور ...

Cancer A high-f...

Cancer A high-fiber diet of fruits and vegetables has anti-cancer benefits, especially against colon...

جوزي بيمارس معا...

جوزي بيمارس معايا مرة كل اسبوع أو اسبوعين حسب ما يفضى و انا مبستمتعش معاه أنا محتاجة عنف ورجولة قوية...