Refine Your Search

Search Results

Author:
Viewing 1 to 12 of 12
Technical Paper

C-V2X LiDAR-Based Non-Line of Sight Object Detection and Localization for Valet Parking Applications

2024-04-09
2024-01-2040
Cellular Vehicle-to-Everything (C-V2X) is considered an enabler for fully automated driving. It can provide the needed information about traffic situations and road users ahead of time compared to the onboard sensors which are limited to line-of-sight detections. This work presents the investigation of the effectiveness of utilizing the C-V2X technology for a valet parking collision mitigation feature. For this study a LiDAR was mounted at the FEV North America parking lot in a hidden intersection with a C-V2X roadside unit. This unit was used to process the LiDAR point cloud and transmit the information of the detected objects to an onboard C-V2X unit. The received data was provided as input to the path planning and controls algorithms so that the onboard controller can make the right decision while approaching the hidden intersection. FEV’s Smart Vehicle Demonstrator was utilized to test the C-V2X setup and the developed algorithms.
Technical Paper

Drivable Area Estimation for Autonomous Agriculture Applications

2023-04-11
2023-01-0054
Autonomous farming has gained a vast interest due to the need for increased farming efficiency and productivity as well as reducing operating cost. Technological advancement enabled the development of Autonomous Driving (AD) features in unstructured environments such as farms. This paper discusses an approach of utilizing satellite images to estimate the drivable areas of agriculture fields with the aid of LiDAR sensor data to provide the necessary information for the vehicle to navigate autonomously. The images are used to detect the field boundaries while the LiDAR sensor detects the obstacles that the vehicle encounters during the autonomous driving as well as its type. These detections are fused with the information from the satellite images to help the path planning and control algorithms in making safe maneuvers. The image and point cloud processing algorithms were developed in MATLAB®/C++ software and implemented within the Robot Operating System (ROS) middleware.
Technical Paper

Operational Design Domain Feature Optimization Route Planning Tool for Automated Vehicle Open Road Testing

2023-04-11
2023-01-0686
Autonomous vehicles must be able to function safely in complex contexts, involving unpredictable situations and interactions. To ensure this, the system must be tested at various stages as described by the V-model. This process iteratively tests and validates distinct parts of the system, starting with small components to system level assessment. However, this framework presents challenges when adapted to deal with testing problems that face autonomous vehicles. Open road testing is an effective way to expose the system to real world scenarios in combination with specific driving situations described by the Operational Design Domain (ODD). The task of finding a path between two points that maximizes the ODD exposure is not a trivial task, without mentioning that in most cases, the developers must design routes in unfamiliar regions. This represents a significant effort and resources consumption, which makes it important to optimize this task.
Technical Paper

LiDAR-Based Fail-Safe Emergency Maneuver for Autonomous Vehicles

2023-04-11
2023-01-0578
Although SAE level 5 autonomous vehicles are not yet commercially available, they will need to be the most intelligent, secure, and safe autonomous vehicles with the highest level of automation. The vehicle will be able to drive itself in all lighting and weather conditions, at all times of the day, on all types of roads and in any traffic scenario. The human intervention in level 5 vehicles will be limited to passenger voice commands, which means level 5 autonomous vehicles need to be safe and capable of recovering fail operational with no intervention from the driver to guarantee the maximum safety for the passengers. In this paper a LiDAR-based fail-safe emergency maneuver system is proposed to be implemented in the level 5 autonomous vehicle.
Technical Paper

Higher Accuracy and Lower Computational Perception Environment Based Upon a Real-time Dynamic Region of Interest

2022-03-29
2022-01-0078
Robust sensor fusion is a key technology for enabling the safe operation of automated vehicles. Sensor fusion typically utilizes inputs of cameras, radars, lidar, inertial measurement unit, and global navigation satellite systems, process them, and then output object detection or positioning data. This paper will focus on sensor fusion between the camera, radar, and vehicle wheel speed sensors which is a critical need for near-term realization of sensor fusion benefits. The camera is an off-the-shelf computer vision product from MobilEye and the radar is a Delphi/Aptive electronically scanning radar (ESR) both of which are connected to a drive-by-wire capable vehicle platform. We utilize the MobilEye and wheel speed sensors to create a dynamic region of interest (DROI) of the drivable region that changes as the vehicle moves through the environment.
Technical Paper

HD-Map Based Ground Truth to Test Automated Vehicles

2022-03-29
2022-01-0097
Over the past decade there has been significant development in Automated Driving (AD) with continuous evolution towards higher levels of automation. Higher levels of autonomy increase the vehicle Dynamic Driving Task (DDT) responsibility under certain predefined Operational Design Domains (in SAE level 3, 4) to unlimited ODD (in SAE level 5). The AD system should not only be sophisticated enough to be operable at any given condition but also be reliable and safe. Hence, there is a need for Automated Vehicles (AV) to undergo extensive open road testing to traverse a wide variety of roadway features and challenging real-world scenarios. There is a serious need for accurate Ground Truth (GT) to locate the various roadway features which helps in evaluating the perception performance of the AV at any given condition. The results from open road testing provide a feedback loop to achieve a mature AD system.
Technical Paper

Interactive Lane Change with Adaptive Vehicle Speed

2021-04-06
2021-01-0094
Advanced Driver Assistance Systems (ADAS) has gained an enormous interest in the past decade with growing complexity in systems software and hardware. One of the most challenging ADAS features to develop is lane change as it requires full awareness of the objects surrounding the Ego vehicle as well as performing safe and convenient maneuvers. This paper discusses a camera-based lane change approach that is designed to improve the driver’s safety and comfort with the help of LiDAR object detection. The forward-facing camera is capable of detecting the Ego and adjacent lane lines as well as the moving objects in the camera’s field of view. A Graphical User Interface (GUI) was also developed for the driver to interact with the lane change feature by visualizing the sensor data and optionally request the vehicle to change lanes when the system suggests that it is safe to do so.
Technical Paper

LiDAR-Based Urban Autonomous Platooning Simulation

2020-04-14
2020-01-0717
The technological advancements of Advanced Driver Assistance Systems (ADAS) sensors enable the ability to; achieve autonomous vehicle platooning, increase the capacity of road lanes, and reduce traffic. This article focuses on developing urban autonomous platooning using LiDAR and GPS/IMU sensors in a simulation environment. Gazebo simulation is utilized to simulate the sensors, vehicles, and testing environment. Two vehicles are used in this study; a Lead vehicle that follows a preplanned trajectory, while the remaining vehicle (Follower) uses the LiDAR object detection and tracking information to mimic the Lead vehicle. The LiDAR object detection is handled in multiple stages: point cloud frame transformation, filtering and down-sampling, ground segmentation, and clustering. The tracking algorithm uses the clustering information to provide position and velocity of the Lead vehicle which allows for vehicle platooning.
Technical Paper

LiDAR-Based Predictive Cruise Control

2020-04-14
2020-01-0080
Advanced Driver Assistance Systems (ADAS) enable safer driving by relying on the inputs from various sensors including Radar, Camera, and LiDAR. One of the newly emerging ADAS features is Predictive Cruise Control (PCC). PCC aims to optimize the vehicle’s speed profile and fuel efficiency. This paper presents a novel approach of using the point cloud of a LiDAR sensor to develop a PCC feature. The raw point cloud is utilized to detect objects in the surrounding environment of the vehicle, estimate the grade of the road, and plan the route in drivable areas. This information is critical for the PCC to define the optimal speed profile of the vehicle while following the planned path. This paper also discusses the developed algorithms of the LiDAR data processing and PCC controller. These algorithms were tested on FEV’s Smart Vehicle Demonstrator platform.
Technical Paper

Autonomous Vehicle Multi-Sensors Localization in Unstructured Environment

2020-04-14
2020-01-1029
Autonomous driving in unstructured environments is a significant challenge due to the inconsistency of important information for localization such as lane markings. To reduce the uncertainty of vehicle localization in such environments, sensor fusion of LiDAR, Radar, Camera, GPS/IMU, and Odometry sensors is utilized. This paper discusses a hybrid localization technique developed using: LiDAR-based Simultaneous Localization and Mapping (SLAM), GPS/IMU, Odometry data, and object lists from Radar, LiDAR, and Camera sensors. An Extended Kalman Filter (EKF) is utilized to fuse data from all sensors in two phases. In the preliminary stage, the SLAM-based vehicle coordinates are fused with the GPS-based positioning. The output of this stage is then fused with the object-based localization. This approach was successfully tested on FEV’s Smart Vehicle Demonstrator at FEV’s HQ. It represented a complicated test environment with dynamic and static objects.
Technical Paper

Autonomous Driving Development Rapid Prototyping Using ROS and Simulink

2019-04-02
2019-01-0695
Recent years have witnessed increasing interest in Advanced Driver Assistance Systems (ADAS) and Autonomous Driving (AD) development, motivating the growth of new sensor technologies and control platforms. However, to keep pace with this acceleration and to evaluate system performance, a cost and time effective software development and testing framework is required. This paper presents an overview utilizing Robotic Operating System (ROS) middleware and MATLAB/Simulink® Robotics System Toolbox to achieve these goals. As an example of employing this framework for autonomous development and testing, this article utilizes the FEV Smart Vehicle Demonstrator. The demonstrator is a reconfigurable and modular platform highlighting the power and flexibility of using ROS and MATLAB/Simulink® for AD rapid prototyping. High-level autonomous path following and braking are presented as two case studies.
Technical Paper

Cost Effective Automotive Platform for ADAS and Autonomous Development

2018-04-03
2018-01-0588
This paper presents a cost effective development platform, named FEV-Driver, for Advanced Driver Assistance Systems (ADAS) and autonomous driving (AD). The FEV-Driver platform is an electric go-kart that was converted into an x-by-wire vehicle which represents the behavior of a full-scale electric vehicle. FEV-Driver has the advantage of being a small-scale vehicle that can be used with a significant lower safety risk compared to full-sized vehicles. The ADAS/AD algorithms for this platform were developed in both Simulink and C++ software and implemented within the Robot Operating System (ROS) middleware. Besides the description of the platform, Lane Keep Assist (LKA) and Automatic Emergency Braking (AEB) algorithms are discussed, followed by a path planning algorithm which enables the vehicle to drive autonomously after a manually controlled training lap. The modular system architecture allows for complete controller exchange or adaptation to different vehicles.
X