The first self-driving vehicle was already presented almost 80 years ago. Over the past decades, automated vehicles have been further developed and many new use cases have been introduced in the form of prototypes. While most of these prototypes consider the user as a passenger inside the vehicle cabin, more use cases are compelling with the user outside the vehicle engaged in valuable activities such as commercial services. This work defines a new use case of fully automated vehicles in urban scenarios called Assistance Vehicles. Assistance Vehicles (ASVs) work in tandem with the user and handle all aspects of driving while the user is temporarily outside the vehicle and engaged in secondary activities such as logistical, social, or maintenance services. The Assistance Vehicle adjusts and synchronizes its position and speed to the movements of the external user. It can be shown that this vehicle increases the safety, comfort, and efficiency of the service.
The currently existing concepts of ASV are not able to deal with the dynamics of service and environment. The service zones are dynamic and consist of unpredictable events with spontaneous changes in the service area. The service zone environment is also dynamic and consists of other road users sharing the drivable space with the ASV. Their intention has to be understood and a corresponding reaction has to be planned in order to avoid deadlocks and collisions. As a result, a new ASV concept is required, which can handle these dynamics.
This work presents a new concept of Collaborative Assistance Vehicle (CAV) as an extended ASV with higher interaction capabilities. This work proposes a functional architecture focusing on an interaction framework, along with algorithmic solutions for implementing core functional modules. Perception, command extraction, and behavior planning are part of the core functional modules. All of these modules are implemented and evaluated.
Conception and Development of an Interaction Framework for a Collaborative Assistance Vehicle
This work presents a new concept of a Collaborative Assistance Vehicle with high interaction capabilities for collaboration with external users outside the vehicle. This work proposes a functional architecture for level 4 automated driving that focuses on an interaction framework, along with algorithmic solutions for implementing core function modules. Perception, command extraction, and behavior planning are part of the core function modules. All of these modules will be implemented and evaluated.