5 - 7 June 2018
Messe Stuttgart, Germany

Preliminary Conference Programme

Day 1: Tuesday 5 June

Keynote Opening Session

How to get from expectations to experience in autonomous driving

Christoffer Kopp
User experience concept leader - autonomous drive
Volvo Car Corperation
The presentation will discuss how we do early experience tests of autonomous driving to be able to guide the technology development in the right direction. It will focus on what elements are most importance to create trust in the autonomous car. It will also cover the most important aspects to take into consideration to make the user feel freedom when in autonomous mode.

Autonomous driving non-driving secondary activities: a literature review

Sibashis Parida
R&D unit, seating department
BMW Group
Autonomous driving will not only impact the automotive sector, but will also revolutionise the way people spend time in a personalised vehicle. In fully autonomous driving mode, the vehicle will no longer be used as a means of transport, but will be used as a third living space after home and the workplace. With the advancement of autonomous driving, users will no longer need to concentrate on driving and driving-related activities, but will use their newly gained free time for non-driving secondary activities. The paper aims to identify and evaluate specific activities that users want to carry out during autonomous driving.

Benefits of augmented reality head-up displays for automated driving

Bettina Leuchtenberg
Expert HMI, ergonomics and usability
Continental Automotive GmbH
Dr Thomas Vöhringer-Kuhnt
Head of HMI user experience and design
Continental Automotive GmbH
Augmented reality head-up displays (AR HUD) have been proved to support drivers during manual driving. The presentation shows that drivers also benefit from AR HUD content during autonomous driving situations. Based on an expert assessment, relevant use cases and AR HUD elements have been elaborated. In a driving simulator study with 24 users, an interaction concept with AR HUD content was compared with an interaction concept without AR HUD content. The study shows that most users prefer the AR HUD elements compared with information presentation in an instrument cluster. Takeover quality benefits from the use of AR elements.

New challenges for HMI in the age of autonomous driving

Rashmi Rao
Senior director, advanced engineering, CoC user experience
The age of the connected car has clearly demonstrated a need to raise the user experience to a new level. To ensure that autonomous driving is not one day equated with monotony, UX design is becoming the decisive brand factor for OEMs. The presentation will outline what a future-orientated HMI design including AR and VR applications has to look like, and explain why the user experience (UX) will be one of the decisive factors in autonomous or semi-autonomous driving enabling a wide variety of interactions between man and machine.

Surf & Curve – HMI ready for SAE 3

Sören Lemcke
Head of advanced human interface solutions
Within Surf & Curve, ZF-TRW has developed user-centred HMI concepts targeting automated driving of SAE Level 3. Together with the strategic partners FKA mbH and the Institute for Automotive Engineering of RWTH Aachen University, revolutionary HMI concepts were not only developed along several usability studies, but also tested and validated in the high-fidelity and static driving simulator. Among other features, Surf & Curve offers a unique way of lateral control with drivesticks, mirror replacement and comfort control ready for automated driving of SAE Level 3.

Panel Discussion - Re-writing the rule book on car interior design

Rashmi Rao
Senior director, advanced engineering, CoC user experience
Christoffer Kopp
User experience concept leader - autonomous drive
Volvo Car Corperation
Sibashis Parida
R&D unit, seating department
BMW Group
Can current thought processes for seating, HMI, UX and interface in convectional cars can be consigned to the history books? In this discussion we explore the exciting opportunities that AV's present and how we could be re-writing the rule book on car interior design.

Afternoon Session

The automotive cocoon

Dr Dominique Massonié
Product manager - HMI
Elektrobit Automotive
Impacted by rapidly evolving consumer electronics technology and new automotive business models, the car is now a connected object that is set to become an extension of someone’s home and office. This raises an interesting question: what factors need to be taken into consideration when creating automotive interiors? This presentation will provide insights into emerging challenges for automotive user interface design and then focus on how in-car interfaces can be designed to deliver experiences that are specific to the user; the tools one needs to create these interfaces; and the roles usage-, user- and context-specific information will play.

Will parallel industries and non-traditional OEMs drive the autonomous cars of the future?

John Tighe
Design director transport
JPA Design
Through several years of experience and understanding of customer behaviour within aircraft and many other transport fields, JPA tells the story of Person X's life in 10-15 years; what their needs will be, what will be available to them, what is relevant and how this will influence the autonomous vehicle interior design of future. We will discuss the package and seating layout, which is one of the most important factors when designing successfully for autonomous cars. What we've witnessed coming from automotive brands often seems to be blinkered and anchored in the past, which brings risk of being leapfrogged by those outside the traditional automotive industry.

Real-estate design opportunities of autonomous cars

Richard Seale
Lead automotive designer
The advent of autonomous technologies will have a huge effect on the rule book that designers have to play with, significantly throwing open the functional and aesthetic potential for our future vehicles and travel. This shift will be largely led by both the change in safety regulations, and the switch from a driver-based model to a passenger-based model – bringing unparalleled opportunities to customise our vehicles. The seismic shifts that autonomy will bring will open Pandora’s box, and nowhere will this be more obvious than in the interiors of our future cars. We will explore original concepts in this presentation.

Mobile Livingroom 2.0

Moritz von Grotthuss
Gestigon - a Valeo brand
Most experts expect the car to be the third place of living aside from our home and our office. Longer distances to commute, more traffic, more time in the car are already creating this reality today. But how can we avoid this development turning out to be our personal mobile dystopia? How do we create a place to be and a place where we like to be? HMI, AR, Interior Cocoon and new perceptual safety features are core to create a Mobile Livingroom 2.0. These concepts will be introduced, explained and discussed in this presentation.

Highly accurate reference system for validation of driver monitoring

Heiko Ruth
Head of system department
CMORE Automotive
Due to upcoming functions on HMI interfaces such as augmented reality, 3D displays and autonomous driving on SAE Level 3, driver monitoring is becoming increasingly important. With C.REF – a reference and ground truth block set for test and validation of several KPIs – CMORE has wide experience with forward-looking ADAS sensors. With the new C.REF Gaze and C.REF Head block set, two new functions in the reference block sets for interior observation have been developed. These block sets give the development team of a DMS system a new dimension of accuracy within the whole development process.

Lounges on wheels?

Dr Cyriel Diels
Academic director, National Transport Design Centre
Coventry University
Shared and automated mobility may make our journeys more pleasurable and productive. Future vehicles are envisioned as 'lounges on wheels', sporting flexible seating arrangements with interiors dotted with screens. This presentation will explore these proposals from a human-centred design perspective and discuss the necessity to respect basic human requirements to realise the potential benefits such shared and automated vehicles may be able to provide.

Day 2: Wednesday 6 June

Morning Session

Understanding occupants' visual behaviour – AI inside autonomous vehicles

Modar Alaoui
This session will cover the latest embedded vision AI technologies that enable visual behaviour understanding for the driver and passengers of autonomous and highly automated vehicles (HAVs). Critical to ensuring safety and comfort for all occupants, the intelligent software uses standard cameras to provide emotion recognition from facial micro-expressions, 30+ face analytics, body pose tracking, action and activity recognition. In the second half of this session, we will cover a number of metrics that trigger the activation of support systems, and others that enhance the overall ridership experience.

A visual aid against motion sickness

Dr Paolo Pretto
Research team leader
Max Planck Institute for Biological Cybernetics
Anecdotal evidence indicates that car passengers develop motion sickness when they experience unexpected and/or sustained discrepancies between physical and visual motion cues (e.g. while reading during the travel). This issue is likely to become more deleterious in autonomous vehicles, where passengers may be facing away from the driving direction and engaged in non-driving activities. We are testing the efficacy of in-car solutions that will constantly display to all passengers additional, non-invasive visual information on the current and imminent vehicle body motion. The goal is to minimise the perceived sensory mismatch and increase passengers’ situational awareness, thus improving safety and comfort.

Autonomous mobility 2050: four potential transformative outcomes for vehicle interiors

Carlo Budtke
Senior consultant
P3 Group
Carlo Budtke will explore the extremes between different approaches in the industry using four examples considering potential future market dominance by each of the following: automotive OEMs, shared mobility, tech giants and disruptive startups. Based on the scenarios explored at the Autonomous Vehicle Interior Design & Technology Symposium in 2017, new approaches and expanded concepts including relevant examples will be presented. The conclusion drawn is that different markets will see different market shares of these four examples, depending on the demographic characteristics. This will strongly impact on the user interaction with the vehicle interior and configurations of each extreme.

User-centred design of autonomous driving systems in trucks

Arnd Engeln
Professor for market and advertising research, traffic and transport psychology
Stuttgart Media University
In the TANGO scientific research project, sponsored by the German BMWi and in cooperation with Bosch, MAN, VW and Stuttgart University, we developed an activity and vigilance management system for truck drivers using a user-centred design process (UCD). We first observed and interviewed truck drivers in their daily work to find out their needs for assistance. Based on this, we conceptualised innovative ideas to keep drivers active and vigilant in phases of automatic driving on Level 2 and 3. In our further UCD process, these ideas are developed and evaluated in an iterative process. Interim results will be shown.

Personalising the autonomous vehicle interior with flexible plastic LCDs

Simon Jones
Commercial director
With the gradual transition to autonomous cars, the HMI will have to address the entertainment, information and communication needs of the passengers. As displays become more and larger, it is increasingly difficult to accommodate flat displays in an ergonomically optimised interior design where every other surface is curved. A new type of low-cost flexible display technology, plastic OLCD, now provides a viable solution for future cars where large area, high brightness and long-lifetime displays are needed. OLCD uses LCD technology, which is already qualified for automotive displays, with the added benefit that it can be conformed and shaped.

Inducing trust: AI-powered assistants as spokespeople for the autonomous vehicle

Dr Nils Lenke
Director corporate research
Nuance Communications
AI-powered automotive assistants (AA) play an increasing role for today’s drivers, enabling them to interact with their vehicles while driving. An interesting question is how this will change with the advent of the autonomous vehicle (AV). One view is that drivers as passengers in AVs will no longer need to use speech. This paper argues the opposite. Drawing from the literature and new usability studies and research it shows that AAs can play a crucial role in building up the necessary trust in the users of AVs. It will also show that in AVs there is a practical need to interact with AAs.

Afternoon Session

3D sensing infrastructure for next-generation in-cabin applications

Dr Gregor O Novak
Managing director
Becom Bluetechnix GmbH
For today’s advanced driver assistance systems and for the next step in autonomous driving, it is important to know what is going on inside the vehicle. A depth-sensing infrastructure based around time-of-flight 3D sensors and VCSEL illuminations combined with embedded processing provides the necessary sensor data to recognise driver and passengers. Based on the driving mode, increased safety through correct pose detection and a broad array of comfort functions for all users can be realised. Increased context and user awareness of the system enable a more intuitive user experience and novel HMIs.

Individual Differences in Trust, Perception and Usage of Automated Systems

Dr Tanja Schweiger
Manager, Automotive Solutions
J.D. Power Europe GmbH
Recent J.D. Power international studies performed in USA, China and Germany show many cultural differences concerning individuals’ trust in automated vehicles. A deeper look shows that beside cultural differences there are also many age and gender differences that should be considered while developing automated systems. These differences are not concerning only the trust in these systems, but also the perception and usage of them, including many interior and HMI aspects.

Occupant restraint dilemma for autonomous vehicles

Dr Gopal Krishnan Chinnaswamy
Senior project engineer
Virtual Engineering Centre, University of Liverpool
The Industry 4.0 revolution is enabling the development of automated driving without interference (autonomy). Connected vehicles are expected to be more common in the coming years, and fully autonomous (L5) vehicles are expected within a decade or two. Autonomous vehicles will require sufficient features to ensure the safety of the occupants even when their positions are not structured as at present. This paper addresses the issues brought about by the autonomy revolution and particularly Level 5 autonomy. It also assesses the need for, and proposes possible solutions to, the occupant safety conundrum.

Creation and evaluation of a Level 5 driverless pod design

Joscha Wasser
Coventry University & Horiba MIRA
Driverless last-mile mobility vehicles, also known as pods, are on the verge of becoming a reality accessible to the wider public as part of public transport systems. However, little is currently known about the passenger requirements for these vehicles. We therefore proposed a comfort model for driverless pods, presented at the Autonomous Vehicle Interior Design & Technology Symposium 2017, which was then used as a basis for the design of a four-seater driverless pod. A digital prototype and an ergonomic buck were then used to conduct participant lead ergonomic evaluations of the interior and to validate our proposed model.

Virtual prototyping to virtually test passenger comfort and safety

Caroline Borot
Business development industry solutions
This paper investigates how virtual interior prototyping and digital human models can be used to find the right car interior design, optimising the comfort of occupants and their safety in a completely new, innovative interior layout. After a brief description of ESI digital thermal human models and virtual seat model, it will show through different industrial use cases, how the (dis)comfort of the passenger can be virtually tested and optimised at the early stage of the car interior design.

Day 3: Thursday 7 June

Morning Session

VI-DAS HMI for user-centred automation

Paul Schouten
UX designer
The presentation will outline an iterative user-centred design study to find solutions for how to allow the VI-DAS user to anticipate what’s ahead using contextual sensor data, to support a safe and comfortable user-vehicle collaboration for different levels of vehicle automation.

UX 2025: investigating the user experience in the connected cockpit of 2025

Patrice Reilhac
Innovation and collaboration research director
A clear vision of megatrends and technological progress is fundamental for envisioning the future cockpit. Automation Levels 3 and 4, 5G connectivity and AI-enabled adaptive and personalised content allow the future cockpit to be highly focused on the human. Valeo presents investigations on its Cockpit 2025 concept for intuitive driving and travelling of future highly digital users.

Human and automation as team members: the AutoMate project

Dr Andreas Lüdtke
Group manager
The presentation describes the concept, process and findings of the AutoMate project. The vision of AutoMate is a novel driver-automation interaction and cooperation concept to ensure that highly automated driving systems will reach their full potential and can be commercially exploited. This concept is based on viewing and designing the automation as the driver’s transparent and comprehensible cooperative companion or teammate. This kind of system can enhance safety and comfort by using the strength of both the automation and the human driver in a dynamic way.

VI-DAS Project – a novel approach to next-generation vehicle interaction

Dr Oihana Otaegui
Head of ITS and engineering department
VI-DAS will progress the design of next-gen 720° connected ADAS (scene analysis, driver status). Advances in sensors, data fusion, machine learning and user feedback provide the capability to better understand driver, vehicle and scene context, facilitating a significant step along the road towards truly semi-autonomous vehicles. Predictions on outcomes in a scene will be created to determine the best reaction to feed to a personalised HMI component that proposes optimal behaviour for safety, efficiency and comfort.

HMI for intuitive and adaptive transitions

Dr Alessia Knauss
Research specialist
Autoliv Research
Transitions between automated vehicle and manual driving represent one of the major challenges in autonomous driving. In order for the driver to be useful, they should be intuitive and adaptive to the driver state, the environmental situation, and personal preferences and characteristics of the driver. This talk will focus on the HMI as an element to make transitions more intuitive and adaptive. Application examples based on different HMI elements (e.g. smart steering wheel) will be presented.

Driver state-based HMI in automated driving: the ADAS&ME approach

Stella Nikolaou
The introduction of automated functions in vehicles brings a new set of possibilities, but also several challenges. It is no longer just the driver using the vehicle as a tool for transportation; instead, the driver and the vehicle work together as a team. ADAS&ME is an EU-funded project targeting the development of adaptive ADAS, able to decide when and how the vehicle needs to take over or recover control based on driver’s state and the environmental context. The approach uses adaptive HMI strategies to assist the driver with automation when needed, and to achieve smooth transitions between automated and manual driving.

Afternoon Session

Feeling your car: HMIs as envisioned by the VI-DAS project

Dr Margarita Anastassova
Research engineer
The VI-DAS vision of future in-vehicle HMIs for autonomous and semi-autonomous driving has a special focus on HMIs providing multi-sensory alerts and improving drivers' situation awareness. An approach to integrating such HMIs in regular driver activity when driving and being driven will be presented.

How can I help my autonomous vehicle?

Elisa Landini
Programme manager
This presentation describes the innovative interaction paradigm between drivers and highly automated vehicles, developed in the AutoMate EU project. This new interaction modality is based on cooperation, i.e. mutual support in perception and action between the driver and the car. The cooperation aims to exploit and make concrete the complementarity of the human and the automation as part of a team. The tool able to allow the cooperation is the HMI. In the project, when the human should compensate a human limit, a negotiation-based HMI is implemented to cooperate with the driver, increasing comfort and acceptability.
Please Note: This conference programme may be subject to change


Autonomous Vehicle International magazine