Summary

While Autonomous Vehicles (AVs) are currently being tested on carefully plotted out test areas, they cannot quite mimic the way real people feel when they drive. The challenge is to infuse cars with the intelligence they need to understand humans and take decisions that align with human preferences. Current autonomous vehicles deliver the task of driving through shared-autonomy, which is the result of a collaboration between the human driver and the vehicle. Effective shared autonomy requires a clear understanding of driver’s behavior, which is driven by multiple psychophysiological and environmental variables.  Naturalistic Driving Studies (NDS) have shown to be an effective approach to understanding the driver’s state and behavior in real-world scenarios. However, due to the lack of technological and computing capabilities, former NDS only focused on vision-based approaches, ignoring important psychophysiological factors such as cognition and emotion. In this project, we introduce HARMONY a novel human-centered multimodal driving study framework. Through HARMONY, we collect, analyze and understand drivers states and behaviors in real-world naturalistic settings through multiple physical and virtual sensors. The main goal of this project is to utilize both human and machine advantages to humanize autonomy by instilling the beneficial nuances of human behavior, emotions, and trust with the technological and safety benefits of AVs. Behavior-guided AVs will bring human factors such as emotions, behaviors, and trust into the autonomous loop, where AVs can enhance the passenger experience, safety, and comfort. In this project, we are building models to predict driver behavior, and emotional changes in response to different environmental conditions, and to automatically infer their preferences.

 

 

Framework

Through using HARMONY, we can better contextualize driving scenarios by bringing together multiple measures of driver’s emotions, cognition, and attention, together with environmental attributes. HARMONY framework consists of physical and virtual devices including cameras, smart wearables, and APIs that each provide a piece of driving context.

HARMONY 1.0 Sensors and Devices

HARMONY 1.0 is the current version of our driving sensing platform. This platform collects data from 3 major sources: (1) Camera, (2) Smartwatch, (3) Music API. Our platform uses a BlackVue DR750S-2CH, which is a commercially available dual dash camera. It records both inside and outside of the cabin simultaneously. BlackVue DR750-2CH supports up to 256 GB of SD card memory, which makes it suitable for long term data collection. This camera has a built-in GPS that connects to GPS as the car turns on. This will sync the timing of the device with the global GPS timing so that the camera always has the current timing. The camera does not have a LCD which decreases both chances of distraction by the LCD and possible feeling of being monitored by the participant. Moreover, it has the option of disabling the audio recording. In addition, BlackVue records the speed of the car from the output of the built-in GPS. The speed is overlayed on the recorded video. The camera records videos with 30 fps, Full HD resolution in 3-minute segments. Each 3 minutes segment of driving is saved as a joint video of inside and outside the environment. This camera has the cloud capability which can be used to record GPS, speed, and videos to a cloud server. This option has not been used at the moment but will be explored in the future for further data storage. Lastly, this camera will turn on and off automatically with the car’s engine start and stop.

Our platform uses an android smartwatch that is equipped with “SWEAR”, an in-house designed app for collecting long-term data from smartwatches (A). This app is designed by Professor Boukhechba, who is also one of the LinkLab members . SWEAR smooths the data collection by adding the ability of changing each sensor’s data collection frequency to the desired frequency. SWEAR records heartrate , hand acceleration, audio amplitude (noise level), light intensity, location, and gyroscope. Participants are required to start/stop the data collection for every session of driving (B). The smartwatch saves every segment of driving data on the watch and shows the participant’s ID, and number of saved files on the status tab (C). Every participant is required to sync the watch with their phone. The watches are all registered on the university wireless network. Using both the smartphone of the participant and the wifi, the time of the watch will be synced to the current time at any moment. In this way the watch always shows the current timing. Every two weeks, the participant is required to transfer their data to our system by clicking on the upload icon on the settings tab (D). Then using participant’s PID the data will be recorded in our system.

Where have our participants driven so far?

The data has been mostly collected from eastern and northeastern regions of the United States including states of Virginia, Pennsylvania, Delaware, West Virginia, Indiana, Illinois, Ohio, Vermont, New Hampshire, Maine, and New York. This data has been collected over the period of June 2019 to present. The most recent map of the collected data to date is as follows. This map is gradually updated as the data collection is an on-going process.

In the below video we summarize HARMONY’s design and goals, and we showcase an example of the application of HARMONY.

Additionally, parts of the HARMONY data will be released gradually which is stored on OSF under the name of HARMONY. As we analyze more sections of the data, the storage gets updated.

Presentations

Tavakoli, A., Heydarian, A., & Balali, V. (2019), How Do Environmental Factors Affect Driver’s Gaze Direction and Head Movements? A Long-Term Naturalistic Driving Study, Transportation Research Board Annual Meeting 2020

Publications (Under Review)

Tavakoli, A., Kumar, S., Boukhechba, M., Heydarian, A., (2021). Can Smartwatches Reveal Driver’s Activity, Driving Events, and Outdoor Environment Attributes in the Wild? Under Review in IEEE Intelligent Vehicles Symposium 2021

Publications

Tavakoli, A., Boukhechba, M., Heydarian, A., (2021). Leveraging Ubiquitous Computing for Empathetic Routing, Accepted to the Computer-Human Interaction (CHI) 2021

Tavakoli, A., Kumar, S., Guo, X., Balali, V., Boukhechba, M., Heydarian, A., (2021). HARMONY: A Human-centered Multimodal Driving Study in the Wild, Journal of IEEE ACCESS

Tavakoli, A., Boukhechba, M., & Heydarian, A. (2020, July). Personalized Driver State Profiles: A Naturalistic Data-Driven Study. In International Conference on Applied Human Factors and Ergonomics (pp. 32-39). Springer, Cham.

Tavakoli, A., Balali, V., & Heydarian, A. (2019). “A Multimodal Approach for Monitoring Driving Behavior and Emotions” (No. 19-05204).

Empathetic Routing

How does environment affect driver’s state?

Reason-Aware Annotation for Contextualizing Driving Scenarios

Contextualizing Driving Scenarios

ORIEL: an Open-Source Multi-modal Driver’s Activity Recognition Classifier in the Wild

Team Members

Arsalan Heydarian

Assistant Professor

Principal Investigator

Arash Tavakoli

PhD Student

Graduate Research Assistant

Xiang Guo

PhD Student

Graduate Research Assistant