While Autonomous Vehicles (AVs) are currently being tested on carefully plotted out test areas, they cannot quite mimic the way real people feel when they drive. The challenge is to infuse cars with the intelligence they need to understand humans and take decisions that align with human preferences. Current autonomous vehicles deliver the task of driving through shared-autonomy, which is the result of a collaboration between the human driver and the vehicle. Effective shared autonomy requires a clear understanding of driver’s behavior, which is driven by multiple psychophysiological and environmental variables. Naturalistic Driving Studies (NDS) have shown to be an effective approach to understanding the driver’s state and behavior in real-world scenarios. However, due to the lack of technological and computing capabilities, former NDS only focused on vision-based approaches, ignoring important psychophysiological factors such as cognition and emotion. In this project, we introduce HARMONY a novel human-centered multimodal driving study framework. Through HARMONY, we collect, analyze and understand drivers states and behaviors in real-world naturalistic settings through multiple physical and virtual sensors. The main goal of this project is to utilize both human and machine advantages to humanize autonomy by instilling the beneficial nuances of human behavior, emotions, and trust with the technological and safety benefits of AVs. Behavior-guided AVs will bring human factors such as emotions, behaviors, and trust into the autonomous loop, where AVs can enhance the passenger experience, safety, and comfort. In this project, we are building models to predict driver behavior, and emotional changes in response to different environmental conditions, and to automatically infer their preferences.