-ORCL Wiki-

The Omni-Reality & Cognition Lab (ORCL) joins together a team with expertise in transportation infrastructure design, traveler behavior, intelligent transportation systems, human-centered design, immersive virtual environments, and user interaction with the built environment to tackle improving transportation infrastructure and technology for non-motorized travelers through the use of alternative reality technologies (virtual, augmented, etc.).

Table of Contents

-History-

Motivation

The ORCL was envisioned as a way to solve many of the inherent problems with traditional infrastructure studies. Traditional methods of researching alternative infrastructure designs are costly and time consuming, whereas the ORCL enables us to understand the interactions between people and infrastructure by eliminating the costs and time needed for physical construction, non-replicable environmental factors that may alter decision making, and the safety risks of real world testing.

Scope of Lab

The Omni-Reality & Cognition Lab is dedicated to the education and research of human interaction with infrastructure systems and environments, utilizing alternative (virtual, mixed, and augmented) reality technologies.

Multidisciplinary Approach

The ORCL offers a space where students and faculty from multiple disciplines can work together to solve complex, modern day problems. The projects tackled within the lab space span multiple fields of research, including civil engineering, computer science, systems engineering, psychology, and more.

Goals

1: Demonstrate the feasibility of utilizing virtual reality as a tool for conducting real-world experimentation and understanding human behavior.
2: Understand pedestrian preferences (both stated and observed), behavior, and physiological responses to alternative infrastructure technology and design at midblock crosswalks.

-Lab Space/Location-

ORCL is located at the University of Virginia in Thornton Hall Room D107.

To create our virtual reality environments, we replicated Water Street in Charlottesville, Virginia between 2nd St. SW and 4th St. SE. For more information, visit the “Affiliated Research” section below.

-Equipment/Hardware-

Physical Equipment

Bicycle Simulator

Wahoo Kickr Smart Trainer
Power measurement system of +/- 2% for accurate, realistic resistance feedback.
Adaptive, real-time resistance based on road grade.

Wahoo Kickr Climb
Adaptive, real-time indoor bicycle grade simulator attached to the front fork of the bicycle that accurately raises or lowers the front end of the bicycle based on road grade. Capable of simulating roadway grade in a range of +20% to -10%.

Wahoo Kickr Headwind
Adaptive, real-time variable speed vortex fan capable of reaching wind speeds experienced by cyclists on the road, providing tactile feedback based on cyclist speed.

Trek Bicycle
Trek Verve 1
Size: Large
21 Speed
Drivetrain details:
Shifters: Shimano Altus EF500 7 Speed
Front Derailleur: Shimano Tourney TY51
Rear Derailleur: Shimano Altus M310
Crank: Forged Alloy, 48/38/28 w/chainguard
Bottom Bracket: Sealed cartridge
Cassette: SunRace freewheel, 14-34, 7 speed
Chain: KMC Z51
Pedals: Wellgo nylon platform

HTC Vive ProEye Headset

Two, identical, HTC Vive Pro headsets with their accompanying controllers will be used during experimentation. The HTC Vive Pro has a resolution of 2880 x 1600 (615PPI) pixels with a refresh rate of 90 hz and is run on SteamVR. The maximum range of the headsets, wired, is 100m squared. The headsets have built in headphones with in-line amplifiers and a field of view of 110 degrees. Movement is traced with an accelerometer, gyroscope, lighthouse 2.0 laser tracking system, and dual front-facing cameras. The headsets have been equipped with the HTC Vive Pro Wireless Adapters, which supports a 6 x 6 m space for accurate tracking and operates on a zero-latency wireless communication

Microsoft HoloLens 2

Microsoft HoloLens 2 is a pair of mixed reality smart glasses developed and manufactured by Microsoft. Upgraded from HoloLens 1, HoloLens 2 allows for a new level of context and human understanding within the holographic experience by providing developers with the ability to use information about what the user is looking at. The HoloLens 2 Eye Tracking API provides access to a single eye-gaze ray (gaze origin and direction) at approximately 30 FPS (30 Hz). The HoloLens 2 can be used to collect eye tracking data in the outdoor benchmark testing, and provide similar output as HTC Vive ProEye Headset in the VR environment

Smart Watch

Participants wear Huawei Smartwatches to measure heart rate. Our platform uses an android smartwatch that is equipped with “SWEAR”, an in-house designed app for collecting long-term data from smartwatches. SWEAR smooths the data collection by adding the ability of changing each sensor’s data collection frequency to the desired frequency. SWEAR records heart rate , hand acceleration, audio amplitude (noise level), light intensity, location, and gyroscope

Computing Equipment

Computer Specs

Unity 3.14f

Wireless communications protocol stack developed by ANT Wireless (a division of Garmin Canada).
Topology: Point-to-point, star, tree, mesh
Band: 2.4 GHz ISM Band
Range: 30 meters at 0 dBm
Max Data Rate: Broadcast – 12.8 kbit/s, burst – 20 kbit/s, advanced burst – 60 kbit/s
Application throughput: 0.5 Hz to 200 Hz (8 bytes data)
Max nodes in piconet: 65533 per shared channel (8 shared channels)
Security: AES-128 and 64-bit key
Modulation: GFSK

-Affiliated Research-

For more information on related projects, visit the official ORCL website:

https://engineering.virginia.edu/omni-reality-and-cognition-lab/projects

-FAQ-

[To be implemented]

-Contacts-

Omni-Reality and Cognition Lab

Thornton D107/108

Phone: N/A

Email: orcl@virginia.edu

-References-

[To be implemented]