By now, we’ve all seen a picture of a highly automated (shorthand: “driverless”) vehicle from the outside: hockey puck or canister-shaped lidar sensors, 360-degree camera coverage, advanced radar systems, or maybe, until recently, even a “friendly” redesign. As sensors become smaller, processors more powerful and algorithms further refined, these sensor suites are likely to shrink or disappear into the surface of the vehicle: moving from the steampunk style vehicles on roads today to the sleek prototypes on stage at auto shows.

While many have focused on the outside of these vehicles, relatively little attention has been trained on how the evolving needs of vehicle users and shifting hardware requirements inside of the vehicle will change the interior of driverless cars. When designers have been charged with reimagining the interior of a vehicle, one feature has been consistent: backward facing front seats. This restructuring of the seating configuration does demonstrate the possibilities available when those in the front seat no longer need to be watching the road, but, much like with our iPhones and Android devices, the game-changing factor will be the software we use to interact with the machine.

In addition to leveraging data and analytics resources to help cities and autonomous vehicle operators better plan for, execute and refine driverless vehicle deployments, INRIX has begun the process of reimagining how riders will interact with a vehicle. In partnership with Renovo’s AWare, the first OS built specifically for automated mobility on demand, INRIX will deploy its OpenCar platform to improve the performance of shared-use, on-demand highly automated vehicles.

INRIX OpenCar re-envisions how riders will interact with highly automated vehicles

Two themes are central to this design: allowing the user to oversee and interact with the ride as desired and providing access to contextual content that makes the ride more enjoyable and productive. In the OpenCar AV system that will be unveiled at CES, INRIX shows how a driverless rider can interact with the vehicle, select a route that meets their needs, use contextual information and services to make smarter decisions on the go, communicate with friends, and access content to make their time in the vehicle more enjoyable. Powering these interactions is INRIX real-time, predictive and historical data from more than 300 million sources, including commercial fleets, GPS, cell towers, mobile devices and cameras, combined with content drawn from a range third-party services made available across manufacturers for any user of an OpenCar powered system. The end result is a system built for the different attention requirements in a HAV and that uses INRIX big data and machine learning to improve the safety, efficiency and enjoyment of their user experience.

2018 will likely be remembered as the year highly automated vehicles went from pilot demonstrations to consumer-ready deployments. With the hardware and processing ready for primetime, it’s time for the innovation focus to expand to include the user-facing software that guides how users interact with these vehicles. INRIX looks forward to jumpstarting this conversation at CES with our OpenCar and Renovo partnership.