Case Studies

INRIX I-95 VPP Data Summary Validation

I-95 Corridor Coalition sole contractor for the Vehicle Probe Project (VPP)

This Case Study summarizes the traffic data quality validations of INRIX data, performed by the University of Maryland on behalf of the I-95 Corridor Coalition during the initial Vehicle Probe Project (VPP).

The first VPP spanned six years, from Summer 2008 to Summer 2014,1 and established numerous industry firsts, several of which are highlighted in the next section. Among these industry firsts is establishing and conducting a process to evaluate the quality of the traffic data provided. Following a consistent methodology, 42 site tests were conducted in 11 states over six years to determine and monitor the quality of data INRIX provided under the VPP. All documentation related to quality requirements, validation testing methodology, and site test results are publicly available on the Coalition’s web site.

VPP Data Quality Summary

Throughout the VPP, INRIX data was subjected to the world’s largest ongoing validation of freeway data accuracy. Over more than five years, 42 validation tests have been conducted in 11 states in a full range of scenarios—urban/rural, overnight work zones, beach and holiday weekends, snowstorms, tunnels, parkways without commercial vehicle traffic, etc.

Table 1 summarizes all 42 site tests in a single table (see page 3, with links to each published report included in the table). In total during the VPP, INRIX data was tested over five years, on over 900 miles of freeways in over 430 discrete segments from as far south as Ft. Lauderdale, FL to as far north as Providence, RI (Figure 1 shows the site test locations across the corridor). Over 600,000 data comparisons were made across all site tests. Figure 2 groups validations into the calendar ranges of 2008-09, 2010-11, and 2012-13 and shows the Average Absolute Speed Error (AASE). In the same time ranges, the summary Speed Error Bias (SEB) was -1.5 MPH in 2008-09, -0.7 MPH in 2010-11, and 0.2 MPH in 2012-2013.

Figure 2 groups validations into the calendar ranges of 2008-09, 2010-11, and 2012-13 and shows the Average Absolute Speed Error (AASE). In the same time ranges, the summary Speed Error Bias (SEB) was -1.5 MPH in 2008-09, -0.7 MPH in 2010-11, and 0.2 MPH in 2012-2013.

Table 1: VPP Data Quality Summary


INRIX was selected from multiple proposers by UMD and the Coalition as the sole contractor for the project in December 2007, was awarded the master contract in February 2008, and began supplying real-time data and associated services on July 1, 2008 for roughly 1,950 centerline freeway4 miles and 1,000 centerline arterial miles across six states, from New Jersey to North Carolina, including the District of Columbia.

The baseline term for the VPP was three years (2008-2011), which was later extended an additional three years (2011-2014). The VPP enabled Coalition member states to add road coverage based on desire and funding availability. At its conclusion in Summer 2014, live data was being provided for over 40,000 centerline miles of roadways in ten states, including nearly 8,000 miles of freeways.

The VPP was groundbreaking in many ways, with many industry firsts:
  • 1st comprehensive requirements for privately sourced real-time traffic data, with a world leading validation program to test and ensure data quality.
  • 1st pay-for-performance contract, tying payments to validated data quality and availability.
  • 1st consistent and publicly available data use agreement giving maximum flexibility to agencies to use, store and re-use data for their purposes.
  • 1st use of private data to establish statewide travel times on dynamic messages signs.
  • 1st statewide map of real-time traffic data on a state DOT’s 511/travel information web site.
  • 1st intelligent fusion of private data and available roadside sensor data on a statewide basis.
  • 1st use of private data in a state DOT branded 511/mobile app.
  • 1st high-definition displays of real-time traffic in shopping malls.
  • 1st online performance monitoring and analysis tools, enabling large scale analysis with only a browser and credentials required for access.
  • 1st corridor-wide multi-state traffic monitoring web site, including user reported incidents, with only a browser and credentials required for access

VPP Quality Requirements

The VPP Contract establishes two metrics for measuring the quality of speed data provided, Average Absolute Speed Error (AASE) and Speed Error Bias (SEB), to be in effect for vehicle flows exceeding 500 Vehicles per Hour in a direction of travel. Since road segments used in validation can vary in length, quality requirements based on speed rather than travel time were used to normalize the effect of varying segment lengths.

Average Absolute Speed Error (AASE)
  • AASE measures the overall error in reported speeds compared to actual conditions
  • The contractually required AASE was below 10 MPH in each of the following speed ranges: 0-30
  • MPH, 30-45 MPH, 45-60 MPH and > 60 MPH
Speed Error Bias (SEB)
  • SEB measures the tendency for reported speeds to be systemically faster or slower than actual
  • The contractually required SEB was a maximum average error of +/- 5 MPH in each of the
    following speed ranges: 0-30 MPH, 30-45 MPH, 45-60 MPH and > 60 MPH

Payment Formula Method

In early 2009, once the initial validations were completed (using the process defined in the next section), the Coalition established a Payment Formula Method, which would reduce payments for data failing to meet the AASE and SEB requirements shown above. Each month since Spring 2009, payments were calculated using the rolling average of the last three completed site validations. Each of the four speed ranges received equal weight, so how data performed in circumstances when ground truth traffic was below 45 MPH counted for 50% of each monthly payment – the formula was clearly calculated to reward or penalize data quality under congested conditions. The VPP RFP and subsequent contract placed these data quality requirements only on Freeways. Thus the official site tests, results and payment calculations were based only on Freeways. During the VPP, data was collected on interchanges and arterials in a small number of sites.

VPP Site Test Overview

The University of Maryland designed, implemented and conducted the VPP Validation Program. The Methodology was developed and proven July through September 2008 with initial validation tests in four states: Virginia, Maryland, Delaware and New Jersey.6 Key to the VPP Validation Program was the utilization of portable Bluetooth Data Collection. Now commonly used for both temporary and permanent data collection, Bluetooth monitoring was far from mainstream in 2008. UMD and the Coalition determined that Bluetooth provided the most cost-effective method for statistically significant ground truth data collection.

The basic concept was for portable Bluetooth readers to be deployed to coincide as near as possible to the start and end of INRIX reporting segments (during the VPP, INRIX utilized TMC codes as reporting segments) to compare ground truth as determined via Bluetooth with INRIX real-time speed/travel time data. Figure 3 illustrates the conceptual approach for Bluetooth data collection. A typical site test would collect data on 10-20 segments for 10-14 days, with segments usually ranging from one to three miles.


Each site test was planned and executed in collaboration with the appropriate Coalition member agency using the process shown in Figure 4, with results published to the Coalition web site upon report completion. During the VPP, 42 site tests were completed in 11 Coalition states between July 2008 and February 2013.

Each Bluetooth sample segment could create up to 288 specific comparison data samples in a day. In each site test, for each five minute increment that a deployed Bluetooth segment had sufficient samples (UMD determined that three or more valid Bluetooth readings traversing a reporting segment in a five minute period provided sufficient ground truth representation), the average Bluetooth speed/travel time was compared to the average INRIX provided speed/travel time.

In cases where multiple INRIX TMC reporting segments were examined by a single Bluetooth reporting segment, a methodology was defined and utilized by UMD for proper comparison.8 Samples were collected 24×7 during the duration of the site test (typically two weeks as stated before). As illustrated in the next section, this approach generated huge amounts of ground truth data on each segment and enabled far greater evaluation of data quality over all hours of the day and days of the week, as well as far better apples-to-apples comparison data, than the traditional testing alternative of drive testing. In the process established by UMD, one output of each site test were numerous “dayplots” that compared individual and aggregated Bluetooth traversal to INRIX reported travel times, with each dayplot representing a specific segment for a specific calendar day. These dayplots were made available to each member agency for their review and are available from UMD upon request.

A Guide to the Elements of the Day Plot
  • The title of each dayplot provides the location as well as meta-data about the site test and the specific segment.
  • The horizontal axis shows speeds by time of day, from midnight to midnight.
  • The vertical axis displays speed in MPH (Bluetooth traversal times are converted to speed across the segment).
  • Each individual Bluetooth traversal is noted by a blue “x”, and a blue “x” with a dot indicates a reading UMD has determined in post-processing to be an outlier, which is discounted from further analysis.
  • For each five minute period with three or more valid Bluetooth traversals, the mean is calculated and represented on the solid line as the ground truth mean for that time period.
  • To account for the statistical probability that the Bluetooth mean is the actual mean for the full stream of traffic it represents, UMD applied the standard error of means methodology, which is represented in the dayplot “Band low” and “Band high” dashed lines.
  • INRIX data for the five minute period is noted by a red diamond (note that while INRIX reports continuously, an INRIX data point is only present on the dayplot if there is a valid Bluetooth sample).
  • To determine error, INRIX data is compared to the closest of the bands if outside the area between the high and low bands, if inside the error band, zero error is reported.
  • The average Bluetooth speed for a given sample determines which of the four defined speed ranges the INRIX error result populates for that time period.
Figure 5: Dayplot from December 8, 2011 Florida Site Test
Figure 5: Dayplot from December 8, 2011 Florida Site Test

Each sample from each site test is rolled up first into overall segment summaries, and finally into an overall result. Exhibit 1 was extracted from the September 2011 Pennsylvania site test to illustrate segment summaries and Exhibit 2 shows the overall summary result for this site test (performed near Harrisburg, PA), the format of this table is used across all VPP site tests to summarize results. The SEM Band (shown as 1.96 SE Band in Exhibit 1) results are the contractual defined data quality results and are used throughout this report. In this site test, 514 observations occurred when ground truth speeds were below 30 MPH, and in those instances, INRIX data’s Average Absolute Speed Error (AASE) was 3.5 MPH.

Exhibit 1

Data quality measures for individual freeway validation segments greater than one mile in the state of Pennsylvania.

Exhibit 2

Data quality measures for INRIX speed data with a score higher than 25 on freeway segments greater than one mile in Pennsylvania.


More Case Studies