ITS 3C Summit Mobile, Alabama September 15, 2014 Chester H. Chandler Terry Hensley Derrick A. Lue
Agenda Introduction Data Sources Methodology Performance Measure Calculations Confirmed Results Test Bed Lessons Learned Acknowledgement
Introduction When did performance measures reporting commence? FDOT, District Seven, Tampa Bay Area End of 2012, Florida Department of Transportation (FDOT) District Seven s upper management directed its SunGuide SM ITS Section to develop and publish a recurring progress report Report envisioned to detail ITS deployment projects in various phases of development, as well as the overall health of the SunGuide SM ITS program. Began collecting data in March 2013, and 3 months later, published first progress report.
Introduction Who did we turn to help us create our report? FDOT Central Office, Doug McLeod for guidance Cambridge Systematics for critical review and commentary Central Office, ITS Section, for securing services of University of Maryland s Center for Advanced Transportation Technology Laboratory (CATT) Regional Integrated Transportation Information System (RITIS) for pre-processing of raw data PB for creating algorithms, spreadsheets, and data reduction techniques that allow quick synthesis and publishing of the progress reports
Introduction How many performance measures reported in Quarterly Progress Report? Total of 26 performance measures reported 2 Quantity 4 Capacity Utilization 4 Quality 11 Operations 3 Infrastructure/Maintenance 2 ITS Development & Deployment
Introduction Who will use our performance measures data? District Seven Management Budgeting, work programming, and planning Florida Transportation Commission Tracking FDOT s overall performance goals Hillsborough and Pinellas Counties MPOs Assisting with congestion management program compliance USDOT/FHWA MAP-21 performance measures reporting compliance
Introduction Who will use our performance measures data? Tampa Bay SunGuide SM Regional Transportation Management Center Traffic Managers Real-time traffic incident detection Real-time work construction zone management Real-time ramp metering operations Real-time dynamic toll changes on express toll lanes Future autonomous vehicle operations support
Data Sources Microwave Vehicle Detector System (MVDS) EIS/ISS, Wavetronix Non-Invasive Magnetic Loop (NIML) Intelligent Transportation Infrastructure Program (ITIP) SunGuide SM RITIS CCTV cameras (for verification sampling)
Data Sources
Methodology Select freeway segments for reporting Detector at midpoint of each subsegment between adjacent interchanges (31 subsegments) Monthly directional data obtained in 15 minute increments from RITIS RITIS calculates some performance measures Average Travel Speed Travel Time Index Buffer Time Index (used to calculate Planning Time Index)
Methodology Download detector polling data in.csv format from the RITIS data fusion engine District Seven SunGuide SM traffic management software collects detector data and exports the data to RITIS RITIS cross-compares, error-checks, and filters detector data Perform further reasonableness tests on RITIS data, adjust or discard as applicable Utilize customized Visual Basic program for calculating final performance measures Report monthly performance measures in graphical format for three-month periods and cumulatively for the year
Freeway Segments Reported I-275 from 54 th Avenue North to SR 60, 13 miles I-275 from Ashley Drive to Livingston Avenue, 12 miles I-75 from Bloomingdale Avenue to I-4, 8 miles I-75 from I-4 to Fowler Avenue, 4 miles I-4 from I-275 to Park Road, 22 miles
Freeway Segments Reported
Quantity Measures (all days in month) Monthly Average Daily Traffic (MADT) Monthly Vehicle Miles Traveled (MVMT)
Quantity Measures (all days in month)
Quantity Measures (all days in month)
Capacity Utilization Measures (weekdays only) Percent of Miles Heavily Congested During Peak Hour Percent of Travel Heavily Congested During Peak Period Average Peak Hour Density Duration of Congestion
Capacity Utilization Measures (weekdays only)
Capacity Utilization Measures (weekdays only)
Capacity Utilization Measures (weekdays only)
Capacity Utilization Measures (weekdays only)
Quality Measures (weekdays only, peak period) Percent of Travel 45 MPH Average Travel Speed Travel Time Index Planning Time Index
Quality Measures (weekdays only, Peak Period)
Quality Measures (weekdays only, Peak Period)
Quality Measures (weekdays only, Peak Period)
Operations Measures # of Road Ranger Assists by County Type of Road Ranger Service Performed Number of Incidents by Type Incident Clearance Duration Number of 511 Calls/Web Visits/Twitter Followers Number of Rapid Incident Scene Clearances (RISCs) Executed Single Point of Contact (SPOC)
Operations Measures
Operations Measures
Operations Measures
Operations Measures
Operations Measures
Operations Measures
Operations Measures
ITS Infrastructure/Maintenance Measures Number of Centerline Miles Deployed/In Development/ Future Number of Active Field Devices and Uptime Percentage Network Uptime (i.e., availability)
ITS Infrastructure/ Maintenance Measures
ITS Infrastructure/Maintenance Measures
ITS Development and Deployment Measures Number of Projects by Phase Construction Budget and Budget Spent
ITS Development and Deployment Measures
ITS Development and Deployment Measures
Confirmed Results SunGuide SM Program Quarterly Progress Report Published Annually Published Quarterly
Lessons Learned Sources of Inconsistencies Data Fusion Related Lane 1 in a given direction is not always the inside travel lane Zone data may be different from sum of the lane data comprising the zone Unconsolidated data directory structure, e.g., some data in District Seven folder and some in Tampa folder
Lessons Learned Deployment Once a primary set of detectors is identified, it is prudent to identify secondary set of detectors for back-up data Choose segment lengths carefully to reflect what users experience Construction effects on devices
Lessons Learned Sources of Inconsistencies Data Collection Device Related 1. Multi-path interference, e.g., tri-cord trusses 2. Metal objects, e.g., guardrail 3. Concrete barrier 4. Concrete walls 5. Frequency drift 6. End of life cycle 7. High winds 8. Vibration, e.g., fast moving commercial vehicles, thunder 9. Mounting bracket fixture 10. Sudden loss of communication 11. Catastrophic event, e.g., direct/indirect lightning strikes, commercial power plant under/over voltage 12. Digi (port/terminal server) failure/grounding issues 13. Construction/work zone temporary or constant shifting of travel lanes 14. Very slow moving traffic streams, e.g., heavy congestion 15. Incorrect/Incomplete calibration/re-calibration 16. Vegetation growth around pole
Test Bed Why a Test Bed? Transportation System Maintenance and Operation (TSM&O) challenged the ITS industry with how to best deploy field devices to reduce long-term maintenance costs while adding value to daily operations and traffic management. Develop data and concepts to accomplish these goals and will recommend optimal future deployment of MVDS units. Test different forms of detection to gather data from multiple sources for evaluation and consideration in future traffic management strategies such as ramp metering or managed lanes.
Test Bed Where is Test Bed? Seven locations along I-275 in St. Petersburg, FL I-275-31.2 SB I-275-30.5 SB I-275-30.0 SB I-275-28.4 SB I-275-28.2 SB I-275-27.9 SB I-275-27.0 SB
Test Bed Test Bed Site Devices MVDS (including, Wavetronix SS125 HD and Econolite G4) Pneumatic Count Tubes Non-Invasive Magnetic Loops (NIMLS or commonly referred to as Micro Loops) (GTT Canoga Micro Loop) Pre-existing FDOT Central Office Count Stations (cut loops) Video Identification System (VIDS) Short duration video recording for field count verification
Test Bed Site Specific Evaluations Drift on a 30-day cycle Per lane accuracy (including speed, volume, classification, and occupancy)
Acknowledgement FDOT Doug McLeod Chester Chandler Terry Hensley Greg Reynolds Romona Burke Waddah Farah Mark Hall Cambridge Systematics Anita Vandervalk Kenneth Voorhies TransCore Drew Young Atkins Clay Packard Kelli Moser Lucent Group Jared Ruso RITIS Team Michael Pack Walter Lucman Drew Lund PB Team Derrick Lue Kathryn Ortega Aaron Zhou Praba Prabaharan Bharathi Chigurupati Cathy Casteleiro