Traffic Records Assessor Training What s in it for you?

Size: px
Start display at page:

Download "Traffic Records Assessor Training What s in it for you?"

Transcription

1 Traffic Records Assessor Training What s in it for you? Joan Vecchi, Lang Spell, Jack Benac 39th International Forum on Traffic Records & Highway Information Systems Crowne Plaza St. Paul Riverfront St. Paul, Minnesota Monday, October 28, 2013

2 Why Assess State TR Systems? BETTER DATA makes BETTER DECISIONS possible. BETTER DATA BETTER PROBLEM ID BETTER INTERVENTION SAVED LIVES The TR assessment helps States improve their data collection, maintenance, and analysis capabilities. 2

3 Data s Role in Decision-making IDENTIFY the causes and outcomes of crashes DEVELOP effective interventions IMPLEMENT countermeasures to prevent crashes and improve crash outcomes UPDATE traffic safety programs, systems, and policies EVALUATE progress in reducing crash frequency and severity 3

4 Old Assessment Process - 6 Months -1 Month Assessment State requests an assessment Pre-assessment site visit; States complete questionnaire One-week on-site visit +1 Month State corrects factual errors +2 Months Assessor delivers final assessment report 4

5 Feedback GAO States Reported on 408 progress and found TR Assessments: Incomplete Inconsistent Too brief TR Assessment participants reported their experience: Resource intensive Logistically challenging Expectations varied 5

6 Changes to Assessment Old Process On-site assessment Limited pool of assessors Assessors interview State staff Single subject matter expert per section Subjective narrative Live report out to small audience State funded New Process On Site kickoff followed by a remote assessment Expanded pool of assessors Assessors rate written responses Multiple subject matter experts per section Qualitative ratings Live/Webinar report out NHTSA funded 6

7 Comparing Assessments Old Assessment Information Flow A S A S New Assessment Information Flow S A STRAP S S A A S S S 7

8 Improvements for All Better resource management (time, money, expertise) Improved services for the States Qualitative ratings that can be quantified Leverages existing data and knowledge Provides NHTSA with an evolving picture of traffic records nationwide 8

9 Features of New assessment New Assessors Expanded pool of subject matter experts Iterative cycle of Q&As to seek clarification from State New Technology State Traffic Records Assessment Program (STRAP) IT system will route and track assessment questions between State and assessors New Capabilities Data from assessments will be aggregated and analyzed to identify national trends in traffic safety systems and provide States with feedback on the capabilities of their traffic records system compared to a national average. 9

10 What & Where are the Criteria? The Traffic Records Program Assessment Advisory: Provides guidance on the necessary contents, capabilities, and data quality measurements for a comprehensive traffic records system, Describes an ideal traffic records system, one that supports high-quality decisions that enable cost-effective improvements to highway and traffic safety, Poses a uniform set of questions that reveals the performance of the State traffic records system relative to the ideal. 10

11 Data Quality Attributes TIMELINESS ACCURACY COMPLETENESS UNIFORMITY INTEGRATION ACCESSIBILITY 11

12 Scope of the Assessment TRCC Management Strategic Planning Data Use & Integration Six Core Data Systems: CRASH DRIVER VEHICLE ROADWAY CITATION/ ADJUDICATON INJURY SURVEILLANCE 12

13 Improving State s TR Data In comparing a State s traffic records system to the ideal outlined in the Advisory, the assessment will: Identify strengths and challenge areas Rank questions to help prioritize investment Supply brief recommendations for improvement 13

14 Assessment Questions TRCC Management 19 Strategic Planning 16 Crash 44 Driver 38 Vehicle 45 Roadway 38 Citation / Adjudication 54 Injury Surveillance 123* Data Use & Integration 13 Total 391 * Injury Surveillance now includes sub-sections on EMS, Emergency Room, Hospital Discharge, Trauma Registry, and Vital Records 14

15 Standards of Evidence The Advisory supplies a standard of evidence for each question. Describes the information needed to support State assertions that a traffic records system possesses the specific capability referred to in the question Determined by subject matter expert panels with State, Federal, academic, and other representation. 15

16 Who Answers the Questions? COORDINATOR The State Assessment Coordinator assigned respondents to each question RESPONDENT Respondents will receive tokens via enabling them to access the online interface RESPONDENT Questions can be assigned to multiple respondents to capture multiple perspectives Once assigned, questions can be: Answered: provide requested information Referred : answer & forward on Deferred: do not answer & forward on Declined: do not answer 16

17 How do you fit in? THE REVIEW PROCESS 17

18 Process Flow 18

19 New Assessment Schedule 3 Month Assessment Process At least 6 months before State Kickoff Meeting State requests traffic records assessment 2 Months prior to Kickoff Meeting NHTSA Traffic Records Team hosts pre-assessment conference call 1 Month prior to Kickoff Meeting State assessment coordinator assigns draft list of respondents 1 Week prior to Kickoff Meeting State assessment coordinator completes State assessment library of documents 1 Week prior to Q&A Cycles On site kickoff meeting Weeks st Q&A Cycle: State answers standardized assessment questions Week 14 Week 15 After week 15 Week 4 Week 5-7 Week 8 Week 9 11 Week 12 Week 13 Assessors review State answers and rate the responses; if needed, they request necessary clarifications 2 nd Q&A Cycle: State responds to the assessors initial ratings and requests for more information and clarification Assessors review additional information from the State and, if needed, adjust initial ratings for each question 3 rd Q&A Cycle: State provides final response to the assessors ratings for each question Assessors make final results Facilitator prepares final report NHTSA delivers final report to State and Region NHTSA hosts webinar to debrief State participants State requests GO Team for targeted technical assistance (optional) 19

20 Assessment Ratings Upon review, assessors will make a determination for each question that reflects how the State s traffic records systems are performing relative to the ideal detailed in the Advisory. MEETS the description of the ideal traffic records system PARTIALLY MEETS the description of the ideal traffic records system DOES NOT MEET the description of the ideal traffic records system 20

21 Assessment Report Executive Summary Strengths Opportunities Document responses and ratings for each question Ability to compare State responses with national average 21

22 Description and Contents Database custodian assigned? Central State Repository? Reporting criteria outlined? Data used for traffic safety purposes? All appropriate data in the database? Authorized researchers have access? Local systems interoperable? 22

23 Description and Contents Specific to Data Components Crash data should be used to: ID risk factors, guide engineering/construction projects, prioritize LE activities, evaluate countermeasure programs Vehicle data should be available to LE Officers at time of field contact; Registration docs should be barcoded Driver data should support analysis of driver behavior Roadway data should incorporate both state and local roads; there should be a common location reference system Cit/Adj data should provide a record of trends in citation issuance, prosecution, dispositions 23

24 Applicable Guidelines Crash MMUCC ANSI D-16 Crash classification ANSI D-20 Driver and vehicle system data elements DPPA FARS Manual Vehicle Vehicle NMVTIS National Motor Vehicle Title Information System Title Brands Recommended by AAMVA PRISM Performance and Registration Information Systems Management 24

25 Applicable Guidelines Driver ANSI D.20 Driver/vehicle system data elements AAMVA Code Dictionary used for conviction data exchanges NDR, PDPS, CDLIS compatibility Roadway MIRE Model Inventory of Roadway Elements Fundamental Data Elements 25

26 Applicable Guidelines Citation/Adjudication NCIC UCR Crime reporting NIBRS Crime reporting NLETS LEIN Telecommunications Law Enforcement Information Network EMS/Injury Surveillance HIPAA EMS NEMSIS ED/HD UB 04 Trauma Registry NTDS 26

27 Applicable Guidelines EMS/Injury Surveillance continued Vital Records US Standard Death Certificate ICD 10 Cause of death coding Injury Scoring AIS, ISS Injury Severity Glasgow Coma Scale Neurologic Injury 27

28 Data Dictionaries Is there a complete data dictionary for the system? Include all data elements and derived variables, edit checks and validation rules Up-to-date Consistent with Field Manuals List data elements populated through links with other systems 28

29 Procedures and Process Flows Diagrams for key processes, which include amount of time to do each step. Outline procedures and policies Reflect interactions with other data systems Include processes for managing errors Show both manual and electronic processes Step-by-step written process descriptions may be used in lieu of flow charts 29

30 Procedures and Process Flows Key Processes Crash Reporting, approval, submission, correction Vehicle Registration, title, branding transactions Driver Collection, reporting, posting of convictions Driver license issuance, education, sanctions, status System Security 30

31 Procedures and Process Flows Key Processes Roadway Steps for adding data elements Updating roadway information Archiving data Procedures for local collection and submission Citation and Adjudication Steps from issuance of the ticket to the officer to court disposition and posting to the driver file DUI records and exchange of data Security Protocols 31

32 Procedures and Process Flows Key Processes Injury Surveillance System Collection, management, analysis and linkage of data Management of aggregate database for research and linkage 32

33 System Interface What links are established to support critical business processes? Systems are connected at all times Electronic uploads to repositories Electronic posting to individual records Uploads to federal databases FARS SafetyNet 33

34 System Interface Crash Driver Vehicle Roadway Citation/Adjudication ISS Vehicle Driver Crash Driver Crash Vehicle Citation/Adjudication ISS 34

35 System Interface Roadway Crash Local systems Location Reference Systems in the state Between various systems within the Roadway system Citation/Adjudication Cit/Crash Cit/Driver/Vehicle Adj/Driver/Vehicle Injury Surveillance EMS/ED and HD EMS/Trauma HD/VS 35

36 Data Quality Control Automated edits and validation rules/ cross-field edits Limited State-level correction authority Performance measures Numeric goals High-frequency errors used for training and to update manuals or generate form revision Feedback for users DQ management reports to TRCC 36

37 Data Quality Control Performance Measures Attributes Timeliness Accuracy Completeness Uniformity Integration Accessibility 37

38 Assessing RATING STATE RESPONSES 38

39 Response No answer provided Positive answer, no evidence Positive answer, non- Advisory evidence provided Answer: system under development, but has not been implemented Positive answer; evidence cannot be obtained Positive answer; insufficient information or explanation For questions related to all of something, State provides positive answer with exceptions Meets Ratings Partially Meets Does Not Meet X X X X X X X X X X X Comment Rating will depend upon quality of evidence and whether alternative evidence provides sufficient proof of statement. If verified, rating will reflect as if Advisory evidence were provided. If evidence is lacking, clarification will be requested. Assessment cannot reflect future developments; only what has been established. Partial credit will be given with verifiable, supporting evidence. Request clarification or alternative evidence. Rate only on evidence provided. Request clarification To be consistent, understanding that States will need to process a minimal number of paper reports, the following rules apply: - State populations > 6 million require 99% electronic capture & submission - State populations 2-6 million require 98% electronic capture & submission - State populations < 2 million require 95% electronic capture & submission Partial answer X Request additional information & clarification Guidelines for Assessor Ratings 39

40 Assessing State respondents provide answers and standards of evidence to support those answers within STRAP; Assessor tokens are sent on specific date giving you access to STRAP; You will see all responses in your assigned modules plus supporting evidence/documentation; 40

41 Assessing (continued) In STRAP, you will see a ballot window for your rating and explanation. You will assess whether the State meets, partially meets or does not meet the standard. (These choices appear in a drop-down option set.) You will identify why you rate a question as you do, especially if the answer is borderline or your assessment is based on documentation that explains the answer. All explanations must be stand-alone. The final report will not include State responses, so your explanation must incorporate the pertinent information. 41

42 Module Leaders You will work independently, but will coordinate with another assessor. One of you will be a Module Leader. Once you have both rated a section, the Module Leader will review all ratings and resolve any differences between assessors. If the response leaves you confused or information is missing, you may request clarification of the response. 42

43 Assessor/Module Leader You must review the Assessor Guidelines for guidance when rating responses. When you have rated all questions and entered the status into STRAP, your Round One work is complete. In Round Two, you will have an opportunity to upgrade Round One ratings based on additional information or documentation from the State. 43

44 Assessor/Module Leader The finding must stand alone; that is, anyone who reads the finding must be able to discern what is being rated and why. 44

45 Assessor/Module Leader The Module Leader, who may be one of the assessors or not (ISS), then reads the ballots and ratings and correlates the answers from both assessors and seeks consensus, if necessary. If the two ballots have the same ranking, the Module Leader makes a finding, and writes the finding into the STRAP system. This is the finding that will go into the final report. It must be complete and understandable. If the Assessor and Module Leader cannot reach consensus, the Facilitator will assist in resolution. 45

46 Response/Rating Rounds Answering of questions by State Respondents and Rating of respondents by Assessors is done in 3 rounds. 46

47 Ratings/Rounds The Assessor who is not the Module Leader should assess and rank all questions first. The Module Leader will complete his or her own rankings prior to review of the other Assessor s rankings. The Module Leader will then compare rankings, determine if there is agreement, and coordinate with the Assessor to draft the ranking to be used or the wording of the clarification request, if needed. 47

48 Rounds 2 and 3 Round 2 allows the State to respond to clarification requests or to questions declined by the assigned respondent in Round 1. Round 3 is the wrap-up phase for the Assessors. Prior to Round 3, all findings are saved as pending. In Round 3, findings can be marked as final. The Module Leader writes the Module Summary for the final report to the State. The Module Summary addresses the State s strengths first, along with a general description of the state of the data system or function, then lists opportunities for improvement. 48

49 Working in STRAP 49

50 Working in STRAP Responses from the State are usually more extensive. The STRAP screen adjusts to accommodate the replies. Each link or completed screen leads to another. The magnifying glass icon invokes the document screen. 50

51 Working in STRAP When the Module Leader needs to view the Assessor responses, STRAP presents both. Each screen provides references to the question and evidence required, plus portions of the work completed to that point. 51

52 What would you do? CLASS EXERCISES 52

53 Case Studies Question 101. Can the vehicle system data be used to verify and validate the vehicle information during initial creation of a citation or a crash report? Answers: Respondent One No Respondent Two Yes, the law enforcement officer can use the barcode on the registration to pre-populate the form. 53

54 Case Studies You have conflicting answers. Which is correct and how do you rate? Do you ask for clarification of the responses? A clarification request may be made in lieu of a rating in Round One; In Round Two, you may request clarification, but must also provide a rating. This gives the State the opportunity to provide clarification based on your rating and explanation. 54

55 Case Studies The rating for this question could be: The State does not meet the standard because the live system is not used for citation and crash creation, only the barcoded information which may have been revised on the system since the vehicle s last registration. 55

56 Case Studies The rating could be this: The State Meets the standard because the officer creating the report can pre-populate information about the vehicle system from the barcode. Or this however, this rating is inadequate: Does not meet the standard because evidence was not sufficient. 56

57 Case Studies Question 73. Is there performance reporting that provides specific timeliness, accuracy and completeness feedback to each law enforcement agency? States that have implemented electronic crash reporting often have small and/or remote agencies that submit paper reports. If feedback does not go to those agencies, what is the ranking? What if the responses are in conflict? 57

58 Case Studies Question 74: Is the detection of high frequency errors used to generate updates to training content and data collection manuals, update the validation rules, and prompt form revision? If the state provides evidence for all but prompting form revisions, how would you rank? 58

59 Are you ready? Questions? If you are convinced this is a job you would like, please complete the form and return it to one of us, either in person or via . This is an opportunity to increase your traffic records knowledge and expertise by hearing what other states are doing and how their problems have been solved. You will also be involved in meaningful discussion/deliberation with other experienced professionals. 59

60 Joan Vecchi (720) Lang Spell Jack Benac THANK YOU 60