Automating the Billing Compliance Program

Size: px
Start display at page:

Download "Automating the Billing Compliance Program"

Transcription

1 Automating the Billing Compliance Program Martha R. Weiner Senior Director, Office of Billing Quality Assurance The Johns Hopkins University School of Medicine Baltimore, Maryland HCCA Physician Practice Compliance Conference September 8, 2005

2 Outline Johns Hopkins University School of Medicine and Clinical Practice Association Snapshot Office of Billing Quality Assurance Overview Original Audit Model Risk-based Audit Model Automating the Audit Process Effectiveness Measures Q & A

3 Johns Hopkins University and School of Medicine Johns Hopkins largest private employer in Maryland Leader in academics, research and patient care Johns Hopkins University and Johns Hopkins Health System separate legal, corporate entities with separate Compliance Offices FY 05: 67,294 admissions (Johns Hopkins Hospital and Bayview Medical Center) and nearly 700,000 outpatient physician visits

4 FY 2005 Clinical Practice Snapshot 20 clinical departments 1500 physicians and other clinical providers $ million in physician charges $ million in fee-for-service revenue $ 74 million of revenue from federal programs 2.6 million billing transactions 40+ payer contracts

5 Background and Overview 1996: Billing Compliance Program Each Department appoints a physician and administrative compliance liaison 1997: Moved into the Clinical Practice Association November February 2003: PATH Audit January 2002: Increased leadership and financial commitment from compliance cops to compliance colleagues June 2003: Revised Program and individual Departmental Compliance Action Plans approved

6 Billing Quality Assurance Objectives Protect the clinical revenue and reputation of the School of Medicine and its faculty through: Commitment and oversight Goals and principles Individual department clinical responsibility Education Internal review and monitoring Communication of standards and procedures

7 Scope of Activities Training Documentation reviews Internal resources for coding and billing advice Revenue opportunities Resolve operational issues Special projects Investigations Coordination with JHHS Compliance and various JHH, JHBMC and SOM Committees Staffing: 15 FTE s Budget: $ 1.5 million (00.6% of clinical revenue) funded by clinical departments

8 Original Audit Model Annual review for everyone 5 services Nearly impossible to accomplish Departmental statistics, but no individual or aggregate benchmarks

9 Introduction of Scoring Methodology FY 2001 Physicians asked for a comparative measurement Competitive nature Data driven Adaptation of Georgetown s point system

10 Provider Scoring SCORE DESCRIPTION (A score of 12 or more indicates significant improvement is needed to meet compliance standards, and a meeting will be scheduled with the provider to review results.) No documentation for services provided or services billed not adequately supported Insufficient documentation of attending physician s presence and participation in service (Teaching Physician regulations not met) E/M service overcoded two or more levels E/M service billed under wrong category Error attributable to computer-assisted documentation or coding CPT procedure code changed CPT bundling/unbundling errors attributable to a provider E/M service documentation undercoded two or more levels CPT procedure code or E/M code added Missing or incorrect use of modifier selected by provider E/M service documentation undercoded one level E/M service documentation overcoded one level Agree that documentation meets all documentation/coding requirements Aim for a low score just like golf! A single audit case is capped at 12 points even though more points may have been assessed for that service.

11 Administrative Scoring SCORE DESCRIPTION Insufficient documentation / wrong units billed Medical Record not provided by department within 10 working days following a written request Coding and key stroke errors attributable to billing or other departmental staff CPT code bundling/unbundling errors that should have been prevented by the billing office either by Medicode, TES or charge entry staff A billing error due to the billing voucher not having the current CPT or ICD-9 codes, or the billing voucher has no space for the provider to write a narrative description Error attributable to computer-assisted documentation or coding A service billed under the name of the provider who did not document or provide the service (except for incident-to services) Misuse of modifier appended to service by billing or other departmental staff When a Department has an average score of 12 or more administrative points, or when a systemic pattern has been identified, a meeting with the Office of Billing Quality Assurance (OBQA) staff is required. The OBQA staff assists in the development of an action plan and recommendations as to how to resolve the identified issue(s).

12 Preparing for the Risk-Based Audit Model Reviewed 5 services for everyone in less than a one year period Established baseline score for individuals and departments Slotted individuals into quarterly audit cycles In parallel, selected and implemented new audit software tool - MDaudit

13 Risk-Based Provider Audit Matrix as of July, 2003 PRIOR SCORE No Review FY AUDIT FREQUENCY New Provider (within 1 st quarter of clinical activity) = <5 Bi-Annual review cycle 6-11 Semi-Annual until less than 6 or quarterly if = > 12 = >12 Quarterly until 11 or less (Semi-Annual) or 5 or less (Bi-Annual) Some Departments have asked that all providers be reviewed annually, even if score qualified for a bi-annual review.

14 Documentation Review Process 10 services selected from prior quarter s billing Scope of documentation reviews Adequate support for code(s) billed Most appropriate code(s) selected Teaching Physician rules met Mid-level Practitioner rules met All services provided were billed (missed revenue) Administrative vs. provider errors

15 Documentation Review Process Packet to the physician Cover letter tailored to score Summary of audit findings Point system quantifies findings meeting required if score is 12 or more points Auditor s work sheet and copy of the medical record note

16 Automating the Audit Process Implementation Formed team comprised of specialists, support staff, reps from clinical departments and our IDX/IT group and vendor Defined new risk-based audit process and workflow that would be supported by MDaudit Built data base of providers from prior year s audits and IDX provider dictionary Slotted audit frequency based on prior audit score Designated an MDaudit super-user

17 Automating the Audit Process Preparing the Audit Billing activity for prior quarter now available in <5 minutes Type and volume of services quantified for each provider Prior audit problems can be highlighted and referenced 10 cases selected in 2 minutes or less Record request reports and/or labels produced, eliminating time-consuming handwritten forms Audit worksheets printed, eliminating handwriting of patient / physician information and billed code(s) Records received and logged in

18 Automating the Audit Process Performing the Audit CPT / ICD-9 / Teaching Physician review Results for each case entered into MDaudit Standardized finding comments selected Free text supplemental comments, as needed On-line worksheet coming soon

19 Automating the Audit Process Reporting the Audit Results Individual provider audit summary and score Key errors highlighted using color Cover letter customized to audit score Multi-tiered statistics and reports High-level roll-up by division and/or department Detail at provider level Aggregate by type of findings, with provider level detail Department, division and individual risks readily identified Entire School of Medicine roll-up

20 Automating the Audit Process Management Reports Weekly Progress Reports Auditor productivity Track date report is mailed and/or meeting is held Ad hoc data queries

21 Automating the Audit Process Comparison of time to complete one Provider (in Hours) Building Audit Sam ple New Old Hours

22 Automating the Audit Process Comparison of time to complete one Provider (in Hours) Requesting Records Preparing Audit Worksheets* Auditing New Old Audit Findings for Provider * Statistical Results 0 1 Preparing Reports ** Hours * Time will be further reduced when on-line worksheet is implemented ** Printing time only

23 Automating the Audit Process Comparison of time to complete Progress Reports (in Days) Weekly Progress Reports New Old Old - each specialist x 6 New - one person for all audits Days

24 Results Reporting and Corrective Action Quarterly meeting with each Department Chair, Administrator and Compliance Liaisons Report package high level data down to individual provider detail Trends, issues, and corrective actions discussed Chairs are actively engaged personally follow-up with faculty

25 Results Reporting and Corrective Action Quarterly meeting with the Dean, Vice-Dean for Clinical Practice, Executive Director/ Compliance Officer, and General Counsel Dean follows-up with Department Chairs, as needed Annual presentation to University Trustees

26 Quarterly and FY Score Cards By Department - # reviewed, # failed, average score and grade Presented to the Dean, to the Compliance Committee and CPA Board of Governors Competitive nature continues How are we doing compared to? What did Department X do to improve so much? Volume of reviews FY 04: reviewed 11,526 CPT codes for 809 providers FY 05: reviewed 13,409 codes for 954 providers

27 Handling the Outliers October 2003 The Dean said: Reduce their salary November January 2004 Workgroup made up of General Counsel, 3 Administrators and myself wrote a draft policy in keeping with Faculty Rules February 2004 Endorsed by the Dean, taken to the Board of Governors for approval However, an alternate, earlier intervention model requested by Department Chairs

28 Pre-Billing Review Process Effective 7/1/2004 If two consecutive failures, all charges are held and coded by Office of Billing Quality Assurance Intensive training and weekly meetings with the provider Must pass post-training review or pre-bill status continues Physician is responsible for cost of the reviews After passing, reviewed again in 30 days If passing score is sustained, placed in regular audit cycle If failing score, returned to pre-billing review Entire process managed in MDaudit

29 Impact of the Pre-billing Review Process First failure is taken very seriously Providers and departments are more actively engaged Number of first failures that pass the next review has grown significantly

30 Evaluating the Effectiveness of the Riskbased Audit Model Resources are devoted to the providers or issues where help is most needed Risk areas are more quickly addressed The quarterly review and meeting cycles keep this program front and center This model, coupled with strong support from the Dean and Board of Governors, has given the Billing Quality Assurance Program even greater credibility

31 Evaluating the Effectiveness of the Riskbased Audit Model The data base will be used to analyze: # and % of first time failures that converted to semi- and bi-annual frequency # and % of bi-annual providers that stayed or changed to a semiannual or quarterly frequency # of first-timers who are slotted into quarterly, semi- or bi-annual frequency Achieved our FY 05 management goal of reducing by 10% the number of providers who do not pass Automating the process was essential to implementing and managing the risk-based model

32 Questions?