Learning Objectives. Why Me?

Size: px
Start display at page:

Download "Learning Objectives. Why Me?"

Transcription

1 1 Targeted DRG Reviews to Optimize Documentation, Coding, Metrics, and Reimbursement Erica E. Remer, MD, FACEP, CCDS President and Founder Erica Remer, MD, Inc. Beachwood, OH Learning Objectives At the completion of this educational activity, the learner will be able to: Select DRGs for targeted review Identify opportunity by performing targeted reviews Create and implement an action plan based on findings Improve documentation, quality metrics, and reimbursement in targeted areas 2 Why Me? Emergency physician for 25 years Clinical documentation integrity physician advisor for multihospital system for 4 years Consultant in documentation, CDI, and ICD 10 since July 2016 I am not an informaticist or analyst 3 1

2 4 How Did I Settle on This Process? What Did I Find? Opportunity for improved provider documentation Coding opportunity (ICD 10) CDI opportunity Compliance opportunity (H&P, APP attestation) 5 What Did I Find? Opportunity for improved provider documentation Coding opportunity (ICD 10) CDI opportunity Compliance opportunity (H&P, APP attestation) Opportunity for Education 6 2

3 7 Data Used in This Presentation The data is ever evolving, but the numbers don t change drastically year to year Relative weights from CMS MS DRG list, version 34; 2016 DRG distribution from 2015 CMS data set (MedPAR) Targeted DRG Review What Analysis of Specific diagnosis or procedure Specific DRG Specific service line Why Identify opportunity In coding In documentation Optimize Quality metrics CMI, reimbursement 8 9 3

4 10 Which DRGs? How to Select a Target They come to you for help Ranking Quality metrics $ Skeptical of CMI 11 How to Select a Target New Provider Service line (e.g., trauma) Residency program 12 4

5 13 How to Select a Target Recurrent queries for same comorbidities Provider problem vs. service line Process problem How to Select a Target Current practice or literature identifies targets DRG shifts Clinical validation denials OIG targets 14 How to Select a Target High yield expected High volume C section without CC/MCC High likelihood Vasculopaths undergoing extremity bypass Deceased without CC/MCC High yield Surgical DRGs High weight differential between no CC/MCC and with CC/MCC 15 5

6 16 How to Select a Target Fishing expedition Deviation from benchmarks (national, state average, benchmark organizations like Vizient, competitors) No CC/MCC (in either two or three tier DRG sets) What Is Wrong With CMI as Sole Metric? 17 No CC/MCC Percentage [No CC/MCC] [No CC/MCC + CC + MCC] 18 6

7 19 Three Tiered DRG Simple Pneumonia and Pleurisy 193 With MCC % 194 With CC % 195 No CC/MCC % Respiratory Infections and Inflammations 177 With MCC % 178 With CC % 179 No CC/MCC % Two Tiered DRG Viral Meningitis 75 With CC/MCC % 76 No CC/MCC % Epistaxis 150 With MCC % 151 No MCC % 20 Comorbidity Opportunity Simple Pneumonia and Pleurisy Benchmk Your inst. 193 With MCC % 34.72% 194 With CC % 33.92% 195 No CC/MCC % 31.36% 21 7

8 22 Comorbidity Opportunity Simple Pneumonia and Pleurisy Benchmk Your inst. 193 With MCC % 34.72% 194 With CC % 33.92% 195 No CC/MCC % 31.36% Comorbidity Opportunity Simple Pneumonia and Pleurisy Benchmk Your inst. 193 With MCC % 34.72% 194 With CC % 33.92% 195 No CC/MCC % 31.36% 23 Comorbidity Opportunity Simple Pneumonia and Pleurisy Benchmk Your inst. 193 With MCC % 34.72% 194 With CC % 33.92% 195 No CC/MCC % 31.36% =

9 25 Comorbidity Opportunity Simple Pneumonia and Pleurisy Benchmk Your inst. 193 With MCC % 34.72% 194 With CC % 33.92% 195 No CC/MCC % 31.36% = Benchmarking and Results Premises: Looking for educational opportunity Lessons learned from one DRG set are transferrable to other DRG sets Aggregate benchmarks are suspect [No CC/MCC] will never reach zero Prior time period will give an approximate baseline for your organization, but CMI is multifactorial Shifts not guaranteed to stay in the same DRG set 26 Targeted DRG Review Process Reports What to do with findings 27 9

10 28 Step 1 1. Select your targeted DRG(s) How many charts to review? (All or a subset?) No CC/MCC (most productive review) Why not CC MCC review? Step 2 1. Select your targeted DRG(s) 2. Run a report with: Patient and provider identifiers Dates of service Calculate actual LOS Current DRG (MS? APR?) Relative weight ALOS SOI ROM $ 29 Step 3 1. Select your targeted DRG(s) 2. Run a report 3. Make a spreadsheet by adding to report: Post review numbers: DRG 2 ; RW 2 ; ALOS 2 ; SOI 2 ; ROM 2 ; $ 2 Pre review post review change = RW; ALOS; SOI; ROM; $ CDI review (Y/N; initials of CDIS; opportunity; no response from HCP?) Your findings of opportunities (columns for common ones; categories; notes for feedback) 30 10

11 31 Step 4 1. Select your targeted DRG(s) 2. Run your report 3. Make a spreadsheet 4. Reverse order from newest to oldest What is your rebilling window? What is your time threshold for retrospective queries? Step 5 1. Select your targeted DRG(s) 2. Run your report 3. Make your spreadsheet 4. Reverse order from newest to oldest 5. If volume mandates selection, look at encounters where the actual LOS exceeds the ALOS (by a number of days; by a percentage?). Sort by longest LOS to shortest. 32 Step 6 1. Select your targeted DRG(s) 2. Run your report 3. Make your spreadsheet 4. Reverse order from newest to oldest 5. Encounters where the actual LOS exceeds the ALOS, longest first 6. Review MR Refer to coding abstract Previous encounters Labs/reports/consults, etc

12 34 What Are You Looking For? In the right DRG? Right principal diagnosis, right principal procedure? Are there CC/MCCs that were documented, but not coded? Are there conditions intimated, but not documented adequately for coding? CC/MCC/HCCs Did a CDIS review it concurrently and catch them? Are all ICD 10 PCS procedures captured? What Are You Looking For? Note all opportunities, even if they don t impact the metrics (you may catch patterns or trends; feedback on documentation for HCPs) PSI/HAC, medical necessity, quality issues? Are there any compliance issues? (e.g., critical care billed, but inadequate documentation; no preoperative H&P in the encounter) 35 Demographics Pt name Doe, John MRN MD Date adm Date d/c d Actual LOS ER 1/3/16 1/12/16 [= d/c adm date] 36 12

13 37 DRG Numbers MS DRG MS RW MS ALOS SOI ROM $ $48,540 (Endovascular Cardiac Valve Replacement without MCC) CDI Review CDIS Y/N CDIS Query Notes Y JR Y Queried HF; no response 38 DRG Numbers 2 MS DRG 2 MS RW 2 MS ALOS 2 SOI 2 ROM 2 $ $62,950 (Endovascular Cardiac Valve Replacement with MCC) 39 13

14 40 in DRG Numbers MS RW MS ALOS SOI ROM $ $14,410 Consistent calculation base rate; encoder output Opportunity Findings Coding opp Coding opp note Doc opp Ac HF Ac resp fail Enceph Documentation opportunity note 3 NA Y with ADHF, EF 32%; BNP 8145; fluid overloaded, diuresis of 2 L, symptomatic improvement Y w/ = 1 Y wo/ = 2 N = 3 Columns for the most common conditions for that DRG Y = 1, N = 0 Can tally 41 What Are You Looking For? Sort findings by: Actual recouped $/improved metrics (coding opportunities) Queries generated if agreed recouped $/ improved metrics (provider documentation opportunities) Queries indicated but not done concurrently (CDIS opportunities) Potential recouped $/improved metrics Likely vs. difficult to tell 42 14

15 43 Reporting The administration/provider leader will want numbers and findings in a digestible format Spreadsheet Report Action items Educational Opportunities Feedback to coders, CDISs, HCPs Sum up findings for more general dissemination Educational presentations blasts Tip cards CDI tips 44 Ongoing Reviews Identify DRGs that warrant ongoing secondlevel review Continued feedback and repeat education as needed 45 15

16 46 Permanent Impact Impossible to exactly quantify Comorbidity Opportunity Simple Pneumonia and Pleurisy Benchmk CO 1 CO With MCC 41.90% 34.72% 37.88% 194 With CC 42.61% 33.92% 39.84% 195 No CC/MCC 15.49% 31.36% 22.28% = = Comorbidity Opportunity Simple Pneumonia and Pleurisy Benchmk Your Benchmk CO With MCC 41.90% 39.34% 37.88% 194 With CC 42.61% 42.01% 39.84% 195 No CC/MCC 15.49% 18.65% 22.28% =

17 CC/MCC Distribution No CC/MCC CC MCC Benchmark 1st Review True Benchmark 2nd Review Examples of Findings Neurosurgery DRG 27 Not homogeneous; adult, brain excisions with significant opportunity Coding accuracy was 97% 27.4% with 1 or more opportunities (26/95) 22/36 of opportunities were for cerebral edema and/or brain compression CMI would have gone from to with additional revenue of > $200,000; actually recouped ~$40, Examples of Findings Neurosurgery DRG 27 Need preoperative H&P in THIS encounter Review and capture diagnoses from imaging that elicited the surgery Encephalopathy Acute respiratory failure Seizure DISORDER Pathology Malnutrition 51 17

18 52 Examples of Findings CORS Looked at 331, 334, 346 [CORS cases with no CC/MCC] for 6 months As approach ALOS = actual LOS, less opportunity Coding accuracy was 92.2% 41.2% with 1 or more opportunities (24/51) CMI would have gone from to with additional revenue of almost $200,000; actually recouped > $55,000 Examples of Findings CORS Lower acuity community hospitals with less opportunity Missing chronic illnesses found in preop workup but not brought into current encounter Vascular disorders not documented in a codable format [ dead, nonviable, dusky without ischemia or gangrene ] ABLA AKI/CKD Pathology results (especially lymph node mets) Malnutrition 53 Obstetrics 54 18

19 55 Learning Objectives At the completion of this educational activity, the learner will be able to: Select DRGs for targeted review Identify opportunity by performing targeted reviews Create and implement an action plan based on findings Improve documentation, quality metrics, and reimbursement in targeted areas Thank you. Questions? In order to receive your continuing education certificate(s) for this program, you must complete the online evaluation. The link can be found in the continuing education section at the front of the program guide