Example: quiz answers

Using FPSC Benchmark Data to Understand Academic ...

Using FPSC Benchmark Data to Understand Academic Radiation Oncology gy Robert C. Browne Director, UHC-AAMC FPSC. March 21, 2010. 2010, UHC and AAMC Page 1. The FPSC in Brief Participating Institutions Began as UHC CPT Database in 1995. FPSC Advisory Group created in 2000. FPSC created in 2001. 87 participating institutions nationwide 65,000+ participating physicians 100+ unique subspecialties 200+ million records, 40 gigabytes of data Hundreds of performance measures 2010, UHC and AAMC Page 2. UHC-AAMC. UHC AAMC FPSC Participants Albany Medical Center Saint Louis University University of Massachusetts Baystate health System Stanford University University of Miami Beth Israel-Deaconess SUNY at Stony Brook University of Michigan Brigham & Women's SUNY Downstate University of Minnesota Cedars-Sinai Cedars Sinai Medical Center SUNY Upstate University of Mississippi Clarian health Partners The Emory Clinic University of Missouri Columbia Columbia University The Methodist Hospital Physician University of Missouri KC.

UHC-AAMC FPSC Participants •Albany Medical Center •Baystate Health System •Beth Israel-Deaconess •Brigham & Women’s • Cedars-Sinai Medical Center Saint Louis University Stanford University SUNY at Stony Brook SUNY Downstate SUNY Upstate • University of Massachusetts • University of Miami • University of Michigan • University of Minnesota ...

Tags:

  Health

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Using FPSC Benchmark Data to Understand Academic ...

1 Using FPSC Benchmark Data to Understand Academic Radiation Oncology gy Robert C. Browne Director, UHC-AAMC FPSC. March 21, 2010. 2010, UHC and AAMC Page 1. The FPSC in Brief Participating Institutions Began as UHC CPT Database in 1995. FPSC Advisory Group created in 2000. FPSC created in 2001. 87 participating institutions nationwide 65,000+ participating physicians 100+ unique subspecialties 200+ million records, 40 gigabytes of data Hundreds of performance measures 2010, UHC and AAMC Page 2. UHC-AAMC. UHC AAMC FPSC Participants Albany Medical Center Saint Louis University University of Massachusetts Baystate health System Stanford University University of Miami Beth Israel-Deaconess SUNY at Stony Brook University of Michigan Brigham & Women's SUNY Downstate University of Minnesota Cedars-Sinai Cedars Sinai Medical Center SUNY Upstate University of Mississippi Clarian health Partners The Emory Clinic University of Missouri Columbia Columbia University The Methodist Hospital Physician University of Missouri KC.

2 Denver health Organization . University of Nebraska Duke University The Ohio State University Thomas Jefferson University University of New Mexico East Carolina University . University of North Carolina Georgetown g University y Tufts Medical Center Tulane University Medical Group University of Oklahoma Oklahoma, OU. Howard University . Physicians Indiana University University of Alabama University of Arizona University of Pennsylvania Johns Hopkins University . University of Rochester Kansas University Physicians University of Arkansas University of California-Davis University of South Florida LifeBridge health . University of California-Irvine UTMB, Galveston Loyola y University y . U i University it off C.

3 California-Los lif i L A Angeles l University of Tennessee LSU Healthcare Network . Massachusetts General University of California-San Diego University of Texas San Antonio Medical College of Georgia University of California-San Francisco University of Toledo Physicians Medical College of Wisconsin University of Chicago University of Utah Medical University of South Carolina University of Cincinnati University of Vermont Montefiore Medical Center University of Colorado University of Virginia Morehouse Medical Associates Uni ersit of Connectic University Connecticutt U i University it off W. Washington hi t Mt. Sinai Faculty Practice Associates University of Florida University of Wisconsin NLSU health System University of Illinois Vanderbilt University Northwestern University University of Iowa VCU School of Medicine/MCV.

4 Oregon health and Science University University of Kentucky Physicians Rush Medical College University of Louisville Wake Forest University Physicians University of Maryland West Virginia University Weill Cornell Physician Organization Yale University 2010, UHC and AAMC Page 3. FPSC Benchmark Development Process Key Goals Maximize sample size (both number of MDs and number of institutions represented). Ensure that sample reflects a population of clinically active faculty Generate a stable distribution ( , eliminate outliers). Identify Id tif relevant l t subpopulations b l ti 2010, UHC and AAMC Page 4. FPSC Benchmark Process Overview By y Participants p By y FPSC. Billing Data Candidate Physicians Identified Transmitted to FPSC, for Benchmark Pool RVUs Calculated Clinically Active MDs Selected for Inclusion in Benchmarking Pool Clinical Effort Reported Specialty Specific Benchmark For MDs Selected Measures Calculated 2010, UHC and AAMC Page 5.

5 Automated Electronic Transfer Allows Efficient Data Capture Participants send physician-level billing data to the FPSC. Data is electronically extracted and sent from the billing office. Data In (at the procedure-level): procedure level): Total Billings for each Procedure Site of Service for each Procedure CPT Code for the Procedure Payer Class for each Procedure CPT Code Modifiers ICD-9 Codes (first four). Frequency of Billed Procedure Patient MRN. Patient Demographics Data: age, sex race sex, race, zip code 2010, UHC and AAMC Page 6. FPSC Applies Multi Multi-Stage Stage Validation and Standard Approach to Calculating RVUs Data Out: Work RVUs FPSC Clean, Scrubs, V lid t Validates, and d Total RVUs Converts CPT.

6 Frequencies into RVUs Clinical Fingerprint Usingg Standard Methodology Coding Distributions 2010, UHC and AAMC Page 7. RVU Source Data Data Sources: Medicare RBRVS Fee Schedule (period specific). The Complete RBRVS, Relative Value Studies, Inc. Gap Filling: Local charge:RVU. g ratio at specialty p y level . gives RVU credit to physicians performing unlisted procedures 2010, UHC and AAMC Page 8. What does CFTE Mean to You? Clinical Full-Time Equivalent OR. Constantly Fighting about Time and Effort The Academic Con Conundrum: ndr m Since faculty time is spread among clinical, research, teaching and administrative activities teaching, activities, time and effort (T&E) must be normalized when benchmarking.

7 2010, UHC and AAMC Page 9. Among Approaches to Account for Faculty T&E, 3 Methodologies Most Common Time/schedule-based Self Self-reported reported via survey Salary-based 2010, UHC and AAMC Page 10. MDs in 2009 FPSC Radiation Oncology Benchmark Have Average CFTE of 82%. 2010, UHC and AAMC Page 11. FPSC Designed to Address Common Pitfalls in Benchmarking Data Common Pitfalls: FPSC Approach: existing comparative data numerous faculty groups not reflective of AHC faculty participating groups broad scope of specialties continuous feedback and refinement through member involvement inaccuracies of survey data data submitted electronically missing or misclassified data consistent methodology in RVU. calculation significant i ifi t year tto year iindividual di id l MD ddetail t il allows ll variability in existing exclusion of outliers and analysis comparative data of coding behaviors 2010, UHC and AAMC Page 12.

8 What Benchmark Measures Does the FPSC Provide? Work RVUs, Total RVUs, Billed Charges per CFTE. Evaluation and Management (E&M) Coding Distributions Scope and Mix of Services (Clinical Fingerprint). Charge Lag Analysis Charge Ch S. Summary St Statistics ti ti Revenue Cycle Performance Collections, Denials, AR. Payment Forecasting Custom Peer Cohort Benchmarking Others 2010, UHC and AAMC Page 13. Clinical Activity Highly Variable Sample Departments vs. 2009 FPSC Benchmarks 2010, UHC and AAMC Page 14. Differential Diagnosis for Variable Clinical Activity Operational barriers Lack of space, aging infrastructure Variable operational support and resources Clinical and non-clinical support staff shortages New practice ramp ramp-up up Patient no-shows Visit mix and practice composition New vs.

9 Established patients Procedures vs. E&M work Faculty with part-time practices Inconsistent coding and billing Under-coding Incorrect modifier use Unbilled services and procedures Inefficiencies Training T i i Clinical processes 2010, UHC and AAMC Page 15. Percent New Patient Visits* Can Impact p Productivity and Access Sample Departments vs. FPSC Benchmarks * Percent New Patients = (Count of 99201-205 + 99241-245) / (Count of 99201-205 + 211-215 + 241-245). 2010, UHC and AAMC Page 16. Key Benefits Of Focusing On Access For New Specialty Patients Improvement in payer mix and collections per unit of service by reducing access barriers that alienate favorably insured patients More work RVUs and total RVUs per unit of specialist time expended increased revenue Greater volume of procedures per patient encounter through successful screening work-up of new patients Greater downstream professional fee and facility revenues from broadening patient base served 2010, UHC and AAMC Page 17.

10 Practice Composition Distribution of Services by CPT Code Key Driver of Variability Faculty Practice Solutions Center Cli i l Fi Clinical Fingerprint Work RVUs per CFTE. i t W k RVU 1 0 CFTE. CPT Code Family Dept A Mean Dept B Mean FPSC Mean Surgery 49 27 66 Radiology 10,931 7,811 9,189 Pathology & Laboratory 5 0 Medicine 109 16 Evaluation & Management 838 1,217 1,243 All CPT Ranges/Codes 11,822 9,165 10,514 2010, UHC and AAMC Page 18. Distribution of Services by CPT Code Work RVUs per cFTE, Radiation Oncology Codes Radiation Oncology CPT Codes gy Dept A Mean p Dept B Mean p FPSC Mean 77261 77263 Radation therpay planning 831 610 694 77280 Set radiation therapy field simple 196 113 102 77285 Set radiation therapy field intermediate 4 77290 Set radiation therapy field 77290 Set radiation therapy field complex complex 318 318 322 322 350 350.


Related search queries