Physician Report Cards

Our facility is investigating creating "Physician Report Cards" to share SOI/ROM/Quality Indicator information with service lines, group practices, and individual physicians. Would anyone please share with me the software tools you are using to report out this information and your processes? We currently have 3M, Midas, and Midas Statit. Also how is this received?

Linda Rhodes RN, BSN, CCDS
Manager Clinical Documentation Improvement
New Hanover Regional Medical Center
Wilmington, North Carolina
Office # 910-815-5544
Cell " 910-777-8344
e-mail : linda.rhodes@nhrmc.org

Comments

  • edited May 2016
    I just recently learned our organization uses the "CRIMSON Initiative" for physician performance and profiling although I do not have access to the data. Is anyone familiar with that program and does it contain information that would be beneficial to a CDI program?
    Thank you,
    Linnea Thennes, RN, BS, CCDS
    Supervisor, Clinical Documentation Improvement
    Centegra Health System
    815. 759-8193
    lthennes@centegra.com
  • I work very closely at CRIMSON here. We implemented CRIMSON a little less than a year ago. It is very helpful to me in regards to physician buy-in related to documentation. Basically our CRIMSON team tries to meet with most the MD's individually or by service line. I also attend each meeting and generally go over a few cases prior. I then show them ways in which they can improve their data based on documentation. I find it a really helpful way to make them see the impact of their documentation. Also, I have gone over basics with the CRIMSON team and they are quick to recommend improved documentation to improve data even when I am not present.
    One of the problems that has come up with CRIMSON is that the Drs are quick to blame coding for problems. So, even though I am not a coder, i spend lots of time going over the coding and making sure that the patient was accurately captured. If there are even minor coding issues (which i'm sure every facility has), the Docs are quick to dismiss the data as a coding problem. Our coding Manager is often overwhelmed with requests for coding review based on CRIMSON. The expectation from coding has dramatically changed due to the combination of the implementation of CDI and CRIMSON.
  • I would state the only way to answer concerns would be with a
    confirmation of a valid sample of the cases to ensure the coding
    reflects the documentation. I was told in the past that the 'coding'
    was flawed as our facility did not report SIRS in patients with elevated
    Lactic Acid, WBC with left shift and vitals 'supportive' of SIRS. A
    review of the cases in question found that Septicemia/SIRS was not
    DOCUMENTED. Many do not know or understand the very explicit rules
    of compliance to which coding is held. But, this is not to say that
    coding is perfect, or sometimes a part of the problem.


    Paul Evans, RHIA, CCS, CCS-P
    Supervisor, Clinical Documentation Integrity, Quality Department
    California Pacific Medical Center
    2351 Clay #243
    San Francisco, CA 94115
    Cell: 415.637.9002
    Fax: 415.600.1325
    Ofc: 415.600.3739
  • Right. We often get questioned on things that are obvious to me and/or coding, but physicians do not understand. Sure, We notice coding errors too, but we try to keep the focus on documentation. We just find that the physician will grasp onto anything that they can "blame" any data that appears less than perfect on.
    I have not worked in this area for long (1 year), but my understanding is that in the past it would be rare that a physician would look at the hosptial billing on a given patient. What we are finding with CRIMSON is that now physicians (as well as CRIMSON staff) are regularly drilling down in the system to the individual patient billing level. Then if they see anything they do not understand (primary dx not what they thing it should be, complication code, low SOI/ROM, etc), they question us (CDI and coding) on it. We then spend a great deal of time going through the record, ensuring that everything is accurate, and then explaining it to the MD or CRIMSON staff. I like that the MD's are looking at the data, but it does put additional pressure on the coding staff. If we do find a problem, then we lose credibility with the Physicians. If the problem can not be fixed (rebilled), they we have an angry MD.
    In the end, I find it overwhelmingly helpful for us because it gets the MD's aware and involved in their data. But, it has been time intensive for us because of how actively involved we are in the program.

    Katy
  • Katy: I agree with you.

    Any time we can get the Medical Staff to 'pay attention' to
    documentation and subsequent coding and quality metrics, it is a 'win'.
    As you stated, it takes a lot of time to drill down to the account level
    in order to comprehend any reasons for 'dissonance'.

    I have found that many of the physicians seem to have taken quite a bit
    of coding advice from totally unqualified sources, thus making this team
    entire endeavor more difficult. (One does not learn coding by taking a
    few' easy' weekend classes at the local junior college - I see
    advertisements for Adult Education Classes advocating this very idea.
    There are many 'layers' of qualifications and education amongst coders).

    Clinicians often become frustrated when charted signs, symptoms and test
    results obviously = condition "X" and we ask them to explicit state
    same. However, these are the rules, and the rules are valid.

    Paul Evans, RHIA, CCS, CCS-P
    Supervisor, Clinical Documentation Integrity, Quality Department
    California Pacific Medical Center
    2351 Clay #243
    San Francisco, CA 94115
    Cell: 415.637.9002
    Fax: 415.600.1325
    Ofc: 415.600.3739
  • Paul,
    Exactly. My other compaint is that although CRIMSON (as far as I know) gathers all their data from coding, their liasons are not coders. Not that they shoud all be coders, but you would think they would need a background.
    I am currently in an ongoing argument about our intensivists O/E mortality ratios and I am slowly getting the idea that our CRIMSON physician liason knows very little about coding. So, he is telling our CRIMSON staff reasons for that low O/E ratio that seems ridiculous to our coding manager and myself.
    For example, he is questioning our use of the DRG's 207 and 208 (resp dx with vent) when we have a patient with a respiratorty diagnosis (like pneumonia) that was on a ventilator. I have explained numerous times that we can not simply not code the vent (nor would we want to). But he simply does not like these DRGs and continues to tell me it appears to be a nonspecific DRG and we should be coding the underlying condition. We ARE coding the underlying PNA, that is the Pdx. But when you add the vent, you end up in 207, which is where we want to be for the RW anyways to capute the vent care. The fact that he does not understand how DRGs and a Pdx are assigned is concerning. He is an MD, but clearly he could use some doing knowledge.

    Katy
  • Katy: Not much time to respond - yes, I have similar stories as well.
    Our only recourse is to code as accurately and compliantly as allowed
    per the Official Guidelines. Given the guidelines now exceed 100
    pages, there are many that do not understand how codes are used - but,
    it seems you are seeking to do just that.

    Paul

    Paul Evans, RHIA, CCS, CCS-P
    Supervisor, Clinical Documentation Integrity, Quality Department
    California Pacific Medical Center
    2351 Clay #243
    San Francisco, CA 94115
    Cell: 415.637.9002
    Fax: 415.600.1325
    Ofc: 415.600.3739
  • edited May 2016
    Could anyone provide a link to CRIMSON? A quick google search pulls up something with the Advisory Board that might be what is referenced. Looks like the Continuum of Care portion might be what folks are referring to.

    Is this accurate? or is there something else I've missed?

    Does this tool / suite also provided direct ROM/SOI type of profiling data? that's what it sounds like to me from folks' comments.

    Thanks,
    Don
  • edited May 2016
    I agree, it would seem that the resources (and basic knowledge for the liaison from CRIMSON) really would need to include reasonable amounts of coding/DRG understanding. Discussing, explaining, advising and motivating medical staff would absolutely require that knowledge base.

    Has that feedback been provided back to CRIMSON? and what was their response?

    Sounds like Katy's organization is stepping up and attempting to provide / explain those perspectives. I have had similar conversations recently in the setting of expected mortality models.

    Don
  • Don, yes. it is an advisory board product. It provides individual physician data as well as group data for many metrics. Including basic demographics, quality data, SOI/ROM, LOS, Mortality rates, etc. The MD's can log into the system themselves and access their current data (up to date up to about 1-3mo for our institution) at anytime. They can then take their data or their group data (Ex: gen surg or critical care) and compare it to other physicians and/or "like"-hospitals in the nation.

    Paul,
    I have not contacted directly with CRIMSON. I am being asked questions from our CRIMSON staff and then responding to them. She is then forwarding my responses to him and he is respnding to her. We are supposed to be setting up a conference call in the near future.
    I personally think this is a data-aggregation issue. Until recently, our hospital assigned attending MD's based soley on who discharged the patient. This does not work if you work in a faclility, like ours, which uses a team model for MDs. One MD may see a patient for 7 days of their 8 day stay but then another MD comes on the last day and discharges, he/she would be tagged as attending even though they saw the patient 1/8 days. This is especially important in critical care. We rarely discharge directly from the ICU (mainly only ETOH/OD issues). Most patients are eventually transferred either to the floor or die in the unit. Therefore, the attending MD for almost all the patients that the ICU treats would be attributed to one of our hospitalists after they were transferred to the floor. CRIMSON is attributing data to the attending MD. So, with the way we were assigning "attending", our intensivists are going to have very few patients where they are the attending MD and almost all of those cases will be low SOI/ROM or they will be deaths. Now, we are seeing that our intensivists have higher than expected mortality rates. I believe this is due to our sample. They have few cases and many will be deaths. the ones who don't die and are discharged directly from the ICU, have low SOI/ROM. Given that scenario, it seems obvious that their O/E ratios would be off. We have since changed the attending attribution, so this should improve.
    However, the CRIMSON rep is "sure" this is a coding issue. He also is focusing soley on the patients that died as the reason for the unusual O/E. I'm pretty much positive it is not a coding issue. We have a unique process for coding death charts in which the coder codes the chart and then sends it to the coding manager. Then the manager and I both review the chart for documentation and coding accuracy before it is dropped. Our SOI/ROM is higher than average due to this process. Also, everything I have read or been told suggests that in order to truly impact O/E ratios you have to look at all charts. Not just deaths, as expected mortality rates are determined by the SOI/ROM of ALL patients, not just those who died.


    ARGGG! As you can see I'm frustrated. I feel like if CRIMSON is providing this kind of data, they should understand how this all works. I am certainly not an expert on this stuff, but I feel pretty strongly that they are on the wrong track. Please, if anyone sees something I'm missing in this process, please let me know.

    Sorry for the rant....


    Katy
  • edited May 2016
    Katy --
    Thanks, I know our organization has some work with the Advisory Board, but I don't believe Crimson is in use here.

    You've made a couple of excellent points.

    You're spot on with the elements about the flow and process of care (especially regarding intensivists). We have a similar pattern with cardiac pts. There is a dedicated hospitalist service that works closely with the cardiologists. As would be expected, attributing to the discharge provider skews the O/E data significantly -- the sickest pts tend to stay with the cardiologists (especially the ones that die in CICU) and thus their O/E is quite a bit higher. I think you're on the right track to sort out in some manner.

    You're also exactly correct on focusing on ALL pts to affect the O/E -- it's not the pt's who've died, rather raising the E part just a little bit on ALL pts that is the most successful strategy. Doing that in conjunction with the type of review you've alluded to for expired pts strikes me as best practice. There was a presentation at the 2010 ACDIS conference from MUSC (Cheryl Erickson) that discusses that type of expired pt review.

    However, I will counter point that there is value on looking at expired pts as a starting point -- I feel one would be more likely to pick up some of the easier initial things that are under-documented by looking at that small number to help guide efforts to move on to looking at all patients. A good place to start if a CDI program or organization is moving more into a conscious, focused push to affect mortality & SOI profiling.

    Don
  • Don,
    I agree 100%. Since we already do extensive multi-tiered reviews of all in-hospital deaths, I am fairly confident that the issues do not stem from our actual deaths. This does not mean that i think we should ignore the death charts. The first thing I did when we started this program was set up a death chart review process. I think reviewing death charts is a great starting point for improving O/E, I just dont think that we can look soley at deaths.

    Katy
Sign In or Register to comment.