false
Catalog
Data Plan Part 2: Registry Operations Manager View ...
Video: Data Plan Part 2
Video: Data Plan Part 2
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Well, good afternoon, everyone, we're excited that you guys have chosen to join us today for this webinar being offered through the TCAA. My name is Amber Kyle, the program manager at the University of Mississippi Medical Center in Jackson. And I have the opportunity today to introduce to you Robby, I mean, Robin Schrader. Robby, Robin is operations manager for the registry there at VCU Medical Center. She's been there for a few years and she spent many years, she's got a wealth of information and knowledge about the registry and about data that we're excited to hear about today. I would say that if you have any questions, please put those in the chat box. I'll be watching the chat box throughout the session and we'll wrap the session up with answering some of those questions. We're going to hand it over to Robin. All right, thank you, Amber. Welcome, everybody. I'm glad that you have chosen to spend an hour with us today talking about the data quality plan, looking at it from the registry view. So we will get started, if it allows me to go here. Great. Stop. There we go. All right. So the TCAA educational statement, again, everybody can read this, but this is letting you know that the TCAA is approved by the California Board of Registered Nursing. And again, just a little disclaimer from them. The objectives today are to describe data integrity and describe the key points of a data quality plan. So we all know that when we introduced the gray book, everybody said, oh my gosh, what are we going to do? We need a data quality plan. What is it and how are we going to do this? Well, the data quality plan in the book, Resources for the Optimal Care of Injured Patients 2022, says all trauma centers must have a written data quality plan and demonstrate compliance with that plan. At a minimum, the plan must require quarterly review of data. So again, looking through and trying to interpret what they're telling us is that we must have a plan that tells us how we are ensuring that our data that we are abstracting and that we are pulling out of our registry is accurate, okay, and we have data integrity as well, right? Data integrity, data quality go hand in hand. So it must include continuous monitoring. This is not just, hey, we're going to do this, you know, once a year. This is continuous. So we have to continue to do it. The plan should allow for continuous process that measures, monitors, identifies, and corrects data quality issues and ensures the fitness of data for use. So continuous validation, okay? As you close out a month, you're going to validate your data. You've got monthly reports, you've got, and I'm going to go through what those monthly reports are and how to do this, but there are so many different reports that you run, and when you're running reports, you should be analyzing that data and saying, whoa, this doesn't make sense. Now I need to dig in and look at this, or I need to take a deeper dive and look at that, okay? But we're going to get into each point of these as we go through. So data integrity is consistent, complete, accurate, quality data maintained over time. So it must be, so the principles of data integrity, ALCOA, apply to the registry, and it must be attributable, okay? So the person recording that data, it needs to be linked to that person who created it with a timestamp. When did they create it? So we all know as we're looking through patient charts, we have a date and a time that providers didn't know. We have a date and a time that radiology procedure was done. We have a date and a time when they went to the OR or whatever their procedure was, whatever they had done, right? We have dates and times. So we know who did it and when they did it. Legible. Obviously, our new electronic medical records have changed the fact that it used to be when I first started, we had to actually go up and look at the charts on the floor and abstract from those charts on the floor, and you had to interpret what in the world some of those physicians were putting in those charts. You'd come to learn your physicians who you saw all the time. But what's nice now is a lot of that legible part has been kind of removed of the legibility by the fact that we have that electronic medical record. Now there are still some things that get scanned in, but for the most part, but again, contemporaneous. So the data that is entered into whatever your EMR is at the time that it was performed, so not retrospectively. So again, here we have epic. And so, you know, as the patient comes into the bay, you have a documenting nurse who is on the computer and she or he is documenting what's going on, right, at the same time that it's taking place. Not like, you know, years ago when, you know, they could write it on a napkin and then go in and have to enter stuff later or whatever the case may be. It's a little bit different. Original. So it should be in the original electronic medical record or, you know, if it's scanned in or whatever it is, but it's the original, okay? And then accurate. So it needs to be truthful, complete, and free of errors. Now we all know that we have found dictated notes maybe that have some weird things in them or, you know, when it says free of error and things like that. But again, when you are going through a record and say you find, for instance, a CT scan or an x-ray and it says it's an x-ray of the right femur and on the impression it says that there's a fracture of the left femur. Well, that's an error and that should go back to your radiology department to be fixed, right? And that's what we do here is when we find things that are inconsistent and have an error. You're going to send that back because we need it to be accurate, right? And they need it to be accurate. So data quality is a measure of a data set's condition. So this is based on data integrity. So again, when you're looking, data quality is going to tie back to your integrity, all right? You can't have one without the other. We've all heard, I think, garbage in is garbage out. So again, if you're putting bad data in your registry, well, you're getting bad data out. And it's super important, we're going to go through those reasons. Why is it so important that our data is quality? And why did the ACS determine that we need to have a data quality plan? And again, it's also going to tie back to how many charts should a full-time registrar be doing? And why should they only be doing a limited number? Well, it all ties into data quality. You've heard the churn and burn and the get the charts done, get the charts done, get the charts done. Well, that leads to poor data quality. I can tell you, hand in hand, that goes together. So I think, again, that's one of the reasons that we have added data quality plan. And I can tell you, just having been through that site survey, they will hold you accountable for how many registrars you have and the quantity of charts that that works out to. And that means every single thing that goes into your database counts as a trauma contact. It's not just NTDP patients or admitted, it's whatever you put into your registry counts as one. I would make sure that you are really within that range, because if you're not, you will get a deficiency. We did not, but we have more than what we need for that reason. All right. So why do we need to have a data quality plan? Why did they start this? So first of all, is our performance improvement, right? So your PIPSC plan at your hospital, how can we improve our process and patient care? So our PI tells us, where do we have problems, where we might be doing really great, and again, where we really may need some help, right? So again, looking at that might be your unplanned intubations, it might be DVT rate, again, and then looking at, well, where's our prophylaxis? If we have a high DVT rate, are we getting our prophylaxis started within 24 hours? Are there doses being held? Maybe they're holding doses because the patient's supposed to do the OR and then they get pushed off. Now we held a dose and we didn't need to. And so looking at our protocols and our practice management guidelines, it all falls together. So again, if we're not picking these data points up and we're not putting valid information into our system, when we pull out these reports for our process improvement and our performance improvement, guess what? We may very well not be getting accurate data. Again, our admission stats, right? We need admission stats for staffing, right? So again, that's all aspects of your program. That means your trauma registry, your PI clinicians, whatever you call them at your center, even your trauma surgeons and your advanced practice providers. So it's everybody involved in the trauma program, everybody. We pull out of our registry, it could even be the shift, how many trauma alerts are happening on this shift? How many are happening on that shift? So that we know how to staff each shift accordingly as well. Your research. So of course, I'm at an academic institute and most level ones are, and we have a ton of research going. So that can be multi-institutional or just in our own. It can be national, it can be state, it could be regional, whatever it is. But you want to provide accurate data, right? That's so important because a lot of that research may go for programs to help address whatever they're looking at. Maybe that's gunshot violence. It could be motorcycle crashes, it could be bicycle crashes. I mean, whatever it happens to be, but you really need to have accurate data. And sometimes I think, again, the registry, new registrars are thrown in and they don't really understand, well, where does this data go and why am I putting this in here? They don't even understand the importance. I know when I started, I did not understand all of those things and I started asking more questions like, what exactly are we doing here? So if you're not sure, then that's important to ask. And then our benchmarking. So obviously if you're a TQIP, you may have a state benchmarking, you may have local benchmarking, whatever that is, but you need to have accurate data and you need to also be following the definitions so that if one facility is not following the definition for a DVT and everyone else, this one is, this one isn't, this one, guess what? That benchmarking is null and void because we're not following the same rules and that's really important. All right. So looking at injury and violence prevention is a huge place where our registry data goes. So we have an enormous injury and violence prevention program here that employs 30 or 40 people. It is huge. But we are mostly grant funded for that. And so I pull a lot of data throughout the year for our IVPP team so that they can get these grants. And those grants can be based on violence prevention. So they want to know how many violent injuries do we have, gunshot wounds, stabbing, assaults, abuse. Again, we also are a level one pediatric center, so we do see a lot of abuse, but it could be elder abuse, it could be family violence, and then it can be child abuse. But all of those things are very important. And again, you're going to pull that out. Your falls. So we look at our falls overall and then we look at our geriatric falls. So again, looking at all of those, your motor vehicle crashes, all right, we want to look at seat belt use. If you're a center that has some kind of cell phone where you're actually abstracting, if you find out that it had to do with cell phone usage in the vehicle, which is very hard to do, but PEDS, the car seat usage, was the patient in a car seat? Were they strapped in properly? Was the car seat strapped in properly? All of those things are very important. And again, airbag deployment, you know, you're looking at, but we're looking at all of these and what we're doing with this is setting up the programs for our communities, what they need. So that's all coming out of your registry. So again, data quality, super important in so many ways. All right, then you look at any national, state, and regional injury and violence prevention programs. Are there any initiatives throughout that would also you'd like to participate in and you want to send data? And then our accreditation. So whether that's the ACS, the state, we're an American Burn Association verified burn center. So again, we need all of this data so that we can pull what we need when they're coming. We know we just did it. So that we can pull all of that information out of our system and that it's correct, it's accurate. All right, and then again, looking at your TQIP benchmarking, right? You don't want to send in bad data. Quality data matters and it makes a difference, a big difference. Looking at the two types of validation that you can do, there's internal and there's external. So data quality and validation should occur at multiple points during data abstraction and reporting. So again, we look at, for us, the trauma registry manager. It could be your manager, a coordinator, a team lead, whatever you call it at your facility, randomly selects a minimum of 10% of cases for validation 100 re-abstraction review, okay? Right now, we're running at about 18% annually that we actually 100% re-abstract. But again, the recommendation is 10%. And then we have a whole bunch, I run a whole bunch of monthly validation reviews. And they are scrubbed. So time to antibiotics and open fractures. Okay, I look at that and scrub through it. Make sure that we have accurate times. Again, any that are over 60 minutes are definitely going to be looked at. Many times I find that a patient who's a transfer in and had the antibiotic at the outside hospital gets missed. So that then goes up on, okay, that's now an education initiative here for my registrars, okay? We need to talk about that and go over that. Looking at time to crani. Again, I'm going to look at those and ensure that the patients that are in there as having a craniotomy actually had a craniotomy and verifying the times. Emergent angioembolization. So again, something that is new in our 2022 standards is making sure that we have from time of order till sick time is 60 minutes. So we are now pulling any and all of those emergent angioembolizations and I'm reviewing them to ensure that we have accurate data. And again, any time that things are found to not have accurate data, and it's not just say one person where I can just have an educational talk with that person, but it's a team. And then we go back and we add that to our registry meeting and we have some educational opportunities. Time to spinal decompression and or stabilization. Again, something new in our standards that we needed to add in. And so again, these are reports that I really go over with a fine tooth comb. Time to fixation for fractures, our splenic and liver injuries, operative versus non-op versus embolization, our trauma attending response time to our alerts, time to our VTE prophylaxis, our ETOH and talk screens and our SBIRTs being completed. There's a ton that we go over every month and all of those are reviewed. And those are our audit reports. My validation report is actually a humongous spreadsheet and that changes about every six months. And what I do is I pull various data. One of those things is looking at what are our injury details say? What ECO did they use? And then what was our internal mechanism code? And again, does everything match? It's not that one says they were in a motorcycle accident and then the ECO says that they fell. I mean, again, it's trying to make sure that everything makes sense. And then I have different data points all the way through and some of them change if for six months there hasn't been one error in a certain field then I may remove that and may add something that I found through our peer review or through scrubbing reports that I think maybe we need to look at. Again, you should be performing monthly IRR and peer review. So this is part of your data. All of these things are part of your data quality plan. I think sometimes people don't understand that all of these things apply to your data quality plan. So take credit for the things that you do that include validation of your data. That's part of your plan. Those monthly IRR and peer review, that's super important. And again, that's a way I always say is to stress it is non-punitive, this is educational only. This is a way for all of us to get together and ensure that we're on the same page. Your TQIP data center report. So again, all of your TQIP benchmark report, any other reporting, you wanna go through that with a fine tooth comb and make sure that everything makes sense. And then our registrars run that validator software before they close a record, right? So in ours, it's a state validation and then it's the TQIP validation. And running those tells them if there's errors. If you have errors, you shouldn't be closing that record, right? You should be fixing them first. All right, external review. So if you submit to NTDB and TQIP, you have a validation summary report. So anytime that you submit data, you can get this report back and this report will tell you your record submission, okay, so your validation, and it'll tell you if you have any errors. Now, level one and two errors, it'll kick out your, it'll kick out, you can't even submit it with level one or two errors, you're not gonna get it to accept it. So we had a couple level three errors there. And so what happens is I then go back and double check those level three errors. And they were, most of the time, they have to do with either an excessively high ETOH value or maybe somebody who came in and their vitals were really, really, really low or zeros, okay, and they didn't die, so they were on a DOA, right? But anyway, you're gonna wanna go through this, you're gonna wanna review and look at it, and if there is something that ended up being incorrect that you needed to fix, then you should resubmit that right away as well. That's why I always say too, don't wait until the last day to submit your data to TQIP and NTDB. Make sure that you're submitting it a week out, so if you have errors or there's anything that comes up, you have time to resubmit. The other external that's super important is your submission frequency report. So this is another report after you submit that you go in and then you download this. And what I do is I go through and I look at what I'm gonna look for is not known, not recorded, so I only grab a small piece to fit on here, but if you look at this, so your ICD-10 external clause code, right? So not known, not recorded is zero. If I had a number in there, that's a problem because how do I not know what the external clause code is? All right, so that's something I would wanna look at. What did we not have that qualified for being submitted, right? This is just your place of occurrence code, but again, this goes through and what it'll have is demographic information. This is injury information. It kind of goes by section, but it's really important. I like to look at our comorbidities and just see what our numbers look like each time that we submit. Is there something that looks like, wow, like we submitted, every patient we submitted has COPD. That's a mapping issue somewhere. So when you look at this, that's the other thing to look for is our mapping working, all right? Is there a problem somewhere? So that's the other reason that I like to look or that you should look is look over this to make sure everything looks right because you'll pick up mapping errors here as well and something that you may need to fix, especially after your annual updates or any other updates that you do. All right, and again, then talking about those internal, those internal, again, are the registrars gonna run their TQIP and state validator prior to closing a record, okay? Monthly report, I have a large validation spreadsheet. When we close out a month, I run it. If I am super busy, I have two of my senior registrars who are able to also do this for me and I'll assign one of them, hey, can you take this month and validate it for me? And again, E-code versus the injury details versus the mechanism or cause code, which is our internal code. Does our admit service and admit attending, do they match? Do we have trauma as the admitting service and then we have an orthopedic surgeon as the admitting physician. They don't match, we need to match them. We review our ICD-10 diagnosis and procedure coding pre-existing conditions. Anybody who's 65 or older and has none listed, guess what? I'm gonna review them. Anybody who's 80 or older and doesn't have functional dependence, I'm going to review and I can tell you that functional dependence is missed a lot. Anticoagulant use, we look for 55 and above. So again, those are things that we like to run through our data and look at. And again, you'll see if you have, these are things that I'll do for so long. And if I haven't had an issue for like six months, then I'm gonna lay off for a couple of months where I don't have to double check it, but then I'll start again and kind of sporadically depending on what I'm finding. Your GCS and your GCS qualifier. So this is one I find often. If you do not have a GCS recorded, you cannot have a qualifier, okay? And again, look at your data dictionary. It'll tell you specifically. And if you have a GCS, you have to have a qualifier. So they need to, they go hand in hand with each other. So those are things also. The TQIP data type. So I found that we needed to do some education on TBI monitors because of choices that were being selected that were not correct. So again, those are things to look at. And then an ISS score. Does it seem too high? Does it seem too low? I review every single patient that has a 75 gets completely re-abstracted, okay? To ensure that that was coded correctly. And majority of anything 50 and above myself or one of the senior registrars, we try to get all of those reviewed. And we review all of our mortalities, okay, as well. So again, you have to make this work for your facility, whatever those things are. And if you're just starting to do this, then you start with a few things and you see how those are. And if things are good, then you move it on and keep going through your entire system of what you have. Again, so internal, your monthly peer review. So, and IRR. So we have an abstraction tool. Ours is kind of based on, right now we have trauma-based, so it's based on our tabs and what is in that tab. And of course we add more in here if there's more. But again, that's non-punitive, it's educational. We have a round table where we say, hey, we do a quarterly IRR where we all look at the same chart to abstract. And usually I ask, hey, what do you guys want to focus on? You know, they want a spine, they want a head, you know, they want a crush injury, whatever kind of injuries that they want to look at. But again, I say to ensure that these are non-punitive. This is a discussion. These are not scored in any way. They are truly just for peer review. But then we move into part of our data quality plan is also that our registrars are held to an accuracy rate. Ours is 95% or better. Overall diagnosis coding, procedure coding, e-coding, and our pre-existing conditions, all must be 95%, each of those. Okay, so overall has to be, but so do those, the coding pieces as well, because those are super important, right? If rates fall below that and are continually below that, then that would require an action plan for that specific registrar, registered professional. And again, you can develop your own tool. That's important. That's our, this is an example of our scoring tool. And again, is the tabs that we have and how we have them done, have them done and count up each possible, each possible point and then from there. So all of the registrars have a scorecard annually so that we know where their accuracy lies. And then we also have a visibility board where, again, when I talk about this is one FTE for 400 to 600 charts per year, okay? So it depends on other responsibilities other than chart abstraction. So if you have a registry team where you guys all share reporting and other responsibilities, well, you can't be expected to do 600 charts a year because you have other responsibilities. So if someone is just doing abstraction and nothing more, then you could hold them to the 600. But I will tell you that just going through our visit, they were looking at 500 per registrar. That's what, that was where they were looking. That's where you should be because of other duties and other things that they may do, other projects. That's what they were looking at. So again, I would ensure that you are within the 600 or less, not over. And again, this is our registrar visibility. So they get this usually twice a year. Everyone gets their own and it tells us for each of them, how many records did they close per month? How many were transfers? We pretty much don't transfer out, but we do get about 30% of our patients are transfers in. How many deaths? What was the average ISS? What was the average length of stay? Total hospital days. And then it kind of figures out hours per records close. So if they, how many they close approximate part. And then the IRR percent is actually accuracy scorecard. So I have a pre-made report in our database that pulls out majority of this. I just have to add in the IRR from their score. All right. Percent of cases close within 60 days of discharge. This should also be with your data quality as well. So this is something else that I pull to tell us where were we each year, do it each month as the 60 days from the last day of the month comes. And I pull out our rate of what we closed. All right. All right. Your internal minimum of 10% of all cases are 100% pre-abstract. So again, they recommend five to 10%. But again, 10% is a pretty good amount. Training and orientation or orienting employees, all cases are re-abstracted. So for us, all cases are completely re-abstracted as we are training and educating. Patterns found in data validation are used for educational purposes with the team or individual registrars as needed. That should be part of your data quality plan because as we talked about in the beginning, it's how do we then correct the data quality issues that we found? Well, that's one of them right there is we are now going to educate that registry team and if it falls on someone else that we need to educate, we can do that as well. Our PI team, so validation, we validate all of our hospital events prior to our submission, making sure that they have been abstracted adequately. And then we have our PIPSC meetings and discussions. Then we also have our next level, which is our medical audit committee discussion, which is peer review as well. All right, so again, why is this data so important? Incomplete, inaccurate and or inconsistent data leads to, right? Can lead to missed opportunities to identify and improve our patient care. Errors in research results and development of protocols, increased costs and a misunderstanding of our trauma population. So accurate data is a reflection of the care your hospital gives. So if we're putting in, as we say, garbage in, garbage out, we could be giving phenomenal care to our patients and if our data is, we don't have quality data, guess what? It may look like we're giving really poor care to our patients or vice versa, right? It could look like we're giving phenomenal care and really we're not. So again, it's really, really important. And so, you know, everybody makes mistakes. Okay, nobody's perfect. You've got phone calls, emails, messages, distractions right in the office or I know many registry professionals are working at home. So we all know that you can have interactions and you lose your place. So it's really important, I always say, to try to make sure, ensure that you're focused and you're not having to read the same thing four and five times because you've been interrupted, right? But it happens. But again, I tell them, before you close out a record, go through each page, make sure everything makes sense, it looks right, nothing is missing, et cetera. And again, you know, really important, make sure that you're validating your data because you don't wanna call from your trauma program manager, your trauma medical director or worse, administration saying, what is going on? Like, this just does not look right. And then to go in and find that, you know, it's a mess and you didn't know, super important. Who should validate your data? Everybody, all right? So the data quality plan really is a responsibility of everybody in your trauma program. It's not just, here it's not just my job, it's not just my trauma director's job, right? It's everybody's job to ensure that we are providing quality data, which equals quality care because we can use that data to make things better. So the registrar should be, the PI clinicians, everybody should be, and of course data validation is a never ending process. So data quality, it's a never ending process to have data quality, all right? What am I looking for? So you should be looking obviously for blank or missed data, all right? Especially in your diagnoses and procedures. Procedures shouldn't have a blank operative timer date, you should know when a patient was in the OR, that should definitely be documented. Your length of stay or ISS, does it seem too high or too low, right? Was the length of stay was like over a year? I don't think we had a patient here that long. So look at it, make sure that someone didn't put in a wrong date, it happens, especially when we change years, right? It's so easy to be entering the wrong year still. Again, an ISS of 75, you wanna make sure that you look at those, ensure that they were coded right, okay? Your inconsistent data, again, same thing, is looking for things, patient admitting floor says ICU, but admit service says not admitted, or it says that they went to the ICU, but then ICU days says not applicable, you know, things like that. I find those things and it happens to just be, somebody is usually going too fast, right? They're not validating what they're putting in. Typos, there's definitely typos. So looking at your MRNs, your account numbers, looking just through your procedures, you're looking through for hospital events. So one thing to focus on is, when the new year brings new data, right? So we know 2025, we're gonna have some changes. Last year, we really didn't have changes for the NTDS, but this year there are going to be. So it's ensuring that new fields or changes in definitions are being followed appropriately, okay? I can tell you, I'm looking for data quality when our advanced directives changed to, now you're only going to apply it as a comorbidity or a preexisting condition if they use it, all right? So that was a complete change from what it was, and that was one that took a little bit of education and reminding to really hammer it home, when to apply it and that now it's less likely you're going to apply it than it used to be. Again, super important. So always, anything new, you should definitely be validating. That's part of our data quality plan is when new data fields or definitions are changed, that we're gonna ensure that we are following those. All right, finding the errors. Again, you've got that TQIP data center validation summary. You've got your submission frequency report. If you submit to the vendor aggregator prior to your submission to build your file and you get the errors back first, that is another way. That's what we use. And so that's my first, before I even have to submit anything is getting it there and getting that. Your ad hoc reports from your own trauma registry database. So you have monthly validation reports, all right? Then you've got all your audit reports that maybe trigger, ooh, I wonder if, or hearing one of your registrars say to someone else, well, that's not how I do it, or, well, no, I do it this way. When I hear that, my ears turn red usually, but I'm like, oh, wait, wait, wait, timeout, timeout, timeout, and then that triggers me. Okay, let me run a report. Let me see what's going on. Especially if you have a larger registry team, this person will always ask this person and that person always ask that person, even though we do have a group and that's how they should be asking is group so that we're all on the same page. We're all doing it the same way. It's so important. So when you hear those words, that triggers you to say, ooh, I better run some reports and look at this data and make sure that this is correct because that's usually a trigger that it may not be. And again, your daily, your registrars are going to validate their charts before they close it. Here in IRR, your monthly reports and develop a checklist of reports and set reminders for yourself. Your chart reviews monitor quality of your individual registrar data. And then again, in those chart reviews, deaths, ISS greater than 24 and no poor hospital events, an ISS greater than 50, lengths of stay greater than 15 days with no poor hospital events. And then again, age greater than 64 or 65 with no comorbidities. Those are just examples of things that you can do. And then bringing up a couple of some sample reports, again, on my monthly validation review right now, I have my registrar and then I've got our MRNs, our account numbers, arrival and discharge dates, transfer in, transfer out information, the type of alert. Again, that's another one that I look at is if trauma is the admitting service and there's no alert, that's not quality data because it's at least a consult, right? If trauma admitted them, it was at least a consult if it wasn't an alert. Again, your E-code, your injury details, your cause code matches. Your ICD-10 and AIS codes are matching. Your procedure codes, ages, comorbidities, our DC disposition and destination, your hospital events, any of your own PI filters. I'll look at ED vitals, your GCS and qualifier, and then again, your ETOH screen and the value and the TQIP data. Again, depending on your center, you'll have to decide what works. And this is just a sample and mine obviously change. Then I have a list here. Basically, this is just kind of a sample of non-discretionary indicators. So these are mandated by the ACS or our state that we need to look. So you're over under triage. We do CRIBARI and NIFTI, your trauma attending response times, ED documentation, and your SBIRT. And then looking at, this one has your open fracture time to antibiotics, our neurosurgery performance improvement information, and that's your craniotomy. We also have ortho PIPs, and those involve a lot of information. And then looking at your IR response times. And then we also have our discretionary reports, which are more of our own. What are we looking at? So for us, ED thoracotomies and REBOA, we're going to look at that. We want to look at our ICU length of stays and our unplanned intubations, unplanned extubations, things like that. So again, make a list of what reports you run when, including validation reports as well, so that you know, I have to run this each month, and you can check it off when you run it. Schedule monthly registry meetings. If you don't already have one, it's important that you have a registry meeting at least once a month. Some people I know have weekly huddles, however that works. But same day and time each month is how you should set that up, because it should be a continuous, you don't want to miss it. And then we assign monthly peer charts to the registrars each month with a day that they're due back, and I send them a reminder of when they're due that morning, early that morning that they're due by end of that day. Complete a review of 5 to 10% of all charts, and then use other reports to validate the other 90% of charts, right? So you're validating your data. Communicate. Communicate to your team, positive and negative, right? It shouldn't always be, hey, we're doing this wrong, or we're, you know, give some positive feedback too that, you know, hey, you know, overall we're doing really, really well, or I let my registrars know when they have, you know, truly are doing a phenomenal job, it's really important to let them know that they are doing a great job, especially because sometimes they get left, you know, behind the scenes and nobody really notices all the work that they're doing, and it's not easy. And then establish your goals, right? So create and provide a registry report. Create a visibility board. Track and trend accuracy. If someone's struggling, why are they struggling? Help them. All right? So develop a data dictionary for your specific facility or a data clarification document for your EMR, okay? Include the hierarchies. Review at least annually. We do a trauma registry retreat two days in January and a mini one in July, and we spend that time literally just looking at our data dictionary and discussing any and all questions that everybody has and making sure we're all on the same page. If you have a cheat sheet in your registry, make sure that you are updating that at least twice annually when the ICD-10 updates come out in April and October. Super important, because things change, as we all know, pretty much daily, and so you want to make sure that you're using the correct codes. Also look at your TQIP code set. Actually, we just got our fall today, so look at that TQIP code set. See if anything has changed. All right? And then make sure that your staff can run basic reports so that they can do some validation of their own, right? That's important, too, for them to be able to look at maybe their own little spreadsheet and know. And if possible, import data from the EMR to the registry if you have that ability. It does help. Still needs to be checked, but it does help. All right, just a little cartoon. So if nothing else, your peer review and validation should result in a Hawthorne effect, right? People, when you're validating these charts and they know, okay, my team knows that I'm going to validate when that's over, that they'll double check themselves knowing that their work is being checked, right? If they don't think anybody's looking at it, it's kind of, it seems like sometimes then, like, they hurry through, where when they know it's going to be checked, they're a little bit better. But the data quality plan overall is really all of these things in a nutshell. You just, in that data quality plan, need to put in, hey, these are the things that we are going to use to validate that our data is quality data. It's correct, okay? We have data integrity. We have data quality. We can, you know, when I run a report or when I submit my data, I am 99.9% sure that what I am submitting is accurate or as accurate as it could possibly be based on documentation, all right? All right, we have a couple of questions, a little bit of a quiz. Oh, she's got it. Okay, I think, okay, good, she has it up. So, according to the American College of Surgeons, what percentage of charts should be validated annually, 5%, 10%, 15%, or 20%? I can't see the, like, what the poll, when to move on. Can you, Amber, you can't see it either, can you? Yeah. I'm just wondering. Yeah, she, OK. She accidentally did all the questions at once. OK, so we'll just have to go on this way. All right. So the answer is, let me here, it's supposed to be, there it is. The answer is, actually, you could have done A or B because it's 5% to 10%. So, oh, there it popped up. Yeah, but great, 10%. I always say it should be 10%. I think 5% is a little bit low. Unless, of course, you just don't have the ability to do it. But it should be, it actually is 5% to 10%. So if you did A or B, you're good. All right, next question. Internal validation includes all except, so IRR and peer review, TQIP submission frequency report, monthly validation reports, none of the above are internal validation. Let's see what we got. The answer is B. The TQIP submission frequency report is an external validation tool. Good job. All right. Three. Data integrity is defined as consistent, accurate, quality data maintained over time, accurate data maintained over time, consistent, complete, accurate, quality data maintained over time, or none of the above. All right. Hey. The answer is C. Great job, guys. Great job, 83% of you. Data integrity is defined as consistent, complete, accurate, quality data maintained over time. All right. Who's responsible for data quality data? Who's responsible for data quality? The TPM, the TMD, everyone, or the trauma registry manager. And 100% got it. Everyone. Good job. All right. Four. Moving on here. I don't know what happened. There we go. Number five. What should you do, what should you look for when doing validation of data for quality? All right. Blank data, high ISS, inconsistent data, all of the above are involved, or none of the above are involved. And it looks like 100% of you got the answer being D, all of you. All right. And now, any questions? Thanks, Robin, for all of that. A few people have put some questions into the chat box, and we've answered a few of those. But, Adina, I have a couple of questions. Yes. In this last little bit of time we have. What about trauma centers? What would you recommend for trauma centers who are, like, low-volume trauma centers where they don't see a whole lot of patients? Their registrar wears many, many hats, you know, and they say that validation, we don't have time to do that. Or you've got trauma centers that are understaffed. How would you suggest addressing those issues? So the first thing that I would say is, if you are maybe a small, you know, a level three or a level four, right, we know that in those centers sometimes the trauma registrar wears many hats, right? We know that. It's totally different than when you get into the ones and twos. But the first thing that I would say is, do you have a sister hospital that has another program? Because I think one of the really good things to do is, if you have a system, is to get involved where your peer review involves the system. So that's where I would help out with that peer review. Now, you may have some differences in how you abstract things, obviously, at, you know, a sister hospital, because, you know, usually a level three or four has a little bit of nuances that that level one or two doesn't, right? Things might be a little bit different. But overall, you're going to have, you know, coding of your procedures and your diagnoses and your external cause codes and, you know, comorbidities, all of those things that would be the same no matter what. That would be one way to do it. The other way is, if you are in a small center where you have that low volume, is do you have a PI nurse or do you have your program manager where you can kind of validate each other? That's the way that I would recommend, is that you work together to do it. If there's just one registrar, that's the hard part. Either you have somebody else that, you know, in your system, or else you'd have to do it with one of your other teams. There's a lot of, you know, you've described the purpose and a lot of the ways that we can validate stuff. But say in a one-month period, how much time do you think should be spent monthly on all components of validation, whether it's identifying who the patients are going to be that you're going to be doing the data validation for, and then training. And then like for new audit filters, or even the review of the results. How much time in a month do you, just as a guesstimate, based on, let's just say, you know, a hospital may have, let's say, 300 patients a month. How much time? Oh, 300. Okay. All right. 300 patients a month. You are going to spend in a month, you're going to probably spend, oh, gosh, I'm just trying to think, if I really put together all the time that I spend, I'd probably spend two out of a whole month on. Now, I got about probably 400 patients. Repeat that one more time, I just didn't hear you. Sorry, I probably have about 400 patients a month. And I'm going to say it's probably 80 hours that is spent on the validation, the running of the reports, scrubbing the reports, and then education. And yeah, easily, easily two full weeks, easily. And then I think one question I'm just, I'm grabbing, there's a question in the chat box, I'll read it. And that was the question, do you as the program manager document the registrar's education for data improvement? And it just goes into, you know, like the time to antibiotics for open fractures and the training that goes that way. And I think the last question I have for you is, who do you report your information up to? How do you use the results for staffing and do you report this up any further? Sure, so documentation of registrar education. Yes, so obviously, so we actually have a monthly, one of my senior registrars does a monthly educational, we call it anatomy corner, as part of our monthly meeting. Okay, but so in our monthly meeting, when we do IRR and those things, yes, that is reported on our minutes as our meeting, and then our peer review, and then anatomy, our anatomy corner is education. So all of that is reported there, and that's all on, when I have to do individual education or group, that is all also documented, whichever, either or. Everything is documented out in the time that we spend doing that. So that if, obviously, when the ACS comes, if they ask me for any of that information, it's all there in one file folder and documented. And again, I think I, so I report anything to my trauma director, my trauma program director, that I think is, you know, an issue, a true issue, versus, hey, you know, I always, we talk all the time. So I do keep her informed of, oh, crud, this past month, I don't know what happened, but it, like, it was really messy. Like, we had a lot, they had a lot of data to send back, versus a month where, oh, my gosh, like, the data was great. It was a great month, so. Well, thank you, Robin. We appreciate your expertise and your time to help educate us on this very valuable process, for sure. It supports our entire programs across the country. I do want to just quickly say two things. As soon as the meeting's over, you should be able to claim your CME for the webinar. And then secondly, this is part two, and some of you are thinking, well, where's part one? And due to some things just going on that were beyond my control, we had to postpone my part, and I'll be doing the part one. And we're actually going to do that on October the 31st. So reach out to Deb or go to the TCA website, or you may get an email. And if you're interested in that, we would love to have you. And it's going to be from the perspective of the program manager. What does this mean to us, and how can we use this along the way? So I appreciate you again, Robin. Christy, thank you for setting this up, and I hope everyone has a great rest of your week. Thank you.
Video Summary
The webinar outlined the importance of data integrity and quality in trauma registry management, presented by Robin Schrader, Operations Manager at VCU Medical Center. Key aspects discussed included the requirement for a written data quality plan as mandated in the "Resources for the Optimal Care of Injured Patients 2022," which stipulates trauma centers must conduct quarterly data reviews to ensure accuracy. Schrader explained that data integrity entails consistent, complete, and accurate information, emphasizing the use of the ALCOA principles—Attributable, Legible, Contemporaneous, Original, and Accurate—in data handling.<br /><br />The session highlighted why quality data is crucial for performance improvement, research validity, resource allocation, injury prevention programs, and maintaining hospital accreditations. Furthermore, it explored strategies for internal and external data validation, including random re-abstraction of records, monthly reporting, and reviews of specific data points like comorbidities, coding accuracy, and discharge dispositions. The need for transparent communication and consistent validation efforts among the trauma team to maintain and improve data quality was underscored. Practical insights were also offered on managing validation processes in low-volume or understaffed trauma centers, ensuring all team members participate in maintaining data accuracy. The webinar concluded with references to establishing robust reporting and review systems to uphold data standards.
Keywords
data integrity
trauma registry
data quality plan
ALCOA principles
performance improvement
data validation
trauma centers
Robin Schrader
VCU Medical Center
×
Please select your language
1
English