How Accurate Doppler Ultrasound Can Reduce Health Care Costs and Increase Patient Safety
James C. Conti, Ph.D. and Elaine R. Strope, Ph.D.
A crisis is clearly developing in the American health care system. On the one hand, those of us involved in health care delivery want the highest quality system the world can offer. On the other hand, as administrators we must address the ever spiraling costs that threaten to bankrupt not only hospitals and clinics, but also the country. But there is good news for hospital, radiology, and imaging center administrators. There are ways to reduce health care costs, and at the same time maintain quality of service and improve patient safety. Can this be true? The answer is a resounding yes. We present one area in which the adoption of sound quality assurance (QA) practices in the imaging facilities accomplishes all of these goals. Our cost saving discussions today will focus on how adopting procedures to verify the accuracy of Doppler ultrasound diagnoses can eliminate unnecessary follow up procedures and dramatically decrease the cost of a hospital stay.
To get a full appreciation of the current technology, one needs to step back and review the evolutionary process that has resulted in the present generation of clinical Doppler ultrasound machines. The initial discovery of the Doppler effect began this process back in 1843.1 Coupled with the introduction into the scientific world of the concept of acoustic wave generation2, the first use of transmission of ultrasound signals and analysis of the return signals through the Doppler equations allowed scientists to measure the motion of objects at a distance from the transducers. Imaging ultrasound was first used on humans in 19263; Doppler ultrasound was first incorporated into medical research in 1943.4 Non-invasive ultrasound imaging continued to progress through the 1970s when the first commercial instruments became available. At the same time, improvements in technology were allowing scientists to incorporate Doppler measurements into non-invasive diagnostic instruments. The first commercially available Doppler ultrasound machines were finally introduced in 1976.5 It wasn’t until the 1980s that ultrasound imaging, coupled with the use of appropriate calibration systems, finally produced quantitatively accurate images. We have now entered a time in the evolutionary process where, with the use of proper calibration systems, Doppler ultrasound can be used to generate highly accurate blood flow data. When evaluating the various diagnostic techniques available today, it is apparent that ultrasound is truly the technique of the future. It is safe, effective and lowest in cost of all the imaging modalities.
So, if this is such a wonderful technology, then why does its use seem to indicate the need for follow-up techniques which introduce unnecessary costs into the medical system? The problem is one of quantitation. If one reviews the results from a number of clinical ultrasound machines, it is very common to find disparity in the quantitative numbers. In addition, different modalities can show different results. For example, catheterization and Doppler ultrasound can yield discordant results. For those clinicians who understand the pitfalls of inaccurate Doppler measurements, standard procedure is to use other techniques for verification when critically important diagnostic decisions have to be made. As an example, a patient may be sent to a peripheral vascular laboratory for an initial Doppler ultrasound analysis of a suspect carotid artery. If the peak blood velocities from the vessel in question indicate a pathological condition or do not agree with clinical signs, then it is common for the attending physician to request a follow-up angiogram to verify the Doppler ultrasound numbers. In our discussions with clinicians, it is clear that these follow-up procedures are done primarily because the Doppler results are considered preliminary. Therefore, a cost savings could be realized if a QA program were instituted to verify the accuracy of the Doppler results. In many instances follow-up angiography would not be indicated. It appears that as much as 85% of all follow-up diagnostic procedures used to verify peripheral vascular flow could be eliminated if the initial Doppler ultrasound analysis could be verified as accurate. To get an idea of how much costs can be reduced by this particular quality assurance step, let’s take the example of an elderly patient who has a blocked internal carotid artery. A local hospital would need to spend approximately $10,000 to do the initial ultrasound, follow-up testing, operative procedures, and final imaging necessary to eliminate this occlusivity that could result in a stroke. Of this $10,000 cost, nearly $2,000 could be the cost of the follow-up angiogram. Now, the cost of procedures around the country vary a great deal, but it is not unreasonable to assume that nearly 20% of the total costs of this carotid endarterectomy procedure could be eliminated if the Doppler ultrasound analyses could be considered accurate and reliable. In addition, follow-up imaging, critically important to long term efficacy of the carotid endarterectomy, could also be done solely by Doppler ultrasound analysis. At the present time, an unverified or questionable Doppler ultrasound analysis simply must be checked with other technology to assure that restenosis does not threaten the long term health of the patient. Keep in mind that our discussions in this example are purely financial in nature and do not address the potential dangers that angiography poses to the patient. These dangers, of course, are not present in ultrasonic techniques.
Another area in which the radiology administrator can focus QA attention is in the cardiac laboratory. It is unlikely, however, that the same dramatic reduction in follow-up angiograms that can be experienced in the peripheral vascular laboratory will ever be achieved in the cardiac laboratory. This is because many follow up angiograms are used to address issues other than Doppler verification. Nonetheless, there are situations in the cardiac laboratory where verifiable accurate Doppler could save the hospital money. For example, recent findings have shown that the replacement of an implanted prosthetic heart valve can result in much lower operative mortality if patients have a new heart valve implanted at the proper time. Currently, ultrasound can be utilized to monitor prosthetic heart valves, but at some point in the aging of this implanted device a cardiologist is going to need accurate data to recommend the proper reimplantation time. At present, an accurate determination must be made with angiography unless the cardiologist can be assured of the accuracy of the ultrasound information. Accurate Doppler could delay the costly follow-up angiogram until it is really necessary.
It must be realized that an appropriate QA process cannot eliminate the potential for incompetence by the user, nor eliminate anatomical situations that make it difficult to obtain an accurate Doppler analysis. However, it is obvious that an inaccurate Doppler ultrasound machine can never generate accurate Doppler data, no matter how competent the user nor perfect the anatomy of the patient. An accurately calibrated machine is a necessary, but not sufficient, prerequisite for dependable data.
So how does a hospital administrator institute a QA program to reduce health care costs and improve patient safety while contributing to the continuous quality improvement (CQI) process? The answer is really a matter of timing and depends upon the institution and personnel considerations. The following discussion of calibration/verification will be broken up into long term, intermediate term, and short term procedures.
Long Term Calibration Procedures
Long term calibration suggests that an instrument or instrument/probe combination is checked on a monthly or perhaps quarterly basis to determine whether the machine is producing reliable results. Because of the infrequency of this quality control step, a much more comprehensive set of procedures needs to be instituted. This is because the nature of errors in any measuring instrument depends upon exactly where in the instrument the problem arises. Baseline errors in a scientific instrument are such that, no matter what number you are recording, it is incorrect by a certain constant amount. For example, instead of 1 meter per second the instrument could be reading 1.25 meters per second. In the same way, instead of reading 2 meters per second it would be reading 2.25 meters per second. The data are easy to correct when faced with this type of constant baseline error. In the above example it is a matter of subtracting 0.25 meters per second from every number that was measured. Other errors are not so simply addressed. For example, instruments can be off by a certain percentage. Instead of 1 meter per second an instrument might be showing 1.25 and then, when it should be reading 2 meters per second, it would be showing 2.5 meters per second. Although the nature of this error is different, it is a fairly easy situation to deal with: the number is 25% too high. A third form of error can be associated with a sensitivity problem with your ultrasound machine. These inaccuracies can arise because insensitive probes can effectively miss the highest velocities. So in addition to the constant baseline error or constant percentage error that might arise, a technician needs to address accuracy versus sensitivity. Again, this can be done but it complicates the actual calibration/verification step. A fourth error can be associated with angle correction. A properly designed calibration system can allow a user to adjust for any angle correction inaccuracies.
As you can see, a quarterly verification procedure can be instituted which initially appears to be a simple and infrequent procedure but requires a rather daunting amount of information. All of these data need to be available and considered when generating a number which the clinician can use to make a medical decision about a patient.
Intermediate Calibration Procedures
In this discussion, an intermediate period would be approximately one week. The obvious improvement with this calibration frequency is that errors would be limited to one week’s worth of diagnostic information. When coupled with a long term calibration program, such as once a month, certain parameters of the ultrasound machine do not need to be checked each week. Angle correction and sensitivity are factors that are less likely to go awry over a short period of time. As a result, weekly calibration procedures could focus on the accuracy of a particular ultrasound instrument/probe combination at preassigned known peak velocities. We find 1, 2, and 4 meters per second are velocities that produce a fairly reliable level of confidence for peripheral and cardiac instruments. The weekly calibration program still needs to address the nature of the errors and will require correction of the measurements from historical calibration data. For example, a diagnostic procedure carried out on Thursday will need to be referenced back to the calibration data generated at the beginning of the week.
Short Term Calibration Procedures
Clearly, as the frequency of calibration or verification checks increases, then the degree of assurance goes up. There are two short term calibration procedures that an institution can utilize. Each will be addressed separately. In a recent comprehensive ultrasound text by Evans et al., daily certification of medical ultrasound machines is recommended.6 A technician could be assigned the daily task of quickly checking a pre-assigned series of peak velocities on each of the instruments in a facility. This will result in a great deal of assurance that the numbers generated during that day are reliable. Although the assurance with daily verification checks is very high, it still requires hospital personnel to take the patient data and go back and correct it with calibration data. Of course, if all the instruments that are checked are accurate to within a pre-assigned value, for instance a 90%-95% accuracy, data correction would not be necessary.
The most frequent verification would be a process that occurred with each patient. This also is a recommendation given by Evans et al.6 Surprisingly, in the clinic it can be one of the fastest and simplest ways to verify the accuracy of a particular investigation. The procedure would include the following steps. For example, during an ultrasound analysis of a partially blocked carotid artery, the ultrasonographer or clinician would determine the maximum peak velocity of blood moving through the stenotic area. By doing this, the effective cross-sectional area of the artery and therefore percent occlusivity of the vessel could be calculated. At the end of the analysis, the settings on the ultrasound instrument are kept exactly as they were when the peak velocities were found. The user then takes the probe and places it on an appropriate calibration system. The calibration signal height is set to match that just obtained from the patient. The clinician reads the true peak velocity from the calibration system and enters that number on the patient’s permanent record. These data could then be used by the clinician at that time or in the future to accurately assess the blood flow characteristics in that particular patient. A properly designed calibration step with an appropriately designed calibration device would require less than one minute of the ultrasonographer’s time. Sources and types of error would not be needed to be addressed, nor would concerns for sensitivity. An ideal calibration device should be such that the depth and acoustic properties of blood and tissue, as well as the angle of the vessel, can all be mimicked at the time of the verification step. This is the most accurate and at the same time simplest procedure that can be used to verify clinical Doppler readings.
Calibration/Verification Instruments (Phantoms)
A great deal of discussion, time, and effort has been spent addressing the characteristics of an ideal Doppler calibration/verification system. Although there is disagreement in the field as to how complicated one of these instruments should be, at the minimum it should contain flowing liquid that mimics the acoustic properties of blood, generate a highly accurate and reliable flow, and be easy to use in the clinical environment. This last step is a very important consideration since many earlier designs on the market are appropriate for research laboratories but are inappropriate for the clinical environment. A hospital administrator needs to ask the following questions:
1. Is it easy to use?
2. Is it accurate?
3. Has it been independently calibrated?
The above issues are very obvious to anyone who needs to institute a QA program. An instrument that is too complicated to be used by either your QA technicians or your ultrasonographers will either not be used or will be used incorrectly and will not result in a quality assurance program. The question of the accuracy of the calibration system is not as simple as it might seem at first. One of the reasons for this is that certain calibration systems are built in such a manner that they can go out of calibration without any indication to the user that this has occurred. As an example, we have seen string phantoms whose indicated speed was a function of the rotational velocity of a primary pulley. This phantom had a stretched string over the pulley and the string was slipping. This meant that the indicated velocity was not the velocity of the string.
The inherent advantage of an independently calibrated system is familiar to all scientists involved with calibration procedures. In other words, it does not make sense to use a calibration system to verify or calibrate an ultrasound machine if that calibration system itself has been calibrated with an ultrasound machine in the first place. How can one determine that the ultrasound machine used to check the calibrator was accurate itself? To avoid these kinds of circuitous arguments, scientists use an independent, more well-defined technology to check the accuracy of calibration systems. This is done by use of primary standards or National Bureau of Standards (NBS) traceable instruments. An example of a primary standard is a mercury manometer. Although extraordinarily simple in its design (a characteristic common to many superior scientific designs), a mercury manometer gives a very accurate pressure determination. Its function is dependent upon gravity and therefore it is considered a primary standard. Other examples of primary standards would be an atomic clock and the velocity of light through a vacuum. Of course, these last two standards are difficult to incorporate into an ultrasound laboratory. A NBS traceable calibration is somewhat different. This requires the use of a calibration instrument that itself has been calibrated by a source located at the NBS (now known as the NIST). NBS traceable standards need to be rechecked periodically. A calibration system used by a hospital to verify the accuracy of one of its diagnostic instruments has to be accurate and this accuracy must be verifiable in an independent manner.
Administrators have the ability to eliminate a number of costly and unnecessary diagnostic procedures simply by instituting a new QA program. Not only will this program save the hospital money and increase patient safety but will also fulfill the need to contribute to the CQI process. A procedure instituted in your facility that verifies the accuracy of Doppler will save money by reducing the need for additional diagnostic procedures that are costly and dangerous. It may also allow you to retain older instruments that have lost some accuracy over time.
A final area that a radiology administrator needs to address in the QA process is ultrasonographer training. We mentioned earlier that reliable QA programs can only assure the accuracy of the hardware and software part of the equation. It will not eliminate anatomical problems that are encountered nor will it eliminate incompetence of a user. However, a properly designed calibration system offers the chance to monitor and/or train users of your ultrasound equipment. Ultrasound machines can have the equivalent computing power of twenty-one personal computers. It is very difficult to train on such a complex instrument when the signal source is a squirming, moving, breathing, constantly varying human patient. A stable, reproducible signal will allow a sonographer to train for hours, knowing that changes in the observed signals are a result of the changes made to the various settings of the instrument. This substantially increases the effectiveness of training.
Cost reduction, increased patient safety, improved ultrasonographer training, successful CQI programs, and improved public relations are all benefits of a proper Doppler ultrasound QA program. As a well-informed administrator, you have a unique opportunity to save your facility money without sacrificing healthcare delivery. We are in an environment in which this is desperately needed. We urge you to pursue this course of action, for you and your facility are the potential winners.
1) Doppler, C.J., Uber das farbige Licht der Doppelsterne. Abhandlungen der Koniglishen Bohmischen Gesellschaft der Wissenchaften, 11:465,1842.
2) Weld, P.E., Early History of Echocardiography. Journal of Cardiovascular Ultrasonography, 5:169, 1986.
3) Elder, I., Hertz, C.G., Use of Ultrasonic Reflectorscope for Continuous Recording of Movements of Heart Walls, Kung Fysiograf Sallsk Lund Fordhandl. 25:5, 1954.
4) Satumura, S., A Study on Examining the Heart With Ultrasonics. Japanese Circulation Journal, 20:277, 1956.
5) Advanced Technology Laboratories, 1976.
6) Evans, D.H., McDicken, W.N., Skidmore, R., Woodcock, J.P., Doppler Ultrasound, John Wiley and Sons, New York, 1989.