Why Radiologists Should Use Medical Diagnostic Displays to Read X-Rays?

We get this question all the time: Why should I pay thousands of dollars for a medical-grade monitor to interpret medical images if I can buy a very nice-looking commercial-grade, off-the-shelf (COTS) monitor at the local computer store? We’ve boiled this argument down to six important reasons based on critical arguments that are hopefully simple to understand and explain to your fellow radiologists or administrators who have little technical background.

A commercial-grade monitor doesn't show all of the critical anatomical information radiologists need to diagnose medical conditions with the greatest accuracy and confidence.   

As the name implies, COTS monitors are intended for office automation, to display documents to appear like a printed page. Therefore, performance attributes are weighted heavily to being as bright as possible so that text is easily resolved with minimal eyestrain.

Commercial displays attain maximum luminance long before the graphic card input reaches its maximum input value. Remember that a typical graphics card can display 256 different input values, each representing a distinct piece of valuable diagnostic information. COTS monitors have been observed to max out on brightness at an input value as low as 200, which means that values 201 to 255 are mapped to the same luminance value. As a result, 20 percent of all the data is cropped or simply eliminated.

By contrast, medical-grade monitors are calibrated to map each individual distinct pixel into something you can detect rather than following the natural response of a graphics card output. Unfortunately, it’s normal for the natural COTS monitor response—uncorrected to the Digital Imaging and Communications in Medicine (DICOM) standard—to yield the same luminance value (measured) for multiple sequential values. For example, flat spots in the response curve are especially obvious in the low range of input values, such as the first 160 of 256 values.

What’s the impact of a flat response? Let’s take, as an example with a COTS monitor, the pixel values of 101, 102, 103, 104, and 105. These could be mapped into a single luminance value on the screen. If a patient has a slight nodule, which would otherwise be identified by a difference in input value between 102 and 105, it will disappear on a COTS monitor because there is no distinction between these values on the screen. Note: Since the majority of the clinical information from imaging modalities is in the lower 50 percent of the luminance range, this means that these pixel values are in the most critical areas in which the ability to resolve pixels at different luminance values is compromised.

In conclusion, the potential to miss critical diagnostic information both at high luminance and because of flat spots in the response is the most important reason to not even consider a COTS monitor. The first requirement for diagnostic display monitors is to insist on a monitor that’s calibrated to the DICOM standard, which truly maps each of the different pixel values into a luminance value on the screen that’s detectable to the human eye as noticeably different. It’s best to have the manufacturer calibrate the monitor to enable optimal mapping of the three RGB channels into the DICOM-compliant curve.

Many COTS monitors don’t have the required dynamic range of brightness.

The maximum light output of a monitor is specified using the unit of candela per square meter (cd/m2). A good-quality commercial display can achieve 300 cd/m2, sometimes more if you’re lucky. This maximum value of 300 cd/m2 is at the low end of what any medical-grade monitor can achieve—these might be able to go up to 2000 cd/m2 or higher.

Why do we need this much luminance? When a display is calibrated to DICOM, a percentage of the response is lost in the mapping process. At 300 cd/m2 and after applying DICOM corrections, the maximum luminance value can be expected to decrease by about 10 to 20 percent.

The human eye has a 250:1 contrast ratio at the ambient conditions of the viewing environment. Assuming the commercial display was DICOM compliant with aftermarket software, the luminance ratio of the display and the eye would be very close. However, ambient light detracts from the ability to see low-contrast information. This particular example would need to be in a low-light room to achieve a 250:1 luminance ratio that accounts for ambient light.

Diagnostic displays operate between 400 and 600 cd/m2, as corrected to DICOM, with reserve luminance potential for extended life at those levels. Even if a monitor is calibrated, if there aren’t enough points in which to map the pixel data, you clip off part of the information. For example, if you want to map 256 grayscale pixel values but you have only 200 points available, you’ll lose that information.

The required dynamic range depends on where you use the monitor. The lighter and brighter the room light, the more information you’ll lose in the dark because you simply won’t be able to distinguish details. There’s a simple fix for this. Calibration takes into account the room light and maps the lowest pixel value into something you can detect. The whole range is shifted, which is important when using it in a light area such as an ER or ICU.

Also, it’s beneficial to have some slack at the dynamic range, which medical monitors provide because the light source of the monitor will decrease over time. Therefore, the maximum brightness to facilitate mapping the whole data range should be about 350 cd/m2, assuming you use the monitor in a dark environment. If you use it in a bright area or if you want to ensure you have some slack to facilitate the decrease of monitor output over a period of several years, you might want to opt for a maximum luminance of 450 to 500 cd/m2.

A medical-grade monitor typically adjusts light output to compensate for start-up variations in output.

With COTS monitors, the light output of a monitor varies as the temperature needs to stabilize for about 30 to 60 minutes after the equipment’s turned on. You can leave the monitor on day and night, or switch it on automatically one hour before use. However, either method will drastically reduce the lifetime of the monitor.

High-quality medical-grade monitors typically have a built-in feedback mechanism that measures light output and adjusts the electrical current to the light source to create the same output.

Therefore, the third requirement is to have a medical-grade monitor with a light output stabilizer.

A medical-grade monitor can perform automatic calibration, QA, and maintain calibration records.

Why are these records so critical? Two words: Liability reduction.

Using a medical-grade monitor and, as a result, being able to continually measure and adjust values on the monitor and record these incidences of calibration, help you do your job properly on an ongoing basis and provide an umbrella of protection over your professional career. Using a COTS monitor simply exposes you, your livelihood, and the money you’ve invested in your medical education and career to too much risk.

In addition, you need access to these records on a regular basis, regardless, to ensure that the monitor is still operating within the acceptable range. Many radiologists seem to replace their monitors after five years or so. If the monitor is still within calibration, there’s no reason to do that.

Therefore, the fourth requirement for a medical-grade monitor is to make sure that you can retrieve and store calibration records.

A medical-grade monitor is typically certified.

The American College of Radiology (ACR) provides recommendations for medical monitors as a standard and baseline for image-display performance. They are somewhat technical and, in our opinion, not worded strongly enough.

Also, most manufacturers of medical-grade monitors are FDA approved, which is actually only a requirement if you’re reading digital mammography. If your monitor meets the ACR requirements referenced above, you should be OK—but FDA approval doesn’t hurt. You can check the FDA website and look up monitor manufacturers to confirm if they’ve been approved. All medical displays offered by Monitors.com are FDA approved for medical use.

The fifth (optional) requirement is to be FDA approved.

Your monitor needs to have the right spatial resolution to see individual details.

In addition to being able to see all of the grayscale, which is characterized by the contrast resolution, you also need to be able to distinguish between the different pixels. 

For example, let’s take a typical computed radiography (CR) chest image, which may have a matrix size of 2,000 x 2,500 pixels. This results in five million pixels, or 5MP. The standard configuration for a diagnostic monitor to look at medical images is 3MP because a physician has the capability to zoom or use an electric loupe to see a one-to-one mapping of each image pixel element on the screen.

One could argue that you can use a 2MP monitor as well. Yes, that’s correct as long as you realize that it will take more time to interpret images and make a diagnosis. If you’re cost-sensitive, a 2MP configuration will do. But, assuming your time is more valuable than the cost of the monitor, the sixth requirement is to have a 3MP monitor configuration for general radiology reading where CR cases are being interpreted.

Yes, diagnostic display technology of at least 3MP costs more than their lower-resolution counterparts and certainly commercial displays. Even if you’re cost-sensitive about monitors, look at it this way:

As you know, radiologists get paid per case they read. Diagnostic monitors of at least 3MP give you more detail within the image and reduce zooming, panning, and time spent focusing on a particular area. If you can read faster and more comfortably, and with more confidence, how much more money can you be earning? How many more cases could you read with the extra time you save each day? Isn’t it possible that the diagnostic display monitor—the superior albeit more expensive technology—pays for itself pretty quickly? Is it worth it to spend more on the best-quality image?

Simply put, a medical-grade monitor is a professional tool that delivers professional results. Chief among these results are improving diagnostic accuracy and timeliness—what your reputation is based on. But being able to increase your earning potential is certainly attractive, too.

Conclusion

Does this mean that you can’t use a COTS monitor? It depends.

  • Are you willing to manually calibrate the monitor on a regular basis by running a calibration check and ensuring this can be applied by the monitor?
  • Will you take care of warm-up time?
  • Do you have a monitor that meets minimum brightness requirements?
  • Do you have a means of keeping your calibration records?
  • Are you certain that, in case of a legal dispute, the plaintiff doesn’t have enough expertise to challenge you about using sub-standard components that may impact patient care?

It’s up to you.

But we would think twice about it—especially as the price difference between a good-quality medical-grade monitor and commercial-grade monitors is not that great, compared with the overall cost of a picture archiving and communication system (PACS), to justify the risks.

Interested in buying a diagnostic display monitor? Now that you understand why you need this superior technology, please review the wide selection of general radiology and mammography displays from Monitors.com.

Leave a comment

All comments are moderated before being published

Featured Products

Save $2,500 Save $1,700 Save $1,250 Save $850
Barco Nio MDNC-3321 3MP 21" Color LED General Radiology Diagnostic PACS Display
Save $1,500 Save $200
Eizo Radiforce RX340 Color LED General Radiology Diagnostic PACS Display (RX340-BK) Monitors.com