In other words, since it would be three years before Hubble’s faulty optics could be repaired during a 1993 space shuttle mission, data cleansing allowed astrophysicists to make good use of Hubble despite the bad data quality of its early images.
So, data cleansing algorithms saved Hubble’s fuzzy images – but how did this data cleansing actually save lives?
“Turns out,” Tyson explained, “maximizing the amount of information that could be extracted from a blurry astronomical image is technically identical to maximizing the amount of information that can be extracted from a mammogram. Soon the new techniques came into common use for detecting early signs of breast cancer.”
“But that’s only part of the story. In 1997, for Hubble’s second servicing mission, shuttle astronauts swapped in a brand-new, high-resolution digital detector—designed to the demanding specifications of astrophysicists whose careers are based on being able to see small, dim things in the cosmos. That technology is now incorporated in a minimally invasive, low-cost system for doing breast biopsies, the next stage after mammograms in the early diagnosis of cancer.”
Even though defect prevention was eventually implemented to prevent data quality issues in Hubble’s images of outer space, those interim data cleansing algorithms are still being used today to help save countless human lives here on Earth.
So, at least in this particular instance, we have to admit that data cleansing is a necessary good.
Jim Harris is an independent consultant, speaker and freelance writer. He is blogger-in-chief at Obsessive-Compulsive Data Quality, a blog offering a vendor-neutral perspective on data quality and related disciplines.
This posting appeared on Information Managment, a sister publication to Health Data Management.