Colorado research initiative moves to the cloud

After the University of Colorado Denver ran into problems making an on-campus data warehouse work effectively, the program achieved efficiencies and saved money by moving to a cloud-based repository.

Problems making an on-premises data warehouse work led a University of Colorado Denver initiative to try the cloud to make a project move forward.

The initiative appears to have resolved technical issues that plagued the data warehouse and provided efficiencies in advancing analytics efforts.

University of Colorado Denver executives say the switch has given new legs to the analytics push, called Health Data Compass, which is part of the Colorado Center for Personalized Medicine. The use of the cloud by University of Colorado Denver highlights new choices facing healthcare organizations that are looking for options in holding vast amounts of data that can be then used for research.

Health Data Compass is an ambitious multi-prong analytics initiative intended to support the development of breakthrough medical technologies at the University of Colorado Health and Children’s Hospital Colorado. The University of Colorado Denver is providing technical support for the joint initiative. The warehouse was headquartered on the University of Colorado Anshultz Medical Campus, and is jointly sponsored by the University of Colorado School of Medicine, University Physicians, University of Colorado Health and Children’s Hospital.

UCHealth and Children’s Hospital are separate organizations; they both use the Epic electronic health record system, and both will continue to conduct their own analytics programs, says Michael Ames, associate director of Health Data Compass.

The warehouse contains vast amounts of data. For example, in mid-July Health Data Compass reported more than 369 million patient lab orders and results from UCHealth, and an additional 124 million lab orders and results from Children’s Hospital.

Compass’s first attempt to develop a traditional, on-premises data warehouse ran into problems. The technology loaded data slowly and was vulnerable to crashing, and the vendor’s promises of technical fixes did not materialize, he adds.

Beyond the frustration with lack of progress on the project, dealing with the technical issues were expensive to the organizations, Ames says. “We were spending $1 million a year in basic maintenance, keeping the car running but not going anywhere with it.”

The solution was to bring the data together in a new enterprise data warehouse running on a Google cloud platform; the new data warehouse was developed in-house.

Also See: How data warehouses can speed identification of rare diseases

So far, moving from the legacy warehouse to the cloud platform has reduced operational infrastructure costs by 60 percent. In addition, the average time for data preparation for analytics has decreased by 50 percent, according to Ames. Running the daily master patient index to identify duplicate records (of patients as children and later adults), and linking the records from the two hospitals has gone from eight hours to 15 minutes.

In using a cloud-based warehouse approach, protecting patient data was of upmost importance. Ames and others working on the projects were confident that the Google platform was secure, but the ways to configure and manage data securely on the platform were unfamiliar to the group. “Google gave us Fort Knox, but we needed help understanding how to keep the doors locked,” Ames explains.

So, project leaders got some outside help, selecting cloud consulting firm Tectonic to provide advisory services and develop the security design documentation. The firm also developed training programs to work on the Google platform and best practices for working in the cloud environment.

More for you

Loading data for hdm_tax_topic #better-outcomes...