How providers can get better results from data efforts
The primacy of information to the healthcare industry over the last several years has seen project successes and failures, as health insurers and other health stakeholders race to develop ways to extract actionable information from their data stores.
While healthcare data and analytics holds promise to bring increased effectiveness and efficiency to the industry, the route to achieving the things that these analytics promise—proactive healthcare, better customer service, process improvements, better claims modeling, increased profitability and more—has been littered with failed attempts. That begs the question of whether there is an approach to healthcare data and analytics that offers some reasonable hope of success.
Through trial and error, a successful pattern for implementing analytics solutions has emerged that is centered around conferring closely with the business customer. By having business analysts and data scientists actually coding their own solutions and employing IT just to standardize, optimize, scale and automate those solutions, there is a correspondingly much higher success rate in delivering those solutions. The idea behind this was and is—because the business had built the original prototype, regardless of how inelegant it may have been, they had a clear understanding of what they should be expecting.
To examine this approach more closely, let’s suppose there are two different tracks of work that were conducted in widely different ways by two different teams. Each track was defined by business and developed by IT, but one was successfully implemented by IT, and the other track was plagued with bugs and ultimately shelved by business pending further analysis.
Briefly going through each track in a simple retrospective and outlining the best practices around how to conduct and coordinate projects between IT and business should illustrate the difference in approaches, and therefore the difference in potential success. The focus here is on healthcare data and analytics and how IT can help facilitate actionable analytics in a continuous delivery and reporting cycle.
Let’s explore two short contrasting case studies that outline how this suggested approach could be applied and its benefits:
The project goal was to develop a master person index using the conformance layer of an Enterprise Data Warehouse project. The driver and business case for the project was to have all outgoing extracts use the same person matching logic to generate data, in the current state (at the time) each individual extract had its own custom matching algorithm and the development of a new track required re-coding this matching in each component.
There were two primary functional requirements. The existing extracts must use this new generic matching logic, and any new system or process must be able to easily integrate with this matched person data.
For non-functional requirements, The major consideration in architecting a solution was to ensure that the core logic of the matching algorithm was easily modifiable so that the algorithm could be refined over time as more was learned on how to properly match disparate person records.
The end status of this work track was that it did not pass acceptance testing. The final product was shelved pending an analysis by the business. In the analysis of this project, there were several lessons learned, but the core issues were that the business resources did not have enough involvement in the design process, and IT did not push enough for that involvement.
The business resources had essentially outlined the problem domain and requested that IT provide a solution, and IT complied by doing an analysis, determining an architecture and developing a solution. The problem was that the business resources had no clear idea (aside from design artifacts) as to what the final deliverable should look like. And because the analytics team could not conceptualize the solution well, the integration was never fully accepted, and the business shelved the project awaiting further research and development.
This track of work only spanned a single sprint before it was pushed back to planning and leadership, but the retrospective results challenged IT’s ability to work properly with the business. It was clear that a change in approach would be required for the next project.
The analytics team wanted a dimensional model built out of the conformed data in the enterprise data warehouse. The final result should be a framework on which dimensions and fact tables can be quickly modeled and then moved to production. The primary use case was to build Claims and Membership facts across over 16 dimensions.
The functional requirements of the project was that the solution would be business defined fact tables and their associated dimension tables. For non-functional requirements, the design must support new business needs. New facts and dimensions that are identified should be easily added to the existing integration.
The end result of this work track was successful Quality Assurance (QA) and User Acceptance Testing (UAT) and a limited launch to the analytics team for a “beta acceptance.” Additional facts and dimensions were added in the following months as the data scientist performed further discovery.
This track of work followed a different requirements gathering phase. This need was communicated upfront to the business resources assigned to ensure more participation and accountability for the systems that were being built by IT. The business analyst assigned to this work was responsible for creating drafts of the data models and coding the final expected output for each fact and dimension. The models were then refined by the architect and technical leads, and the developers had a business-built and accepted model along with the expected output to test against.
Another success factor was limiting the number of high impact decisions. There were still issues and redesigns done during the development and QA phases, but the overall design stayed the same, and the solution was delivered to the business on time and met their expectations.
This method of providing requirements and defining the final results so that IT can set proper expectations was ultimately successful. Developers are very good at (and primarily responsible) for delivering well-structured and performing solutions to business problems, but often lack the proper intuitions to dissect and disseminate the business problem thoroughly.
Of course, business involvement in a development process is not a new concept—the Agile methodology actually encourages and depends on this—leading to a modus operandi designed to increase business involvement to remove a lot of the guesswork that IT tends to do in these situations.
While not foolproof, the suggested approach significantly increased the likelihood of project success. The business analysts intuitively knew how the data should be queried and used and what the output should look like. Once they provided that to IT, all the assumptions and guesswork was eliminated, and IT could focus on the things it does best by preparing the new dimensional model for production. This should serve as a sharp reminder that the best business technology outcomes happen when business and IT resources find the right way to work together.