Why healthcare is moving to real-time analytics

Faster analysis may enable providers to solve patient safety, readmission issues.


A majority of companies in the retail, technology, banking, healthcare and life science sectors (92%) are investing in real-time analysis technology for human and machine-generated data, according to a new report from OpsClarity, a provider of intelligent monitoring products for data and streaming applications.

Fast data and streaming analytics enable organizations to create nearly instantaneous personalization efforts and highly informed decisions, are a top priority for both consumer-facing and internal big data analytics tasks, the survey found. 

Most of those participating in the survey said they want completed analytics transactions within five minutes. In addition, some 27 percent said streaming processing systems must deliver insights within 30 seconds of the initial query.



Some 32 percent said their organizations plan to use the technology to power applications to serve the needs of their consumers, while 29 percent are likely to focus on internal analytics to optimize business practices. And 39 percent said these analytics efforts will involve both areas.

“With new fast data technologies, companies can make real-time decisions about customer intentions and provide instant and highly personalized offers, rather than sending an offline offer in an email a week later,” said Dhruv Jain, CEO and co-founder of OpsClarity. “It also allows companies to almost instantaneously detect fraud and intrusions, rather than waiting to collect all the data and processing it after it’s too late.”

Within healthcare, real-time analytics could empower personalized results that could impact areas such as patient safety, clinical risk assessment, optimizing operations for an organization, readmission reduction and even patient engagement. “The industry is experiencing remarkable growth in both awareness and adoption of real-time high-velocity data processing platforms,” the survey concludes.

The survey, fielded among software developers, architects and DevOps professionals found that 79 percent of organizations plan to reduce or eliminate investment in batch processing, and 44 percent cite lack of expertise on new data frameworks to analyze data pipeline failures.

“The ability to harness the power of real-time data analysis gives businesses a competitive edge in today’s digital economy by enabling them to become more agile and rapidly innovative,” Dhruv said. “However, as the underlying stream processing data frameworks and applications are relatively new and heterogeneous in nature, end-to-end monitoring and visibility into these fast data applications is critical for organizations to accelerate development and reliably operate their business-critical functions.”

More for you

Loading data for hdm_tax_topic #better-outcomes...