Beating the 4 top challenges of database management in the cloud

A transition to the cloud forces database managers to deal with complex data integrations spanning multiple environments.


According to a recent survey from KeyBanc Capital Markets, 32 percent of CIOs say they plan to use multiple vendors to create internal private cloud systems, while 27 percent planned hybrid cloud arrangements.

It’s no secret why. When making the leap to the cloud, organizations have two goals in mind—achieving economies of scale not typically available to smaller organizations and obtaining greater access to resources that free up their own.

However, attaining these highly sought-after benefits isn’t always easy. As is the case with any innovation, a move to the cloud presents a new set of challenges. For database managers, dealing with complex data integrations spanning multiple environments presents a messy and complicated scenario.

What follows are the most common issues database managers experience and the best practices to manage them.

Challenge 1: Resetting your thinking
Cloud environments are a lot more fluid and dynamic than the typical server. As such, IT executives must adjust their thinking, compared with that of working with on-premise technologies. For example, when working with a single cloud server, there is a chance that it may suddenly disappear for any reason, at any time, fully knowing there’s also a good chance it will never come back. With the cloud, however, it’s much easier to get an identical server up and running instantly. Even for the simplest applications, to keep an entire application fault tolerant and highly available, IT executives need to embrace distributed computing principles.

Unlike working with static, immovable infrastructures, IT executives can’t just set it and forget it. Consistently review and revise current cloud set-ups. When considering the example of disappearing data, they must make sure it lives in multiple locations. Understand and apply cloud-specific accommodations for high availability, persistent storage of data and active monitoring of systems.

Challenge 2: Automating the cloud
Working with many servers, networks and machines, anything can go on the fritz and give a healthcare organization problems at any time. The common solution for when anything isn’t working, “Have you tried turning it on and off?” has an interesting cloud version. When it comes to the cloud, everything is disposable. When things aren’t working, cloud providers simply wipe the machine and start from a clean state. This shift in thinking requires an organization to employ automation strategies enabling organizations to easily to deploy, update, monitor and manage servers in a new and more efficient manner when everything needs to be shut down.

One example in which there's been an increased need for (and knowledge of) automation among clients is the deployment of a new release to production. In the cloud, such an action typically requires a single Docker pull command. This deceptively simple operation hides enormous levels of complexity behind it. Before the cloud, IT staff could log onto the server, stop the application, update binaries and start the application again. Now, they need to package applications in Docker, register and deploy it; everything else is managed.

Challenge 3: Migrating data
Moving an organization’s assets to the cloud is a long and gradual process. Migrating on-premise systems one day and having them on the cloud the next is unheard of. It takes time, usually years, for this process to complete. That’s why many are running hybrid environments, instead.

Depending on an organization's systems and infrastructure, navigating a hybrid system can be as easy as setting a VPN tunnel between the two locations and expanding pre-existing distributed architecture to a new location. If the system presumes a single source of truth (SSOT), an organization will run into issues where it now has multiple sources. In most organizations, this is handled with extract, transform and load (ETL) and data flow processes, and requires the data team to carefully delineate the ownership of each data item and its flow throughout the organization to successfully integrate into the new cloud environment.

Challenge 4: Securing the cloud (beyond your provider)
When primary servers and databases are safely locked in a vault deep underneath an organization's offices, it then feels like it has a certain expectation of safety from the physical aspects of computer security. On the cloud, systems are typically much safer because of ongoing intrusion detection, distributed denial of service (DDoS) prevention and active monitoring done by the cloud infrastructure.

While cloud providers often take care of security issues, there are still many issues that can arise. For example, if a server is made public without following the appropriate security protocols, there is nothing that will prevent any hacker from waltzing into an organization's innermost systems and wreaking havoc. Over the past few years, there are countless examples of millions of people having their data stolen and ransomed back to the organization they stole it from because the database was kept open when it went into production.

Over the last few years, as threats have developed and evolved, security has been the bane of many organizations. Remember to properly secure and protect a database is easy to forget. However, you don’t want to learn to lock the door the hard way when the data is gone and your organization's name is in all the headlines.

More for you

Loading data for hdm_tax_topic #better-outcomes...