“The Goal is to turn data into information, and information into insights.”-Carly Fiorina (Ex-CEO HP)
An intriguing quote, these words touch on one of the more important elements of data collection. Back before “big data” became a buzzword, industry leaders had to recognize that value didn’t lie uniquely in the quantity, but rather in the quality, insights and analysis these numbers could provide.
“Algorithmic business,” an advanced version of analytics, has emerged as the next big thing. Analytics strengthen an organization’s understanding of the customer’s needs, likes, and dislikes. To fuel these studies, organizations are collecting data from numerous physical and virtual sensors through initiatives like digital transformation and Internet of Things (IoT). The collected data then is then analyzed in order to generate meaningful insights for companies. This paradigm shift has affected the entire market.
Capgemini’s World Quality Report 2016-17 illustrates how organizations are embracing both digital transformation and IoT as tools to analyze and engage with customers in an efficient way, having a strong impact on the processes, procedures and tools adopted by the IT industry.
In short, companies amass data from various physical and virtual sources. Processing this data requires a step-by-step process that provides valuable insight at every level. Each stage’s data offers new information that help organizations to make strategic decisions.
The first step, known as Big Data, comprises a collection of raw data not ideal for human interpretation.
This raw data, when summarized in a certain way (number of counts, likes, average, sum and so on), generates meaningful information for interpretation. This second stage is called descriptive analysis, and some of the examples include social media likes, financial reports and inventory details.
With the third step, known as predictive analytics, we can further process data and probable outcomes by using both data modeling and machine learning techniques. When applied, these analytics can generate sales forecasts for the next year and calculate the probability of customers making future credit payments on time. Companies are designing applications and business models that can respond quickly to customer needs and apply automation to ensure timely response.
Furthermore, various algorithms are being embedded in business applications that make relevant decisions in the stage known as prescriptive analytics. These types of models are equipped with a feedback loop and a system linked to a particular algorithm that responds to end-users automatically. Businesses adopting this algorithmic approach are called Algorithmic Businesses.
As Gartner explained, “Algorithmic business involves the industrialized use of complex mathematical algorithms pivotal to driving improved business decisions or process automation for competitive differentiation.”
The building blocks of any system following algorithmic business logic contain input, output, and a feedback loop. Input, a system using Decision tree, Fuzzy logic, Excel, SAS, SAP, Tibco Software, MathWorks, or Ayata, is analyzed by an algorithmic function driven by the feedback loop and based on the analysis the output function provides. Depending on the output system, the application changes the resulting output in forms like price, inventory details, patient’s statistics, and others. The model below shows a high-level understanding of the Algorithmic Business Approach.
To better illustrate this practice, let’s take the example of Amazon.com. Amazon has a Recommendation engine that gives suggestions to online buyers about products they may have they viewed, purchased, or kept in their virtual cart. This engine is very dynamic and changes the suggestions based on the customer’s recent searches and purchases. Amazon once reported that this approach helped them to increase sales by 29%.
Many other industries are using an algorithmic approach to better target their customers.
In algorithmic-based applications, feedback loops continuously change inputs and outputs, limiting the use of standard testing techniques. Model-based testing techniques allow us to configure a desired business process through the use of tools such as Conformiq, Reactis tester, TVG, and TorX.
These model-based testing tools generate both test cases and test scripts automatically, auto-execute them, and analyze the test results. Every time the feedback loop alters the input, the models change and the test cases are automatically updated. The most important part of model-based testing is designing a model for automated test case generation. This requires testers who are able to apply model-based thinking to test case generation based on the needs of an algorithmic business.
The benefits of model-based testing include the following:
Enabled Dynamic Testing: Testing for algorithmic business involves multiple iterations of data in order to auto-generate test cases and executions. Improved time to market - Automation is applied not only in test execution but also in test case generation.
Reuse Potential: Using models enables higher reuse for similar applications in algorithmic business.
Increased competition has forced businesses to innovate and to develop customer intimacy. The use of algorithms has helped businesses to study customer behavior and buying patterns in a dynamic digital business environment. Such businesses, known as algorithmic businesses, also need to apply models to their testing practice, as model-based testing tools and methodologies have helped reduce time to market and provide both cost- and performance-related efficiencies.
Register or login for access to this item and much more
All Health Data Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access