Stalled BI programs are intriguing to me. Having been involved in many intelligence projects over the years, I’ve seen how they can sputter when business takes a back seat to IT. IT often desires a well-articulated series of requirements that can be implemented in a top-down fashion. Business, on the other hand, generally prefers a more agile, evolving requirements methodology, especially for processes that are information rather than transaction driven – something along the lines of “show me something and I’ll react to it.” If IT desires predictable, transaction-focused requirements, business often prefers learn-as-you-go information. At times the two approaches are at loggerheads.
Indeed, it’s pretty important to appreciate the differences between transaction and information-driven project initiatives. A few years back, a prospect sent OpenBI a spreadsheet with what they said were comprehensive requirements for a planned BI application and asked for a “fixed-bid” consulting fee proposal to deploy a solution. After reviewing the reqs, we agreed to submit such a proposal, with the proviso that changes and out-of-scope requests would be handled by a control process adjudicated by managers from both the customer and OpenBI.
The first stage of the work was a two week roadmap and requirements confirmation study. At the end of that short investigation, the requirements list was quite a bit different than the original, and both parties agreed that managing to such an in-motion, “fixed” scope would be too onerous. We then settled on a tightly-controlled time and materials basis for payment in lieu of the fixed-bid.
It turns out the T&M compromise was best for both sides as requirements continued to accrue and shift over the three month engagement. In fact, it was only when OpenBI began to show the customer first-cut reports, OLAP cubes and visuals with real data – sparking “aha” analytics moments – that requirements and expectations began to solidify. At the time, I recalled the wisdom from a mentor 25 years ago: “The decision support project actually starts when the key stakeholders first see their data.” The engagement ended a big success, extending just two weeks beyond the initial plan, even with 75 percent of delivered functionality different from what was first articulated, and many “extra” features deferred to Phase 2.
An interesting article in January-February 2013 Harvard Business Review, “Why IT Fumbles Analytics,” elaborates on the IT/transaction-Business/information project divide and offers advice to incipient BI/Analytics programs. For authors Donald Marchand and Joe Peppard, a primary reason that Analytics initiatives fail is that companies “treat their big data and analytics projects the same way they treat all IT projects, not realizing that the two are completely different animals.” The mistake that many organizations make with analytics is to employ “the conventional approach to designing and installing IT systems, which focuses on building and deploying the technology on time, to plan, and within budget.”
Indeed, “a big data project can’t be treated like a conventional, large IT project, with its defined outcomes, required tasks, and detailed plans for carrying them out ... Commissioned to address a problem or opportunity that someone has sensed, such a project frames questions to which data might provide answers, develops hypotheses, and then iteratively experiments to gain knowledge and understanding.”
The authors provide an informative juxtaposition of traditional IT and analytics initiatives. A typical IT project would be installation of an ERP system to improve efficiency, while an analytics initiative might be to develop an understanding of customer behavior for outcome prediction to serve strategic growth. The IT planning process is top-down and detailed; the analytics process, on the other hand, is discovery-driven, fueled by “science of business” theories, hypotheses, experiments – and refinement. Business-knowledgeable practitioners and technology professionals are keys to both IT and analytics, but analytics requires data and social scientists as well. Success for IT projects is delivery on time to plan; analytics success revolves on new insights and the adoption of an evidence-based decision-making culture.
Marchand and Peppard propose five guidelines to promote successful analytics deployments:
- Obsess on Information Users – “challenge how they ... use data in reaching conclusions and making decisions, urging them to rely on formal analysis instead of gut feel.”
- Obsess on Information Use
- Deploy Social Scientists on Analytics Project Teams – better yet, quantitative social scientists.
- Focus on Learning – i.e. the science of business
- Obsess on Solving Business Problems – rather than, like IT, the risks of deploying technology
I can’t help but compare the divide of traditional IT and BI project management with the dialectic of “Planners vs Searchers.” IT Planners implement top-down solutions to technical engineering problems. Analytics Searchers, on the other hand, acknowledge upfront that they don’t have answers, but are intent on finding them through evidence-based discovery. New analytics deployments could well benefit from appreciating this distinction.
Steve Miller is co-founder of a Chicago-based business intelligence (BI) services firm OpenBI, LLC, that specializes in delivering analytic solutions with both open source and commercial technologies. Miller has more than 30 years of experience in intelligence and analytics, having migrated from health care program evaluation, to database consulting with Oracle Corporation, to running a fast-growing BI services business at Braun Consulting. Advances in technology over that time have fundamentally enabled the use of quantitative methods for business differentiation. OpenBI, LLC, is all about helping customers attain that differentiation. Steve blogs frequently on Stats Man's Corner at miller.openbi.com. You can reach him at firstname.lastname@example.org.
This blog originally appeared on Information Management.