healthcaretechoutlook

It's Easy to Get Lost When Creating an Enterprise Analytic Strategy

By Nicholas Marko, Chief Data Officer, Geisinger health System

Nicholas Marko, Chief Data Officer, Geisinger health System

A few years ago I was driving through Detroit with my wife and some friends on our way to a weekend getaway. My attention was split between their conversations, the directions being dictated from the in-dash GPS, and the few feet of dark road in front of me lined with construction barrels and temporary signage. Suddenly, I became aware of a significant widening of the road and a bit of a traffic jam ahead. I realized upon refocusing my attention to the GPS that it had directed me to the entrance of the Ambassador Bridge. It seems that the GPS had re-routed me around a construction delay via Canada. We did not have our passports on hand, we had no intention of going to Canada that evening, and there was no lawful way to reverse direction in the crowded border crossing queue. When my turn at the immigration booth arrived, the only bewildered comment that I could muster for the border agent was, “But I don’t want to go to Canada…” Fortunately he was mildly-amused, and a few illegal (but sanctioned) U-turns avoided placing me at the center of a minor international incident. I drove away with a story that is much more enjoyable in retrospect and an appreciation for the fact that even high-tech roadmaps aren’t always as reliable as they seem.

Organizational analytic strategy easily can go astray just as easily as my near-miss with Canada that night. Just as Jill (my GPS) failed to appreciate that a diversion around traffic was not a good idea if it required even temporarily crossing an international border, so too can well-intentioned and carefully-constructed analytic strategies encounter fatal problems if they have been built on a conceptual foundation that incorporates some common and fundamentally-flawed assumptions. Let’s focus on three key principles which, if ignored, can derail an organization’s attempts at building a solid analytic strategy.

Principle #1: “Enterprise Data Strategy” does not equal “Analytic Strategy”.

Once an organization decides that its data represents a valuable resource, there is typically some degree of commitment made to developing an Enterprise Data Strategy EDS. Unless the organization has an executive leader with a background in data strategy leading these discussions, management frequently gravitates

Nicholas Markofirst toward issues of data infrastructure and data governance. This is not unreasonable, as both are important domains of EDS and because most organizations recognize that improvements are needed in both domains. Unfortunately, because these two issues tend to consume considerable time and resources, discussions of additional components of a comprehensive enterprise data strategy are often deferred. Subsequently an unstated assumption tends to emerge that once the infrastructure and governance are established, the data will be somehow be “clean and ready for analysis” and that this analysis will follow seamlessly as a result of these preparatory efforts.

In reality the opposite is true. Data has no inherent value as a commodity, but rather it is valuable only when used. “Analysis” can be understood as a term that describes the process of utilizing and interpreting data to extract knowledge, and so developing a strategy that outlines the intended use–the analysis–of data must be an early step on the road to developing a comprehensive EDS. In this context, analytic strategy is considered one component of a multi-dimensional Enterprise Data Strategy. Successful EDS is the result of a balanced approach to addressing all of these components in parallel. Conversely, when EDS development is approached from the top down, asymmetric efforts naturally emerge and the analytic strategy component tends to be among the most under-emphasized. The distinction between analytic strategy and EDS is, therefore, more than just semantic. Whereas a solid analytic strategy is a strong pillar supporting a comprehensive EDS that has been developed from the ground up, an analytic strategy that follows from a top-down approach to building an EDS will almost universally fail to produce a strong analytic strategy.

Principle #2: Business Intelligence and Analytics are two different domains.

A common, specious statement heard during early discussions of analytic strategy goes something like this: “Most of our analytics are done using reporting and dash boarding.” Flat reporting is primarily an exercise in reporting and summarizing data, and dash boarding is a simple version of data visualization. While an analyst can certainly use the information presented in reports and dashboards to conduct an analysis, the actual analytic portion of this process lies in the analyst’s actions, not in the construction of the report or the dashboard. For this reason, I consider reporting and dash boarding to be two approaches to the domain of Business Intelligence BI, not the domain of data analytics. Both BI and analytics are valuable to enterprise operations, but they each use different tools to generate different outputs, and their products are consumed in different capacities for different purposes.

Because BI and analytics are both valuable domains, yet are also fundamentally different from each other, they require different strategies in order to be applied successfully. Analytic strategy and BI strategy should, therefore, be addressed separately by experts with different skill sets, with different end-user needs in mind, and with different strategies for implementation and for consumption of their outputs. Un f o r t u n a t e l y , many enterprises believe that they have an analytic strategy when what they really have is a BI strategy. This simple mistake generally leaves significant data-related value unrealized, as the entire domain of analytics receives insufficient attention and data utilization opportunities are lost.

Principle #3: Start by considering large-scale value propositions, not by selecting tools and use cases.

Once an organization has progressed to the point of developing a strategy explicitly for data analytics, they often find that skill sets and technical resources for real analytics are sparse. Many existing personnel tend to have skills in data management, data architecture, and various BI tasks, but real analytic experts are few and far between. This leads many organizational leaders to look to an obvious place for advice–vendors. Vendors will offer whitepapers from various sectors describing industry standards, discuss the robust set of analytic tools that they offer, and suggest working collaboratively to identify specific analytic “use cases” that they can help the organization address. This approach seems logical, and many data strategists fail to recognize that this is primarily a sales tactic. This is not to suggest that the vendors are somehow being deceptive or have ill intent, because this model has become a de facto way that many organizations are comfortable building analytic strategies and operations. Taking this approach will likely get an organization exactly what they expect–a series of technology products intended to solve a handful of specific problems that have some tie to value. While this is not necessarily bad, it is a suboptimal approach to building a robust analytic strategy, and it leads to a series of one-off solutions that are difficult to generalize. Upon realizing this, many organizations repeat the same process again and again, leading to significant spending on temporary value generators that are never really part of a cohesive analytic strategy.

I contend that organizations can do better than this by adjusting their initial approach in two ways. First, leadership must have an honest discussion that involves business strategists, data specialists with domain expertise, and technical IT resources. Together this group can determine what the organization really hopes to gain from its analytic enterprise and which parts of this ideal future state are practically achievable. Until these goals are defined, it is impossible to construct an analytic strategy. During this phase the enterprise should focus on big ideas and large-scale value propositions. Specific use cases should be selected only after this broader vision has been outlined and only once the key domains where analytics will be used to realize value have been defined. Leading with use cases results in one-off solutions, whereas leading with ideas and vision leads to generalizable infrastructures.

Once broad goals have been set and initial use cases have been selected as representative instances of these goals, the second step is an organizational assessment of its current capacity to produce the requisite analytic results. From here some form of gap analysis can be used to identify the exact resources (human and technical) that are already in the system and those that need to be acquired. Indeed, a plan for acquisition of deficient resources is often the first part of a well-formed analytic strategy. If an organization lacks the appropriate strategic personnel to conduct this assessment (i.e. it does not know what it does not know) then this may be a time to seek the assistance of carefully-selected data strategy consultants (not vendors).

Only after these two steps have been com­pleted should the organization en­gage vendors and hiring experts. Now the enterprise will be seeking specific tools and solutions to explicit prob­lems from an informed perspective and as the resources fall into place the analytic strategy development can continue.

These three critical points are the most common misunderstandings that occur during the early phases of creating an enterprise data strategy. Organizations without a data strategy expert on-hand can easily fail to appreciate one or more of these subtle but critical points. The results can be disappointing and the process can be expensive. Careful attention to these principles will help an organization construct a solid foundation upon which a valuable analytic strategy can be built. This, in turn, can be one part of a more comprehensive EDS that guides an organization along the path to effective utilization of its data without taking a detour to Canada.