Less Is More: Streamlining Your Analytics and BI Tool Portfolio

Size: px
Start display at page:

Download "Less Is More: Streamlining Your Analytics and BI Tool Portfolio"

Transcription

1 Less Is More: Streamlining Your Analytics and BI Tool Portfolio Published: 23 July 2018 ID: G Analyst(s): James Laurence Richardson Many organizations use multiple analytics and BI products. Data and analytics leaders must review and work to reduce their portfolio of tools by identifying areas of redundant practice and technology overlap. This will free them to meet evolving analytics needs and realize economies of scale. Key Challenges Many organizations are using multiple analytics and business intelligence (BI) technologies inefficiently. Few companies have a plan for retiring older analytics and BI technologies. The analytics and BI tool legacy that organizations carry can inhibit their ability to innovate with analytics. Recommendations To gain efficiencies in how they use tools within their analytics and BI strategies, data and analytics leaders should: Realign how their organization views analytics and BI tools by actively managing them as a portfolio. Scope the situation by conducting an initial assessment of the life cycle of analytics and BI tools currently in use. Define the future shape of their portfolio by undertaking a full audit of analytics and BI tools with all business stakeholders. Make streamlining the analytics and BI technology estate a repeatable exercise by establishing a portfolio review period. Drive change by setting prioritization rules in order to deliver the desired future portfolio.

2 Table of Contents Introduction... 2 Analysis...6 Realign How Your Organization Views Analytics and BI Tools by Actively Managing Them as a Portfolio...6 Conduct an Initial Assessment of the Life Cycle of Tools in Use... 7 Undertake a Full Audit of Your Tool Portfolio to Help Define Its Future Shape Approach 1: Apply Application Portfolio Management Techniques Approach 2: Use Analytics Blocks to Map Portfolio Tools to Outcomes Share the Results and Establish a Portfolio Review Period...13 Set Prioritization Rules for the Desired Future Portfolio to Guide Portfolio Pruning...13 Gartner Recommended Reading List of Tables Table 1. Initial Analytics and BI Portfolio Mapping...9 Table 2. Some Common Analytics and BI Portfolio Issues Table 3. Example of Guiding Principles/Rules of Engagement...15 List of Figures Figure 1. Is Your Analytics and Business Intelligence Vendor an Enterprise Standard?... 3 Figure 2. Representative Components of a Modern Analytics and BI Architecture...5 Figure 3. Boston Consulting Group (BCG) Growth-Share Matrix... 8 Figure 4. Example of an Analytic Capability Portfolio Map Figure 5. Technology Adoption S-Curves...14 Introduction Gartner survey data shows that most large organizations have more than one "enterprise standard" BI tool, with 41% actively identifying that they have multiple standards in place (see Figure 1). 1 Page 2 of 17 Gartner, Inc. G

3 Figure 1. Is Your Analytics and Business Intelligence Vendor an Enterprise Standard? Number of respondents = 694; enterprises with more than 1,000 employees using vendors covered in the 2018 Magic Quadrant for analytics and BI platforms. Source: Gartner (July 2018) This is no surprise; for almost a decade, Gartner's advice has been that organizations make use of a set of analytics and BI tools. While things have changed during that time notably in the advance and scope of the technologies that now comprise an analytics and BI platform the reasons for, and benefits from, using a range of tools from different vendors remain consistent: No single vendor or tool offers everything at the same level of functionality. Although the level of differentiation between platforms is lessening in some areas (such as analytic dashboarding), no single vendor provides the best across all of the 15 capabilities that Gartner defines as critical in a modern analytics and BI platform (see "Critical Capabilities for Analytics and Business Intelligence Platforms"). As such, there is an ongoing need to use software from a variety of vendors in order to maximize the value returned from data by fully supporting a wide range of analytics needs. For a representation of the components of a modern analytics and BI architecture, see Figure 2. Departments have their own needs and pressing priorities that must be met in terms of using data to support decision making. Often, the operational applications they use bundle technologies for reporting or analysis purposes, or they require other analytics and BI tools to service particular analytic tasks. Gartner, Inc. G Page 3 of 17

4 Innovation happens at different rates among vendors, and new capabilities may become available that are not supported by incumbent software providers. This means that space for newer disruptive technologies needs to be made in order to stop the analytics offered from becoming outmoded. Also, enterprises rarely achieve a steady state. Merger and acquisition activity, for example, affects the analytics and BI environment as much as any other technology area usually by bringing other tools into the wider organization. Page 4 of 17 Gartner, Inc. G

5 Figure 2. Representative Components of a Modern Analytics and BI Architecture BI = business intelligence; DB = database; DW = data warehouse; operational technology; OLAP = online analytical processing Source: Gartner (July 2018) Gartner, Inc. G Page 5 of 17

6 It is clear that using multiple analytics and BI technologies is both widespread and beneficial. However, based on inquiries with Gartner customers, it is also evident that the actions and processes required when running a portfolio are often not in place or are informal or immature. This note addresses the main implications of using a technology portfolio that data and analytics leaders need to understand, and act on, to get the most value from their portfolio of tools. Analysis Realign How Your Organization Views Analytics and BI Tools by Actively Managing Them as a Portfolio A portfolio is a set of managed products and the keyword here is "managed." Often, organizations through accident rather than design find themselves using and supporting a group of analytics and BI software products. The tools were added over time, with no explicit plan as to how they fit into the overall set or reference architecture. This is not a portfolio, just a collection of technologies, and is characterized by overlapping functional capabilities, user confusion over which tool to use for what, and process inefficiency. Organizations should aim to have a portfolio with the minimum viable elements to meet the needs expressed. In analytics and BI less is more, but one technology is not enough, and managing a portfolio is as much about what's not included as what is. When managing a set of tools and capabilities: Portfolio elements are periodically assessed to check their value as a matter of routine, perhaps once a year. The life cycle of portfolio elements is monitored, with a view to minimizing the use of aging technologies and eventually retiring them. While there are clear benefits to using multiple tools, there are also consequences that must be recognized and that necessitate ongoing portfolio management: Multiple skills in multiple tools need to be established and maintained in both the central team and among users. If moving from a centralized BI competency center (BICC) to a hybrid analytics center of excellence (ACE) organizational model, in order to deliver more self-service, the requirement for training power users in more depth is further extended. This is because selfservice users need to be more proficient than users who just consume BI outputs as reports or dashboards. Multiple data connections and metadata definitions have to be built, supported and managed. Connecting multiple analytics and BI tools to single sources, such as the enterprise data warehouse (EDW), or diverse sources is multiplied by the number of tools used in the portfolio. Furthermore, as these platforms are proprietary there is no industry standard for metadata the task of populating and aligning definitions in different tools is significant and ongoing. Economies of scale are less realized. Every tool that's added reduces the likelihood of gaining benefits through the reuse of skills and shared approaches, as well as removing the savings that Page 6 of 17 Gartner, Inc. G

7 could be made from volume purchasing of licenses or subscriptions. Further, ongoing demand from business users for new analytic outputs means it's all too easy to bloat a portfolio. Add too many tools, and the consequences may outweigh the benefits; so how do you know what is the right number? The answer depends on several variables, such as functional capability, team size and resourcing and internal politics. A useful rule of thumb is that functional overlap is to be avoided unless there's a clearly understood and stated reason to allow it. Resourcing the necessary support for lots of tools may reduce the ability of the central analytics and BI team to innovate. Gartner commonly talks about a "run, grow, transform" division of activities in IT. A portfolio has to leave some space and time for transformative, lab-style experimental analytic activities. An overloaded portfolio will rarely foster innovation, because the vast majority, if not all, of the effort and budget will go into simply running the set of tools. Data and analytics leaders need to maintain a balance between the benefits of a broad set of analytic capabilities and the consequences of using multiple technologies. To do this, they need to establish policies and processes to model the usage of the platforms that comprise their analytics and BI portfolio. Conduct an Initial Assessment of the Life Cycle of Tools in Use Many of you will be familiar with the Boston Consulting Group (BCG) matrix (the growth-share matrix; see Figure 3). This approach to product categorization is a useful analogy and can be applied to quickly assess analytics and BI tool portfolios. It has the additional benefit of prompting analytics and BI teams to think about their organization as having internal buyers (users), with them acting as sellers in an internal market for information products. Gartner, Inc. G Page 7 of 17

8 Figure 3. Boston Consulting Group (BCG) Growth-Share Matrix The original version of this matrix was created by the Boston Consulting Group Source: Adapted from Boston Consulting Group Begin with a simple tabular mapping exercise to document current portfolio elements. See Table 1 for a suggested model, and amend to fit as needed. Page 8 of 17 Gartner, Inc. G

9 Table 1. Initial Analytics and BI Portfolio Mapping Domain Category Analytic Capability Vendor/ Product No. of Users No. of Artifacts Usage Trajectory Derived BCG Category Overlapping Products Either information portal, analytic workbench or data science lab (see Figure 2) Analytic dashboards or reporting, for example (see Figure 2 for list of common capabilities) Name of tool used Count of total users of tool Approx. count of reports, dashboards etc., built using tool Either growing, static or declining High users/ artifacts + growing usage = Star Low users/ artifacts + growing usage = Question Mark Other products installed that deliver the same analytic capability High users/ artifacts + static usage = Cash Cow Low users/ artifacts + static or declining usage = Dog Example: Information Portal KPI dashboard Vendor X 1,000 users across enterprise 100 dashboards (five semantic models) Static Cash cow Vendor Y Add rows as needed BCG = Boston Consulting Group Source: Gartner (July 2018) This exercise provides a quick assessment of the current situation. Completing this step can bring issues to the fore and should provide enough input for initial discussions with internal stakeholders Gartner, Inc. G Page 9 of 17

10 about the likely next steps toward rationalization. Table 2 shows some common issues in analytics and BI portfolios. Table 2. Some Common Analytics and BI Portfolio Issues Issue A constellation of Stars: fewer overlaps would deliver economies of scale Greedy Cash Cows: replace with newer deliverables to future-proof Too few (or no!) Question Marks: lab work needed Old Dogs that won't die: change management needed Impact Multiple overlapping tools (that is, serving the same combination of domain category and analytic capability). Often a result of "land and expand" departmental purchasing of modern analytics and BI platforms (for example, companies using Qlik and Tableau and others and seeing Microsoft Power BI appear). Commonly long-standing, on-premises enterprise reporting tools delivering core KPIs as highly formatted output going back perhaps decades (often with hundreds, maybe thousands, of reports and dashboards). Widely used, but of little analytic value and consuming considerable IT resource or annual maintenance fees. Prone to risk of losing support. An inability to identify the niche emerging tools being tested/used points to a lack of innovation in analytics and BI often as a result of portfolio overload in other areas, or a lack of connection with groups doing innovative work. Over time, Cash Cows can morph into remaining islands of older analytics and BI tooling usage. These will persist for a long time if programs are not in place to change behavior (see the section on setting prioritization rules below) Source: Gartner (July 2018) Undertake a Full Audit of Your Tool Portfolio to Help Define Its Future Shape Build a fuller model using one of the two approaches outlined below. The aim is to educate stakeholders in the business value of tools identified during the quick assessment, and to test the feasibility of retiring ageing or poorly used elements based on user adoption, cost to support and future plans. Approach 1: Apply Application Portfolio Management Techniques Use this approach if your organization is already using application portfolio management (APM) in other areas. There are well-established techniques for APM that can and should be applied to analytics as much as any other type of application. In fact, Gartner research specifically calls out extending APM work to include apps falling into the analytical classification, which means, "any application with a primary objective of capturing, storing and manipulating data for query and analysis." (For more details, see "Defining 'Application' for APM.") In the main, APM has not been routinely applied in the case of analytics and BI products for two likely reasons: Page 10 of 17 Gartner, Inc. G

11 Many (but not all) analytic tasks are loosely coupled to business process and are therefore less likely to be reviewed as part of process review and modelling. The majority of organizations have simply lacked a strategy for analytics and BI beyond delivering reports, dashboards and visualization tools to end users. Central to applying APM, is the conducting of application fitness assessments: answering the question, "How well does this application fit current and emerging business needs and technical requirements?" In the case of analytics and BI, the use case includes all the supported apps and is used to build a domain assessment, drawing on fitness and value reviews to assess the application portfolio. The outcome of this assessment is used as input for defining roadmaps and prioritizing application portfolio improvement initiatives. An application fitness review evaluation process involves three steps: 1. Collect Gather relevant fitness and performance information (including metrics, surveys and expert/user anecdotes and opinions). 2. Assess and analyze Evaluate collected and other relevant information for trends, known problems or issues, unexplained errors, and so on. 3. Rate Assign a rating to each indicator. These three evaluative steps are completed for each aspect of the application performance business, operational, technical and cost and collectively determine an overall application health rating. Full details on how to conduct the process can be found in "How to Assess Your Current Application Portfolio Using Fitness and Value Review Processes" and in "Toolkit: Application Fitness and Value Review." Approach 2: Use Analytics Blocks to Map Portfolio Tools to Outcomes Analytics blocks are part of Gartner's analytic evolution framework (for detail on using this approach see "Gartner Analytics Evolution Framework"). Analytic blocks are a practical way of understanding the set of people, processes and technologies needed to deliver a particular business outcome from analytics. This technique was intended to be used when starting new projects, but can be retrospectively applied to existing, even long-standing, activities (such as reporting) as a means of understanding the relative life cycle stages of the tools in the portfolio. In this context for portfolio review purposes it would be a case of documenting analytics blocks that represent the bulk of the outcomes delivered by analytics and BI and adding a time dimension. For example, identifying how many people use the analytic capability (the technology) enabling the block, how old it is and whether usage is increasing, flat or decreasing. The outcome would be a map, like that shown in Figure 4. Gartner, Inc. G Page 11 of 17

12 Figure 4. Example of an Analytic Capability Portfolio Map BA = business applications; EDW = enterprise data warehouse; IoT = Internet of Things; OLAP = online analytical processing Source: Gartner (July 2018) Page 12 of 17 Gartner, Inc. G

13 Furthermore, using other components of the evolution framework (analytics clusters and hubs) can identify dependencies and synergies within the portfolio. Share the Results and Establish a Portfolio Review Period Whichever approach is used to model the portfolio (APM, analytic blocks or another method), it needs to be fully collaborative and open involving business and IT stakeholders. As such, it should be sanctioned by and fall under the aegis of the analytics and BI steering committee in its role of overseeing and governing BI usage. The outcome of the portfolio modelling activity must be documented and communicated to the core team and the self-serve user community, so that they can understand which tools to use and when (working within the guidance given). This is critical for those users that will need to use multiple tools. The data and analytics steering group should establish an agreed schedule for conducting an update, in order to make portfolio analysis a repeatable exercise. This update cycle is usually tied to the timing of the strategy and planning cycle. Set Prioritization Rules for the Desired Future Portfolio to Guide Portfolio Pruning This is where pruning the portfolio is enacted. But in most cases, rather than the draconian removal of tools it will be a case of weaning the organization off older technologies. This can be done through the gradual replacement of outputs in a new form that uses other tools in the portfolio. If not actively managed out, older analytics and BI tools will continue to be used even to the point where they are no longer supported by the software publisher. The concept of an overlapping S-curve where technologies become obsolete and are abandoned naturally is at worst a myth and at best too slow to free up resources (see Figure 5). Gartner, Inc. G Page 13 of 17

14 Figure 5. Technology Adoption S-Curves Source: Gartner (July 2018) Where "long tail" usage persists, it may reflect tools that offer functional capabilities not offered by others in the portfolio. If this is the case, continued use is justified and this will be clear in the portfolio modelling exercise (as outlined above). For the most part, this is not the case and ongoing usage is driven by habitual familiarity and a lack of clarity and communication about the options available to users. It can also be the case that slow or absent migration of BI content from an older to newer technology footprint means that users have no choice. Very often, the central analytics and BI team doesn't have any rules of engagement to guide its approach. They continue to use older technology options, thereby exacerbating the problem of persistent usage. The output of the portfolio review should be combined with the documented usage guidelines in order to steer usage toward the desired future state of the portfolio, taking account of where tools are in the life cycle. Data and analytics leaders should establish a set of prioritization rules to rebalance the portfolio in order to maintain the optimal mix of tools. An explicitly stated and simple set of rules is needed to drive change management. This prioritization is required to start actively reducing the use of products identified as overlapping (see examples in Table 3). Page 14 of 17 Gartner, Inc. G

15 Table 3. Example of Guiding Principles/Rules of Engagement Example 1 Example 2 1. Our aim is to drive the cost of reporting down and increase the value of analytics. 2. We will not develop new BI outputs in <reporting tool> if we can avoid it. 3. We will establish and teach self-service as an approach to analytics. 4. We will gradually replace <reporting tool> content with dashboard and data stories. 1. We currently use four modern analytics and BI platforms to deliver dashboards across different lines of business. 2. This is not cost-effective or scalable. As such, all new projects will either use <product A> or <product B> based on their capabilities and more widespread adoption. 3. Existing dashboard content in <Product C> and <Product D> will be redeveloped over time. 4. Cross training will be offered to migrate users. Source: Gartner (July 2018) Acronym Key and Glossary Terms ACE analytics center of excellence APM BI BICC application portfolio management business intelligence business intelligence competency center Gartner Recommended Reading Some documents may not be available as part of your current Gartner subscription. "How to Assess Your Current Application Portfolio Using Fitness and Value Review Processes" "Select the Right Business Intelligence and Analytics Tool for the Right User" "Gartner Analytics Evolution Framework" "Don't Waste Time on an 'Application' Definition Start Identifying Application Boundaries Instead" Evidence 1 An online survey was developed and hosted by Gartner as part of its research for the 2018 Magic Quadrant for Analytics and Business Intelligence Platforms. Vendor-provided reference customers (end-user customers and OEMs) and respondents from the prior year's survey provided responses. Gartner, Inc. G Page 15 of 17

16 The survey was conducted from 8 September 2017 through 5 October The survey results used in this document derive from 1,526 responses. Page 16 of 17 Gartner, Inc. G

17 GARTNER HEADQUARTERS Corporate Headquarters 56 Top Gallant Road Stamford, CT USA Regional Headquarters AUSTRALIA BRAZIL JAPAN UNITED KINGDOM For a complete list of worldwide locations, visit Gartner, Inc. and/or its affiliates. All rights reserved. Gartner is a registered trademark of Gartner, Inc. and its affiliates. This publication may not be reproduced or distributed in any form without Gartner's prior written permission. It consists of the opinions of Gartner's research organization, which should not be construed as statements of fact. While the information contained in this publication has been obtained from sources believed to be reliable, Gartner disclaims all warranties as to the accuracy, completeness or adequacy of such information. Although Gartner research may address legal and financial issues, Gartner does not provide legal or investment advice and its research should not be construed or used as such. Your access and use of this publication are governed by Gartner Usage Policy. Gartner prides itself on its reputation for independence and objectivity. Its research is produced independently by its research organization without input or influence from any third party. For further information, see "Guiding Principles on Independence and Objectivity." Gartner, Inc. G Page 17 of 17