Pg Alliance Data

Size: px
Start display at page:

Download "Pg Alliance Data"

Transcription

1 Pg.

2 Speaker Biography Brock Dietrich has been at Alliance Data for 19 years, with his first 10 years in the Credit Risk Analytics organization before moving into a variety of Information Technology roles. This unique background has allowed him to synthesis business needs with technical capabilities to deliver trusted solutions for the organization. Renee Fronk has been at Alliance Data for ten years as an IT Technical application manager. She supports MicroStrategy, SAS, Microsoft BI, and Tableau. She has a total of 14 years of experience using MicroStrategy across three different companies beginning with MicroStrategy 7. Pg. 2

3 Who is Alliance Data? Marketing and loyalty experts. As providers of branded credit cards for top companies, it s easy to assume we re simply a bank but the truth is, we re so much more than that. Focused on growing our partners sales (rather than building our own portfolio), our team of experts is on the cutting edge of marketing, loyalty and data-driven insights making us relationship builders who drive results and create connections. That s deliberately different, and that s exactly who we are. Pg. 3

4 Agenda Problem Statement Scope Committee Makeup Interviews / Requirements Numeric Ranking Evaluation Scorecard / Vendor Demonstrations Total Cost of Ownership Proof of Concept / Final Selection Pg. 4

5 Problem Statement Excessive resources needed to support multiple tools - MicroStrategy - Tableau - SQL Server BI - Cognos - Business Objects Inability to share insights across business units Lack of governance and consistency of key metrics Lack of training / best practices Pg. 5

6 Capability Scope Pg. 6

7 Vendor Scope Limited to 26 vendor platforms in the 2017 Gartner Magic Quadrant for Business Intelligence and Analytics Platforms report Pg. 7

8 Committee Makeup Enterprise Data Governance council 10 members Created sub-committee to address problem Nominated delegate from their business unit Open committee Total of 25 members, 10 delegates had voting rights Pg. 8

9 Interviews Producers Interviewed 130 subject matter experts Probed for use cases, data sources, tools, and processes Forward-looking topics, such as wish list items and features not yet available Consumers Focus groups consisting of approximately 50 managers and directors Business decisions being made Timeliness and trustworthiness of their data Desired functionality Pg. 9

10 Requirements Ability to provide enterprise data models with agreed upon metric calculations (one version of the truth) Ability for end users to ingest their own data and create content without IT involvement (data discovery) Process for migrating user created data from a personal space to a shared community space, and eventually to a enterprise IT managed space Governance and security paramount due to heavily regulated industry Overarching theme from the interviews was a governed self-service solution was required Pg. 10

11 Example Use Case Credit applications are the lifeblood of our business 150+ client relationships Client relationship managers closely monitor approval rate metric Receive alert when approval rate outside 1 standard deviation of the mean Ability to view historical trends on mobile device and drill into attributes to quickly identify cause of variance Allows relationship manager to engage right resources to resolve potential issue Pg. 11

12 Critical Capabilities Aligned use cases to March 2017 Gartner Critical Capabilities for Business Intelligence and Analytics Platforms report. Admin, Security and Architecture Data Source Connectivity Cloud BI Self-Contained ETL and Data Storage Self-Service Data Preparation Metadata Management Embedded Advanced Analytics Smart Data Discovery Interactive Visual Exploration Analytic Dashboards Mobile Exploration and Authoring Embed Analytic Content Publish, Share and Collaborate Platform and Workflow Integration Ease of Use and Visual Appeal Pg. 12

13 Numeric Ranking Pg. 13

14 Critical Capabilities Weighting 8% Admin, Security and Architecture 15% Data Source Connectivity 1% Cloud BI 4% Self-Contained ETL and Data Storage 7% Self-Service Data Preparation 10% Metadata Management 4% Embedded Advanced Analytics 2% Smart Data Discovery 6% Interactive Visual Exploration 5% Analytic Dashboards 4% Mobile Exploration and Authoring 5% Embed Analytic Content 15% Publish, Share and Collaborate 4% Platform and Workflow Integration 10% Ease of Use and Visual Appeal Pg. 14

15 Elimination Process Sub-committee considered numeric ranking and the following criteria: 1. Tools for which ADS has already made significant investments in BI asset development (MicroStrategy, Tableau, Microsoft, SAS, and Oracle) 2. New tools that were highly ranked in the Gartner MQ (Clear Story Data, TIBCO Software, Logi Analytics, Birst, Pyramid Analytics, and Information Builders) 3. Substantial enterprise platform market presence (Qlik) We selected 6 finalist to conduct an in-person demonstration of their capabilities. MicroStrategy ClearStory Data TIBCO Software Tableau Microsoft Qlik Pg. 15

16 Vendor Demo One hour to execute a standard demo script demonstrating steps from raw data ingestion to final output Connect to Northwind Database Create a sales report dashboard - KPI value with trend, Vertical bar chart, Global Map, Data Table Share report End user view report Schedule refresh/ distribution View on mobile device Answer a business question Ad-hoc question Free form deep dive highlighting unique capabilities of their platform Non-functional requirements, licensing and user models, training, support, administration, and infrastructure Pg. 16

17 Demo Scorecard During the demonstration, every sub-committee member completed a scorecard for each vendor (1-10) on the below summary categories Detailed interview use case requirements, aligned to critical capabilities, collapsed into 8 summary categories Pg. 17

18 Final Vendor Scorecard Following the vendor demonstration, the group met to agree upon a final score in each category for the vendor, delegates voted on score Based on the clustering of scorecard results, the sub-committee narrowed down the list to 3 vendors, MicroStrategy, Qlik, and Tibco Pg. 18

19 5 year TCO Entered into detailed pricing negotiations with the 3 vendors to calculate a 5 year TCO. Overall score weighted 60% functionality score and 40% cost score. Pg. 19

20 Scoring Results Matrixed Functionality scorecard and 5 Year TCO Selected Qlik to conduct a POC, hands-on in our environment with our data Pg. 20

21 Proof of Concept Approach Install vendor software in our environment Document test cases Vendor conduct training with users Connect vendor solution to required data sources Business users execute test cases Show stopper failures determined if additional POC necessary Pg. 21

22 Example Test Case Scorecard Category Scorecard Item Use Case Functionality Requirement Criticality Ease Reporting Schedule with data event triggers Schedule/distribute daily field reporting Ability to send internal and external subscriptions and export to Excel/PDF Critical, Useful, Nice to Have Easier, Same, More Difficult Pg. 22

23 Results Integration between Qlik Sense and N-Printing was insufficient to accomplish governed self-service requirement Executed 2 nd POC with MicroStrategy Governance through schema layer Dossier Collaboration Promote Dossier to Document Fine grained security controls allow for governance lifecycle (personal, community, enterprise) Mobile capabilities including white label, responsive design, embedding Pg. 23

24 Final Recommendation MicroStrategy was selected due to it s ability to accomplish all of our POC test cases, and provided an effective platform to implement a governed self-service model. Technology Process People Infrastructure design Contract Roles and responsibilities Build Center of Excellence Change management Software install and configuration Communication plan User adoption Metadata upgrade Define policies User groups User security setup Training curriculum Training Pg. 24

25 Five key takeaways Structured process (transparent, fair, leverage industry knowledge) Include all stakeholders (especially vocal critics) Interview the business (clearly defined requirements, scorecard) Define scripted vendor demonstration (compare/contrast build effort, consumption experience) Conduct POCs in your environment (real business data and real use cases) Pg. 25

26 Questions? Pg. 26