Machine-generated data: creating new opportunities for utilities, mobile and broadcast networks

Size: px
Start display at page:

Download "Machine-generated data: creating new opportunities for utilities, mobile and broadcast networks"

Transcription

1 APPLICATION BRIEF Machine-generated data: creating new opportunities for utilities, mobile and broadcast networks Electronic devices generate data every millisecond they are in operation. This data is vast, complex and contains a wealth of useful information. Network-based service providers such as utility companies, mobile providers and broadcast networks are capturing, storing and analyzing machine-generated data to help them measure and improve customer experience, provide proactive support, predict and prevent service outages, and drive impactful product roadmaps. For utility companies, sensor data from smart meters provides environmental information and corresponding resource usage for every point on the grid. For mobile providers, call detail records (CDRs) contain the details of each call or event that passes through a switch. As networks and devices continue generating more data at shorter intervals, and as user bases continue to grow, harnessing the volume, variety and velocity of this data presents a challenge. Traditional Ways of Analyzing Customer Usage and Experience are Slow and Cumbersome Network-based service providers have traditionally relied on two mechanisms to understand customer usage and to identify areas for improvement: 1. Direct customer feedback: Usually offered through routine surveys or client outreach for support, this data is valuable yet often skewed because participation is self selected. Example: If a cable company receives several calls from clients enrolled in the family package who want to add a sports channel to their plan, the company might notice a trend and decide to offer both sports and family channels in a single, bundled package. But they may be missing other cross-sell opportunities if those clients haven t made calls to customer support. 2. Data collected from machines, typically on a weekly or monthly basis: Utilities and mobile or broadcast network providers must measure network usage, primarily to ensure correct billing. Example: Utility companies measure energy consumption at customer accounts by checking each meter on a monthly basis. However, if there was an error on the customer s device in Week One of the billing cycle, it might not be identified for another three weeks, resulting in the utilities provider losing revenue because of an inability to accurately bill for energy consumption during the entirety of that month. Application Summary Improve customer experience and deliver better products to market by collecting and analyzing machine-generated data from customers devices Relevant Industries Network-based service providers Utilities Mobile communications Broadcast and cable Key Challenges Goal to capture and analyze streaming, machinegenerated data Expensive to store exponentially growing data volumes Data sampling and aggregations required for analysis limits complete view of data Key Benefits of Apache Hadoop Schema on read enables real time data loads of unstructured data Commodity hardware delivers petabyte-scale storage at low cost Fast, flexible analysis of massive unstructured data enables deeper insights

2 APPLICATION BRIEF From a technology perspective, network-based service providers have traditionally relied largely on online analytical processing (OLAP) using relational database management systems (RDBMS) to collect and analyze this information. Today operators increasingly face demands to capture more machine-generated information at a higher granularity and with greater frequency than before. This demand is driven by business needs to better understand system operation as well as customers behaviors and actions. Herein lies the challenge: traditional data warehouse environments struggle to capture and analyze machine-generated data with such high volume, velocity and variety. Common Challenges The data generated by machines is vast, complex and comes in many varieties. As these data sets quickly expand into the petabyte range, often with billions of files each containing millions of records, extracting valuable information using traditional tools becomes practically impossible. Challenges typically center around four key areas: 1. Ingestion of large data volumes leads to input/output (I/O) bottlenecks that affect service level agreements (SLAs); the data generated outweighs the system s ability to accommodate it. 2. Storage and protection of Big Data is expensive. 3. Processing and analysis of large, complex data sets is difficult, forcing trade-offs between data set size and detail of analysis. 4. Integration of this data into business processes requires constant upkeep of data formats, schemas within the requirements of operational SLAs. Leveraging Hadoop to Ingest, Store and Analyze Data Ingest and store more data, more affordably Apache Hadoop is a linearly scalable, grid-based data management platform designed to run on commodity hardware. Every node added to the Hadoop cluster increases storage capacity, network bandwidth and processing power. Huge volumes of data can be ingested and stored more quickly than with traditional storage area network (SAN) and network attached storage (NAS) solutions, which require data to be funneled through a single system head. Also, scaling into the petabyte range with Hadoop is cost effective because the platform is open source and clusters can be built on low-cost servers. Hadoop s cost-efficient scalability is especially valuable to network-based service providers attempting to capture their machine-generated data, which is especially voluminous and must be ingested in near real time. Perform deeper, more flexible analytics on larger data sets Hadoop was designed to manage massive quantities of complex data, eliminating trade-offs between the size of the data set and time-to-answer during analysis. Because Hadoop is built on a highly scalable and flexible file system, any type of data can be loaded without altering its format, preserving data integrity and delivering complete analytic flexibility. Hadoop implements a schema on read approach, allowing context for the data to be set when the question is asked. Data no longer needs to be transferred using time-consuming extract, transform, and load (ETL) processes from storage over a network to the database or analytic platform where computation takes place. Any type of data can be mined in its original format and combined with other data types to paint a more comprehensive picture. And the amount of time required to perform deep analytics and mass transformations on very large quantities of complex data is dramatically reduced, for example, from 6-10 hours down to 5-10 minutes. As a result, users can get more reliable results by running queries against comprehensive data sets, rather than relying on sampling or aggregations of a limited window of historical data. In Summary Hadoop allows network-based service providers to capture, store and analyze all of their data both machine-generated and otherwise for more flexible, insightful and ad hoc analysis at petabyte scale. Success Story The Customer: Opower Customer engagement platform for the utility industry Challenge Needed to capture, store, manage and analyze ever-increasing utility data streams from large smart meter deployments receiving terabytes of advanced metering infrastructure (AMI) data, along with data generated from smart appliances, interactive user applications, sensors, and social media data Solution Deployed CDH platform of Apache Hadoop, HBase, Hive, and Sqoop to store, query and transform all time series and social data Results Analysts and product managers have gained 360-degree view into customer energy usage patterns, facilitating proactive customer support Utilities providers can now foster better end user relationships by offering advice and feedback based on individual usage patterns 2012 Cloudera, Inc. All Rights Reserved. Cloudera and the Cloudera logo are trademarks or registered trademarks of Cloudera Inc. in the USA and other countries. All other trademarks are the property of their

3 Effectively Leveraging the Power of Apache Hadoop: Shuffle and Snappy Shuffle 101 Many of Hadoop s core mechanics come into play in the Shuffle phase. The Shuffle phase is the intermediary step between Map and Reduce and often raises many questions. Several parameters can be adjusted to help make this step run more smoothly. When a MapReduce job is run, the Mapper generates key/value pairs but what actually happens on disk? This is a simplified explanation into this activity of what happens when the mapper is operating: Key/value pairs are first written to a buffer (io.sort.mb). This buffer controls the intermediate performance of a job--the typical point where the largest delays are encountered in a MapReduce job. When the buffer exceeds io.sort.spill.pct, a spill thread begins and will spill keys and values to disk respectively. If the buffer fills before the spill is complete, the spill will buffer or will block the mapper until the spill completes. The spill is deemed complete when the buffer is completely flushed. After this, the mapper will continue to fill the buffer until another spill begins. This loop will continue until the mapper has emitted all of its key/value pairs. Setting a larger value for io.sort.mb allows more key/value pairs to fit in memory, yielding fewer spills. Additionally, modifying io.sort.spill.pct gives the spill thread a higher tolerance results in fewer blocks. How much room is required for accounting information to reduce spill? To reduce spill of accounting information the parameter io.sort.record.percent should be addressed. The amount of room required for the accounting information is a function of the number of records, not the record size. Therefore, a higher number of records might need more room for accounting to reduce spill. This formula can help determine space requirements: 16/(16 + R) where R is the average record size in bytes. Example: if average map output record is 16 bytes, 16/32 = 0.50 MapReduce-64 now enables a job to automatically derive the percent value by using all available memory (up to io.sort.mb) for either the data or accounting. This can still be set manually per job. What happens when your cluster is required to merge multiple spill files into a single output file for the reducer? One very important parameter to keep in mind when running larger MapReduce jobs via large data sets, or over a large cluster is io.sort.factor. A combiner needs to be called to force the merges into one file effectively calling multiple iterations over the intermediate data set. If the io.sort. factor size (depending on size of cluster/job) is increased, the number of merges required to achieve the reducer input can be decreased. This cuts the number of spills and the number of times the combiner is called, resulting in only one full pass through the data set. So the io.sort.factor is very important! Note: io.sort.factor defaults to 10 and will lead to too many spills and merges when an organization begins to scale. It can be increased to 100 or more on clusters greater than 50 nodes or when processing extremely large data sets. When it comes down to getting maximum performance out of MapReduce jobs that are crunching extreme data sets or operating on a large cluster, enabling the Shuffle phase to run more efficiently will vastly improve overall performance.

4 Snappy Compression The Hadoop community often counteracts slow disk performance by leveraging a compression library. This can reduce the size of the data footprint on disk, attaining faster reads. But, there is an IO/CPU trade off--compression consumes CPU cycles. While compression will reduce the storage footprint, the key is being able to leverage compressed files in the computations. The traditional compression library deployed in Hadoop is LZO, however the system administrator had to address the challenges of managing the compression libraries and other quirks that would result from running LZO on Hadoop. Then came Snappy, an open-source compression library that compresses files at 250MB/s. Snappy keeps the compression speed at a premium, with a small sacrifice on the compressed storage size compared to other compression algorithms. If you d like an explanation on why I had to change it I d be more than happy. But thank you very much for allowing me to make the edits, and I apologize that it was last minute. Snappy was a major addition to Cloudera s Distribution for Hadoop (CDH), because of its outstanding performance and the ability to work with distributed file systems. Any Hadoop user understands that additions to the Hadoop stack can complicate things; for example, LZO can be very difficult to manage when added to a Hadoop stack. However, Snappy is extremely easy to use with Hadoop. Hadoop users now have the ability to leverage an extremely fast compression algorithm to store and process their data as evidenced in this comparison: Compresssion Original File Size Compressed Output Compression Pct. Compression Speed Decompression Speed Snappy 1.52MB 9.09MB 59.8% MB/s 354.1/s ZLIB 1.52MB 5.44MB 35.8% 12.5 MB/s MB/s LZO 1.52MB 8.27MB 54.4% MB/s MB/s LibLZF 1.52MB 8.29MB 54.6% MB/s MB/s QuickLZ 1.52MB 8.34MB 54.9% 92.9 MB/s 67.9 MB/s FastLZ 1.52MB 8.54MB 56.2% 51.3 MB/s MB/s Full comparison of Snappy vs. LZO, ZLIB, QuickLZ, FastLZ, & LibLZF. To use Snappy with a specific Hadoop job you just add the compression codec to a jobconf object:... Configuration conf = new Configuration(); // Compress Map output conf.set( mapred.compress.map.output, true ); conf.set( mapred.map.output.compression.codec, org.apache.hadoop.io.compress.snappycodec ); // Compress MapReduce output conf.set( mapred.output.compress, true ); conf.set( mapred.output.compression, org.apache.hadoop.io.compress.snappycodec );... Turning a Hadoop cluster into a high performance distributed computing machine becomes much easier with Snappy compression, and for those using CDH3U1 or higher, the installation is already done!

5 Ensure Network Performance With Apache Hadoop Companies across the globe rely heavily on their network for ensuring business continuity, performance and expansion. IT Departments must ensure that a network stays accessible and secure. Companies are using Hadoop to detect threats or fraudulent activity and identify potential trouble areas. This brief will highlight how real companies are using Hadoop to ensure network performance is maintained at all times. Analyzing Network Data to Predict Failure Utilities run big, expensive and complicated systems to generate power. Monitoring the health of the entire grid requires the capture and analysis of data from every utility, and even from every generator, in the grid. A power company built a Hadoop cluster to capture and store the data streaming off of all of the sensors in the network. As a result, the power company can see and react to long-term trends and emerging problems in the grid that are not apparent in the instantaneous performance of any particular generator. Combining all of that data into a single repository and analyzing it together, can help IT organizations better understand their infrastructure and improve efficiencies across the network. Hadoop is a powerful platform for dealing with fraudulent and criminal activity. It is flexible enough to store all of the data message content, relationships among people and computers, patterns of activity that matters. It is powerful enough to run sophisticated detection and prevention algorithms and to create complex models from historical data to monitor real-time activity. Threat Analysis A global developer of software and services to protect against computer viruses has amassed an enormous library of malware indexed by virus signatures. The vendor uses MapReduce to compare instances of malware to one another, and to build higher-level models of the threats that the different pieces of malware pose. The ability to examine all the data comprehensively allows the company to build more robust tools for detecting known and emerging threats.

6 Ensure Network Performance With Apache Hadoop Threat Analysis (cont.) Online retailers are particularly vulnerable to fraud and theft. Many use web logs to monitor user behavior on the site. By tracking that activity, tracking IP addresses and using knowledge of the location of individual visitors, these sites are able to recognize and prevent fraudulent activity. Hadoop is a powerful platform for dealing with fraudulent and criminal activity. It is flexible enough to store all of the data message content, relationships among people and computers, patterns of activity that matters. It is powerful enough to run sophisticated detection and prevention algorithms and to create complex models from historical data to monitor real-time activity. Why Cloudera Cloudera is the leading provider of Hadoop-based software and services. Our open source software offering, Cloudera s Distribution for Apache Hadoop (CDH) is the industry s most popular means of deploying Hadoop. CDH is a platform for data management and combines the leading Hadoop software and related projects and provides them as an integrative whole with common packaging, patching and documentation. Cloudera s professional services team is experienced at delivering high value services to thousands of users supporting hundreds of implementations over a range of industries. Our customers use Cloudera s products and services to store, manage, and analyze data on large Hadoop implementations. Cloudera is the leading provider of Hadoop-based software and services. Our open source software offering, Cloudera s Distribution for Apache Hadoop (CDH) is the industry s most popular means of deploying Hadoop. CDH is a platform for data management and combines the leading Hadoop software and related projects and provides them as an integrative whole with common packaging, patching and documentation.

7 Increase Revenue With Apache Hadoop Large and successful companies are using Hadoop to do powerful data analyses of the data they collect. With the ability to store any kind of data from any source, inexpensively and at very large scale, Hadoop makes it possible to conduct the types of analysis that would be impossible or impractical using any other database or data warehouse. Hadoop lowers costs and extracts more value from data. Companies looking to increase revenue opportunities can utilize Hadoop to analyze a wide variety of data to solve several business challenges including determining how customers are lost, predicting customer preferences, better ad targeting, and creating better promotional offers. This brief will highlight how real companies are using Hadoop to solve these challenges. Customer Churn Analysis A large mobile carrier needed to analyze multiple data sources to understand how and why customers decided to terminate their service contact. What issues were important and how could the provider improve satisfaction and retain customers? The company used Hadoop to combine traditional transactional and event data with social network data. By examining call logs to see who spoke with whom, creating a graph of that social network, and analyzing it, the company was able to show that if people in the customer s social network were to leave, then the customer was more likely to depart, too. Combining data in this way gave the provider a much better measure of risk that a customer would leave and improved planning for new products and network investments to improve customer satisfaction. Recommendation Engine A leading online dating service has to measure compatibility between individual members so that it can suggest good matches for potential relationships. The company combined survey information with demographic and web activity to build a comprehensive picture of its customers. The data included a mix of complex and structured information, and the analytical system had to evolve continually to provide better recommendations over time. Hadoop makes an exceptional staging area for an enterprise data warehouse. It provides a place for users to capture and store new data sets or data sets that have not yet been placed in the enterprise data warehouse. Hadoop can store all types of data and makes it easy for analysts to pose questions, develop hypotheses and explore the data for meaningful relationships and value. Hadoop allowed the company to incorporate more data over time, improving the compatibility score that customers see. Hadoop s built-in parallelism and incremental scalability mean that the company can size its system to meet the needs of its customer base, and that it can grow easily as new customers join.

8 Increase Revenue With Apache Hadoop Recommendation Engine (cont.) A large content publisher and aggregator uses Hadoop to determine the most relevant content for each visitor. Many online retailers, and even manufacturers, rely on Hadoop to store and digest user purchase behavior and to produce recommendations for products that a visitor might buy. In each of these instances, Hadoop combines log, transaction and other data to produce recommendations. Ad Targeting Leading advertising networks select ads best suited to a particular visitor. Ad targeting systems must understand user preference and behavior, estimate how interested a given user will be, and choose the one that maximizes revenue. Optimization requires examining both the relevance of a given advertisement to a particular user, and the collection of bids by different advertisers who want to reach that visitor. One advertising exchange uses Hadoop to collect and analyze the stream of user activity coming off of its servers. Business analysts at the exchange are able to see reports on the performance of individual ads, and adjust the system to improve relevance and increase revenues immediately. A second exchange builds sophisticated models of user behavior in order to choose the right ad for a given visitor in real time. Hadoop delivers much better targeted advertisements by steadily refining those models and delivering better ads. Point of Sale Transaction Analysis A large retailer doing Point-of-Sale (PoS) transactional analysis needed to combine larger quantities of PoS transaction analysis data with new and interesting data sources to forecast demand and improve return that it got on its promotional campaigns. Cloudera is the leading provider of Hadoop-based software and services. Our open source software offering, Cloudera s Distribution for Apache Hadoop (CDH) is the industry s most popular means of deploying Hadoop. CDH is a platform for data management and combines the leading Hadoop software and related projects and provides them as an integrative whole with common packaging, patching and documentation. The retailer built analytic applications on the SQL system for Hadoop, called Hive, to perform the same analysis that it had done on its data warehouse system but over much larger quantities of data, and at much lower cost. Hadoop makes an exceptional staging area for an enterprise data warehouse. It provides a place for users to capture and store new data sets or data sets that have not yet been placed in the enterprise data warehouse. Hadoop can store all types of data and makes it easy for analysts to pose questions, develop hypotheses and explore the data for meaningful relationships and value.

E-guide Hadoop Big Data Platforms Buyer s Guide part 1

E-guide Hadoop Big Data Platforms Buyer s Guide part 1 Hadoop Big Data Platforms Buyer s Guide part 1 Your expert guide to Hadoop big data platforms for managing big data David Loshin, Knowledge Integrity Inc. Companies of all sizes can use Hadoop, as vendors

More information

Bringing the Power of SAS to Hadoop Title

Bringing the Power of SAS to Hadoop Title WHITE PAPER Bringing the Power of SAS to Hadoop Title Combine SAS World-Class Analytics With Hadoop s Low-Cost, Distributed Data Storage to Uncover Hidden Opportunities ii Contents Introduction... 1 What

More information

Top 5 Challenges for Hadoop MapReduce in the Enterprise. Whitepaper - May /9/11

Top 5 Challenges for Hadoop MapReduce in the Enterprise. Whitepaper - May /9/11 Top 5 Challenges for Hadoop MapReduce in the Enterprise Whitepaper - May 2011 http://platform.com/mapreduce 2 5/9/11 Table of Contents Introduction... 2 Current Market Conditions and Drivers. Customer

More information

Big Data The Big Story

Big Data The Big Story Big Data The Big Story Jean-Pierre Dijcks Big Data Product Mangement 1 Agenda What is Big Data? Architecting Big Data Building Big Data Solutions Oracle Big Data Appliance and Big Data Connectors Customer

More information

Microsoft Big Data. Solution Brief

Microsoft Big Data. Solution Brief Microsoft Big Data Solution Brief Contents Introduction... 2 The Microsoft Big Data Solution... 3 Key Benefits... 3 Immersive Insight, Wherever You Are... 3 Connecting with the World s Data... 3 Any Data,

More information

In-Memory Analytics: Get Faster, Better Insights from Big Data

In-Memory Analytics: Get Faster, Better Insights from Big Data Discussion Summary In-Memory Analytics: Get Faster, Better Insights from Big Data January 2015 Interview Featuring: Tapan Patel, SAS Institute, Inc. Introduction A successful analytics program should translate

More information

From Information to Insight: The Big Value of Big Data. Faire Ann Co Marketing Manager, Information Management Software, ASEAN

From Information to Insight: The Big Value of Big Data. Faire Ann Co Marketing Manager, Information Management Software, ASEAN From Information to Insight: The Big Value of Big Data Faire Ann Co Marketing Manager, Information Management Software, ASEAN The World is Changing and Becoming More INSTRUMENTED INTERCONNECTED INTELLIGENT

More information

ETL on Hadoop What is Required

ETL on Hadoop What is Required ETL on Hadoop What is Required Keith Kohl Director, Product Management October 2012 Syncsort Copyright 2012, Syncsort Incorporated Agenda Who is Syncsort Extract, Transform, Load (ETL) Overview and conventional

More information

GE Intelligent Platforms. Proficy Historian HD

GE Intelligent Platforms. Proficy Historian HD GE Intelligent Platforms Proficy Historian HD The Industrial Big Data Historian Industrial machines have always issued early warnings, but in an inconsistent way and in a language that people could not

More information

WHITEPAPER. Unlocking Your ATM Big Data : Understanding the power of real-time transaction monitoring and analytics.

WHITEPAPER. Unlocking Your ATM Big Data : Understanding the power of real-time transaction monitoring and analytics. Unlocking Your ATM Big Data : Understanding the power of real-time transaction monitoring and analytics www.inetco.com Summary Financial organizations are heavily investing in self-service and omnichannel

More information

Bringing Big Data to Life: Overcoming The Challenges of Legacy Data in Hadoop

Bringing Big Data to Life: Overcoming The Challenges of Legacy Data in Hadoop 0101 001001010110100 010101000101010110100 1000101010001000101011010 00101010001010110100100010101 0001001010010101001000101010001 010101101001000101010001001010010 010101101 000101010001010 1011010 0100010101000

More information

Machina Research White Paper for ABO DATA. Data aware platforms deliver a differentiated service in M2M, IoT and Big Data

Machina Research White Paper for ABO DATA. Data aware platforms deliver a differentiated service in M2M, IoT and Big Data Machina Research White Paper for ABO DATA Data aware platforms deliver a differentiated service in M2M, IoT and Big Data December 2013 Connections (billion) Introduction More and more businesses are making

More information

Apache Spark 2.0 GA. The General Engine for Modern Analytic Use Cases. Cloudera, Inc. All rights reserved.

Apache Spark 2.0 GA. The General Engine for Modern Analytic Use Cases. Cloudera, Inc. All rights reserved. Apache Spark 2.0 GA The General Engine for Modern Analytic Use Cases 1 Apache Spark Drives Business Innovation Apache Spark is driving new business value that is being harnessed by technology forward organizations.

More information

Operational Hadoop and the Lambda Architecture for Streaming Data

Operational Hadoop and the Lambda Architecture for Streaming Data Operational Hadoop and the Lambda Architecture for Streaming Data 2015 MapR Technologies 2015 MapR Technologies 1 Topics From Batch to Operational Workloads on Hadoop Streaming Data Environments The Lambda

More information

Simplifying the Process of Uploading and Extracting Data from Apache Hadoop

Simplifying the Process of Uploading and Extracting Data from Apache Hadoop Simplifying the Process of Uploading and Extracting Data from Apache Hadoop Rohit Bakhshi, Solution Architect, Hortonworks Jim Walker, Director Product Marketing, Talend Page 1 About Us Rohit Bakhshi Solution

More information

Microsoft Azure Essentials

Microsoft Azure Essentials Microsoft Azure Essentials Azure Essentials Track Summary Data Analytics Explore the Data Analytics services in Azure to help you analyze both structured and unstructured data. Azure can help with large,

More information

REDUCING NETWORK COSTS WITHOUT SACRIFICING QUALITY

REDUCING NETWORK COSTS WITHOUT SACRIFICING QUALITY REDUCING NETWORK COSTS WITHOUT SACRIFICING QUALITY INTEGRATED ENVIRONMENT DELIVERS MEASUREMENT AND ANALYSIS OF DATA FOR BETTER SERVICE AND WASTE REDUCTION 1 EXECUTIVE SUMMARY What if you could eliminate

More information

Oracle DataRaker The Most Complete, Most Reliable Solution for Transforming Complex Data into Actionable Insight

Oracle DataRaker The Most Complete, Most Reliable Solution for Transforming Complex Data into Actionable Insight Oracle DataRaker The Most Complete, Most Reliable Solution for Transforming Complex Data into Actionable Insight Oracle DataRaker Driving Action Through Insight INCREASE Operational Efficiency IMPROVE

More information

IBM Software IBM InfoSphere BigInsights

IBM Software IBM InfoSphere BigInsights IBM Software IBM InfoSphere BigInsights Enabling new, cost-effective solutions to turn complex information into business insight 2 IBM InfoSphere BigInsights Executive summary Companies are hyper-connected

More information

Architected Blended Big Data With Pentaho. A Solution Brief

Architected Blended Big Data With Pentaho. A Solution Brief Architected Blended Big Data With Pentaho A Solution Brief Introduction The value of big data is well recognized, with implementations across every size and type of business today. However, the most powerful

More information

IBM Tivoli Monitoring

IBM Tivoli Monitoring Monitor and manage critical resources and metrics across disparate platforms from a single console IBM Tivoli Monitoring Highlights Proactively monitor critical components Help reduce total IT operational

More information

A technical discussion of performance and availability December IBM Tivoli Monitoring solutions for performance and availability

A technical discussion of performance and availability December IBM Tivoli Monitoring solutions for performance and availability December 2002 IBM Tivoli Monitoring solutions for performance and availability 2 Contents 2 Performance and availability monitoring 3 Tivoli Monitoring software 4 Resource models 6 Built-in intelligence

More information

The ABCs of. CA Workload Automation

The ABCs of. CA Workload Automation The ABCs of CA Workload Automation 1 The ABCs of CA Workload Automation Those of you who have been in the IT industry for a while will be familiar with the term job scheduling or workload management. For

More information

SAS ANALYTICS AND OPEN SOURCE

SAS ANALYTICS AND OPEN SOURCE GUIDEBOOK SAS ANALYTICS AND OPEN SOURCE April 2014 2014 Nucleus Research, Inc. Reproduction in whole or in part without written permission is prohibited. THE BOTTOM LINE Many organizations balance open

More information

Evolution to Revolution: Big Data 2.0

Evolution to Revolution: Big Data 2.0 Evolution to Revolution: Big Data 2.0 An ENTERPRISE MANAGEMENT ASSOCIATES (EMA ) White Paper Prepared for Actian March 2014 IT & DATA MANAGEMENT RESEARCH, INDUSTRY ANALYSIS & CONSULTING Table of Contents

More information

ActualTests.C Q&A C Foundations of IBM Big Data & Analytics Architecture V1

ActualTests.C Q&A C Foundations of IBM Big Data & Analytics Architecture V1 ActualTests.C2030-136.40Q&A Number: C2030-136 Passing Score: 800 Time Limit: 120 min File Version: 4.8 http://www.gratisexam.com/ C2030-136 Foundations of IBM Big Data & Analytics Architecture V1 Hello,

More information

ENABLING GLOBAL HADOOP WITH DELL EMC S ELASTIC CLOUD STORAGE (ECS)

ENABLING GLOBAL HADOOP WITH DELL EMC S ELASTIC CLOUD STORAGE (ECS) ENABLING GLOBAL HADOOP WITH DELL EMC S ELASTIC CLOUD STORAGE (ECS) Hadoop Storage-as-a-Service ABSTRACT This White Paper illustrates how Dell EMC Elastic Cloud Storage (ECS ) can be used to streamline

More information

Harnessing the Power of Big Data to Transform Your Business Anjul Bhambhri VP, Big Data, Information Management, IBM

Harnessing the Power of Big Data to Transform Your Business Anjul Bhambhri VP, Big Data, Information Management, IBM May, 2012 Harnessing the Power of Big Data to Transform Your Business Anjul Bhambhri VP, Big Data, Information Management, IBM 12+ TBs of tweet data every day 30 billion RFID tags today (1.3B in 2005)

More information

Copyright - Diyotta, Inc. - All Rights Reserved. Page 2

Copyright - Diyotta, Inc. - All Rights Reserved. Page 2 Page 2 Page 3 Page 4 Page 5 Humanizing Analytics Analytic Solutions that Provide Powerful Insights about Today s Healthcare Consumer to Manage Risk and Enable Engagement and Activation Industry Alignment

More information

HP SummerSchool TechTalks Kenneth Donau Presale Technical Consulting, HP SW

HP SummerSchool TechTalks Kenneth Donau Presale Technical Consulting, HP SW HP SummerSchool TechTalks 2013 Kenneth Donau Presale Technical Consulting, HP SW Copyright Copyright 2013 2013 Hewlett-Packard Development Development Company, Company, L.P. The L.P. information The information

More information

Cloud Integration and the Big Data Journey - Common Use-Case Patterns

Cloud Integration and the Big Data Journey - Common Use-Case Patterns Cloud Integration and the Big Data Journey - Common Use-Case Patterns A White Paper August, 2014 Corporate Technologies Business Intelligence Group OVERVIEW The advent of cloud and hybrid architectures

More information

Kx for Telecommunications

Kx for Telecommunications Kx for Telecommunications Kx for Telecommunications Page 1 1. Executive Summary The Telecoms industry, like many others, has undergone massive transformation. Providing voice over fixed lines and circuit

More information

Data Strategy: How to Handle the New Data Integration Challenges. Edgar de Groot

Data Strategy: How to Handle the New Data Integration Challenges. Edgar de Groot Data Strategy: How to Handle the New Data Integration Challenges Edgar de Groot New Business Models Lead to New Data Integration Challenges Organisations are generating insight Insight is capital 3 Retailers

More information

Building Your Big Data Team

Building Your Big Data Team Building Your Big Data Team With all the buzz around Big Data, many companies have decided they need some sort of Big Data initiative in place to stay current with modern data management requirements.

More information

Cloud Based Analytics for SAP

Cloud Based Analytics for SAP Cloud Based Analytics for SAP Gary Patterson, Global Lead for Big Data About Virtustream A Dell Technologies Business 2,300+ employees 20+ data centers Major operations in 10 countries One of the fastest

More information

Building a Data Lake with Spark and Cassandra Brendon Smith & Mayur Ladwa

Building a Data Lake with Spark and Cassandra Brendon Smith & Mayur Ladwa Building a Data Lake with Spark and Cassandra Brendon Smith & Mayur Ladwa July 2015 BlackRock: Who We Are BLK data as of 31 st March 2015 is the world s largest investment manager Manages over $4.7 trillion

More information

White Paper. Five industries where big data is making a difference

White Paper. Five industries where big data is making a difference Five industries where big data is making a difference To understand how Big Data can transform businesses, we have to understand its nature. Although there are numerous definitions of Big Data, many will

More information

1. Intoduction to Hadoop

1. Intoduction to Hadoop 1. Intoduction to Hadoop Hadoop is a rapidly evolving ecosystem of components for implementing the Google MapReduce algorithms in a scalable fashion on commodity hardware. Hadoop enables users to store

More information

E.ON Energie Kundenservice: Getting the Most Out of Data with SAP Business Warehouse powered by SAP HANA

E.ON Energie Kundenservice: Getting the Most Out of Data with SAP Business Warehouse powered by SAP HANA customer logo here E.ON Energie Kundenservice: Getting the Most Out of Data with SAP Business Warehouse powered by SAP HANA E.ON Energie Kundenservice GmbH had a treasure trove of insights deep within

More information

Sr. Sergio Rodríguez de Guzmán CTO PUE

Sr. Sergio Rodríguez de Guzmán CTO PUE PRODUCT LATEST NEWS Sr. Sergio Rodríguez de Guzmán CTO PUE www.pue.es Hadoop & Why Cloudera Sergio Rodríguez Systems Engineer sergio@pue.es 3 Industry-Leading Consulting and Training PUE is the first Spanish

More information

Transforming Big Data to Business Benefits

Transforming Big Data to Business Benefits Transforming Big Data to Business Benefits Automagical EDW to Big Data Migration BI at the Speed of Thought Stream Processing + Machine Learning Platform Table of Contents Introduction... 3 Case Study:

More information

SAS & HADOOP ANALYTICS ON BIG DATA

SAS & HADOOP ANALYTICS ON BIG DATA SAS & HADOOP ANALYTICS ON BIG DATA WHY HADOOP? OPEN SOURCE MASSIVE SCALE FAST PROCESSING COMMODITY COMPUTING DATA REDUNDANCY DISTRIBUTED WHY HADOOP? Hadoop will soon become a replacement complement to:

More information

IBM Big Data Summit 2012

IBM Big Data Summit 2012 IBM Big Data Summit 2012 12.10.2012 InfoSphere BigInsights Introduction Wilfried Hoge Leading Technical Sales Professional hoge@de.ibm.com twitter.com/wilfriedhoge 12.10.1012 IBM Big Data Strategy: Move

More information

Reduce Money Laundering Risks with Rapid, Predictive Insights

Reduce Money Laundering Risks with Rapid, Predictive Insights SOLUTION brief Digital Bank of the Future Financial Services Reduce Money Laundering Risks with Rapid, Predictive Insights Executive Summary Money laundering is the process by which the illegal origin

More information

Key Factors When Choosing a Shopper Counting Solution An Executive Brief

Key Factors When Choosing a Shopper Counting Solution An Executive Brief Key Factors When Choosing a Shopper Counting Solution An Executive Brief July 2015 Table of Contents Preface: Why Numbers Alone Aren t Enough... 1 Realizing Optimum ROI from Shopper Counting Solutions...

More information

E-Guide THE EVOLUTION OF IOT ANALYTICS AND BIG DATA

E-Guide THE EVOLUTION OF IOT ANALYTICS AND BIG DATA E-Guide THE EVOLUTION OF IOT ANALYTICS AND BIG DATA E nterprises are already recognizing the value that lies in IoT data, but IoT analytics is still evolving and businesses have yet to see the full potential

More information

Trusted by more than 150 CSPs worldwide.

Trusted by more than 150 CSPs worldwide. RAID is a platform designed for Communication Service Providers that want to leverage their data assets to improve business processes and gain business insights, while at the same time simplify their IT

More information

DELL EMC HADOOP SOLUTIONS

DELL EMC HADOOP SOLUTIONS Big Data and Analytics DELL EMC HADOOP SOLUTIONS Helping Organizations Capitalize on the Digital Transformation The digital transformation: a disruptive opportunity Across virtually all industries, the

More information

An Effective Convergence of Analytics and Geography

An Effective Convergence of Analytics and Geography An Effective Convergence of Analytics and Geography Gain Competitive Advantage Using Smarter Analytics Tony Boobier BEng CEng FICE FCILA FCIM MICPS Insurance Leader IBM Business Analytics EMEA Agenda 1

More information

Boston Azure Cloud User Group. a journey of a thousand miles begins with a single step

Boston Azure Cloud User Group. a journey of a thousand miles begins with a single step Boston Azure Cloud User Group a journey of a thousand miles begins with a single step 3 Solution Architect at Slalom Boston Business Intelligence User Group Leader I am a bit shy but passionate. BI Architect

More information

Oracle Big Data Cloud Service

Oracle Big Data Cloud Service Oracle Big Data Cloud Service Delivering Hadoop, Spark and Data Science with Oracle Security and Cloud Simplicity Oracle Big Data Cloud Service is an automated service that provides a highpowered environment

More information

Session 30 Powerful Ways to Use Hadoop in your Healthcare Big Data Strategy

Session 30 Powerful Ways to Use Hadoop in your Healthcare Big Data Strategy Session 30 Powerful Ways to Use Hadoop in your Healthcare Big Data Strategy Bryan Hinton Senior Vice President, Platform Engineering Health Catalyst Sean Stohl Senior Vice President, Product Development

More information

White Paper: SAS and Apache Hadoop For Government. Inside: Unlocking Higher Value From Business Analytics to Further the Mission

White Paper: SAS and Apache Hadoop For Government. Inside: Unlocking Higher Value From Business Analytics to Further the Mission White Paper: SAS and Apache Hadoop For Government Unlocking Higher Value From Business Analytics to Further the Mission Inside: Using SAS and Hadoop Together Design Considerations for Your SAS and Hadoop

More information

Can Machine Learning Prevent Application Downtime?

Can Machine Learning Prevent Application Downtime? NIMBLE LABS RESEARCH REPORT Can Machine Learning Prevent Application Downtime? Business users expect immediate access to data, all the time and without interruption. But reality does not always meet expectations.

More information

Hadoop in Production. Charles Zedlewski, VP, Product

Hadoop in Production. Charles Zedlewski, VP, Product Hadoop in Production Charles Zedlewski, VP, Product Cloudera In One Slide Hadoop meets enterprise Investors Product category Business model Jeff Hammerbacher Amr Awadallah Doug Cutting Mike Olson - CEO

More information

Innovative solutions to simplify your business. IBM System i5 Family

Innovative solutions to simplify your business. IBM System i5 Family Innovative solutions to simplify your business IBM System i5 Family Highlights Provide faster, extremely reliable and highly secure ways to help simplify your IT environment, enabling savings to be invested

More information

CASE STUDY Telecommunications Provider Masters Customer Journeys with NICE Customer Engagement Analytics. Copyright 2017 NICE. All rights reserved.

CASE STUDY Telecommunications Provider Masters Customer Journeys with NICE Customer Engagement Analytics. Copyright 2017 NICE. All rights reserved. CASE STUDY Telecommunications Provider Masters Customer Journeys with NICE Customer Engagement Analytics TABLE OF CONTENTS The Challenge... 3 NICE CEA from IVRO to CJO... 4 IVRO Deployment... 4 CJO Deployment...

More information

DLT AnalyticsStack. Powering big data, analytics and data science strategies for government agencies

DLT AnalyticsStack. Powering big data, analytics and data science strategies for government agencies DLT Stack Powering big data, analytics and data science strategies for government agencies Now, government agencies can have a scalable reference model for success with Big Data, Advanced and Data Science

More information

WHITE PAPER SPLUNK SOFTWARE AS A SIEM

WHITE PAPER SPLUNK SOFTWARE AS A SIEM SPLUNK SOFTWARE AS A SIEM Improve your security posture by using Splunk as your SIEM HIGHLIGHTS Splunk software can be used to build and operate security operations centers (SOC) of any size (large, med,

More information

Data Analytics and CERN IT Hadoop Service. CERN openlab Technical Workshop CERN, December 2016 Luca Canali, IT-DB

Data Analytics and CERN IT Hadoop Service. CERN openlab Technical Workshop CERN, December 2016 Luca Canali, IT-DB Data Analytics and CERN IT Hadoop Service CERN openlab Technical Workshop CERN, December 2016 Luca Canali, IT-DB 1 Data Analytics at Scale The Challenge When you cannot fit your workload in a desktop Data

More information

Cognitive enterprise archive and retrieval

Cognitive enterprise archive and retrieval Cognitive enterprise archive and retrieval IBM Content Manager OnDemand provides quick, efficient access to critical documents to enable an optimal customer experience Highlights Archive, protect and manage

More information

Big Data Anwendungsfälle aus dem Bereich der digitalen Medien

Big Data Anwendungsfälle aus dem Bereich der digitalen Medien Presented by Kate Tickner Date 12 th October 2012 Big Data Anwendungsfälle aus dem Bereich der digitalen Medien Using Big Data and Smarter Analytics to Increase Consumer Engagement Dramatic forces affecting

More information

Quantifying the Value of Software Asset Management

Quantifying the Value of Software Asset Management 1 Executive Summary Over the past few decades, employees have come to rely more and more heavily on software solutions to automate and enhance a variety of core business activities from sales order entry

More information

Big Data Analytics for Retail with Apache Hadoop. A Hortonworks and Microsoft White Paper

Big Data Analytics for Retail with Apache Hadoop. A Hortonworks and Microsoft White Paper Big Data Analytics for Retail with Apache Hadoop A Hortonworks and Microsoft White Paper 2 Contents The Big Data Opportunity for Retail 3 The Data Deluge, and Other Barriers 4 Hadoop in Retail 5 Omni-Channel

More information

Hadoop and the Data Warehouse: When to Use Which

Hadoop and the Data Warehouse: When to Use Which Hadoop and the Data Warehouse: When to Use Which Dr. Amr Awadallah, Founder and CTO, Cloudera, Inc. Dan Graham, General Manager, Enterprise Systems, Teradata Corporation Hadoop and the Data Warehouse:

More information

The Evolution of Big Data

The Evolution of Big Data The Evolution of Big Data Andrew Fast, Ph.D. Chief Scientist fast@elderresearch.com Headquarters 300 W. Main Street, Suite 301 Charlottesville, VA 22903 434.973.7673 fax 434.973.7875 www.elderresearch.com

More information

Konica Minolta Business Innovation Center

Konica Minolta Business Innovation Center Konica Minolta Business Innovation Center Advance Technology/Big Data Lab May 2016 2 2 3 4 4 Konica Minolta BIC Technology and Research Initiatives Data Science Program Technology Trials (Technology partner

More information

A Crucial Challenge for the Internet of Everything Era

A Crucial Challenge for the Internet of Everything Era Optimized Data Engineering: A Crucial Challenge for the Internet of Everything Era COSPONSORED BY CONTENTS Introduction 1 The Many Transformations of the Internet of Everything 2 Understanding IoE 3 IoE:

More information

Image Itron Total Outcomes

Image Itron Total Outcomes Image Itron Total Outcomes Simple. Flexible. Scalable. Affordable. AN EVOLVING LANDSCAPE In a dynamic industry with rapidly evolving technologies and business models, the ability to be agile and make decisions

More information

COPYRIGHTED MATERIAL. 1Big Data and the Hadoop Ecosystem

COPYRIGHTED MATERIAL. 1Big Data and the Hadoop Ecosystem 1Big Data and the Hadoop Ecosystem WHAT S IN THIS CHAPTER? Understanding the challenges of Big Data Getting to know the Hadoop ecosystem Getting familiar with Hadoop distributions Using Hadoop-based enterprise

More information

WHITE PAPER. Loss Prevention Data Mining Using big data, predictive and prescriptive analytics to enpower loss prevention.

WHITE PAPER. Loss Prevention Data Mining Using big data, predictive and prescriptive analytics to enpower loss prevention. WHITE PAPER Loss Prevention Data Mining Using big data, predictive and prescriptive analytics to enpower loss prevention Abstract In the current economy where growth is stumpy and margins reduced, retailers

More information

The Economic Benefits of Puppet Enterprise

The Economic Benefits of Puppet Enterprise Enterprise Strategy Group Getting to the bigger truth. ESG Economic Value Validation The Economic Benefits of Puppet Enterprise Cost- effectively automating the delivery, operation, and security of an

More information

Oracle Big Data Discovery The Visual Face of Big Data

Oracle Big Data Discovery The Visual Face of Big Data Oracle Big Data Discovery The Visual Face of Big Data Today's Big Data challenge is not how to store it, but how to make sense of it. Oracle Big Data Discovery is a fundamentally new approach to making

More information

InfoSphere Warehouse. Flexible. Reliable. Simple. IBM Software Group

InfoSphere Warehouse. Flexible. Reliable. Simple. IBM Software Group IBM Software Group Flexible Reliable InfoSphere Warehouse Simple Ser Yean Tan Regional Technical Sales Manager Information Management Software IBM Software Group ASEAN 2007 IBM Corporation Business Intelligence

More information

Architecture Overview for Data Analytics Deployments

Architecture Overview for Data Analytics Deployments Architecture Overview for Data Analytics Deployments Mahmoud Ghanem Sr. Systems Engineer GLOBAL SPONSORS Agenda The Big Picture Top Use Cases for Data Analytics Modern Architecture Concepts for Data Analytics

More information

Kaseya Traverse Unified Cloud, Network, Server & Application Monitoring

Kaseya Traverse Unified Cloud, Network, Server & Application Monitoring PRODUCT BRIEF Kaseya Traverse Unified Cloud, Network, Server & Application Monitoring Kaseya Traverse is a next-generation software solution for monitoring the performance of hybrid cloud and IT infrastructure

More information

MSP Guide to Automating Your Business for Profitability

MSP Guide to Automating Your Business for Profitability MSP Guide to Automating Your Business for Profitability Competition in the MSP space continues to grow, exerting downward pressure on MSPs pricing models and profit margins. To help counteract this trend,

More information

ORACLE BIG DATA APPLIANCE

ORACLE BIG DATA APPLIANCE ORACLE BIG DATA APPLIANCE BIG DATA FOR THE ENTERPRISE KEY FEATURES Massively scalable infrastructure to store and manage big data Big Data Connectors delivers unprecedented load rates between Big Data

More information

ANY SURVEILLANCE, ANYWHERE, ANYTIME DDN Storage Powers Next Generation Video Surveillance Infrastructure

ANY SURVEILLANCE, ANYWHERE, ANYTIME DDN Storage Powers Next Generation Video Surveillance Infrastructure WHITEPAPER ANY SURVEILLANCE, ANYWHERE, ANYTIME DDN Storage Powers Next Generation Video Surveillance Infrastructure INTRODUCTION Over the past decade, the world has seen tremendous growth in the use of

More information

Cask Data Application Platform (CDAP) The Integrated Platform for Developers and Organizations to Build, Deploy, and Manage Data Applications

Cask Data Application Platform (CDAP) The Integrated Platform for Developers and Organizations to Build, Deploy, and Manage Data Applications Cask Data Application Platform (CDAP) The Integrated Platform for Developers and Organizations to Build, Deploy, and Manage Data Applications Copyright 2015 Cask Data, Inc. All Rights Reserved. February

More information

Comprehensive Enterprise Solution for Compliance and Risk Monitoring

Comprehensive Enterprise Solution for Compliance and Risk Monitoring Comprehensive Enterprise Solution for Compliance and Risk Monitoring 30 Wall Street, 8th Floor New York, NY 10005 E inquiries@surveil-lens.com T (212) 804-5734 F (212) 943-2300 UNIQUE FEATURES OF SURVEILLENS

More information

Hybrid Data Management

Hybrid Data Management Kelly Schlamb Executive IT Specialist, Worldwide Analytics Platform Enablement and Technical Sales (kschlamb@ca.ibm.com, @KSchlamb) Hybrid Data Management IBM Analytics Summit 2017 November 8, 2017 5 Essential

More information

IBM ICE (Innovation Centre for Education) Welcome to: Unit 1 Overview of delivery models in Cloud Computing. Copyright IBM Corporation

IBM ICE (Innovation Centre for Education) Welcome to: Unit 1 Overview of delivery models in Cloud Computing. Copyright IBM Corporation Welcome to: Unit 1 Overview of delivery models in Cloud Computing 9.1 Unit Objectives After completing this unit, you should be able to: Understand cloud history and cloud computing Describe the anatomy

More information

Next-generation forecasting is closer than you might think

Next-generation forecasting is closer than you might think White Paper Powerfully Simple Next-generation forecasting is closer than you might think Surrounded by an explosion of data, most forecasting systems can t leverage it. Here s how you can. Joseph Shamir

More information

Improving Healthcare Payer Performance with Big Data

Improving Healthcare Payer Performance with Big Data Improving Healthcare Payer Performance with Big Data Architect s Guide and Reference Architecture Introduction O R A C L E E N T E R P R I S E A R C H I T E C T U R E W H I T E P A P E R F E B R U A R

More information

How Data Science is Changing the Way Companies Do Business Colin White

How Data Science is Changing the Way Companies Do Business Colin White How Data Science is Changing the Way Companies Do Business Colin White BI Research July 17, 2014 Sponsor 2 Speakers Colin White President, BI Research Bill Franks Chief Analytics Officer, Teradata 3 How

More information

Real-Time Streaming: IMS to Apache Kafka and Hadoop

Real-Time Streaming: IMS to Apache Kafka and Hadoop Real-Time Streaming: IMS to Apache Kafka and Hadoop - 2017 Scott Quillicy SQData Outline methods of streaming mainframe data to big data platforms Set throughput / latency expectations for popular big

More information

Hadoop in the Cloud. Ryan Lippert, Cloudera Product Cloudera, Inc. All rights reserved.

Hadoop in the Cloud. Ryan Lippert, Cloudera Product Cloudera, Inc. All rights reserved. Hadoop in the Cloud Ryan Lippert, Cloudera Product Marketing @lippertryan 1 2 Cloudera Confidential 3 Drive Customer Insights Improve Product & Services Efficiency Lower Business Risk 4 The world s largest

More information

Capacity Management from the ground up

Capacity Management from the ground up Capacity Management from the ground up Starting as a team of one Starting a capacity management function has a beginning, and that can be starting from scratch with one person. Determining where to start

More information

total energy and sustainability management WHITE PAPER Utility Commercial Customer Engagement: The Five Analytics-Enabled Strategies that Matter Most

total energy and sustainability management WHITE PAPER Utility Commercial Customer Engagement: The Five Analytics-Enabled Strategies that Matter Most total energy and sustainability management WHITE PAPER Utility Commercial Customer Engagement: The Five Analytics-Enabled Strategies that Matter Most TABLE OF CONTENTS INTRODUCTION 3 Strategy #1: Target

More information

Billing Strategies for. Innovative Business Models

Billing Strategies for. Innovative Business Models Billing Strategies for Innovative Business Models How Boring Old Billing Could Be the Competitive Advantage You Never Knew You Had Billing Strategies for Innovative Business Models Page: 1 Introduction

More information

ORACLE DATA INTEGRATOR ENTERPRISE EDITION

ORACLE DATA INTEGRATOR ENTERPRISE EDITION ORACLE DATA INTEGRATOR ENTERPRISE EDITION Oracle Data Integrator Enterprise Edition delivers high-performance data movement and transformation among enterprise platforms with its open and integrated E-LT

More information

SAP Big Data. Markus Tempel SAP Big Data and Cloud Analytics Services

SAP Big Data. Markus Tempel SAP Big Data and Cloud Analytics Services SAP Big Data Markus Tempel SAP Big Data and Cloud Analytics Services Is that Big Data? 2015 SAP AG or an SAP affiliate company. All rights reserved. 2 What if you could turn new signals from Big Data into

More information

Hortonworks Apache Hadoop subscriptions ( Subsciptions ) can be purchased directly through HP and together with HP Big Data software products.

Hortonworks Apache Hadoop subscriptions ( Subsciptions ) can be purchased directly through HP and together with HP Big Data software products. HP and Hortonworks Data Platform Hortonworks Apache Hadoop subscriptions ( Subsciptions ) can be purchased directly through HP and together with HP Big Data software products. Hortonworks is a major contributor

More information

InfoSphere Warehousing 9.5

InfoSphere Warehousing 9.5 IBM Software Group Optimised InfoSphere Warehousing 9.5 Flexible Simple Phil Downey InfoSphere Warehouse Technical Marketing 2007 IBM Corporation Information On Demand End-to-End Capabilities Optimization

More information

More information for FREE VS ENTERPRISE LICENCE :

More information for FREE VS ENTERPRISE LICENCE : Source : http://www.splunk.com/ Splunk Enterprise is a fully featured, powerful platform for collecting, searching, monitoring and analyzing machine data. Splunk Enterprise is easy to deploy and use. It

More information

Analyze Big Data Faster and Store it Cheaper. Dominick Huang CenterPoint Energy Russell Hull - SAP

Analyze Big Data Faster and Store it Cheaper. Dominick Huang CenterPoint Energy Russell Hull - SAP Analyze Big Data Faster and Store it Cheaper Dominick Huang CenterPoint Energy Russell Hull - SAP ABOUT CENTERPOINT ENERGY, INC. Publicly traded on New York Stock Exchange Headquartered in Houston, Texas

More information

Conquering big data challenges

Conquering big data challenges Conquering big data challenges Big data is here for financial services An Experian Perspective Don t get left in a cloud of dust Financial institutions have invested in Big Data for many years. Regulatory

More information

IBM QRadar SIEM. Detect threats with IBM QRadar Security Information and Event Management (SIEM) Highlights

IBM QRadar SIEM. Detect threats with IBM QRadar Security Information and Event Management (SIEM) Highlights IBM Security Data Sheet IBM QRadar SIEM Detect threats with IBM QRadar Security Information and Event Management (SIEM) Highlights Use IBM QRadar Security Information and Event Management, powered by the

More information

Adobe Deploys Hadoop as a Service on VMware vsphere

Adobe Deploys Hadoop as a Service on VMware vsphere Adobe Deploys Hadoop as a Service A TECHNICAL CASE STUDY APRIL 2015 Table of Contents A Technical Case Study.... 3 Background... 3 Why Virtualize Hadoop on vsphere?.... 3 The Adobe Marketing Cloud and

More information