Thursday, May 15, 2014

10 things you shouldn't expect big data to do (TechRepublic)

By                                    May 12, 2014, 11:36 AM PST

Many business leaders have embraced big data initiatives expecting miracles, only to discover that big data introduces new complexities -- and that reaping the benefits requires a lot more effort than they anticipated. 
hero

Every organization pursues big data with high hopes that it can answer long-standing business questions that will make the company more competitive in its markets and better in the delivery of products and services. Yet in the midst of this enthusiasm, it's easy to build false expectations for big data -- benefits that will never materialize unless you give it the right amount of "help." Here are 10 key things that big data in itself won't do for you unless you take the right steps to optimize its value.

1: Solve your business problems

Big data doesn't solve business problems. People do. Only those organizations that sit down and decide what they want to get out of their big data before they start working with it are going to reap the calibre of business intelligence they're looking for.

2: Help your data management

IBM claims that 2.5 quintillion bytes of data are being generated daily. Most of this is big data. Unsurprisingly, the amount of data under management in companies around the world has grown exponentially, too. As the data piles up without clearcut data retention and usage policies (especially for big data), organizations are struggling to manage it.

3: Ease your security worries

For many companies, determining security access for big data is still an open item. This is because security practices for big data aren't as defined as they are for data that belongs to systems of record. We are at a point where IT should be working with end users to determine who gets access to which levels of big data and its corresponding analytics.

4: Address critical IT skill areas

Big data database management, server management, software development, and business analysis skills are in short supply. They add incremental burden to a core of critical IT skills that many IT departments already lack.

5: Diminish the value of legacy systems

If anything, legacy systems of record are more valuable than ever with big data. Often, it is these legacy systems that offer critical clues as to how to best dissect big data for analytics that can answer important business questions.

6: Simplify your data center

Big data requires parallel processing compute clusters and a different style of system management from traditional IT transaction and data warehouse systems. This means that energy consumption, cooling, software, hardware, and the systems skills needed to run these new systems will also be different.

7: Improve your data quality

The beauty of traditional transaction systems is that there are fixed data field lengths and comprehensive edits and validations on data that help get it into relatively clean form. Not so with big data, which is unstructured and can come in almost any form. This makes big data quality a major headache. Data quality is critical. If you don't have it, you can't trust the results of your data queries.

8: Validate current ROI metrics

The most common way to measure return on investment from systems of record is to monitor speed of transactions and then extrapolate what this means in terms of captured revenues (like how many new hotel reservations you can capture per minute). Speed of transactions is not a good metric for big data processing, which can take hours and even days to crank through a large cache of data and to run analytics. Instead, the best metric for gauging the effectiveness of big data processing is utilization, which should be above 90% on a regular basis (contrast this with transaction systems, which might be only 20% capacitized). It's important to develop these new ROI metrics for big data, because you still have to sell the CFO and other business leaders on big data investments.

9: Create less "noise"

Ninety-five percent of big data is "noise" that contributes little or nothing to business intelligence. Sifting through this data to get to the nuggets of intelligence that will truly help the business can be daunting.

10: Work every time


For years, universities and research centers ran big data experiments to derive elusive answers on genomes, drug research, and whether there was life on other planets. While some of these algorithms and queries yielded results, many more were inconclusive. There is tolerance for inconclusiveness in university and research environments, but not in corporate settings. This is where IT and other key decision makers need to manage expectations.
Mary E. Shacklett is president of Transworld Data, a technology research and market development firm. Prior to founding the company, Mary was Senior Vice President of Marketing and Technology at TCCU, Inc., a financial services firm; 

No comments:

Post a Comment