Avoid 5 Deadly Big Data Mistakes

Published December 17, 2013   |   
Sohini Bagchi

Today, many companies are already using Big Data to turn their data process into actionable insights. However, in reality, companies often find that Big Data doesn’t necessarily translate into easy success. In an exclusive interaction with CXOtoday, Adrian DeLuca, Chief Technology Officer, Hitachi Data Systems, APAC states that many companies throw themselves into big-data projects, only to fall into common traps and end up with nothing to show value. Many Big Data initiatives also got scrapped. DeLuca points out the five most common yet deadly mistakes companies make with big data and offers his recommendations on how to avoid them.

Mistake #1: Collecting lots of random data

As data is growing exponentially, organizations already have too much data, both structured and unstructured. The challenge for most firms is how to use it efficiently and in “real time” to be able to react to the ever changing market forces. Not many organizations can handle the data they already have and randomly look to collect more data. The key lies in how they use the information to make profitable business decisions, says DeLuca.

Mistake #2: Forgetting the small data

Big Data needs constant experimentation. With advanced analytics techniques Big Data can be analyzed to find many useful correlated predictions like what the consumers have already done – what they read, what they bought or searched and so on. Technically Big Data is always one step behind. It tells us what people did in the past. The biggest mistake organizations make is assuming that this history predicts the future. With changing market conditions and customer preferences firms that rely only on the old story without experimenting real time, will face tremendous challenge. Relatively,Small data can tell is why and explains the past so that we can predict a changing future based on a better understand of what is causing this change.

Read More