As big data volumes continue to explode, businesses are challenged to quickly extract rich insight from the mountain of machine-generated data streaming in from devices, sensors, smart meters, operational equipment, social-media platforms and other sources.
According to a recent IDC report, worldwide installed raw storage will climb from 2,596 exabytes (EB)—or 2.5 billion gigabytes—in 2012 to a staggering 7,235 EB in 2017. As IDC Analyst David Reinsel notes, “The desire to store more data is insatiable.” But storage is moot without the ability to effectively retrieve the meaning in the data that can actually move the needle for a business.
In today’s big data world, the one-size-fits-all approach no longer works. The data management stack has transformed into multiples, and the analytic stack has had to respond with individual tools to get at the appropriate data and function, be it operational analytics, investigative analytics or predictive analytics. Big data has created pockets of specialization, where some databases are great for warehousing, while others excel at analytics.
Companies are also challenged by an evolving infrastructure and the proliferation of data centers, data warehouses and data marts. Not only is the infrastructure used to deliver information changing, the data coming in from myriad new devices is also changing dramatically.