7 Rules of Big Data in Banking

Published September 16, 2013   |   
Andy Hirst

As we move from an era of automating transactions to the era of data, the information challenges banks face are increasingly around understanding and creating intelligence from information. Areas like regulatory compliance, customer insight, or real-time offer management are real data problems. The next wave of value creation is likely to come from creating more understanding, insight, and wisdom from this data to better protect the bank or find new customer segments to drive revenues.

Here are my 7 rules to follow when embarking on a new data project:

Good data is key. For all the amounts of data we collect having less but clean data is the critical bedrock to any analysis. Flawed data leads to flawed discussions whereas clean data leads to confident decisions. Banks need to invest in data infrastructure to provide reliable information to the risk, compliance, and customer applications.

  • The faster you analyze bank data, the better the predictive value. Since there is time value to banking data, the industry is moving from batch to real-time data. Value is created by making new offers in real time and risk exposure is better predicted and avoided if analysed in real time.
  • Maintain one copy of your data. Your data may become less reliable and accurate the more you copy and move it. Further the cost of IT infrastructure increases when holding multiple copies for different applications.

Read More