How Big Data Analytics is Aiding Search for Flight 370

Published March 17, 2014   |   
Chris Preimesberger

As the hours and days go by following the sudden and mysterious disappearance of Malaysia Airlines Flight 370 somewhere in Southeast Asia, more people and organizations are joining the search party. And they are using every tool at their disposal — not the least of which big data analytics — to try and locate the Boeing 777, which carried 239 people and has thousands of family members and friends heartsick.

Daniel Hardman, chief architect of Adaptive Computing in Provo, Utah, is a supervisor of sorts in this search. Adaptive is most well known for the Moab data analytics platform, which is used by a number of various enterprises globally, including Oak Ridge National Laboratory, the University of Cambridge, and The Weather Channel.

Adaptive Computing’s Moab HPC Suite and Cloud Suite are an integral part of the company’s Big Workflow data center package, which the company claims unifies all data center resources, optimizes the analysis process and guarantees services. Big Workflow derives its name from its ability to solve big data problems by streamlining workflows to deliver insights from massive quantities of data across multiple platforms, environments and locations.

Read More