Not since the emergence of the cloud has information technology received more attention than the current buzz surrounding big data. As with cloud computing, it can sometimes be hard to decipher what big data really means.

Former SGI chief scientist John Mashey originally popularized the phrase, using it to describe the relentlessly expanding boundaries of computing — bigger processors, bigger memory, bigger networks and so on. Today, big data refers more specifically to harnessing value through analytics from a vast amount of data flooding business and government agencies.
 
Data is everywhere

Analytics and big data have become top 10 strategies among CIOs, according to a Gartner 2013 survey. Data is arriving in greater volume, at greater velocity and in far greater variety than ever before.

While data traditionally came from carefully structured business systems, it now pours in from online shopping, email, social media and countless other aspects of our increasingly digital lives. It arrives from a multitude of digital sensors in cars, factories, shipping containers, roadways, as well as video surveillance and many other machine-generated sources.
And it is largely “unstructured,” which means it doesn’t fit neatly in a conventional relational database.  

Analytics lead to insights

Another reason for big data’s popularity is the rapid development of a new breed of data analytics tools that enable organizations to broadly mine the data, gain insights and capture value.

For example, marketing can uncover patterns in buying behavior to gain more customers and increase sales to existing ones. Manufacturing can better understand product performance to improve quality. Finance can more readily identify fraud to prevent loss. And physicians can utilize genetic profiles to treat patients more effectively.

From achieving competitive advantage to growing topline revenue to saving lives, the potential gains surrounding big data are far-reaching. They also present new challenges: How to derive value at greater speed, scale and efficiency.

Big computing for big imaginations

The magic behind big data lies in the computing environment, and much like solving big technical computing problems, big data analytics requires multiple computer processors — tens, hundreds or even thousands — all working in parallel. This is the world of high performance computing and, when leveraged effectively, it opens a world of possibilities.

In traditional HPC environments we start with a hypothesis, generate data and gain knowledge. Using HPC for big data analytics, we start with the data, then form the hypothesis and gain insight. The most common form of analytics utilizes a cluster of computers to essentially search across a broad data set for specific information. It’s like looking for a needle in a haystack.

There is also a form of big data analytics whereby the objective is not one of search, but of discovery. Here we look for relationships within the data in order to understand what the data is telling us. This form requires a computer system with massive amounts of shared memory. Such a system can also deliver results in microseconds.

Once organizations recognize the power of big data analytics, its boundaries will rapidly expand. And with imagination it can not only transform the way business operates but also reshape the world we live in — which explains why big data has suddenly become such a big deal.

Jorge Titinger is president and CEO of SGI. Learn more about SGI Big Data Solutions at www.sgi.com/bigdata.

Learn more about SGI at:

Facebook: www.facebook.com/sgiglobal
Twitter: @sgi_corp

Published in Columnist