Every day we create 2.5 quintillion bytes of data.  90% of the data in the world today has been created in the last two years.  This data is a combination of everything you could imagine.  It could be data from the Large Hadron Collider, posts to Facebook, pictures on Imgur, YouTube Videos, Amazon transaction records, and much, much more.  When so much data gets collected the question becomes, what do you do with it all?  Simply too much data exists for conventional methods to keep track of or any one person to monitor.  Gartner analyst Doug Laney defined the growth of our data as being three dimensional.  The 3Vs model, as Laney calls it, describes big data as increasing in Volume (amount of Data), Velocity (speed of Data in and out), and Variety (range of data types and sources). IBM even added a 4th dimension, Veracity.  So how will we deal with big data in the future as it grows even more?  How do we turn all this information into opportunity?

Big Data provides lots of opportunity for companies, allowing them to use sources of information that were previously untapped.  Learning about customer needs and complaints, current device ownership, and unspeakable amounts of other information can lead a company on the right path to growing their business.  That information may lead to partnering with another vendor, or releasing new upgrades, or even fixing problems you didn’t know existed.  Big Data mining also can help with fraud analysis.  Although financial service companies don’t need to use big data to uncover fraud, incorporating data from other sources can always help them find problems quicker.  Same goes for law enforcement.

Analyzing, storing, and moving data in a cost effective manner becomes an important issue.  Experts state that there is nothing more cost effective and time saving than being able to analyze the data as a stream in real time, instead of after you move and store it.  This prevents storing useless information for long periods of time.  You have to be careful not to trash any information too quickly.  After all, data that isn’t important today may be what you base your decisions on tomorrow.

Some companies have built infrastructures around big data, focusing on collecting it, analyzing it with software and teams of talented personnel, and ultimately trying to use the data gained to improve their company.  The problem is that there are just too few experts out there that can turn that mined data into company gold.  I’m not surprised, it’s a tough job.  Imagine looking at the beach from a helicopter and having to pick out the prettiest seashells. They are all there just waiting for you to grab, but finding it and using it to your advantage is the difficult thing.  Some feel analytics software is the answer; others feel it’s not how you obtain the data but where you get it from.  Applications are becoming a favorite way to pull data from users, with majority of Americans now owning Smartphones or Tablets with internet access.

When the Sloan Digital Sky Survey started collecting data in the year 2000, it amassed more data in its first few weeks than all the data collected in the history of astronomy (about 200 GB a night, 140 TB in total since 2000).  When its successor, the Large Synoptic Survey Telescope takes over in 2016, it will acquire that amount (140TB) every 5 days.  In 2010, the Large Hadron Collider produced 13 Petabytes (13,000 Terabytes) of data. My point?  Big Data is only going to get bigger, and as time goes on we will have to find ways to better analyze it and sort through it if we want it to be useful.