Wednesday, January 23, 2013

Data Proliferation

According to IBM, every day we create 2.5 quintillion (2.5×1018) bytes of data.  Experts project the world’s data will expand 50 times in the next decade.  Sounds overwhelming!  How are marketers going to conquer data proliferation?

Read On:
Mobile technology is contributing its fair share of data to the runaway world of Big Data.  As smartphone penetration continues to grow (currently 53% of all mobile according to Nielsen), so does the amount of mobile data consumption, as users watch more video, play more games, access social networks, etc.  Mobile Marketer Daily reported that mobile data traffic is currently 3.89 trillion megabytes and forecasts it will grow ten-fold to approximately 40 trillion by 2016.  Please note, these numbers do not reflect the mobile data consumption of tablets.  Currently there are approximately 55 million U.S. tablet users.

IBM just released the results of a worldwide study they conducted back in October and revealed that the primary Big Data objective for companies moving forward will be to employ the numerous tools at their disposal to achieve customer centricity; 49 percent to be exact.  However, other studies indicated that companies were experiencing difficulty mining Big Data for consumer insight.        

Here are my recommendations as it relates to working with data proliferation:

Baby Steps!  Start small, think big. 

      1.    Select a market or two. 
      2.    Examine the different points of available data capture. 
      3.    Analyze. 
      4.    Summarize key learning. 
      5.    Develop potential indicated actions.
      6.    Execute!  Execute!  Execute!
      7.    Repeat the above process; select a geographic region.
      8.    Repeat the above process; Go Global!

Are you prepared to master data proliferation?   


  1. Too many companies --- I fear --- will try to use too much data, losing their strategy in the weeds. Your suggestions are valid, tho in today's (perceived) cutthroat competitive marketplace, most companies will be pushed toward quick reactions. Smart ones will use technology to sort the wheat from the chaff and make bread.

  2. Execute, Test, Execute-with-learning, Test, Execute with scale... I think the need for feedback loops during execution are still a good thing.

    On the technology side, it's not ready for prime-time yet, but big-data analytics that make processing larger and larger data sets, without having to be a data scientist, are eventually coming.

  3. It sounds like we are going to have to come up with some new language to talk about this data being created. 2.5 quintillion is such a massive number I have to imagine that most people, and companies, can't comprehend it.

    Also, I am wondering if anyone has figured out how much "new" data is actually being created and how much is duplication. Additionally, videos, music, websites, etc. are becoming massive data files. A current day Blu-Ray disc has the capability of holding at least 50GB of data but the movie that plays is still the same hour and a half it has always been. Websites that were meer kilobytes in size 20 years ago are now tens or hundreds of gigabytes, possibly terabytes large.

    I guess I am wondering, are more bytes being created because we have more to communicate or are we just adding depth to what is being said? I guess the challenge is how to get the data to make sense without losing site of the end objective, which is ultimately to understand the consumer.

    Thanks Jimmy