Welcome!

From the Author of The Agile Architecture Revolution

Jason Bloomberg

Subscribe to Jason Bloomberg: eMailAlertsEmail Alerts
Get Jason Bloomberg via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Related Topics: Data Lakes News

Data Lakes: Article

Don’t Jump in the Data Lake

Don’t Jump in the Data Lake

Don’t Jump in the Data Lake

32. 47. 19. 7. 85.

Congratulations! I just gave you five very important, valuable numbers. Or did I?

If they were tomorrow's winning Powerball numbers, then certainly. But maybe they're monthly income numbers. Or sports scores. Or temperatures. Who knows?

Such is the problem of context. Without the appropriate context, data are inherently worthless. Separate data from their metadata, and you've just killed the Golden Data Goose.

If we scale up this example, we shine the light on the core challenge of data lakes. There are a few common definitions of data lake, but perhaps the most straightforward is a large object-based storage repository that holds data in its native format until it is needed or perhaps a massive, easily accessible, centralized repository of large volumes of structured and unstructured data.

True, there may be metadata in a data lake, thrown in along with the data they describe - but there is no commonality among such metadata, and furthermore, the context of the information in the lake is likely to be lost, just as a bucket of water poured into a real lake loses its identity.

If data lakes present such challenges, then why are we talking about them, and worse, actually implementing them?
The main reason: because we can.

With today's data collection and storage technologies, and in particular Hadoop and the Hadoop Distributed File System (HDFS), we now have the ability to collect and retain vast swaths of diverse data sets in their raw, as-is formats, in hopes that someone will find value in them down the road "just-in-time" - where any necessary processing an analytics take place in real-time at the time of need.

This era of data abundance is relatively new. Only a handful of years ago, we had no choice but to transform and summarize diverse data sets ahead of time in order to populate our data marts and data warehouses.

Today, in contrast, we can simply store everything, ostensibly without caring about what such data are good for or how they are organized, on the off chance that someone will come along and find a good use for them.

Yet as with the numbers in the example above, data by themselves may not be useful at all. Simply collecting them without proper care may not only lead to large quantities of useless information, but might in fact take information that may have been useful and strip any potential usefulness from it.

The Dark Underbelly of Big Data

This dumbing down of the information we may collect is the dark underbelly of the big data movement. With our mad rush to the quantity of data we can collect and analyze, we risk foregoing the quality of those data, in hopes that some new analytics engine will magically restore that quality.

We may think of big data analytics as analogous to mining for gold, separating the rare bits of precious metal from vast quantities of dross. But we'll never find our paydirt if we strip away the value during the processes of data collection and analysis.

Perhaps we should go back to the Online Analytical Processing (OLAP) days, where we carefully process and organize our information ahead of time, in order to facilitate subsequent analysis. Even with today's big data technologies, there are reasons to remain with such a "just-in-case" approach to data management, rather than the just-in-time perspective of data lake proponents.

In reality, however, this choice between just-in-case and just-in-time approaches to data management is a false dichotomy. The best approach is a combination of these extremes, favoring one or the other depending on the nature of the data in question and the purpose that people intend to put them toward.

More Stories By Jason Bloomberg

Jason Bloomberg is the leading expert on architecting agility for the enterprise. As president of Intellyx, Mr. Bloomberg brings his years of thought leadership in the areas of Cloud Computing, Enterprise Architecture, and Service-Oriented Architecture to a global clientele of business executives, architects, software vendors, and Cloud service providers looking to achieve technology-enabled business agility across their organizations and for their customers. His latest book, The Agile Architecture Revolution (John Wiley & Sons, 2013), sets the stage for Mr. Bloomberg’s groundbreaking Agile Architecture vision.

Mr. Bloomberg is perhaps best known for his twelve years at ZapThink, where he created and delivered the Licensed ZapThink Architect (LZA) SOA course and associated credential, certifying over 1,700 professionals worldwide. He is one of the original Managing Partners of ZapThink LLC, the leading SOA advisory and analysis firm, which was acquired by Dovel Technologies in 2011. He now runs the successor to the LZA program, the Bloomberg Agile Architecture Course, around the world.

Mr. Bloomberg is a frequent conference speaker and prolific writer. He has published over 500 articles, spoken at over 300 conferences, Webinars, and other events, and has been quoted in the press over 1,400 times as the leading expert on agile approaches to architecture in the enterprise.

Mr. Bloomberg’s previous book, Service Orient or Be Doomed! How Service Orientation Will Change Your Business (John Wiley & Sons, 2006, coauthored with Ron Schmelzer), is recognized as the leading business book on Service Orientation. He also co-authored the books XML and Web Services Unleashed (SAMS Publishing, 2002), and Web Page Scripting Techniques (Hayden Books, 1996).

Prior to ZapThink, Mr. Bloomberg built a diverse background in eBusiness technology management and industry analysis, including serving as a senior analyst in IDC’s eBusiness Advisory group, as well as holding eBusiness management positions at USWeb/CKS (later marchFIRST) and WaveBend Solutions (now Hitachi Consulting).