<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=486348&amp;fmt=gif">
BlogHeader.jpg

Keep it realFull of thought

Data: Are we going back to the future?

WhereScape, Data Warehousing, Big data | [fa icon="comment"] 0 Comments

dataandcloudcomputing_1

If like me you follow Data Warehousing and Big Data developments you will have noticed some interesting research and commentary around the Hadoop and Big Data ecosystems of late. A couple of recent articles on Hadoop adoption by Gartner and Data Silos continue the trend. It seems that adoption has been much slower than anticipated by the industry, the key reasons being a shortage of skilled resource, the risk of creating yet more siloed data and crossing the chasm from science project to enterprise solution. Sound familiar?

If you’ve been in this space since the late 90’s early 2000’s then you’ll be getting a strong sense of Deja vu right about now. Back then the problems were similar even though the data was smaller.

  • Business people couldn’t get access information in a timely manner
  • General lack of confidence in reports/analysis due to quality issues
  • No single source of truth

And what were the technical reasons behind this?

  • Data siloed in various databases
  • Not enough skilled resource
  • Slow hand cranked data integration
  • No standardised architecture based on best practice

Simply put; siloed data diminishes value, integration is hard, we don’t have the resources to meet our needs and we’re worried our competition might be doing this better and faster than us!

This is the problem that the Data Warehouse, a consolidated store of information organised for Analysis (sound like a Data Hub?) was designed to address. To make it easier to report and analyse accurate information, to adhere to best practices and to make it possible for analytical tools to access and present accurate information, simplifying things for business people.

Unfortunately back then as an industry we missed a trick and lost our way with our Business stakeholders. Either we went for the quick fix and threw a reporting tool over whatever data there was and hoped the Business got some value, or we spent years planning an Enterprise Data Warehouse and they got sick of waiting. We made two critical mistakes:

  • We sacrificed time to value and ROI for either the quick or slow fix.
  • We failed to recognise that successful and rapid Data Integration is the key to the kingdom

Ultimately we failed to deliver clean integrated data that reflected complete business processes and provided true competitive insight at the right time and cost.

Now back to 2015 and enter Big Data, the problems of siloes, scarce resources, limited time and immature architecture have just got bigger, exponentially so. The thing is, Big Data is here, it’s valuable and it’s going to be a game changer for the businesses that can cross the adoption chasm. To succeed in the Big Data world we need to ensure that we learned from our mistakes and recognise the value of the last 20+ years shared learning. We must implement technologies that break down data siloes, are easy to program and support, implement best practice and provide clean quality information at the right time and cost. We have to solve the Data Integration bug bear once and for all, on time and within budget using the limited talent pool we have available.

It’s a little like being Marty McFly, jumping into the old DMC-12, and heading back in time to avert a less than satisfactory future.

So how do we do this? It won’t be easy. Why? Because if it was this would have been solved years ago, data integration is hard, it’s messy and it’s time consuming. But by doing the hard work on our data, cleaning it and applying value through transformation we’ll get the results we’re after from our data both big and small.

To truly get ahead of the curve and make the promise of our Data a reality we need to work “smarter not harder” we need tools that make the task easier, that implement frameworks for us and that automate code generation through the capture of meta data. To solve our data problem we need software that can apply the benefits of automation to data integration.

At WhereScape that’s what we do, we understand that the only criteria that matters to a Business Stakeholder is time to value and we understand that analytical database solutions must be built on flexible frameworks that simply development and support for IT. It’s the fundamental reason why we built our 3D and RED products and it’s baked into our DNA.

We’ve got our customers backs on Data Integration Automation be the data Big or small and we’re making sure they won’t need a DeLorean time machine to address their data rich future.

To find out more, download our guide to building useful data warehouses here.

Topics: WhereScape, Data Warehousing, Big data

All in a days work

#32
cups of coffee consumed
zestfull
#600
lines of code cracked
skillfull
#450
fantabulous happy customers
delightfull