In the first part of this blog series, I discussed the traditional data warehouse and some of the issues that can occur in the period following the initial project.
I concluded that post by outlining 5 common challenges that I hear from customers.
- COMPLEX ARCHITECTURE
- SLOW PERFORMANCE
- OLD TECHNOLOGY
- LACK OF GOVERNANCE
This blog will focus on the first of these issues: INFLEXIBILITY.
For the avoidance of any doubt, let’s begin with the definition of INFLEXIBILITY:
“not permitting change or variation; unalterable”
It is clear that we are talking about a solution that does not readily lends itself to being changed and varied.
Let’s begin by considering the 3 most common reasons why we may need to change our existing data warehouse.
- Request to amend an existing data model due to evolving requirements
- Something has changed in the source system/s
- Requirements to add new source data set/s or to add new subject areas
Each of these reasons bring their own particular set of challenges that should NOT be underestimated. Some changes requested may seem innocuous at first but may end up resulting in a substantial change to the underlying data structure or leading to new data sources being acquired.
When the original data models / star schemas / cubes were designed it is likely that they were derived from a set of reporting requirements.
Good practice dictates the use of some kind of report decomposition matrix (RDM). The RDM would catalogue all the data items and then collate them into subject oriented fact tables with corresponding conforming dimensions.
Ideally, the analysts involved in the requirements workshops were able to grasp the organisation’s big picture and looked beyond just the requirements of the initial reporting needs, and that they used modelling tools that supported quick changes, good version source controls and proper governance to support impact/dependency analysis. Unfortunately this is often not the case.
What if the analysis was not complete or thorough enough?
What if something was missed?
What if the business has a new or shifting focus?
What if the new data required isn’t in a database, and is held by external parties outside our organisation, or, what if the data isn’t conventionally structured at all?
And of course there is always a chance that the business users were simply not able to see the big picture of the organisation’s analytical needs and cannot articulate the requirements clearly.
Over the years, techniques for designing and developing data warehouses have also evolved. In the past slow moving structured transactional databases were common and we were predominantly delivering reporting and drill / slice and dice analysis against reasonably static datasets.
In today’s world, businesses are evolving quickly and market conditions are volatile. Quick and rapid decisions are often required to ensure the survival and competitiveness of businesses. With the exponential growth of data, traditional technologies and mind set are no longer adequate for organisations to harness and fully realise the potential of their data assets. Organisations currently struggle to cope with the sheer volume and variety of data (transactional and newer type of data such as social media, IOT and etc.). Big Data is no longer something that organisations can afford to ignore and delay their adoption.
The users’ expectation of data and reporting agility will however be hard to fulfil considering the complexity of the traditional data warehouse implementation.
The diagram above illustrates the components of a typical traditional data warehouse implementation from the data sources to the Business Intelligence visualisation layer.
Changes to the Business Intelligence requirements could easily mean that each of the components needs to be individually assessed and amended. This is often not a simple task, potentially involving consultants with varying skill sets and tools. The problem is further exacerbated as the internal data warehouse team is likely to be pre-occupied with “business as usual” (BAU) activities.
Due to prolonged frustrations and slow service levels, business users tend to form a perception that their only path to insight is a concrete tunnel that always lead to the same destination.
So what’s the answer?
Can we provide a streamlined, efficient and ultimately more agile data warehouse within our organisations?
Can we address changing Business Intelligence and Analytics requirements without getting lost in an endless cycle of legacy code inspection and the resulting regression testing?
The answer: YES! (Of course…).
The Modern Data Platform
The Modern Data Platform delivers a foundation for agile modelling, delivery and iteration that significantly reduces the lead time to deliver changes, resulting in lowering the cost of ownership of your Business Intelligence and Analytics solutions.
Remember to follow the rest of the posts in this blog series where we will explore in detail the four remaining common challenges of traditional data warehousing.