A recent project we delivered helped me understand just how critical data-driven solutions are in the realm of operations. An Oil & Gas client of ours has a large project underway to transform their maintenance operations and system landscape. What spearheaded this was a dashboard solution we delivered using Power BI that provides insights into their work management process for each facility. The key focus of this dashboard was to drive operational excellence. The dashboard provided total transparency into their maintenance operations and processes, so they can easily identify issues and drill into the detail to action them.
I’ve worked in the analytics industry for a long time and have seen many great solutions but this one was special. It’s the first time I’ve seen Power BI take on an enterprise-level problem and handle complex data from an ERP system such as SAP. It’s also the first time I’ve seen a business put into action exception-based management of their metrics.
For a long time, I have found myself wondering if Power BI was fit for medium to large enterprises. Particularly those with complex system landscapes, including SAP ERP, Oracle JDE, IBM Maximo, or Dynamics 365. As you probably know, these enterprise landscapes often have complicated data with multiple systems having to talk to each other (or not talk at all in some cases), leaving quite a herd of data to muster.
I am now extremely confident of Power BI’s capabilities in this space if certain steps are taken. I would like to share my observations as to why it worked so well on this project.
The project was driven by experts in the Asset Management field who have decades of experience and apply industry best practices to their metrics. Again, this ties into my number one driver of success for any BI project, it must be business-driven!
Because the business knows their data, knows their processes, knows what to look for and what to prioritize. They just don’t know the best technical application to get at their data and action it, this is where IT folks are meant to fill the gap, to assist them on this journey. It is a journey because it’s never done perfectly from day dot, it will evolve, change, and grow as the business uses it and wants more.
Now, you might consider me lucky to have been surrounded by such expertise and I would agree. However, I highly doubt a company would survive without hard-working employees with intimate knowledge of their business. These are the employees you need on your project from day one – so start building a relationship with them.
Power BI is the ideal tool for achieving a reduced time-to-market solution. This is only the case if you have a simple dataset to work with. The best part is you can always design it to be simple.
Once all your datasets are ready, producing a dashboard is very intuitive and straightforward. The caveat here is to try and leave the calculations to a database and not DAX, so push calculations as close as you can to the database.
Power BI, or any BI tool for that matter, should be left only to handle presentation and interactivity. Don’t make it another database to crunch numbers. The database will always be the ideal place for any complex calculations or transformations.
Also, dashboards can appear simple in look and feel but still be intricate. Just don’t get too carried away, present the metrics or data points in a manner that makes sense and more importantly, how a user will use them.
I mentioned earlier that keeping your dataset simple and pushing the calculations down to the database made this project a success. Well, we only were able to achieve this with a Data Warehouse. Even with SAP HANA at our disposal, we still opted to use a SQL Server Data Warehouse.
A Data Warehouse is the ideal mechanism for this type of analysis because it’s architected to answer business questions. In this case, those questions were such as “How many Work Requests are outstanding?”, “How many Work Orders are compliant to their PM plan for the week?” and “Did we utilize our capacity as we expected?”. All these questions were calculated before the data hit the dashboard by using an ETL tool and Data Warehouse with Dimensional Models.
The other consideration is disparate systems, in our client’s case, they had a resource scheduling tool that we needed data from for resource capacity and loading. Since it’s a separate dataset, you need to extract, integrate, and transform with the CMMS data to get your answers.
Doing this outside the SAP HANA databases was optimal because we avoided clashing with the SAP ERP system for resources, plus the data was calculated before a user hits refresh.
Why not a Data Lake you may ask? Because a Data Lake is not formatted into Dimensional Models which are built for the primary purpose of supporting query and analysis. It’s just a copy of the source data structures and in most cases, that’s an RDBMS. If you have worked with SAP, you will know having to extract data to analyze for one module will mean acquiring 20 tables minimum. That already blows your performance out the window. You must architect this data to make it suitable for answering questions.
A Data Lake is not ideal for this purpose. A Data Warehouse with Dimensional Models is.
A great way to drive action for your dashboard or reporting is to employ an exception-based approach. This is where you design the metrics and dashboard to focus on issues as a priority and drive the user to those issues to action.
If you are using the dashboard, this allows you and your team to focus only on the issues at hand and not get lost in the myriad of data points or other metrics. This becomes even more powerful when dealing with leading indicators, where you can preempt those issues before they become alerts on your dashboard. Phase 2 of this project will focus on designing and acquiring the data points to support the leading indicators.
A great design tip for exception-based management is the traffic light system. This makes it very easy to understand visually for a user what needs attention now. This is a tried-and-true method that’s been around for a long time but still highly effective with analytics.
For this project, we were able to leverage Power BI Dataflows to pre-cache data for the dashboard. I highly recommend this if you have the licensing as it provides lightning-fast performance for a dashboard plus the added benefit of populating data based on SQL-like queries.
Given the dataset was based on a Star Schema data model from the Data Warehouse, we were able to use Power BI Dataflows to query the dataset to support the Summary, plus the individual queries to support the detail. So, rather than having to build each dataset or worse one big large one. We used Dataflows to produce the data we needed in the desired format. That data is then loaded into a cache which can be scheduled daily or weekly, so it’s prepopulated for your dashboard.
This is by far the most optimal approach for achieving a summary and drill-down with your Power BI dashboard. The whole point of BI & Analytics is for instant answers to your questions. Users will not accept, and shouldn’t, anything that is sub-optimal – it’s just not good practice.
In Summary, Power BI is a fantastic tool for producing exception-based, actionable insights for your operations. If you have the business leading the way, keep your solution simple and leverage a Data Warehouse – you will produce an intangible asset for your operations department. If you are interested in learning more about Power BI and building data-driven solutions – let’s chat.