Today’s business environment is more challenging than ever.
There are few aspects of a company that have been left unchanged by the challenges of a global pandemic, tightening liquidity constraints due to rising inflation, and overall political unrest in various regions around the world. While these different dynamics are navigated to whatever the “new normal” is, one thing that can be certain is that a new set of challenges will arise. Different strategies will need to be formulated to maintain comparative advantages and stay competitive. In short, the only constant is change.
Change is occurring at an increasingly rapid pace with each technological advancement. This continues to put management teams on their heels as they improvise, adapt and overcome. Fortunately, the last 20 years have brought rise to the prevalence and importance of business intelligence (BI), and how companies can leverage these tools to “punch back” at the challenges of today while preparing for the issues of tomorrow.
With the rise of BI tools like Tableau, PowerBI, and Looker, organizations are now left with the challenge of trying to understand the mountains of data they create on a daily basis; to synthesize this information into intelligence. While this is not a novelty, it does pose a question: who defines what gets measured, and, ultimately, managed?
To understand why this question is so important, let’s look at how business intelligence is typically performed today:
While this process has benefited businesses, they remain underserved. These aggregations and calculations being performed are the metrics that organizations use to manage their performance. These metric definitions must be either a) aggregated from reporting tables or b) calculated ad hoc from various dimension tables.
Think about when you last tried to understand profitability by customer. To break down this analysis, one must define what profitability means and what a customer definition is. Then you need to be able to aggregate these definitions over a specific time period to understand how individual customers' behavior changes over time. Only then are you in a position to explain how your customer profitability metric is performing.
Reporting tables are aggregated with transformation tools, enabling the consistent definition and generation of these metrics; however, there is a practical limit to how often these tables should be utilized. For example, as you look to aggregate a metric over the last six months, such as, your Inventory Stock Ledger movement, you will find that the way data must be rolled up to facilitate this analysis requires consistent computation of the ins and outs of inventory by looking at – sales orders, purchase orders, and intercompany transfers. While this can be performed, the compute-heavy approach to building this analysis is not scalable or sustainable as you would need to perform these “rollups” for every time dimension you intend to analyze over. The creation of individual reporting tables for each time frame is cumbersome and tedious. As a result, reporting tables are often best utilized in analyzing metrics like monthly recurring revenue.
To answer a question, such as, what is the monthly recurring revenue by customer segment, it becomes an increasingly unwieldy task to precompute the answer to this question. For instance, the definition of revenue must be universally agreed upon by the organization in order to make any meaningful decisions on this information. Second, this revenue figure must be filtered to only include recurring revenue for the organization before you can begin to aggregate monthly recurring revenue over a specific period. Lastly, customer segment logic must be applied to recurring revenue by month in order to apply the appropriate criteria of data in the dataset. When examples like this are magnified by the volume of the different metrics being analyzed by your organization, the tenability of the approach comes into question. Who validates that metric logic is correct? How many different departments are leveraging this information?
Without a reporting table to draw from, data and BI analysts are left to aggregate these metrics based on dimension tables, leaving the interpretation and calculation of this metric up to the analyst, not the business.
This may seem like a communication problem; but, as your business grows and departments become increasingly more siloed, the ability to manage these metric definitions quickly becomes unsustainable. For example, sales may interpret revenue as the dollar amount sold as reported in your CRM, whereas Finance’s interpretation of Revenue is only what has been recognized in your General Ledger.
How do you reconcile this discrepancy?
The metric layer: A centralized repository for all metric definitions/calculations consumed by all reporting tools and entities in your business.
The metric layer is the newest addition to the modern data stack, empowering flexibility and consistency of analysis for your business’s most pertinent KPIs.
With proper implementation, metric layers solve this common business problem. Existing BI practices force companies to push these metric definitions into their BI tools with little room for governance. In other words, the metric definitions referenced by end users and analysts are arbitrarily calculated based on their interpretation of the metric in question, as opposed to, allowing system-defined and agreed upon metrics to be automatically calculated from a central repository. Without that, metric calculations remain disjointed across your reporting stack, buried in hidden dashboards and workbooks, ultimately, recreated and reused with little to no organizational oversight.
The metric layer sits as an additional semantic layer applying business logic to your data warehouse, allowing for consistent calculation and aggregation of your business metrics into your entire reporting suite. This architecture allows your entire business to leverage the same metric calculations, while enabling the ability to edit, create new, or delete these calculations all from one central location, automatically casting these metrics across your enterprise.
The outcome: a single source of truth that governs your entire reporting infrastructure to empower you and your business to define, measure, and manage what matters. No more second-guessing the validity of metric calculations that tears at the fabric of the strategic decision making. Decisions should be based on intelligence, not information.
The questions of today may not be an indicator of the questions of tomorrow. This train of thought laid birth to the world of BI and data science that companies of today are trying to leverage at scale. By leveraging a metric layer, your entire organization can redefine and monitor KPIs that inform your strategic guidance without re-engineering new reporting, dimension, and fact tables for each scenario you encounter; saving time, fostering trust in your strategic vision, and helping you maintain your competitive edge.
The introduction of a metric layer creates a richer and more robust ecosystem for your BI teams and tools to live within. These teams and tools can focus less on topics like data modeling and governance, which can be automated and are critical for long-term success; and shift the impetus to delivering comprehensive analysis with better user experiences and deeper insights.
This shift, ultimately, is the greatest benefit of the metric layer. BI teams are empowered to understand changes over time providing a pivotal link between exploratory data science methodologies and the financial and operational reporting that guides strategic decision-making. The answers are in reach; however, it takes a concerted effort to increase accessibility to unlock the value in your data in this ever-expanding business environment.
A data pipeline is a set of data processing elements connected in series, where the output of one element is the input of the next one
A data warehouse is a large store of data accumulated from a wide range of sources within a company and used to guide management decisions.
Calculations and aggregations of data used to measure key performance indicators over time
Fact tables house the events from which metrics are constructed
Fact tables contain measurements and facts about a particular business process i.e. a table containing invoice summary information like invoice number, date, and customer name
Dimension tables accompany fact tables, and contain the attributes that provide additional granular information to the fact table i.e. a table containing invoice detail information like quantity, invoice lines, and SKU information
Reporting tables are built from fact and dimension tables to provide aggregations to specific metrics (i.e. trailing twelve-month revenue)
Transformation tools update and reformat data from one state into another
They can prepare data for a wide variety of use cases, from cleaning it for analysis, aggregating it for consumption, or restructuring it to load into operational tools like machine learning platforms or marketing automation software
Business intelligence comprises the strategies and technologies used by enterprises for the data analysis and management of business information
BI tools access and analyze data sets and present analytical findings in reports, summaries, dashboards, graphs, charts and maps to provide users with detailed intelligence about the state of the business i.e. Tableau, PowerBI, Looker, Qlik, Domo, etc.
A centralized repository for all metric definitions/calculations consumed by all reporting tools and entities in your business.