Analytics Growth Framework
Introduction
The Analytics Growth Framework is a guide designed for anyone seeking practical, structured guidance for using analytics to grow their business.
The purpose of the Analytics Growth Framework is:
A simple but powerful framework to help companies build their analytics roadmap.
Focused on actionable steps that align data with strategic goals.
Create systematic process towards continuous improvement
Align analytics efforts with business objectives
Standardise measurement across organization
Create accountability through clear metrics
Target Audience of the Analytics Growth Framework
Analytics Growth Framework Overview (MAXA)
Measure provides the baseline data foundation to then Analyse and either inform on next Experiment to test new ideas or to Activate.
Together, these sections ensure businesses make informed, efficient, and innovative decisions.
Measure 📏
Measure's Objective
Accurately define, track, collect, store, and analyse data to provide a clear understanding of business performance across key metrics.
Focus on building a reliable data foundation by ensuring the quality, accuracy, and relevance of collected data to inform decisions.
Measurement Framework
Create your Measurement Framework to map your Objectives to their corresponding KPIs and Key Driver Metrics. This is done by:
Mapping the Objective to one or more Key Result Areas (KRAs).
Mapping each Key Result Areas to one Key Performance Indicator (KPI).
Mapping each Key Performance Indicator to Key Driver Metrics.
An Objective is a high-level goal of the business or organisation such as "Achieve USD 10M Annual Revenue".
A Key Result Area (KRA) is an area of activities that help in achieving the Objective with measurable outcomes (i.e. the Key Performance Indicator). The KPI should be a non-gameable metric that if reached or achieved does not result in the non-achievement of the KRA.
Key Driver Metrics are metrics that can act as proxies of the KPIs. They are ideally operational or supporting
In summary, the Measurement Framework is built as follows:
One Objective -> Multiple KRAs
One KRA -> One KPI
One KPI -> Multiple Key Driver Metrics
Data Infrastructure
Once the list of KPIs and Key Driver Metrics are outlined, these have to be tracked and collected from the various relevant measurement sources.
Measurement sources can be Site Analytics, Social Ad Platforms, Programmatic Display Ad Platforms, Viewability vendors, eCommerce backend systems, Marketplace Analytics, etc.
Some of the Measurement Sources require implementation by adding some Javascript code or SDK. This is where a Digital Tracking Plan is required.
Develop comprehensive Tracking Plan that defines digital KPIs and supporting metrics to specific events, variables, and platforms (GA4, Ads platforms, etc.), including data layer specifications and versioning.
Build Data Infrastructure including collection (tag management/API integrations), storage (data warehouse), transformation (cleaning, joining datasets), and QA processes.
Establish Data Governance including ownership, update frequency, data dictionary, and quality control processes.
Reporting & Analytics Systems
Implement Reporting System with automated dashboards, standardised reports, and self-service analytics capabilities.
Dashboards: Monitor trends and establishing baselines for performance.
Reports: More detailed versions of the dashboards with self-service capabilities.
Alerts: Preconfigured setups of KPIs and Key Driver Metrics to check for changes againts as well as for Quality Assurance. These are based on the Data Dictionary settings (for example,
Notebooks: More advanced spaces where data analysts and scientists are able to apply scripts and advanced analysis to dig deeper into the data.
Activations: Configurations of data flows for activations of the
Analyse⚙️
Analyse's Objective
Analysing data across the various analysis types of Descriptive, Diagnostic, Predictive, and Prescriptive.
Focus on leveraging insights from measurement to make incremental improvements to performance.
What happened (Descriptive Analytics)
The objective of Descriptive Analytics is to understand historical performance by identifying patterns, trends, and changes in KPIs and supporting metrics to establish WHAT has occurred without examining causality.
Descriptive analysis areas include:
Performance Monitoring: Tracking KPIs and Key Driver Metrics against targets, baselines, and historical data.
Distribution Analysis: Showing how KPIs and Key Driver Metrics are distributed across segments.
Trend Analysis: Comparing period-over-period changes, analysing seasonality patterns and anomalies.
Why it happened (Diagnostic Analytics)
The objective of Diagnostic Analytics is to investigate relationships between metrics and external factors to understand WHY changes occurred. This is also known as Root Cause Analysis.
Diagnostic analysis areas include:
Correlation Analysis: Examining relationships between KPIs, Key Driver Metrics, and other metrics. Example: bounce rate being strongly correlated to page loading time.
Cross-Dataset Analysis: Joining different datasets together to uncover deeper insights. Example: combining website actions with Customer Stage (First Time Customer/Repeat Customer/etc.) via a common User ID.
Cohort Analysis: Comparing different user groups over time. Example: First time customers that bought with a promo code vs first time customers that didn't.
Path Analysis: Studying user journey variations. Example:
Funnel Analysis: Analysing step-by-step conversion stages. Example: Breaking up eCommerce Conversion Rate into View Items Per Session, Add To Carts Per View Item, Checkout Initiations Per Add To Cart, Payment Page Per Checkout Initiation, Purchases Per Payment Page
What could happen (Predictive Analytics)
Prioritising opportunities based on potential impact, effort required, and technical feasibility.
Breakdown opportunities into fixes, automation options or potential experiments.
Surface New Metrics and Features that significantly impact KPIs, feeding back to Measure and Experiment stages
What should be done (Prescriptive Analytics)
Prioritising opportunities based on potential impact, effort required, and technical feasibility.
Breakdown opportunities into fixes, automation options within Activate or potential experiments as part of Experiment.
Surface New Metrics and Features that significantly impact KPIs, feeding back to Measure stage
Experiment 🧪
Experiment's Objective
Test new ideas, hypotheses, or strategies to discover innovative ways to achieve goals or solve problems.
Focus on controlled testing to determine causality and validate assumptions before scaling.
Experiments Roadmap
Create Experiments Roadmap based on insights from Optimise stage, prioritising tests by expected impact and implementation complexity.
Develop Experiment Framework including hypothesis formation, test design, sample size calculation, and success metric definition.
Experiments Implementation
Implement Technical Experiments Infrastructure for experimentation including A/B testing platforms, feature flags, and monitoring systems.
Execute Experiment Cycle from hypothesis to conclusion, with clear documentation of methodology, results, and learnings.
Cross-Stage Feedback
Feed test results back into Measure and Optimise stages, distributing insights to inform future strategies and create a continuous improvement loop.
Activate🚀
Activate's Objective
Implementing changes in existing processes, campaigns, and systems to maximize efficiency and effectiveness in achieving goals.
Rolling out successful Experiment results to become new standards.