Agency·14 October 2025·8 min read

Digital Marketing ROI: How to Actually Measure It

Why most reported ROI is wrong, the MER approach (total revenue / total ad spend), how we baseline restaurants before running ads, and what we give clients vs what we use internally.

By Jay

Digital Marketing ROI: How to Actually Measure It

Digital Marketing ROI: How to Actually Measure It

Most reported digital marketing ROI is wrong. Not falsified, just structurally flawed in ways that make the numbers useless for decision-making. Understanding where the measurement fails helps you ask better questions of your agency and make better decisions with your budget.

Here is how attribution actually works, where the models break down, and the approach we use with clients to get numbers that are genuinely actionable.

Why Most Reported ROI Is Wrong

The standard approach to ROI measurement counts every sale that touched an ad as a sale caused by the ad. This is the attribution model problem, and it affects every platform's reporting.

Meta reports conversions based on its default 7-day click, 1-day view attribution window. A customer who clicks your ad and buys a week later counts as a Meta conversion. So does a customer who saw your ad, did nothing, searched for your brand on Google three days later, and bought through the organic search result. Meta claims both.

Google counts the same customer through the Google Ads attribution model. If that customer also clicked a Google ad at any point in the path, Google claims credit for the conversion too.

The result is that if you add up the conversions claimed by Meta, Google, and any other paid channel, the total is often larger than your actual sales volume. Two channels claiming the same customer is not unusual. Three is not rare. The platforms are not lying exactly. They are each reporting within their own attribution logic. But the combined picture is misleading.

The MER Approach

Media Efficiency Ratio (MER) cuts through this problem by stepping back from platform-level attribution entirely.

MER is: total revenue divided by total marketing spend.

If a restaurant does $40,000 in weekly revenue and spends $1,800 on ads, the MER is 22.2. Every dollar of ad spend produced $22.20 in revenue.

This number does not require tracking individual conversions. It does not depend on any platform's attribution window. It does not get inflated by double-counting. It is the actual ratio of output to input across the whole business.

MER is not perfect. It does not tell you which channel is working better than another, which campaign drove the growth, or whether some of the revenue would have come anyway without the ads. Those questions require more granular analysis. But MER is honest in a way that in-platform ROAS reporting often is not, and for most businesses it is the right first number to track.

How We Baseline Restaurants Before Running Ads

The baseline methodology is central to how we measure campaign impact at Adelaide Socials, especially for our Accelerator restaurant clients.

Before any campaign launches, we access the restaurant's POS data for the prior 8 to 12 weeks. We calculate weekly revenue across that period, excluding any obvious outliers like a particularly strong weekend because a private event ran. The result is a baseline: what the restaurant earns in a normal week without our involvement.

Once campaigns are live, we compare weekly revenue to that baseline. The delta above baseline is the revenue we attribute to the campaign effort. We calculate MER on that delta rather than on total revenue, which gives a more conservative and more honest picture of what the campaigns contributed.

Some revenue growth after campaign launch is attributable to seasonal factors, word of mouth, or organic growth that would have happened anyway. The baseline methodology cannot eliminate this entirely. It does control for it as well as any non-experimental measurement approach can.

Incremental ROAS vs Blended ROAS

Blended ROAS is total revenue attributed to a channel divided by spend in that channel. It is the number Meta and Google show in their dashboards.

Incremental ROAS asks a harder question: how much of that revenue would have happened anyway without the ad? The difference between these two numbers is the real marketing contribution.

True incrementality measurement requires controlled experiments: turning campaigns off for a portion of the audience, running A/B holdout tests at the market level, or using Meta's own Conversion Lift studies. These are not complex to run but they require discipline and a willingness to turn off spend in the test group, which most businesses find uncomfortable.

For clients without the volume to run reliable holdout tests, we use MER trends over time as a proxy. If MER improves consistently when campaign spend is active and deteriorates when it is not, the campaigns are contributing incrementally to revenue. If MER is stable regardless of spend, the campaigns may be capturing intent that would convert anyway through other channels.

What We Report to Clients vs What We Use Internally

The reporting we give clients is deliberately simple. Weekly revenue, ad spend, MER, and the above-baseline delta for Accelerator clients. That is the core table. It fits on one page and answers the only question that matters: is this working?

We also include campaign-level performance: reach, click-through rates, cost per click, and landing page conversion rates. These are the operational metrics that tell us whether specific campaigns need adjustment. But we do not present these as the primary measure of success to clients, because they are not. A campaign can have a high CTR and produce no bookings. A campaign can have a low CTR and pack a restaurant. The in-platform metrics tell us what to adjust. Revenue tells us whether the adjustment mattered.

Internally, we look at a broader set of signals. Platform attribution by channel, by campaign, and by audience segment helps us understand which levers to pull. Creative performance across formats and hooks informs the brief for the next shoot. Search term reports and quality scores shape the Google Ads structure. But these are operational tools, not client-facing proof of performance.

The One-Number Summary

If someone asks how your digital marketing is performing and you have to give them one number, make it MER. Revenue divided by spend. Track it weekly. Compare it to the period before campaigns launched.

If it improves over the first 8 to 12 weeks of a campaign, the work is contributing something real. If it does not move, something in the campaign, the offer, the landing page, or the attribution baseline needs to be examined.

The question "is my marketing working?" deserves a better answer than a platform dashboard screenshot. MER gives you one. Get in touch if you want to talk through how we measure performance for your business, or see how the Accelerator tracks results for restaurant clients.

ROIMERattributionmeasurementdigital marketing
Skip the small talk