Your CRO tool might be grading its own homework

Updated April 3, 2026 · 5 min read

Your CRO tool might actually grade its own homework. The cost-per-order pricing model and the "risk-free, pay only for results" framing sound great until the moment you realize that you don't set the rules. The vendor decides what counts as a conversion. The vendor sets the attribution window. The vendor builds the dashboard you look at. And the vendor charges you a percentage of whatever number ends up on that dashboard. The structural incentive points in exactly one direction — toward making the number bigger — and the merchant is the only party in the arrangement with any reason to question whether the number is honest.

This is a sibling argument to the broader piece on attribution math, which works through what attributed revenue actually proves and how holdout tests reveal the gap between attributed and incremental. This piece is narrower. It is about the incentive misalignment that shows up specifically when the vendor of the tool also sets the rules for measuring the tool. The broader piece argues that attributed revenue and incremental revenue are not the same number. This piece argues that when the vendor controls both the definition of attributed and the commission tied to it, the gap between attributed and incremental is the vendor's main P&L lever, and the vendor will always pull it in the direction the merchant doesn't want.

The magician's misdirection

The mental image that fits is a magician inviting you to look at the brilliant green numbers going up, while quietly building inflated commissions on irrationally long attribution windows behind the curtain. The dashboard shows you the part you were meant to see — the topline attributed revenue, the trend chart, the percentage lift over the prior period. The mechanics underneath that number — what counted as a click, how long the cookie lived, what other touchpoints got overwritten by the last-touch credit, what kind of orders got included or excluded — those mechanics are usually a footnote in the contract, sometimes not even written down, almost never visible in the dashboard interface itself.

The merchant's relationship to the dashboard is one of trust. The merchant doesn't have access to the raw event data that produced the number, or if they technically do, they don't have the analytics infrastructure to reproduce the calculation independently. So the merchant looks at the number, sees that it's growing, and pays the commission. The vendor, knowing exactly which knobs in the calculation produce the most growth in the dashboard number, has every reason to keep those knobs turned in the direction that maximizes their commission. There is no malice in this. It is just where the incentive points when the vendor sets the rules and the merchant pays the commission.

A concrete scenario

A visitor comes to a store, browses a product page, doesn't buy. The recovery page surfaces, the visitor clicks one of the recommended products, browses for another minute, doesn't buy, leaves. Two weeks later, the same visitor sees an Instagram ad from the same brand, clicks through, buys a gift card for a friend, completes checkout. The 30-day attribution cookie from the original recommendation click is still active. The recovery tool's dashboard credits the gift card purchase as an attributed order, claims a percentage of the gift card revenue as commission, and the merchant pays.

Nothing in that scenario is technically dishonest. The cookie was real. The click did happen. The visitor was the same person across both sessions. Each individual link in the chain is auditable. What is dishonest is the implication that the recommendation click two weeks earlier caused the gift card purchase. The Instagram ad caused it. The gift card was for a friend, which means it had nothing to do with whatever product the visitor had clicked on the recovery page. The connection between the two events is the same person being on the internet, which is not a meaningful causal connection when the question is "did this tool drive incremental revenue."

The honest accounting of that order would credit the Instagram ad and call the recovery tool's contribution close to zero. The vendor's accounting credits the recovery tool because the cookie was still alive and the cookie window is the rule the vendor set. The commission is collected on the order. If the merchant were running this calculation themselves, with control over the attribution window, they would set the window much shorter — most considered purchases that are causally tied to a click happen within hours, not weeks — and the dashboard number would shrink dramatically. The vendor knows this and does not set the window short, because a shorter window means less commission collected every month.

A second scenario, even more obvious

A different visitor sees the recovery page, doesn't click anything, leaves. A week later, the visitor types the brand's name directly into Google, clicks the first organic result, and buys a product they had been considering for a month. The recovery tool, depending on how it counts impressions versus clicks, may or may not credit this order. In some implementations, the impression alone is enough to attribute the order to the recovery page if the order falls within the cookie window. In other implementations, only a click counts. Both implementations are valid choices in the abstract; both produce dashboard numbers that don't reflect the truth of what caused the purchase.

The order in this scenario was caused by branded organic search, which is itself one of the most-discussed examples of attribution overcounting — Google reports a healthy ROAS on branded keywords because the visitors who would have arrived through organic instead arrive through the paid ad, and the ad gets credit for revenue that would have happened either way. When a recovery tool layers on top of that, claiming credit for orders driven by branded search that happened to land within the recovery cookie window, the attribution becomes second-order overcounting on top of an already-overcounted channel. The merchant ends up paying commission on revenue that two different vendors are independently claiming, neither of which actually moved the conversion.

Nobody is technically lying

The funny thing about all of this is that nobody is technically lying. The cookie was real. The click did happen. The order did happen. The two events fell within the window. The dashboard reports the resulting number accurately given the rules the vendor set. Each vendor in the chain is following the rules of their own tool, the rules are written down somewhere, the merchant signed a contract that incorporated those rules. From inside the system, every number is defensible.

What the merchant gets to do, if the merchant is paying attention, is notice that the rules add up to a calculation the merchant would never have set up if the merchant had been the one writing them. The cookie window is longer than any honest causal claim could support. The impression-attribution rules are generous. The exclusion criteria — orders that wouldn't have been counted under a stricter definition — are loose. Each individual rule is defensible in isolation; the combination is the vendor's commission lever, and the lever points in one direction. So next time you see "attributed revenue" on a dashboard, the question worth asking is who defined attribution, and who gets paid more when that number is higher. The answer is almost always the same.

What a fair model looks like

The fairest pricing arrangement is the one where the dashboard number doesn't change the vendor's revenue. A flat monthly subscription, set at a price the merchant evaluates against the value of the tool, removes the incentive to inflate the dashboard. The vendor still has every reason to make the tool work — if it doesn't work, the merchant cancels — but the vendor doesn't have an incentive to game the attribution rules because the attribution number isn't what gets billed. The merchant can read the dashboard with the assumption that the number is calibrated to be useful for understanding the tool's contribution, not to be useful for justifying the next invoice.

The conservative attribution choices that go with flat pricing tend to be deliberately undersized. Click-based rather than impression-based. Short windows rather than long ones. No overwriting of other channels' attribution if the visitor had a more recent touchpoint elsewhere. The dashboard number that comes out of those choices is smaller than what an aggressive commission-based vendor would report on the same traffic. That's the point — the small number is closer to the incremental number, and the merchant can plan unit economics against it without having to mentally discount the dashboard before doing the math.

The trade is that the smaller dashboard number doesn't impress as much in vendor sales conversations. A vendor with a $50,000 attributed revenue number will look better in a side-by-side than a vendor with a $20,000 number on the same store, even if the holdout test would reveal the $20,000 number to be closer to the truth. The merchants who get this trade right are the ones who run the holdout test or who at least mentally separate "the number on the dashboard" from "the revenue I would actually lose if I uninstalled the tool." Those two numbers are never the same, and the gap between them is mostly the vendor's pricing strategy.

The closing read

The pattern to look for, when you're evaluating any CRO tool that prices on commission of attributed revenue, is the relationship between who makes the rules and who pays the bill. If the same party does both, the rules will drift in the direction of the bill, slowly and with full plausible deniability, and the merchant will be the last to notice. The honest version of the arrangement separates the two. The vendor can still build the dashboard, but the vendor's revenue should not move when the dashboard number does. Once you remove that incentive, attribution becomes a measurement question instead of a pricing lever, and the conversation about what counts as a conversion gets a lot more boring and a lot more useful at the same time.

Recover missed product discovery.

Free Starter plan. Native theme integration. Honest attribution.

Free Starter plan. 7-day trial on paid plans. No credit card.