
A/B Testing Mistakes I Made So You Do Not Have To
2026-03-17
Product Photos Are Killing Your Sales — Here’s How to Fix Them
2026-03-19I have built a lot of marketing reports over the years. Most of them were useless. They looked great on the surface — full of colorful charts, trend lines going in the right direction, professional formatting that made them look important. But nobody made better decisions because of them. I know this because I asked the people who received them. I sat down with the CEO and the marketing director and asked a simple question: “Did last month’s report help you decide anything? Did you make any change to your strategy or your budget or your priorities based on what you saw in that report?” The answer was always no. The reports contained plenty of data — pageviews, social media impressions, email open rates, time on site, bounce rate, and a dozen other numbers — but zero actionable insights. They reported activity without connecting it to outcomes. They made people feel informed without actually informing them. That was the moment I realized I was building dashboards for the wrong reason.
The Three Questions Every Dashboard Must Answer
I threw out my old dashboards and redesigned everything around three simple questions. Are we getting more traffic than we did last month? Are we converting a higher percentage of those visitors into customers or leads? Are we generating more revenue as a direct result of our marketing efforts? If your dashboard cannot answer these three questions clearly and immediately — if someone has to dig through sub-reports or calculate percentages manually — then your dashboard is not doing its primary job. Everything else is noise dressed up as insight. I realized that most of what I was reporting was what I call “activity metrics.” These are numbers that tell you something happened but not whether that something mattered.
The Vanity Metrics Trap
Activity metrics are easy to collect and look impressive on a dashboard. Total pageviews went up 15 percent. Social media impressions reached 2 million. Email open rates hit 38 percent. These numbers feel good to report and they feel good to hear. But they are dangerously misleading because they do not correlate to business outcomes in any reliable way. I worked with a team that was proudly celebrating 2 million social media impressions per month. It was the first number on their dashboard, highlighted in green with an upward arrow. When I asked how many of those 2 million impressions turned into actual website visits, the number was under 5,000 — a conversion rate of 0.25 percent from impression to visit. When I asked how many of those visits turned into customers, the number was under 50. Two million impressions produced fewer than 50 customers. That is not a success story. It is a story about measuring the wrong metric and building a dashboard that reinforces that mistake.
The problem with vanity metrics is that they create a false sense of progress. When the team sees impressions going up, they feel like their strategy is working. They invest more time and money into the channels that generate the most impressions, even though those channels are not actually producing results. The dashboard is actively leading them in the wrong direction. I have seen this pattern in dozens of companies, and it almost always leads to wasted budget and missed opportunities.
My Current Dashboard: Five Numbers
After years of building bad dashboards, I now use exactly five metrics on every dashboard I build. Sessions, which tells me if our overall traffic is growing and whether our reach is expanding over time. Conversion rate, which tells me if our messaging, user experience, and calls to action are effective at turning visitors into customers. Cost per acquisition, which tells me how efficiently we are spending money to acquire each new customer. Revenue, which is the actual business outcome we are all working toward. And return on investment, which tells me whether the money we are spending on marketing is generating more value than it costs. That is it. Five numbers. Everything else — social media followers, email open rates, pageviews by channel, time on page — is a supporting detail. These secondary metrics are useful for diagnosing why something went wrong, but they do not belong on the main dashboard.
If your dashboard has more than ten metrics, you are including vanity numbers that make you feel busy without telling you anything useful. I recommend applying the “so what” test to every metric on your dashboard. Imagine someone says to you: “Sessions increased by 20 percent this month.” If your natural response is “so what?” — meaning you cannot immediately connect that increase to a specific action, decision, or business outcome — that metric does not belong on your primary dashboard. It might belong in a drill-down report for deeper analysis, but it should not be one of the first numbers someone sees when they look at your reporting. Removing those vanity metrics is the single fastest way to improve the usefulness of your dashboard.
How Often to Report
Different decisions need different reporting cadences. I use three time frames. Weekly, I check the five core metrics and look for anomalies. If something is significantly up or down compared to the previous week, I investigate. Maybe a campaign launched, a competitor changed their pricing, or a seasonal trend started earlier than expected. Monthly, I do a deeper analysis of channel performance — which channels are improving, which are declining, and whether the trends from last month are continuing or reversing. Quarterly, I do a full strategy review including competitive analysis, goal setting for the next quarter, and a reassessment of our overall marketing priorities based on everything we learned over the previous three months.
I also learned that the format of the report matters as much as the content. I used to spend hours every month creating a twenty-page PDF report with detailed charts, analysis, and recommendations. Nobody read it. I know this because I would send it out and get zero questions or comments. Now I send a five-bullet email every Monday morning. Each bullet contains one metric, the current number, the percentage change from the previous period, and one sentence explaining what it means and whether it is a concern or a positive sign. The CEO comments on it almost every week because it takes thirty seconds to read and directly informs the decisions they are making. Simple formats get read and acted on. Complex formats get ignored, regardless of how much effort went into creating them.
If you have not looked at your own dashboard recently with a critical eye, I encourage you to do it right now. Open your analytics tool, look at the default dashboard, and ask yourself honestly: does this help me make better decisions? Does it answer the three questions about traffic, conversion, and revenue? If the answer is no, start removing metrics and adding the ones that actually matter. The first time you look at a dashboard that shows only the numbers that drive your business, you will wonder why you ever tolerated all the noise.
Related Articles
A/B Testing Mistakes I Made So You Do Not Have To
Why Your Dashboard Numbers Lie (And How to Fix Reports)




