There is no denying that the appeal of digital marketing is the metrics. In many cases data is as near to immediate as any manager needs. Proper data allows you to more easily make choices, decisions, and scale your efforts as needed.
Every wealth comes with a downside though. It is possible to be too reliant on data, to become effectively data blind. I mention reliance first because it is an issue of focus- all one sees is the data, not the inability to see the data.
An over-reliance on immediate or accessible data, in front of your face, occludes your view of the fields before you.
But isn’t data good?
I won’t argue that data is useless, far from it. As an analyst, my role tends to rely on data heavily in diagnosing problems and forecasting. Data is only as good as your focus though because data itself is neutral and doesn’t do anything on its own.
You can use data to make good decisions but data can’t make good decisions for you.
As an industry and in marketing ourselves, we don’t do ourselves any favors either. Awards and self-congratulations rely upon the nerd/geek aspect of the job. We play up the tables, graphs, and reports and heavily emphasize the data driven side while neglecting the thinking side.
In effect the data becomes the answer to everything. Of course this portrayal can hurt you in the long term when it comes to any complex problem. Naive viewers will attribute your inability to solve a problem as your inability to access data and do your job. It can be easy to buy into this as well. By tapping into the data-driven mythos, we buckle down on the data itself rather than the tool.
This leads to two opposing sides. One side of the spectrum latches on to heavily to a single data point; the opposing side continually seeks more data and reports to compensate for lack of focus. I’d imagine that every individual has his or her own tendencies to lean one way or another. Knowing that is at least half the battle, the other half is keeping your perspective centered.
Too Singularly Focused
You will get nowhere by focusing too heavily on a single metric or group of metrics. Don’t take this as a reason to neglect KPIs. Identifying successful metrics is one of the best paths to success. On the other hand, the wealth of data can make it easy to rely on a single metric, KPI or not, at the expense of others. It’s much simpler to blame your problems on CTR when the actual problem is with your sales process and its impact on your conversion rate. Availability, accessibility and ease of use can lead you astray.
This singular focus is often a symptom of micromanaging. Rather than ask, “What are we trying to accomplish?” the manager picks something from the wealth of data and starts analyzing that instead. Rather than focus on growing revenue, questions start popping up such as “CPCs are down 3.5% week over week” or “CTR has dipped to 1.28% after sitting steady at 1.34% over the last 30 days.” This is then followed by a rally to address the threat.
Of course either of these instances could indicate something. Who am I to tell you they don’t. You’ll get lucky now and then but when this becomes the day to day focus, account health ultimately suffers. Rather than focusing on the important metrics, day to day questions are brought up regarding these additional facts. Since the questions are not grounded in any sort of hypothesis or reasoning, they often end up being time sinks and unanswerable from the data.
Sometimes CTRS vary, some days you get 4 conversions, and some days you get 6. There will always be variance in your data. Performance marketing does not means you must manage the details on a day to day basis. The samples are usually just too small.
Focusing too Broadly
The other end of the spectrum is marked by a lack of cohesiveness. While the singular focus results in myopic blindness, the other end’s focus is so diffuse as to be ineffective. There are two root causes. One could be a perversion of a singular focus, as in “I’ll pull every report until I figures out what X means” or it could be “I have all this data the answer is in here somewhere.”
Both are similar in their own ways but the first circles the issue attempting to steel oneself against the threat. Similar to problems in the previous section, this leads to lots of wasted time with nothing to show for it. The second form is a combination of naiveté and lack of perspective. There is both overconfidence in certain reports and a lack of perspective for the reasoning those reports. It is similar to the aphorism “when you have a hammer, every problem is a nail.”
Rather than hone in on what the potential issue could be, it can be more comfortable to pull reports that worked in the past. I don’t mean this in old data, but the same types of reports one has seen work before. For instance, performance is off and immediately geographic, auction insights, and site flow reports are pulled. Each of these are useful but when it comes to defining a problem they often do little but spend time creating the illusion of investigation.
Using the geographic report example, unless you heavily rely on local business, geographic data often means little when it comes to driving real results. Of course you might see a slightly higher CPA in Alabama but that is almost never the actual problem in your account. You could change modifiers state by state and see at best incremental gains. In many cases examples like these make up such small quantities of your total data that continually focusing on them and trying to find an answer will lead you nowhere. The business is used as a cover for the actual problem. It takes time but has little long term impact.
The Middle Path
By now I’m sure you’ve realizes that this, like any other dichotomy, will advocate some form of middle path. How you reach that path is the problem. Unfortunately I don’t have a clear answer for you either. Part of it relies on experience, part of it relies on the account, and the rest relies on your system of analysis.
If there was one definite lesson here, it would be that data itself is neutral. It is only an indicator not a cause. Tying it back to the industry, this is why you often hear doubts about the utility of quality score. It’s similar to the reasoning Google themselves mention that you shouldn’t optimize for it.
Of course a high quality score is good and correlates with performance but it actually has no causal link to performance. By managing an account to optimize quality score, you are only accounting for a secondary indicator while ignoring the primary causes of poor performance. Your taking stabs at the symptoms while not addressing the root cause.
If we can’t rely on data, what are we to do? Easy! You shift your focus to perspective.
What is the strategy, what are the goals, and what tests are you running?
With a wealth of options and tasks you could do, it is very easy to fall into a productivity trap. You can continually work through an ever growing number of tasks or questions or you could focus on effectiveness instead.
Sometimes you have to step back and think before you act. Stepping back and thinking through a problem isn’t a nice to have, it is part of the job. Our field offers us many levers but PPC wisdom, as pretentious as that sounds, is learning that not every lever needs to be pulled.