As a PPC analyst, I have become more and more focused on tool development and reporting projects. However, as we continue to sign more clients, hire more Account Managers, and deal with more complex, large accounts, it becomes increasingly important to think about scalability with these projects.

Now you might be wondering, what exactly does “scalability” mean? In this case, our definition is “able to be used across a range of capabilities”. Specifically, the same tool should be able to be used for a small biz account, and an extra-large account with thousands of keywords.

So, What is an “Unscalable” Tool?

Many of our existing tools and templates rely on Google Sheets and a Supermetrics connector. While it’s great in theory, there are some problems.

  • Data refreshing – data accuracy and completeness are the lifeblood of any data analysis tool. Whatever method you choose to import data, you must be able to rely on it to function consistently. Manually downloading data from ad interfaces is an option, but is tedious and time-consuming, and can lead to many human errors.
  • Data size – Google sheets and Excel have limits on how much data they can hold and manipulate with ease. Referring to our earlier definition of scalability, we want to be able to use a tool with any sized account.
  • Time-Consuming – How much time the user has to actually spend using the tool is an important factor. The entire point of creating a tool is to either save time or provide an insight that would be difficult or impossible to find manually. For example, if we have a tool that runs SQRs (search query reports) but requires 10 steps, a manual data refresh, and some error troubleshooting from the user every time, that’s not scalable long term. Time is our most valuable resource, let’s use it wisely!
  • Robustness – A tool that is constantly breaking (either due to human error or computer error), makes an analyst sad. The very nature of Google Sheets & Excel makes it very easy for the end-user to make changes (unintentionally or not) and cause things to break. 

Project Planning for Success

To build scalable solutions, we must start at the very beginning: the planning stage.

We won’t get too in-depth in this post, if you want more general planning tips you can see my post on the topic here

Some things to think about during the planning stage that will impact scalability include the following:

How is the PPC tool going to be built?

Your go-to may be Excel or Sheets, but with the limitations discussed previously, now might be the time to start learning (or implementing) some code! Whether it’s Python, R, javascript, or some other code of your choice, it might just be a stable solution.

Another often-overlooked, yet very important aspect is how the tool works with multiple users. Our Google Sheets tools are copied for each instance that the user needs it (for each client, for example). The problem here is that we end up with old outdated versions floating out there in the ether that is Google Drive. Account Managers end up using broken copies inadvertently. When a tool is coded and hosted online, updates are pushed live and there’s no chance of a user having an old copy.

The build method also handles the problem of data size nicely, as a few lines of code can churn through thousands of rows of data quicker than Excel can.

Investing in Data Storage Solutions

One of the biggest challenges we have faced is data storage. BigQuery is an inexpensive solution, and many businesses also use services like Funnel to pipe in and store data.

It may help to think about a tool in different functional pieces: data pipeline, storage, manipulation, and visualization. This way you can tackle each piece separately and the project becomes more manageable.

Scaling an Existing Tool or Report

Sometimes you don’t have the time or resources to completely rebuild a tool or report and need something functional. In that case, there are likely some improvements to be made!

Step 1: Locate any slow-downs or breakages

Do things freeze up due to complex formulas or large data sets? For me personally, my workflow generally entails writing some long, complicated formulas to get the data to do what I want, and then a bit later I’ll go in and streamline things where possible. 

Step 2: Reduce Opportunity for Human Errors

I have found that the more direct interaction the user has with a report that’s based in Google Sheets, the more room there is for human error. Accidentally deleting formulas, overwriting cells, and formatting changes are all very common.

A short term fix can be as simple as locking cells for editing (or showing a warning). Swapping out complicated formulas for more simple ones is also a great idea here.

Step 3: Look into optimizing one of the following: data import, storage, manipulation, visualization

Pick one step of the process where there is room for improvement. As an example, we’ve recently been building a new Data Studio report for a client. They are running on multiple platforms and accounts, and need to segment by different product lines. Rather than pulling the raw data into Data Studio and doing any custom metrics and segmentation there, we decided to do it in the data file before pulling into Data Studio. This allows Data Studio to function as the visualization, which is where it shines, without adding extra frustration of getting the data to be structured in a usable way.