Case Study: Rockerbox Marketing Attribution
Thinking beyond spreadsheets to offer true insights and recommendations
Rockerbox offered the most advanced marketing data, but in the same old format. After talking with users, researching competitors, and analyzing our data, I discovered our customers weren’t after raw numbers dumped into a vast sea of table cells, but the key insights and specific recommendations only Rockerbox could provide.
Rockerbox retrieves marketing data from third-party advertising platforms, such as Facebook, Google, and TikTok, aggregates that data, processes it to remove redundancy, then presents various reports meant to help their customers understand how best to allocate their marketing budget. Instead of compiling marketing data across multiple platforms and generating their own reports, users can view all their marketing data within a single interface.
Rockerbox: The Spreadsheet Emulator
The Rockerbox UI delivered marketing reports in a traditional tabular format reminiscent of spreadsheets, with added enhancements including complex settings and filters to help users winnow the data down to something relevant to their needs. When Rockerbox hired me as their first designer, management’s directive was to streamline the filter interface and make it easier to build and save custom reports, but after talking with customers and our customer success team to better understand the filters, I discovered end users were often at a loss when it came to interpreting all the data in front of them.
User Research: Expectations vs. Reality
Our initial user research efforts consisted of remote contextual inquiry—observing users and asking questions as they navigated the Rockerbox UI while performing everyday tasks relevant to their jobs. Users shared their screens and their thoughts, while we used Grain to record the calls, create transcripts, and generate AI-driven summaries to help us identify and catalog key takeaways.
What did we find out? Broadly:
- Users were unclear which report to consult to answer specific questions
- Users often had to manually compile data from multiple reports to accomplish their goals
- Even with the right report and advanced filters, drilling down to a meaningful dataset and getting to the “why” was challenging
- There was a disconnect between what customers expected the product to do and what the product actually did
That final bullet ended up being the most telling; some customers felt the sales process had pitched Rockerbox as providing not just a window into their data, but insights with a clear path forward, which the product clearly didn’t do.
Supplementing Raw Data with Insight
Rockerbox had a traditional analytics UI: pre-built reports consisting of tabular data, graphs, and filters. The user was responsible for configuring the filters to drill down to relevant datasets and derive their own insights. This was no different from most other analytics platforms.
One of the first questions I asked the product team in my early days was regarding ad spend recommendations. If we have access to all this data and can draw conclusions from it, why don’t we provide those conclusions to customers in the form of insights? Their response seemed obvious once I’d heard it: liability. They didn’t want to be on the hook for making a recommendation that ended up losing money for the customer.
I understood their concerns, but I believed strongly the value of the product wasn’t in raw data, but surfacing insights based on that data, so I set about incorporating that thinking into my designs, starting with my first large project: the Channel Overlap report.
The Channel Overlap view was initially specced out as a typical Rockerbox report that would display lots of numerical data in tabular fashion, but I pushed to include prominent insights based on that customer’s data. Nothing fancy, just simple comparisons to help our customers understand how people interacting with their ads across multiple platforms impacted their revenue.
In our brief user testing prior to release, we heard exactly what I was hoping to hear: “This is the kind of thing I’m looking for from Rockerbox. Help me understand the data. More of this please.”
So we gave them more.
Pushing Beyond Insight with Goals & Recommendations
Our success with Channel Overlap gave us the confidence to step outside the Rockerbox comfort zone and try even more useful insights, which came in the form of Goals & Recommendations.
I discovered that to appease and retain larger customers, our customer success team drafted custom reports and recommendations based on the customers’ raw data. I examined these custom reports and saw what looked like fairly formulaic recommendations: increase spending on Facebook Ads by 25%, decrease Google Ads by 25%, etc.
I believed if we could automate such recommendations and present them clearly in the UI, we’d unlock a powerful tool customers would come to rely upon in both short- and long-term budget planning. One challenge was making Rockerbox management comfortable with direct recommendations to customers regarding how they should spend their money.
Working with Product, we defined a framework for our recommendations that minimized our exposure while still delivering value to customers. We decided automated spend changes shouldn’t go beyond 25% in either direction; the math might justify more aggressive increases or decreases in spend, but we’d stick to more reasonable numbers and let the customer’s risk tolerance guide them beyond that range.
Users are prompted to set goals for CPA or ROAS, and the tabular data is supplemented with columns containing recommendations to optimize their ad spend, along with content to help them understand where our recommendations come from.
Advanced Recommendations with Marketing Mix Modeling
Marketing Mix Modeling, or MMM, is a long-term forecasting technique using prior marketing performance to predict future revenue. The engineering work for our MMM product had already been under development while we worked on Goals & Recommendations, but the proof-of-concept UI was still focused on displaying data without insight. When I was asked to get MMM ready for our customers, I pushed hard for us to take what we’d learned from Channel Overlap and Goals & Recommendations and incorporate it into what would be our most powerful recommendation engine yet for our biggest customers.
In addition to plain-language insights, customers could adjust their target ROAS and see ad spend recommendations for each platform, moving the target along the curve to align with their risk tolerance. They were also provided an interactive worksheet where they could adjust spend for each platform to see how it would impact revenue; customers could stick with the recommended value provided by the model or enter a custom value to see the effects.
Conclusion
My strategy to incorporate helpful insights into the Channel Overlap report, paired with our subsequent research and testing, demonstrated our customers’ desire for more meaningful reports and paved the way for Goals & Recommendations and MMM to become not just another collection of data, but meaningful tools to increase marketing revenue.