Site icon Animalz

How Google Analytics Kills Great Blogs

google analytics content marketing 2

If you remember the Animalz blog for just one thing, let it be this: The oversimplification of content marketing is a drag on the entire industry. A nuanced understanding of content marketing strategy, creation, and measurement is the thing that separates great blogs from the rest.

We’ve previously discussed oversimplification in the context of growth constraints. Content is most often used to increase top-of-funnel traffic—even if that isn’t the constraint on growth. We’ve also covered it in the context of timing, i.e., how do you write for topics that are trending up vs. down?

In these cases (and dozens of others), success is determined long before a single word is written, optimized, or shared. Content “works” only if the strategy is created to solve a real problem—one whose solution is easily tracked and measured. An oversimplified approach results in noise (and wasted budget) every single time.

In my nearly 10 years in content marketing, I’ve spent countless hours in Google Analytics trying to fine-tune a report that uses multichannel attribution to “prove” once and for all that content marketing is worth the spend. I’ve never managed to do it.

Google Analytics encourages an oversimplified approach to measuring content for three reasons:

This isn’t really about the problems with Google Analytics; it’s about the desire for near-term ROI that causes marketers to fiddle with strategies that will work, given enough time. We need a better way to measure content marketing.

To really drive this point home, let’s rewind the clock about 70 years. Content marketers, meet George Katona.

What Content Marketers Can Learn From the Consumer Sentiment Index

In 1946, the Federal Reserve approached Dr. George Katona, an economics professor at the University of Michigan, and asked him to help create a survey to measure Americans’ assets.

He took the job but proposed a twist: A trained psychologist, Katona believed that emotion played a key role in the economy, and he was keen to collect data to prove his theory. If he could find a way to measure the way people felt about money, he might be able to predict the way they spent it.

The Fed rejected his idea. After all, how do you measure feelings on a mass scale? But Katona found a way to get the data he needed from the survey anyway. Instead of asking people how they felt about their personal finances or the economy, he probed with broad, general questions like, “Are you better or worse off financially than you were a year ago?” and “Do you think you’ll be better or worse off in a year?” Optimism (or pessimism) turned out to be a really good predictor of where the economy was heading.

Katona collected this data for years before anyone cared about his “subjective” measurement. But when the idea caught on in the 1970s, he already had decades of data to map against the stock market. So was born the Consumer Sentiment Index.

Amazingly, consumer sentiment dropped—i.e., pessimism increased—just before each recession. The opposite was true, too. As optimism increased, GDP rose. The model worked.

Source: Investing.com

Katona, of course, wasn’t surprised to find that consumers made decisions based on noneconomic motives. The markets rose and fell based on what economist John Maynard Keynes called “animal spirits.” In other words, pure emotion.

In his book Essays on Behavioral Economics (PDF), Katona concluded that

(a) changes in consumer attitudes and expectations are measurable; (b) attitudes and expectations represent intervening variables modifying overt behavior; (c) changes in optimism and pessimism and therefore in willingness to buy have a great impact on discretionary expenditures; and (d) changes in consumer expenditures on durable goods have a substantial influence on general economic trends.

Not only did Katona prove that emotion was a driver of economic trends, but he also figured out how to measure it. Decades ago, Katona solved a problem that we in the SaaS marketing world are still wrestling with today. A blog doesn’t grow (or decline) because of raw emotion, but success (or failure) cannot be measured without a holistic set of data points. Google Analytics does not, for example, tell you how people feel about your content and your brand—and that is far more important than page views and bounce rates.

A Better Way to Measure Content Marketing

A few years ago, Moz founder Rand Fishkin published a slide deck that explained Why Content Marketing Fails. He apparently struck a chord—it’s been viewed more than 4.4. million times. The first reason he presents is that stakeholders want to draw a direct line between pageviews and sales. And as any content manager knows, it’s never that simple.

This is exactly the same line of thinking that Katona fought against. He wrote that “traditional economic analysis, not making use of [psychological] survey data, had at its disposal only aggregate data on consumer expenditures . . . and no quantitative data at all on economic motives or expectations.”


Economists before him measured the data they had—and it was incomplete. Marketers today follow the same path. They analyze the data they have, most often Google Analytics, without considering the metrics they haven’t yet measured.

No one sums up this problem better than Statuspage cofounder Danny Olinsky:

During every conversation with prospective customers, we always ask the question “How did you hear about us”? Many responses go something like this – “I think I may have seen a blog post of yours,” or “You guys wrote that post on reaching $5k in MRR, right?,” and “I honestly can’t remember, but I may have seen a post of yours or one of your customer’s status pages.” Ironically, it seems like the less people actually remember how they heard about you, the better job you’ve done at content marketing.

Our marketing director, Devin Bramhall, points out that nonpaid marketing campaigns are inherently difficult to measure. There isn’t a simple input/output equation that results in perfect data every time. Treating organic campaigns as if they were paid can and will ruin them. Instead, Devin suggests an index of metrics—not unlike the Consumer Sentiment Index—that paints a more comprehensive picture and helps you tune into some of the hard-to-measure benefits of content marketing.

Here are the numbers she suggests measuring:

1. Traffic

You have to measure traffic—we’ll never suggest otherwise. But you have to keep it in perspective. “In the early days,” explains Devin, “traffic helps you track your growth. Later, it becomes more of a general health metric. It’s good to look at on a monthly basis at a high level, and also as a way to measure campaigns designed to drive traffic to individual posts.”

Here are the two things she suggests all SaaS blogs should be keeping tracking of:

Other than that, look for specific metrics that measure your blog’s success. If, for example, you run calls to action alongside content, monitor clicks and conversions. If you use guest posts to build links, keep an eye on search traffic to the pages you’re linking back to. Find small, specific numbers, and then view them alongside longer-term traffic numbers—this how you incentivize healthy growth.

2. Email Open Rate and Click-Through Rate

Measuring the same content across different channels is really important.

When people reach your site via organic search, they are seeking information. They want answers and don’t care about your brand or reputation. When people opt into an email list, they are affirming that they like your brand and want more.

Email metrics tell an interesting story about readers’ continued interest. A drop in opens and clicks should be investigated. An increase should be celebrated. Perhaps more importantly, this data should be analyzed in conjunction with traffic. Some content will perform better in search, some will thrive in email, so measure accordingly. An SEO-driven post is unlikely to drive tons of interest in email, and a thought-leadership post that resonates in a newsletter may not drive sustained traffic.

Lastly, keep an eye out for replies. Readers do respond to emails with feedback and questions that can be indicative of broader trends.

3. Organic Social Engagement

Social media is a channel where you get granular feedback on content. If you share the same post more than once, you can see what angles resonate the most—is the post’s headline enough to drive clicks, or do people get excited about a quote or statistic that you’ve highlighted?

Not only do you get the chance to measure content on a third channel, but you also get the most accurate feel for whether or not people enjoy it. Social is where readers will do the most replying, commenting, re-sharing, etc. This data is hard to quantify, but it tells volumes about the effect your content has on people.

Anecdotal reporting—i.e., screenshots of tweets, comments, email replies, etc.—absolutely has a place in your measurement. Stakeholders want to see how real people are reacting to the content. Put this alongside your charts and graphs with context about who the people are and why their reaction is meaningful—it’s just as powerful as hard data.

Keep Google Analytics in Perspective

Google Analytics simply cannot measure the success or failure of your content marketing. Use it as a tool, but keep it in its place. Whether you use the metrics we suggested above or come up with your own, find a way to create an index that tells the whole story about content efforts. Hard data alone just doesn’t cut it.

Exit mobile version