In 2017 Google announced the Ad Experience Report in Search Console. I remember the announcement like it was yesterday. The overarching idea was brilliant. Provide a set of reports that let site owners know when their advertising situation yielded a terrible user experience. And they also announced that if a site was failing a review, Chrome would stop showing ads for the site. It was bold, and I loved that from Google.
This was something that was so needed it wasn’t even funny.
Why? I have documented for a very long time how aggressive, disruptive, or deceptive ads could be seriously problematic from a major algorithm update standpoint (with a focus on broad core updates). I have documented where Google has explained this over the years, I have provided case studies, I have explained the connection with Navboost, and more.
Here is what the report looks like in GSC:
Regarding Google providing warnings about aggressive and disruptive advertising, Google’s quality rater guidelines has contained information about aggressive ads, Google’s blog post about broad core updates explains more about ad aggressiveness, and Google’s helpful content documentation covers aggressive and disruptive ads.
In addition, Google’s Elizabeth Tucker explained in her post about the March core update that Google refined its systems to better understand if webpages contained unhelpful content, had a poor user experience, etc. And even the Chrome team posted about the topic by publishing a post on the Chrome dev blog that explained how to run ads without disrupting the user experience. And that was very interesting timing considering the September HCU(X) had literally just rolled out and poor UX had a role in that update…
Enforcement Begins:
Back to 2017… Google announced it would provide a report in GSC notifying site owners when their ad situation was failing a review based on the Better Ads Standards. I couldn’t wait to dig in, especially considering how many sites had reached out to me after getting hit hard by broad core updates where aggressive and disruptive advertising was part of the equation.
Google began enforcing violations in February of 2018 and I began analyzing that situation right away. By the way, if a site failed the ad experience review, all ads would be stripped from the site in Chrome. Yep, no ads for you. I loved this idea. What an incredible way to get site owners to tone down the ad situation. I documented how this was working in several blog posts in 2018 (both on my site and on Search Engine Roundtable).
Sites violating the Better Ads Standards did indeed have their ads stripped in Chrome. It was wild to see, although it was only the worst of the worst situations getting impacted. For example here are some screenshots of ads getting removed:
“Mountain View, we have a problem…” Great concept, but it never told the full (aggressive ads) story.
As time went on, I quickly started to see issues with how the system was working. First, the ad situation would only be based on the Better Ads Standards. The standards cover the worst ad formats you can think of. For example, autoplaying video ads with sound, large sticky ads, flashing animated ads, full screen rollover ads, and more. Don’t get me wrong, those are terrible formats, but it still leaves sites passing a review that run way too many ads on a page, have ads that still annoy the heck out of users, layer ads, and more. Enforcement was only for the worst of the worst based on what I was seeing. And that gave site owners a false sense of security. I’ll cover more about that soon.
Here are some of the formats covered by the Better Ads Standards:
In other words, sites could still be algorithmically impacted based on the contribution of aggressive advertising and poor UX situations beyond what the Better Ads Standards covers. So, the Ad Experience report was providing a serious false sense of security for site owners. Note, I said “contribution” to being algorithmically impacted above. That’s because Google is evaluating many factors with broad core updates. There’s never one smoking gun. There’s usually a battery of them.
Second, there were many, many sites that had never been reviewed in search console. And if a site wasn’t reviewed, then it couldn’t fail the review and enforcement could never happen. So ads kept running on sites that either violated the Better Ads Standards, or just had a terrible ad situation without violating the Better Ads Standards. Again, this gave site owners a false sense of security when it came to major algorithm updates like broad core updates.
As a quick example, here is the Ad Experience Report for a site pummeled by the September helpful content update (HCUX), where in my opinion super aggressive ads absolutely contributed to the classifier being applied. The site has never been reviewed. So, the site owner could wrongly believe everything was fine from an advertising standpoint. If there was an early warning sign in GSC, maybe the site owner could have toned things down.
Here are two sites impacted by the HCU(X) that had super aggressive ads. One passes the review and the other was never reviewed.
Third, and this is pretty obvious, the report is now in the ‘Legacy tools and reports’ menu in GSC. That’s not a good sign and it could be deprecated at some point. So this entire situation could be moving in the opposite direction from where it should be headed! i.e. Google might be providing less information for site owners about their ad situation versus more…
That said, I exported the latest list of violating sites via the Ad Experience API and there was still enforcement going on recently. For example, you can see enforcement starting in March and then May of 2024 for several of the sites.
The “But we pass…” argument and a GIANT false sense of security.
Even today when I explain to site owners that their ad situation is probably contributing to their drop based on broad core updates (or other algorithm updates), some come back and say, “But the Ad Experience Report in GSC says our ads our fine!” Ugh, then I need to explain everything I’ve been explaining here in this post. For example, Google’s review is just based on the Better Ads Standards and doesn’t cover many aggressive, disruptive, or deceptive ad situations on the web.
So in a weird twist, the report that’s supposed to be helping site owners tone down aggressive ads is actually giving them a false sense of security and driving MORE aggressive ads.
For example, in recent presentations about major algorithm updates, I showed two different ad situations to explain what aggressive and disruptive ads look like. Based on the Better Ads Standards, both sites either passed a review or they weren’t reviewed at all.
Here’s one of the slides from my deck. Go ahead and try and find the content:
What the Ad Experience Report could have been (or maybe what it still could be):
Based on the latest Google antitrust trial, we learned more about Navboost. It’s an important system Google uses for tracking 13 months of user interaction signals, which can impact rankings. I covered more about this in my post about “highly visible AND low quality” and why that’s a most dangerous combination.
Well, imagine if Google could provide some warning signs in GSC when user happiness levels drop based on Navboost data. Google would be helping site owners help themselves and avoid disaster from major algorithm updates. As Google has explained, “quality” is not just about content. It’s about the user experience, the ad situation, how things are presented, and more.
Once again, here’s a great slide about Navboost from the Google antitrust trial:
And here’s a slide from one of my presentations covering John Mueller’s comments about “quality” being more than just content. Notice he specifically says “UX issues” and “ads on the page”.
So there would have to be some threshold with Navboost that could trigger a warning in GSC in the Ad Experience Reporting (or maybe change the name to the User Experience Report instead).
But that brings up a big issue with people trying to game Google…
Unfortunately, spammers are why we can’t have nice things. If there was a Navboost threshold that site owners could understand, then spammers could test how close they can come to the ‘danger line’ without being impacted. And then that could yield many sites pushing the limits across the web with aggressive ads, aggressive affiliate setups, greedy UX, and more.
By the way, this is also why we don’t receive all inbound links in GSC, why nofollowed links are included there, why we can’t export all pages from the coverage reporting, and more. Google can’t provide too much data or that data could be abused by spammers.
Therefore, I think it will be very hard for Google to tell site owners that users are not happy (other than a major update rolling out and a site tanking). Google can provide hints and signs all over the place (which they are already doing), but I highly doubt they will provide hard data in GSC that ads are a problem (or that Navboost data is signaling big problems from a user experience standpoint).
Just thinking out loud, maybe Google could provide data once per X months or something so it could be harder to understand the threshold, but still help site owners that might be tipping the scales in the wrong direction.
Moving forward: It was a good concept, but it’s not effective.
When you boil it down, you cannot rely on the Ad Experience Report in GSC for letting you know when your ad situation is too aggressive, disruptive, or deceptive. It’s only going to flag the worst of the worst, and that’s IF your site even gets reviewed. Again, most aren’t reviewed.
But that doesn’t mean you should ignore user frustration and user ‘unhappiness’. In my posts and presentations about broad core updates, I’ve always said, “Hell hath no fury like a user scorned.” If you bombard users with aggressive, disruptive or deceptive ads, you will find out when that yields a terrible user experience at scale. It’s called a broad core update. And remember, there’s never one smoking gun with broad core updates. Google is evaluating many factors using machine learning. But the ad situation, and what that does to the user experience, is absolutely a factor.
Just remember Navboost… It’s an important system that tracks 13 months of user interaction signals and can impact rankings. That’s a great way for Google to understand a growing, and creeping, ad experience problem. And if you tip the scales in the wrong direction, look out.
User Studies: An early warning Google tsunami detection system.
I get questions all the time about how to know if aggressive ads are a problem. Well, you can receive early warning signs before a Google tsunami rolls through by running user studies through the lens of broad core updates. It’s a great way to understand how objective third-party users feel about your site, your content, the user experience, the AD EXPERIENCE, and more. You can read my post about the topic for more information about the power of user studies. The great part is that you can set one up quickly and start gaining important feedback.
Unfortunately, many site owners **** the idea of doing this, but most don’t run the studies. I just covered this in my latest SMX presentation. It’s maddening. So just set up the study and run it. The results could be enlightening.
Summary: Yes, aggressive, disruptive, and deceptive ads can cause problems SEO-wise.
But no, the Ad Experience Report in its current form won’t really help most site owners understand when they are getting dangerously close to the edge… Each site is different, there are many signals being evaluated by Google with broad core updates, and Navboost is continually measuring user happiness levels (with Google tracking 13 months of user interaction data). Ignore those signals at your own peril. And again, have your team run a user study now to find out what real people think about your site, your content, your ads, your UX, and more. The results could drive change. And change might be exactly what your site needs.
GG