How SEOs can deal with unwanted *****-intent traffic


SEO for ***** sites is a fiercely competitive space – yet pervasive and unwanted *****-intent traffic remains a big challenge for enterprises, ecommerce sites and marketplaces. 

Here is why this is a problem and what can be done about it.

When non-***** sites rank for ***** searches

It’s important to understand that “*****-intent traffic” and “***** content” differ. 

Any amount of ****** content can cause Google to label a website as “*****” and limit its exposure for most queries. 

It’s a good practice to label any ***** content as such using <meta name="rating" content="*****"> tag that will signal to Google that this content should be filtered from SafeSearch.

Whenever practical, ****** content should be separated from the main site by moving it to a subdomain.

*****-intent traffic, on the other hand, describes the intent behind the search query, regardless of the content of the page it lands on.

How SafeSearch influences Google’s results

If SafeSearch is on, most explicit and ***** content will be filtered out from the results, which effectively means a ban on sexually exploitative or sexually suggestive content and nudity. 

Websites that Google explicitly labels as being pornographic only show up for certain queries. Google prevents *****-themed content from triggering rich snippets or appearing in Discover.

Ironically, this means that safe websites and platforms that monitor and remove explicit content (for example, mainstream news sites or educational platforms) are more likely to appear for *****-oriented search queries in Google when SafeSearch is on.

General information

For many queries with ***** intent, Google might return results that offer more general information about the topic or non-explicit references. 

For instance, a search for an ***** film star might return a Wikipedia page or a news article about them rather than their explicit content.

Vague queries

Many search queries can be interpreted in multiple ways, both innocent and *****. With SafeSearch on, Google is likely to favor a non-explicit interpretation. 

For example, searching for “breast” might prioritize results about breast cancer, chicken breast recipes, or anatomy over more *****-themed results.

While we don’t know what percentage of all Google searches is ***** in intent, we know that many authoritative, established sites and global marketplaces capture much of this traffic, even if no matching ***** content is found on the site.

It is not uncommon for *****-intent searches to make up to 20-40% of all SEO visits. This number can be even higher for some geos.

Isn’t all traffic good traffic? Unpacking the *****-intent dilemma

For publisher sites that can monetize pageviews through programmatic advertising, a click is a click, and the intent of the traffic might not be the key determining factor for CPM. 

For ad arbitrage sites, capturing ***** intent visits may even be desirable.

However, this can be problematic for online businesses, platforms, or marketplaces that are conversion-oriented and non-*****.

Analytical noise

When organic search visits are going up, it’s tempting to deem SEO strategy a success. But what if a big portion of these visits are non-converting ***** clicks? 

An uptick in visits could be because a key competitor or another large website has scaled their *****-traffic blocking efforts. 

Not having the right level of insight or ability to isolate valuable visitor segments from noise can lead to:

  • Analytical mistakes.
  • Misplaced investment of time and resources.
  • Failure to tie SEO performance to business outcomes.

It’s expensive

What is the ROI of *****-intent traffic for a non-***** site? 

If non-converting ***** queries make up a lion’s share of all visits, it may be time to examine the costs associated with serving this traffic and start scaling back.

Quantifying *****-intent queries: Navigating your traffic data

*****-intent traffic is easy to spot but difficult to quantify. 

Sadly, no magic tool will provide all the SEO keyword data and determine what portion is ***** in intent.

The bigger the site, the higher the risk. 

Established sites that do not restrict indexing of search results pages or marketplaces that leverage user-generated content (UGC) run the risk of amassing an enormous amount of long-tail traffic through low-quality URLs that rank for the most obscure ***** terms.

Google Search Console

GSC is a great place to start looking. While it does not provide complete keyword data, it offers enough insights to gauge the magnitude of the problem by examining a relatively small sample of top keywords.

Google Analytics

GA (and most other web analytics tools) can help get more granular by analyzing URLs of top organic landing pages for ***** terms or phrases that could be interpreted as ***** in meaning. 

This is especially relevant for marketplaces, sites that index internal search results, or leverage UGC for SEO. 

As a bonus, GA makes it easier to understand the business impact of ***** traffic by cross-referencing it with available engagement and conversion data.

Ahrefs

Ahrefs is a fantastic tool that can analyze massive lists of keywords and their ranking fluctuations. 

With a bit of regex magic or AI help, it’s possible to determine which keywords have ***** intent and estimate the overall share of traffic they represent. 

The best part? Competitive intelligence. 

Ahrefs makes it easy to analyze competitor standing with respect to ***** traffic and glimpse additional insights behind their SEO reach and performance.

It’s well worth segmenting traffic data for further detail. Do some geographies, days of week, times of day, or device types stand out more than others? 

Understanding behavioral and usage patterns can make isolating and addressing unwanted traffic easier.


Get the daily newsletter search marketers rely on.


How to minimize and manage unwanted *****-intent traffic

While no solution will be perfect, here are several ways to reduce *****-intent traffic.

Examine URL slugs and on-page keywords

Frequently, a partial keyword or phrase match in URLs or on-page keywords might be enough to rank a perfectly innocent page to one or more related ***** queries. 

Sometimes, updating URLs and on-page elements may be enough to drop unwanted rankings.

Remember that changing a URL will likely temporarily impact overall rankings and URL authority for the affected page.

Make use of blacklists

Paid search teams often use blacklists for *****, hateful or harmful keywords. These lists can also be useful for SEO.

Use them to restrict the crawling or indexing of URLs based on related keywords.

One of the most popular methods for this is robots.txt. It offers a simple, effective way to disallow problematic URLs at scale using regex rules. 

One of the downsides to this approach is how public it is – it’s quite literally out there for the entire world to see. Another downside is that robots.txt does not allow for nuance.

Not every *****-intent search is equally problematic. In many cases, it may be enough to noindex a page to allow for crawl and discovery of other linked content.

On the other hand, it might be desirable to apply 404 or even 410 response codes to URLs that consistently rank for extreme or very illicit phrases. Websites with dynamic URL generation are especially susceptive to this.

Frequently, URLs that drive *****-intent traffic will only rank for one or a few closely related *****-intent terms, which makes disallowing, noindexing, or doing a 404 redirect viable options. 

In other cases, a blanket rule is not the best solution. Consider doing an experiment with conditional rules that: 

  • Target users, instead of bots.
  • Restrict access only to certain audiences, geographies, or device types that drive the unwanted traffic.

Consider engagement-based indexing

An *****-intent user is unlikely to convert on a non-***** site. 

These low-quality visits will likely have exceptionally high bounce rates, low pageviews, and no conversions. 

A scalable approach for an enterprise site might include custom indexing logic that issues noindex directives based on user engagement and conversion signals.

Manage *****-intent traffic to protect your SEO efforts

While *****-intent SEO traffic might increase the volume of visitors, the quality and relevance of this audience for non-***** sites are questionable. 

Businesses must recognize the nuance between traffic numbers and genuine user engagement. 

By effectively recognizing, segmenting, and acting against unwanted *****-intent traffic, enterprises can fine-tune their SEO strategies and ensure their content reaches the right audiences. 

After all, in the age of data-driven decision-making, it’s not just about attracting eyes – it’s about attracting the right ones.

Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.



Source link : Searchengineland.com

Social media & sharing icons powered by UltimatelySocial
error

Enjoy Our Website? Please share :) Thank you!