Diggity Marketing SEO News Roundup – February 2021


SEO’s best minds are clearly back at work. An impressive set of articles were published this month, and this roundup will take you through the best of them…

We’ll open with some case studies. First, you’ll learn what three separate case studies have to say about what went down with the December 2020 update. After that, you’ll learn how many URLs you can “request indexing” for before you hit a limit.

Some fresh guides are next. They’ll teach you about the latest best practices for PBNs, how to find keywords without help from historical data, and how to analyze SERPs to rank more effectively.

Finally, we’ll look at the news. There was a search ranking algorithm update on January 28th that you shouldn’t miss, plus a Google-employee’s ruling on what does and doesn’t qualify as “duplicate content.”

Google’s December 2020 Broad Core Algorithm Update Part 2: Three Case Studies That  Underscore The Complexity and Nuance of Broad Core Updates

https://www.gsqi.com/marketing-blog/december-2020-google-core-algorithm-update-part-two-case-studies/

Glenn Gabe brings us this look at what we can know about the latest core update from three different case studies.

The first case study covers the results of a news publisher who focuses on a highly-specific niche. This site was hit hard over 2020, even though it appeared to be doing most things right.

from gsqi core-update-case-one-drop-jan

As Glen put it, the site had E-A-T galore. Several core authors produced the news stories. They were qualified and their authorship was prominently displayed. Additionally, the site had over 2 million inbound links, including some of the most authoritative sites in the world.

None of this stopped them from getting hit in the January update and again in the May update. Glen recommended tackling this with a series of steps targeted to problems with news sites including:

By the time the December 2020 update rolled around, the site’s traffic grew by $140%+. It’s a story that may offer some options to other struggling news sources.

The second case study involved a site that did not welcome the December update. This affiliate site lost more than half of all its traffic. Through several graphs, Glen diagnosed what he believes is the cause.

In this case, he thinks that low-quality content has been allowed to overwhelm the core site content. It’s difficult to say because this study hasn’t been concluded yet. Affiliate sites that have been hit may want to stay tuned.

gsqi dec core update case study 2

The final case study looked at a site that had every reason to praise the December update, especially after it had been hit so hard by the May update. This site was large-scale, and it was operating in a tough niche.

The May update destroyed more than 40% of its traffic. Again, Glen noticed that a growing pile of thin content defined the site. It also had some problems with intrusive ads and mobile issues.

gsqi dec core update case study 3

While these issues were steadily corrected, nothing changed until the December update, when the site regained 40% of its traffic—almost overnight.

These case studies can offer a lot of ideas to sites that have been hit across 2020. If you want to make some major changes to your own site, you may be interested in knowing how many URLs you can request indexing for at one time. Our next case study may have your answer.

How Many URLs Can You “Request Indexing” For in GSC? [Case Study]

https://nickleroy.com/blog-posts/request-indexing-gsc-limit/

Nick LeRoy brings us this quick look into how many indexing requests GSC will tolerate from you at one time.

Request Indexing quota exceeded

The “request indexing” feature was completely missing from GSC for several months. Many SEOs were excited to see it come back, but they may not have noticed that the functions have changed slightly. Nick’s case study helped to clarify some of those changes.

Before the tool was taken offline, the limits had been tracked to about 50 URLs/day. Nick tested the new limits with a site that launched with more than 500,000 new pages.

These limits may be concerning for SEOs who rely on fast indexing or might be launching new sites soon. Nick theorized that the new limits may be there to prevent automatization of the whole process.

It’s something to watch. For now, let’s move on to the guides. We’ll start with Rank Club’s look at the best PBN practices for 2021.

2021’s PBN Best Practice Guide [Backed by Data]

https://rankclub.io/2021-pbn-best-practice-guide

Rob Rok of Rank Club brings us this data-backed look at how to use PBNs right in 2021. He didn’t theorize about what might work. Instead, he tracked what his busiest customers were doing and turned it into a set of recommendations.

how fast can you build links from rankclub

He broke the guide down by Tier 1 and Tier 2 PBN links. Tier 1 links are the PBN links that you build directly to your site. Tier 2 links are the PBN links that you point toward your incoming links to increase their power. For each set, he tried to answer the biggest questions.

For Tier 1 links, he focused on questions like:

Q: How fast can I build PBN links to my site?

A: Approximately 3.29 per month.

Q. How many PBN links can I build to my website?

A: The average number built to one domain is 7.12

Q: Do PBN links work on YT videos?

A: Isolated testing has shown they do, but as of yet, clients are not using them that way.

 

For Tier 2 links, he focused on questions like:

Q:How many Tier 2 links should be sent to a given URL?

A: The average is 2.63, though clients have been successful in building as many as 13 at a time.

Q: How many Tier 2 links can be built at a time?

A: The average number of links per order is 20.53.

Q: Should I use tier 1 and Tier 2 links together?
A: Nearly 25% of all PBN users choose to use both of them together.

 

Nearly a dozen more questions and answers are covered across the full guide. Many of the questions and answers are reinforced with graphs, charts, and other helpful data representations.

Now that you’ve learned something you can do with links, let’s look at how you can improve your keywords. Moz has some advice on how you find keywords when you can’t rely on historical data.

Finding Keyword Opportunities Without Historical Data

https://moz.com/blog/find-keyword-opportunities-without-historical-data

Imogen Davies brings us this in-depth look at what options you have when researching a keyword with no historical data. As she points out in the introduction, Google has confirmed that 15% of daily queries are combinations that have never been searched before.

15% unique queries from google

A lot of opportunities are likely to be buried in those queries, but it’s hard to imagine successfully ranking for them when there’s no reference point for what works. Standard keyword tools aren’t going to be helpful here because they’re built around analyzing historical data.

Imogen recommends three alternative strategies:

For mining “People Also Ask,” Imogen suggests that you should start by going large-scale. Use SERP API tools or repeated searches to track all of the related searches that real people make.

Scraping autosuggest is the next recommendation, and it’s easily done with the URL query string she provides for you to paste right into the search bar. It provides you with a complete list of all the suggested queries that are associated with your keyword.

google search how to meme

She recommends that you follow up on either of these strategies by grouping everything you find into topics and themes. This will help you plan your content and get ahead of competitors on unserved queries.

Our next piece has some more advice for you on how to serve your searchers better. This time, you’re going to do it by analyzing SERPs.

How to Analyze SERPs to Win Big in Rankings

https://cxl.com/blog/analyze-serps/

Adam Steele, writing for CXL, brings us this look at how you can analyze SERPs to win big in rankings.

He starts with a short history lesson on how SERPs have changed. He points out that features are frequently the top result for most searches, and that nearly all searches are now intensely customized for intent.

Serps history from 2013

This emphasis on intent has turned out to be a great thing for SEOs. Now, we have a simple, visual way to confirm what Google thinks a keyword means. All we need to do is perform a search and analyze the SERPs that appear.

Adam’s guide takes us through how we can use that analysis to confirm that a content plan will satisfy the keyword’s intent.

He starts with a clear example. He performs a search with the word “Apple” and shows what comes up.

There isn’t a single first-page result for the fruit. Instead, it’s all about the tech company. Good luck ranking for the word “apple” if what you’re selling comes by the bushel.

apple serp company not fruit

He also points out that making even small shifts in the query can change the intent significantly.  As an example, he points out the difference between the terms “my SEO sucks” and “why does my SEO suck.”

The first one turns up an SEO agency by the same name and nothing else. The second one turns up informational articles and Quora questions. These examples point out why it’s crucial that you analyze SERPs rather than guessing intent.

After that, he gives us a list of tips to get more traffic by snagging different search features.

He closes with the reminder that these changes are happening at a rapid speed and reminds us to keep a close eye on real-world SERPs rather than just ranking data.

Now, let’s look at the news. First, a small algorithm update may have dropped on the 27th of January.

Smaller Google Search Ranking Algorithm Update On January 27th

https://www.seroundtable.com/small-google-search-ranking-algorithm-update-30843.html

We reported in the last roundup that the core update had been announced as complete. Barry Schwartz brings us some evidence that may not have been the case.

He tracked a series of conversations that started around the 27th. Across the internet, SEOs were experiencing strange fluctuations and reporting them in forums. Shortly after these conversations began, the fluctuations started appearing in tracking tools.

core update dj khaled meme

Barry tracked them across RankRanger, SERPMetrics, SEMRush, and other tools. In each one, he identified spikes that occurred between Jan 26th and 27th.

While the tools all seem to agree that nothing has happened since then, the question of whether there may be more aftershocks of this update remains open.

As our final news item, we’ll be looking at what Google has to say about duplicate content.

Google: Same Content in Different Formats is Not Duplicate

https://www.searchenginejournal.com/google-same-content-in-different-formats-is-not-duplicate/393595/

We all know that duplicate content can be a problem, but it’s harder to say what “duplicate content” really means.

In a presentation this month, John Mueller helped to clarify the definition by declaring that identical content published in different formats is not considered duplicate content.

john muller on duplicate content

For example, he said, if you repurpose a video as an article (or vice versa), Google doesn’t count it against you. The same holds even if both formats are word-for-word identical.

John went further by saying that Google doesn’t have the technology or interest in doing a full-text analysis of videos. They don’t try to compare the text of videos to other content.

He even argued that having the same content in different formats may serve searchers, and site owners should feel encouraged to do it. It sounds like good news for anyone who is trying to develop their content by putting good information in a new format.

 

As a final note for this month, I wanted to mark the passing of one of SEOs important thinkers: Hamlet Batista.

I did not have the pleasure of knowing him personally, but it was impossible to miss his influence.

He was a constant source of new ideas and an inspirational figure to many of my close friends in the industry. My thoughts go out to them and his family.

 

Got Questions or Comments?

Join the discussion here on Facebook.

Matt-Author-Img

Matt is the founder of Diggity Marketing, LeadSpring, The Search Initiative, The Affiliate Lab, and the Chiang Mai SEO Conference. He actually does SEO too.

 



Source link

Social media & sharing icons powered by UltimatelySocial
error

Enjoy Our Website? Please share :) Thank you!