email:  
pw:
Membership is FREE
Post Reviews, Receive Notice Of Specials
Sign Up Here     Password Help
Penalized in Google?
Unwinding Google Penalties

Rank Loss 2024 : Negative SEO : Broken Google : AI Content

 

Reversing Rank Loss 2024

by: Bob Sakayama
updated April 2024

 

 


 

What happended to Google's cached version of your site?

You used to be  able to see the cached date of your indexed urls by doing a site:domain.com search and clicking a link in that result. But no longer. Now you need to do a specific search using cache:https://www.domain.com (using the full url) to see the cached version. And Danny Sullivan says even that's going away soon. This is a big loss for those of us obsessed with Google, since the cached information is extremely useful in understanding how frequently your site is crawled and exactly how Google sees your pages. A very distant cached date is a warning that a page is not being updated within Google frequently. It may be fixable by making the content more robust. It may indicate that your site is experiencing difficulties with its crawl budget -  the index is polluted with too many urls, or other problems you may be able to recover from if you can see the symptoms. So this change is basically Google becoming less transparent - taking away our capability to better analyze performance. If you can, complain about this.

 


 

Think you're penalized?

What you're seeing is probably not strictly a Google Penalty, where you can see a manual action against your site in Search Console. With a manual action, you may not rank for anything. In our experience, manual actions are now rare. But the good thing about a manual action is you that it comes with actionable information - reasons for the penalty: pure spam, unnatural links to your site, cloaking and/or sneaky redirects, thin content with little or no added value, hidden text and/or keyword stuffing, spammy free hosts, user generated spam. Just need to fix these flagged items to restore your ranks.

But it's much more likely that your rank loss is algorithm based. Google's automation is flagging something it does not like. But unlike a manual action, you get no clues. The worst thing about it is the mystery. Not knowing why things are collapsing is hugely stress inducing because you're helpless to stop the decline.

The experts - your marketing team, seo consultant, and the people who control your website may have already suggested things - rewriting tags, optimizing tags, adding entities to content, improving speed - and you probably tried them all.

Some version of the above is very common. Many business owners feel cursed in the search and have been stuck in a bad place for a long time. But in spite of how it feels, it's very unlikely that you're site is cursed. A long lived website representing a legitimate business that previously had productive ranks, is unlikely to crash and burn without reason. Some of those reasons are relatively new and not yet on the radar of those keeping watch.

 

The Secret Handshakes

The increase in complexity of every aspect of an internet connection leads to more opportunities for failure. Search performance depends on successful implementations of layers of instructions and code that dynamically impact the entire site. So a tiny code error can create a harmful misfire on an important attribute across every page of your site. Everyone makes mistakes. The fact that mistakes can scale is scary.

And these kinds of errors go undetected because no one knows to look for them. That would include any misconfiguration that doesn't reveal itself as broken.

The biggest sources of hidden errors are developers who are unaware of the search consequences of their code - especially true for theme developers.  We've seen many instances where rank issues were the result of the automated misuse of tags, canonicals, redirection, parameters, robots.txt, etc. hidden in the cms used to manage the site.

 

Out Of Your Hands

There have been significant changes in the way popular platforms handle seo - where the responsibility is now in the hands of the theme developers - one degree of separation from the platform - think Wordpress, Shopify. These themes generate additional urls, using canonical tags to prevent inappropriate indexing, and generally control how seo is deployed. This has not worked out well if you read the theme complaints.

One of the biggest issues with themes is with Core Web Vitals - checker: https://pagespeed.web.dev/ - basically a test of the page load speed. Google want to see less than 2 seconds. Yeah, right. Don't know of a single site that meets this metric.

If your site is failing the CWV test, test the themes demo url. This will tell you if the theme itself is the problem. We've seen this already many times so it's clearly a problem. In most cases the theme needs code fixes to change the loading order of the called resources, the way canonicals are deployed, automation handling important tags, javascript controlling css, etc.

 

Broken Google

Index Pollution / Crawl Budget

Broken Google can create a issue that can harm ranks. They tell us that a redirected url won't be indexed, like a url going 404, or a url canonicaled to a different url. When they don't honor these directives they contribute to undesirable growth of their index.

Google doesn't like sites that generate massive amounts of low quality content that clog up their index, hence the need for G to set crawl budget - a limit on the resources Google is willing to expend on your site. Polluting their index can definitely harm performance if they have to slow down the crawl of your site to deal with your excessive numbers. 

We have improved search performance by cleaning up Google's index of our clients' sites.

We understand the need to enforce their rules, but in many cases, it's a failure of Google to respect directives that should have prevented the problem - redirect and canonical. A page with a redirect should not get indexed, it's the redirect target that gets crawled. Same for the canonical. The canonical url gets indexed regardless of the url on which it is the canonical. We know of many sites where urls are indexed that should have been blocked.

We came up with a solution that is elegantly simple - a conditional noindex. It's a few lines of code you can add to your template, so every page gets it. If certain conditions are met, it posts a noindex robots meta instruction. As the site gets crawled, inappropriately indexed page get removed from Google's index. Run the tests below. If you see large numbers of urls that should not be indexed this fix should work. It may not work on some AWS hosting accounts, where a redirection occurs in front of all other code so the noindex is not read. If you control the server you won't have a problem.

 

Test Your Site For Index Pollution

These searches show urls indexed by Google that meet a condition that might reveal issues

site:domain.com -inurl:https   [not secured]
site:domain.com inurl:https   [secured]
site:domain.com inurl:http:   [not secured]

site:domain.com inurl:www   [www version]
site:domain.com -inurl:www   [non www version]

site:domain.com inurl:page   [possible pagination parameter]

site:domain.com -inurl:sort   [possible sort parameter]

 

Wrong Url

A long standing problem has been Google assigning a rank held by the wrong url. Seen this on many different sites. Some have healed on their own. In some cases we've had to intervene with canonicals or redirects to force the rank back to the intended url. These fixes have consequences so they're always temporary.

The wrong url holding the rank is often a parent/child issue, where the search for a brand is held by a product url or vice versa. The confusion can be created by conflicting on page semantics in the tags, filename and/or content. There are instance where semantic fixes do not work and we need a code fix or a more drastic url change. The point being that Google is often not responsive to the simplest solutions.

 

Negative SEO

https://www.theastronomycafe.net/tools/bobsakayama.html

You can pay someone to take down a competitor - see my post on third party attacks - which means that you can likewise be targeted. These are unethical services that have weaponized Google's search results.

Got a solicitation through one of our sites that might be a wakeup call. For $34.93 this service claims it will destroy the ranks of a competitor. Weaponizing 3rd party SEO against your competitors.

 

This is the Google translation from Russian:


We greet you!
We are ready to offer the best "killer" runs for your competitors' sites. Total from 2 000 r.
- 100% result. Online sites will definitely "fall".
- The maximum possible number of negative feedbacks.
- Our special database - the most "deadly" sites out of 10,000,000 websites (viral, spam, porn, etc.). This works flawlessly.
- We do the run from 4 powerful servers at once.
- Continuous spamming of toxic links to the official email.
- Fulfillment of the order for 40-240 hours around the clock. Let's stretch it out as we please.
- Run with forbidden keywords.

 

[2 000 r. = $34.93 on 16 June]

 

Another very recent solicitation for SEO services emphasizing paid links has this url focused on negative seo:

https://www.strictlydigital.net/product/negative-seo-service/

 

 

Here is an example of an attack url's content. Note that it's not meant for humans - just wall to wall text jammed with links to the targets of this attack.

 

 

This is one of thousands of urls all from different referring domains and all designed to harm ranks by making it appear to be an overdone attempt to manipulate ranks using paid text links to create a noncompliant link profile. We have a procedure for finding and remedying this specific attack issue. Our bots are coded to identify attack urls by the nature & styling of the text. We then disavow all those domains.

These negative SEO tactics supposedly trigger rank suppression of the target as Google sees a non-compliant link profile. Disavowing these links should be a signal that they are not responsible for them. But we've handled enough of these attacks to know that there is some randomness in play - or other unknown issues. We've seen ranks begin to recover after submitting a disavowal, but many instances of no change. We can't absolutely correlate disavowals with rank recovery, although sometimes via date stamps we can correlate rank loss with the attack. Still probably good to clean house and see what you find. It's amost always worse than you expected.

 

 

 

 

 

 

 

 

 

 

Core Web Vitals - checker: https://pagespeed.web.dev/

 

 

 

 

 

 

 

 

AI Content

 

 

 

 

 

Home       Google Penalty Primer       Contact