It's true. Don't think you're special just because you got penalized in Google. Everybody's doing it. Kind of part of the growth process for an online enterprise. So get used to it.
Learn. Read. Unwind.
It's a sad fact that the most severe Google penalties are triggered by seos. While many site owners inadvertently trigger rank issues, the real damage is done by the professionals. This is caused by a number of factors, among them the fact that what worked yesterday may trigger penalties today. In many respects the rules are changing faster than the ability of the seos to keep up.
The goal of this site is to provide insight on the possible causes and solutions for the myriad Google penalty environments we see daily. The successful management of a business that relies on search requires attention to the constantly rising standards demanded by Google of high ranking sites. A penalty is a wake up call to action that can be used as a learning experience. Even if you ultimately hire a professional to remediate your site, read this site and learn as much as you can. You have work to do.
A sudden loss of rank that persists is an existential threat to a business dependent on search traffic. So of course, the first step in the remediation process is to determine whether a penalty is involved. If you have a manual action notice in Webmaster Tools the question is moot - you are definitely penalized. But many times the rank loss is not accompanied by any messaging from Google. The automated updates like Penguin and Panda suppress ranks without communicating their actions. Fortunately, you can make an educated guess if you can line up the release dates of major updates with negative changes in search traffic, and there are free tools that can see these examples.you do this -
If you received an unnatural links manual action notice, or if your rank loss closely correlates with a Penguin update, you probably need to clean up your link profile. If you see a tight correlation with a Panda update read this post.
But often there is no clear cut correlation pointing to a specific update. It is critical to be able to know with some confidence whether the ranks are harmed for a search compliance reason, a technical problem with the site implementation, a regulatory issue, a problem with Google's index, poor optimization, or something else entirely.
One very common problem for client sites is the presence of unknown or unexpected urls indexed by Google. A careful review of Google's index of your site can reveal problems that are otherwise invisible. The simple site:domain.com search is incredibly valuable, but it comes with some surprising flaws, especially with the metrics and date stamps.
These searches can reveal important information regarding Google's index of your site.
Reveals the urls in Google's index. If you get no result, the site is either severely penalized with an exclusion penalty, or the rank loss is the result of something that prevented the indexing of the site. Make sure what you see comports with what should be indexed - check for the presence of all urls important in the search.
One flaw in these results is usually visible with the number of indexed pages. The number you see will probably change when you reach the last page of the results. Also files blocked by robots.txt may be counted in the total, but won't have a cached record, so your actual numbers are often significantly lower.
Reveals the urls in Google's index ordered by their strength on the word "keyword." The first result will be the url that holds rank for that term. If the url that should hold the rank is not #1, there is a problem.
Looks for the presence of "https" in the url string. If the site is serving https and http urls, you do not want the entire site indexed both ways. If you're only using https on certain pages, check this search to make sure https is confined to those known urls, and not accidentally being implemented across the entire site.
Looks for the presence of "http" in the url string. If the site is serving ONLY https urls (as Google is now encouraging), you do not want the entire site indexed both ways. Check this search to make sure http urls are NOT being indexed.
Because subdomains often are generated & indexed inadvertently, we strongly recommend that you run these next steps to make certain no unknown subdomains are currently indexed, creating duplicates of existing pages.
Looks for urls that without "www" in the url string. Can also reveal https, but this is the first step we use to discover all the subdomains that are indexed. If this search reveals a subdomain, add it to the search (see next example).
site:domain.com -inurl:www -inurl:subdomain1
Looks for urls without "www" and without "subdomain1" in the url string. If there another subdomain is revealed, repeat the search and add inurl:subdomain2 to your search string. As you eliminate the visible subdomains, others will appear if they are indexed. When there are no longer any results, you have them all.
All websites, whether a small blog or a large e-commerce enterprise, are vulnerable to actions taken by search engines without warning. The web environment is clearly not without risk, and those risks may be outsized compared to our knowledge of them. For example, most penalized sites (excluding those harmed by seos) are not penalized due to black hat or intentional violations of the best practices guidelines published by Google. By far and away, the sites that are suppressed in the natural search fall into non-compliance completely inadvertently, and sometimes because of the actions of third parties.
While Google has made large advances in its spam detection capabilities, they still require human intervention to keep their results clean and relevant, because the number of players seeking to game the system is constantly growing and throwing new challenges their way. Many penalties result from hackers commandeering ranks from existing sites by directly hacking or otherwise influencing Google's index.
And in some cases, Google is the cause of the problem. One case in point is regarding links - either paid, or unnatural. Google has been penalizing sites that flagrantly buy text links, yet the demand for those links continues to grow. This is because in spite of clear messaging that paid links are black hat and not welcome, paid links continue to work. And a complication of penalizing sites for links - Google is unable to reliably differentiate the difference between a paid link from a naturally occurring one. This leads to all kinds of ethical issues when penalties are involved, especially when triggered by third parties.
The recent barrage of penalties resulting from Google's recent aggressive enforcement policies has generated a huge demand for penalty experts, and they have suddenly appeared. Some are the same guys who blew up their clients' ranks a short time ago and they've become experts at removing the same links they were recently building. Many are claiming a 100% success rate. Most of the seos making this claim are addressing an unnatural link penalty - a manual action that has been one of the most common, and most straightforward to remove. If you can identify, remove, and disavow the offending links, the manual action is revoked. It's easy, because if you are thorough, you'll probably succeed eventually. But the trigger behind rank penalties can be much more complex. We have seen many sites penalized for technical reasons unrelated to content or links. The number and frequency of the recent algorithm updates are indicating a desire by Google to improve the experience of searchers, and eliminate some loopholes that were being abused. These changes disrupt the search results, and create winners and losers. The real experts understand that complex processes like the interaction of a website with Google's algorithm are not fully understood, even as we attempt to influence it and discover solutions for penalties. And from all we've seen so far, it's going to get a bit more complicated over time as Panda updates.
A major change in the evaluation of content has taken place starting in 2011 with the advent of Panda with an emphasis on presentation and user experience. The actions taken most recently are rewarding user engagement to a surprising degree. And as sites are rewarded for implementing more and better engagement, those not heeding this call will see their rank loss as penalties. But this new order being created by requirements for higher standards for inclusion in the search is changing the web, as webmasters gradually figure out that without high quality engagement, the site has little chance of ranking for anything. Google wants the pages it recommends to be valuable to the searcher - very user friendly, offering information that is useful and presented elegantly, images, media, links to authority sites, choices unique to the page, no dead ends. The old days of relevancy being all important in determining the results has been shifting to higher presentation standards for a while now, and we expect the trend to continue. Businesses that rely on high ranks are scrambling to figure out how to compete in the new world of search, where the best content is not enough - and "engagement" has become the holy grail.