Following on from blog post about the Google Panda update I thought we’d go into a bit more detail about what can be done if your site literally vanishes from Google. The majority of these drastic drop in positions are caused by one of Google’s algorithm changes they are almost constantly making, such as the Panda update. Quite often there are some other things at work but lets talk about what can be done about getting your old positions back.

To get positions back you need to try and work out what what caused the penalty, address it then beg Google for forgiveness (by submitting a reconsideration request)

Top 9 problems that need to be solved when your website disappeared from google:

1. Check if your Website go Down?

First off, double check the drop wasn’t caused by website downtime. If your website is offline when the search engines spiders come to re-crawl the website you are going to see a huge drop in positions. Google do not want their users to find a website that is offline as that’s about the worst result possible.

Unfortunately it’s just a matter of waiting. If the website is accessible again, pages from your website should start to show up again once Google re-crawls and reprocesses them.

2. Check if your site break Google’s Quality Guidelines?

Although Google is not open about everything they penalise websites for, they do have a list of Quality Guidelines that explain some of the deceptive/manipulative tactics that they may penalise you for. You should have a read through these guidelines to ensure you’re not breaking the basic quality guidelines.

3. Check if Google can see your website

If Googlebot cannot access your website because it was denied by the robots.txt or a meta tag then it simply isn’t going to rank or index your website, you are essentially telling Google that you do not want them to list your website.

You can easily check if Google can access your website in Google Webmaster Tools, just go to “Fetch as Googlebot” in the menu on the left, your website address should already be filled in so just click “Fetch”, it should return “Success”:

Fetch Status: Success

Click “Success” and you should see a preview of how Google sees your website.

Fetch as Googlebot Results

The number you see at the top is the server response, if you do not get the “HTTP/1.1 200 OK” at the top (or at least the 200) then something is very wrong.

200 Server Response

If you see 404 that means in Google’s eyes the page does not exist.

If you see 301 or 302 that means the page is being redirected, it should also show what page it is being redirected to.

Unfortunately even if the page is returning a 200 it doesn’t mean it can definitely be indexed. Google can be blocked by meta tags and robots.txt.

Make sure your robots.txt does NOT disallow Googlebot

A robots.txt blocking all the search engine spiders looks like this: (You don’t want this)

User-agent: *
Disallow: /

A standard robots.txt looks like this: (You want this)

User-agent: *
Disallow:

Remove meta robots or meta googlebot

You should remove meta tags that look something like this:

<meta name=”robots” content=”noindex”>

or

<meta name=”googlebot” content=”noindex”>

The above are blocking Google from indexing your site and if the page is already in their index they will remove it after the next time it is crawled.

4. Check Low quality inbound links

1000 links for £9.99 may be an appealing offer but the links will be extremely low quality and Google can penalise you for them. You should focus on getting one way natural links to your site, quality is better than quantity and they can help you gain positions again if you have been penalised before.

5. Stop linking to other websites

This may sound like a strong heading but who you link to does matter, if you link to a bad neighbourhood it can have a serious effect on your positions within Google. You should put rel=”nofollow” onto any links to external websites you do not 100% trust.

6. Stop stuffing your site with keywords

Unnatural Keyword Density Example

Unnaturally high keyword density is an extremely common cause of a drop in position for 1 particular keyword. Keywords should not be used excessively in the content or alt tags on images, for example if a website mentioned “cars” too much it could just stop ranking all together for “cars”.

The content on your site should be natural; read it out loud and if it doesn’t sound right then it probably isn’t right to put on your website either.

7. Remove Duplicate Content

Low amount of original content and content used on multiple pages is one of the things that the Google Panda update has been targeting. Each page of your website should have unique content, you shouldn’t be copying over 300 words of brilliantly worded marketing text onto every page just because it sounds great.

8. Improve your website

Addressing the points I have raised above is great but you also have to remember the visitors of your website. Google has access to so much data from their browser, toolbar, operating system and so on. A website that gets 1000 visits but 90% of them immediately go back to the search results or they only spend 20 seconds on the website is not a good result, it’s unclear exactly what metrics are taken into account by Google but it’s likely that the bounce rate, average time on site and the CTR from the search engines are all considered.

Improving the look/usability of your website and tweaking the meta description can improve all of these and help to show Google that visitors of your website do find it useful/interesting.

9. Have you been effected by the Panda Update?

Panda eating a website

Google’s “Panda” update was released February 2011 and has been updated regularly since then; its aim is to basically lower the rank of low-quality sites and return higher-quality sites near the top, it’s a domain level penalty so will affect your entire website, read more about the Google Panda update.

I’ve done all of the above, now what?

Now you have looked into all of the above issues, you are happy that your site does not break Google’s Quality Guidelines and you believe you are doing nothing wrong then you should submit a reconsideration request to Google.

If the above issues didn’t work then you may have to buy a new domain name and start fresh: move your site over to the new domain then start building up trust again by having a user friendly website with quality content and 1 way natural links.

This blog post was written by Diana Esho – follow us on Twitter or Facebook for an inside look into the technical side of SEO.