There’s something extra jarring about logging into Google Analytics and realizing that your organic traffic has plummeted; something more jarring than any other traffic source.
Maybe it’s because all that organic traffic is hard-earned; the result of of months and even years worth of optimization, content writing and re-optimization all to go from completely off the map to finally cracking page one. Every transaction it generates is all of that work finally paying major dividends.
Or maybe it’s because it can be frustrating as f*** figuring out what the hell happened.
Either way, time is a factor here, and there’s not a moment to lose. Where do we start, and what are we looking for?
Note: For the sake of this article, we’re going to assume that you’ve already ruled out all non-SEO issues. These include but are not limited to:
- Google Analytics tracking errors
- Seasonality of products or services
- Server issues (your site is partially or entirely down)
- Some sort of natural disaster
- A sexy new competitor who is clearly better than you and everybody knows it.
We’re also going to assume that you’ve already ruled out a manual penalty, since you are alerted to those in Google Webmaster Tools.
If the drop in traffic is more sudden than gradual, you might be dealing with some sort of technical error on your site, and that means it’s time to pop over to Google Webmaster Tools. Navigate over to the “Robots.txt tester” (under “Crawl” on the left hand nav) and take a look at your robots.txt file.
If you’re not sure what that is (although in this scenario you should be, because you made one already, ya dope), it’s basically a way for you to block crawlers from accessing certain pages on your site.
So for example, if you wanted to keep Google from crawling your shopping cart, you would use something like this:
Where you might start seeing a problem is if you somehow managed to block pages that you want Google to index and rank. Or even worse, if you see this…
…ya done goofed, because you blocked all crawlers from your entire domain.
Luckily, Google Webmaster Tools has plenty of resources for editing, testing and uploading your robots.txt file, so this one is an easy fix. So if you know that there was work done on your site recently, check out your robots.txt file and see if it was affected in the process.
Once you fix the issues, keep an eye on your rankings and traffic over the next week or so and see if that solves the problem.
XML Sitemap Issues
If it’s not a robots.txt problem, you might be having issues with your XML sitemap. Fortunately, this is another issue that can be diagnosed in Google Webmaster Tools. Head back under the Crawl option on left nav bar, and click on Sitemaps.
The first thing you’ll want to do is to compare the number of URLs submitted and the number of URLs indexed (see below). The ratio should be pretty close to 1:1. If it’s not, then there’s either something wrong with your sitemap or you’re blocking Google from crawling certain pages (see robots.txt issues above).
When it comes to your sitemap, there are a number of things that can give you headaches. Things like:
Using the wrong URL format: If your preferred domain is https//:yourdomain.com and all the URLs in your sitemap are written as https//:www.yourdomain.com, Google will consider those to be two separate domains. Be sure to set up your preferred domain in GWT and make sure your sitemap matches it.
Not accounting for SSL encryption: If you’ve recently secured your site with SSL encryption (and good on you for doing so), you have to be sure to update your sitemap to reflect that. Even if your redirects are set up correctly (more on that later), it’s still imprtant to include your new secured URLs in your sitemap. In other words, if the URLs in your sitemap start with http//: and the URLs on your site start with https//:, it’s time to update your sitemap.
Sitemap serving a 404: If your sitemap is serving a 404 error, then you probably submitted the wrong sitemap URL, or else your sitemap just isn’t where you thought it was. Either way, ensure that you have the correct URL and resubmit it.
You used the wrong format: For Google to read your sitemap, it must be in the XML format. Most shopping carts or hosting services will provide a way to generate an XML sitemap, but if yours doesn’t then there are many free options available.
These are just a few common issues, but there are many others that could arise.
Incomplete HTTP to HTTPS Migration
Switching your site over to HTTPS is a pretty involved process that comes with a boatload of benefits, including higher rankings. But if not done correctly, or if not done completely, it can also bring a pantsload of problems for SEO managers, especially if the issues aren’t caught immediately.
If you’ve recently made the switch, you might be extra surprised to see your rankings and traffic take a dive. After all, it’s been almost three years since Google officially declared HTTPS a ranking signal, and its importance has only increased.
But alas, here we are. So what went wrong with the migration? In our experience, the problem usually comes down to bad or non-existent redirects.
You’d be amazed how often this gets overlooked. While many shopping carts or hosting providers will automatically redirect all non-secure URLs to their respective secured version, this is not always the case. And when that’s not taken care of, you end up with two versions of every single page on your website.
I know what you’re thinking. “But Dan,” you say, assuming that we’re on a first name basis, “I thought the duplicate content penalty was a myth.”
Technically, you’re right. There is no official penalty for duplicate content. BUT that doesn’t mean that it doesn’t still have negative effects. While Google will usually give priority to secured pages, it’s not guaranteed. And if there are two versions of the same page, it can make it harder for Google to figure out which page it should rank. This problem can be exacerbated by site owners who neglect to update links in their navigation. If non-secure URLs are still receiving internal linking and secure URLs are not.
At the very least, this will make it take far longer for Google to start ranking the secure page. At the very worst, it can cause the rankings of both pages to drop.
And while we’ve already mentioned this once in this article, it bears repeating: For the love of all that is good and holy, update and resubmit your sitemap, and add your new secured domain to Google Webmaster Tools. If you don’t, your rankings WILL drop and you WILL lose traffic.
One final note: switching to HTTPS will also affect any canonical tags you may have in place, so be sure to update those as well.
Google Algorithm Update
Once everything else has been ruled out, it’s time to consider what many would call the worst case scenario: you’ve been negatively affected by a Google ranking algorithm update.
You’d think that it would be a lot harder to tell when this happens now that Google no longer announces their updates, but luckily the SEO community is all over it. It doesn’t take long for major changes to get picked up by SEO publications, especially with access to tools like MozCast.
It usually just takes a quick Google news search to determine if and when (or at least a general idea of when) an algorithm update has dropped. Then it’s just a matter of comparing that to when your traffic/rankings disappeared. If the two line up, you may be onto something.
As of late, most (if not all) of Google’s updates have to do with one of three things: usability, security, or quality. The days of uber-optimizing low-quality content are coming to an end, and every update moves us further and further away. If your site is still behind the times, you’re going to feel it.
But each update is different, so do your research, make the necessary changes, and hope that it’s enough to get back into Google’s good graces.
Because the Goog giveth, and the Goog taketh away, but the SEO manager will taketh that s*** right back.
Learn more about Conversion Giant and our SEO services.