How I Lost 50% Website Traffic By Being Careless With Subdomains
I’ve seen and read a lot of blog posts recently about making mistakes, in particular how it’s sometimes good to make mistakes as they provide a platform for learning, improving upon ones self, and not making them again in the future.
This post is exactly that; Looking back on a decision that maybe wasn’t quite right, and that ultimately had an impact on my business.
Everyone with a website wants traffic. Everyone with a website that wants to succeed needs traffic. Visitors to a website are the backbone, the reason we spend days, weeks and months optimising them to squeeze every last search engine position we can out of them.
I’ll admit that I’m quite the addict when it comes to statistics and analytics. I have the real-time analytics running in a tab all day and, much to my partners dismay, am constantly checking on my mobile when away from the office. It will come as no surprise therefore that I was shocked, and somewhat bemused, when the traffic to this website started to gradually get less and less with each week that passed.
Every website will no doubt experience a drop in traffic at some point in it’s life. This could be due to the search engines changing their algorithm, a website receiving a penalty for something or other, a change of marketing tactics, a redesign and so forth. Sometimes it can be easy to figure out the reason, other times there will be no obvious culprit.
After a few weeks of the slow and steady decrease in traffic, and it now being at 50% of what it was in the previous months, it was time to investigate and dig a little deeper.
In the graphic above you can see that the traffic is decreasing at a constant rate, then increases when I find and resolve the issue.
The Culprit? Subdomains
I found the reason for the drop in traffic by checking in Google which pages it had indexed for my site.
The first thing I noticed was that the number of pages being indexed was much higher than the number of pages that I had on my site. Scrolling through the list of results, I could also see pages being referenced that didn’t belong to the site itself. These other sites were actually from development and staging sites I was hosting on subdomains.
When a site was ready for showing to the customer prior to it going live, I would place it up on a public subdomain for them to preview and provide feedback. Seeing that some of these sites could be in a review phase for a few weeks, even months, it provided enough time for Google to index them, therefore allowing them to contribute towards the ranking of the primary site.
As with anything related to search engines, unfortunately there was no instantaneous fix. There were however steps I could take to resolve this as soon as possible, and prevent it occuring again in the future:
Request URL removal
The ‘Remove URLs‘ feature in Google Webmaster Tools provides a quick and easy way to create requests to remove URL’s from Google’s index.
Use a robots.txt
By uploading a robots.txt file to the root of the subdomains it’s possible to instruct search engine bots to not spider the site. Simply place this in the robots.txt file:
Set META tags
Similarly to the above, we can tell bots to not index the pages by adding the following META tag to the headers of our pages:
<meta name="robots" content="noindex,nofollow" />
Note: If you’re using the subdomain as a development site, remember to remove the robots.txt file and/or META tags when the site eventually goes live.
After making the updates these references to the subdomains in Google started to disappear about 7-10 days later. The traffic began to rise and, as of this last week, returned to where it was previously. Needless to say, I’ll be a lot more strategic and careful when setting up and using subdomains as a means to provide a preview of a website to a customer.