Learn from My Mistakes: Blog Visibility WordPress Option
Well it’s time for another installment of Learn from My Mistakes and this one is just a bit embarrassing since I’m an SEO by trade. Not only does it involve losing rankings, traffic, and eventually cached pages in search engines (never a good thing), but it was all caused and fixed by one simple click of the mouse.
Allow me to set the stage a bit. As some of you know, I recently launched a new MLB Rumors blog. It’s been going quite well and in fact ranked extremely quickly for most of it’s main terms. However, a couple of weeks ago I began to notice the search traffic was tailing off a bit. With that blog the traffic is pretty seasonal and since it was the holidays and there wasn’t much in the way of news, I figured it was just a normal slow down.
A few days later, I noticed that our rankings had dropped a couple of spots for a few terms. Obviously I’m never a fan of that so I checked out the site to make sure everything was ok. Nothing seemed out of the ordinary so I chalked it up to typical fluctuation in Google and didn’t worry about it. Several days after that, our rankings had dropped significantly and worse yet, we noticed that several pages of the site were no longer cached.
Finally I knew something was up and began investigating. I looked at the .htaccess file and everything checked out. We didn’t have a robot.txt file so we knew that couldn’t be it. After looking at all the usual suspects and coming up empty, I realized we had created another installation of WordPress in the root folder of the site. While this normally wouldn’t have been an issue, somehow the Blog Visibility setting in the Privacy Options had been set to “I would like to block search engines, but allow normal visitors.” Bingo!
Now, I’m not quite sure why anyone would ever want to use that option but I was darn sure I didn’t. As it turns out, that option serves it’s own robots.txt file as long as there isn’t one already present. So, even though we had looked at the .htaccess, and knew we didn’t have a robots file, the search engines were being told to turn around and go back from wherever they came. Obviously that doesn’t do nice things to your rankings.
After changing that pesky setting, things quickly started returning to normal. The site was cached again, our search engine listings once again had descriptions to go with the titles, and about a week after solving the mystery, our rankings returned. We’re still a spot or two below where we had been but I’m confident we’ll climb back up without too much trouble.
There are a few lessons to learn on this one. First and foremost, always, ALWAYS double check to make sure your WordPress blog is set to allow search engines. I know it seems like a basic thing but if I manged to screw it up, I’m guessing someone else has along the way. Next, you’d be well served to keep a close eye on your search engine rankings and organic traffic. Drops in either can alert you to a possible problem and often, if you catch things quickly, you can avoid major damage.
Also, it would be a good idea to keep a log of changes you make to your blog. When you install a new plugin, keep track of the date and maybe a few of the changes it involved. Now obviously if you correct a typo you probably don’t need to keep track of that, however, don’t dismiss anything else as too minor. You never know when just a slight tweak to a setting is going to help or hurt your search engine rankings. If you have a log to look back on, you’ll be sure you don’t forget or miss any possible cause of the change.
And last but certainly not least, don’t perform tests on sites that make you money. While my MLB Rumors site has taken off quite well, it doesn’t generate much if any income yet. Had the same thing happened on this blog, I could very well have missed out on hitting my $1k goal for the month of December. Whether you have a site specifically for testing, or you just use one of your less popular blogs, ALWAYS protect your money makers whenever possible.