Teams of all sizes can make mistakes. Here are a few basic SEO points to check if your site falls off Google—or your traffic falls off a cliff.
Are You Being Indexed?
Searching your own name in Google is one way to see if you’re still in the Google index, but a more reliable method is to log into Google Webmaster tools and see if Google is crawling your site. If so, how many pages are they indexing? Does the number sound correct to you? Is the number roughly the same week over week? If not, you may have a robots.txt, sitemap or server issue.
Is Your robots.txt File Blocking Crawlers?
It’s an easy enough mistake to make…you push changes live from a staging server (which you don’t want indexed) and the robots file comes with it. The easiest way to check this off your troubleshooting list is to visit your website’s home page and type robots.txt after the top level domain. For example, ExampleWebsite.com/robots.txt
If your robots.txt file contains:
User-agent: *
Disallow: /
Your website is blocking all crawlers from the site. Get in touch with your web developer or SEO resource to upload a new file that won’t block Google.
Check Your Sitemap
If your sitemap doesn’t reflect all the pages you want indexed, that’s a good place to start. Sitemaps live in different locations based on how your website is built, but asking your web developer or Googling ‘sitemap’ and the platform your website was built with will get you started.
Server Issues
If your server is timing out or taking too long to render pages and media, Google will skip over the sections that aren’t coming up quickly. Moving to a new server or streamlining your back end can help…asking your web developer about the timeout errors is a good place to start.
Hopefully these points will help you do a little research before sending out a ‘The Site is Down!’ email…because having a site fall out of the rankings isn’t the same as having the site go down. If you have persistent SEO issues you are struggling with, get in touch, we’d be happy to help.