Check Your Robots.txt File

A robots.txt file is a file on your site that tells google bots what content they can and cannot crawl and index.

You can find your robots.txt file by searching “YOUR DOMAIN NAME/robots.txt” in Google. A page like this should show up…

Let’s review what this page is telling google bots:

  • User-Agent: * says that all of the following rules on this file apply to all google bots
  • Disallow:/wp-admin/ says that all bots are not allowed to crawl or index any pages that include /wp-admin/ in the url. This is included so google doesn’t rank the back end of your WordPress site (where your login and make changes to your site)
  • Allow: /wp-admin/admin-ajax.php allows bots to crawl pages from that url. This is so that certain page formatting instructions can be followed by bots.

Your robots.txt file should look like this at a minimum. Here are some things you should check for on your own file:

  1. Do you have a robots.txt? If not, you should get one! Talk to your web developer or whoever build your site about getting one (you can learn more about building a robots.txt file here)
  2. Do you have any disallow directives that you don’t recognize? If so, you may want to talk to your web developer and ask why it is there. Content that is disallowed will not rank in Google.
  3. Look out for “Disallow: / “. If this instruction is here, it is preventing all bots from crawling your site. No crawling means no ranking!

ACTION ITEM: Check Your Robots.txt File.