How do I optimise my robots.txt file within WordPress?

This article will detail how to optimise your robots.txt WordPress file.

What is a robots.txt file?

A robots.txt file is used to inform search engine robots how to crawl your webpages, so that they can grab your content along with its links which, in turn, will help visitors find your site.

You can also use Robots to set which pages search engine robots should NOT crawl and prevent specific robots from crawling your site at all.

Where can I find my Robots file?

Your WordPress robots.txt file can be found within your website’s files after you’ve connected to your site via FTP.

From there, you can manually edit your file to set which robots can crawl your pages, along with the pages they should and shouldn’t crawl. Details on how to do this can be seen in the section below.

How do I add/optimise my Robots file?

You can optimise your Robots file via a plugin or via FTP, both of which are outlined below.


Via a plugin

If you’re not particularly tech-savvy, then you can install a plugin in WordPress to allow Robots. To do this, simply follow the instructions outlined below:

Step 1 of 4

Start by installing and activating the plugin All In One SEO.


Step 2 of 4

From there, hover your cursor over All in One SEO in the left-hand menu and select Tools from the menu that appears.

Select SEO Tools


Step 3 of 4

On the next page, you’ll be shown the Robots.txt Editor. Simply toggle the option for Enable Custom Robots.txt to enable Robots.

Enable Robots


Step 4 of 4

From here, you can add your own rules by entering values into the provided text boxes:

  1. User Agent: which bots you wish to target. If you want to target ALL bots, simply enter *
  2. Rule: set whether you want to Allow or Disallow the rule
  3. Directory Path: set where your rule should be set

Once you’re happy, click Add Rule to insert your rule, followed by Save Changes.

Add Rule



Step 1 of 4

Start by opening FileZilla.


Step 2 of 4

Enter your FTP details into the provided fields and click Quickconnect.

Enter FTP Details


Step 3 of 4

Right-click your robots.txt file and select View/Edit.
View robots


Step 4 of 4

This will open a Notepad file where you can begin adding/editing your rules.

  1. User Agent: which bots you wish to target. If you want to target ALL bots, simply enter *
  2. Disallow: which pages on your site you want robots to avoid

Be sure to save your Notepad file after you’ve made any changes.