robots.txt is a text file that is used to restrict robots (more importantly, search engine bots) from indexing and crawling certain parts of your website. Without a robots.txt file, the bots have free reign around your website which is not a good thing at all.
By default WordPress does not come with a robots.txt, so this is something you will need to add and it is beneficial to do so.
The whole point of restricting access to certain areas of your website, is so that you can focus on indexing content that is important.
What happens when you don’t have a robots.txt file?
In short, anything and everything can be indexed.
For WordPress that means the following.
As you can probably guess, non of the above have any real SEO relevance and can even hurt your site. Google allows only a finite amount of what is known as link juice, so it makes sense to focus our efforts on our content and pages.
How can we control the indexation of our site?
The solution is really very simple.
Create a text file using your favorite text editor, or if you have Cpanel, just create a new file in the file manager and call is robots.txt.
Copy and paste the following.
User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/themes Disallow: /wp-content/plugins/ Allow: /wp-content/uploads Allow: /feed*
Save the file, if you created it off the server just upload it to the root directory. This is the same directory that wp-admin, includes and content is stored.
That’s it, now you will start to notice some de-indexation of irrelevant information over the next few days and weeks.
If you would like to add other files to the above, just add another rule like so.
Just remember that we omit the site name, the add the path to the folder starting with a forward slash (/).