WebSite X5 Pro
WebSite X5 generates the robots.txt file and adds it to the main directory of the Website, so that search engines know which contents to exclude from indexing.
Robots are programs that scan the web automatically, for different reasons: search engines such as Google™, for example, use them to index website contents; spammers, on the other hand, use them to obtain e-mail addresses without authorization.
A website owner uses the robots.txt file to tell robot programs what they are supposed to do with his website. Robots (at least the trustworthy ones) check for the presence of a robots.txt file and follow the instructions in it, before accessing a website.
So, the robots.txt file is simply a text file with a list of instructions that specify:
- the kind of robot that the rules apply to
- the URLs of the pages to be blocked.
WebSite X5 provides a default robot.txt file which contains instructions so that no robots should consider the contents of certain subfolders, such as the admin and res ones:
User-agent: * Disallow: /admin Disallow: /captcha Disallow: /menu Disallow: /imemail
You can change these basic instructions, according to your specific requirements.
To edit and include the robots.txt file, you need to:
- Go to Step 1- Website Settings > SEO.
- Open the Base section and select the Include the robots.txt file option.
- Use the field underneath to manually edit the instruction to include in the robots.txt file.
#tip - For more information on robots, and on how to create a robots.txt file, see the official website http://www.robotstxt.org or consult Google™ Webmaster Central (Block or remove pages using a robots.txt file)