Robots.txt is a text file which is present on your site (www.yourdomain.com/robots.txt) to tell search robots that which pages you would like them not to visit. On your website, you may have sensitive datas where you don’t want search engines to index them. To make this possible, writing on the robots.txt file will help search engines to read the robots.txt file and they don’t crawl that pages.
Creating a robots.txt file isn’t easy. You need to know the list of user agents, disallowed files and directories. So we use Robots.txt generator to create a robots.txt file. MN Robots.txt Generator is a simple web service that lets to create robots.txt file easily. Just select the options from the template and click create. Once the robots.txt file is created, copy and paste it on your root directory.
View the Original article
No comments:
Post a Comment