To affect what automated visitors see when they visit your website, you can use a robots.txt file. You'll need to be able to create a file at the root level of your web site. You can use a standard text editor to make you robots.txt file (e.g., TextEdit, Notepad, etc.).
You may want to check the syntax of your live robots.txt file, the one on your site now, using a robots.txt validator. Robots.txt validtors can be very useful to verify you've set your robots.txt file up to do what you really want it to do. It helps reduce human error, especially as the file gets larger.
Remember that while web robots (search engine spiders and such) often do obey the instructions they find in robots.txt files, a robots.txt file is not a security measure. Robots.txt files are your suggestions about what machines should and should not access, they won't control who or what can access anything on your web site. To actually restrict access to your site (or just certain directories) you can often set up authentication, authorization and access control using a .htaccess file.
No comments:
Post a Comment