Many talks about this little file... robots.txt. Is it realy usefull ? what do you have to put inside ?
In fact robots.txt is an robot/spider exclusion protocol. Every spider/reboot should read it and apply the rules it contains. The robots.txt file directive can be used to not allow a spider/rebot to index a specific directory, file, or more generally every single ressource you have on your hosting space/account..