1. If it does not exist, create robots.txt file in the root directory of the website.
2. Add these lines (as an example):
user-agent: * – the asterisk includes every bot, not just search engine indexing spiders
Disallow: – followed by the path to the directory. The trailing slash can be replaced with its equivalent: /* – but just the trailing slash is enough. It includes every file and sub-directory inside the directory.