Customizing a site's robots.txt
A robots.txt
file provides crawling guidance to search engines, such as the following:
- Which user agents are allowed to crawl or not crawl
- Which URLs to crawl or not crawl
- Locations of sitemaps
- Limiting the frequency of crawling
Properly configuring your site's robots.txt
enhances search-engine optimization.
Brightspot provides a default robots.txt
with the following directives:
User-agent: *
Crawl-delay: 10
Brightspot creates the file robots.txt
at the location in the field Main > Default Site URL. For example, if your site's default URL is https://brightspot.com
, Brightspot creates the robots.txt
file at https://brightspot.com/robots.txt
.
To customize a site's robots.txt:
- Click > Admin > Sites & Settings.
- In the Sites widget, select the site for which you are configuring
robots.txt
, or select Global to configurerobots.txt
for all sites. - Click , located to the left of , and type
robots.txt
. - In the robots.txt field, enter directives for the search engine. See the search engine's documentation for the list of honored directives.
- Click Save.
See also:
Previous Topic
Configuring sitemap generation
Next Topic
Brightspot Analytics