hugo/docs/content/en/templates/robots.md
Bjørn Erik Pedersen 5fd1e74903
Merge commit '9b0050e9aabe4be65c78ccf292a348f309d50ccd' as 'docs'
```
git subtree add --prefix=docs/ https://github.com/gohugoio/hugoDocs.git master --squash
```

Closes #11925
2024-01-27 10:48:57 +01:00

1.8 KiB

title linkTitle description categories keywords menu weight aliases
Robots.txt file Robots.txt Hugo can generate a customized robots.txt in the same way as any other template.
templates
robots
search engines
docs
parent weight
templates 230
230
/extras/robots-txt/

To generate a robots.txt file from a template, change the site configuration:

{{< code-toggle file=hugo >}} enableRobotsTXT = true {{< /code-toggle >}}

By default, Hugo generates robots.txt using an internal template.

User-agent: *

Search engines that honor the Robots Exclusion Protocol will interpret this as permission to crawl everything on the site.

robots.txt template lookup order

You may overwrite the internal template with a custom template. Hugo selects the template using this lookup order:

  1. /layouts/robots.txt
  2. /themes/<THEME>/layouts/robots.txt

robots.txt template example

{{< code file=layouts/robots.txt >}} User-agent: * {{ range .Pages }} Disallow: {{ .RelPermalink }} {{ end }} {{< /code >}}

This template creates a robots.txt file with a Disallow directive for each page on the site. Search engines that honor the Robots Exclusion Protocol will not crawl any page on the site.

{{% note %}} To create a robots.txt file without using a template:

  1. Set enableRobotsTXT to false in the site configuration.
  2. Create a robots.txt file in the static directory.

Remember that Hugo copies everything in the static directory to the root of publishDir (typically public) when you build your site.

{{% /note %}}