Fix the robots.txt path
This commit is contained in:
parent
1050d7a78f
commit
dca195e9bd
2 changed files with 2 additions and 2 deletions
|
@ -23,7 +23,7 @@ If you don't want your repository to be visible for search engines read further.
|
||||||
## Block search engines indexation using robots.txt
|
## Block search engines indexation using robots.txt
|
||||||
|
|
||||||
To make Gitea serve a custom `robots.txt` (default: empty 404) for top level installations,
|
To make Gitea serve a custom `robots.txt` (default: empty 404) for top level installations,
|
||||||
create a file called `robots.txt` in the [`custom` folder or `CustomPath`](administration/customizing-gitea.md)
|
create a file with path `public/robots.txt` in the [`custom` folder or `CustomPath`](administration/customizing-gitea.md)
|
||||||
|
|
||||||
Examples on how to configure the `robots.txt` can be found at [https://moz.com/learn/seo/robotstxt](https://moz.com/learn/seo/robotstxt).
|
Examples on how to configure the `robots.txt` can be found at [https://moz.com/learn/seo/robotstxt](https://moz.com/learn/seo/robotstxt).
|
||||||
|
|
||||||
|
|
|
@ -22,7 +22,7 @@ menu:
|
||||||
|
|
||||||
## 使用 robots.txt 阻止搜索引擎索引
|
## 使用 robots.txt 阻止搜索引擎索引
|
||||||
|
|
||||||
为了使 Gitea 为顶级安装提供自定义的`robots.txt`(默认为空的 404),请在[`custom`文件夹或`CustomPath`](administration/customizing-gitea.md)中创建一个名为 `robots.txt` 的文件。
|
为了使 Gitea 为顶级安装提供自定义的`robots.txt`(默认为空的 404),请在 [`custom`文件夹或`CustomPath`](administration/customizing-gitea.md)中创建一个名为 `public/robots.txt` 的文件。
|
||||||
|
|
||||||
有关如何配置 `robots.txt` 的示例,请参考 [https://moz.com/learn/seo/robotstxt](https://moz.com/learn/seo/robotstxt)。
|
有关如何配置 `robots.txt` 的示例,请参考 [https://moz.com/learn/seo/robotstxt](https://moz.com/learn/seo/robotstxt)。
|
||||||
|
|
||||||
|
|
Loading…
Reference in a new issue