Descrição
DB Robots.txt is an easy (i.e. automated) solution to creating and managing a robots.txt file for your site. It is easy to create a robots.txt without FTP access.
If the plugin detects an existing XML sitemap file it will be included into robots.txt file.
It automatically includes the host-rule for Yandex.
Instalação
- Upload bisteinoff-robots-txt folder to the
/wp-content/plugins/
directory - Activate the plugin through the ‘Plugins’ menu in WordPress
- Enjoy
FAQ
-
Will it conflict with any existing robots.txt file?
-
If a physical robots.txt file exists on your site, WordPress won’t process any request for one, so there will be no conflict.
-
Will this work for sub-folder installations of WordPress?
-
Out of the box, no. Because WordPress is in a sub-folder, it won’t “know” when someone is requesting the robots.txt file which must be at the root of the site.
Colaboradores e desenvolvedores
“DB Robots.txt” é um software com código aberto. As seguintes pessoas contribuíram para este plugin.
ColaboradoresTraduzir “DB Robots.txt” para o seu idioma.
Interessado no desenvolvimento?
Navegue pelo código, dê uma olhada no repositório SVN ou assine o registro de desenvolvimento via RSS.
Registro de alterações
2.3
- Tested with WordPress 6.2.
- The code is optimized
- Added the robots directives for new types of images WebP, Avif
2.2
- Fixed Sitemap option
2.1
- Tested with WordPress 5.5.
- Added wp-sitemap.xml
2.0
- Tested with WordPress 5.0.
- The old Host directive is removed, as no longer supported by Yandex.
- The robots directives are improved and updated.
- Added the robots directives, preventing indexind duplicate links with UTM, Openstat, From, GCLID, YCLID, YMCLID links
1.0
- Initial release.