Creates a robots.txt for applications that can be set to allow or disallow web crawlers to the app.

Alex Gleason 9393640fa9 Fixed inconsistent help text per tomd %!s(int64=6) %!d(string=hai) anos
subcommands 0492e68244 Add default subcommand %!s(int64=8) %!d(string=hai) anos
templates d02a7246d4 Filesystem skeleton %!s(int64=8) %!d(string=hai) anos
LICENSE d02a7246d4 Filesystem skeleton %!s(int64=8) %!d(string=hai) anos
README.md 9393640fa9 Fixed inconsistent help text per tomd %!s(int64=6) %!d(string=hai) anos
commands 9393640fa9 Fixed inconsistent help text per tomd %!s(int64=6) %!d(string=hai) anos
functions b9f9ac57d8 Give dokku ownership of nginx.conf.d %!s(int64=8) %!d(string=hai) anos
plugin.toml d02a7246d4 Filesystem skeleton %!s(int64=8) %!d(string=hai) anos

README.md

dokku robots.txt

Creates a robots.txt for applications that can be set to allow or disallow web crawlers to the app. This is useful for deploying websites that you do not want indexed by search engines. For instance, you may want to deploy a production app with robots allowed and a staging app with robots disallowed.

requirements

  • dokku 0.4.x+

installation

# on 0.4.x+
sudo dokku plugin:install https://notabug.org/candlewaster/dokku-robots.txt.git robots.txt

commands

robots.txt:disallow <app>, Discourages web crawlers from indexing this app
robots.txt:allow <app>,    Doesn't discourage web crawlers from indexing this app

usage

# Discourage web crawlers from indexing myapp
dokku robots.txt:disallow myapp

# Don't discourage web crawlers from indexing myapp
dokku robots.txt:allow myapp

# View robots.txt for myapp
dokku robots.txt myapp