Disallow site indexing in EE4

There are servers where we don’t want search engines to index sites on it. i.e. development servers. In order to accomplish it, we can manually add robots.txt on all sites but it’s tedious to do it on all sites and it’s difficult to maintain.

To disallow site indexing on all sites – current and future in EE4, we need to add following block in /opt/easyengine/services/nginx-proxy/vhost.d/default

location = /robots.txt { return 200 "User-agent: *\nDisallow: /\n"; }

You can use the following one liner to add it:

cat <<EOF | sudo tee /opt/easyengine/services/nginx-proxy/vhost.d/default
location = /robots.txt { return 200 "User-agent: *\nDisallow: /\n"; }

Adding any config at /opt/easyengine/services/nginx-proxy/vhost.d/default will ensure that that config will get included for each site in nginx-proxy’s config.

You need to reload nginx proxy config for the changes to take effect. Run following command to reload nginx-proxy:

ee service reload nginx-proxy