March 25, 2025

Customize robots.txt rules by domain

If you're using multiple domains to target different audiences in Shopify Markets, you can now create custom crawling rules for each domain using the request.host object in your robots.txt.liquid template.

This update gives you more control over how search engines crawl your content across different markets.

To implement domain-specific rules, simply use the request.host object in your robots.txt.liquid template.

As an example, this code snippet would disallow crawling of the /en/ folder for the example.fr domain, while allowing the folder to remain crawlable for other domains.


{% for group in robots.default_groups %}
{{- group.user_agent }}
{%- if request.host == 'example.fr' -%}
Disallow: /en/
{%- endif -%}
{% endfor %}

Learn more about customizing robots.txt in the Shopify Dev Docs.