Editing robots.txt.liquid

Search engines, such as Google, constantly crawl the internet in search of new data as a source for their search results. The robots.txt file tells search engine bots, known as crawlers, which pages to request to view from your online store. All Shopify stores have a default robots.txt file that’s optimal for Search Engine Optimization (SEO).

Your sitemap is used by search engines to place your online store in the search engine results. Learn how to find and submit your sitemap.

On this page

Overview

The default robots.txt file works for most stores, but you can edit the file through the robots.txt.liquid theme template. You can make the following edits:

  • allow or disallow certain URLs from being crawled
  • add crawl-delay rules for certain crawlers
  • add extra sitemap URLs
  • block certain crawlers

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *