• Topic 主题 トピック Topic Topic
  • Sign in

Robots.txt Configuration-Anti-Crawler Settings

Editing the Robots.txt file is an advanced SEO feature that enhances your SEO flexibility and caters to personalized SEO requirements for sellers.

The Robots.txt file instructs search engines on which pages to include or skip in their search results. Search engines examine the Robots.txt file of your website while crawling and indexing it. If you desire better control over the crawl requests to your site, you can modify your Robots.txt file.


In This Article



Applicable Scenarios for Robots.txt

The default Robots.txt file applies to most stores. However, specific scenarios that involve prohibiting search engine spiders like Google from crawling certain URLs necessitate editing the Robots.txt file. These scenarios comprise:

  • Allowing or disallowing the crawling of particular URLs.
  • Adding crawl delay rules for specific crawlers.
  • Incorporating additional sitemap URLs.
  • Blocking certain crawlers.



Editing the Robots.txt

If you wish to edit the Robots.txt file, it’s advisable to possess expertise in coding and SEO. For guidelines on writing the Robots.txt file, refer to What is a Robots.txt file and How to write a Robots.txt file.

You can add or remove instructions within the file by following these steps:

  1. From your SHOPLINE admin panel, go to Channels > Online Store > Preferences.
  2. Locate Robots.txt Management.
  3. Click Go to settings.
  4. You will be redirected to the “Robots.txt editor”.
  1. This feature is an optimized version of Anti-crawler settings. If you have entered anti-crawler pages in Anti-crawler settings, the system will automatically append the corresponding disallow text at the end of the Robots.txt file.
  2. This feature is currently in the gray release phase, with completion at 70% as of January 18, 2024. It’s expected to reach 100% completion next Monday (January 22).
Have more questions? Submit a request