• Topic 主题 トピック Topic Topic
  • Sign in

Robots.txt Management for Advanced SEO

A robots.txt file is a text file that provides instructions to search engine crawlers on which parts of your website they can access. It helps control which pages are included or excluded from search engine results. 

By creating or modifying your robots.txt file, you can gain more control over how search engines crawl and index your website. This allows for greater flexibility in your SEO strategy.

 

In This Article

 


 

When to Edit Robots.txt

While the default Robots.txt file works for most stores, you may need to edit it in specific scenarios where you want to restrict Google or other search engine crawlers from accessing certain URLs. Common scenarios include:

  • Allowing or disallowing the crawling of specific URLs
  • Adding crawl delay rules for specific crawlers
  • Incorporating additional sitemap URLs
  • Blocking certain crawlers

 


 

Editing the Robots.txt File

Editing Robots.txt requires a basic understanding of coding and SEO principles. To access and edit the file, follow these steps: 

  1. In your SHOPLINE admin panel, go to Channels > Online Store > Preferences.
    2.1.png
  2. Locate Robots.txt Management and click Go to settings.
    2.2.png
  3. Use the Robots editor to make the necessary changes. For detailed instructions on Robots.txt, refer to the "Introduction to robots.txt" and "How to write and submit a robots.txt file" guides on Google Search Central.
    2.3.png
Note: This feature was an optimized version of Anti-crawler settings. If you have edited in the Anti-crawler settings, the system will automatically append the corresponding disallow text at the end of the robots.txt file.

 

Have more questions? Submit a request

Comments