Managing LLMS.txt (AI Crawlers) for Your SHOPLINE Store
LLMS.txt works similarly to robots.txt, but is specifically intended for AI crawlers and assistants (e.g., ChatGPT, Claude, Copilot, Gemini, Perplexity). It tells these agents which content they may access, summarize, or index, and which content they must not use.
Because uploading files directly to your store’s root directory isn’t currently supported, the recommended approach is:
- Upload LLMS.txt to your File library, then
- Create a 301 redirect from "/LLMS.txt" to the file’s public URL.
Follow this guide to properly configure your LLMS.txt file and ensure AI crawlers interact with your store content according to your preferences.
What Is LLMS.txt
LLMS.txt is a simple plain-text file that declares your store’s AI interaction policy—think of it as the AI counterpart to robots.txt for search engines. With this file, you can state which pages or sections AI crawlers and assistants (such as GPTBot for OpenAI/ChatGPT, ClaudeBot for Anthropic, and PerplexityBot) may read and summarize, and which sensitive or private areas (for example, checkout, account, or admin) they must not access or use.
LLMS.txt works alongside your SEO settings and does not replace robots.txt; use robots.txt to manage search engine behavior, and LLMS.txt to guide AI tools.
| Note: To learn how to set up robots.txt, please refer to "robots.txt Management for Advanced SEO." |
Differences between LLMS.txt and robots.txt
| Topic | LLMS.txt | robots.txt |
| Audience | AI crawlers and assistants (LLMs) | Search engine crawlers |
| Purpose | Controls what AI tools may read, summarize, and use for training | Controls which pages search engines may index or crawl |
| Location | Served at root via 301 to File library (workaround) | Natively located at /robots.txt |
| Typical Rules | Blocks sensitive flows (e.g., checkout, account, admin) from AI access | Sets SEO-driven allow/disallow patterns for indexing |
| Relationship | Complementary to robots.txt but independent: Controls AI crawlers and assistants. Changes here do not affect search engine indexing. | Complementary to LLMS.txt but independent: Controls search engine crawlers. Does not apply to AI crawlers. |
Publishing LLMS.txt via 301 Redirect
Before You Start
You’ll need:
- Admin access to your SHOPLINE store
- Permission to upload files to File library
- Permission to create 301 URL redirects
How to Upload LLMS.txt File to File Library
- From your SHOPLINE admin panel, go to Settings > File library and click the Upload files button in the upper-right corner.
- In the pop-up window, click the upload area and select the LLMS.txt file, or drag and drop it into the upload area.
- Click the copy icon on the right of the asset row. The link will be automatically copied, with Copied successfully displayed above.
| Note: To learn more about how to upload your local assets to the File library, please refer to "Uploading Local Assets to the File Library." |
How to Create 301 URL Redirects
- From your SHOPLINE admin panel, go to Settings > Domains. Locate the 301 Redirect section and click Manage redirections.
- Click Add redirect in the upper-right corner.
- In the pop-up window, fill in the Redirect from and Redirect to fields and click Add to save the redirect:
- Redirect from: Enter the LLMS.txt file name, i.e., /llms.txt
- Redirect to: Copy and paste the LLMS.txt file link from the File library.
| Note: To learn more about how to create and manage a 301 URL redirect, please refer to "Creating and Managing 301 URL Redirect." |
Test Your 301 Redirect Setup
- Open a browser and enter your SHOPLINE store URL (e.g., https://abc.myshopline.com), then append /llms.txt to the end (i.e., https://abc.myshopline.com/llms.txt).
- You should see the contents of your LLMS.txt file (not a 404 or a file download).
- If something looks cached, hard refresh your browser or retry in an incognito window.
Using and Understanding the LLMS.txt Template
LLMS.txt Template
You can use the following LLMS.txt template provided by SHOPLINE. Copy it into a plain-text file, then follow the steps in "Publishing LLMS.txt via 301 redirect" to complete the setup.
# We use SHOPLINE as our e-commerce platform.
# Public product, blog, and FAQ content may be accessed for summarization or question answering only.
# Do not use our content for model training or dataset creation.
User-Agent: *
Disallow: /admin
Disallow: /checkouts
Disallow: /cart
Disallow: /orders
Disallow: /trade
Disallow: /checkout
Disallow: /invoices
Disallow: /payment_methods
Disallow: /search
Disallow: /products/search
Disallow: /user
Disallow: /transit_page
Disallow: /api/
Disallow: /preview
Disallow: /apple-app-site-association
Policy: allow-output, disallow-training, disallow-derivative-works
LLMS.txt Rules and Format Explanation
This section will help you read and customize the LLMS.txt template. You’ll learn what User-agent, Allow, and Disallow mean, and how paths are matched.
File Basics
- Encoding & Name: Plain text (UTF-8), typically named "llms.txt".
- Comments: Any line starting with # is ignored by crawlers (useful for notes).
- One Directive per Line: Keep rules short and one per line.
- Paths Only: Rules use paths (starting with " / "), not full domains.
Core Directives
-
User-Agent: Names the crawler a rule applies to (e.g., GPTBot for OpenAI/ChatGPT, ClaudeBot for Anthropic, PerplexityBot).
Note: User-Agent: * means "all AI crawlers." - Allow: Explicitly permits a path or section.
-
Disallow: Blocks a path or section.
Note: When Disallow is empty, it means "allow everything."
Best Practices
You can use this quick checklist to review your LLMS.txt file:
- Do keep rules short, one per line, and use clear sections.
- Do test your SHOPLINE store URL by appending "/llms.txt" after publishing and after any theme or domain changes.
- Do use 301 redirects from "/llms.txt" to your uploaded file (root upload isn’t supported).
- Don’t include your domain in rules—use paths only.
- Don’t rely solely on query-based blocking; prefer path blocking (e.g., Disallow: /checkout/).