robotsTxt Reference
The robotsTxt function generates a robots.txt file for your Next.js application.
Parameters
Section titled “Parameters”The robotsTxt function accepts one optional parameter: a configuration object.
robotsConfig
Section titled “robotsConfig”type: Partial<MetadataRoute.Robots>
A configuration object that can contain the following properties:
type: object | object[]
Rules for web crawlers. Can be a single rule object or an array of rule objects. Each rule object can contain:
- userAgent (
string | string[]): The user agent(s) the rule applies to (e.g.,"*","Googlebot"). - allow (
string | string[]): Path(s) that are allowed to be crawled. - disallow (
string | string[]): Path(s) that are not allowed to be crawled. - crawlDelay (
number): Time (in seconds) the crawler should wait between requests.
sitemap
Section titled “sitemap”type: string | string[]
The URL(s) of your sitemap file(s).
type: string
The preferred domain for your site (e.g., "https://example.com").