# robotsTxt Reference

The `robotsTxt` function generates a `robots.txt` file for your Next.js application.

## Parameters

The `robotsTxt` function accepts one optional parameter: a configuration object.

### `robotsConfig`

**type:** `Partial<MetadataRoute.Robots>`

**Added in:** `seo-in-nextjs@1.0.0`

A configuration object that can contain the following properties:

#### `rules`

**type:** `object | object[]`

**Added in:** `seo-in-nextjs@1.0.0`

Rules for web crawlers. Can be a single rule object or an array of rule objects. Each rule object can contain:

- **userAgent** (`string | string[]`): The user agent(s) the rule applies to (e.g., `"*"`, `"Googlebot"`).
- **allow** (`string | string[]`): Path(s) that are allowed to be crawled.
- **disallow** (`string | string[]`): Path(s) that are not allowed to be crawled.
- **crawlDelay** (`number`): Time (in seconds) the crawler should wait between requests.

#### `sitemap`

**type:** `string | string[]`

**Added in:** `seo-in-nextjs@1.0.0`

The URL(s) of your sitemap file(s).

#### `host`

**type:** `string`

**Added in:** `seo-in-nextjs@1.0.0`

The preferred domain for your site (e.g., `"https://example.com"`).