Generate the robots.txt file for a site
seo-in-nextjs can generate the robots.txt file for your Next.js site with automatic configuration.
Enabling robots.txt generation
Section titled “Enabling robots.txt generation”-
Create a
robots.tsorrobots.jsfile in your project’s app folder. -
Import the
robotsTxtfunction into the file.app/robots.ts import type { MetadataRoute } from "next";import { robotsTxt } from "@dlcastillop/seo-in-nextjs";export default function robots(): MetadataRoute.Robots {return robotsTxt();}app/robots.js import { robotsTxt } from "@dlcastillop/seo-in-nextjs";export default function robots() {return robotsTxt();} -
Open http://localhost:3000/robots.txt in your browser to see the robots file.
Advanced robots configuration
Section titled “Advanced robots configuration”By default, robotsTxt() generates a robots.txt file with sensible default rules that work well for most cases.
However, you can customize the configuration by passing an options object to the function to extend or override these defaults.
The configuration allows you to specify custom rules, user agents, and crawl delays.
import type { MetadataRoute } from "next";import { robotsTxt } from "@dlcastillop/seo-in-nextjs";
export default function robots(): MetadataRoute.Robots { return robotsTxt({ rules: [ { userAgent: "*", allow: "/", disallow: ["/admin", "/private"], }, { userAgent: "Googlebot", allow: "/", crawlDelay: 10, }, ], });}import { robotsTxt } from "@dlcastillop/seo-in-nextjs";
export default function robots() { return robotsTxt({ rules: [ { userAgent: "*", allow: "/", disallow: ["/admin", "/private"], }, { userAgent: "Googlebot", allow: "/", crawlDelay: 10, }, ], });}Related
Section titled “Related”View related API references.
robotsTxt API Reference for the robotsTxt function.