DEV Community

Cover image for Generating dynamic robots.txt and sitemap.xml in a Next.js App Router with TypeScript
Arfatur Rahman
Arfatur Rahman

Posted on

2 1 1 1

Generating dynamic robots.txt and sitemap.xml in a Next.js App Router with TypeScript

Introduction

In modern web development, ensuring search engines properly index and crawl your website is crucial for SEO and visibility. Two essential files that help with this are sitemap.xml and robots.txt. If you're using Next.js with the App Router and TypeScript, you can dynamically generate these files to keep them updated as your content changes. In this guide, we'll walk through how to generate sitemap.xml and robots.txt in a Next.js project, why they are necessary, and how they improve your website's SEO.

What is a Sitemap?

A sitemap is an XML file that lists all the URLs of a website, providing search engines with a structured map of the site's content. This helps search engines like Google and Bing efficiently crawl and index pages, ensuring they appear in search results.

Why is a Sitemap Required?

  1. Better Indexing: It ensures that search engines discover all important pages, even if they are not linked properly.
  2. Faster Crawling: A sitemap helps search engines prioritize and crawl new or updated content quickly.
  3. SEO Improvement: Providing metadata such as change frequency and priority helps search engines understand your content better.

Generating sitemap.xml in Next.js

To dynamically generate a sitemap in a Next.js project using TypeScript, we use the MetadataRoute.Sitemap API. Below is a complete implementation:

import { db } from "@/lib/db";
import type { MetadataRoute } from "next";

export const revalidate = 60;

const getDynamicSitemaps = ({
  data,
  defaultRoute,
}: {
  data: { slug: string }[];
  defaultRoute: string;
}) => {
  return data.map((d) => ({
    url: `${process.env.NEXT_PUBLIC_WEBSITE_URL}/${defaultRoute}/${d.slug}`,
    lastModified: new Date(),
    changeFrequency: "yearly" as const,
    priority: 1,
  }));
};

export default async function sitemap(): Promise<MetadataRoute.Sitemap> {
  const allBlogs = await db.blog.findMany();
  const allProjects = await db.project.findMany();
  const allProducts = await db.product.findMany();

  const blogSiteMap = getDynamicSitemaps({
    data: allBlogs,
    defaultRoute: "blogs",
  });
  const projectSitemaps = getDynamicSitemaps({
    data: allProjects,
    defaultRoute: "projects",
  });
  const shopSitemaps = getDynamicSitemaps({
    data: allProducts,
    defaultRoute: "shop",
  });

  return [
    {
      url: `${process.env.NEXT_PUBLIC_WEBSITE_URL}/`,
      lastModified: new Date(),
      changeFrequency: "yearly",
      priority: 1,
    },
    {
      url: `${process.env.NEXT_PUBLIC_WEBSITE_URL}/blogs`,
      lastModified: new Date(),
      changeFrequency: "weekly",
      priority: 0.5,
    },
    ...blogSiteMap,
    ...shopSitemaps,
    ...projectSitemaps,
    {
      url: `${process.env.NEXT_PUBLIC_WEBSITE_URL}/contact`,
      lastModified: new Date(),
      changeFrequency: "monthly",
      priority: 0.8,
    },
  ];
}
Enter fullscreen mode Exit fullscreen mode

For more details, refer to the Next.js sitemap documentation.

What is robots.txt?

The robots.txt file is a text file that tells search engine crawlers which pages or sections of a website they can or cannot access. This is important for managing how search engines interact with your site.

Why is robots.txt Required?

  1. Control Crawling: Prevent search engines from indexing private or unnecessary pages.
  2. Reduce Server Load: Block bots from crawling pages that do not need to be indexed.
  3. Improve SEO: Ensure search engines focus on important content rather than unnecessary pages.

Generating robots.txt in Next.js

In Next.js, we can define a robots.ts file inside the app directory to generate robots.txt dynamically:

import type { MetadataRoute } from "next";

export default function robots(): MetadataRoute.Robots {
  return {
    rules: {
      userAgent: "*",
      allow: "/",
      disallow: "/dashboard/",
    },
    sitemap: `${process.env.NEXT_PUBLIC_WEBSITE_URL}/sitemap.xml`,
  };
}
Enter fullscreen mode Exit fullscreen mode

For more details, refer to the Next.js robots.txt documentation.

Conclusion

Generating sitemap.xml and robots.txt dynamically in a Next.js app router setup with TypeScript is a great way to ensure your website is optimized for search engines. The sitemap helps search engines discover and index pages efficiently, while the robots.txt file controls crawler access to ensure proper SEO strategies. By implementing these files, you improve the discoverability, indexing, and performance of your website in search engine rankings.


Connect with Me

Top comments (0)

👋 Kindness is contagious

Explore a trove of insights in this engaging article, celebrated within our welcoming DEV Community. Developers from every background are invited to join and enhance our shared wisdom.

A genuine "thank you" can truly uplift someone’s day. Feel free to express your gratitude in the comments below!

On DEV, our collective exchange of knowledge lightens the road ahead and strengthens our community bonds. Found something valuable here? A small thank you to the author can make a big difference.

Okay