How to let Lighthouse download your Next.js robots.txt file

by Dan Edwards, 29 January 2025

How to let Lighthouse download your Next.js robots.txt file

If you're using an app/robots.ts to generate your robots.txt file, you might encounter an issue where Google Lighthouse reports that it can't download your robots.txt file, even though it's accessible from your browser.

Lighthouse was unable to download a robots.txt file.

As Lighthouse will knock off a few SEO points if it can't find the file, it can be frustrating when trying to achieve perfect scores for your Next.js application.

The Problem

When using the App Router's metadata API with an app/robots.ts file like this:

app/robots.ts
TypeScript
1import { MetadataRoute } from 'next'
2
3export default function robots(): MetadataRoute.Robots {
4  return {
5    rules: {
6      userAgent: '*',
7      allow: '/',
8    },
9  }
10}

You might find that while the file is accessible at /robots.txt and displays correctly in your browser, Lighthouse still complains that it "was unable to download a robots.txt file."

Lighthouse may not be able to download your robots.txt file, even though you can see it in the browser

Why This Happens

The issue occurs because Lighthouse attempts to fetch the robots.txt file by running a script from your site's root. However, Next.js's default Content Security Policy (CSP) settings can block this request. Specifically, Lighthouse's audit can't work with restrictive connect-src CSP directives.

The Solution

To fix this issue, modify your Next.js configuration to allow Lighthouse to fetch the robots.txt file. Here's how to do it in your next.config.ts:

next.config.ts
TypeScript
1import type { NextConfig } from 'next'
2
3const nextConfig: NextConfig = {
4  // Your existing config options...
5  output: 'standalone',
6
7  // Add this headers configuration
8  async headers() {
9    return [
10      {
11        source: '/robots.txt',
12        headers: [
13          {
14            key: 'Content-Security-Policy',
15            value: "connect-src 'self'; script-src 'none'; object-src 'none'; frame-src 'none'"
16          }
17        ]
18      }
19    ]
20  }
21}
22
23export default nextConfig

This configuration:

  1. Only applies to the robots.txt route, maintaining strict CSP settings elsewhere
  2. Allows connect-src requests to your domain
  3. Disables unnecessary script, object, and frame sources for this route

Security Considerations

This solution is secure because:

Verifying the Fix

After deploying these changes:

  1. Your robots.txt file should still be accessible in the browser
  2. Lighthouse should now be able to download and verify your robots.txt file
  3. Your Lighthouse scores should improve if this was affecting them

By implementing this fix, you maintain both security and optimal Lighthouse performance for your Next.js application.

Feel free to use my Next.js configuration template(opens in a new tab), too - it's much more comprehensive than create-next-app