How to let Lighthouse download your Next.js robots.txt file
by Dan Edwards, 29 January 2025

If you're using an app/robots.ts
to generate your robots.txt
file, you might encounter an issue where
Google Lighthouse reports that it can't download your robots.txt
file, even though it's accessible
from your browser.
Lighthouse was unable to download a robots.txt file.
As Lighthouse will knock off a few SEO points if it can't find the file, it can be frustrating when trying to achieve perfect scores for your Next.js application.
The Problem
When using the App Router's metadata API with an app/robots.ts
file like this:
1import { MetadataRoute } from 'next'
2
3export default function robots(): MetadataRoute.Robots {
4 return {
5 rules: {
6 userAgent: '*',
7 allow: '/',
8 },
9 }
10}
You might find that while the file is accessible at /robots.txt
and displays correctly in
your browser, Lighthouse still complains that it "was unable to download a robots.txt file."

Why This Happens
The issue occurs because Lighthouse attempts to fetch the robots.txt file by running a script from
your site's root. However, Next.js's default Content Security Policy (CSP) settings can block this
request. Specifically, Lighthouse's audit can't work with restrictive connect-src
CSP directives.
The Solution
To fix this issue, modify your Next.js configuration to allow Lighthouse to fetch the robots.txt file.
Here's how to do it in your next.config.ts
:
1import type { NextConfig } from 'next'
2
3const nextConfig: NextConfig = {
4 // Your existing config options...
5 output: 'standalone',
6
7 // Add this headers configuration
8 async headers() {
9 return [
10 {
11 source: '/robots.txt',
12 headers: [
13 {
14 key: 'Content-Security-Policy',
15 value: "connect-src 'self'; script-src 'none'; object-src 'none'; frame-src 'none'"
16 }
17 ]
18 }
19 ]
20 }
21}
22
23export default nextConfig
This configuration:
- Only applies to the robots.txt route, maintaining strict CSP settings elsewhere
- Allows
connect-src
requests to your domain - Disables unnecessary script, object, and frame sources for this route
Security Considerations
This solution is secure because:
- It only modifies the CSP for the
robots.txt
route - It only allows connections to your domain:
'self'
- It explicitly disables other potentially risky sources
- Your main application maintains its original strict CSP settings

Verifying the Fix
After deploying these changes:
- Your
robots.txt
file should still be accessible in the browser - Lighthouse should now be able to download and verify your
robots.txt
file - Your Lighthouse scores should improve if this was affecting them
By implementing this fix, you maintain both security and optimal Lighthouse performance for your Next.js application.
Feel free to use my Next.js configuration template(opens in a new tab), too - it's much more comprehensive than create-next-app