Cloudflares
Understanding Cloudflare Pages Headers for Security and Performance
This article explains how to configure security, privacy, and performance-related headers specifically for Cloudflare Pages using the _headers file.
Note: This configuration is for Cloudflare Pages deployments. The example includes commonly used tools like Google Fonts and Pagefind (WebAssembly-based search). Adjust the CSP directives based on your actual dependencies. Test your site thoroughly after applying these headers to ensure nothing breaks.
Create a file named _headers in your Hugo static/ folder with the following content:
How to Configure Pagefind in Cloudflare Pages Build Settings
To ensure Pagefind runs automatically during your Cloudflare Pages deployment, update your project’s build configuration.
In Cloudflare Pages, navigate to:
Workers & Pages → Your Project → Settings → Build → Build configuration
Then set the build command to:
hugo && npx pagefind --site public --output-path public/pagefind
This command first generates your site with Hugo, placing the output in the public directory. Pagefind then scans that directory, builds the search index, and writes its files into public/pagefind, making the search functionality available on your deployed site.
Prevent SEO Issues on Cloudflare Pages with X-Robots-Tag
Cloudflare Pages serves your site on two domains: your custom domain (e.g., mysite.com) and a default mysite.pages.dev domain. If both get indexed, search engines may treat them as duplicate content, hurting your SEO.
To prevent this, add an X-Robots-Tag header to the mysite.pages.dev version of your site so crawlers don’t index it.
How to set it up
- In your site’s build output directory (usually
staticorpublic), create a file named_headers. - Add the following rule (replace
mysite.pages.devwith your actual Cloudflare Pages domain):
https://mysite.pages.dev/*
X-Robots-Tag: noindex
The first line matches all paths on your Cloudflare Pages domain.
The second line applies the X-Robots-Tag: noindex directive.