In part 1 of this series, we discussed the Cache-Control header and its directives. In this part of the series, we'll look at implementing cache control strategies in various React frameworks and on the server.
Remix
There are two ways in Remix to set headers depending on the type of the request. For page requests, you can export the headers
hook from the route page:
export const headers: HeadersFunction = () => ({
"Cache-Control": "max-age=86400, s-maxage=604800, stale-while-revalidate=86400"
});
For data requests, you can set headers directly on the response object:
export const action: ActionFunction = async ({ request }) => {
const formData = await request.clone().formData();
const data = await getData(formData);
return json(
data,
{
headers: {
"Cache-Control": "max-age=0, must-revalidate, no-cache",
},
},
);
};
Gatsby
Gatsby lets you configure headers in the gatsby-config.js
file:
module.exports = {
headers: [{
source: "/about",
headers: [
{
key: "cache-control",
value: "max-age=86400",
}
]
}]
}
Gatsby also has an adapter for Netlify, gatsby-adapter-netlify
, that takes care of setting up various configurations automatically, including HTTP headers.
For more information about Gatsby's recommended caching strategy, read here.
Express
Express's default Cache-Control header for static assets is: public, max-age=0
.
You can change the default settings by passing an options block to the static
middleware:
app.use(
express.static("assets", {
etag: true, // This is default
immutable: false, // This is default
lastModified: true, // This is default
maxAge: 1000 * 60, // In MILLISECONDS, default is 0
setHeaders: (response, path) => {
// Set custom headers
},
}),
);
To set headers for a route request, use the setHeader
method:
app.get("/", (req, res) => {
const page = renderPage("Home");
res.setHeader("Cache-Control", "max-age=86400, must-revalidate");
res.send(page);
});
Nginx
If you use Nginx to serve static content, use the add_header
directive to set headers:
server {
listen 80;
listen [::]:80;
server_name mywebsite.com;
root /etc/nginx/dist;
index index.html;
location / {
add_header Cache-Control "max-age=86400";
}
location /images/ {
add_header Cache-Control "max-age=31536000, immutable";
}
}
If you use Nginx to serve dynamic content, you should configure Nginx's cache first:
proxy_cache_path /path/to/cache levels=1:2 keys_zone=choose_cache_name:10m max_size=4g inactive=180m use_temp_path=off;
Then, activate the cache:
proxy_cache our_cache_name;
proxy_cache_revalidate on;
proxy_cache
instructs Nginx to use the specified cache for our request. proxy_cache_revalidate
requires Nginx to revalidate its cache items with the origin server when they go stale. If we don't set this directive - Nginx will request a fresh response from the origin server whenever an item goes stale.
Here is a complete example of setting up a proxy cache:
proxy_cache_path /etc/nginx/cache levels=1:2 keys_zone=my_app_cache:10m max_size=4g inactive=180m use_temp_path=off;
server {
listen 80;
listen [::]:80;
server_name dynamic_cache;
proxy_cache my_app_cache;
proxy_cache_revalidate on;
location / {
add_header X-Cache-Status $upstream_cache_status;
proxy_pass http://my_app_upstream;
}
}
Nginx cache status
Nginx provides the $upstream_cache_status
embedded variable (via the ngx_http_upstream_module
module). It holds the status of the cache response. The possible values are MISS, HIT, EXPIRED, or REVALIDATED. These values are explained in the following diagram.
Thanks for reading! See you in the next article. 👋🏼
Top comments (0)