Consider an endpoint is being accessed as frequent as your heartbeat by thousands users or more. How much do you trust the health of the endpoint to serve every users with no issues? If you do have system observability set up, you might probably see some timeout once a while from the endpoint and that could happen to one of your paying users. Even with the robust backend infrastructure set up, you could reach 99.9% uptime, what about the 0.1% of your user traffic?
Caching with Redis come to the rescue! Given that this endpoint must not be real-time sensitive in order for cache to make senses. Such as weather info endpoint, since weather won't fluctuate in the matter of seconds.
Example here will be an Express App:
We have this controller just to pipe through the request and handling error when necessary.
app.post('/getStatus', async (req, res) => {
try {
const cityId = req.body.cityId;
const response = await weatherAPI.cacheAwareGetStatus(cityId);
return res.status(200).json(response);
} catch (e) {
console.log(`Error calling API`);
res.sendStatus(error.status);
}
});
Then the API integration with external weather api. Cache layer is performed here.
class WeatherAPI {
public async getWeather(cityId) {
const body = {
cityId,
};
return await axios({
url: 'api.weather.com',
method: 'post',
data: body,
});
}
public cacheAwareGetStatus() {
let raceResult;
const cacheFetchTimeout = 5000;
const defaultResponseData: GetPaymentInfoResponse = {
temperature: 0,
humidity: null,
wind: null,
};
const timer = new Promise((resolve) => {
setTimeout(resolve, cacheFetchTimeout, { timeout: true });
});
raceResult = await Promise.race([weatherAPI.getWeather(cityId), timer]);
if (raceResult.timeout) {
try {
const weatherInfoCache = await redisClient.get(cityId);
if (weatherInfoCache) {
return JSON.parse(weatherInfoCache);
} else {
console.log(`API Timeout at ${cacheFetchTimeout} and cache not found`);
return defaultResponseData;
}
} catch (e) {
console.log(`API Timeout at ${cacheFetchTimeout} and error retreiving cache response`);
return defaultResponseData;
}
} else {
try {
await redisClient.set(cityId, JSON.stringify(raceResult));
} catch (e) {
console.log(`Failed to set cache proceed to return response`);
}
return raceResult;
}
}
}
export const weatherAPI = new WeatherAPI();
Promise.race([weatherAPI.getWeather(cityId), timer])
in the code determine whether the timer is up, probably the request is lost or still in flight for too long. We then want to get the cached response from Redis and return to the client. Otherwise, we proceed with the return of the latest response and update the cache in Redis. Sweet!
Top comments (0)