Where it All Started
Most web apps get their value from interacting with HTTP APIs.
This is generally done using HTTP clients like the native fetch
function, Axios or Angular's HttpClient
.
Once you set up an HTTP client on a web app then sooner or later, you will need to extend its capabilities in order to handle different topics like User Experience (e.g. pending requests indicator), performance (e.g. caching), resilience (e.g. automatic retry), and security (e.g. authentication). Luckily, most HTTP clients can be easily extended using interceptors so you won't have to wrap them or implement your own client.
Even though implementing an interceptor can sound quick and easy, handling edge cases, testing and maintenance can get expensive. Wouldn't it be better if someone else could handle these issues for us?
That's when my friend Edouard Bozon and I noticed the following facts:
- most apps including ours, those of our clients (i.e. companies not HTTP ones) and probably yours need the same interceptors,
- implementing interceptors can be tricky with some HTTP clients if you are not familiar with some other concepts,
- implementations observed in tutorials or in our clients' codebases can be error-prone or miss a couple of important edge cases,
- implementing the same interceptor more than once in a lifetime is a boring waste of time.
The next thing I remember is that we decided to react by initiating an open-source library called Convoyr.
๐ก The Idea Behind Convoyr
While Convoyr is currently focused on extending Angular's HttpClient
it has been designed as a modular and framework agnostic set of plugins.
We like to think of Convoyr as an infrastructure agnostic Service Mesh for web apps and JavaScript... even if we are not there yet.
๐ข The Network Latency Performance Problem
Today, in this blog post, we will focus on the performance topic and how to fix network latency issues using Convoyr.
In most cases, when a user navigates from a route to another on the same web app, the main thing preventing us from displaying the result instantly is the network latency related to fetching data from some remote service.
This can be problematic, especially when it comes to re-fetching some data that we have just fetched a few minutes ago and that didn't change since. We end up making the user wait to finally display the same result as he received before.
Imagine a list of products where the user clicks on a specific product to see its details before clicking the "back to the list" button. The latency due to re-fetching the products can cause friction.
๐ Caching to the Rescue
One of the first solutions we can think of is caching. We can implement our own caching system or just set the right response headers and let the browser handle the HTTP caching as described by RFC7234.
The latter approach is generally the most appropriate as it is standard, generic, efficient, scalable, sharable, easy to set up, and cheap to maintain.
๐บ The Freshness Problem
HTTP caching is relatively easy to set up but it comes with a price, the price of freshness.
In fact, in order to cache a response, the server has to tell the client how long it can cache it or in other terms, how long it should be considered fresh.
Choosing a freshness duration can be a challenging decision.
Too low and it would render the cache useless; too high and the web app would use the "expired" data returned by the cache.
๐ค Why make a choice
Software development is all about tradeoffs but what if we could skip this one.
Wouldn't it be nice if we could use the latest data from the cache while we are fetching the freshest one from the network?
We can imagine many ways of implementing this behavior but let's focus on Developer eXperience and find an approach that works globally without having to change all the HTTP calls in our apps.
Observable vs Promise
Angular's HTTPClient
has the specificity of returning observables instead of promises in opposition to the native fetch
function and Axios.
Amongst other advantages like making HTTP calls lazy and easily cancelable, observables offer an additional benefit which is the ability to emit multiple values.
โ๏ธ Emit both cached & network
As we can emit multiple values with observables, what about first emitting the data from the cache (if available) and then the data from the network?
This means that given the code below in our Angular component, we wouldn't have to change anything and it would first display the cached result and refresh it with the latest result from the network.
@Component({
template: `{{ weather | json }}`
})
export class WeatherComponent {
weather: Weather;
...() {
this.http.get<Weather>('/weather/lyon')
.subscribe(weather => this.weather = weather);
}
}
or in a more reactive way:
@Component({
template: `{{ weather$ | async | json }}`
})
export class WeatherComponent {
weather$ = this.http.get<Weather>('/weather/lyon');
}
Convoyr cache plugin
Convoyr provides a cache plugin @convoyr/plugin-cache
that extending the behavior of the HTTP client by first emitting the data from the cache if available then the one from the network as described above.
Setup
It takes two steps to setup Convoyr's cache plugin.
- Installing Convoyr and the plugin:
npm install @convoyr/core @convoyr/angular @convoyr/plugin-cache
- Enable the cache plugin in the
AppModule
:
import { ConvoyrModule } from '@convoyr/angular';
import { createCachePlugin } from '@convoyr/plugin-cache';
@NgModule({
imports: [
...
HttpClientModule,
ConvoyrModule.forRoot({
plugins: [createCachePlugin()],
}),
],
...
})
export class AppModule {}
How to know if the data comes from cache
You will probably want to display the data differently when it comes from the cache or when it's all fresh from the network.
Convoyr's cache plugin can provide some metadata on the emitted response by setting the addCacheMetadata
to true
.
createCachePlugin({
addCacheMetadata: true
})
Be careful though as this will change the response type.
The code below:
http.get('/weather/lyon')
.subscribe(data => console.log(data));
... will log the following data:
{
data: {
temperature: ...,
...
},
cacheMetadata: {
createdAt: '2020-01-01T00:00:00.000Z',
isFromCache: true
}
}
Convoyr's cache plugin is progressive
The addCacheMetadata
can be very interesting but it is also kind of intrusive as it changes the response type.
Enabling it globally on some apps can require a terrific refactoring.
In order to avoid the trouble and let you enable this option progressively, the cache plugin allows you to activate different configurations on different groups of requests using the shouldHandleRequest
option.
import { and, matchOrigin, matchPath } from '@convoyr/core';
createCachePlugin({
shouldHandleRequest: and(
matchOrigin('marmicode.io'),
matchPath('/weather')
)
})
Storage
By default, the cache plugin stores the 100 most recently used requests in memory.
You can override this behavior by providing your own storage or instantiating the MemoryStorage
with the size of your choice using the maxSize
option.
createCachePlugin({
storage: new MemoryStorage({ maxSize: 2000 }), // 2000 requests
})
or
createCachePlugin({
storage: new MemoryStorage({ maxSize: '2 mb' }), // 2MB
})
๐ Upcoming Features
This is just the beginning and there is more to come so stay tuned.
Here is a list of some upcoming features:
- Handle ReSTful APIs (e.g.
/items
should populate/items/:itemId
so we can instantly show partial data from list views in detail views) - Use IndexedDB as storage.
๐ Other plugins
@convoyr/plugin-auth handles authentication both easily and securely.
@convoyr/plugin-retry handles backoff (i.e. retries when things go wrong).
Top comments (0)