Introduction
In the world of cloud development, combining Serverless Framework with a Monorepo offers a smart way to handle and deploy multiple applications. This article breaks down the process of creating a CloudFormation stack using Serverless Framework. Each application, neatly packed within a Monorepo, gets its own Lambda function. Join us as we explore the easy integration of serverless tech and the practical advantages of using a Monorepo, uncovering the steps to set up a flexible and organized infrastructure.
Starting with Monolith
In the early stages of a project, it's often a good idea to begin with a monolith—a single, straightforward architecture. This simplicity is beneficial when the project is just getting started because we might not have a clear picture of the system's boundaries, and predicting how it will grow can be tricky.
By starting with a monolith, developers can handle the uncertainties of the initial phase more easily. As the project advances and we add more features, there might come a time when it makes sense to break off certain functions into smaller pieces called microservices. This step allows for a more organized and scalable system that can adapt as the project evolves.
Keyword: Modularity
Building on the foundation of a monolith, the key to success lies in a modular code organization, where "bounded contexts" are well-contained and independent.
In an ideal world, picture each module neatly residing in its own folder within our monolith. Transforming a module into a new application should be a seamless transition; if it requires excessive effort, it's a clear sign that something needs adjustment.
Embracing modularity ensures flexibility as the project evolves, promoting a codebase that is both adaptable and comprehensible.
Monorepo Overview
This monorepo is organized around the 'apps' folder, containing two Fastify applications:
- Customers
- Orders
Each application within the 'apps' folder follows a consistent structure:
- src
- handlers
-
app.ts
(Lambda entry point)
-
- routes
-
routeA.ts
(Route definitions for specific functionalities) -
routeB.ts
(Additional route definitions)
-
- handlers
-
app.ts
(Application entry point) -
server.ts
(Server setup and configuration)
apps/customers/app.ts
import 'module-alias/register'
import fastify, {FastifyInstance, FastifyServerOptions} from 'fastify'
import autoload from '@fastify/autoload'
import {join} from 'path'
export default function createApp(
opts?: FastifyServerOptions,
): FastifyInstance {
const defaultOptions = {
logger: true,
}
const app = fastify({ ...defaultOptions, ...opts })
app.register(autoload, {
dir: join(__dirname, 'routes'),
options: { prefix: '/customers' },
})
return app
}
apps/customers/routes/getCustomers.ts
import { FastifyInstance } from 'fastify'
export default async function (app: FastifyInstance): Promise<void> {
app.get(
'/',
async () => {
return [
{
id: 1,
name: "Johnny Walker"
},
{
id: 2,
name: "Jack White"
}
]
},
)
}
apps/customers/handlers/app.ts
import awsLambdaFastify from '@fastify/aws-lambda'
import createApp from '../app'
const app = createApp()
const proxy = awsLambdaFastify(app)
export const handler = proxy
apps/customers/server.ts
import createApp from './app'
const app = createApp()
app.ready().then(() => {
app.listen({ port: 4001 }, (err) => {
if (err != null) {
app.log.error(err)
process.exit(1)
}
})
})
(Order app has the same structure).
Going Serverless
In our move to deploy the entire project on the cloud using Serverless Framework, we aim to avoid bundling all dependencies (as listed in package.json) for each Fastify application's lambda function. To achieve this, we'll create a shared layer that can be utilized by all applications in the monorepo. This layer approach streamlines deployment, ensuring efficient use of resources across multiple functions while simplifying maintenance.
Using the "serverless-layers" Plugin
Introducing the "serverless-layers" plugin (https://www.serverless.com/plugins/serverless-layers): a user-friendly tool that automates the creation of layers. By default, it utilizes the package.json found in the project's root, simplifying the layer setup process. This plugin streamlines the integration of shared dependencies across multiple functions, enhancing efficiency and reducing redundancy in your Serverless deployment.
serverless.yml
service: fastify-serverless-layer
frameworkVersion: '3'
provider:
name: aws
runtime: nodejs18.x
region: eu-south-1
deploymentBucket:
name: fastify-serverless-layer-bucket
serverSideEncryption: AES256
plugins:
- serverless-plugin-typescript
- serverless-tscpaths
- serverless-layers
- serverless-deployment-bucket
- serverless-offline
custom:
serverless-layers:
functions: # optional
- customers
- orders
dependenciesPath: ./package.json
functions:
customers:
handler: apps/customers/src/handlers/app.handler
events:
- http:
path: /customers/
method: ANY
- http:
path: /customers/{any+}
method: ANY
orders:
handler: apps/orders/src/handlers/app.handler
events:
- http:
path: /orders/
method: ANY
- http:
path: /orders/{any+}
method: ANY
In the provided serverless.yml, each Lambda function (e.g., 'customers' and 'orders') is configured with an entry point pointing to an application within the monorepo. For instance, the 'customers' function is linked to apps/customers/src/handlers/app.handler. These functions are subsequently associated with specific API Gateway routes, like /customers/ and /orders/. This setup streamlines the integration of Fastify applications with AWS Lambda, simplifying the routing configuration for the Serverless deployment.
Conclusions
The project code is in this GitHub repository: fastify-serverless-layer.
This journey guided us from building a CloudFormation stack based on a modular monolith to evolving into microservices, demonstrating the project's flexible design in practice.
Top comments (0)