Bridging the Gap Between Two Worlds
In the realm of software design, marrying Command Query Responsibility Segregation (CQRS) with low-code might seem like an odd match. However, beneath the surface, these two approaches complement each other surprisingly well. This article explores the synergy between CQRS and low-code, demonstrating how their coexistence bridges the gap between traditional and modern software architectures. Join us as we unravel the simplicity and effectiveness of this powerful combination for building robust applications.
The Central Role of Commands in CQRS
In the realm of CQRS, a "command" represents an imperative instruction, typically responsible for altering the state of an application. Commands carry the essence of change, from updating data to triggering specific actions. When it comes to integrating CQRS with a low-code pipeline, the brilliance lies in the ability to delegate command execution. Low-code platforms excel at handling the intricacies of command processing, offering a streamlined and visual approach to define, execute, and manage these imperative actions. This synergy not only simplifies the implementation of commands but also empowers developers to harness the efficiency of low-code for rapid and visual command orchestration.
Command Handling Simplified: The Low-Code Connection
In the world of low-code, simplicity meets power as it takes on the role of a command handler through the orchestration of a dedicated pipeline. Low-code platforms serve as adept command executors, effortlessly managing the flow of instructions through a visual pipeline. This approach not only abstracts the complexities of command execution but also empowers developers with an intuitive and efficient means to design, deploy, and monitor the entire process. By embracing low-code as a command handler, developers can leverage its agility to streamline the execution of imperative actions, unlocking a new level of efficiency in the CQRS paradigm.
Scenario
As a user, the task at hand is to upload product data from a CSV file into an Airtable.
Flow
-
User Initiates Upload:
- The user triggers an upload action by making an API call to the backend, providing a CSV file containing product data.
-
Controller Dispatches Command:
- The controller prepares a 'ImportCsvCommand' command encapsulating the user's request.
- The command is dispatched into the command bus.
-
Command Handling:
- The command handler takes charge and performs the following operations:
- Uploads the CSV file to an S3 bucket.
- Invokes a low-code workflow through a webhook.
- The command handler takes charge and performs the following operations:
-
Low-Code Workflow:
- The low-code workflow follows a structured sequence:
- Reads the CSV file from the S3 bucket.
- Parses the CSV data.
- Inserts the parsed data into the 'products' table in Airtable.
- The low-code workflow follows a structured sequence:
Demo Application
Our application is crafted with NestJS, following the principles of Domain-Driven Design (DDD) and the CQRS pattern. Using NestJs's power, we seamlessly combine these approaches for efficient handling of tasks. This mix ensures our application is not just modern but also well-organized, showcasing the synergy between user-friendly frameworks and smart design principles.
src/app.ts
export default async function bootstrap(): Promise<FastifyInstance> {
const serverOptions = {
logger: true,
};
const instance = fastify(serverOptions);
const app = await NestFactory.create<NestFastifyApplication>(
AppModule,
new FastifyAdapter(instance),
);
app.useGlobalPipes(new ValidationPipe());
app.enableCors();
await app.register(FastifyMultipart);
await app.init();
return instance;
}
Product Controller
src/product/presentation/http/product.controller.ts
@Controller('product')
export class ProductController {
constructor(private readonly commandBus: CommandBus) {}
@Post('/importCsv')
@HttpCode(200)
async uploadFile(
@Req() req: FastifyRequest,
@Res() res: FastifyReply<never>,
): Promise<void> {
if (!req.isMultipart()) {
res.send(new BadRequestException());
return;
}
const csvData = await req.file();
let now = new Date();
now = new Date(now.getTime() - now.getTimezoneOffset() * 60 * 1000);
const storageKey = `${now.toISOString()}-${csvData.filename}`;
await this.commandBus.execute(new ImportCsvCommand(csvData, storageKey));
res.send();
}
}
ImportCsv Command & Command Handler
src/product/application/command/import-csv-command.ts
export class ImportCsvCommand {
constructor(
public readonly data: MultipartFile,
public readonly storageKey: string,
) {}
}
src/product/application/command-handler/import-csv-handler.ts
@CommandHandler(ImportCsvCommand)
export class ImportCsvHandler implements ICommandHandler<ImportCsvCommand> {
constructor(
@Inject(IStorageService) private readonly storageService: IStorageService,
private readonly httpService: HttpService,
private readonly configService: ConfigService,
) {}
async execute({ data, storageKey }: ImportCsvCommand) {
await this.storageService.uploadFile(data, storageKey);
await this.callLowCodePipeline(storageKey);
}
private async callLowCodePipeline(storageKey: string) {
// invoke webhook
}
}
Low-Code Flow
For our low-code workflow, we harnessed the power of the Make platform. This user-friendly tool allows us to design our flow through an intuitive drag-and-drop interface. By seamlessly connecting pre-configured connectors, such as the S3 bucket and AirTable table, we effortlessly orchestrate the entire process. 'Make' empowers us to visually design and configure the workflow, ensuring a smooth integration between the S3 bucket and the AirTable table, all with a few clicks and without the need for extensive coding.
Webhook: the flow entry point
At the start of our flow is the webhook block, acting as the starting point for our process. This essential component kicks off the low-code workflow, setting the stage for the following actions.
src/product/application/command-handler/import-csv-handler.ts
@CommandHandler(ImportCsvCommand)
export class ImportCsvHandler implements ICommandHandler<ImportCsvCommand> {
constructor(
@Inject(IStorageService) private readonly storageService: IStorageService,
private readonly httpService: HttpService,
private readonly configService: ConfigService,
) {}
async execute({ data, storageKey }: ImportCsvCommand) {
await this.storageService.uploadFile(data, storageKey);
await this.callLowCodePipeline(storageKey);
}
private async callLowCodePipeline(storageKey: string) {
const result = this.httpService
.post(this.configService.get<string>('FLOW_IMPORT_PRODUCTS_WEBHOOK'), {
bucketName: this.configService.get<string>('PRODUCTS_BUCKET_NAME'),
bucketKey: storageKey,
})
.pipe(tap((res) => console.log(res)))
.pipe(
catchError(() => {
throw new ForbiddenException('API not available');
}),
);
await lastValueFrom(result);
}
}
curl --location 'http://localhost:3000/product/importCsv' \
--form 'csv=@"./sample-data/product.csv"'
sample-data/product.csv
1;First Product;10.50
2;Second Product;20.00
3;Third Product;30.00
After the execution:
From S3 Reading to CSV Parsing
Moving ahead, we shift from reading data in the S3 bucket to the CSV parsing phase. The low-code system smoothly collects the CSV file and then effortlessly processes it through parsing.
Final Step: insert into AirTable
As the final step, our low-code workflow seamlessly inserts the parsed data into the designated AirTable.
Conclusions
The project code is in this GitHub repository: cqrs-with-low-code.
Embracing a low-code approach for certain command handling in our application has proven advantageous, notably in expediting feedback loops. Delegating tasks to the low-code pipeline streamlines command execution, fostering a quicker response mechanism. This efficiency not only enhances our development process but also empowers us to adapt and iterate swiftly, ultimately contributing to a more responsive and agile system.
Top comments (0)