The Ask?
I was always wondering if I could test Cloud Services locally without the hassle and expense of provisioning cloud services using one of the providers like AWS, Azure, or GCP then I found LocalStack, an open-source project that allows us to emulate multiple AWS cloud services, such as SQS, EC2, and CloudFormation, on the local machine
Introduction
LocalStack is a cloud service emulator that runs in a single container on your laptop or in your CI environment. With LocalStack, you can run your AWS applications or Lambdas entirely on your local machine without connecting to a remote cloud provider! Whether you are testing complex CDK applications or Terraform configurations, or just beginning to learn about AWS services, LocalStack helps speed up and simplify your testing and development workflow.
The required cloud service environment can be simulated using localstack and it supports wide range of cloud services including S3, lambda, SQS, SNS, SES, RDS, DynamoDB etc.
Component testing allows to test the behavior of the individual components or modules separately and ensure the it’s working as expected before integrating it into a large system. In the event-driven architecture, component testing is crucial due to the complex nature of integration testing caused by its asynchronous and distributed nature. Lambda functions are responsible for handling different events. Component testing can be utilised to identify defects or issues and to verify lambda specific logics by isolating and independently testing them.
Pre-requisite
There are some prerequisites to setup localstack with quarkus application.
Install Docker — Localstack emulates cloud services within a single docker container. Therefore, Docker should be installed and docker daemon should be running on the machine.
Install AWS CLI — You may need to run aws cli commands to check the cloud services created in test containers.
Setup
First of all, use the below cmd to create a directory app and create index.js, docker-compose.yml, and trust-policy.json.
index.js — this file is the main entry file for lambda and it contains the handler function. When the lambda function is invoked, it runs this handler method, and you will see the console statement inside your terminal. This method expects three arguments: event, context, and callback. Read more about the lambda argument here in the How it Works Section
docker-compose.yml — this file is used to start Localstack inside the docker container with some additional environments and configurations.
trust-policy.json —this file is used to define the IAM role that grants the function permission to access resources and services. You can read more about IAM here.
You have now successfully deployed your lambda function to localstack running inside a docker container on your system.
Execute the below command to invoke your lambda.
Switch to the terminal where you had run ‘docker-compose up’ and you will see Lambda is triggered inside your terminal.
Congratulations! You’ve successfully invoked your lambda function and it works as it would have in the AWS console.
The major upside of running localstack is that it enables you to
Explore other AWS services without worrying about incurring any cost by accident
Test your production code locally (thus crush some nasty bugs beforehand)
Further we can extend it to other AWS Services such as Dynamo DB
NodeJS Local Examples with DynamoDB/Docker
Some samples to test DynamoDB locally through Docker
# Download & Run LocalStack
$ docker pull localstack/localstack:latest
$ docker run -it -p 4567-4578:4567-4578 -p 8080:8080 localstack/localstack
# Add some fake credentials locally
$ vi ~/.aws/credentials
# Data to include >>>>>>>>>>>>>>>>>>>>>>>>>>>>
[fake]
region = eu-west-1
aws_access_key_id = **NOT_REAL**
aws_secret_access_key = **FAKE_UNUSED_CREDS**
# Data to include <<<<<<<<<<<<<<<<<<<<<<<<<<<<
npm i aws-sdk
node nodejs-dynamodb-create-table-local.js
node nodejs-dynamodb-populate-table-local.js
node nodejs-dynamodb-read-table-local.js
And you can also view the new dynamoDB resource created at local dashboard
Sample scripts
nodejs-dynamodb-create-table-local.js
const AWS = require("aws-sdk")
// URI and other properties could be load by ENV Vars or by property file (.env)
AWS.config.update({
region: "us-west-2",
endpoint: "http://localhost:4569"
})
const dynamodb = new AWS.DynamoDB()
const params = {
TableName : "Users",
KeySchema: [{ AttributeName: "email", KeyType: "HASH"}],
AttributeDefinitions: [{ AttributeName: "email", AttributeType: "S" }],
ProvisionedThroughput: {
ReadCapacityUnits: 5,
WriteCapacityUnits: 5
}
}
dynamodb.createTable(params, console.log)
nodejs-dynamodb-populate-table-local.js
const AWS = require("aws-sdk")
// URI and other properties could be load by ENV Vars or by property file (.env)
AWS.config.update({
region: "us-west-2",
endpoint: "http://localhost:4569"
})
const dynamodb = new AWS.DynamoDB()
var params = {
TableName:"Users",
Item: {
email : { S:"jon@doe.com"},
fullname: { S:"Jon Doe" },
role: { S:"Super Heroe" }
}
};
dynamodb.putItem(params,console.log)
nodejs-dynamodb-read-table-local.js
const AWS = require("aws-sdk")
// URI and other properties could be load by ENV Vars or by property file (.env)
AWS.config.update({
region: "us-west-2",
endpoint: "http://localhost:4569"
})
const docClient = new AWS.DynamoDB.DocumentClient()
const email = process.env.EMAIL || 'jon@doe.com'
const params = {
TableName: "Users",
KeyConditionExpression: "#email = :email",
ExpressionAttributeNames:{
"#email": "email"
},
ExpressionAttributeValues: {
":email":email
}
}
docClient.query(params,console.log)
Top comments (0)