DEV Community

Cover image for Develop and Test AWS S3 Applications Locally with Node.js and LocalStack
Srishti Prasad
Srishti Prasad

Posted on

Develop and Test AWS S3 Applications Locally with Node.js and LocalStack

AWS S3 (Simple Storage Service)

A scalable, high-speed, web-based cloud storage service designed for online backup and archiving of data and applications.
Provides object storage, which means it stores data as objects within resources called buckets.

Managing files and images is crucial in modern web development, often utilizing cloud storage solutions such as Amazon S3.
Directly developing and testing S3 interactions on AWS can be cumbersome and expensive.

This blog targets: Node.js developers,Cloud developers,DevOps engineers.
The focus is on simulating AWS S3 locally using LocalStack for:

  • Efficient development and testing workflows
  • Cost-effective solutions

By the end of this blog, you will learn:

  • How to utilize Node.js and AWS SDK v3
  • Methods for uploading and fetching images
  • Strategies to ensure a seamless local development workflow

๐Ÿ“šWhat is localstack?

Localstack is a technology with the help of which, we may mimic a development environment using AWS services by accessing local versions of various cloud services. This enables you to improve and debug your code prior to putting it into a live environment. Because of this, Localstack is a useful tool for simulating important AWS services like message queues and object storage, among others.

Furthermore, Localstack is a useful resource for learning how to set up and launch services using a Docker container without having to use your credit card or open an AWS account.
We build a Localstack container in this tutorial to implement the primary S3 service functions.

๐ŸŒNode Js & ๐Ÿ“LocalStack

As mentioned earlier, LocalStack provides a means to emulate a local environment for Amazon with some of the most popular services. This article will guide you through the process of creating a container using the LocalStack image. Subsequently, it will demonstrate the utilization of Node.js and AWS SDK v3 to create an S3 bucket and implement key functionalities within these services.

Prerequisites
Before you begin please ensure that you have the ๐Ÿณ Docker & Docker desktop should be installed.

๐Ÿš€ A Step-by-Step Guide:
1) Pull docker image of localstack from Docker hub:

docker pull localstack/localstack
Enter fullscreen mode Exit fullscreen mode

2) Run the container with following command:

docker run -d --rm -it -p 4566:4566 -p 4510-4559:4510-4559 localstack/localstack
Enter fullscreen mode Exit fullscreen mode

3) To list the running containers in your Docker environment run

docker ps
Enter fullscreen mode Exit fullscreen mode

Now you have to copy containerID of localstack over there and run next command

4) Get inside the container with the following execution:

 docker exec -it '<container-id>/<name>' bash
Enter fullscreen mode Exit fullscreen mode

example:
execute command
5) To set the default creds, AWS APIs need some kind of dummy configuration

 aws configure --profile default 
Enter fullscreen mode Exit fullscreen mode

enter same accessID and secret key as you are going to add in script here I have used 'test'
Image description
6) To test if you are able to build the S3 buckets

 awslocal s3api create-bucket --bucket sample-bucket
Enter fullscreen mode Exit fullscreen mode

here name of bucket is sample-bucket (I have named it as 'banner')so I'll add the same in the script.

Image description

7) To list AWS S3 buckets

 awslocal s3api list-buckets
Enter fullscreen mode Exit fullscreen mode

You have S3 bucket created. Now you can perform operations on bucket created.

Node js script to generate signedUrl for every image in s3 bucket

import {
  S3Client,
  CreateBucketCommand,
  GetObjectCommand,
  HeadBucketCommand,
  waitUntilBucketExists,
  ListObjectsV2Command,
  PutObjectCommand
} from '@aws-sdk/client-s3';
import { fromEnv } from '@aws-sdk/credential-providers';
import { getSignedUrl } from '@aws-sdk/s3-request-presigner';
import fs from 'fs';
import path from 'path';

// Set AWS credentials and configuration
process.env.AWS_ACCESS_KEY_ID = 'test';
process.env.AWS_SECRET_ACCESS_KEY = 'test';

const s3Client = new S3Client({
  region: 'us-east-1',
  endpoint: 'http://localhost:4566',
  forcePathStyle: true,
  credentials: fromEnv()
});

const bucket = 'banner';

// Here you have to add absolute path of image or file 
const directoryPath = '/Users/download/S3-localstack/images';

async function generateSignedUrl(key) {
  try {
    const getObjectUrl = await getSignedUrl(s3Client, new GetObjectCommand({ Bucket: bucket, Key: key }), { expiresIn: 3600, responseContentDisposition: 'inline' });
    return getObjectUrl;
  } catch (error) {
    console.error(`Error generating signed URL for ${key}:`, error);
    return null;
  }
}

async function uploadFiles() {
  const files = fs.readdirSync(directoryPath);
  const uploadPromises = files.map(file => {
    const filePath = path.join(directoryPath, file);
    const fileStream = fs.createReadStream(filePath);
    fileStream.on('error', function (err) {
      console.error('File Error', err);
    });

    const uploadParams = {
      Bucket: bucket,
      Key: file,
      Body: fileStream
    };

    const putObjectCommand = new PutObjectCommand(uploadParams);
    return s3Client.send(putObjectCommand).then(() => {
      console.log(`File ${file} successfully uploaded to bucket ${bucket}`);
    });
  });

  await Promise.all(uploadPromises);
}

async function main() {
  try {
    // Check if the bucket exists
    try {
      const headBucketCommand = new HeadBucketCommand({ Bucket: bucket });
      await s3Client.send(headBucketCommand);
      console.log(`Bucket ${bucket} already exists`);
    } catch (err) {
      if (err.name === 'NotFound') {
        const createBucketCommand = new CreateBucketCommand({ Bucket: bucket });
        await s3Client.send(createBucketCommand);
        console.log(`Bucket ${bucket} successfully created`);
      } else {
        throw err;
      }
    }

    // Wait until the bucket exists
    await waitUntilBucketExists({ client: s3Client, maxWaitTime: 20 }, { Bucket: bucket });
    console.log(`Bucket ${bucket} is ready`);

    // Upload files to the bucket
    await uploadFiles();

    // List objects in the bucket
    const listObjectsCommand = new ListObjectsV2Command({ Bucket: bucket });
    const data = await s3Client.send(listObjectsCommand);

    if (!data.Contents || data.Contents.length === 0) {
      console.log('No objects found in the bucket');
      return;
    }

    const imageKeys = data.Contents.map((object) => object.Key);
    console.log('Found keys:', imageKeys);

    // Generate signed URLs for each image
    const signedUrls = await Promise.all(imageKeys.map((key) => generateSignedUrl(key)));
    console.log('Signed URLs for the uploaded images:', signedUrls);

    return signedUrls;
  } catch (err) {
    console.error(`Failed to complete operations for bucket ${bucket}:`, err);
  }
}

// Call the main function
main();

Enter fullscreen mode Exit fullscreen mode

When using AWS S3 in a real-world scenario, you typically fetch files or images directly from the S3 bucket rather than from a local directory.

Output :-
Run the above script using: node <name_of_the_file>

Bucket banner successfully created
Bucket banner is ready
File newS3.jpeg successfully uploaded to bucket banner
File s3.jpeg successfully uploaded to bucket banner
Found keys: [ 'newS3.jpeg', 's3.jpeg' ]
Signed URLs for the uploaded images: [
  'http://localhost:4566/banner/newS3.jpeg?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=test%2F20240713%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20240713T145911Z&X-Amz-Expires=3600&X-Amz-Signature=e82c43d8017029c9e175df4ba4e7c317fecc4240c8a745291ed04696facc4da4&X-Amz-SignedHeaders=host&x-id=GetObject',

  'http://localhost:4566/banner/s3.jpeg?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=test%2F20240713%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20240713T145911Z&X-Amz-Expires=3600&X-Amz-Signature=e4d173ae016a3d89d0355d92b0f59b0b008866d18d2a4cfda2a93fcda381fb0c&X-Amz-SignedHeaders=host&x-id=GetObject'
]
Enter fullscreen mode Exit fullscreen mode

If you want to build an API to get those URLs and send them to the frontend, you can integrate the script with an Express server by creating & exposing a route utilizing this script.

This approach not only demonstrates the practical use of the AWS SDK and LocalStack for local development but also shows how you can extend this functionality to real-world applications by integrating it with backend services like Express.js

Here is documentation to follow you will get deatil of each method which I have used in the script.
aws sdk v3

Let me know in comments if you need help in exposing the api

That's it for today, thanks if you are reading till here ! Hope you enjoyed reading it.

Feel free to engage in the comment section if you have any questions or wish to contribute further insights to this blog. Your feedback and discussions are valued contributions that enhance our shared knowledge.

Top comments (0)