DEV Community

Cover image for Shielding Your Data: Safeguarding AWS S3 via VPC Endpoints.
Kunal Shah for AWS Community Builders

Posted on • Edited on

Shielding Your Data: Safeguarding AWS S3 via VPC Endpoints.

Shielding Your Data: Safeguarding AWS S3 via VPC Endpoints.

Security & Cost Optimization at the same time.

Project Overview —

The AWS VPC Endpoint project aims to architect a secure, cost-efficient, and scalable cloud environment. The project’s primary goal is to provide a standardized framework for setting up AWS VPC endpoint for accessing AWS services privately while adhering to AWS best practices and compliance requirements.

SOLUTIONS ARCHITECTURE OVERVIEW -

First Let’s understand the real world use case -

1.Securing Access to AWS Services: VPC endpoints will let you get admission to AWS services inclusive of S3, DynamoDB, or Amazon RDS from within your VPC with out exposing them to the general public net. This enables enhance security by means of reducing the attack floor and gets rid of the need to configure public-dealing with security controls.

Example: A enterprise’s application jogging in an AWS VPC desires to get right of entry to records saved in Amazon S3 buckets securely. By the use of VPC endpoints for S3, the employer guarantees that data switch among the application and S3 stays within the AWS community, decreasing exposure to external threats.

2.Cost Optimization: VPC endpoints can help optimize expenses through decreasing information transfer prices incurred when getting access to AWS services over the public internet. Since information transfer between your VPC and the endpoint service stays in the AWS network, you can avoid information transfer costs associated with net traffic.

Example: An employer regularly transfers large volumes of facts between its AWS VPC and Amazon S3 buckets for backup and storage functions. By the usage of VPC endpoints for S3, the organization can extensively reduce facts switch costs compared to having access to S3 over the internet.

3.Compliance Requirements: VPC endpoints can assist meet regulatory and compliance necessities with the aid of ensuring that statistics transfer between your VPC and AWS offerings stays private and stable. This is in particular crucial for industries with strict information privateness and compliance requirements.

Example: A healthcare business enterprise desires to ensure that patient data saved in Amazon DynamoDB stays blanketed and compliant with HIPAA regulations. By using VPC endpoints for DynamoDB, the agency ensures that data get entry to is confined to authorized sources inside the VPC, supporting meet compliance necessities.

4.Improved Network Performance: VPC endpoints can enhance community performance by using lowering latency and enhancing throughput for accessing AWS offerings. Since facts transfer takes place within the AWS community, it may bring about quicker and greater reliable conversation between your VPC and endpoint services.

Example: A gaming organization's multiplayer online game calls for actual-time get entry to to Amazon DynamoDB for storing player profiles and sport nation. By the usage of VPC endpoints for DynamoDB, the business enterprise guarantees low-latency get right of entry to to the database, providing a continuing gaming revel in for players.

Overall, AWS VPC endpoints provide a steady, cost-powerful, and green way to get entry to AWS services from inside your VPC, making them an vital thing of many AWS architectures. By leveraging VPC endpoints, organizations can increase protection, reduce charges, and enhance performance for your AWS cloud workloads.

Prerequisite —

  • AWS Account with Admin Access.

AWS Services Usage —

  • AWS VPC, Endpoints, EC2, SSM, S3, CloudFormation and IAM

STEP BY STEP GUIDE -

STEP 1 : Clone the GitHub Repo

  • Navigate to following GitHub Repository **s3-vpc-endpoint-lab**

  • Clone the repo to download the CloudFormation Template for this lab.

  • CloudFormation template name — endpoint-lab-cft.yml

STEP 2 : Creating AWS resources through CloudFormation service.

  • Login to AWS account, Navigate to AWS CloudFormation Service.

  • Head over & change the region of the aws console where you want to deploy the resources. (default is ap-south-1)

  • If you want to deploy in any other region you will have modify prefix list in CloudFormation template.

  • Click on Create Stack & upload the template downloaded in the step 1.

  • Keep rest of the settings as default & hit create.

  • This stack will create a VPC, EC2, VPC endpoints, Instance profile, Security Group, subnets, route tables.

STEP 3 : Verify the CloudFormation deployment.

  • Check all the resources created/deployed through CloudFormation.

  • Verify Security group of EC2 to check there is no inbound rules.

  • Verify & validate Security group of endpoints where only VPC CIDR is allowed in inbound rules.

  • all endpoints — ssm, ec2, s3 are deployed.

STEP 4 : Creating two AWS s3 buckets.

  • Navigate to AWS S3 on aws console.

  • Create a bucket in the same region where CloudFormation is deployed.

  • This is to have a bucket in same region as of s3 gateway endpoint.

  • Create another bucket in different region for testing use case.

STEP 5 : Connect to AWS EC2 Instance through SSM

  • As we have deployed AWS ssm endpoint, we should be able to connect the private ec2 instance through ssm connect.

  • This is entirely secure & traffic remains in isolated VPC.

  • Once connected to EC2, list S3 buckets through usual aws cli command

    aws s3 ls

  • You should get no response as EC2 is private & no traffic is intended to Internet. Hence it is not able to query s3.ap-south-1.amazonaws.com

STEP 6 : Accessing AWS S3 via AWS VPC Gateway endpoint

  • Now we will use AWS VPC Gateway endpoint to access our regional AWS s3 buckets.

  • From EC2 console hit the command —

  • aws s3 ls — region (same region)

    aws s3 ls --region ap-south-1

  • List contents from a bucket via s3 gateway endpoint. (same region)

    aws s3 ls s3://mybucket --region ap-south-1

  • List contents from a bucket via s3 gateway endpoint. (different region)

  • This will give no response as there is no regional Gateway for it.

    aws s3 ls s3://mybucket --region us-east-1

STEP 7 : Accessing AWS S3 via PrivateLink

  • You can run the same command from another VPC or On-Premises that has connectivity with current VPC and be able to access the bucket via PrivateLink.

STEP 8 : Decommission

  • Delete the CloudFormation Stack to delete all the deployed resources.

  • Delete the s3 buckets created in step 4.

STEP 9 : More to read -

Congrats ! We have successfully completed lab for Shielding Your Data: Safeguarding AWS S3 via VPC Endpoints.

I am Kunal Shah, AWS Certified Solutions Architect, helping clients to achieve optimal solutions on the Cloud. Cloud Enabler by choice, DevOps Practitioner having 8+ Years of overall experience in the IT industry.

I love to talk about Cloud Technology, DevOps, Digital Transformation, Analytics, Infrastructure, Dev Tools, Operational efficiency, Serverless, Cost Optimization, Cloud Networking & Security.

aws #community #builders #VPC #endpoints #privatelink #connectivity #cloudformation #cost #optimization #network #security #hybrid #network #prefixlist #isolated #solution #centralize #secure #access #performance #edge #locations #operations #infrastructure #scalable #reliable #highly #available #private #secure #design #acloudguy

You can reach out to me @ acloudguy.in

Top comments (0)