DAY 14 - Amazon Macie - Security & Compliance - Day Fourteen
Tweet This Blog - 100 days of Cloud on GitHub - Read On iCTPro.co.nz
Data on 123 Million US Households Exposed Due to Misconfigured AWS S3 Bucket - TrendMicro
Insecure Amazon S3 bucket exposed personal data on 500,000 Ghanaian graduates - Portswigger
Leaky Buckets: 10 Worst Amazon S3 Breaches - Bitdefender
Some of the biggest leaks on recent years , main reason for the hacks was misconfigured S3 Buckets. The best way to avoid your misconfiguration is to have proper monitoring and logging which is already provided by AWS.
๐ฎ Policing your S3 infrastructure
.
A compliance and security service to automatically detect your unsecured data.
Amazon Macie gives you an monitoring , alerting and protection of your S3 buckets. This tool will audit S3 buckets to find -
- Discover your sensitive data at scale
- Continual evaluation of your Amazon S3 environment
- Scalable on-demand and automated sensitive data discovery jobs
- Fully managed sensitive data types.
- Custom-defined sensitive data types.
Multi-account support and integration with AWS organizations
ML/Ai helps to finds access pattern.
Assign business values in the form of risk score.
PII/PHI data protection.
Alert on unauthorized ACL changes.
Maintain Compliance requirement (GDPR).
Check out this link to understand the availability for your regions.
Macie helps to get alerts on Anonymized access , config compliance, credential loss, data compliance (PII,PHI), file hosting(malware detection), Identity Enumeration, information loss, location anomaly, open permissions, privileged escalation, Ransomware, Service disruption, suspicious access.
๐ ๏ธ Enable Amazon Macie
๐ Knowledge Requirement
- S3 buckets
- CloudTrail, enable CloudTrail
๐ค The Macie
- Goto your console and search for Amazon Macie
- now click Get Started
- click on the enable Macie
Thats it , now wait till Amazon Macie analyze your account.
After few minutes it will give you a brief summery of your s3 buckets
๐ชฃ Configure your bucket
Setup custom identifiers
Select Custom data identifiers and click create.
I am adding a custom data identifier for SIRET-NICs data. and click submit.
Configure a Job
Now lets configure a bucket for Amazon Macie to police.
- On dashboard click s3 buckets and select your buck and click Create Job.
- Refine your scope, for this demo purpose am keeping it as One-time job. Click Next
- Now select your Custom data identifier & Click Next.
- Give a Name and description & Click Next.
- review and click Submit.
if you want to store results for long term , consider Configuring an S3 bucket of your sensitive data discovery results. this can be done in Discovery results under Settings menu.
Overview of all buckets
- Select buckets from dashboard to see an overview
Publication of findings
Macie automatically publishes all new and updated findings to Amazon EventBridge. You can configure Macie to publish findings to additional destinations, and specify how often to publish updates to all destinations.
โ
Connect with me on Twitter
๐ค๐ฝConnect with me on Linkedin
๐ง๐ผโ๐คโ๐ง๐ป Read more post on dev.to or iCTPro.co.nz
๐ป Connect with me on GitHub
I belive its worth mentioning a legacy solution s3-inspector from Clario.
here is a small tutorial in case you want to practice it
Create IAM user
Comments / Action | Screenshot |
---|---|
Go to IAM, click on Users & Select Add user. Name and tick Programmatic access and click next | |
Now select Attach existing policies directly and add AmazonS3ReadOnly policy then review and c*reate user* |
Copy your credentials to a notepad or download it .
Configure AWS CLI
Create a s3readonly aceess user and get programatic access key.
Lets setup AWS CLI, goto your terminal (Linux)
enter
aws configure
Enter your AWS Access Key ID & AWS Secret Access Key
Get the code
Currently i have forked the code in GitHub
For Linux(Debian)
wget https://raw.githubusercontent.com/anuvindhs/s3-inspector/master/s3inspector.py
Execution
python s3inspector.py
This script is known for false positives , also if you get error during execution
try
sudo apt install python-pip
pip install termcolor
or you can get the code and improvise your self as well
</div>
Top comments (3)
I belive its worth mentioning a legacy solution s3-inspector from Clario.
here is a small tutorial in case you want to practice it
Create IAM user
Copy your credentials to a notepad or download it .
Configure AWS CLI
Create a s3readonly aceess user and get programatic access key.
Lets setup AWS CLI, goto your terminal (Linux)
enter
Enter your AWS Access Key ID & AWS Secret Access Key
Get the code
Currently i have forked the code in GitHub
For Linux(Debian)
Execution
This script is known for false positives , also if you get error during execution
try
or you can get the code and improvise your self as well
Thanks for sharing CLI tips!
You are welcome Wendy