Keeping track of all the actions being done in your AWS account can be a real challenge - especially when you have multiple team members using the same account.
All actions in your account get recorded in the AWS CloudTrail application in AWS. This trail of everything that happens in your account will likely have a huge number of entries in it as it records pretty much everything that happens in your account. You can do queries on it and also download the logs and perform whatever processing you wish.
There are 1000's of different API calls that you can make within your AWS accounts by performing operations in the AWS Console, using the AWS Command Line Interface (CLI), or using the various AWS Software Development Kits (SDKs).
You are likely interested in knowing when some key actions occur in your accounts.
You may be interested from a security perspective where you want to know if someone is performing destructive operations or messing with Identity and Access Management (IAM) entities, or even as simple as someone creating or deleting IAM Access Keys or someone logging into the AWS console as the account Root user.
Another case may be if you're running containers in your account using a service like the Elastic Container Service and you want to know when tasks in your container clusters have completed or have failed.
Since every API call is logged you can build tools to react to any event and perform almost any operation with this information. A simple approach to start with is to notify interested parties when certain API calls have occurred so that they can decide if it's important to investigate (for security-related events), or if they have to start performing new operations to continue the flow of a bigger application as the previous tasks have completed.
This blog details an example Github repo where a Serverless Application Model (SAM) project is setup that will allow users to receive emails when a subset of S3, IAM, and Account Login events occur in your AWS account and also will send messages to a Slack server with the details of the AW account events.
Amazon Eventbridge
One of the most important components in AWS for building Event-driven Architectures (EDAs) or any type of system where you trigger actions based on some event happening is Amazon Eventbridge.
Eventbridge contains a number of components for dealing with EDAs including support for Event Buses, a Scheduler, and Eventbridge Pipes.
Event Buses are event routers to route events from sources to targets without them being coupled. Event producers create event objects and send them to an event bus. Event consumers register with Eventbridge for which patterns of events they are interested in reacting to. Eventbridge takes care of the routing of events to listeners who are interested in them.
The Eventbridge Scheduler allows you to create recurring or one-time schedules where almost any AWS resource can be spun up and executed. The scheduler supports huge scales of actions.
Eventbridge Pipes are a tool that allows you to link together event producers and consumers without having to write extra function logic. You can enrich the information passed from producers to consumers without writing AWS Lambda code to facilitate it.
Here is a picture of the Amazon Eventbridge AWS console page.
Amazon Eventbridge default event bus
Each AWS account has a default Amazon Eventbridge Event Bus configured. This event bus receives almost all events for all actions performed in your AWS account. So every action you perform in the AWS console, the AWS CLI, or AWS SDK code will result in this default event bus getting triggered. By default no consumers are setup to do anything for these events being received so they just get ignored by Eventbridge.
In order to setup a listener for a subset of these AWS API events in Eventbridge you create Eventbridge Rules. These rules define a filter or pattern of what types of events a consumer is interested in. The rules also define a Target of what should be called when a matching event is seen.
Here is an example of an Eventbridge Rule Event Pattern matching three event types (DeleteBucket, DeleteBucketPolicy, and PutBucketPolicy) from the S3 service for the default event bus.
For each Eventbridge Rule you can setup a number of Targets to perform some action when a matching event is seen. Targets can be various AWS services including AWS Lambda functions, SQS queues, ECS Tasks or many more. The full list of supported target types can be found here.
Here is an example of the target associated with the rule above for S3 events:
The target entry above says that the Lambda function named serverless-account-watcher-AccountEventHandler-Wq8ovJs581gi should be executed when events are received matching this rule.
AWS Lambda Function to handle events
When an AWS Lambda function is triggered from an Amazon Eventbridge Rule, the function will receive the details of the event (AWS API call) that triggered the rule. This event will contain all the relevant information including what IAM principal (user, role, etc) triggered the action., details about the parameters of the request, and security information like the IP address of the caller and more.
Here is an example of what an event received for an API call to delete an S3 bucket.
In the AWS Lambda function that will handle these events there is code to parse out key details from events like above and then send the details to a Slack server as well as sending emails to registered users of a Simple Notification Service (SNS) topic.
The main AWS Lambda code is below. It is coded in Python using the AWS SDK for Python along with the very useful AWS Powertools for Lambda library. The full source along with all the SAM Infrastructure as Code (IaC) files can be found in the Github repo associated with this blog which is here.
from aws_lambda_powertools import Logger, Tracer, Metrics
import boto3
import requests
import json
import os
import traceback
logger = Logger()
tracer = Tracer()
metrics = Metrics(namespace="AccountEventHandler")
@tracer.capture_method
def parse_event(event):
"""Parse details of event"""
result = ""
eventName = "UNKNOWN EVENT"
eventDetail = event.get('detail')
if eventDetail:
eventName = eventDetail.get('eventName')
try:
match eventName:
# S3 events
case "DeleteBucket":
result = f"Bucket \"{eventDetail.get('requestParameters').get('bucketName')}\" was deleted by \"{eventDetail.get('userIdentity').get('type')}\" \"{eventDetail.get('userIdentity').get('arn')}\""
case "PutBucketPolicy":
result = f"Bucket \"{eventDetail.get('requestParameters').get('bucketName')}\" policy added by \"{eventDetail.get('userIdentity').get('type')}\" \"{eventDetail.get('userIdentity').get('arn')}\""
case "DeleteBucketPolicy":
result = f"Bucket \"{eventDetail.get('requestParameters').get('bucketName')}\" policy deleted by \"{eventDetail.get('userIdentity').get('type')}\" \"{eventDetail.get('userIdentity').get('arn')}\""
# IAM events
case "CreateAccessKey":
result = f"Access Key \"{eventDetail.get('responseElements').get('accessKey').get('accessKeyId')}\" for user \"{eventDetail.get('requestParameters').get('userName')}\" created by \"{eventDetail.get('userIdentity').get('type')}\" \"{eventDetail.get('userIdentity').get('arn')}\""
case "DeleteAccessKey":
result = f"Access Key \"{eventDetail.get('requestParameters').get('accessKeyId')}\" for user \"{eventDetail.get('requestParameters').get('userName')}\" deleted by \"{eventDetail.get('userIdentity').get('type')}\" \"{eventDetail.get('userIdentity').get('arn')}\""
case "UpdateRole":
result = f"Role \"{eventDetail.get('requestParameters').get('roleName')}\" updated by \"{eventDetail.get('userIdentity').get('type')}\" \"{eventDetail.get('userIdentity').get('arn')}\""
case "DeleteRole":
result = f"Role \"{eventDetail.get('requestParameters').get('roleName')}\" deleted by \"{eventDetail.get('userIdentity').get('type')}\" \"{eventDetail.get('userIdentity').get('arn')}\""
# Console Login events
case "ConsoleLogin":
result = f"Root user console login from IP: \"{eventDetail.get('sourceIPAddress')}\""
# Default generic event
case _:
result = eventDetail
except:
result = event
return eventName, result
@tracer.capture_method
def send_slack_message(payload, webhook):
"""Send Slack message to passed in URL
"""
logger.debug(f"payload={payload} webhook={webhook}")
headers = {'Content-Type': 'application/json'}
return requests.post(webhook, data=json.dumps(payload), headers=headers)
@tracer.capture_method
def publish_to_sns(subject, message, topic):
logger.debug(f"subject={subject} message={message} topic={topic}")
# Send message to SNS
sns_client = boto3.client('sns')
return sns_client.publish(TopicArn=topic, Subject=subject, Message=message)
@tracer.capture_lambda_handler
@logger.inject_lambda_context(log_event=True)
@metrics.log_metrics(capture_cold_start_metric=True)
def lambda_handler(event, context):
try:
SNS_TOPIC_ARN = os.environ['SNS_TOPIC_ARN']
SLACK_WEBHOOK_URL = os.environ['SLACK_WEBHOOK_URL']
logger.debug(f"SNS_TOPIC_ARN={SNS_TOPIC_ARN}")
logger.debug(f"SLACK_WEBHOOK_URL={SLACK_WEBHOOK_URL}")
event_name, event_detail = parse_event(event)
slack_msg = f"{event_name}: {event_detail}"
logger.debug(f"slack_msg={slack_msg}")
slack_response = send_slack_message({"text": slack_msg}, SLACK_WEBHOOK_URL)
logger.debug(f"slack_response={slack_response}")
sns_subject = event_name
sns_msg = event_detail
sns_response = publish_to_sns(sns_subject, sns_msg, SNS_TOPIC_ARN)
logger.debug(f"sns_response={sns_response}")
except Exception as ex:
logger.exception("Exception hit")
raise RuntimeError("Cannot prcocess event") from ex
Serverless Application Model (SAM)
In my example I am using the Serverless Application Model (SAM) as my Infrastructure as Code tool. If you click on the links you can find more information about SAM and it's associated CLI (SAM CLI).
The SAM template file (template.yaml) that sets up the AWS Lambda function for this project along with the Eventbridge Rules, SNS Topics and more looks like this:
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
serverless-account-watcher
SAM template to setup notifications for AWS account changes and notify interested parties.
Globals:
Function:
Timeout: 3
MemorySize: 128
Tracing: Active
LoggingConfig:
LogFormat: JSON
Environment:
Variables:
POWERTOOLS_SERVICE_NAME: AccountEventHandler
Parameters:
SlackWebhookURL:
Type: String
Description: URL to publish slack messages for account changes to. (NOTE Below is an invalid URL - need to change it)
Default: https://hooks.slack.com/triggers/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
Resources:
AccountEventHandler:
Type: AWS::Serverless::Function
Properties:
CodeUri: account_event_handler/
Handler: app.lambda_handler
Runtime: python3.12
Architectures:
- arm64
Layers:
- !Sub arn:aws:lambda:${AWS::Region}:017000801446:layer:AWSLambdaPowertoolsPythonV2-Arm64:69
Environment:
Variables:
SNS_TOPIC_ARN: !Ref AccountNotificationSNSTopic
SLACK_WEBHOOK_URL: !Ref SlackWebhookURL
Policies:
- SNSPublishMessagePolicy:
TopicName: !GetAtt AccountNotificationSNSTopic.TopicName
Events:
TriggerForS3Events:
Type: EventBridgeRule
Properties:
RuleName: S3EventsRule
Pattern:
source:
- "aws.s3"
detail:
eventName:
- DeleteBucket
- DeleteBucketPolicy
- PutBucketPolicy
eventSource:
- "s3.amazonaws.com"
TriggerForIAMEvents:
Type: EventBridgeRule
Properties:
RuleName: IAMEventsRule
Pattern:
source:
- "aws.iam"
detail:
eventName:
- CreateAccessKey
- DeleteAccessKey
- UpdateRole
- DeleteRole
eventSource:
- "iam.amazonaws.com"
TriggerForSigninEvents:
Type: EventBridgeRule
Properties:
RuleName: SigninEventsRule
Pattern:
source:
- "aws.signin"
detail:
userIdentity:
type:
- Root
eventName:
- ConsoleLogin
eventSource:
- "signin.amazonaws.com"
AccountNotificationSNSTopic:
Type: "AWS::SNS::Topic"
Properties:
DisplayName: "Account Notifcation SNS Topic"
Subscription:
- Endpoint: account_notifications@example.com
Protocol: email
TopicName: "AccountNotificationSNSTopic"
Outputs:
SNSTopic:
Description: SNS Topic that will receive account notifications
Value: AccountNotificationSNSTopic
LambaFunctionARN:
Description: ARN of the Lambda function that will process account notifications
Value: !GetAtt AccountEventHandler.Arn
There are 100's (or likely 1000's) of AWS API events you can handle using the same approach. If you want to add more just update the SAM template file to setup rules for more events. If you want to parse out more important details for the included events or other event types you just need to write some more code in the above parse_event function to get the details. In my case I just want to know when key events happen and then i will go find the full event details in CloudTrail for cases where i want to inspect more.
Examples of the project notifications
After deploying this project in your AWS account you will receive an email at the address setup in the SAM template.yaml file to confirm you are ok to receive emails from the AWS SNS service. The email you will receive will look like this:
This confirmation is only needed to be done once per email/SNS topic.
Once you have confirmed you can then go ahead and try creating some of the actions in which this project is setup to listen for.
Here is an example of what will be seen in your email when you delete an S3 bucket.
Below is an example of the message seen in my slack server:
Try the example in your AWS account
You can clone the Github repo and try this out in your own AWS account. You will need to replace a few values in the template.yaml file. The SNS Subscription Endpoint which has the value "account_notifications@example.com" will need to be updated to an email address you have access to. Also the SlackWebhookURL Default value of https://hooks.slack.com/triggers/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 will have to updated to a valid Slack webhook URL.
To install the project you will need to setup the AWS SAM CLI as described in the AWS SAM section above and then you will need to run "sam build" and "sam deploy"
Please let me know if you have any suggestions or problems trying out this example project.
For more articles from me please visit my blog at Darryl's World of Cloud or find me on X, LinkedIn, Medium, Dev.to, or the AWS Community.
For tons of great serverless content and discussions please join the Believe In Serverless community we have put together at this link: Believe In Serverless Community
Top comments (1)
Insightful !