Recap
We have pretty good visibility of what happens with our function from "outside". In order to understand the performance and actual work progress inside the function, we installed AWS Lambda Powertool layer which allows us to do exactly this. Now it is the time to start implementing it in the Function code.
Python code
For my example I wrote very, and I mean it, very simple code. I use it also for CI/CD lab, where I do some unit tests in the pipeline (to show the functionality), therefore the code consist a few functions inside.
So, here is the Lambda function code.
""" Demo lambda """
import json
def sumnumbers(a, b):
""" simple function """
result = a + b
return result
def mutiplynumbers(a, b):
""" simple function """
result = a * b
return result
def subnumbers(a, b):
""" simple function """
result = a - b
return result
def divisionnumbers(a, b):
""" simple function """
result = a / b
return result
def handler(event, context):
""" A very simple Lambda function """
numa = 456
numb = 234
firstoperation = sumnumbers(numa, numb)
secondoperation = mutiplynumbers(numa, numb)
thirdoperation = subnumbers(numa, numb)
fourthoperation = divisionnumbers(numa, numb)
response = "This is sum {}, multiply {}, substraction {}, division {}".format(
firstoperation,
secondoperation,
thirdoperation,
fourthoperation
)
return {
'statusCode': 200,
'body': json.dumps(response)
}
Decorate the handler
Tracing
Let's go to the work. For AWS Lambda powertools we can decorate each function to enable its functionalities. However, there is one function which needs to be treated differently. As all of us know, it is the handler function.
In the example there is an handler
function which is triggered by Lambda. We will decorate it with Tracer first. On the end we will have all elements enabled - tracing, metrics and logging (sounds like three pillars of Observability to you? It should :) ).
First, we need to import tracer from the library (which is included as layer).
from aws_lambda_powertools import Tracer
Initialize it outside the handler.
tracer = Tracer()
And finally, decorate the handler.
@tracer.capture_lambda_handler
def handler(event, context):
""" A very simple Lambda function """
Please remember, in order to have it operational, we need to have the previous step done (previous article in this tutorial), which means proper setup in the SAM template:
Tracing needs to be enabled for Lambda function: Tracing: Active
And variable POWERTOOLS_SERVICE_NAME
set, like in our example: POWERTOOLS_SERVICE_NAME: simpleFunctionService
Having that we are able to capture:
- Coldstart and create an annotation about it
- Any response (or exception) generated by handler. All these will be included as tracing metadata.
What is the difference between annotation and metadata? As documentation states:
- annotations are key-value records associated with traces and indexed by X-Ray. You ca work with them on X-Ray, create traces groups, etc.
- metadata are also key-value records, also associated with traces, but are not indexed by X-Ray.
Functions
Ok, we decorated the handler. What about the functions? We can decorate them as well, let's do it as an example here:
@tracer.capture_method
def sumnumbers(a, b):
""" simple function """
result = a + b
return result
Now we are able to capture the functions as well. Let's deploy it and see what will happen in X-Ray.
All right, we deployed the code. Please remember, that if you run it thourgh any pipeline and you perform the unit tests, you have to have AWS Lambda Powertools installed. The simplest way to do it is to install the library using pip
:
pip install aws-lambda-powertools
This is exactly what I did. Of course, we can test the whole Lambda function, but I do not care about it here.
Ok, the code is deployed, after the few executions of API, we should see something similar to this:
We can see we have the new information - number of annotations. Let's jump to the one where we have the highest number, in my case - 3
.
When we switch to Raw view, we have very detailed information. Let's keep our focus on one important part - coldstart annotation. Yes, we have this knowledge and we can use it in the future :)
As final element here, let's add some custom annotations to our functions.
@tracer.capture_method
def sumnumbers(a, b):
""" simple function """
result = a + b
tracer.put_annotation("operationStatus", "success")
tracer.put_annotation("operationResult", result)
return result
Let's go to X-Ray again after the deployment. Click any trace, then click any from our small functions and go to the annotations
tab. You should see similar output like on the picture below.
Top comments (0)