A bit of background
In some applications the usage of messaging systems is a must, you need to communicate different components in your application in an asynchronous way, this means putting a message in some place hoping another piece of your application pick up those messages and process them. This is fairly simple and doesn't present a problem by itself but some times a piece of your application (the one that should be picking up those messages) fails and you only notice when you see the implication of those messages not being processed (which could be too late and your customers are noticing the bad effect).
Presenting the scenario
Let's suppose we have a single service bus (a messaging/queue PaaS component you can consume in Azure) with a single topic and 2 subscriptions in that topic. This is a very simple scenario but the idea is to present the way to monitor the amount of messages in each subscription to detect if our application is not processing those messages. We will have an application that will be publishing messages into one of the subscription at a regular basis and will be running always, another application will process those messages but will be doing it a bit slower than the one pushing messages, this will show the messages piling up a litlle bit (just to have a nice graph :) ) and then, for some wild reason, this application stops working, we should notice in the graph that the messages in that subsription starts piling up noticing that they are not being processed.
This example doesn't include an alert, but in Azure Monitor it is possible to set an alert to trigger when we detect a certain amount of messages in the subscription
The limitations currently present in Azure
At this moment, we do have a metric in Service Bus that presents the count of messages, but it is at a topic level and not at subscription level, even when the information is accessible thru the Service Bus API it is not possible to set alerts or create graphs based on that information (yet), if you are only interested in the messages at a topic level, those metrics (still in preview) are enough but if you are interested in knowing which subscription is not being processed, this article is definetely for you.
What we will need
We will need:
A Service Principal Name, which is called "App Registration" in the Azure portal. This SPN will be what we will use to interact with our Service Bus to read the message counters and to push the counter to the Log Analytics Workspace.
A Log Analytics Workspace which is a realy nice resource where we can store metrics we extract from azure or even push our own metrics there (like we are about to do with the counters of the amount of messages per subscription).
A Service Bus with one Topic and 2 subscriptions in the topic, this is an over simplified scenario but will help present the case and then you can adjust to your needs.
Configuring the SPN
The first item we are going to need is a Service Principal, this is created in the Azure Active Directory under the "App Registration" section.
The configuration is fair simple, you need to provide a Name and later when it is created setup a password of it and you are done.
At this point take note of the secret created (it is automatically created by Azure) because you will not be able to retrieve it again in the future, if for some reason you lost it or it gets compromised, you can allways create a new secret and remove this one.
Also very important to note the Application ID from this SPN as you will need it in the script to identify to Azure as this SPN.
After this is completed there is an extra step that is very important, we need to provide permissions to this Service Principal to operate with our resources otherwise it will not be able to even read anything, for this example I will simplify this by giving "Contributor" in the subscription where the resources are, this is not ideal at all but for this example will be ok, please setup yours with less permissions to limit the actions this SPN can do.
For this we simply go to the Azure Subscription pane and click on "Access control (IAM)" and then add a role assignment as follows.
Setup the Log Analytics Workspace
We will need a Log Analytics Workspace to store the results of our script and later build alerts on top of it.
Once you have the Log Analytics Workspace deployed you will need to obtain it's:
- WorkspaceID
- Primary Key (or Secondary, it is the same)
This two items will be used to tell the script which one is the Log Analytics to store the information and to give permissions to write in it.
You will find this information in the "Advanced settings" blade of the Log Analytics.
Then we can go to the Logs in our Log Analytics and check what are the stored tables in there, this ones are the defaults, you will see later that we will add a new one to store our informations.
Lets start the coding
Obtain a valid header for our calls to Azure API
Ok, now that we have the SPN setup and the Log Analytics configured we can start with the needed code to work with them.
Our first piece of code will create the needed header to interact with the Azure API to perform calls.
function GetAzureAuthHeader {
$token = Invoke-RestMethod `
-Uri "https://login.microsoftonline.com/$TenantId/oauth2/token?api-version=1.0" `
-Method Post `
-Body @{
"grant_type" = "client_credentials";
"resource" = "https://management.core.windows.net/";
"client_id" = $client_id;
"client_secret" = $client_secret;
};
$header = @{
'Authorization' = $("Bearer " + $token.access_token);
};
write-host "Logging to Azure finished.";
return $header;
}
Posting informtation to Log Analytics
In order to post information to your Log Analytics you will need a signature, this signature is build based on the information of your workspace and also with the information you are posting, here is a simple function that will calculate the signature for you.
Function BuildSignature ($WorkspaceId, $sharedKey, $date, $contentLength, $method, $contentType, $resource){
$xHeaders = "x-ms-date:" + $date
$stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource
$bytesToHash = [Text.Encoding]::UTF8.GetBytes($stringToHash)
$keyBytes = [Convert]::FromBase64String($sharedKey)
$sha256 = New-Object System.Security.Cryptography.HMACSHA256
$sha256.Key = $keyBytes
$calculatedHash = $sha256.ComputeHash($bytesToHash)
$encodedHash = [Convert]::ToBase64String($calculatedHash)
$authorization = 'SharedKey {0}:{1}' -f $WorkspaceId,$encodedHash
return $authorization
}
Now you have your signature and you will need to post the information to Log Analytics, we will encapsulate that in a function to make it easier.
Function PostLogAnalyticsData($WorkspaceId, $sharedKey, $body, $logType){
$method = "POST"
$contentType = "application/json"
$resource = "/api/logs"
$rfc1123date = [DateTime]::UtcNow.ToString("r")
$contentLength = $body.Length
$signature = BuildSignature `
-WorkspaceId $WorkspaceId `
-sharedKey $sharedKey `
-date $rfc1123date `
-contentLength $contentLength `
-method $method `
-contentType $contentType `
-resource $resource
$uri = "https://" + $WorkspaceId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01"
$TimeStampField = "TimeGenerated"
$headers = @{
"Authorization" = $signature;
"Log-Type" = $logType;
"x-ms-date" = $rfc1123date;
"time-generated-field" = $TimeStampField;
}
$response = Invoke-RestMethod -Uri $uri -Method $method -ContentType $contentType -Headers $headers -Body $body
return $response.StatusCode
}
With this two functions you have everything that is neede to post your information to Log Analytics, now we need to obtain the information and simply push it.
Populate the subscriptions with some messages for our example
In this example we will simply add a few messages to a few subscriptions to show the functionality of the script, this example is not intended to be a real life scenario.
In our Service Bus we will have a single topic called "simpletopic" and two subscriptions in it called "Subscription1" and "Subscription2" (how creative), we will have 2 messages in "Subscription1" and 3 messages in "Subscription2"
Allright, all the setup is done and we have the tree basic functions we need to retrieve information and post it to Log Analytics, let's put it all together!
Querying the Log Analytics Workspace to see the messages
Let's first get the topics of our Service Bus
$Header = GetAzureAuthHeader
$ServiceBusTopicQueryURL = "https://management.azure.com//subscriptions/$Subscription/resourceGroups/$ResourceGroupName/providers/Microsoft.ServiceBus/namespaces/$ServiceBusName/topics?api-version=2017-04-01"
$ServiceBusTopics = $(Invoke-RestMethod -Uri $ServiceBusTopicQueryURL -Headers $Header -Method Get -ErrorAction Stop)
In $ServiceBusTopics.value we will have all the founded topic names, in our case is only "simpletopic"
Then we will do our next call to retrieve the subscriptions for that topic. In a real world example you will want to do this in a foreach loop to look in all the topics returned.
$ServiceBusSubscriptionQueryURL = "https://management.azure.com//subscriptions/$Subscription/resourceGroups/$ResourceGroupName/providers/Microsoft.ServiceBus/namespaces/$ServiceBusName/topics/simpletopic/subscriptions?api-version=2017-04-01"
$ServiceBusSubscriptions = $(Invoke-RestMethod -Uri $ServiceBusSubscriptionQueryURL -Headers $Header -Method Get -ErrorAction Stop)
This will return the two subscriptions, here is a picture of the output.
you can see in the "properties" the counts of messages already, they will look like this
And there you have the count of messages as "messageCount" which is the sum of all messages in the subscription no matter if they are active, deadletter, scheduled, transfer or transferdeadletter, but in "countDetails" we DO have them split by message status and this is what we are looking for!
Now we will want to do something for every message that we found, yes you are right, store them in Log Analytics!. Let's do that using the function "PostLogAnalyticsData" that we created earlier.
For ActiveMessages
$jsonActive = @{
"ServiceBusTopicName" = $Topic
"ServiceBusSubscriptionName" = $ServiceBusSubscription.name
"ServiceBusSubscriptionActiveMessageCount" = $ServiceBusSubscription.properties.countDetails.activeMessageCount
}
$json = $jsonActive | ConvertTo-Json
# Submit the data to the API endpoint for active messages
PostLogAnalyticsData -WorkspaceId $WorkspaceId -sharedKey $sharedKey -body ([System.Text.Encoding]::UTF8.GetBytes($json)) -logType $logType
The tricky part in here is the "logType" this is the name we will give to our table in Log Analytics, it is any name we want it to have and then we need to provide a json as the body with the information that we want to store in Log Analytics, keep in mind that this is whatever you want to put in there, I am choosing to store the name of the topic, the name of the subscription and the amount of active messages but you can put whatever you want.
Executing all together
When you put all this together you will want to run it in a fixed schedule to constantly cound the messages, keep in mind that the information you push to Log Analytics can take up to 5 minutes to show there, so is not instant when you push it.
After your first push to Log Analytics you will see a "Custom Log" appear and below a table with the name you gave to this ("ServiceBusMessageCount" in our example)
After a minutes of our first execution we see the numbers there.
OK! the information is in there from here on you can build a graph and place it in a dashboard, an Azure Monitor alert or consume this information with another thirth party application (like grafana).
Final notes
I am putting this in a github repository in a single file so is easy for you to clone and tune on top of it.
Github: https://github.com/javiermarasco/devopsjourney/tree/master/MessageCounter
If you liked this post please let me know or if you find that there are missing points or something to improove it also please share your thoughts with me I am always happy to get input and improove.
Thank you for reading the article and I wish you to have a very productive day!
Top comments (0)