DEV Community

Cover image for Host your ML model as a serverless function on Azure
Leonard Püttmann
Leonard Püttmann

Posted on • Edited on

Host your ML model as a serverless function on Azure

Building machine learning models is fun. But to deliver real business value, machine learning models need to be put into production. Often, that's easier said than done.

The logical solution is to make the model accessible via an endpoint that is hosted in the cloud. One option for this is to host the machine learning model on a virtual machine in the cloud. With this, however, comes the need to manage the VM. This can quickly become a hassle, especially when you need to serve a lot of endpoints.

One interesting alternative is to run your machine learning model as a serverless function in the cloud. While I think that the name serverless is not quite fitting, because we still use a server to run the code, the concept by itself is still amazing. Instead of managing hardware resources yourself, you just provide the code you would like to run and the provider will handle all the rest. Serverless functions are also dirty cheap. The code of the serverless function is usually packed up inside a Docker image, which is started up every time you call your model. This means that you don't pay for a resource 24/7, you only pay as you go.

For today's article, we will be using serverless functions on Microsoft's Azure Cloud. The first million executions for cloud functions there are free. So if your needs for hosting an ML model are manageable, you should be able to provide your model without paying even a single cent.

**In a nutshell, serverless functions are great, if:

  • The model is small and efficient.
  • You need the model only sometimes or in regular intervals and
  • response times of a couple of seconds are alright for you.
  • You don't want to pay a lot of money (or you don't have any).**

Deploying serverless functions of Azure is super easy. In this article, we are going to deploy a decision tree model, which was trained on data from a wind turbine to predict energy production. You'll need at least a basic understanding of the concepts of cloud computing as well as some knowledge of Python. However, if you'll have any questions along the way of this article, feel free to ask anything in the comments. To follow along, you'll need:

  • An actvive subscription in Azure
  • VS Code as well as some Azure Extentions
  • Python 3.8 or higher
  • Optional: Azure function core tools

Before we start building a serverless function app, we'll first take a look at our machine learning model as well as the data we trained the model with.

The whole code + data of the project is accessible here.

Predicting energy production of a wind turbine

The machine learning model meant for deployment is a simple decision tree that was trained on two and a half years of data from a wind turbine. The goal of the model is to predict the energy production of the wind turbine given information such as the ambient temperature, technical conditions of the turbine and, of course, the current wind speed. The dataset is really interesting and fun and I encourage you to dive deeper into it. When looking at the energy produced by the wind turbine, we can see that between July and September, the energy output seem to be the highest, usually outputting max energy. After that, the energy production is much lower.

Image description

To predict the energy production of the wind turbine, we are going to use a decision tree. While a decision tree will most likely yield worse results than, say, a random forest, XGBoost regressor or a neural net, a simple model such as a decision tree also has upsides. For one, it's very fast and lightweight, making it ideal to run as a serverless function. It's also very interpretable, making it easy to understand why exactly the model came to make its prediction. And when the underlying data itself is very good, decision trees can be very powerful. This is why this model, together with a logistic regression, is my first go-to model when first tackling any problem.

That being said, we probably won't see stunning results from the model. Before training the model, I shifted the data in a way that the model tries to predict the energy production for the next 10 minutes. The further the prediction is in the future, the more likely I expect the model to become unaccurate, but for demonstration purposes, I think this should be fine.

Creating an intital function in the Azure portal

Image description

Time to get started with the creation of the serverless function. Go to the Azure portal and click on function app to create a new function app. Choose to publish the function as code and select Python 3.9 as the runtime stack and version. I am going to deploy the function in northern Europe, as it is the nearest location to me. But feel free to choose any other location that's closest to you. As for the plan type we want to select the consumption (serverless) plan. After that, hit review + create to create the function app.

Image description

While it is technically possible to write and deploy code directly from the Azure portal, it's much easier to just do that from VS code. Let's jump into VS Code!

Installing VS Code extentions

Deploying from VS Code has some benefits as opposed to writing the code directly in Azure. First, we can debug and try out our function locally, which I think is great. Second, when deploying from VS Code, you have more options to customize your model. For example, we might upload a machine learning model as a .pkl file packed up with our function, which is exactly what we are interested in.

Image description

Go into the extensions section of VS Code and install the Azure Tools as well as the Azure Functions extension. To test functions locally, you also need to install the Azure function core tools. The last step is optional, but I highly recommend testing our functions locally before deploying them. Make sure to log in to your Azure account as well.

Initializing an empty function in VS code

Once all the extensions are installed, open the directory you want to work in, hit F1 and search for "Azure Functions: create new function...". We've already created a function app in Azure, but this will only serve as the shell, to which we will later deploy our actual function.

Image description

In the first step, you can select the programming language. I'm going to use Python, but there are many other programming languages available, such as Java, C# or JavaScript. After that, you can choose to select a runtime for a virtual environment, in which you can run your function.

You can also select a template for your function, which determines the way that your function gets triggered. For example, you can create a time trigger to trigger your function at a given interval or time of day. A function might also get triggered whenever there is a new entry in data storage. I am going to use the HTTP Trigger, which triggers the function every time it receives a POST request.

After that, you can set the name of the function as well as the access rights. If you need to access other tools from Azure, I recommend that you set the auth settings to "Admin".

Everything is set now, and all the needed components will automatically get created for us in the directory we initialized the function. Time to write some code!

Image description

Writing our function

Inside your directory, you should find a folder with the name that you have given to your function. In there, you'll find a file called __init__.py. All the code that we want to run in the serverless function goes into this file. The file also contains a python function called main, which is crucial for the serverless function. You may also add more python functions, but you need to have the main functions as well!

import logging
import pandas as pd
import joblib
import json
import azure.functions as func

# Load the decision tree model
dtr = joblib.load('windpred_model.pkl')

def main(req: func.HttpRequest) -> func.HttpResponse:

    # Parse the received data as a JSON file
    data = req.get_json()
    data = json.loads(data)

    # If the data is not empty, convert to a pandas DataFrame 
    if data is not None:
        response = []
        df = pd.DataFrame(data)
        df = df.apply(pd.to_numeric, errors='coerce')

        # Create new prediction for every entry in the df
        for i in range(df.shape[0]):
            entry = df.iloc[[i]]
            y_hat = dtr.predict(entry)

            # Store results in a dict
            results = {
                'energy_production': y_hat[0]
            }

            # Append results to a list 
            response.append(results)

        # Return the predictions as a JSON file
        return json.dumps(response)

    else:
        # If no JSON data is recieved, print error response
        return func.HttpResponse(
             "Please pass a properly formatted JSON object to the API",
             status_code=400
        )
Enter fullscreen mode Exit fullscreen mode

Deploying the function

Now we have everything in place for our serverless function and we are ready to deploy. Click on the Azure Extention and click on your subscription in the resources section. Search for the function app, in which you'll find the function app that we previously deployed in the Azure Portal. Right-click on the function and click on "Deploy to Function App...". This will automatically push the function app we configured in VS Code to Azure.

Image description

Trying out our serverless function

After a couple of minutes, the function app should be deployed and ready to use. Because our function is triggered by HTTP requests, we can test out our app with the help of the Python requests library.

To be able to send the data, we need to have it in JSON format. Luckily, Pandas is capable to save a DataFrame to JSON by calling .to_json on a DataFrame. I have taken a subset of the test data, which looks like this as a JSON:

{"AmbientTemperatue":{"1553427000000":38.039763,"1547726400000":29.4031836,"1524628800000":33.7847183784,"1558305600000":33.6842655556,"1545313800000":25.8910933},"BearingShaftTemperature":{"1553427000000":44.663637,"1547726400000":40.0134959,"1524628800000":47.901935875,"1558305600000":40.8214605,"1545313800000":42.1678136},"Blade1PitchAngle":{"1553427000000":45.7368925375,"1547726400000":45.7368925375,"1524628800000":45.7368925375,"1558305600000":34.3081334429,"1545313800000":45.7368925375},"Blade2PitchAngle":{"1553427000000":43.6993571429,"1547726400000":43.6993571429,"1524628800000":43.6993571429,"1558305600000":32.3317821077,"1545313800000":43.6993571429},"Blade3PitchAngle":{"1553427000000":43.6993571429,"1547726400000":43.6993571429,"1524628800000":43.6993571429,"1558305600000":32.3317821077,"1545313800000":43.6993571429},"GearboxBearingTemperature":{"1553427000000":65.5114258,"1547726400000":61.7971398,"1524628800000":77.119133,"1558305600000":49.5740933,"1545313800000":69.9417784},"GearboxOilTemperature":{"1553427000000":59.7925489,"1547726400000":56.3766701,"1524628800000":64.204399375,"1558305600000":54.2150616667,"1545313800000":58.2121131},"GeneratorRPM":{"1553427000000":1053.90176,"1547726400000":1030.01957,"1524628800000":1751.7155625,"1558305600000":115.3844747778,"1545313800000":1433.95605},"GeneratorWinding1Temperature":{"1553427000000":66.001735,"1547726400000":56.8643122,"1524628800000":113.29087075,"1558305600000":57.8011734444,"1545313800000":70.1916634},"GeneratorWinding2Temperature":{"1553427000000":65.1331282,"1547726400000":56.0960214,"1524628800000":112.59216325,"1558305600000":57.2828797778,"1545313800000":69.546118},"HubTemperature":{"1553427000000":43.996185,"1547726400000":36.0191189,"1524628800000":42.996094375,"1558305600000":39.5969365,"1545313800000":34.0113608},"MainBoxTemperature":{"1553427000000":49.83125,"1547726400000":39.95625,"1524628800000":41.89842625,"1558305600000":43.1738124,"1545313800000":36.2125},"NacellePosition":{"1553427000000":135.25,"1547726400000":173.0,"1524628800000":60.75,"1558305600000":183.5,"1545313800000":172.0},"ReactivePower":{"1553427000000":49.440134,"1547726400000":18.04336296,"1524628800000":-9.3880610405,"1558305600000":-10.1450863947,"1545313800000":0.1142242497},"RotorRPM":{"1553427000000":9.4539536,"1547726400000":9.2260374,"1524628800000":15.708135375,"1558305600000":1.0182516089,"1545313800000":12.841184},"TurbineStatus":{"1553427000000":2.0,"1547726400000":2.0,"1524628800000":0.0,"1558305600000":2.0,"1545313800000":2.0},"WindDirection":{"1553427000000":135.25,"1547726400000":173.0,"1524628800000":60.75,"1558305600000":183.5,"1545313800000":172.0},"WindSpeed":{"1553427000000":4.49369983,"1547726400000":3.690674815,"1524628800000":2.5373755703,"1558305600000":2.7541209632,"1545313800000":7.19549971}}
Enter fullscreen mode Exit fullscreen mode

As you can see, there is a lot of information that we are passing to the model packed up inside the JSON file. We may load in the JSON data and call our function like this:

# Load the JSON data
with open('data.json') as f:
    data = json.load(f)

# Send request to our serverless function
URL = 'URL OF MODEL GOES HERE'
headers = {'Content-type': 'application/json'}
req = requests.post(URL, json=json.dumps(json_data))
print(req.text)
Enter fullscreen mode Exit fullscreen mode

Image description

You can get the URL of the function in the Azure portal in the code + test section of your function. The output is the predicted energy output for the next 10 minutes.

I hope you found this small article about deploying machine learning models as serverless functions useful. If you have any thoughts or questions you would like to share, let me know in the comments! :-)

Top comments (0)