• ML Spring
  • Posts
  • Azure ML: Deploying your model as a Web Service

Azure ML: Deploying your model as a Web Service

Seamlessly deploy any ML model! 🚀

Deploying a model basically means creating a service out of your model and making it accessible to others.

Today we will see how one can leverage Azure ML platform to deploy a machine learning model as a web service.

The deployment can be done in two ways:

  • Using UI (No-Code approach)

  • Using Azure's Python SDK (code available on Github)

We take up both the approaches and understand each of them, let’s go! 🚀 

Prerequisites:

  • An Azure Subscription (1 month free subscription available for all).

  • Basic knowledge of python and machine learning.

Lets get started:

Step 1: Install Azure Python SDK

!pip install azureml-sdk 

Step 2: Create a Machine Learning Worksapce:

from azureml.core import Workspace
ws = Workspace.create(name='AzureML_Deployment_WS',
               subscription_id='2f##b8*****2',
               resource_group='AzureML_Deployment_RG',
               create_resource_group=True,
               location='eastus'
               )

Step 3: Register the Model

from azureml.core.model import Model

model = Model.register(ws, model_name="classifier", model_path="saved_model_v1.pkl")

Model can also be registered & can be seen through UI.
Inside your ML workspace launch ML studio and following panel appears
Check this out ⬇️

Step 4: Prepare artefacts

There are 3 artefacts that you need for deploying you model:

  • Model weights: (Which you have already used to register model)

  • Scoring Script: A python script front ending your model weights, it receives the request, loads model for prediction and sends the predictions as response. It is sort of an inference script specific to your model.

# score.py

import json
import joblib
from azureml.core import Model


def init():
    global model
    model_name = "irisclassifier"
    path = Model.get_model_path(model_name)
    model = joblib.load(path)


def run(data):
    try:
        data = json.loads(data)
        result = model.predict(data["data"])
        return {"data": result.tolist(), "message": "Prediction successful"}
    except Exception as e:
        return {"data": e, "message": "Failed to predict"}
  • Environment.yml file: A file specifying all your project dependencies.

# environment.yml

channels:
- anaconda
- conda-forge
dependencies:
- python=3.6.2
- pip:
  - pandas==1.1.5
  - azureml-defaults
  - joblib==0.17.0
  - scikit-learn==0.23.2

Step 5: Deploy your model:

First We will take the no code approach:
It's fairly simple, check out ⬇️

No-code | UI based deployment

Let's take a look at deployment using Azure Python SDK:
Now we have all the artefacts ready, following steps need to be followed ⬇️

  • Setup Inference config: Create a source_dir at the same level your deployment notebook is present and put your score.py and env.yml in it, finally run the code below ⬇️

from azureml.core.model import InferenceConfig

inference_config = InferenceConfig(
    conda_file='./env.yml',
    source_directory="./source_dir",
    entry_script="./score.py",
)
  • Setup Deployment Config: Here we specify the cpu and memory requirement of each container hosting your model server, run below code for it ⬇️

from azureml.core.webservice import AciWebservice

deployment_config = AciWebservice.deploy_configuration(
    cpu_cores=2, memory_gb=3, auth_enabled=True
)
  • Deploy the Service:

from azureml.core.model import Model

service = Model.deploy(
    ws, # The instance of workspace created above
    "myservice",
    [Model(ws, 'bannerdetector')],
    inference_config,
    deployment_config,
    overwrite=True,
)
service.wait_for_deployment(show_output=True)

Step 6: Access deployed model endpoint:

One your model is deployed you need to go to the endpoints tab and find the rest endpoint where you model is served and ready for inference.
Check this out ⬇️

Service Endpoint.

You can find all the code and artefacts on my Github repo.

Find me on X (twitter) for daily tutorials around ML & MLOps.

Cheers! 🥂 

Subscribe to keep reading

This content is free, but you must be subscribed to ML Spring to continue reading.

Already a subscriber?Sign In.Not now