Introduction
As mentioned in my previous post, Azure Digital Twins (ADT) Service is a new way to build next generation IoT solutions. In the first post I show you in a video how to manage ADT instances with the ADT Explorer. In the second post I show how to do mostly the same but using Postman and the ADT Rest API.
ADT has control plane APIs and data plane APIs. The latest is used to manage elements in the ADT instance. In order to use these APIs Microsoft published a .Net (C#) SDK. And SDK is a convenient way to manage instances, since you can easily create applications for your digital twins. If for any reason you prefer to use another language like java, javascript or python, you need to generate your own SDK.
In this post I describe how to autogenerate a Python SDK using the tool Autorest and a swagger file.
In this repo an example of a generated SDK could be found: https://github.com/pauldj54/ADTDataPlane
Autorest
Autorest is a tool to generate client libraries for accessing RESTFul web services. It is available in github: https://github.com/Azure/autorest
Note: Autorest prerequisite is Node.js and the version 10.x is recommended. If you need multiple versions of Node.js I recommend you the tool nvm-windows that can be downloaded from this link: https://github.com/coreybutler/nvm-windows/releases
I will use PowerShell with admin rights for the next steps.
Now let’s select the desired Node.js version:

The steps shown in the figure above are the following:
- Print the current versions
- Since only v12.x was available then install the 10.x version
- List the versions available to confirm
- Change the used version to 10.x
- Confirm the changes.
Note the prefix “*” marking the selected Node.js version.
To install autorest I followed the steps from the official Azure documentation https://docs.microsoft.com/en-us/azure/digital-twins/how-to-create-custom-sdks:
npm install -g autorest
# run using command 'autorest'
autorest
Generate our own SDK
- In order to generate our own SDK the Swagger file with the ADT data plane APIs definition is needed and can be downloaded from here . Please be aware that the “examples” folder is also required, if not Autorest throws an error
- Place the downloads in a directory in your computer. I created a folder under my git directory with the name “adt_dataplane_api”.
- Open a console and navigate to the directory created in the previous step. Issue the following command:
autorest --input-file=digitaltwins.json --python --output-folder=ADTApi --add-credentials --azure-arm --package-name=ADTApi --package-version=1.0
Basically you point to the swagger file (digitaltwins.json), select python as the output language, enter an output folder, package name and other details

4. If everything ran successfully you should see the following output:

Converting our own SDK to a python package
It is very convenient to convert the generated SDK in a python package and include it in the environments as needed. In order to do so, I followed these steps:
- Create a setup.py file in the “ADTDataPlane” directory:
setup(
name='adtdataplane',
version='0.1',
description='ADT data plane python SDK',
author='Azure Digital Twin autogenerated using autorest by Paul Hernandez',
url='https://github.com/pauldj54/ADTDataPlane',
packages=find_packages(exclude=['tests*'])
)
2. Add the auto-generated code to git. I did it in my github using these directions
3. Now we are ready to install our newly generated package 😊

4. Verify the installation:

Manage an ADT Instance
Once we have our SDK python package available is time to test it. For this post I registered an AAD application (app registration) and I am using the same ADT instance of the previous post.
- Find your Application (client) ID and Directory (tenant) ID:

2. Create a client secret and write it down:

3. Grant the role “Azure Digital Twins Owner (Preview)” to the registered app:

4. Create a config file in the root directory (or another directory) within you python project and name it for instance settings.json . Hint: secrets and other sensible information will be stored in this file, so make sure you don’t push it to git or your source control.
The file should look like this:
{
"client_id" : "<your-client-id>",
"tenant_id" : "<your-tenant-id>",
"adt_instance_url" : "https://management.azure.com",
"secret" : "<your-secret>",
"endpoint" : "https://<your-adt-instance>.api.neu.digitaltwins.azure.net",
"scope" : ["https://digitaltwins.azure.net/.default"],
"authority" : "https://login.microsoftonline.com/<your-tenant-id>"
}
5. Create a dtdl sample file to test the code. I crated the file “SampleModel.json” quite similar to the one in the official documentation:
{
"@id": "dtmi:com:contoso:SampleModelPy;1",
"@type": "Interface",
"displayName": "SampleModelPy",
"contents": [
{
"@type": "Relationship",
"name": "contains"
},
{
"@type": "Property",
"name": "data",
"schema": "string"
}
],
"@context": "dtmi:dtdl:context;2"
}
6. Import the following modules and install them if required:
import msal
from msrestazure.azure_active_directory import AADTokenCredentials
import adtdataplane
import logging
from azure.mgmt.consumption.models.error_response import ErrorResponseException
import json
msal is the Microsoft Authentication Library and is the preferred library according to the documentation. AADTokenCreadentials is the class used to build the credentials, adtdataplane is our generated sdk. Some other packages are required by the code.
7. Load the config file and create a confidential client application as follows:
# Load Config file
with open(r"settings.json") as f:
config = json.load(f)
# Create a preferably long-lived app instance that maintains a token cache.
app = msal.ConfidentialClientApplication(
config["client_id"], authority=config["authority"],
client_credential=config["secret"],
# token_cache=... # Default cache is in memory only.
# You can learn how to use SerializableTokenCache from
# https://msal-python.rtfd.io/en/latest/#msal.SerializableTokenCache
)
8. I used this code snippet from the azure python sdk examples to obtain a token:
# The pattern to acquire a token looks like this.
result = None
# First, the code looks up a token from the cache.
# Because we're looking for a token for the current app, not for a user,
# use None for the account parameter.
result = app.acquire_token_silent(config["scope"], account=None)
if not result:
logging.info("No suitable token exists in cache. Let's get a new one from AAD.")
result = app.acquire_token_for_client(scopes=config["scope"])
if "access_token" in result:
# Call a protected API with the access token.
print(result["token_type"], result["access_token"])
else:
print(result.get("error"))
print(result.get("error_description"))
print(result.get("correlation_id")) # You might need this when reporting a bug.
9. I transform the acquired toke in AAD token credentials and create an SDK client:
credentials = AADTokenCredentials(result)
try:
client = adtdataplane.AzureDigitalTwinsAPI(credentials = credentials, base_url = config['endpoint'])
logging.info("Service client created – ready to go")
except ValueError as err:
print('Client creation failed with error: {0}'.format(err))
10) Now we can load a dtdl model:
# load models
with open(r"models\SampleModel.json") as f:
dtdl = json.load(f)
dtdl_list = []
dtdl_list.append(dtdl)
try:
response = client.digital_twin_models.add(model = dtdl_list, raw=True)
print(response)
except adtdataplane.models.ErrorResponseException as e:
print(e)
Please notice the model location and modify it accordingly
11. Verify if the model was created
# Verify the model was created
response = client.digital_twin_models.get_by_id('dtmi:com:contoso:SampleModelPy;1')
print(response)
You should see something like this:

12. We could also verify that the model was correctly upload using the ADT Explorer:


The entire python code:
import msal
from msrestazure.azure_active_directory import AADTokenCredentials
import adtdataplane
import logging
from azure.mgmt.consumption.models.error_response import ErrorResponseException
import json
# Load Config file
with open(r"settings.json") as f:
config = json.load(f)
# Create a preferably long-lived app instance that maintains a token cache.
app = msal.ConfidentialClientApplication(
config["client_id"], authority=config["authority"],
client_credential=config["secret"],
# token_cache=... # Default cache is in memory only.
# You can learn how to use SerializableTokenCache from
# https://msal-python.rtfd.io/en/latest/#msal.SerializableTokenCache
)
# The pattern to acquire a token looks like this.
result = None
# First, the code looks up a token from the cache.
# Because we're looking for a token for the current app, not for a user,
# use None for the account parameter.
result = app.acquire_token_silent(config["scope"], account=None)
if not result:
logging.info("No suitable token exists in cache. Let's get a new one from AAD.")
result = app.acquire_token_for_client(scopes=config["scope"])
if "access_token" in result:
# Call a protected API with the access token.
print(result["token_type"], result["access_token"])
else:
print(result.get("error"))
print(result.get("error_description"))
print(result.get("correlation_id")) # You might need this when reporting a bug.
credentials = AADTokenCredentials(result)
try:
client = adtdataplane.AzureDigitalTwinsAPI(credentials = credentials, base_url = config['endpoint'])
logging.info("Service client created – ready to go")
except ValueError as err:
print('Client creation failed with error: {0}'.format(err))
# load models
with open(r"models\SampleModel.json") as f:
dtdl = json.load(f)
dtdl_list = []
dtdl_list.append(dtdl)
try:
response = client.digital_twin_models.add(model = dtdl_list, raw=True)
print(response)
except adtdataplane.models.ErrorResponseException as e:
print(e)
# Verify the model was created
response = client.digital_twin_models.get_by_id('dtmi:com:contoso:SampleModelPy;1')
print(response)
Next steps
Even when the scenario presented in this post is extremely basic, now you have a python SDK to manage ADT instances. The benefit is more obvious when you have some data sets and want to populate your twins. In a next post I would like to show you how to write an ingestion program using the python SDK and a data source, most probably a CSV or a JSON file, let’s see what the open data world offers to us.
References
- Create custom SDKs for Azure Digital Twins using AutoRest: https://docs.microsoft.com/en-us/azure/digital-twins/how-to-create-custom-sdks
- Use the Azure Digital Twins APIs and SDKs: https://docs.microsoft.com/en-us/azure/digital-twins/how-to-use-apis-sdks
- Python-packaging: https://python-packaging.readthedocs.io/en/latest/minimal.html
Pingback: Managing Azure IoT Digital Twins with Python – Curated SQL