共用方式為


List, get, and consume web services in Python

Important

This content is being retired and may not be updated in the future. The support for Machine Learning Server will end on July 1, 2022. For more information, see What's happening to Machine Learning Server?

Applies to: Machine Learning Server

This article is for data scientists who wants to learn how to find, examine, and consume the analytic web services hosted in Machine Learning Server using Python. Web services offer fast execution and scoring of arbitrary Python or R code and models. Learn more about web services. This article assumes that you are proficient in Python.

After a web service has been published, any authenticated user can list, examine, and consume that web service. You can do so directly in Python using the functions in the azureml-model-management-sdk package. The azureml-model-management-sdk package is installed with Machine Learning Server. To list, examine, or consume the web service outside of Python, use the RESTful APIs that provide direct programmatic access to a service's lifecycle or in a preferred language via Swagger.

By default, web service operations are available to authenticated users. However, your administrator can also assign roles (RBAC) to further control the permissions around web services.

Important

Web service definition: In Machine Learning Server, your R and Python code and models can be deployed as web services. Exposed as web services hosted in Machine Learning Server, these models and code can be accessed and consumed in R, Python, programmatically using REST APIs, or using Swagger generated client libraries. Web services can be deployed from one platform and consumed on another. Learn more...

Requirements

Before you can use the web service management functions in the azureml-model-management-sdk Python package, you must:

Find and list web services

Any authenticated user can retrieve a list of web services using the list_services function on the DeployClient object. You can use arguments to return a specific web service or all labeled versions of a given web service.

## -- Return metadata for all services hosted on this server
client.list_services()

## -- Return metadata for all versions of service "myService" 
client.list_services('myService')

## -- Return metadata for a specific version "v1.0" of service "myService" 
client.list_services('myService', version='v1.0')

Once you find the service you want, use the get_service function to retrieve the service object for consumption.

Retrieve and examine service objects

Authenticated users can retrieve the web service object in order to get the client stub for consuming that service. Use the get_service function from the azureml-model-management-sdk package to retrieve the object.

After the object is returned, you can use a help function to explore the published service, such as print(help(myServiceObject)). You can call the help function on any azureml-model-management-sdk functions, even those that are dynamically generated to learn more about them.

You can also print the capabilities that define the service holdings to see what the service can do and how it should be consumed. Service holdings include the service name, version, descriptions, inputs, outputs, and the name of the function to be consumed. Learn more about capabilities...

You can use supported public functions to interact with service object.

Example code:

# -- List all versions of the service 'myService'--
client.list_services('myService')

# -- Retrieve the service object for myService v2.0
svc = client.get_service('myService', version='v2.0')

# -- Learn more about that service.
print(help(svc))

# -- View the service object's capabilities/schema
svc.capabilities()

Note

You can only see the code stored within a web service if you have published the web service or are assigned to the "Owner" role. To learn more about roles in your organization, contact your Machine Learning Server administrator.

Consume web services

Web services are published to facilitate the consumption and integration of the operationalized models and code into systems and applications. Whenever the web service is deployed or updated, a Swagger-based JSON file, which defines the service, is automatically generated.

When you publish a service, you can let people know that it is ready for consumption by sharing its name and version with them. Web services are consumed by data scientists, quality engineers, and application developer. They can be consumed in Python, R, or via the API.

Users can consume the service directly using a single consumption call, which is referred to as a "Request Response" approach. Learn about the various approaches to consuming web services.

Consuming in Python

After authenticating with Machine Learning Server, users can also interact with and consume services using other functions in the azureml-model-management-sdk Python package.

For a full example of a request-response consume, see this Jupyter notebook

# Let's call the function `manualTransmission` in this service
res = svc.manualTransmission(120, 2.8)

# Pluck out the named output `answer`.
print(res.output('answer'))

# Get `swagger.json` that defines the service.
cap = svc.capabilities()
swagger_URL = cap['swagger']
print(swagger_URL)

# Print the contents of the swagger file.
print(svc.swagger())

Sharing the Swagger with application developers

Application developers can call and integrate a web service into their applications using the service-specific Swagger-based JSON file and by providing any required inputs to that service. The Swagger-based JSON file is used to generate client libraries for integration. Read "How to integrate web services and authentication into your application" for more details.

The easiest way to share the Swagger file with an application developer is to use the code shown in this Jupyter notebook. Alternately, the application developer can request the file as an authenticated user with an active bearer token in the request header using this API:

GET /api/{{service-name}}/{{service-version}}/swagger.json

See also