InferenceConfig Class
- Inheritance
-
builtins.objectInferenceConfig
Constructor
InferenceConfig(entry_script, runtime=None, conda_file=None, extra_docker_file_steps=None, source_directory=None, enable_gpu=None, description=None, base_image=None, base_image_registry=None, cuda_version=None, environment=None)
Parameters
Name | Description |
---|---|
entry_script
Required
|
The path to a local file that contains the code to run for the image. |
runtime
|
The runtime to use for the image. Current supported runtimes are 'spark-py' and 'python'. Default value: None
|
conda_file
|
The path to a local file containing a conda environment definition to use for the image. Default value: None
|
extra_docker_file_steps
|
The path to a local file containing additional Docker steps to run when setting up image. Default value: None
|
source_directory
|
The path to the folder that contains all files to create the image. Default value: None
|
enable_gpu
|
Indicates whether to enable GPU support in the image. The GPU image must be used on Microsoft Azure Services such as Azure Container Instances, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. Defaults to False. Default value: None
|
description
|
A description to give this image. Default value: None
|
base_image
|
A custom image to be used as base image. If no base image is given then the base image will be used based off of given runtime parameter. Default value: None
|
base_image_registry
|
The image registry that contains the base image. Default value: None
|
cuda_version
|
The Version of CUDA to install for images that need GPU support. The GPU image must be
used on Microsoft Azure Services such as Azure Container Instances, Azure Machine Learning Compute,
Azure Virtual Machines, and Azure Kubernetes Service. Supported versions are 9.0, 9.1, and 10.0.
If Default value: None
|
environment
|
An environment object to use for the deployment. The environment doesn't have to be registered. Provide either this parameter, or the other parameters, but not both. The individual parameters will
NOT serve as an override for the environment object. Exceptions include Default value: None
|
entry_script
Required
|
The path to a local file that contains the code to run for the image. |
runtime
Required
|
The runtime to use for the image. Current supported runtimes are 'spark-py' and 'python'. |
conda_file
Required
|
The path to a local file containing a conda environment definition to use for the image. |
extra_docker_file_steps
Required
|
The path to a local file containing additional Docker steps to run when setting up image. |
source_directory
Required
|
The path to the folder that contains all files to create the image. |
enable_gpu
Required
|
Indicates whether to enable GPU support in the image. The GPU image must be used on Microsoft Azure Services such as Azure Container Instances, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. Defaults to False. |
description
Required
|
A description to give this image. |
base_image
Required
|
A custom image to be used as base image. If no base image is given then the base image will be used based off of given runtime parameter. |
base_image_registry
Required
|
The image registry that contains the base image. |
cuda_version
Required
|
The Version of CUDA to install for images that need GPU support. The GPU image must be
used on Microsoft Azure Services such as Azure Container Instances, Azure Machine Learning Compute,
Azure Virtual Machines, and Azure Kubernetes Service. Supported versions are 9.0, 9.1, and 10.0.
If |
environment
Required
|
An environment object to use for the deployment. The environment doesn't have to be registered. Provide either this parameter, or the other parameters, but not both. The individual parameters will
NOT serve as an override for the environment object. Exceptions include |
Remarks
The following sample shows how to create an InferenceConfig object and use it to deploy a model.
from azureml.core.model import InferenceConfig
from azureml.core.webservice import AciWebservice
service_name = 'my-custom-env-service'
inference_config = InferenceConfig(entry_script='score.py', environment=environment)
aci_config = AciWebservice.deploy_configuration(cpu_cores=1, memory_gb=1)
service = Model.deploy(workspace=ws,
name=service_name,
models=[model],
inference_config=inference_config,
deployment_config=aci_config,
overwrite=True)
service.wait_for_deployment(show_output=True)
Variables
Name | Description |
---|---|
entry_script
|
The path to a local file that contains the code to run for the image. |
runtime
|
The runtime to use for the image. Current supported runtimes are 'spark-py' and 'python'. |
conda_file
|
The path to a local file containing a conda environment definition to use for the image. |
extra_docker_file_steps
|
The path to a local file containing additional Docker steps to run when setting up the image. |
source_directory
|
The path to the folder that contains all files to create the image. |
enable_gpu
|
Indicates whether to enable GPU support in the image. The GPU image must be used on Microsoft Azure Services such as Azure Container Instances, Azure Machine Learning Compute, Azure Virtual Machines, and Azure Kubernetes Service. |
azureml.core.model.InferenceConfig.description
|
A description to give this image. |
base_image
|
A custom image to be used as base image. If no base image is given then the base image will be used based off of given runtime parameter. |
base_image_registry
|
The image registry that contains the base image. |
cuda_version
|
The version of CUDA to install for images that need GPU support. The GPU image must be
used on Microsoft Azure Services such as Azure Container Instances, Azure Machine Learning Compute,
Azure Virtual Machines, and Azure Kubernetes Service. Supported versions are 9.0, 9.1, and 10.0.
If |
azureml.core.model.InferenceConfig.environment
|
An environment object to use for the deployment. The environment doesn't have to be registered. Provide either this parameter, or the other parameters, but not both. The individual parameters will
NOT serve as an override for the environment object. Exceptions include |
Methods
build_create_payload |
Build the creation payload for the Container image. |
build_profile_payload |
Build the profiling payload for the Model package. |
validate_configuration |
Check that the specified configuration values are valid. Raises a WebserviceException if validation fails. |
validation_script_content |
Check that the syntax of score script is valid with ast.parse. Raises a UserErrorException if validation fails. |
build_create_payload
Build the creation payload for the Container image.
build_create_payload(workspace, name, model_ids)
Parameters
Name | Description |
---|---|
workspace
Required
|
The workspace object to create the image in. |
name
Required
|
The name of the image. |
model_ids
Required
|
A list of model IDs to package into the image. |
Returns
Type | Description |
---|---|
The container image creation payload. |
Exceptions
Type | Description |
---|---|
build_profile_payload
Build the profiling payload for the Model package.
build_profile_payload(profile_name, input_data=None, workspace=None, models=None, dataset_id=None, container_resource_requirements=None, description=None)
Parameters
Name | Description |
---|---|
profile_name
Required
|
The name of the profiling run. |
input_data
|
The input data for profiling. Default value: None
|
workspace
|
A Workspace object in which to profile the model. Default value: None
|
models
|
A list of model objects. Can be an empty list. Default value: None
|
dataset_id
|
Id associated with the dataset containing input data for the profiling run. Default value: None
|
container_resource_requirements
|
container resource requirements for the largest instance to which the model is to be deployed Default value: None
|
description
|
Description to be associated with the profiling run. Default value: None
|
Returns
Type | Description |
---|---|
Model profile payload |
Exceptions
Type | Description |
---|---|
validate_configuration
Check that the specified configuration values are valid.
Raises a WebserviceException if validation fails.
validate_configuration()
Exceptions
Type | Description |
---|---|
validation_script_content
Check that the syntax of score script is valid with ast.parse.
Raises a UserErrorException if validation fails.
validation_script_content()
Exceptions
Type | Description |
---|---|