Configuring Microsoft Machine Learning Server 9.3 to Operationalize Analytics using ARM Templates
Microsoft Machine Learning Server 9.3 released today.
Operationalization refers to the process of deploying R and Python models to Machine Learning Server in the form of web services and the subsequent consumption of these services within client applications to deliver business results.
We have now introduced Command Line Interface (CLI) for Machine Learning Server operationalization, making configuration both easy and powerful. These CLIs are similar to Azure CLIs and offer full parity with the Admin Utility in earlier releases.
In this release, we further optimized web service request-response time to make it significantly faster. You can construct a dedicated session pool for a specific web service to preload models and code; this will greatly reduce the web service request-response time especially when the models are big.
The installation process is already taken care by using these Azure Marketplace Images (which come with Machine Learning Server pre-installed):
- Microsoft Machine Learning Server 9.3.0 on Windows Server 2016
- Microsoft Machine Learning Server 9.3.0 on CentOS Linux 7.2
- Microsoft Machine Learning Server 9.3.0 on Ubuntu 16.04
- Microsoft Machine Learning Server 9.3.0 on Red Hat Enterprise Linux 7.2
We will use ARM Template Custom Script Extensions and the new Admin CLI feature to automate One-Box/Enterprise Configuration.
One-box configuration: As the name suggests, one web node and one compute node run on a single machine. This configuration is useful when you want to explore what it is to operationalize R/Python analytics using Machine Learning Server. It is perfect for testing, proof-of-concepts, and small-scale prototyping, but might not be appropriate for production usage.
Enterprise configuration: In this configuration, multiple nodes are configured on multiple machines along with other enterprise features like High Availability, Active Directory Authentication, Secure Connectivity etc. This configuration can be scaled up or down by adding or removing nodes.
We have created the following 4 ARM Templates for easy deployment of One-Box and Enterprise Configurations:
- OneBox Configuration for Windows
- OneBox Configuration for Linux
- Dynamic Scaling Enterprise Configuration on Windows with Azure SQL DB as WebNode Backend
- Dynamic Scaling Enteprise Configuration on Linux with Azure Database for PostgreSQL as WebNode Backend
Just Click “Deploy to Azure” button in the github page to deploy the ARM Templates. Architecture and Instructions to connect are provided in the respective github page.
There are several other ways to deploy ARM Templates: Azure Portal, .NET SDK, Powershell, Azure CLI, Ruby, Visual Studio, REST API, AzureSMR package. Refer to this article for detailed steps.
For any issues (or) feature requests, please use the issues page in github. Code contributions are also welcome by creating a pull request.
Please watch this repo for new templates that might be added in the future : https://github.com/Microsoft/microsoft-r/tree/master/mlserver-arm-templates
USEFUL LINKS: Introducing Microsoft Machine Learning Server 9.3 Release What's new in Machine Learning Server Operationalize analytics with Machine Learning Server Configuring Machine Learning Server to operationalize analytics (One-Box Configuration) Configuring Machine Learning Server to operationalize analytics (Enterprise Configuration) Set up an auto-scale environment to operationalize your R analytics, with just ONE CLICK