Microsoft released Azure Machine Learning Workbench at the Ignite conference on September 25, 2017 as a public preview. This tool is a new tool which they are adding to their Azure ecosystem, which includes the machine learning tool they introduced three years ago, Azure Machine Learning Studio. Microsoft has said they plan on keeping both products. When asked about the two products, they said that the earlier tool, Azure Machine Learning Studio, is targeted to developers who wanted to add machine learning to their current applications, as it is an easy to use tool that doesn’t require a person to be a trained data scientist. Azure Machine Learning Workbench is targeted to data scientists who want to bring in other libraries, like TensorFlow for Python, and delve deep into the data.
Microsoft Moves into Machine Learning Management
Microsoft is looking for Azure Machine Learning Workbench for more than a tool to use for Machine Learning analysis. It is part of a system to manage and monitor the deployment of machine learning solutions with Azure Machine Learning Model Management. The management aspects are part of the application installation. To install the Azure Machine Learning Workbench, the application download is available only by creating an account in Microsoft’s Azure environment, where a Machine Learning Model Management resource will be created as part of the install. Within this resource, you will be directed to create a virtual environment in Azure where you will be deploying and managing Machine Learning models.
This migration into management of machine learning components is part of a pattern first seen on the on-premises version of data science functionality. First Microsoft helped companies manage the deployment of R code with SQL Server 2016 which includes the ability to move R code into SQL Server. Providing this capability decreased the time it took to implement a data science solution by providing a means for the code can be deployed easily without the need for the R code to be re-written or included in another application. SQL Server 2017 expanded on this idea by allowing Python code to be deployed into SQL Server as well. With the cloud service Model Management, Microsoft is hoping to centralize the implementation so that all Machine Learning services created can be managed in one place.
Hybrid Cloud, Desktop, and Python
While you must have an Azure account to use the Machine Learning Workbench, the application is designed to run on a locally on either a Mac or Windows computer. There is a developer edition of the tool so that one can learn the tool and not incur a bill, which is the case with the previous product, Azure Machine Learning. The download of Machine Learning Workbench must be accessed within an Azure account and is installed to your local computer. When running the application from your computer, the application will prompt to log into your Azure account to load Azure Machine Learning Workbench.
The application is designed to use and create Python code. Azure Machine Learning Workbench does not contain any accommodation to incorporate machine learning components written in R, just Python. If you have created machine learning components using R, they can be incorporated into the Azure Machine Learning Model Management if you create webservices which encapsulate the R code. The R code does not interface into Workbench, but can be made to be a part of the managed projectes in Azure. While it is possible to create a webservice for R with the earlier product with Azure Machine Learning, there is no direct way to include R with Azure Machine Learning Workbench. There are a number of sample templates to get started using Python templates including the ubiquitous Iris dataset, Linear regression and several others. Once the project is created, you can use your favorite IDE, it creates python code which can be read anywhere.
Staying within Machine Learning Workbench application allows you access to arguably one of the neatest parts of the Machine Learning Workbench, the data parser. This tool which was originally code-named project Pendleton and designed to be an intuitive way to modify the contents of data even better than the previous leader in parsing data, Power BI’s Power Query.
You can select the option “Derive column by example” or “Split Column by Example” and then start typing in a new column. For example, if you want to separate a column which contains the date and the time, if you right click on that date column and select “Split Column by Example” then type the date in the new column provided, the application will immediately determine that you want two columns and crate them. The date column and a time column be created for you after typing in one date. After the sample columns have been created, you can approve the change or reject it if does not work how you want to.
Like Power Query, each change made to the data is included in the window called Steps on the right side of the application window. When you are done modifying the data, right click on the Data Preparations source icon, which in my example is called UFO Clean, to and the UI changes made to the data are used to create Python code to perform the changes. The generated Python code can be used to the source data programmatically.
The next step in the process is to write the python code needed to evaluate the data and create a model which would in my case determine where and when you are most likely to see the next UFO based on the dataset I have included in my project. Unlike it’s counterpart Azure Machine Learning, you will need to know how to write the necessary code needed to create a machine learning analysis in Python for Azure Machine Learning Workbench. One could write the Python code to create a machine learning analysis in any Python editor. If you chose to use Azure Machine Learning, the Python library scikit-learn is installed as part of the application. Other libraries which you may want to use, such as the common library matplot, you will need to load within Azure Machine Learning Workbench.
Web Service: How Azure Machine Learning Workbench Solutions are Deployed
To deploy a package, you will need to export the completed model serialized Python object, with the Python Module, Pickle. This will create a file with the suffix of pkl, which is the file that you will be deploying. Azure Machine Learning Workbench expects that you will be deploying via Docker containers or creating an Azure cluster. You will need to register the Docker container in the Machine Learning Container for it to be deployed.
Data aficionado et SQL Raconteur
Pingback: The Whys Of Azure ML Workbench – Curated SQL
Thank you for reading my blog. The process I use for updating Power BI files is I edit the file locally, and get it to the state where it is ready to check it in then upload over it. I think you can make the changes locally if you have the one drive synced, but I don’t use it that way as I want to determine when to check it in rather than have ever mod instantly there.