How to Better Evaluate the Goodness-of-Fit of Regressions

Most of the content of this post is platform-agnostic. Since in these days I’m using Azure Machine Learning, I take it as a starting point of my studies.

It’s quite simple for an Azure Machine Learning average user to create a regression experiment, make the data flow in it and get the predicted values. It’s also easy to have some metrics to evaluate the implemented model. Once you get them, the following questions arise:

  • How can I interpret these numbers?
  • Are these metrics enough to assess the goodness-of-fit of the model?

This post wants to provide you with the statistical foundation behind these metrics and with some additional tools that will help you to better understand how the model has fitted. These tools are implemented in a R script you can simply copy&paste into an Execute R Script module.

Read the rest of the article here:

How to bulk copy Azure ML Experiments from a Workspace to another one or do a Backup of them in Physical Files

How to bulk copy Azure ML Experiments from a Workspace to another one or do a Backup of them in Physical Files

Hi all,

today I want to tackle the issue of bulk copying more than one Azure ML experiment at once between different workspaces.

May be you already know that you can partially solve this task by copying an experiment one at a time. But you have to access to both the workspaces with your user. If not, you can simply share a workspace in this way:

Once you can see both the workspaces in your Azure Machine Learning Studio, you can simply select an experiment and than “Copy to workspace”:


and than you can choose the destination workspace:


A you can imagine… you can’t simply select more than one experiment and than copy all them:


Now suppose you have dozens of experiments and simply you don’t want to waste your time coping them all manually, or moreover you can’t have access to a shared workspace for security reasons. Is there a way to bulk copy your experiments? I’ll show you how to do that using few rows of PowerShell.


Fixed some SQL Server Partition Management Utility Bugs

The SQL Server Partition Management Utility ( is one of the best tool used to manage the partition-switch operations. It is a command line tool and can be integrated in a SSIS package or used to generate the T-SQL scripts needed in a regular “sliding window” partition management scenario. A blog post that shows how to use this tool is this one.

In my case, I wanted to speed the loading of a big partitioned fact table through a SSIS package (that calls two child packages). So this package calls more instances of the tool in order to load more than one staging table in parallel. Each staging table is related to a fact table partition. After each staging table is loaded, the SSIS package loads the target fact table using the partition-switch operations against the staging table.

All seemed to work fine, but during the test phase, when I tried to increase the degree of parallelism (that is the number of executed instances of the tool), I got a deadlock error.