Share via


Azure Data Factory: Hub Not Found

You can use the new Azure portal to create or edit Azure Data Factory components. Once you are done you may automate the process of creating future Data Factory components from PowerShell. In that case you can use the JSON files you edited in the portal GUI as configuration files for the PowerShell cmdlets. For example, you may try to create a new linked service using settings from C:\Cooler\HDInsight.JSON as specified below:

New-AzureDataFactoryLinkedService -ResourceGroupName CoolerDemo -DataFactoryName $DataFactoryName -File C:\Cooler\HDInsight.JSON

In that case you may see something like this error:

New-AzureDataFactoryLinkedService : Hub: {SomeName_hub} not found.
CategoryInfo : CloseError: (:) [New-AzureDataFactoryLinkedService]. Provisioning FailedException
FullyQualifiedErrorID : Microsoft.Azure.Commands.DataFactories.NewAzureDataFactoryLinkedServiceCommand

image

If you check the JSON file that you exported from the portal and referenced in the PowerShell script, you will see it ends with something like this:

        "isPaused": false,
"hubName": "SomeName_hub"
}
}

The hubName is currently automatically generated based on the name of the Data Factory and should not be present in the JSON files used by PowerShell. Remove the comma on the line above the hubName and the entire line starting with hubName.

                        ,
"hubName": "SomeName_hub"

That will leave the end of the file looking something like this:

        "isPaused": false
}
}

Check out all your other JSON files you are using for Data Factory components and do the same editing for any that have a hubName.

NOTE: This applies to Azure Data Factory as of April 2015. At some point the hubName should become a viable parameter usable by PowerShell.

I hope you enjoyed this small bite of big data!

Cindy Gross – Neal Analytics: Big Data and Cloud Technical Fellow  image
@SQLCindy | @NealAnalytics | CindyG@NealAnalytics.com | https://smallbitesofbigdata.com

 Tweets by @SQLCindy

Technorati Tags: Azure Data Factory,Troubleshooting,error,PowerShell,Microsoft Azure,Neal Analytics,Big Data,configuration,deploy,SSIS,SQLCindy

Comments

  • Anonymous
    August 03, 2015
    Hi Cindy, Hope you are well. I am trying to create a linked service using Azure Batch as compute instead of HDInsight but keeps failing with this error- ENTITY PROVISIONING FAILED: AZURE BATCH OPERATION FAILED. CODE: '' MESSAGE: '' Json Script- { "name": "HostLinkedService", "properties": { "type": "AzureBatch", "typeProperties": { "accountName": "bigdata", "accessKey": "**********", "poolName": "xmltoavropool", "linkedServiceName": "BatchStorageLinkedService" } } } I've tried deploying this via PS and Portal both, same results. Batch and Factory are in the same region as well, is there a known issue around this atm? Cheers

  • Anonymous
    August 03, 2015
    This is resolved now, here- disqus.com/.../troubleshoot_azure_data_factory_issues