Azure Data factory: Linked service parameterization default values and CI/CD pipelines

2025-02-18T15:07:37.1933333+00:00

I am struggling to understand how ADF parameterization functions when deploying from one environment to another when using DevOps pipelines.

Our linked services may be set up with parameters such as

				
"name": "[LinkedServiceName]",
"properties": {
	"parameters": {
		"domain": {
			"type": "string",
			"defaultValue": "[dev-domain]"
		},
		"database": {
			"type": "string",
			"defaultValue": "[dev-database]"
		},
		"secret_name": {
			"type": "string",
			"defaultValue": "[dev-keyvault-secret]"
		},
		"keyvault_url": {
			"type": "string",
			"defaultValue": "[dev-keyvault-url]"
		}
	},
	"typeProperties": {
		"server": "@{linkedService().domain}",
		"database": "@{linkedService().database}",
		"encrypt": "mandatory",
		"trustServerCertificate": false,
		"authenticationType": "SQL",
		"userName": "[username]",
		"password": {
			"type": "AzureKeyVaultSecret",
			"store": {
				"referenceName": "AzureKeyVault",
				"type": "LinkedServiceReference",
				"parameters": {
					"url": {
						"value": "@linkedService().keyvault_url",
						"type": "Expression"
					}
				}
			},
			"secretName": {
				"value": "@linkedService().secret_name",
				"type": "Expression"
			}
		}
	}
}

Which is fine. But then the default values are automatically set by ADF wherever we try to use the linked service, such as in a dataset:

"name": "[DataSetName]", 
"properties": {
	"linkedServiceName": {
		"referenceName": "[LinkedServiceName]",
		"type": "LinkedServiceReference",
		"parameters": {
			"domain": "[dev-domain]",
			"database": "[dev-keyvault-secret]",
			"secret_name": "[dev-keyvault-url]" 		
		} 	
	} 	
}

My deployment team is struggling with these default values when deploying from DEV to PROD.
Should the CICD pipeline replace the values in both the linked service and dataset?
Or do I need to change something in the data set or make changes to the JSON file directly?

Azure Data Factory
Azure Data Factory
An Azure service for ingesting, preparing, and transforming data at scale.
11,274 questions
{count} votes

1 answer

Sort by: Most helpful
  1. phemanth 13,905 Reputation points Microsoft Vendor
    2025-02-18T20:27:06.51+00:00

    @Lockyer, Aaron (SGRE COG B MS SYS BSPM)

    Welcome to the Microsoft Q&A forum.

    It seems like you're dealing with a common challenge in Azure Data Factory (ADF) deployments. When moving from one environment to another (e.g., DEV to PROD), parameterization can indeed be tricky.

    PLease check the below steps

    1. Parameterize Linked Services: Ensure that your linked services are properly parameterized. This allows you to pass dynamic values at runtime, which is crucial for different environments
    2. ARM Templates: Use Azure Resource Manager (ARM) templates for your deployments. These templates can be parameterized to replace values during deployment. You can define parameters in the ARM template and provide different values for each environment
    3. Global Parameters: Consider using global parameters in ADF. These parameters can be set at the factory level and can be overridden during deployment. This approach can help reduce manual intervention
    4. CI/CD Pipelines: In your CI/CD pipeline, you should replace the values in both the linked service and dataset. This can be done by defining the parameters in your ARM template and using the pipeline to pass the appropriate values for each environment
    5. JSON Files: If you need to make changes directly to the JSON files, ensure that the parameters are correctly defined and referenced. This might involve updating the JSON files to include the correct parameter values for each environment
    6. please check the helpful resources below:

    :Parameterize linked services - Azure Data Factory & Azure Synapse

    :CI/CD Azure Data Factory - Parameterization - best practices

    :Custom parameters in a Resource Manager template - Azure Data Factoryhope the above steps will resolve the issue, please do let us know if issue persists. Thank you

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.