Make sure the projects are of .NET framework 4.5.2

To write a test client which completely mimics the ADF, write a console application with the code as shared.

  1. Following nugets need to be installed first:
  • Azure.Management.DataFactories – provides ADF management capabilities
  • Moq – a mocking framework for .NET

 

  1. Make linked services for azure table storage, batch and azure documentDB:
  • Azure Storage Linked Service connection string (Input):
    • Account Name - schoolrecordsstorage
    • Account Key – can be found on the portal
    • Default Endpoint Protocol – https

 

  • Azure Batch Linked Service (Batch):
    • Account Name – schoolrecordsbatch
    • Batch URI - https://northeurope.batch.azure.com
    • Access Key - can be found on the portal
    • Pool Name – streamspool
    • Linked Service Name – Azure Storage Linked Service’s name

 

 

  1. Create list of all the linked services being used by ADF

 

  1. Make datasets for azure table storage and documentDB

 

  • Azure Storage Dataset definition (Input):
  • Structure - List<DataElement>() where new DataElement are EnglishScore, MathScore, ChemistryScore, PhysicsScore, ComputerScore, AccountsScore, BusinessStdScore, EconomicsScore
  • Linked Service Name – associated storage linked service
  • Type Properties – SubjectsScoreStorage is the table name
  • External – true (when data in storage is not coming from some other pipeline but coming from external source)
  • Availability – 1 Hourly
  • Policy – Specify the retry interval and timeout and maximum no. of retries

 

  • Azure Blob Dataset definition (Input):
    • Linked Service Name – associated storage linked service
    • Type Properties - FolderPath = "binariescontainer"
    • External – true
    • Availability – 1 Hourly

 

  • Azure DocumentDB Dataset definition (Output):
    • Linked Service Name – associated documentDB linked service
    • Type Properties – TestAssetsR StudentCollection is the collection name
    • Availability – 1 Hourly

 

  1. Create list of all the datasets being used by ADF

 

  1. Create an activity within the pipeline:

 

  • Assign Storage and Azure Batch as inputs to it
  • Assign DocumentDB as output to it
  • Type Properties – Give the following details in here:
    • Assembly Name: dll StudentGradingAdfActivity.dll
    • Entry Point:TransformActivity (i.e., namespace.className of entry class)
    • Package Linked Service: Azure Storage Linked Service Name
    • Package File: binariescontainer/ StudentGradingAdfActivity.zip (i.e., a zip containing only .dll and .pdb files which has the custom activity dll as well)
    • Extended Properties: Containing all the configurable properties, namely, sliceStart, sliceEnd, dataStorageAccountName, docDbEndpointUrl, sinkDbName, dataStorageContainer, azureStorageDatasetName, docDbDatasetName, docDbLinkedServiceName, azureStorageLinkedServiceName, blobDllStorageDatasetName, queueName, storedProcedureName, queueWrapperWriteSize, docsWriteBatchSize, (dataStorageAccountKey, docDbAuthKey, appInsightsInstrumentationKey)
    • Policy: Define the policy to follow for pipeline scheduling, namely, order of execution, retry count and timeout.
    • Scheduler: 1 Hourly

 

  1. Create a mock for the IActivityLogger. This will be needed in the next step

 

  1. Call Execute(IEnumerable<LinkedService>, IEnumerable<Dataset>, Activity, IActivityLogger). Pass mock logger to it as logging is not needed while running the adf test client