Just continue appending the JSON arrays to the same blob. Ensure that each array is correctly formatted and separated.
[{<item 1>}, {<item 2>}, ..., {<item 1000>}][{<item 1001>}, {<item 1002>}, ..., {<item 2000>}]
Then in your data flow transformation, configure it to read from the append blob, then use a derived column to transform the JSON. For example, you can use an expression to remove the closing and opening brackets:
replace(replace(replace(columnName, '][', ','), '[', ''), ']', '')
And configure the sink to write the transformed JSON to the SQL database, then set up the Copy Data activity to read from the transformed JSON and write to the SQL database.
For example :
If your JSON in the blob looks like:
[{<item 1>}, {<item 2>}, ..., {<item 1000>}][{<item 1001>}, {<item 1002>}, ..., {<item 2000>}]
You can use the following expression in the Derived Column transformation to concatenate the arrays:
replace(replace(replace(yourColumn, '][', ','), '[', ''), ']', '')
This will transform the JSON into:
{<item 1>}, {<item 2>}, ..., {<item 1000>}, {<item 1001>}, {<item 1002>}, ..., {<item 2000>}
Then, you can wrap it back into a single array:
[{<item 1>}, {<item 2>}, ..., {<item 1000>}, {<item 1001>}, {<item 1002>}, ..., {<item 2000>}]