Hi Michael,
I've seen this issue before with Azure Load Testing, and it can be a bit tricky. The corrupted data—like the long garbled strings you're seeing—often stems from encoding or how Azure handles the CSV split. Here are some approaches that usually help:
Encoding Check: Sometimes, the problem comes from the file’s encoding. Azure Load Testing works best with CSVs saved as UTF-8 or UTF-8 BOM. If you haven’t already, try re-saving your CSV in UTF-8 and re-upload it. Tools like Notepad++ can quickly confirm and change encoding if needed.
Header Row Experiment: Even though your file doesn’t need a header, adding a dummy header row sometimes helps the splitter allocate rows properly. It's worth a quick test to see if it resolves the garbling.
Manual Splitting: To bypass Azure’s CSV splitting (which seems to be the root issue here), you could manually divide the CSV into smaller files—say, 500 rows each—and upload each file to a separate engine. This approach often avoids the garbling altogether.
CSV Format Consistency: Double-check the CSV for any irregularities, like inconsistent delimiters or stray quotes. Even small formatting hiccups can throw off the splitting process and cause these odd strings.
Consider External Data Source: If Azure Load Testing supports pulling data from an external source, using something like Azure Storage could give you more control, especially as you scale up. This way, you wouldn’t need to worry about CSV splitting at all.
If none of these work, reaching out to Azure Support is a good call. They might have insights into any quirks or recent issues with the CSV splitter. Let me know if any of these solutions help, or if you'd like to dive deeper into one!