Compartilhar via


Ingesting Your First CSV Into Azure Data Explorer

We've been using the Kusto query language internally for quite a while and over that time, and while I'm not a member of the Data Explorer team, I've helped a number of people get started with the language. Now that Azure Data Explorer is public, there are a lot of resources to get you off the ground.

Your first stop should be the main Data Explorer documentation. They have some quick starts to help you get started with creating a cluster, ingesting data, and doing basic queries. Next on our imaginary syllabus is a 4 hour class available from Pluralsight. If you're serious about learning this language as quickly as possible, that's a great use of your time.

In this blog post, I'll walk through a quick scenario of ingesting data with C#. These steps assume that you have already created a cluster and a database.

  1. In the portal, go to the query editor for your database and execute: .create table ingestionTest(a:int, b:int, c:int)
  2. Create a new Console app and reference the Microsoft.Azure.Kusto.Ingest NuGet package. See the docs for more help with the package.
  3. Go to the C# API samples and copy the code from Ingest From Local File(s) using KustoDirectIngestClient into your new console project.
  4. Go the the Overview blade for your cluster on the portal and copy the URI value. Pass that value into the KustoConnectionStringBuilder constructor.  (Note, normally you would not use direct ingestion. It can be hard on your cluster and requires you to manage a lot of retry logic, etc on your own. Queued ingestion is preferred. In those cases, your connection string would be the "Data Ingestion URI" value.)
  5. Update the Initial Catalog to the name of your database and also set that as the value for the kustoDatabase variable lower in the code.
  6. The code sample is configured to use an AAD app to authenticate, but for this example, let's just keep it simple and run with your own AAD account. To do this, simply remove the ApplicationKey and ApplicationClientId fields from the initialize block. All you need to set are the FederatedSecurity and InitialCatalog fields.
  7. Update the kustoTable variable to be the name of the table that you created in step 1.
  8. Create a sample CSV file that has three integers on each row. Put that file name into the "files" list in the code.
  9. Run!

Because this is using direct ingestion, you should immediately be able to go the Query blade in the portal and query your table. So if you called your table "ingestionTest", execute a query that is just "ingestionTest". You should see the contents of your data there.

In a real production scenario, you would be using queued ingestion and you'd probably want to ingest from a blob or a stream. For further reading, checkout a bigger walkthrough of a scenario like I described above and also best practices for ingestion.

Congrats! You have now ingested some data into Azure Data Explorer. Expand that out to a couple million rows and witness the power of the Kusto query language!

P.S. There are a bunch of old posts on this blog tagged "Data Explorer". Many year ago, I worked on a completely different product called Data Explorer. That has now grown up and is part of the data tab in Excel and part of Power BI. Those posts are unrelated to the recently announced "Azure Data Explorer." Although, in a fun twist of fate, it is technically possible to use the previous Data Explorer technology via Excel and Power BI to query data in the new Azure Data Explorer.

Comments