Compartilhar via


Load Data Source Dataflow

Note

Bing Maps for Enterprise service retirement

Bing Maps for Enterprise is deprecated and will be retired. Free (Basic) account customers can continue to use Bing Maps for Enterprise services until June 30th, 2025. Enterprise account customers can continue to use Bing Maps for Enterprise services until June 30th, 2028. To avoid service disruptions, all implementations using Bing Maps for Enterprise REST APIs and SDKs will need to be updated to use Azure Maps by the retirement date that applies to your Bing Maps for Enterprise account type.

Azure Maps is Microsoft's next-generation maps and geospatial services for developers. Azure Maps has many of the same features as Bing Maps for Enterprise, and more. To get started with Azure Maps, create a free Azure subscription and an Azure Maps account. For more information about azure Maps, see Azure Maps Documentation. For migration guidance, see Bing Maps Migration Overview.

Load Data Source Dataflow API to create a data source that contains entity data for a user-specified entity type. For example, a data source could contain location and hours of operation information for a set of restaurants. With the Load Data Source Dataflow API you can:

  • Create a data source.

  • Update, add and delete data source entities.

  • Overwrite an existing data source.

  • Make your data source public.

You can also use the Bing Maps Account Center to create or update a data source. The Bing Maps Account Center also offers the option to geocode address data on upload. For more information, see Creating and Managing Data Sources.

Before using this API, review the Geocode and Data Source Limits.

The Load Data Source Dataflow API creates and updates data sources by creating load data source jobs. The Create a Load Data Source Job URL is used to create the load data source job and requires a data schema and a set of entity data. The data schema and entity data can be provided in XML format or as values separated by commas, tabs or pipe (|) values. KML, KMZ and shapefile formats are also supported. Shapefile data must be uploaded as a zipped set of .shp, .shx, and .dbf files. For more information about the data schema and input data, see Data Schema and Sample Input.

After you have created a job, you can use the Get Load Data Source Status URL to get job status. A job can have one of three statuses: “Pending”, “Completed” or “Aborted”. A job has a “Pending” status when it is created and keeps that status until the job is “Completed” or Aborted”. When a job completes, the response returns a unique base URL that you can use to query the data source with the Query API.

You can delete a data source by using the Delete a Data Source API.

In this Section

Resource Description
Create a Load Data Source Job Describes how to create a data source and upload entity data by using a load data source job. You can also use this API to stage or update a data source.
Get Load Data Source Status Describes how to request status for a load data source job.
Publish a Staged Data Source Describes how to publish a staged data source.
Response Data Describes the responses returned when you create and get status for a load data source job.
Data Schema and Sample Input Describes how to define a data schema and input data for an entity type. Examples are provided for XML format and for input data that is provided by using sets of values separated by pipe (|), comma, or tab characters.
Geography Types Describes the variety of geography types available that represent a geographic area. These types can be used as an entity property and to query for entities in a custom geographical area.
Helpful Tips for Entity Data Provides helpful tips to guide you when you create a data schema and entity data to upload to a data source.
C# Sample Code Provides C# code that shows how to upload entity data to a data source by using the Load Data Source Dataflow API.
VB Sample Code Provides VB code that shows how to upload entity data to a data source by using the Load Data Source Dataflow API.