Partilhar via


Execute SQL Statements in a Loop Package Sample

The Execute SQL Statements in a Loop sample package uses a Foreach Loop container to iterate through SQL statements saved in text files in a folder. The Transact-SQL statements create five tables in the AdventureWorks database, one table for each of the valid sales territories in the United States. The collection value of the Foreach File enumerator maps to a variable. This variable is used in a property expression to update the value of the connection string in the File connection manager with each enumeration.

The data in the text file has several problems: some postal codes were incorrectly interpreted as numeric values and therefore are missing their leading zeros, and the identifier of some sales territories are not valid. The data flow in the package detects and corrects incorrect postal codes, writes data rows with an invalid sales territory to a file, and directs valid data rows to one of five tables in the AdventureWorks database.

When the package is rerun, the tables are truncated before the data is inserted.

If you run the sample on a non-English version of Windows, you may have to substitute the localized name of the Program Files folder to open or run the sample.

Important

Samples are provided for educational purposes only. They are not intended to be used in a production environment and have not been tested in a production environment. Microsoft does not provide technical support for these samples.

Requirements

Before you run this sample package, ensure that the following conditions are met:

  • The sample package and data files that it uses must be installed on the local hard disk drive.
  • You must have installed and have administrative permissions on the AdventureWorks database.
  • If you intend only to run the sample package from the command line, you must install SQL Server 2005 Integration Services (SSIS).
  • If you intend to open the package in SSIS Designer and run the sample package, you must install Business Intelligence Development Studio.

For more information about how to install samples, see the topic, "Installing Sample Integration Services Packages", in SQL Server 2005 Books Online. To obtain the latest version of the samples, including new samples released after the original release of SQL Server 2005, see SQL Server 2005 Samples and Sample Databases (April 2006).

Location of the Sample Package

If the samples were installed to the default installation location, the Execute SQL Statements Sample package is located in the following folder:

c:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Package Samples\ExecuteSQLStatementsInLoop Sample\ExecuteSQLStatementsInLoop\

The following files are required to run this sample package.

File Description

ExecuteSQLStatementsInLoop.dtsx

The package file.

Customer.txt

The source data file.

CustomersWithInvalidTerritoryID.txt

The file to which data that is not valid is written.

CreateProspectTableTerr1.sql

The Transact-SQL statement that creates the Territory1 table.

CreateProspectTableTerr2.sql

The Transact-SQL statement that creates the Territory2 table.

CreateProspectTableTerr3.sql

The Transact-SQL statement that creates the Territory3 table.

CreateProspectTableTerr4.sql

The Transact-SQL statement that creates the Territory4 table.

CreateProspectTableTerr5.sql

The Transact-SQL statement that creates the Territory5 table.

Running the Sample

The package can be run from the command line by using the dtexec utility, or can be run in Business Intelligence Development Studio.

If you are using a non-English version of Windows, you may have to update the ConnectionString property of any file connection managers used in the package to run the sample package successfully. You should verify that the path used in the connection manager is valid on your computer, and if you need to, modify the path so that it uses the localized name of the Program Files folder.

For this sample, you may have to update "Program Files" in the ConnectionString property for the CreateTableSQL, Customers, and CustomersWithInvalidTerritoryID connection managers.

To run the package by using dtexec

  1. Open a Command Prompt window.

  2. Use the Change Directory command, cd, to change the directory to C:\Program Files\Microsoft SQL Server\90\DTS\Binn, the location of dtexec.

  3. Type the following command:

    dtexec /f "C:\Program Files\Microsoft SQL Server\90\Samples\Integration Services\Package Samples\ExecuteSQLStatementsInLoop Sample\ExecuteSQLStatementsInLoop\ExecuteSQLStatementsInLoop.dtsx"
    
  4. Press Enter.

For more information about how to run the package by using the dtexec utility, see the topic, "dtexec Utility", in SQL Server 2005 Books Online.

To run the package in Business Intelligence Development Studio

  1. Open Business Intelligence Development Studio.

  2. On the File menu, point to Open, click Project/Solution.

  3. Locate the ExecuteSQLStatementsInLoop Sample folder, and then double-click the file named ExecuteSQLStatementsInLoop.sln.

  4. In Solution Explorer, right-click ExecuteSQLStatementsInLoop.dtsx in the SSIS Packages folder, and then click Execute Package.

Important

If you open the Execute SQL Statements in a Loop sample package in SSIS Designer before you run the package for the first time, the Data Flow task includes a warning. The warning occurs because the SQL Server tables that the package uses do not exist yet; the tables are created the first time that you run the package. The warning does not appear when you reopen the package in SSIS Designer, if you have run the package at least one time. The package runs successfully regardless of the warning.

Components in Sample

The following table lists the Integration Services tasks, containers, sources, transformations, destinations, and connection managers that are used in the sample.

Element Purpose

Foreach Loop

The Foreach Loop container, Run SQL Statements, uses the Foreach File enumerator to iterate through files that contain Transact-SQL statements. The Foreach Loop container includes an Execute SQL task.

Execute SQL task

The Execute SQL task, Create Tables, connects to the AdventureWorks database and runs five Transact-SQL statements that create tables.

Data Flow task

The Data Flow task, Extract-Clean-Load Data, executes a data flow that extracts the data from a text file, cleans and distributes the data, and then loads the data into tables in the AdventureWorks database.

Flat File source

The Flat File source, Extract data, extracts the data from the text file.

Conditional Split transformation

The Conditional Split transformation, Direct Rows by TerritoryID, directs rows to different outputs depending on the value in the TerritoryID column. Rows with invalid values in TerritoryID are directed to the default output.

Derived Column transformation

The Derived Column transformation, Fix Postal Code, cleans the postal codes by adding a leading zero to four-character postal codes.

OLE DB Destination

Five OLE DB destinations load data into five different territory tables.

File connection manager

The File connection manager, Create Tables SQL, connects to files that contain SQL CREATE TABLE statements.

Flat File connection managers

The File connection manager, Customers, connects to the text file that contains the customer source data.

The File connection manager, CustomsWithInvalidTerritoryId, connects to the text file to which the package writes invalid data.

OLE DB connection manager

The OLE DB connection manager, (local).AdventureWorks, connects to the AdventureWorks database on the local server.

Sample Results

To see the execution results of the Execute SQL Statements in a Loop sample package, open a Query window in SQL Server Management Studio, type or copy the following Transact-SQL query, and then run the query.

SELECT * FROM AdventureWorks.dbo.Territory1
SELECT * FROM AdventureWorks.dbo.Territory2
SELECT * FROM AdventureWorks.dbo.Territory3
SELECT * FROM AdventureWorks.dbo.Territory4
SELECT * FROM AdventureWorks.dbo.Territory5

This query retrieves the data that was extracted from the text file and loaded into SQL Server tables.