Partager via


CASI Kit Part 6 - Integrating SQL Azure with SharePoint 2010 and Windows Azure

This post is most useful when used as a companion to my five-part series on the CASI (Claims, Azure and SharePoint Integration) Kit.

·         Part 1: an introductory overview of the entire framework and solution and described what the series is going to try and cover.

·         Part 2: the guidance part of the CASI Kit. It starts with making WCF the front end for all your data – could be datasets, Xml, custom classes, or just straight up Html. In phase 1 we take your standard WCF service and make it claims aware – that’s what allows us to take the user token from SharePoint and send it across application or data center boundaries to our custom WCF applications. In phase 2 I’ll go through the list of all the things you need to do take this typical WCF application from on premises to hosted up in Windows Azure. Once that is complete you’ll have the backend in place to support a multi-application, multi-datacenter with integrated authentication.

·         Part 3: describes the custom toolkit assembly that provides the glue to connect your claims-aware WCF application in the cloud and your SharePoint farm. I’ll go through how you use the assembly, talk about the pretty easy custom control you need to create (about 5 lines of code) and how you can host it in a page in the _layouts directory as a means to retrieve and render data in web part. The full source code for the sample custom control and _layouts page will also be posted.

·         Part 4: the web part that I’m including with the CASI Kit. It provides an out of the box, no-code solution for you to hook up and connect with an asynchronous client-side query to retrieve data from your cloud-hosted service and then display it in the web part. It also has hooks built into it so you can customize it quite a bit and use your own JavaScript functions to render the data.

·         Part 5: a brief walk-through of a couple of sample applications that demonstrate some other scenarios for using the custom control you build that’s described in part 3 in a couple of other common scenarios. One will be using the control to retrieve some kind of user or configuration data and storing it in the ASP.NET cache, then using it in a custom web part. The other scenario will be using the custom control to retrieve data from Azure and use it a custom task; in this case, a custom SharePoint timer job. The full source code for these sample applications will also be posted.

With the CASI Kit I gave you some guidance and tools to help you connect pretty easily and quickly to Windows Azure from SharePoint 2010, while sending along the token for the current user so you can make very granular security decisions. The original WCF application that the CASI Kit consumed just used a hard-coded collection of data that it exposed. In a subsequent build (and not really documented in the CASI Kit postings), I upgraded the data portion of it so that it stored and retrieved data using Windows Azure table storage. Now, I’ve improved it quite a bit more by building out the data in SQL Azure and having my WCF in Windows Azure retrieve data from there. This really is a multi-cloud application suite now: Windows Azure, SQL Azure, and (ostensibly) SharePoint Online. The point of this post is really just to share a few tips when working with SQL Azure so you can get it incorporated more quickly into your development projects.

Integration Tips

1. You really must upgrade to SQL 2008 R2 in order to be able to open SQL Azure databases with the SQL Server Management Studio tool. You can technically make SQL Server 2008 work, but there are a bunch of hacky work around steps to make that happen. 2008 R2 has it baked in the box and you will get the best experience there. If you really want to go the 2008 work-around route, check out this link: https://social.technet.microsoft.com/wiki/contents/articles/developing-and-deploying-with-sql-azure.aspx. There are actually a few good tips in this article so it’s worth a read no matter what.

2. Plan on using T-SQL to do everything. The graphical tools are not available to work with SQL Azure databases, tables, stored procedures, etc. Again, one thing I’ve found helpful since I’m not a real T-SQL wiz is to just create a local SQL Server database first. In the SQL Management tool I create two connections – one to my local SQL instance and one to my SQL Azure instance. I create tables, stored procedures, etc. in the local SQL instance so I can use the graphical tools. Then I use the Script [whatever Sql object] As…CREATE to…New Sql Query Window. That generates the SQL script to create my objects, and then I can paste it into a query window that I have open to my SQL Azure database.

3. This one’s important folks – the default timeout is frequently not long enough to connect to SQL Azure. You can change it if using the .NET SqlClient classes in the connection string, i.e. add "Connection Timeout=30;". If you using SQL Server Management Studio then click the Advanced button on the dialog where you enter the server name and credentials and change it to at least 30. The default is 15 seconds and fails often, but 30 seconds seems to work pretty well. Unfortunately there isn’t a way to change the connection timeout for an external content type (i.e. BDC application definition) to a SQL Azure data source.

4. Don't use the database administrator account for connecting to your databases (i.e. the account you get to create the database itself). Create a separate account for working with the data. Unfortunately SQL Azure only supports SQL accounts so you can't directly use the identity of the requesting user to make decisions about access to the data. You can work around that by using a WCF application that front-ends the data in SQL Azure and using claims authentication in Windows Azure, which is exactly the model the CASI Kit uses. Also, it takes a few steps to create a login that can be used to connect to data a specific database. Here is an example:

--create the database first

CREATE DATABASE Customers

 

--now create the login, then create a user in the database from the login

CREATE LOGIN CustomerReader WITH PASSWORD = 'FooBarFoo'

CREATE USER CustomerReader FROM LOGIN CustomerReader

 

--now grant rights to the user

GRANT INSERT, UPDATE, SELECT ON dbo.StoreInformation TO CustomerReader

 

--grant EXECUTE rights to a stored proc

GRANT EXECUTE ON myStoredProc TO CustomerReader

 

For more examples and details, including setting server level rights for accounts in SQL Azure, see https://msdn.microsoft.com/en-us/library/ee336235.aspx.

 

5. You must create firewall rules in SQL Azure for each database you have in order to allow communications from different clients. If you want to allow communication from other Azure services, then you must create the MicrosoftServices firewall rule (which SQL Azure offers to do for you when you first create the database), which is a start range of 0.0.0.0 to 0.0.0.0. If you do not create this rule none of your Windows Azure applications will be able to read, add or edit data from your SQL Azure databases! You should also create a firewall rule to allow communications with whatever server(s) you use to route to the Internet. For example, if you have a cable or DSL router or an RRAS server, then you want to use your external address or NAT address for something like RRAS. 

 

Those should be some good tips to get your data up and running.

 

Data Access

Accessing the data itself from your custom code – Windows Azure WCF, web part, etc. – is fortunately pretty much exactly the same as when retrieving data from an on-premise SQL Server. Here’s a quick code example, and then I’ll walk through a few parts of it in a little more detail:

//set a longer timeout because 15 seconds is often not

//enough; SQL Azure docs recommend 30

private string conStr = "server=tcp:foodaddy.database.windows.net;Database=Customers;" +

"user ID=CustomerReader;Password=FooBarFoo;Trusted_Connection=False;Encrypt=True;" +

"Connection Timeout=30";

 

private string queryString = "select * from StoreInformation";

private DataSet ds = new DataSet();

 

using (SqlConnection cn = new SqlConnection(conStr))

{

SqlDataAdapter da = new SqlDataAdapter();

da.SelectCommand = new SqlCommand(queryString, cn);

da.Fill(ds);

//do something with the data

}

 

Actually there’s not really here that needs much explanation, other than the connection string. The main things worth pointing out on it that are possibly different from a typical connection to an on-premise SQL Server are:

· server: precede the name of the SQL Azure database with “tcp:”.

· Trusted_Connection: this should be false since you’re not using integrated security

· Encrypt: this should be true for any connections to a cloud-hosted database

· Connection Timeout: as described above, the default timeout is 15 seconds and that will frequently not be enough; instead I set it to 30 seconds here.

One other scenario I wanted to mention briefly here is using data migration. You can use the BCP tool that comes with SQL Server to move data from an on-premise SQL Server to SQL Azure. The basic routine for migrating data is like this:

1. Create a format file – this is used for both exporting the local data and importing it into the cloud. In this example I’m creating a format file for the Region table in the Test database, and saving it to the file region.fmt.

 

bcp Test.dbo.Region format nul -T -n -f region.fmt

 

2. Export data from local SQL – In this example I’m exporting out of the Region table in the Test database, into a file called RegionData.dat. I’m using the region.fmt format file I created in step 1.

 

bcp Test.dbo.Region OUT RegionData.dat -T -f region.fmt

 

3. Import data to SQL Azure. The main thing that’s important to note here is that when you are importing data into the cloud, you MUST include the “@serverName” with the user name parameter; the import will fail without it. In this example I’m importing data to the Region table in the Customers database in SQL Azure. I’m importing from the RegionData.dat file that I exported my local data into. My server name (the -S parameter) is the name of the SQL Azure database. For my user name (-U parameter) I’m using the format username@servername as I described above. I’m telling it to use the region.fmt format file that I created in step 1, and I’ve set a batch size (-b parameter) of 1000 records at a time.

 

bcp Customers.dbo.Region IN RegionData.dat -S foodaddy.database.windows.net -U speschka@foodaddy.database.windows.net -P FooBarFoo -f region.fmt -b 1000

 

That’s all for this post folks. Hopefully this gives you a good understanding of the basic mechanics of creating a SQL Azure database and connecting to it and using it in your SharePoint site. As a side note, I used the CASI Kit to retrieve this data through my Windows Azure WCF front end to SQL Azure and render it in my SharePoint site. I followed my own CASI Kit blog when creating it to try and validate everything I’ve previously published in there and overall found it pretty much on the mark. There were a couple of minor corrections I made in part 3 along with a quick additional section on troubleshooting that I added. But overall it took me about 30 minutes to create a new custom control and maybe 15 minutes to create a new Visual web part. I used the CASI Kit web part to display one set of SQL Azure data from the WCF, and in the Visual web part I created an instance of the custom control programmatically to retrieve a dataset, and then I bound it to an ASP.NET grid. I brought it all together in one sample page that actually looks pretty good, and could be extended to displaying data in a number of different ways pretty easily. Here’s what it looks like:

CASI_Kit_Part6.zip

Comments

  • Anonymous
    January 01, 2003
    The comment has been removed