Share via


Presentation: Introducing Data-Tier Applications

SQL11UPD04-TSCRIPT-03

This wiki is a transcript of a previously recorded video.

Related content assets:


Introducing Data-Tier Applications

Hi, I’m Greg Low. I’m a SQL Server MVP and part of the Microsoft Regional Director Program. In this section, I want to take a look at the improvements that have been made to the data-tier applications in SQL Server 2012.

 

Data-tier applications were introduced in SQL Server 2008 R2 and gave us a new way of looking at how we might deploy applications. But there are substantial improvements in how this mechanism has evolved into SQL Server 2012.

 

First up, this is often called the DAC framework, or some people refer to it as DAC FX. But it’s a whole collection of things like APIs, file and data formats, and the services required. And the integration for this is both in SQL Server Data Tools where we can build the applications, and also in SQL Server Management Studio where you can deploy, upgrade or you can even extract existing applications as data-tier applications. So it’s a complete sort of end-to-end story.

 

The aim of this is to simplify the deployment and versioning of databases in very similar ways as the same as we do with other applications. If you look at most development teams today, what they do is they use one type of process to build all of their high-level language implementations and the application. But as soon as they come to do the databases, they often do this in a very, very different way. Most organizations I go into today have whole series of T-SQL scripts. And so, for example, they’ll have a seed script that starts off the initial version of the database, and then beyond that they’ll store a large number of upgrade scripts that then have to be applied to bring the database up to the current version.

 

One of the challenges with this is that if someone is doing development and they want to focus on what does the database actually look like now, that doesn’t exist anywhere, apart from if we create a database. So I’d literally have to look at the seed script, then have a look at every single upgrade that’s been done since then to get to a point where that occurs. Now, when I go to do an upgrade against a production server, for example, I might generate a script. But many, many times, I’ve seen situations where you then go to apply that script, and it works fine in dev, it works fine in tests, perhaps I have a staging environment that may or may not work, then when they apply it to production, the same script falls over. And that’s because over time, drift has occurred between the different environments, and the production environment and the dev environment. Now they are subtly different. Now that can lead to really difficult problems to solve. And so, the aim instead, with the data-tier applications, is to focus on what the model should look like rather than necessarily how you get there. And another thing that this helps with, is imagine that you have five or six production servers. Can you hand-on-your-heart guarantee that every one of those is configured exactly the same in terms of the current scheme and so on? Everyone starts out with good intentions, but often in the middle of the night when someone has to patch something, things happen, you know? And you end up in the situation where even the one script might not be able to be applied to different servers. And so, having a model that is focused, or a deployment model that is focused on the database scheme or database model that’s desired, can lead to better outcomes.

 

These tools are fully integrated into SQL Server Development and SQL Server Data Tools, and are designed for developers to work with. And so, as I said, let’s you focus on the model, not on the scripts that allow you to get there. And this works both on premises and allows you to target the cloud as well.

 

One of the biggest issues that was raised about the previous version, in SQL Server 2008 R2, was the upgrade process. When I first looked at this and I saw it was a side-by-side process, I had real concerns about that. Because what would happen is that when I went to do an upgrade, all the data in my existing database would be migrated to another database. The original copy was retained there. Now, at first I was very, very concerned about that. However, in some testing, I must admit, in some ways, I came to like some of that, because I would never end up in a situation where I was halfway between versions. Either my upgrades worked, or they didn’t. And I was always back in a known state every time. Now that’s not the case that people usually have with scripts. When I apply a script and something goes horribly wrong, because things aren’t as I expect, it’s very, very common to then end up in some sort of mess that’s fairly hard to recover from. However, migrating the data to another database is clearly not a solution when you’re dealing with larger amounts of data.

 

So, the headset in SQL Server 2008 R2 that data-tier applications were targeted at departmental applications, and it was really more the smaller applications in the organization. And this is another core thing to understand. Most organizations I go into have a very large number of applications. However, there are one or two applications that are large, and then they have a gigantic number of all of these smaller applications. Yet the effort they use to manage those smaller applications, in many cases, the same amount of effort is taken on a small one as on a large application. Now, that’s not sensible. And so the data-tier application framework allows you to deal with the bulk of those applications in a much simpler way, even if it isn’t still suitable perhaps for your very largest applications. Maybe they need to be done differently.

 

Now in Version 3 of the data-tier application upgrade, we now have the option to upgrade in place. And so that addresses one of the biggest issues with the side-by-side migration. And so there’s no need to copy and migrate the data – this was the number one thing that people asked for. And it also works back to SQL Server 2005 SP4 or later, and with SQL Azure. An important thing to consider when you’re doing the upgrade is, because you’re upgrading in place, you also need to make sure there’s enough transaction log space for the upgrade process to occur.

 

Now finally, PowerShell has been one of the ways that I could drive all of this extraction, upgrade, deployment type process. PowerShell providers have been in place for SQL Server for awhile and the team that did the data-tier application work exposed this through PowerShell. They had an upgrade method in place, and that’s still there, but now deprecated. And it’s been replaced by an incremental upgrade method that now does the upgrade of the application without migrating the data.

 

Another issue that was raised with data-tier applications in earlier versions was that they only supported a much smaller subset of data types. In this version, there’s been a substantial increase of the number of data types that are supported. In fact, if you’re working with SQL Azure, you’ll find that 98 percent of the data types that are supported in Azure are now supported in data-tier applications. So this becomes a very good way to build applications that’ll be deployed to the cloud. Now in this version in particular, I’ve mentioned a number of the things here that have been added, and probably the biggest ones that have been requested have been geometry and geography, the spatial data types added in 2008, and the spatial indexes then to support those. So these are now fully supported inside the data-tier applications.

 

When I built my project that I’m going to create the data-tier application in, I also have the ability to do platform targeting. So in this case, I can say, look, I really do want to build a project targeting SQL Azure, and I can use this to create a down level .dacpac file on version 2.1 which is supported on the SQL Azure structure. And this can be the output of my code. If I want, it’s still possible to generate SQL scripts, but typically you’ll want to generate one of these .dacpac files. And what’s going to happen when you do the build, is it’s going to prevent you from using features or aspects of the product that are not supported by the target platform.

 

Now another issue that’s been raised is, what if I have an existing database and I want to sort of export the schema and the data? Now in earlier versions, I had the ability to extract the data tier application, and that would extract a .dacpac from the existing database. What I have the ability to do in this version is now extract both that schema data, the schema constructs, but also to be able to extract the data. Instead of creating a .dacpac file, this creates a .bacpac file. So I’ve got a dot b-a-c-p-a-c. This was added originally in the CTP of the feature pack. What it does is it has the contents of the .dacpac but in addition to that it has the data in there as well. And so it uses JSON as a format for the data which is much, much more compact than earlier versions. It’s also much easier to work with than something like BCP and avoids all the basic sort of problems with code pages and with precision on numeric values and so on. This is not intended as a backup mechanism, because it’s not dealing with this in a transactional way. So if I had a database that was quiesced or not being in use at all, at the moment I could use this to extract it out and get the data and be comfortable with where that’s at. But keep in mind, if I had people who were using the database at the same time, it’s copying table by table, so I don’t have this sort of consistent transactional view across all the data. So it’s important to realize that.

 

But the commands, I’ve got DACImportExport.CLI.exe, as I said, from my server here and the database called retaildb. I want to export the file RetailDB.bacpac, and I’ve pulled it out and I’ve done that with Windows authentication. Similarly, there’s an import option here, and so this brings in both the schema and the data and registers the database as a data-tier application. Now this can greatly simplify migration from SQL Server on premises moving to SQL Azure. So I can extract the data-tier application using the import/export, get a .bacpac file, and push this off to SQL Azure.

 

Now if I’m working with SQL Azure, I’ve also got integration where I can take an existing SQL Azure database and I can push it out to a .bacpac file or I can come in from a .bacpac file. So this gives us another way of archiving off databases, and so on, while they’re sitting in SQL Azure. Given the fact that it’s all occurring in the cloud instead of on my local file system, what I need to have in place is Windows Azure storage, and then new options have been added to the Azure platform where I can import and export from my SQL Azure databases, and move them in and out of that local storage. If I need to do it to and from my local file system, then when I have SQL Server Management Studio, and connect it up to SQL Azure, I have import and export options there as well.

 

So a call to action: If you haven’t been trying data-tier application framework – and maybe you had concerns about things like migrating data during upgrades and so on – I’d really encourage you to go back and have another look at this and start to get involved with it. Because it gives you the opportunity to simplify your application development, it allows you to get away from scripts that really are often quite error-prone, and what you can do is distribute .dacpac or .bacpac files, if you need some data to go with it, along with your applications.

 

So in an upcoming demonstration, we’ll take a look at an in-place upgrade of an existing SQL Server 2012 data-tier application.


Return to [[SQL Server 2012 Developer Training Kit BOM (en-US)]]