Code First Migrations: Walkthrough of August 2011 CTP
The information in this post is out of date.
Visit msdn.com/data/ef for the latest information on current and past releases of EF.
For Code First Migrations see https://msdn.com/data/jj591621
We have released the first preview of our migrations story for Code First development; Code First Migrations August 2011 CTP. This release includes an early preview of the developer experience for incrementally evolving a database as your Code First model evolves over time.
Please be sure to read the ‘Issues & Limitations’ section of the announcement post before using migrations.
This post will provide an overview of the functionality that is available inside of Visual Studio for interacting with migrations. This post assumes you have a basic understanding of the Code First functionality that was included in EF 4.1, if you are not familiar with Code First then please complete the Code First Walkthrough.
Building an Initial Model
Create a new ‘Demo’ Console application
.Add the EntityFramework NuGet package to the project
Tools –> Library Package Manager –> Package Manager Console
Run the ‘Install-Package EntityFramework’ command
Note: If you have previously run the standalone installer for the original EF 4.1 RTM you will need to upgrade or remove the installation because migrations relies on EF 4.1 Update 1. This is required because the installer adds the EF 4.1 assembly to the Global Assembly Cache (GAC) causing the original RTM version to be used at runtime rather than Update 1.
.
Replace the contents of Program.cs with the following code:
using System;
using System.Data.Entity;
using System.Linq;namespace Demo
{
class Program
{
static void Main(string[] args)
{
using (var db = new DemoContext())
{
if (!db.People.Any())
{
db.People.Add(new Person { Name = "John Doe" });
db.People.Add(new Person { Name = "Jane Doe" });
db.SaveChanges();
}foreach (var person in db.People)
{
Console.WriteLine(person.Name);
}
}
}
}public class DemoContext : DbContext
{
public DbSet<Person> People { get; set; }
}public class Person
{
public int PersonId { get; set; }
public string Name { get; set; }
}
}
. .Run the application
Automatic Migrations
Make sure that you have run the application from the previous step
(This will ensure that the database has been created with the initial schema)
.Update the Person class to include an Email property:
public class Person { public int PersonId { get; set; } public string Name { get; set; } public string Email { get; set; } } .
Run the application again
You will receive an exception informing you that the database no longer matches the model
“The model backing the 'DemoContext' context has changed since the database was created. Either manually delete/update the database, or call Database.SetInitializer with an IDatabaseInitializer instance. For example, the DropCreateDatabaseIfModelChanges strategy will automatically delete and recreate the database, and optionally seed it with new data.”
.
Fortunately we now have a better alternative to the two options proposed in the exception message. Let’s install Code First Migrations!
- Tools –> Library Package Manager –> Package Manager Console
- Run the ‘Install-Package EntityFramework.SqlMigrations’ command
.
Installing migrations has added a couple of commands to Package Manager Console. Let’s use the Update-Database command to bring our schema inline with our model.
Run the ‘Update-Database‘ command in Package Manager Console
Migrations will now attempt to calculate the changes required to make the database match the model. In our case this is a very simple change, and there is no chance of data loss, so migrations will go ahead and apply the change. Later in this walkthrough we will look at taking control of more complicated changes as well as previewing the script that migrations will run.
What Changes Can Migrations Detect Automatically?
In this section we looked at adding a property, here is the full list of changes that migrations can take care of automatically:
- Adding a property or class
- Nullable columns will be assigned a value of null for any existing rows of data
- Non-Nullable columns will be assigned the CLR default for the given data type for any existing rows of data
- Renaming a property or class
- See ‘Renaming Properties & Classes’ for the additional steps required here
- Renaming an underlying column/table without renaming the property/class
(Using data annotations or the fluent API)- Migrations can automatically detect these renames without additional input
- Removing a property
- See ‘Automatic Migrations with Data Loss’ section for more information
Renaming Properties & Classes
So far we have looked at changes that migrations can infer without any additional information, now let’s take a look at renaming properties and classes.
Rename the Person.Email property to EmailAddress
public class Person { public int PersonId { get; set; } public string Name { get; set; } public string EmailAddress { get; set; } }
.
Attempt to migrate using the ‘Update-Database’ command in Package Manager Console
You will receive an error warning about data loss. This is because migrations doesn’t know about the property rename and is attempting to drop the Email column and add a new EmailAddress column.
”Update-Database : - .Net SqlClient Data Provider: ……. Rows were detected. The schema update is terminating because data loss might occur.” .
Let’s preview what migrations is trying to do by running ‘Update-Database –Script‘
- This gives us a script showing us what migrations is trying to do. Inspecting this script confirms that migrations is trying to drop the Email column.
ALTER TABLE [dbo].[People] DROP COLUMN [Email] ;
.
- This gives us a script showing us what migrations is trying to do. Inspecting this script confirms that migrations is trying to drop the Email column.
We can inform migrations of the rename by running:
‘Update-Database -Renames:"Person.Email=>Person.EmailAddress"’
.- Migrations will go ahead and migrate the database, this time the Email column is renamed to EmailAddress
- You’ll also notice that a ‘Migrations’ folder is added to your project with a single Model.refactorlog file in it. This file ensures that the same rename will be applied when migrations is run against a different database or on another developers machine. The ‘Migrations’ folder will get some more use later in this walkthrough.
The renames parameter is a comma separated list of renames and can include class and property renames. Class renames use the same format as property renames i.e. –Renames:”Person=>Customer”.
Custom Scripts
Up until now we’ve let migrations take care of working out what SQL to execute. Now let’s take a look at how we can take control when we need to do something more complex.
Call for Feedback: From what we are seeing in our own internal use we don’t anticipate that custom scripts will be required very often. However, our efforts are somewhat sheltered from the ‘real world’ so we would love feedback on situations where you need to use custom scripts. In particular we are interested if there are significant scenarios where a code based alternative to writing raw SQL would be beneficial.
Remove the Person.Name property and add in FirstName and LastName properties
public class Person { public int PersonId { get; set; } public string EmailAddress { get; set; } public string FirstName { get; set; } public string LastName { get; set; } }
You’ll also need to update the code in the main method to deal with this model change:
static void Main(string[] args)
{
using (var db = new DemoContext())
{
foreach (var person in db.People)
{
Console.WriteLine("{0}, {1}", person.LastName, person.FirstName);
}
}
}If we try and upgrade we will get an error warning about data loss because migrations will try and drop the Name column. What we really want to do is take care of populating the new columns with data from the old column before dropping it.
.Ask migrations to scaffold it’s best guess at the changes to a new custom script by running:
‘Add-CustomScript –n:”SplitPersonName”’
.You’ll now notice a new sub-folder appear in the ‘Migrations’ folder in your project. The name of this folder contains a timestamp to control ordering and the human readable name that you supplied. This new folder also contains a few files:
- Source.xml – This file captures the state that the database should be in before this custom script is run. This allows migrations to replicate the changes we have made to the schema in the previous sections of this walkthrough before running our custom script.
- Up.sql – This is the script that will be run. Migrations has given you a starting point by populating it with the SQL it was trying to run.
- Target.xml – This is the state that the database should be in after the script has run (i.e. the current state of the model). This file will come into play more once we support downgrade as well as upgrade.
.
Let’s go ahead and change the script to add the new columns, migrate data and then drop the old column:
You’ll notice the script contains a lot of SET statements, this is just a limitation of this first preview and we are working to remove them for future releases.
SET ANSI_PADDING, ANSI_WARNINGS, ARITHABORT, CONCAT_NULL_YIELDS_NULL, QUOTED_IDENTIFIER ON; SET NUMERIC_ROUNDABORT, ANSI_NULLS OFF;
GO PRINT N'Adding new name columns...'
;
GO ALTER TABLE [dbo].[People] ADD [FirstName] NVARCHAR (MAX) NULL, [LastName] NVARCHAR (MAX
) NULL;
GO PRINT N'Migrating Data...'
;
GO UPDATE [dbo].[People] SET [FirstName] = LEFT([Name], CHARINDEX(' ', [Name]) - 1), [LastName] = RIGHT([Name], LEN([Name]) - CHARINDEX(' ', [Name]
))
GO PRINT N'Removing old name column...'
;
GO ALTER TABLE [dbo].[People] DROP COLUMN [Name]
; GO
Run the ‘Update-Database’ command to bring the database up to date
With our custom script complete we can go back to using the automatic upgrade functionality, until we find the need to take control again. Migrations allows you to swap between automatic upgrade and custom scripts as needed. The Source.xml file associated with each custom script allows migrations to reproduce the same migration steps we have performed against other databases.
Automatic Migrations with Data Loss
So far we have only made changes that avoid data loss but let’s take a look at how we can let an automatic migration execute even when it detects that data loss will occur.
Remove the EmailAddress property from Person
public class Person { public int PersonId { get; set; } public string FirstName { get; set; } public string LastName { get; set; } }
If we try and upgrade we will get an error warning about data loss but we can run
‘Update-Database –Force’ to let migrations know that we are ok with data loss occurring
Summary
In this walkthrough we saw an overview of the functionality included in the first preview of Code First Migrations. We saw automatic migrations, including property and class renames as well as migrating with data loss. We also saw how to use custom scripts to take control of parts of the migration process. We really want your feedback on what we have so far so please try it out and let us know what you like and what needs improving.
Rowan Miller
Program Manager
ADO.NET Entity Framework
Comments
Anonymous
July 27, 2011
Hi Rowan, This looks very promising. I am thinking about custom scripts being written in raw sql as opposed to code. I am writing an application that can target either a SQL Server database using Microsoft's SQL Server provider or an Oracle database using Devart's dotConnect for Oracle EF provider. EF DbContext / Code First handles this very well. Simply by changing the connection string in your web/app.config the appropriate provider is used and thus the appropriate database is targeted. And all this is completely transparant to any code you write as it is the provider that handles the generation of the appropriate raw sql for your target database from your ExpressionTree. If for migrations you have to write custom scripts in raw sql then I cannot get away from having to write one version for SQL Server and one version for Oracle. I can't see how you could possibly do this in code once and let the providers take care of producing the appropriate raw sql again, but perhaps you have some ideas. With regards to extra database constructs being added on database initialization that will be removed by migrations, you've mentioned indices. I just wanted to make you aware of an additional scenario I have in my application. I have a self-referencing hierarchy of entities. Since EF does not support SQL Server common table expressions or Oracle start with / connect by prior statements, I ended up surfacing this hierarchy in a view (I have a SQL server version using a cte and an Oracle version using connect by prior) and have a class mapped to this view which I can involve in my LINQ statements. EF Code First will initially create a table for this class, which I then drop in my database initializer and replace with the appropriate version of the view depending on the provider specified in your connection string. When table valued functions can be used from EF DbContext / Code First with both the SQL Server provider and Devart's dotConnect for Oracle EF provider I can use those instead of the views. So with regards to migrations, I would want to have this view preserved. kind regards RecoAnonymous
July 27, 2011
The comment has been removedAnonymous
July 27, 2011
As an extension to the question from Don Pinkster: Another scenario would be to let the application automatically update the database upon detecting its trying to connect to an older version. How would that work?Anonymous
July 27, 2011
I presume we will be seeing many more posts on migrations as it evolves, but I would like to see some thoughts on how this will be ran in a production environment, without VS, and where we as vendors do not have direct access to a customers database. Also, the -Force switch is fine for now, but I think that it would be in everyone's best interest if there was a '-ForceConditions' switch as well. This switch would allow you to specifiy that forcing certain changes is ok, if certain conditions are true, and if not, fail the script. This would be much safer in real world applications, IMO. As with many other users, I am excited about Migrations and can't wait to see what's to come.Anonymous
July 28, 2011
@Don/Danthar, you can script the change via Update-Database -Script. @Danthar, speaking as my pre-Microsoft self, letting the application automatically update the database is scary for a couple of reasons: 1) multiple applications running simultaneously and trying to upgrade the database (concurrency issues) and 2) wouldn't that mean that users need DDL permissions to the database?Anonymous
July 28, 2011
The comment has been removedAnonymous
July 28, 2011
@Danthar/Holger K, as mentioned in the announcement post, we are also working on enabling migrations scenarios for areas such as team build, deployment (e.g. Web Deploy), and invoking migrations from custom code. The PowerShell commands will be just one of the wrappers that we provide over a core set of migrations APIs.Anonymous
July 28, 2011
This looks very promising! Do you have an idea when a production ready version could be available?Anonymous
July 29, 2011
The comment has been removedAnonymous
July 30, 2011
The comment has been removedAnonymous
August 01, 2011
I agree with those people suggesting that SQL scripts should not be the only option. Personally, I would prefer that the visual studio build did not migrate my database, I want all that done within my program. Given that code first does not require any SQL knowledge, none should be required for migrating the database. Everything should be able to be done within the code. This includes scenarios where there are major changes for example deleting of classes whose data needs to be moved to new classes, situations where classes have changed in the hierarchy, all those sorts of things which will require some C# code to do the work Here is how I would expect it to work: var dbVer = GetDatabaseVersion(); if (dbVer==1.0) { SchemaChanges.Add(new Rename(Person.Email,EmailAddress)); } if (dbVer==1.1) { SchemaChanges.Add(new RemoveProperty(Person.Name,UpdateName)); } if (dbVer==1.2) { SchemaChanges.Add(new RemoveClass(Person,UpdatePerson)); } [..] bool UpdateName(string existingName, Person person) { person.FirstName = existingName.Left etc.. person.LastName = existingName.Left etc.. } // This gets passed a person class that represents the data in the database. bool UpdateName(dynamic existingPerson) { if (existingPerson.FirstName=="Mary") { AddFemale(new Female(existingPerson.FirstName etc... } etc.. } Hopefully this sort of things can be made possible, without them it will be very difficult to use. ...StefanAnonymous
August 01, 2011
The comment has been removedAnonymous
August 02, 2011
The comment has been removedAnonymous
August 02, 2011
The comment has been removedAnonymous
August 04, 2011
Good work! Have a few questions though:
- Is it a way to update DB schema manually then run "Update-Database" just to make sure that everything is in sync?
- EF Code-First generates fancy foreign key & index names but in my DB there are names which follows naming convention (ex.: FK_Person_Role or AK_Person_Name). Will "Update-Database" tool complain about different names of foreign keys and indexes?
- Is there an option to specify a list parameters (which columns to skip etc.) in a separate file, then when running "Update-Database" command, parameters are taken from that file? (similar how StyleCop works)
Anonymous
August 05, 2011
Is there similar functionality for a non-code-first project (i.e.a edmx, database-based project)? It seems like I should be able to take my old edmx file and my new edmx file and generate the appropriate cahnge scripts to bring my customers database up to date.Anonymous
August 08, 2011
The comment has been removedAnonymous
August 08, 2011
The comment has been removedAnonymous
August 17, 2011
In our project, with EF 4.0, we are very happy using Huagati DBMLTools for migration tasks(www.huagati.com/dbmltools/). It has great workflow where we can see model changes that is not in database and select what we want to be in there. In output we have SQL script for our changes. Maybe some of DBMLTools features may have an impact on EF Migrations tools.Anonymous
August 22, 2011
The comment has been removedAnonymous
August 25, 2011
The comment has been removedAnonymous
September 07, 2011
@shawn – The ability to run arbitrary code is on our backlog to support in a future release. @Stefan Olson – We have added code based migrations in Alpha 2 which is now available. @TweeZz @SychevIgor @Ilia Ivanov @Devi – We’ve done some work to improve type resolution in Alpha 2. If you continue to have problems with Alpha 2 can you start a thread in our pre-release forum; social.msdn.microsoft.com/.../threads @Boris Drajer – Code First does work pretty well in scenarios where your model is split between multiple modules and needs to be calculated at runtime. We’ll make sure that Migrations works well in these same scenarios. @Konstantin Tarkus – The latest preview (Alpha 2) works a little differently in that it just look at changes between your model. The answer to point 1 is no, any changes relating to your model should be made through migrations, you are free to add extra stuff manually though. Point 2 is not a problem, EF isn’t concerned about foreign key names etc, as long as they are present. Point 3 somewhat goes away with Alpha 2, scripts are generated based on changes in the model, you can always chose to remove bits of the script. @e-labs @Bruce Swan - This migrations experience is very targeted towards Code First, even more so in the latest preview where migration scripts are expressed in code. For the EDMX/Designer based approach take a look at Juneau/EF integration - channel9.msdn.com/.../DEV314 @John Saunders – Yes, this is something we are discussing. What we do in this area will be guided by user feedback etc. Code First Migrations will have MSBuild integration in the future. Alpha 2 includes that ability to get a final change script. @Gvozdin – Alpha 2 od Code First Migrations has a pretty different workflow that allows you to edit a script to avoid pushing some changes to the database. It also supports generating a final change script to be handed to a DBA etc.