Поделиться через


Storing Complex Types, Binary Resources, and Other Tricky Things

-or-

Sync’ing OData to Local Storage in Windows Phone (Part 2)

In my previous post in this series, I described how to use a T4 template in Visual Studio to generate a hybrid proxy that supports both OData feeds and local database storage. There were also a few local storage issues that I didn’t get a chance to cover in the first post, namely what to do with binary resources (media resource streams), which can be images or audio and video files or streams, downloaded from the data service. Also, I came across a workable solution that enables you to store complex properties in the local database.

I also decided to create and publish a Windows Phone Mango app project to Code Gallery that will include my T4 template for the hybrid proxy and demonstrate how to use it. I decided to use the public Netflix feed as my data source because, a) it’s got a pretty advanced data model that includes both complex types and media link entries (MLEs) and b) I can just add this mew local storage functionality to the original Netflix feed-based Windows Phone quickstart that I wrote. (I will probably also need to extend the sample to include traversing an association, but I will decide that later.)

Support for Complex Types

If you recall from my previous post, I was bemoaning the lack of complex type support in LINQ-to-SQL (L2S) and thereby also in local database. Fortunately, I came across an old blog post by Dino Esposito where he describes his solution to this problem (lack of support for complex properties in L2S). Dino’s basic solution was to create private backing properties in the main type for the properties of the complex type, and then attribute these properties for L2S storage. Fortunately, L2S supports storing of private fields. For example, here is what the definition for the complex property Title.BlueRay, which returns the complex type DeliveryFormatAvailability ,looks like in my T4-generated proxy:

                private DeliveryFormatAvailability _bluray;
[global::System.Runtime.Serialization.DataMemberAttribute()]
public DeliveryFormatAvailability BluRay
{
get { return _bluray; }
set
{
if (_bluray != value)
{
_bluray = value;
OnPropertyChanged("BluRay");
}
}
}
partial void OnBluRayChanging(DeliveryFormatAvailability value);
partial void OnBluRayChanged();

       
[global::System.Data.Linq.Mapping.ColumnAttribute(CanBeNull=false)]
private bool BluRay_Available
{
get { return this.BluRay.Available; }
set { this.BluRay.Available = value; }
}

[global::System.Data.Linq.Mapping.ColumnAttribute]
private DateTime? BluRay_AvailableFrom
{
get { return this.BluRay.AvailableFrom; }
set { this.BluRay.AvailableFrom = value; }
}

[global::System.Data.Linq.Mapping.ColumnAttribute]
private DateTime? BluRay_AvailableTo
{
get { return this.BluRay.AvailableTo; }
set { this.BluRay.AvailableTo = value; }
}

[global::System.Data.Linq.Mapping.ColumnAttribute]
private string BluRay_Rating
{
get { return this.BluRay.Rating; }
set { this.BluRay.Rating = value; }
}

[global::System.Data.Linq.Mapping.ColumnAttribute]
private int? BluRay_Runtime
{
get { return this.BluRay.Runtime; }
set { this.BluRay.Runtime = value; }
}

I knew that I had updated my templates to correctly generate the attributed complex property and private backing properties because L2S was able to load the model, but I was getting null exceptions when trying to add objects. Turns out L2S was doing something to try and set the backing properties before the complex type was set by the OData client. The solution was simply to instantiate the complex properties in the default constructor for the entity type:

public Title()
{
// We need to explicitly instantiate complex properties or L2S fails.
this.BluRay = new DeliveryFormatAvailability();
this.BoxArt = new BoxArt();
this.Dvd = new DeliveryFormatAvailability();
this.Instant = new InstantAvailability();
}

At this point, I was able to get the first page of Titles from the Netflix service, materialize them, and store them in the local database. Then, the next time I started the program, I was able to get them from the local database first before trying the service again. Here’s the LoadData method that does this:

        // Loads data when the application is initialized.
public void LoadData()
{
// Instantiate the context and binding collection using the stored URI.
this._context = new NetflixCatalogEntities();

            try
{
// Try to get entities from local database.
var storedTitles = from t in localDb.Titles
orderby t.ReleaseYear descending, t.Name
select t;
if (storedTitles != null && storedTitles.Count() == 0)
{

                    var titlesFromService = new DataServiceCollection<Title>(this._context);

                    titlesFromService.LoadCompleted += this.OnTitlesLoaded;                  

                    // Load the data from the OData service.
titlesFromService.LoadAsync(GetQuery());
}
else
{
// Bind to the data from the local database.
this.Titles = new ObservableCollection<Title>(storedTitles);
}
}
catch (Exception ex)
{
MessageBox.Show("Unable to load stored titles. " + ex.Message);
}
}

And, here’s where we store the entities returned by the data service request:

private void OnTitlesLoaded(object sender, LoadCompletedEventArgs e)
{
if (e.Error == null)
{
// Get the binding collection, which is the sender.
DataServiceCollection<Title> loadedTitles =
sender as DataServiceCollection<Title>;

        if (loadedTitles != null)
{
// Make sure that we load all pages of the Customers feed.
if (loadedTitles.Continuation != null)
{
loadedTitles.LoadNextPartialSetAsync();
}

            // Set the total page count, if we requested one.
if (e.QueryOperationResponse.Query
.RequestUri.Query.Contains("$inlinecount=allpages"))
{
_totalCount = (int)e.QueryOperationResponse.TotalCount;
}                   

            try
{
localDb.Titles.InsertAllOnSubmit<Title>(loadedTitles);

                localDb.SubmitChanges();
}
catch (Exception ex)
{
MessageBox.Show("Titles could not be stored locally. "
+ ex.Message);
}

            loadedTitles.LoadCompleted -= OnTitlesLoaded;

            this.Titles = loadedTitles;

            IsDataLoaded = true;
}
// Update the pages loaded text binding.
NotifyPropertyChanged("PagesLoadedText");
}
else
{
// Display the error message in the binding.
this.Message = e.Error.Message;
}
}

I still haven’t thought through all the ramifications of local storage with client paging, but assuming that we maintain the same sort order and the service isn’t changing much, we should be able to perform this “get it from local and if it’s not there then get it from the service” logic for each page in the feed (of 82K Netflix titles). Obviously, I will have to figure the memory issue out at some point or else a user can fill-up the phone with a large feed like Netflix.
This is why you should always page in the client and not rely on the service to page!

Storing Binary Resources

The final massively tricky problem I need to deal with is getting and storing binary binary resources. In OData (and really it’s from AtomPub), an entity that is attributed as a media link entry (MLE) can have a related media resource (MR). This media resource, usually a BLOB like a image, audio or video file, is accessed from the data service separately from the entity itself. I’ve written a blog series on this for the OData team blog. In our case, we are storing the MLEs in local database, but instead of filling up the database with BLOBs, it makes more sense just to put them in isolated storage. Here’s where it gets tricky… The original app simply called the GetReadStreamUri method to get the URI of the image from the data service; then on binding the data service requested the image from that URI. Easy and works great, but this means that you hit the data service for the same images over-and-over every time you run the app, hence the need for local storage (especially when your images are large…what a waste of bandwidth).

Instead, we need to get the images ourselves from the data service and store them locally. This involves some stream reading and writing, but nothing too hard there. (It’s also a little more cumbersome to get images from local storage than from the net, but a huge cost savings.) But, what about when we have stored the entity in local database, but for some reason the image didn’t make it. Now we have trouble because the entity isn’t being tracked by the DataServiceContext, and even if we attach it, the read stream URI data is missing—this info was kept in the EntityDescriptor object which is recreated on attach but it doesn’t know this URI. Looks like we will need another async call to the service to get the correct URI for the media resource for an attached entity that is a MLE (drat).

I’ll talk about my complete solution to this binary resource retrieval and storage problem in my next post.

Glenn Gailey

Comments

  • Anonymous
    November 17, 2011
    Wow, that's an excellent job! And one more thing that still needs some consideration, is how can we synchronize entities that are modified later? What I'm thinking about now is having a Timestamp field for all entities and keeping them synchronized between both contexts. That will let us easily find out what entities were changed and definitely minimize a traffic consumption on sync'ing process.There might be some concurrency issues in case of read/write data service, but with a read-only ones it should work just fine. What do you think about this, Glenn?

  • Anonymous
    November 19, 2011
    Thanks T3z, That's exactly where I was headed, namely using a timestamp property to do progressive downloads from the OData feed to the client by including a "where timestamp greater than..." filter expression. This, of course, requires that you have control of the data service to be able to add a timestamp property to entities in the data model. Also, the client will never know about deletes that happen at the service. Sync can get pretty complex, but this is probably a good enough solution for many clients. Perhaps someday OData will also support sync-based queries. Glenn.

  • Anonymous
    November 26, 2011
    I thought I've posted my comment last week, but, as I can see now, It didn't appear. What I was saying is that I used this approach in my recent project, as it was simple and straightforward solution. After that I've discovered that there is an update of Sync Framework that allows to build Windows Phone offline-clients, but I didn't have much time to play with it. Looks very promising and it seems like its more extensive solution utilizing OData protocol. Here's a link: code.msdn.microsoft.com/Sync-Framework-Toolkit-4dc10f0e Did you try it?

  • Anonymous
    November 27, 2011
    Thanks for the great lead-in to a future post...I have also been investigating this toolkit. The guidance in this series involves enabling the application to query an OData feed and store entities in local database (and blobs in isolated storage). The sync toolkit instead uses only isolated storage and requires a separate sync service, which is a specialized kind of OData service. Nonetheless, this toolkit does provides more a traditional and comprehensive sync experience (including conflict detection and resolution that you get from the Sync Framework), which run essentially outside of the application. I hope to be able to post in more depth about this toolkit in the next few weeks. Glenn.