Delen via


Silverlight Caching Application Block

Retired Content

This content is outdated and is no longer being maintained. It is provided as a courtesy for individuals who are still using these technologies. This page may contain URLs that were valid when originally published, but now link to sites or pages that no longer exist.

The latest Enterprise Library information can be found at the Enterprise Library site.

patterns & practices Developer Center

On this page:
Comparing to Enterprise Library Caching Application Block | System.Runtime.Caching | Design of the Silverlight Caching Application Block | Object Cache and Memory-Backed Cache - Cache Item Policy, Cache Item Priorities, Cache Capabilities, Thread Safety | In-Memory Cache - Limiting the Size of the In-Memory Cache | Isolated Storage Cache - Limiting the Size of the Isolated Storage Cache, Item Size on Disk, JSON Serializer, Multiple Instances of the Isolated Storage Cache Running at the Same Time

The Caching Application Block in the Enterprise Library 5.0 Silverlight Integration Pack is an entirely new implementation, which very closely resembles the System.Runtime.Caching API available in .NET Framework 4.0.

This new implementation provides the following features:

  • In-memory cache
  • Persistent cache support in isolated storage
  • Various cache item purging strategies based on available size in isolated storage, maximum number of cached items, or expiration delay

These features will be discussed later in this chapter.

Comparing to Enterprise Library Caching Application Block

The caching support in the Enterprise Library 5.0 Silverlight Integration Pack is not based on the Enterprise Library 5.0 Caching Application Block. The main reason for this is the fact that the Caching Application Block is going to be deprecated in the next version of Enterprise Library, in favor of using System.Runtime.Caching.

The main scenarios that the Enterprise Library 5.0 version of the Caching Application Block support do not make much sense in a Silverlight application for a number of reasons. For example, storing cached data in a database is not possible in Silverlight.

Future versions of Enterprise Library will also build on top of System.Runtime.Caching.

System.Runtime.Caching

The .NET Framework 4.0 introduced the System.Runtime.Caching namespace. This namespace contained a new and extensible caching implementation. For example, this implementation allows you to create custom caching providers.

In Silverlight 4, the System.Runtime.Caching namespace does not yet exist. The Enterprise Library 5.0 Silverlight Integration Pack contains classes that are very similar to System.Runtime.Caching, in the Microsoft.Practices.EnterpriseLibrary.Runtime.Caching namespace. If a future version of Silverlight does introduce the System.Runtime.Caching to Silverlight, it will be relatively easy to swap out the Enterprise Library implementation with the Silverlight version.

Design of the Silverlight Caching Application Block

The following diagram describes the main classes that are used by the caching block.

Follow link to expand image

As you can see, there are two implementations of a cache, called the InMemoryCache and the IsolatedStorageCache. Ultimately, they both derive from an abstract base class called ObjectCache.

Object Cache and Memory-Backed Cache

The ObjectCache class is the abstract base class in System.Runtime.Caching that all cache implementations must derive from. It provides methods for adding, retrieving, and removing objects from a cache.

The MemoryBackedCache base class contains common functionality for both the InMemoryCache and the IsolatedStorageCache. This base class also performs the expiration and scavenging logic based on notifications coming from two schedulers running on a background thread. The expiration scheduler will notify the cache if there are items that have expired. The scavenging scheduler will tell the cache to start removing items if the maximum size of the cache has been exceeded. For the InMemoryCache, the maximum size is determined by the number of items in the cache, whereas for the IsolatedStorageCache, the maximum size is determined by the space available on disk.

Cache Item Policy

When you add an item to the cache, you must also specify a cache item policy. This cache item policy allows you to control when an object will be removed from the cache. For example, you can specify a SlidingExpiration timespan for a cache item. This means that the item will be removed from the cache if it hasn't been accessed within the specified period. You can also specify an absolute DateTime to control when the item will expire.

The fact that you can specify expiration doesn't guarantee that the item is not removed before it has expired. For example, if the cache becomes too large, it can be removed by the scavenging logic.

Note

The expiration logic is performed on a background thread at a configurable interval. This means an item is not removed from the cache exactly when the item expires, but will be removed sometime after the item expires. Normally this doesn't cause issues, because most operations on the cache check if an item has expired. For example, if you test if an item is stored in the cache, the cache will return false if the item is still in the cache but has expired. When you check the actual contents of the cache using a debugger or query the number of items using the Count property, you'll see that it is still there until the expiration thread has removed it.

The CacheItemPolicy class also allows you to specify two callbacks: the RemovedCallback, which gets called when the item gets removed from the cache, and UpdatedCallback, which is called when a cached value is updated.

Note

The UpdateCallback and RemovedCallback in the CacheItemPolicy class can be used both in the in-memory cache and the isolated storage cache. However, for the isolated storage cache it is not possible to serialize the callbacks. This means that the callbacks are lost when the application closes. The only way to restore these callbacks is to explicitly add the cached item again.

Cache Item Priorities

The CacheItemPriorities are used by the scavenging logic to remove items from the cache when the maximum size of the cache is reached. The cache item policy allows you to specify a cache priority. System.Runtime.Caching defines two priorities: NotRemovable and Default.

The scavenging logic will first try to scavenge items that have CacheItemPriority.Default. However, if it then turns out that the cache is still too large, it will also remove items with CacheItemPriority.NotRemovable. So the term NotRemovable is a bit misleading, because any item can be removed from the cache if the cache becomes too full.

Cache Capabilities

The .NET Framework version of the object cache also provides a mechanism for describing the capabilities of a particular cache implementation. In theory, if you have several cache implementations at your disposal, you could use this information to select a particular cache implementation based on its capabilities. The isolated storage cache and in-memory cache describe their cache capabilities, but the Enterprise Library 5.0 Silverlight Integration Pack does not ship with any mechanism to select a caching implementation based on its capabilities.

Thread Safety

The Silverlight Caching Application Block is thread-safe. That means you can read and write from the cache from different threads without worrying that the cache becomes corrupt. In fact, the expiration and scavenging logic is executed on a different thread.

Because the scavenging logic and expiration logic executes on different threads, it is possible that the contents of the cache change between calls. Therefore you should try to interact with the cache using atomic operations. The following code shows an example of this:

// Correct way to retrieve something from the cache (using atomic operation)
object item = cache["CacheItemKey"];
if (item == null)
{
   // Item was not stored in the cache
}

// Incorrect way to test if something is in the cache (using two operations)
object item = null;
if (cache.Contains("CacheItemKey"))
{
    // Item might have been added or removed between calls. You can't assume
    // that item is not null
    item = cache["CacheItemKey"]; 
}

When you query the contents of the cache, for example when you iterate over the entire contents of the cache, the cache will create a snapshot of itself at the moment you retrieved the iterator. That means you can safely iterate over the contents of the cache without the iterator breaking. However, while iterating, you will not see any changes that are made to the cache.

In-Memory Cache

The InMemoryCache allows you to create a cache that stores data in memory. This is very similar to the MemoryCache class from the System.Runtime.Caching namespace on the desktop version of the .NET Framework. The information stored in the in-memory cache is lost when the application closes.

Using the in-memory cache is similar to holding object references to retrieved data in code. Both these techniques will store the results in memory. But the in-memory cache allows you to specify when the data will expire and also allows you to refresh the data once it has expired. If you use a lot of cached data, the in-memory cache can also help you to manage the size of the cache.

Limiting the Size of the In-Memory Cache

With the in-memory cache, you can set limits on how large you'd like the cache to grow. You can set an upper boundary on the maximum number of cache entries. When the maximum number of items is reached, the scavenging logic will then start to trim the cache. You can also specify how aggressive the scavenging logic should be by specifying how many items should be left in the cache after the scavenging is complete.

It's not possible in Silverlight to determine how many bytes an object actually uses in memory, so it's not possible to limit the cache to actual bytes used. Memory consumption will be much higher if you cache large objects or lists of objects, compared to caching many very small objects.

Isolated Storage Cache

Sometimes, you'll want data that's stored in the cache to be persisted to the file system so that it's available after the application starts. If you have data that doesn't change often and that is needed during the startup of your application, then storing this data in isolated storage can improve the startup time of your application.

Note

The data stored in isolated storage is not secured. This means that end-users and system administrators can locate the data stored in isolated storage and either read or change this data.

Limiting the Size of the Isolated Storage Cache

By default, Silverlight will limit the size of isolated storage to 1 Mb of data. You can increase the size of the isolated storage quota, either through explicit user consent or through a .NET Framework configuration policy.

For most applications, you can ask the user for a larger isolated storage quota, but there is no guarantee that the user will permit it. To help with that situation, the isolated storage cache gives you several settings that you can use to control the size of the cache.

When you create the cache, you'll have to set its maximum size. While this means that the cache will not grow larger than the maximum size, it does not guarantee that the cache can actually grow to this size. For example if you have a 1 Mb quota and 300 Kb is already used by other data, then the actually available size for the cache is 700 Kb.

You must also specify when you want the scavenging logic to start and how much space the scavenging logic should reclaim. You do this by specifying percentages of the actually available space. For example, if you have 1 Mb of available space and specify that scavenging should start at 80% and stop at 50%, then the scavenging logic will start at 800 Kb and will trim the cache back to 500 Kb.

Note

The scavenging percentages look at the available free space, or the maximum size you have specified, whichever is lower. So if you have specified a maximum cache size of 500 kb, and the isolated storage quota is set to 1 Mb, the percentages will work on the maximum size. If you have specified a maximum size of 2 Mb, but there is only 0.5 Mb of free space available, then 0.5Mb is used.

Item Size on Disk

The isolated storage cache stores each cache item in a separate file in isolated storage. Because the way the file system handles small files, each cache item will consume at least 1 Kb of space. This means that with the default size of isolated storage, you can store a maximum of 1024 files in the cache.

It is recommended that you do not store small items in isolated storage. For example, if you wish to store several items in the cache, it's recommended you store them as a list of items, rather than storing each item using a separate value in the cache.

JSON Serializer

Cache items are serialized in JavaScript Object Notation (JSON) format using the DataContractJsonSerializer. This means that the classes you'd like to store in the cache should be serializable to JSON. In general, this includes most primitive types, most array and collection types, as well as complex types that use DataContractAttribute and DataMemberAttribute. Please refer to the chapter on Stand Alone JSON Serialization on MSDNĀ® for more information.

Hh852715.note(en-us,PandP.51).gifSharon Says:
Sharon You should ensure that items you store in the cache are serializable.If you try to add an object to the isolated storage cache that is not serializable, the add operation will fail silently and the item will not be added to the cache.

Multiple Instances of the Isolated Storage Cache Running at the Same Time

The space used in isolated storage is shared by all instances of your application. This causes problems for the isolated storage cache, because this would cause multiple instances of your application to read from and write to the same files. Synchronizing between multiple instances of the application would require complicated locking and synchronization logic, which would also have a negative impact on the performance of the cache.

Hh852715.note(en-us,PandP.51).gifSharon Says:
Sharon Multiple instances of your application might be started up more often than you might think. If a user has opened your application and presses ctrl+n to create a new browser window, he will have two instances of your application running at the same time.

To overcome this issue, only the first instance of your application will be given read and write access to isolated storage. For any subsequent instances that create an isolated storage cache, that cache will be automatically populated with all the data that was already stored in isolated storage at that point in time, but after the creation it will behave like the in-memory cache.

The advantage of this approach is that multiple instances of your application have the ability to cache data. If you populate isolated storage with data that is used often and does not expire easily, then all instances of your application should have roughly the same performance.

When the isolated storage cache is locked and starts to behave like an in-memory cache, it is no longer possible to detect how much memory the cache consumes. This means that the isolated storage cache cannot use the configured scavenging percentages because they work on the actual size it would have consumed on disk. To prevent this cache from growing beyond control, a limit will be placed on the maximum number of items the cache can hold. This number is determined by the maximum number of items that were already stored in isolated storage. To prevent the cache from becoming unusably small, this number is set to a minimum of 20 items.

For example: The first instance of the application has started and added 40 items to the isolated storage cache. Then the second instance of the application starts. It also creates the isolated storage cache, which will be populated with the 40 items that were added by the first instance. From then on, the second instance will work as the in-memory cache. If you then add an additional 10 items to the second cache, it will start scavenging and remove the 10 oldest items, to make room for the newly added 10 items.

Next Topic | Previous Topic | Home

Last built: July 8, 2011