Web services and large content in .NET 2.0
There are a couple of features in .NET 2.0 (Whidbey) that help solve the problems of dealing with large data. The two main issues with sending large data in Web service messages are: 1. working set (memory) due to buffering by the serialization engine and 2. bandwidth consumption due to 33% inflation after Base64 encoding.
Chunking to constrain working set
First, let's talk about working set. In .NET 2.0, you can implement IXmlSerializable on both client and server side and use that to chunk data to/from the network. This allows you send/receive arbitrarily large content without consuming arbitrarily large working set. Here's an example:
On the server side, here's what you do to chunk large content back to the client:
First: The Web method needs to turn off ASP.NET buffering and return a type that implements IXmlSerializable:
[WebMethod]
[SoapDocumentMethod(ParameterStyle= SoapParameterStyle.Bare)]
public SongStream DownloadSong(DownloadAuthorization Authorization, string filePath)
{
//turn off response buffering
HttpContext.Current.Response.Buffer = false;
//return song
SongStream song = new SongStream(filePath);
return song;
}
Second: The type that implements IXmlSerializable needs to send chunks in its WriteXml method:
[XmlSchemaProvider("MySchema")]
public class SongStream : IXmlSerializable
{
private const string ns = "https://demos.teched2004.com/webservices";
private string filePath;
public SongStream()
{
//default constructor for serializer
}
public SongStream(string filePath)
{
this.filePath = filePath;
}
public static XmlQualifiedName MySchema(XmlSchemaSet xs)
{
//this method is called by the framework to get the schema for this type
// here we return an existing schema from disk
XmlSerializer schemaSerializer = new XmlSerializer(typeof(XmlSchema));
string xsdPath = null;
xsdPath = HttpContext.Current.Server.MapPath("SongStream.xsd");
XmlSchema s = (XmlSchema)schemaSerializer.Deserialize(
new XmlTextReader(xsdPath), null);
xs.XmlResolver = new XmlUrlResolver();
xs.Add(s);
return new XmlQualifiedName("songStream", ns);
}
void IXmlSerializable.WriteXml(System.Xml.XmlWriter writer)
{
//this is the chunking part
//note that ASP.NET buffering must be turned off for this to really chunk
int bufferSize = 4096;
char[] songBytes = new char[bufferSize];
FileStream inFile = File.Open(this.filePath, FileMode.Open, FileAccess.Read);
long length = inFile.Length;
//write filename
writer.WriteElementString("fileName", ns, Path.GetFileNameWithoutExtension(this.filePath));
//write size
writer.WriteElementString("size", ns, length.ToString());
//write song bytes
writer.WriteStartElement("song", ns);
StreamReader sr = new StreamReader(inFile, true);
int readLen = sr.Read(songBytes, 0, bufferSize);
while (readLen > 0)
{
writer.WriteStartElement("chunk", ns);
writer.WriteChars(songBytes, 0, readLen);
writer.WriteEndElement();
writer.Flush();
readLen = sr.Read(songBytes, 0, bufferSize);
}
writer.WriteEndElement();
inFile.Close();
}
System.Xml.Schema.XmlSchema IXmlSerializable.GetSchema()
{
throw new System.NotImplementedException();
}
void IXmlSerializable.ReadXml(System.Xml.XmlReader reader)
{
throw new System.NotImplementedException();
}
}
On the client side, here's what you do to chunk large content from the server:
First: The Web service method named DownloadSong in the proxy must be changed to return a type that implements IXmlSerializable. You can use a SchemaImporterExtension to do this automatically when the proxy is generated, for now, let's just assume you somehow edited the proxy code and changed it to return a type named SongFile which resides on the client and implements IXmlSerializable.
Second: The type used on the client that implements IXmlSerializable (in this case called SongFile) needs to read chunks from the network stream and write them to disk to avoid arbitrarily large buffers. This particular implementation also fires progress events which can be used to update a UI e.g. a progress bar.
public class SongFile : IXmlSerializable
{
public static event ProgressMade OnProgress;
public SongFile()
{
}
private const string ns = "https://demos.teched2004.com/webservices";
public static string MusicPath;
private string filePath;
private double size;
void IXmlSerializable.ReadXml(System.Xml.XmlReader reader)
{
reader.ReadStartElement("DownloadSongResult", ns);
ReadFileName(reader);
ReadSongSize(reader);
ReadAndSaveSong(reader);
reader.ReadEndElement();
}
void ReadFileName(XmlReader reader)
{
string fileName = reader.ReadElementString("fileName", ns);
this.filePath =
Path.Combine(MusicPath, Path.ChangeExtension(fileName, ".mp3"));
}
void ReadSongSize(XmlReader reader)
{
this.size = Convert.ToDouble(reader.ReadElementString("size", ns));
}
void ReadAndSaveSong(XmlReader reader)
{
FileStream outFile = File.Open(
this.filePath, FileMode.Create, FileAccess.Write);
string songBase64;
byte[] songBytes;
reader.ReadStartElement("song", ns);
double totalRead=0;
while(true)
{
if (reader.IsStartElement("chunk", ns))
{
songBase64 = reader.ReadElementString();
totalRead += songBase64.Length;
songBytes = Convert.FromBase64String(songBase64);
outFile.Write(songBytes, 0, songBytes.Length);
outFile.Flush();
if (OnProgress != null)
{
OnProgress(100 * (totalRead / size));
}
}
else
{
break;
}
}
outFile.Close();
reader.ReadEndElement();
}
public void Play()
{
System.Diagnostics.Process.Start(this.filePath);
}
}
By the way, you can implement IXmlSerializable today in .NET 1.0 and 1.1 and use it to chunk content. In .NET 2.0 we improve the experience by allowing you to emit a meaningful schema and introducing SchemaImporterExtensions which allow you to generate the right code on the client side.
Reducing Bandwidth Utilization
IIS 6.0 makes it easy to compress replies including Web services replies. In .NET 2.0, the client will automatically tell the server that it accepts gzip compression and it will automatically decompress replies. This lets you send Base64 encoded data (text) and compress it which can reduce its size by up to 10 : 1. Note that we don't have a client side compression feature so if you are sending Base64 data from the client and you want to compress it, you would need to use 3rd party compression library or roll your own.
Comments
Anonymous
November 10, 2004
The comment has been removedAnonymous
November 10, 2004
The comment has been removedAnonymous
November 10, 2004
What about going the other way, does ASP.NET still do brain dead buffering of the request ? (see http://www.pocketsoap.com/weblog/2003/12/1392.html)Anonymous
November 10, 2004
Thanks for the question Simon. I asked my friend Erik Olson on the ASP.NET team and here's what he said:
ASP.NET has a threshold after which it's buffered onto disk and gets chunked in automagically from the request stream. The threshold is 512 KB. This is adjustable in the <httpRuntime/> config section and can be scoped to the URL using <location>. See requestLengthDiskThreshold.Anonymous
November 11, 2004
When you say "In .NET 2.0, the client will automatically tell the server that it accepts gzip compression and it will automatically decompress replies." are you referring to the MTOM stuff?Anonymous
November 11, 2004
No this is not MTOM. This is using the HTTP ACCEPT header to tell the server it can send gzip compressed content if it wants to.Anonymous
November 12, 2004
So, this doesn't appear to compliment MTOM, it appears to compete with it. When should you choose MTOM over this method?
Also, the encoding and then zipping of the data seems like it would be a bit slow. I'd like to get away from Base64 entirely.Anonymous
July 18, 2006
PingBack from http://netzooid.com/blog/2006/07/18/streaming-mtom-on-a-wse-client/Anonymous
May 23, 2008
There are a couple of features in .NET 2.0 (Whidbey) that help solve the problems of dealing with large data. The two main issues with sending large data in Web service messages are: 1. working set (memory) due to buffering by the serialization enginAnonymous
January 20, 2009
PingBack from http://www.hilpers-esp.com/484609-enviar-ficheros-con-servicios-web