Share via


Todays bug is...

Today I'm thinking about a recent bug we have for file attachments for work item tracking in Team Foundation System.  We are using HTTP Post with multi-mime parts to upload a file to the server.  The problem is when a file size is larger then the maximum that is configured on the server.  We are currently using maxRequestLength property in the web.config to configure the maximum upload file size.  The default is 4mb but we usually set it higher.  ASP.NET can only handle about 1 gig max, so one issue we're discussing with our Program Managers is will customers want to attach larger files than this.  We know internal teams may want to add larger files like full window debug dumps which can be quite large.

Since we haven't heard at this point that the 1 gig limit is a problem we're going to continue with our HTTP Post of the full file and potentially make this our upper limit something less than less.  However we still want this setting to be configurable by the administrator, some people may not want huge files to be attached.  And at the very least if larger files are needed a network share could be configured and we can store links to the files as a work around.

Back to the problem: when we upload a file larger than the maximum set in the web.config, ASP.NET throws back a '500 internal server error'.  The problem here is our client doesn't know this was due to a file being too large it could be any of a number of server problems.  Also it's not efficient to send the whole file only to be rejected due to it being too large and we want to give the user a nice message as to why the upload failed.

So a proposed solution is to add a new table to our database that will store this size and be generic enough to hold any other configuration settings we may need in the future.  Surprisingly we haven't needed this table before or we found other solutions to get data to the client that could be stored here. 

We will then expose this data via a web service method something like GetConfigSettings() that will pass back a schematized XML element or string with this data, that way if we add new confiuration settings we will not need new web methods.  I'll also need a way to put the data into the db such as SetConfigSettings that would take in schematized xml, or perhaps a more specific SetMaxAttachmentSize(), the latter would be easier to use a web brower to call the web service method to set the value.  I need to think about this a little more today.

Another possible solution instead of storing the data in our db is to have the web service method read the maxRequestLength which isn't too easy to get right now.  I emailed someone on the ASP.NET team yesterday and he talked to the implementor of configuration and currently for Beta 2 this property (HttpRuntimeSection.MaxRequestLength) is not exposed however they do plan to expose it for RTM.  He did say we could get it using private reflection.

Then there is the matter of getting the data to the client, there are a couple of options here:  All options will check the file size before uploading the file and throw a proper exception.

1) Every client gets metadata from our server pretty much before it can do anything so it knows things such as valid fields for the work item.  We can add another table to this metadata and the client can cache it.

2) We can make the web service call in our proxy before uploading a file that is stored in a static member so we only need to make the call once per instance of the client Object Model.

3) We can make the web service call in our proxy constructor which can store all config settings (only 1 one for now) in member variables.

I'm leaning towards option 3 since we wouldn't have to change any client code (except for the proxy, which I own) and server side code.

So this is what I'm thinking about today and also working on completing another bug I started fixing yesterday, so if all goes well today I should get to coding this bug this afternoon or tomorrow.

Have an awesome day.

Sam

 

This posting is provided "AS IS" with no warranties, and confers no rights.

Comments

  • Anonymous
    March 08, 2005
    From my experience, IIS does not react well to uploads of even a few hundred megs in .net, let alone a full gig. My understanding is that this is because .net laods the entire stream into memory, and doesnt chunk it off into disk as the file is uploaded. This then creates the market for products such as SAFileUp.

    Wihtout SA Fileup, I get my web server continually restarting the ASPNet worker process, because too much ram is being used by .net (leak protection?)

    How will Team Foundation address the issue of large files in this context?
  • Anonymous
    March 08, 2005
    The comment has been removed
  • Anonymous
    March 09, 2005
    The comment has been removed