Udostępnij za pośrednictwem


My file's bigger than yours!

As you probably know by now, we intend to support versioning even very large files in Team System. Last I checked, I believe the only hard limit was a question of how big your database's hard disk is.

We've had to fix a few issues (and had to get ASP.Net and SQL 2005 to fix a few issues) along the way. One of those fixes lets the Source Control Proxy download very large files faster than it used to (and using less memory in the process, as well) - the difference with a 2GB+ file was rather dramatic.

So, I'm curious. We've gotten data from a few folks on how big their versioned data is "in total" - as in, how much space the whole repository needs, or how much space the latest version of everything represents. But, how big do individual files get for you? What kinds of files are those? I know we've checked in some big crash dumps before, and zips of large collections of files. What other sorts of "big data" do you find yourself want to keep around/version/track history of?

 

In dreams I see myself flying...

Comments

  • Anonymous
    January 18, 2006
    The comment has been removed
  • Anonymous
    January 19, 2006
    Oof, those sound big all right. Do you know (or have a guess) as to how big such datasets are (in GB)?
  • Anonymous
    January 19, 2006
    The comment has been removed
  • Anonymous
    January 23, 2006
    Our system installation includes creating an Oracle database instance.
    For this purpose we use template tablespaces or backup sets. This is usually quicker than executing DDL and load scripts.

    In our cases, these files are only between 75 and 110 MB. However, in future releases we might include example data (chromatograms) in the range of 0.5 to 2GB.