Freigeben über


FIM Service Database size data point

It is an exciting time as customers plan & start to roll out their FIM 2010 deployments.  One of the initial questions asked is how much hardware do I need?  The team is working on providing some more documentation to help answer these types of questions, but for now I will share with you a data point that may help.

In an earlier post I included the hardware we are using to test performance in our lab. In this post I would like to give you a data point around scale & size of a given db used on this hardware.

Scale

First is a breakdown of the top objects in one of our test databases.  This should help give you a reference point for thinking about if your deployment is bigger or smaller than what this db represents.  Size will additionally vary based on the configuration, number of attributes, etc that are specific to your deployment.

Object Type

Count

Requests

9 million

Expected Rules Entries

3 million

Workflow Instances

1.5 million

Gate Registrations

550k

Groups

450k

People

200k

Note: The requests & workflow instance counts are higher than I would expect in a deployment as I have not pruned this database down for completed objects.

Data file size

SQL File

Size

FIMService SQL Data (mdf)

450 GB

FIMService SQL Log (ldf)

80 GB 

Note: The log file of course will grow over time & need to maintained via regular transaction backups.

Comments

  • Anonymous
    April 29, 2010
    Darryl, This has been very useful information. Could you tell us how much of the data file's 450 GB is used? As well as how much of the 80 GB log file regularly gets used?

  • Anonymous
    May 05, 2010
    Thanks David.  The inital space used was aroun 250GB of the 450GB.  The log file usage was around 20GB.  You can find some more of this information on the TechNet site which includes growth of the db space used over time during a long-haul run we did in our lab. http://technet.microsoft.com/en-us/library/ff400287(WS.10).aspx

  • Anonymous
    January 19, 2011
    Darryl, Based on your comment on this article "The requests & workflow instance counts are higher than I would expect in a deployment as I have not pruned this database down for completed objects. "How would you recommend we prune the database of completed objects? Thanks, Jameel

  • Anonymous
    January 19, 2011
    The database used in this reference contained all the requests for the initial population of that database along various additional requests to do things like register all users which we used a test tool to populate automatically.  Typically the completed requests will expire & then be deleted from the database by a SQL Agent job automaticaly over time.

  • Anonymous
    January 19, 2011
    Darryl, Thanks for your response. I am in a pretty much similar situation as you had mentioned. All those initial imports had left about 8 Million requests objects along with other collateral objects like Workflow Instances etc. I have verified that the DeleteExpiredSystemsObjects job is running periodically. I dont see these 8 Million objects anymore when I search for all the requests. However, I dont see a drop in the size of the FIMService database. Do these objects needs to be manually purged from SQL?

  • Anonymous
    January 19, 2011
    The file size of the SQL server database will not change even when records are deleted from the database.  SQL Server only frees up this space internally for the database to use later. You can view a breakdown of the free & used space for the SQL database by either using SQL Server Management Studio & Right Click on the db->Reports->Standard Reports->Disk Usage or just run the query "exec sp_spaceused" on the db.