Tips for estimating OMPM scanning times
A common question among OMPM customers is “How do I estimate how long OMPM will take to run?” Though we don’t have a simple calculation to provide, we do have some tips to share from consultants and customers.
First, we’d like to share some customer advice from M. Nothnagel, who posted his estimation method in the Application Compatibility forums on TechNet. He uses a combination of light scans and deep scans to estimate completion time. You can find his forum post here:
A Microsoft Premier Field Engineer, Lee Palmer, suggests the following tip to speed up scanning:
I know a performance gain is not scanning for 64-bit macro compatibility. This can make a real difference in the scan times. As we recommend 32-bit Office and most customers are not deploying 64-bit, then it is not really necessary to have this option set in the ini file.
And Curtis Sawin, a Senior Consultant, shared his tips for using Robocopy to estimate scan times:
In general, a big impact on the time it takes to perform a scan is how “close” the scanning computer is to the target file share. You can use “tracert” to find out how many hops are between the scanning computer and file share. The fewer hops, the faster scan results.
Additionally, what I like to do is provide an estimate of how long a scan will take, by:
- Identify some folders to scan as a pilot scan (try to get at least 500-1000 files).
- Configure offscan.ini to scan the pilot folders.
- Determine the number of files in the pilot folder list.
- Execute the pilot scan.
- Determine the duration of the pilot scan.
- Determine the number of files in a production scan.
- Configure offscan.ini to scan the production folders.
- Estimate the amount of time the production scan will take based on the data from steps 3, 5, and 6.
I’ve used robocopy in a batch file to determine the number of files (steps 3 and 6). Below is the contents of such a batch file:
robocopy %cd% %cd% /xj /w:5 /r:2 /s /ndl /l /if *.xls /if *.xlt /if *.xla /if *.xlc /if *.xlm /if *.ppt /if *.pot /if *.pps /if *.ppa /if *.doc /if *.dot /if *.wiz >> "%userprofile%desktopDocumentTotals.log"
This approach helps customers “see” how long a scan will take, and gives them a better comfort level and the ability to plan…or to scale out and use more than one scanning computer.
The above steps can be done from computers that are close and far from the file share to demonstrate the difference the number of hops makes.
Comments
Anonymous
August 16, 2011
I really think you did a great job by providing this information.I think your readers may appreciate this related blog. leanmachineradio.com/.../so-how-long-do-you-think-it%E2%80%99s-really-going-to-takeAnonymous
November 05, 2012
In the robocopy command above, for the source location I understand you would put the location you want to scan. For the destination location what would you put? Also can you confirm that this command doesn't actually copy any files it just outputs to a log file? Sorry if these are dumb questions but I thought it best to ask :)Anonymous
November 05, 2012
Don't worry, answered my own question robocopy source destination (location where log file will go) and no actual copies are done. Thanks anyway :)Anonymous
November 07, 2012
I have been running the above robocopy script and in task manager its showing the cpu and memory of the robocopy.exe process going up and down but nothing has been updated in hours in the file generated. This is scanning a NAS which is around 10TB in size (total not just office files). Is it a case that its scanning through but hasn't got to more office files to log yet or has the robocopy process failed in some way? I don't want to stop it if there is a chance its still doing what it should be. Can anyone advise? Many thanks.Anonymous
November 07, 2012
Again I have answer my own question. as robocopy has to scan through all files but is only logging the office files if there are a large number of other formats it may be a while before more logging is performed. By being brave and leaving it to do its thing it has continued to log. Thanks.