Udostępnij za pośrednictwem


Improving Your Image: Sector-Based, File-Based, and Sysprep - What Makes the Most Sense? Part 2: The Pros and Cons.

Welcome to blog number two in the series where we’ll look at file-based images versus sector-based images. We’ve already done the tools primer and defined the terms again in the last blog post. Now we actually get some real work done by comparing the “tried-and-true approach” of sector-based images that most of us (including me) have grown up with and the relatively newer approach of composing installations with “builds” at deploy time using file-based “core images” (I defined the terms in quotes in the first blog of the series in case you are wondering why I used quotes). Beware, this blog doesn’t have any pictures or pretty screenshots.

I want to be true to real feedback and experiences here and the definition for ImageX in the TechNet Library kind of does that, but it also doesn’t really highlight the benefits of sector-based images. This is a pros and cons article and I’ll start with the Microsoft-defined benefits (or “pros”) of the file-based imaging tool, ImageX then move to sector-based images. The “cons” are really described in the paragraphs as I explain the “pros” – I’ll save a nice table comparing everything for the next and last post in the series.

Benefits of ImageX

The limitations of sector-based imaging led Microsoft® to develop the ImageX tool and the accompanying Windows image (.wim) file format. You can use ImageX to create an image, to modify the image without extracting and recreating the image, and to deploy the image to your environment. Because ImageX works at the file level, it provides the following capabilities:

  • More flexibility and control over the deployed media.
  • Rapid extraction of images to decrease setup time.
  • Reduced image size due to "single instancing", which means that the file data is stored separately from the path information. This enables files that exist in multiple paths or in multiple images to be stored once and shared across the images.
  • Non-destructive image application. The ImageX tool does not perform an all-inclusive overwrite of the contents of your drive. You can selectively add and remove information.
  • The ability to work across any platform supported by Windows.
  • Two different compression algorithms, Fast and Maximum, to reduce your image size further.
  • The ability to treat an image file like a directory. For example, you can add, copy, paste, and delete files from a file management tool, such as Windows Explorer.

Many OEMs and corporations need to deploy and install Windows as rapidly as possible, including all relevant updates, applications, and settings. Reduced deployment and installation time lower manufacturing costs for OEMs, and can decrease cost and scheduling risks for corporate deployments.

Is all of this really so one-sided though? Let’s look at each of the bullets individually and see what a typical naysayer would respond with.

  • More flexibility and control over the deployed media.  
    • I actually agree with this one, but… you do need some expertise in creating the deployed media and figuring out stuff like how to inject drivers real-time to make sure the hardware you’re targeting installs as desired. Truth is if you haven’t been sysprepping and are creating an image per hardware class or HAL, then ImageX will build an image that will install on like hardware or HAL as with the sector-based tools you might be using. Problem with that is that you can’t guarantee those images will move from device-to-device. The file outputted by ImageX is also serviceable offline, so you can mount it like an ISO file and start adding or removing stuff from it. The Windows 7 image files even have providers to enumerate drivers, packages (updates, hot fixes and language packs) and features – so you don’t need to do things like mount registry hives and add files then change registry values – offline servicing does that work as part of the process.
  • Rapid extraction of images to decrease setup time.
    • This isn’t really a fair comparison with the sector-based tools. I’ve seen sector-based images come down and apply quite quickly. Where the time savings are really visible here are when you compare a WIM-based setup or apply versus an unattended setup with the good ‘ol “I386” folder in Windows XP source media. Even though Windows XP Pro x86 source media is a paltry 600MB or so compared to Windows 7 Enterprise media of 2.25GB, Windows 7 will usually install faster – you wouldn’t think that was possible with almost three times the file size, but try it. So if a retail image from the Windows 7 DVD takes about 20 minutes to install, Windows XP might take 25 minutes on the same hardware, but I’ve seen sector-based images lay down in 15 minutes or less with a sysprep specialization pass included. Speed is actually the key advantage of the sector-based image, so I’ll actually give this bullet to the sector-based imaging side.
  • Reduced image size due to “single instancing”, which means that the file data is stored separately from the path information. This enables files that exist in multiple paths or in multiple images to be stored once and shared across the images.  
    • That’s a mouthful, isn’t it? I don’t see this as a real advantage unless you are appending image WIM files to a master WIM file. Windows 7 Ultimate media has up to five images in the same WIM file and we can select the SKU we install based on the image index number we apply with ImageX. That helps in cases where you may be using multiple image and appending them together (might make sense in multi-language environments or when you have a mix of Professional and Enterprise), but if you have a standard set of apps for every user, they all get an X86 image and they all have drivers we manage, then there really isn’t a huge need to append image files. We’ll call this benefit then a wash for most customers.
  • Non-destructive image application. The ImageX tool does not perform an all-inclusive overwrite of the contents of your drive. You can selectively add and remove information.  
    • Now we’re talking! If I apply a sector-based image to a drive with data, that data is gone. “Yeah, but I like to format my drive anyway to ensure a ‘clean install’.” That’s fine, but if you need user data from the drive you’re applying the sector-based image to, be prepared to deal with the laws of physics and bus and network speeds to pull that data off the drive and then eventually put it back on. That means refresh and break/fix rebuild scenarios just got several hours longer in many cases. Now if I use a WIM file, I can apply it to a drive with data on it – yes, it’s true. That means I keep the user data on the same volume I am applying the WIM image to – or in other words save those hours of time it took to move the data off. “But won’t the drive be fragmented when we’re done?” Maybe, it will defrag itself once a week and we can force one if we really want to when we’re done. 

      If it helps, think of the outcome you might get if you used Robocopy to apply the entire set of operating system image files with their attributes to a hard drive with files and folders already present. Then why doesn’t Robocopy by itself work? What if I was willing to give up the compression and single-instancing that ImageX and WIMs give me? I was on this great email thread a few months ago with two personal heroes of mine, Michael Niehaus and Mark Russinovich. We were trying to figure out the best way to move files from a larger VHD container to a smaller VHD container. Robocopy will keep file attributes and almost do this but it doesn’t handle short filenames (aka 8.3 filenames) in all cases. So let’s say I have a “jeremy~1.xml” and a “jeremy~2.xml” file, it’s possible that Robocopy will reverse those and turn “jeremy~1.xml” to “jeremy~2.xml” and vice-versa during the file copy. NTFS file systems have had native support for long file names for many years, but there are still many applications that will rely on short filenames and we don’t want to break any applications.

  • The ability to work across any platform supported by Windows.  
    • A sector-based image isn’t really limited here either, sure the Boot Configuration Database (BCD) may need to be rebuilt and hooks into the Windows Recovery Environment re-established, but this can generally be done with either WIM file-based image installs or sector-based image installs. I’ll call this benefit slightly favorable to a file-based image though just because we can mount images across Windows client and server platforms and versions and peruse them using Windows Explorer or CMD.
  • Two different compression algorithms, Fast and Maximum, to reduce your image size further.  
    • I’ve never put much thought to this feature and tend to select “Fast” – in somewhat thin image environments, I don’t see a huge difference in size. The compression in general is a great thing and helps keep the size of images and image repositories in check. Advantage goes to file-based images based on compression alone (and single-instancing if used), but I’m not seeing a lot of people overly excited about two different levels of compression.
  • The ability to treat an image file like a directory. For example, you can add, copy, paste, and delete files from a file management tool, such as Windows Explorer.  
    • This is pretty cool. The image itself isn’t a black box file, so you can mount it and add or remove things in it. I’ve already written about this at length. To be fair, there are explorer tools for the common sector-based images to view image contents and sometimes extract individual folders or directories. These tend to work with the FAT and FAT32 file systems for editing – which is fine until you need to add something to an NTFS image. Most images from Windows XP forward are going to be NTFS, so the limitation to not add things to an NTFS offline image is a major limitation. It basically means you need to build the machine again, configure it and recapture. Not necessarily a fun or fast operation.
  • What about benefits of a sector-based image?  
    • I’m not writing this blog to say that sector-based imaging is always bad. Like I always say, a good IT pro will answer just about every question with “It depends…” and this is no different. The thing I have been thinking about lately is how closely the advantages of sector-based images mirror those of thick images. Let’s explore the advantages of sector-based images.
  • Sector-based images apply faster.  
    • I won’t disagree here, but many of the same factors apply to the sector-based image as the file-based image. How good is the network connection to the imaging server and NIC? How big is the image? Is there data to migrate and restore before and after the machine is (re)imaged? On bare metal installs, a sector-based image (or even a thick file-based image) is often going to be faster than a file-based image or a build that adds applications and other configurations outside of core image. This makes a lot of sense in manufacturing, bare metal installs where every machine needs the identical image and cases in companies where few languages are spoken and applications requirements are extremely static and consistent across all users in the company. The biggest problem is you can’t really service it offline, so builds and recaptures may be much more common than with thin or thick file-based images. That said, I’ll give the sector/thick file-based image the advantage in build speed.
  • Sector-based images are guaranteed to work when moving from like machine to like machine.  
    • The sector copying process doesn’t really care about things like files, their attributes or whether they are corrupted. ImageX and copying mechanisms tend to be affected by irregularities with files in the image. That is good and bad. It ensures that captured and applied images have files in good states, but it won’t clone a system where you might be willing to live with files in a bad state. Backup and restore mechanisms need this quality and things like the Volume Shadow Copy service in Windows relies on this as well to capture a sector view of the machine, regardless of the state of the files. This does provide a sort of “guarantee” that it will work if it was captured on the same hardware.
  • Sector-based images are good for cloning hard drives when replacing or moving to larger/faster hard drives.
    • I won’t argue that. ImageX was designed and scoped for imaging and not as a complete PC backup and restore mechanism. It assumes things like the PC to be captured is in a good state, isn’t domain-joined and has been prepared for cloning by sysprep. On the other hand, a sector-based tool doesn’t really care if the preparation steps have been taken, so you basically capture the image of any hard drive and apply that to another, resize partitions accordingly and you can replace or upgrade your hard drive (or keep it backed up). “Backup and Restore” in Windows 7 does a really good job of system and directory level backups on a machine-by-machine basis. This is actually a sector-based (and extremely thick) VHD image created using the Volume Shadow Copy service.
  • Sector-based imaging tools have multicast and file-based tools do not.  
    • Wrong. I threw that in there because it is a perception like image application speed that people often wrongly assume. File-based images can be multicast, but your deployment server needs to be running Windows Server 2008 or Windows Server 2008 R2. Multicast capability was a huge differentiator for third party tools for a long time and I still hear people say it can’t be done with WIM files. Not true, it can and the Windows Servers that can do it often already exist in your environment, you’ve already paid for them and can use them, or they tend to be cheaper than third party image deployment servers with multicast capabilities (if you haven’t already acquired them).
  • Closing thoughts for Part 2 of ‘Improving Your Image…’  
    • I’ve written a bit more than I had planned in part 2 of the series, but I think it is valuable and hopefully you find it unbiased. You should start to see cases where file and sector-based images will make the most sense. Part 3 will be about build automation and I’ll build a table with recommendations for scenarios when sector-based, file-based thin and file-based thick images make the most sense. You can start to extract that from above and draw your own conclusions, but we haven’t really hit on the fun part yet of automation and task sequences. That stuff really changes the game and even if you are a good scripter and insist on using sector-based images for everything, there is a ton of value you can get from a task sequencer even when used for post-OS-install per-user customization. Stay tuned, I promise it will all be clear after part 3.

Goodbye for now,
Jeremy Chapman
Windows Deployment