Freigeben über


Migrating Content to SharePoint Online via PowerShell

This blog comes from guest writer Mike Barrett

Migrating data to SharePoint Online is a challenging process and without a good understanding of how the process works, migration results are difficult to predict. In this blog, I’ll discuss high level details about migrating to SharePoint Online using PowerShell cmdlets that leverage the Migration API. It is possible to migrate data over Client-Side Object Model (CSOM) but for performance and throttling reasons, the migration API is preferred.
 
At a high level, each migration type has the same steps.

  1. Creation of initial package. For SharePoint migrations this a ‘prime’ package made with either Export-SPWeb or custom code. For file system data this package is created with New-SPOMigrationpackage
  2. Converting the package so it is compatible with SPO (ConvertTo-SPOMigrationTargetedPackage)
  3. Upload the package to Azure Storage and start a migration job (Invoke-SPOMigraitonEncryptUploadSubmit or Submit-SPOMigrationJob)
  4. Behind the scenes the job started in step 3 will scan the data and begin importing it into SharePoint Online.

The following sections of this article break down the details and differences of the migration types as well as provide some recommendations on improving migration performance.

Migrating File System Data Migrating SharePoint Data Improving SharePoint Data migration speeds

Migrating File System Data

File system data migration essentially has 5 steps.

  1. Ensure the destination site and library are created prior to the migration.
  2. Create the initial package (New-SPOMigrationPackage).
    1. This requires an output folder where it saves the package called OutputPackagePath.
  3. Convert the package so it is compatible with SPO Migrations (ConvertTo-SPOMIgrationTargetedPackage).
    1. This consumes the package created in step 1 and outputs data to a different output path OutputPackagePath. Even though the parameter name is the same as in step 1, it is expecting a different path. In other words, the output of New-SPOMigrationPackage is the input of ConvertTo-SPOMigrationTargetedPackage, and ConvertTo-SPOMigrationTargetedPackage has its own output.
  4. Encrypt and upload the data to Azure storage, and start the migration job (Invoke-SPOMigraitonEncryptUploadSubmit)
    1. Similar to the previous commands, Invoke-SPOMigraitonEncryptUploadSubmit uses the output of the previous command, ConvertTo-SPOmigrationTargetedPackage. The output of Invoke-SPOMigraitonEncryptUploadSubmit is encrypted and uploaded automatically to Azure Storage.
  5. Optional: Monitor the migration job (Get-SPOMigrationJobProgress).

More information: Upload on-premises content to SharePoint Online

Here is a sample script to migrate file share data and monitor the job progress:

Note: This script requires that the destination library be pre-created and two separate temp folders that are not part of the content being migrated.

 [CmdletBinding(DefaultParameterSetName="UseCustomTermSet")]
Param
(
 [Parameter(Mandatory=$true)]
 $sourceFiles,

[Parameter(Mandatory=$true)]
 $tempDir1,

[Parameter(Mandatory=$true)]
 $tempDir2,

[Parameter(Mandatory=$true)]
 $targetWeb,

[Parameter(Mandatory=$true)]
 $targetDocLib
)

$cred = get-credential

New-SPOMigrationPackage -SourceFilesPath $sourceFiles -OutputPackagePath $tempDir1 -TargetWebUrl $targetWeb -TargetDocumentLibraryPath $targetDocLib -IgnoreHidden –ReplaceInvalidCharacters

$finalPackages = ConvertTo-SPOMigrationTargetedPackage -ParallelImport -SourceFilesPath $sourceFiles -SourcePackagePath $tempDir1 -OutputPackagePath $tempDir2 -Credential $cred -TargetWebUrl $targetWeb -TargetDocumentLibraryPath $targetDocLib

$job = Invoke-SPOMigrationEncryptUploadSubmit -SourceFilesPath $sourceFiles -SourcePackagePath $tempDir2 -Credentials $cred -TargetWebUrl $targetWeb

#This next line is optional but will show the progress of the job after the data is uploaded.
 $status = Get-SPOMigrationJobProgress -TargetWebUrl $targetWeb -AzureQueueUri $job.ReportingQueueUri -Credentials $cred -JobIds $job.JobId.Guid -EncryptionParameters $job.Encryption

Migrating SharePoint Data

As for the writing of this blog (September 25, 2017), the SharePoint Online cmdlets only support list and library level migrations. Migrations of Webs or Sites is not something our cmdlets can do now.

  • It is best to begin by scanning your environment for migration blockers using the SharePoint Migration Assessment Tool (SMAT)
  • For information on the output of SMAT see: Scan Reports Roadmap
  • If you don’t want to migrate using PowerShell, we have a new migration tool you can use Introducing the SharePoint Migration Tool
  • Regarding supportability, the migration API supports migrating content from SharePoint 2013 to SharePoint Online. Using the migration API to migrate content from MOSS 2007 or SharePoint 2010 is not supported. In addition, the migration API does not facilitate SPO to SPO site migrations.

Migrating Lists

  1. The first step is to export the list using Export-SPWeb from the SharePoint Server. The export should use -nofilecompression and -itemurl parameters to ensure the migration api cmdlets can properly read the files. To improve performance of the migration, only export the most recent version by using -includeversions 1

    For example, if your list is called Expenses and is located at https://SP in your on prem farm, the export command could look like this:

    Export-SPWeb -Identity https://SP -ItemUrl '/lists/expenses' -Path c:\exportfolder -NoFileCompression -IncludeVersions 1

  2. Now that the package is created, go to the SharePoint site where you want to move the list, create an empty OOB custom list. This step can be scripted if necessary.

  3. Set up a machine to migrate the data to SharePoint Online by installing the SharePoint Online PowerShell Module

  4. Once the PowerShell Module is installed, you are ready to convert the package from SharePoint On Premises Prime package to a SharePoint Online package. This is done using the ConvertTo-SPOMigrationTargetedPackage cmdlet. This cmdlet is used for converting list exports, library exports, but also packages created from file system data.

  5. After the package data is converted to be a valid SharePoint Online package, we use Invoke-SPOMigrationEncryptUploadSubmit to encrypt the data, upload to azure storage, and start the migration job within SharePoint (technet doesn't yet have an article on this one, but it is referenced in the article: Upload on-premises content to SharePoint Online)

  6. Once the job has started all that is needed is to wait for it to complete. If you want to check the status, you can use Get-SPOMigrationJobProgress

Here is a sample script that uses each of these commands. This script expects that $sourceFiles is set to the path where the uncompressed on-premises export package lives. Also the $tempDir1 parameter is expected to be an empty folder that is NOT under the same path as the $sourceFiles

 [CmdletBinding(DefaultParameterSetName="UseCustomTermSet")]
Param
(
    [Parameter(Mandatory=$true)]
    $sourceFiles,
 
    [Parameter(Mandatory=$true)]
    $tempDir1,
 
    [Parameter(Mandatory=$true)]
    $targetWeb,
 
    [Parameter(Mandatory=$true)]
    $targetListRelativePath 
)
 
$cred = get-credential
 
$finalPackages = ConvertTo-SPOMigrationTargetedPackage -ParallelImport -SourceFilesPath $sourceFiles -SourcePackagePath $sourceFiles -OutputPackagePath $tempDir1 -Credential $cred -TargetWebUrl $targetWeb -TargetListPath $targetListRelativePath
 
$job = Invoke-SPOMigrationEncryptUploadSubmit -SourceFilesPath $sourceFiles -SourcePackagePath $tempDir1 -Credentials $cred -TargetWebUrl $targetWeb 
 
#this next line will produce some UI to show the progress of the job after its uploaded. This is optional and be removed of monitor isn't what you want.
Get-SPOMigrationJobProgress -TargetWebUrl $targetWeb -AzureQueueUri $job.ReportingQueueUri -Credentials $cred -JobIds $job.JobId.Guid -EncryptionParameters $job.Encryption

Migrating Libraries

As of now, the recommended SharePoint Document Library method is Microsoft’s new SharePoint Online Migration Tool. The arcticle How to use the SharePoint Migration Tool covers the use of this tool.

Improving SharePoint Data migration speeds

The most common issue or problem with SharePoint migrations I have seen is being dissatisfied with the migration speed. There are many factors to consider regarding migration speeds. The article SharePoint Online and OneDrive Migration Speed discusses the factors that impact migration performance.
There are a few additional things to consider.

  1. As the article referenced above states, it is critical that you triage your data prior to migration. Here are 3 things to keep in mind when triaging your data.
    1. Large lists and libraries (greater than 5k items/files) will move much more slowly than small or medium lists. The larger the list, the more slowly it will migrate. If a large list or library must be migrated, consider migrating that content last. Migrating “larger” content last helps ensure these lists and libraries do not de-rail project timelines.
    2. Wide lists and libraries (many columns). Like large lists, lists with many columns migrate more slowly, since there is more data to migrate per item. Consider migrating wide lists and libraries at the end of the project as well.
    3. Lists and Libraries with Versions enabled. When you migrate a list or library with versions enabled each item version must be migrated separately. This means that an item with 100 versions will migrate at the same speed as 100 items with a single version. Consider exporting the most recent item version, or the 5 most recent item versions to improve migration performance.
  2. Outside of data rationalization/triaging, using simultaneous migrations is the number one way to improve overall throughput. When migrating multiple packages at once, you can avoid throttling by using a different account for each simultaneous migration attempt. A common example would be 16 accounts migrating 16 packages at once.
  3. Another common bottle neck is local network bandwidth issues. One good workaround is to use Azure VMs to migrate the export packages.

Comments

  • Anonymous
    October 19, 2017
    I have notice that the default location for the migrationjobstatus log file that is generated by the Get-SPOMigrationJobProgress is the path that powershell is initially opened from. Is there a way to change the log location?
    • Anonymous
      November 09, 2017
      Hi Chuck,I've been doing some testing on this and haven't found a way to redirect this log output. We can suppress the output using the NoLogFile parameter, but this will suppress all log file creation, which is less than ideal.
  • Anonymous
    December 20, 2017
    Any advice for moving data from sharepoint 2010 to sharepoint online? I came across this article because I've used your api but I got errors creating the package @ New-SPOMigrationPackage.You say its not supported, would it be possible upgrade the cmp to be 2013 compatible and then proceed? Tks
    • Anonymous
      January 11, 2018
      Hi Gary,It's best to upgrade the data to SharePoint 2013 prior to migrating that data using the migration API.
    • Anonymous
      February 13, 2018
      Hi,There is an open-source tool that can move SP2010 Lists/Libraries to SharePoint Online (and do a whole lot more):https://github.com/GLubomirov/SP_Hauler/releases/tag/1.0Greetings,George