次の方法で共有


SharePoint 2013 - Usage Analytics (the story)

In every new SharePoint version there is a considerable investment put into optimizing the way we aggregate and process usage information. By usage information I understand the requests made by users, and the patterns and trends that can be extrapolated from this requests.

SharePoint 2013 was no exception, although we kept the same mechanisms for gathering information like we had in 2010 (SPUsageProviders / SPUsageDefinition / SPUsageReceivers) the processing of the data gathered was moved towards the Usage Analytics Component of the Search Service Application.

Given that in the last couple of months we had an increasing number of cases dealing with Usage Analytics I wanted to share with you a broad overview of how we get collect and process the data for the usage analytics reports.

Data collection

In SharePoint 2010 we introduced the concept of SPUsageProviders and SPUsageDefinitions. I will not go into many details, but you can see this as being the definition of a data set that can be programmatically logged to Usage Database (WSS_Usage) from within your SharePoint code.

You can see all the definitions currently installed in your farm by running:

 Get-SPUsageDefinition

Each definition can have the following properties (and many more):

 - Columns - will define the columns in the data set that we log.
 - Table - the name of the partitions you will see in the WSS_Usage database.
 - RetentionPeriod - the time we keep the data in the WSS_Usage database.
 - UsageReceivers - will discuss about this a bit later

Each time you add a new usage definition we will create a set of 32 partitions (with the name specified in the table property of the definition) in the WSS_Usage database (also a view with the name of the definition, that will basically cumulate the data from all the partitions).

Programmatically in your custom code (or in the MS code) we can create new entries for each definitions and add them to an internal store (first 1000 entries in memory, then they get moved to .usage files on the disk).

Every 5 minutes a timer job (Microsoft SharePoint Foundation Usage Data Import) will get all the definitions and call the ImportEntries method of each definition. If this method was not overwritten it will move the data from the in-memory store or .usage files to the corresponding tables in the WSS_Usage database.

When this is done, it will execute the SPUsageReceivers that are attached to the usage definitions. You can look at the usage receivers as being an event handler triggered after each import of the usage data. You can develop your own receivers with the purpose of running tasks like data analysis, data archival… etc.

For the OOB usage analytics we use two different usage definitions: Page Requests and Usage Analytics. For Page Requests we will save all the logged data in the WSS_Usage database, for the Usage Analytics definition the ImportEntries method was overwritten so we don’t log anything to the WSS_Usage database.

When we provision the first SearchService application we will automatically also create an Analytics Component. During the provision process of the SSA we will also add a usage receiver to both the Page Requests and the Usage Analytics definitions. These receivers have the role of "transporting" the data from the usage definition store to the analytics component of the SSA. This happens by means of a web service method called SendAnalyticsUsageEvents. The web service will receive all the usage entries logged and write them to the EventStore (default location on the analytics component: C:\Program Files\Microsoft Office Servers\15.0\Data\Office Server\Analytics_GUID\EventStore (where the GUID is the ApplicationName property of the SSA).

One of the more common problems we noticed is that the usage receivers are no longer registered to the usage definitions. You can check for this by using:

 $ad = Get-SPUsageDefinition | where {$_.Name -like "Analytics*"}
 $pd = Get-SPUsageDefinition | where {$_.Name -like "Page Requests"}
 $ad.Receivers
 $pd.Receivers

If the receivers are not displayed you can register them by using:

 $ad = Get-SPUsageDefinition | where {$_.Name -like"Analytics*"}
 $ad.Receivers.Add("Microsoft.Office.Server.Search.Applications, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c","Microsoft.Office.Server.Search.Analytics.Internal.AnalyticsCustomRequestUsageReceiver")
 $ad.EnableReceivers = $true

 $pd = Get-SPUsageDefinition | where {$_.Name -like"Page Requests"}
 $pd.Receivers.Add("Microsoft.Office.Server.Search.Applications, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c", "Microsoft.Office.Server.Search.Analytics.Internal.ViewRequestUsageReceiver")
 $pd.EnableReceivers = $true 

This pretty much ends the data gathering. The short version of this process being: we use the Page Requests and Usage Analytics usage definitions to gather the data, and attached usage receivers to send the data to the SSA by means of the SendAnalyticsUsageEvents web service method, that will store the data in the event store to await processing.

Data Processing

The processing is done by a component called the analytics engine. This will run once every day and read the data from the Event Store and then perform a number of computations on this data to compile the needed reports. The whole process is opaque and really complex, besides reading the event store, we also read data from the search analytics database and from the search index. The computation themselves can be fairly resource intensive and take a long time to finish. We recommend having a good amount of free space on the drive where the store is located, as during the processing we will create some temporary files that can reach large dimensions.

There are not many ways of interacting with the analysis engine. I would recommend using the following:

 $tj = Get-SPTimerJob -Type Microsoft.Office.Server.Search.Analytics.UsageAnalyticsJobDefinition
 $tj.GetAnalysisConfiguration()
 $tj.GetAnalysisInfo()
 

This will give some basic informations, as to what the running schedule is, successful runs, last errors… You can also start an off-schedule run of the analysis engine:

 $tj = Get-SPTimerJob -Type Microsoft.Office.Server.Search.Analytics.UsageAnalyticsJobDefinition
 $tj.DisableTimerjobSchedule()
 $tj.StartAnalysis()
 

And then don’t forget to do the:  

$tj.EnableTimerjobSchedule()

 

There is not much troubleshooting that can be done at Analytics Engine level. If the correct data is in the event store, the reports should reflect this.

After the analysis engine finishes processing the data it will write the results to the Search_Service_Application_AnalyticsReportingStoreDB database, AnalyticsItemData.

Final notes

The whole platform behind the usage analytics reports is expandable. The out of the box code, is not meant to cover all usage reporting scenarios. For more special reporting needs I recommend developing a custom solution that includes SPUsageReceivers and event custom usage definitions.

Also, I would like to extend again the warning, that changing settings of the different analyses will cause the engine to fail. Do not change settings you don't understand.

Comments

  • Anonymous
    June 10, 2014
    Hi Daniel, nice article ! There are a couple of nice things you can see with these PS snapins. I'm facing the following issue: I have added two additional Analytics DBs (via Add-SPServerScaleOutDatabase) and removed the original one. Since then, analytics reports do not work anymore. What I can see in the new DBs is that the tenant configuration didn't get copied over (table dbo.TenantSettings is empty). Do you have any idea how to force SharePoint into re-initializing this configuration ? Cheers, Rares Marinescu

  • Anonymous
    June 16, 2014
    The comment has been removed