SharePoint 2013 TIP: how to get unique users metrics from the WFEs
This SharePoint 2013 TIP presents a method for obtaining unique user counts quickly and easily and for free. This method uses existing IIS features and relies upon the capabilities of Microsoft's Log Parser 2.2, which is available free. It involves just two small files: a batch file you execute in a command window and a configuration file containing the SQL. Running it generates three CSV files: the first two are interim data files; the third one is the one you want: it contains a listing of dates and unique users per date. Once the script completes it also writes to the console the total unique users found over all log files analyzed.
I run this script on a monthly basis to update unique user metrics. I've also used it to analyze unique users for an entire years's worth of log files - no need to change any configuration when running the analysis on a month's log files or a year's worth: just drop the two files into the folder containing all the log files and execute. What is provided here are the basics to help you get started. OOTB, you can start capturing these metrics right away using the method presented here. Once you get up to speed, you can customize to many other applications.
Solution
- Assumptions: The following are some basic assumptions to keep in mind when engaging this method:
- Windows Server 2012
- IIS 8.0
- W3C Log File Format
- One loge file per (IIS) site
- Daily Schedule
- Two WFEs (NLB)
- Default log file naming
- IIS Advanced Logging extension not installed.
- Windows NLB
- Crawl service account: spContent
- Farm service account: spFarm
- SharePoint setup user administrator account: spAdmin
- Log files end in ".log".
- Preparation:
- File Naming: By default, from my experience, for standard Windows Server 2012 installations of IIS 8, the log file naming will have the format: u_exYYMMDD.log. It will have this format for all servers. Thus, you cannot dump them all into a single folder without performing some renaming ahead of time. I append the WFE number to the files. For example, log files coming from the first WFE in one of my farms will be renamed like so: u_exYYMMDD_WFE1.log. The highlighted part is the part added. Check your own log files to see how they are produced: if they follow the same approach to naming as shown here then great: you don't need to customize the script. Otherwise, you'll need review the script and make appropriate changes to the SUBSTR methods consistent with your file naming approach.
- File Consolidation: You will need to copy all IIS log files from all WFEs into a single folder, renaming them slightly so as differentiate log files among the servers. You may have 2, 5, 10 or more log files for each given day.
- Running the script: if your file naming is the same as noted above, just drop the two files into the folder containing your consolidated log files and then execute. When running Log Parser script from the command line, it runs amazingly fast. I have been able to analyze many GB of log files within 10 seconds or less.
- Customizing: you'll note, when you open up and review the load file, that I filter out farm service accounts and other obviously unhelpful entries (e.g., 401, NULL). These WHERE statements you'll need to customize to your farm.
- Files: here are the two files you need. Just copy and past.
guu.bat
logparser -i:IISW3C file:guuLoad.txt -o:CSV -q >guuOutStep1.csv
logparser -i:CSV -o:CSV "select UserName,Date from guuOutStep1.csv group by UserName,Date" -q>guuOutStep2.csv
logparser -i:CSV -o:CSV "select Date, count(*) as users from guuOutStep2.csv group by Date order by Date" -q>guuOutStep3.csv
logparser -i:CSV "select COUNT(DISTINCT UserName) AS Total-Unique-Users-for-this-Month from guuOutStep2.csv"
guuLoad.txt
SELECT REPLACE_STR(TO_LOWERCASE(REPLACE_STR(cs-username, '0#.w|', '')),'DOMAIN\','') as UserName,
STRCAT(STRCAT(SUBSTR(EXTRACT_FILENAME(LogFilename),6,2),'/'),STRCAT(STRCAT(SUBSTR(EXTRACT_FILENAME(LogFilename),8,2),'/'),SUBSTR
(EXTRACT_FILENAME(LogFilename),4,2))) AS Date
FROM *.log
WHERE
sc-status<>401 AND
cs-username<>'0#.w|DOMAIN\spcontent' AND
cs-username<>'0#.w|DOMAIN\spadmin' AND
cs-username<>'0#.w|DOMAIN\spfarm' AND
cs-username<>'' AND
cs-username<>NULL AND
cs-username<>'DOMAIN\SPCONTENT' AND
cs-username<>'DOMAIN\SPADMIN'
References
- Microsoft Log Parser 2.2
- IIS Advanced Logging Extension
- how to remove special characters in a particular column
- REPLACE (Transact-SQL)
- CASE (Transact-SQL)
- CONTAINS (Transact-SQL)
- How to join multiple select statements in SQL
- COUNT (Transact-SQL)
- += (String Concatenation) (Transact-SQL)
- Standard Date and Time Format Strings
- Quick Analytics from SharePoint (kind of...)
Notes
- Claims-based naming: For my farms, I find that usernames captured in the IIS logs include both standard (e.g., DOMAIN\First.Last) and claims-based (e.g., 0#.w|DOMAIN\First.Last) versions. Therefore, I filter for both versions.
- Case sensitivity in WHERE clauses: I also find that the standard version is upper-case, while the claims-based version is lower-case. Since the WHERE statements are case sensitive, I've had to account for this in the SELECT statement using the TO_LOWERCASE method.
- Different syntax for common TSQL terms: Log Parser 2.2 implements a syntax that is very similar to TSQL, but not quite. There are some odd but understandable variations, such as REPLACE_STR for the TSQL REPLACE method; or STRCAT for the TSQL simple "+" method.
- Inadequately defined help file: When searching through the Log Parser 2.2 help file, you'll find that not everything is adequately described. For example, if you search for the term CONCATENATE, so as to find out what the concatenation syntax is, you'll find nothing. This doesn't mean that the appropriate syntax hasn't been implemented. It's just that the term CONCATENATE has not been included on the page that presents the Log Parser syntax for concatenation, which is STRCAT. In this case, I found the desired syntax for concatenation by performing searches on related terms, such as STRING, and then carefully reviewing the results. It took awhile, but I eventually found the syntax I was looking for. Be sure to perform multiple searches within the Log Parser 2.2 help file (it's a CHM file) when searching for the necessary term.