Freigeben über


ULS Log File Data Mining with Log Parser

ULS Log File Data Mining

Finding meaningful information in SharePoint ULS logs is frustrating. It can be like finding a needle in a hay stack. The SharePoint ULS Log Vieweris a great tool which I highly recommend for routine research; but has performance problems processing large sets of log files.

Assume you want to find all the Exceptions thrown in the last 72 hours. You go to the LOGS folder (but due to some categories set to verbose logging, coupled with a recurring error) to discover there are 6 GB of log files. Many of the individual log files are so large you cannot open them with Excel or Notepad. What do you do now?

Fortunately, there is a tool available that makes it possible to data mine ULS logs. This tool was originally developed to data mine IIS log files. Over the years it has been enhanced and extended to data mine other log file formats; such as, system event logs, comma separate text files, Network Monitor logs, the registry, even the file system! It is indeed the Swiss Army knife of log file data mining. It is named Log Parser, now in version 2.2. LogParser can slice-and-dice log files in a multitude of ways. It can output the results to text files, SQL Server tables, or charts. It is indeed, a great way to analyze ULS logs!

How do you use LogParser to analyze ULS logs? Key enabling facts are:

1. LogParser can process tab delimited files.

2. ULS logs are tab delimited files!

3. LogParser can infer column names if the first row of the file contain field names.

4. The first row of each ULS log file contains the field names!

5. LogParser can automatically parse all files of a given type in a directory without you having to individually name each file.

6. All ULS logs are in the same directory and have the same file type?!

Now that we know LogParser can consume ULS logs, all we need to do is install it, construct the proper query syntax, and analyze the log files.

Step 1: Download the latest version from the Microsoft download center: https://www.microsoft.com/downloads/details.aspx?FamilyID=890cd06b-abf8-4c25-91b2-f8d975cf8c07&displaylang=en. Follow the installation instructions. On my machine, it is installed at c:\Program Files\Log Parser 2.2\. It will make life easy to add the LogParser directory to your Path environmental variable.

Step 2: Create the query file. LogParser is a command line tool. It uses a SQL-like query language. Although you can embed the SQL query in the command line, I recommend keeping the SQL in separate text files; and then reference the SQL files from the command line. This makes it very easy to tweak the SQL and reuse it on many machines.

SELECT Area AS [Area],

   Category AS [Category],

   Level AS [Level],

   COUNT(*) AS [Count]

INTO %COMPUTER_NAME%-area-category-level.tsv

FROM %COMPUTER_NAME%-*.log

GROUP BY Area, Category, Level

HAVING [COUNT] > 1000

ORDER BY [Count] DESC

This syntax should be easy to read since it is very close to database SQL syntax. It is selecting 3 columns, plus the COUNT aggregate function, grouping the results, and then ordering by count descending. Only rows having at least 1,000 entries are written out.

The only unusual part is the term %COMPUTER_NAME%. This is a substitution parameter passed in from the command line (see next step). Parameters enable using the same query file in different situations. For this example, all farm server ULS log files are periodically copied to a common file share. Passing in a server name allows this query to just parse the ULS log files for a particular server.

Step 3: Compose the command line.

LogParser -i:tsv -o:tsv file:area-category-level.sql?COMPUTER_NAME=mossidx01

Breaking down the parameters:

-i:tsv

Input file format is a tab-separated-values (tsv) text file

-o:tsv

Output file format is a tab-separated-values (tsv) text file

file:xxxxx

Path and name of the query file containing the SQL-like query, which is area-category-level.sql in this example.

?

Separator between the SQL file name and optional parameters.

name=value

Parameter name/value values. In this example the parameter is COMPUTER_NAME and the value is “mossidx01”.

Step 4: Execute the command and review the results. Here we see 1,462,076 log entries were processing in just 77 seconds.

>LogParser -i:tsv -o:tsv file:area-category-level.sql?COMPUTER_NAME=mossidx01

Statistics:

-----------

Elements processed: 1462076

Elements output: 10

Execution time: 77.27 seconds (00:01:17.27)

Step 5: A tab separated file is automatically recognized and transformed into a worksheet by Excel. Open the output file with Excel gives this result.

Area

Category

Level

Count

Search Server Common

PHSts

Monitorable

1,321,494

Windows SharePoint Services

Timer

Monitorable

110,492

Search Server Common

FTEFiles

High

5,437

Windows SharePoint Services

Topology

High

5,343

Search Server Common

MS Search Administration

High

4,376

Search Server Common

GatherStatus

Monitorable

3,299

SharePoint Portal Server

Business Data

Unexpected

1,719

Office Server

Setup and Upgrade

High

1,105

Search Server Common

Common

Monitorable

1,086

ULS Logging

Unified Logging Service

Monitorable

1,077

Here are some other query examples to get your creative thinking going.

· Get the first 10,000 individual entries for a category and level, ordered by date and time.

SELECT TOP 10000

   Timestamp AS [Timestamp],

   Process AS [Process],

   Area AS [Area],

   Category AS [Category],

   EventID AS [EventID],

   Level AS [Level],

   Message AS [Message]

INTO %COMPUTER_NAME%-%LEVEL%-%CATEGORY%-detail.tsv

FROM %COMPUTER_NAME%-*.log

WHERE [Category] = '%CATEGORY%' AND [Level] = '%LEVEL%'

ORDER BY [Timestamp]

· Get the first 1,000 individual entries for a particular event ID, ordered by date and time.

SELECT TOP 1000

   Timestamp AS [Timestamp],

   Process AS [Process],

   Area AS [Area],

   Category AS [Category],

   EventID AS [EventID],

   Level AS [Level],

   Message AS [Message]

INTO %COMPUTER_NAME%-top-%EVENTID%-detail.tsv

FROM %COMPUTER_NAME%-*.log

WHERE [EventID] = '%EVENTID%'

ORDER BY [Timestamp]

Conclusion: LogParser is a fantastic data mining tool. It allows you to quickly focus on specific information in the ULS logs, without spending hours manually reviewing huge numbers of log entries. The help file that is installed with LogParser is Excellent! Refer to it often to see samples, and discover all the things you can do with LogParser.