Freigeben über


Troubleshooting runtime issue with WaIISHost.exe

Developing Azure application and seeing through it successfully launched in the cloud has much fun, although the life with cloud is not always easy. In the last two days, I have a terrible experience with an application error, which occurred right after the deployment. The “.Net Runtime” application error event is illustrated as below:

 

Application: WaIISHost.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.TypeInitializationException
Stack:  at Microsoft.WindowsAzure.ServiceRuntime.Implementation.Loader.RoleRuntimeBridge.b__0()
at System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
at System.Threading.ThreadHelper.ThreadStart()

 

It seemed clear the WindowsAzure.ServiceRuntime ran into critical error when it tried to bring up the application service it hosted. But, the error message is not intuitive enough to find the root cause. Another weird behavior is, the cloud package built from the same project on the other enlistment machine was just doing fine. There must have some significant difference between the two packages (.cspkg files) built from two enlistments. I had no luck in finding any tool online to help compare the two package files. Looking through the build logs and all the files inside the package didn’t surface any clue.

After having spent some decent time investigating inside the VM, I finally found a log file named “WaIISHost.log” Under this folder: “C:\Resources\Directory\625f7be7a6b241dfaa6924e3e6b8841e.xxxxxxxx.Service.DiagnosticStore”. A “surprising” detail of the exception error is clearly written in this file. It is something like this (Concerning the project confidentiality, I am using “xxx…” here to represent the dll names.)

WaIISHost Information: 0 : [00003952:00000001, 2014/02/14 18:05:29.433, INFO ] DebuggerAttachComplete START
WaIISHost Information: 0 : [00003952:00000007, 2014/02/14 18:05:30.676, ERROR] Unhandled exception: IsTerminating ‘True’, Message

‘System.TypeInitializationException: The type initializer for ‘xxxxxxxx’ threw an exception. —> System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.WindowsAzure.ServiceRuntime, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35′ or one of its dependencies. The system cannot find the file specified.
at xxxxxxxx..cctor()
— End of inner exception stack trace —
............

 

The log information is very helpful. The stack trace has enabled me to identify the root cause immediately. It turns out that a dependency dll was still referencing an older version of Azure SDK (ver. 2.0). The root of the root cause was actually from an unexpected time delay of a “Gated Check-in” feature implemented in VSTS recently. This is why the issue was seen from one enlistment, but not from the other(s). And, this issue may not be easily reproduced at local dev box, if multiple versions of azure runtime assemblies are installed on the dev box.  

If I started to search for “WaIISHost Log”, I would have quickly found this early post (https://blogs.msdn.com/b/kwill/archive/2011/05/05/windows-azure-role-architecture.aspx). My “painful” experience has motivated me to share the tip of using “WaIISHost.log” with you for effectively troubleshooting cloud service errors during its start up.

 

(This article was originally written on Feb. 15, 2014)

Comments

  • Anonymous
    March 19, 2015
    Thanks a lot for the helpful info.