High Memory part 3 - Native Heap
So let's continue our digging into Memory problems and how to track down what is happening. So our last post, High Memory continued, went into how to look into the managed heap and if the problem is a System.Data.DataTable, we can look at it and see what the columns are.
So what if the memory isn't all in the Virtual Memory. What if it is in the native NT Heap? In that case, !address -summary may look like:
0:000> !address -summary
-------------------- Usage SUMMARY --------------------------
TotSize ( KB) Pct(Tots) Pct(Busy) Usage
41c2000 ( 67336) : 03.21% 04.95% : RegionUsageIsVAD
2cf36000 ( 736472) : 35.12% 00.00% : RegionUsageFree
3d0b000 ( 62508) : 02.98% 04.59% : RegionUsageImage
d80000 ( 13824) : 00.66% 01.02% : RegionUsageStack
36000 ( 216) : 00.01% 00.02% : RegionUsageTeb
4a434000 ( 1216720) : 58.02% 89.42% : RegionUsageHeap
0 ( 0) : 00.00% 00.00% : RegionUsagePageHeap
1000 ( 4) : 00.00% 00.00% : RegionUsagePeb
1000 ( 4) : 00.00% 00.00% : RegionUsageProcessParametrs
1000 ( 4) : 00.00% 00.00% : RegionUsageEnvironmentBlock
Tot: 7fff0000 (2097088 KB) Busy: 530ba000 (1360616 KB)
-------------------- Type SUMMARY --------------------------
TotSize ( KB) Pct(Tots) Usage
2cf36000 ( 736472) : 35.12% :
3d0b000 ( 62508) : 02.98% : MEM_IMAGE
c10000 ( 12352) : 00.59% : MEM_MAPPED
4e79f000 ( 1285756) : 61.31% : MEM_PRIVATE
-------------------- State SUMMARY --------------------------
TotSize ( KB) Pct(Tots) Usage
3ffd2000 ( 1048392) : 49.99% : MEM_COMMIT
2cf36000 ( 736472) : 35.12% : MEM_FREE
130e8000 ( 312224) : 14.89% : MEM_RESERVE
Largest free region: Base 685ba000 - Size 04b36000 (77016 KB)
From here we can see that most of the memory now is native. The best way to find out what is causing all of these memory to be allocated is to use a tool from the IIS team called DebugDiag. When you run this, you can use the Memory and Handle Leak rule to have DebugDiag track native memory and try to help you find out what is causing the high memory.
After you have run the test under DebugDiag, you load the dump file into DebugDiag and run the analysis report Memory Pressure Analyzers. When you look at the report it creates, you will see some summary information about memory, such as:
mscorwks.dll (a known Windows memory manager) is responsible for 761.87
MBytes worth of outstanding allocations. These allocations appear
to have originated from the following module(s) and function(s):
System.Threading._TimerCallback.TimerCallback_Context(System.Object)
From this we can click on the name of the DLL and jump to the report for the file. The main part to look at are the Top 5 functions by allocation count and Top 5 functions by allocation size.
Top 5 functions by allocation size
mscorwks!EEHeapAlloc+12d 454.89 MBytes
mscorwks!EEvirtualAlloc+104 306.94 MBytes
mscorwks!DebuggerHeap::Alloc+2f 39.16 KBytes
mscorwks!CExecutionEngine::CheckThreadState+14d 288 Bytes
mscorwks!NLSTable::OpenOrCreateMemoryMapping+120 0 Bytes
Each function, if we click on it, will give us a Leak Probability and also is then further broken down to show the Top 10 allocation sizes by allocation count and Top 10 allocation sizes by total size for each of these functions. So we will get something like:
Top 10 allocation sizes by allocation count
24 Bytes 44,766 allocation(s)
28 Bytes 8,973 allocation(s)
40 Bytes 8,404 allocation(s)
20 Bytes 7,388 allocation(s)
16 Bytes 6,761 allocation(s)
64 Bytes 5,584 allocation(s)
5.26 KBytes 5,580 allocation(s)
19.54 KBytes 5,580 allocation(s)
2.35 KBytes 5,580 allocation(s)
36 Bytes 4,895 allocation(s)
So now we can see for the function, we should focus on allocations of a particular size. So if we want to focus on the 24 Byte allocations, we then scroll a little farther down in the report and we will see callstacks. Just scroll until you find one of the size you are looking for. For example:
Call stack sample 3
Address 0x1c4704d0
Allocation Time 00:05:02 since tracking started
Allocation Size 24 Bytes
Function
mscorwks!EEHeapAlloc+12d
mscorwks!EEHeapAllocInProcessHeap+51
System.Threading._TimerCallback.TimerCallback_Context(System.Object) System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
webengine!HashtableIUnknown::AddCallback+a
webengine!HttpCompletion::ProcessRequestInManagedCode+1a3
webengine!HttpCompletion::ProcessRequestInManagedCode+1a3
webengine!HttpCompletion::ProcessCompletion+3e
webengine!CorThreadPoolWorkitemCallback+18
mscorwks!ThreadpoolMgr::intermediateThreadProc+49
kernel32!BaseThreadStart+34
So now we have callstacks that allocated the memory that we need to focus on. There is some more information about DebugDiag on it's blog: https://blogs.msdn.com/debugdiag/
Comments
Anonymous
February 07, 2008
The comment has been removedAnonymous
February 13, 2008
The comment has been removedAnonymous
August 17, 2009
what a pity! just dig out your blog via google :) "From here we can see that most of the memory now is native" from !address -summary, how can we decide that the most of memory is native? I think this command just shows all the memory information within a process, but not indicate any managed or unmanaged.Anonymous
August 20, 2009
You look at the type of memory that is taking up all the space. RegionUsageHeap is taking up 58% of the memory from the output above. If you want to see where managed memory is, it is located in the Virtual Memory (RegionUsageIsVAD)Anonymous
August 30, 2009
Understood. So normally the managed memory is located in RegionUsageIsVAD, whereas the unmanaged memory is in RegionUsageHeap, RegionUsageStack etc.Anonymous
October 07, 2009
We've been trying to track down an issue for some time. mscorworks!EEVirtualAlloc+119 shows two callstacks but only memory addresses like below: Function Source Destination 0x79E8C582 0x79E717B4 0x79F8C96F kernel32!BaseThreadStart+34 I'm at a dead end with this because I was expecting .NET callstack info. Any ideas/tips?Anonymous
October 07, 2009
One thing you can try to do with these numbers is running the command !ip2md from sos.dll on them. So attach to your process or open a dump from that process and run that command after loading sos: !ip2md 0x79E8C582 !ip2md 0x79E717B4 !ip2md 0x79F8C06F See if any of them resolve to something managed. I think those as functions in mscorwks though and are native themselves. What are you trying to track down?