Fixing high Windows memory usage caused by large metafile

5.00 avg. rating (91% score) - 1 vote

Shortly after installing a third party Windows service, the memory usage of my Windows 2008 R2 server suddenly went up – from just as little as 4GB out of the 16GB installed to as high as 99% most of the time, as reported by Task Manager:

task manager full memory

After a detailed analysis of the Processes tab, which did not reveal any process that seemed to consume a lot of memory, I downloaded Process Monitor to check and immediately noticed as many as several thousands file system activities caused by a process which was trying to read from some data files, as demonstrated in the following screenshot:

readfile1

A further investigation using RAMMap revealed that the metafile size is very high:

high metafile size

At this point the high memory usage can be attributed to the offending process which creates many file handles while reading/writing data files, perhaps without closing the file handles appropriately once done. This results in the increased size of the meta file which is required to keep track of the file system activities. As the metafile cache is shared between different processes, the memory usage of the offending process as reported by Task Manager remains low, because Task Manager does not take into account the size of the meta file when calculating process memory usage.

At first I tried to restart the offending application, resulting in an immediate decrease of memory consumption. However, as the memory leak is due to a bug in the application, the memory consumption of the server soon went up to 99% again in just over three days. I downloaded and installed the Dynamic Cache Service from Microsoft, hoping that some caching would help prevent the memory leaks. However, to my disappointment, although the Dynamic Cache Service did help, the leaks were so serious that the memory consumption would eventually increase to over 99% again, after the application has been running for two weeks. Scheduling a task to restart the application daily would help to reduce memory usage but could also result in service disruption. So I definitely needed a more permanent solution.

Next, I tried the “Empty Working Set” and “Empty System Working Set” from the Empty menu of RAMMap and notice that the memory usage went down instantly without restarting the service:

empty working set

After the sets had been emptied, the service still worked properly. So at least this is a good workaround. My next task is to write a program to empty the sets automatically and run this as a scheduled task while waiting for an official fix from the manufacturer. For this I used the code from the Analysis Services Stored Procedure project, which has a wrapper method to clear the filesystem cache:

FileSystemCache.ClearAllCaches();

However, it turns out that this cleanup not as thorough as RAMMap and you will still need to manually empty the working sets for each process with the following code to have the same effects:

static void Main(string[] args)
{
  try
  {
    Console.WriteLine("Clearing active/standby system file cache ...");
    FileSystemCache.ClearAllCaches();
    Console.WriteLine("Successfully cleared active/standby system file cache ...");
  }
  catch (Exception ex)
  {
    Console.Write(ex.Message);
  }

  Process[] plist = Process.GetProcesses();
  try
  {
    foreach (Process p in plist)
    {
      if (p.Id == Process.GetCurrentProcess().Id)
      {
        Console.WriteLine("Ignoring current process #" + p.Id + " (" + p.ProcessName + ")");
      }
      else
      {
        Console.WriteLine("Emptying working set for process #" + p.Id + " (" + p.ProcessName + ")");
        EmptyWorkingSet(p.Handle.ToInt64());
        Console.WriteLine("Successfully emptied working set for process #" + p.Id + " (" + p.ProcessName + ")");
      }
    }
  }
  catch (Exception ex)
  {
    Console.WriteLine("Error: " + ex.Message);
  }
}

[DllImport("psapi")]
public static extern bool EmptyWorkingSet(long hProcess);

The entire source code with the compiled executable can be downloaded here. I have removed the unnecessary .NET 4.0 references from the Analysis Services Stored Procedure project to make the executable compatible with .NET 2.0. Take note that the program will need to be run as administrator to work properly.

5.00 avg. rating (91% score) - 1 vote
ToughDev

ToughDev

A tough developer who likes to work on just about anything, from software development to electronics, and share his knowledge with the rest of the world.

One thought on “Fixing high Windows memory usage caused by large metafile

  • May 4, 2016 at 9:19 am
    Permalink

    Sysinternal’s Handle.exe – should list open file handles if you believe that could be the issue.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>