Just throw memory at the problem, right?
Some would argue that just restarting the application or throwing more RAM at it is all that is needed and memory leaks aren’t fatal in Node. However, as leaks grow, V8 becomes increasingly aggressive about garbage collection. This is manifested as high frequency and longer time spent in GC, slowing your app down. So in Node, memory leaks hurt performance.
Leaks can often be masked assasins. Leaky code can hang on to references to limited resources. You may run out of file descriptors or you may suddenly be unable to open new database connections. So it may look like your backends are failing the application, but it’s really a container issue.
In this week’s highlight, we cover StrongOps heap profiling. One of the many metrics monitored by StrongOps is the heap size and usage of your Node applications over time. This allows you to dig deep into the V8 heap and help you pinpoint the root cause of any memory leaks. What’s StrongOps? It’s a DevOps and performance monitoring dashboard for Node apps. Here’s the one minute intro video to learn more.
When running Node applications in production, heap usage, heap growth and frequency of garbage collection are key parameters which should be tuned for optimal memory processing.
The Heap Size graph of StrongOps monitors three key metrics of memory performance over time:
- Heap: Current heap size (MB)
- RSS: Resident set size (MB)
- V8 Full GC: Heap size sampled immediately after a full garbage collection (MB)
These metrics on historical timescales (1/3/6/12/24 hrs) help detect patterns like slow or abrupt memory leaks in the system. Such leaks will cause the application to crash and report out of memory conditions. Heap growth under load is normal in Node and the heap itself tries to resize based on consumption. However the heap currently in use needs to be within acceptable ranges. The V8 GC size reflects the baseline of memory usage over multiple GC cycles where clean up of out of scope elements in the heap takes place. V8 Full GC consumption is expected to stay in a saw-tooth pattern as net usage.
To use, login to your dashboard and select the “Memory” check box on the metric selection menu on the left panel of the dashboard.
Once we suspect a memory leak in Node application based on heap size monitoring, best practice methodology is to deep-profile the application at an object level for diagnosing non-optimal memory patterns. StrongOps provides heap profiler that enables you to monitor the counts and memory use of all instances of a constructor type, for example among others:
- State (Writable and Readable)
Normal (Non-Leaky) memory profile
Allocated memory and actual memory usage by that object types help isolate potential leaking constructs. More often than not, memory leaks are found in collection objects like arrays or hashmaps. However, it is not uncommon for leaks to occur in String or even native objects.
Instance counts are expected to stay in a range or band which matches the net of incoming request workload / concurrency vs. response served. As responses are served and objects exit scope, their instance counts too should drop along with freeing used memory back into the heap after garbage collection.
Leaky Memory Profile
In case of un-natural instance count growth, retainers should be inspected. Retainer inspection can be done by taking memory snapshots or heapdumps and analyzing them using Chrome-Dev tools. We will dive into Node heapdump in next week’s Memory Leak Analysis blog.
It’s easy to get started with StrongOps
Ready to start monitoring event loops, manage Node clusters and chase down memory leaks? We’ve made it easy to get started with StrongOps either locally or on your favorite cloud. Check out the Getting Started page and you’ll be up and running in minutes.
- Get the technical whitepaper “Getting Started with DevOps for Node.js”
- What’s in the upcoming Node v0.12 release? Big performance optimizations, read Ben Noordhuis’ blog to learn more.
- Ready to develop APIs in Node.js and get them connected to your data? We’ve made it easy to get started with LoopBack either locally or on your favorite cloud, with a simple npm install.