• All Community
    • All Community
    • Forums
    • Ideas
    • Blogs


Norton Performance Graphs Explained

Every day, corporations, governments, and individuals must live with security vs. performance tradeoffs.  An extra layer of security in any situation, digital or otherwise, always has a performance impact.   A simple (and probably overused) example is airport security.  Standing in an airport security line is time consuming, but provides the level of security mandated by the government.  We are mandated to trade performance (walking straight to the terminal) for the security provided by the inspections.  Overzealous security can cause bottlenecks, which inhibits daily activities.  Eliminating the security removes the bottlenecks, but opens vulnerabilities.

This year's Norton products focus on providing the highest level of security available, while trying to minimize these bottlenecks.  Continuing the airport metaphor, the Norton products would allow you to "walk straight to the terminal" while performing the required scans.  The difference is everything is done in the background to prevent security from interfering with performance, while maintaining the ability to prevent a threat from reaching its target.


We spent a great deal of time engineering the Norton Products to be fast, lightweight, and non-invasive.  Real-time tasks are always necessary, such as the scanning performed by Auto-Protect, SONAR, the Firewall, and Intrusion Prevention.  These real-time tasks are heavily optimized using a variety of techniques. 

Other tasks and features, such as Norton Insight, Norton Community Watch, Automatic LiveUpdate, and broader virus and spyware scans, do not need to run all the time.  These tasks are essential to security, but carry a performance impact.  The 2009 Norton Products moved all of these tasks to the background, and runs them when the computer is not actively being used.  This approach ensures the computer is protected at all times, while leaving the computer's resources available for interactive tasks.

But, how do we prove our claims?  This is why the Norton Performance Graphs were developed. 

Graph Overview

The "CPU Usage" dialog in the 2009 Norton Products consists of several parts; the two which are discussed here in detail are the CPU and memory graphs.

Fig 1: Norton Idle Processing

These graphs represent a 90 minute history of CPU and Memory usage on the machine, with blue indicating the overall system's consumption of the resource, and yellow indicating the overall Norton consumption of the resource.  The gray shading indicates the Norton Product has detected the user was no longer interacting with the system.  This is the ideal time for the background tasks described above to be performed, and it is common to see a spike in resource consumption by Norton.   If you return to the computer while idle processing is taking place, the processing will automatically stop until a later time (as show below).

Fig 2: Norton Idle Processing ends when user activity begins

(Note that system uptime is about 75 minutes)

One of the challenges in building these graphs is collecting and archiving information without impacting system performance.  For this reason, the data samples plotted in the graphs are collected at regular intervals using performance counters provided by the WindowsTM Operating System.  These counters are heavily optimized, and provide a low-impact mechanism for acquiring this information.  Over a 90 minute period, this infrequent and lightweight data process collection yields an accurate resource usage pattern on the system without a performance penalty.

The CPU Graph

In the idle screenshots above, several of the idle scans had run, which caused the spike in Norton CPU usage in the center of the graph.  This graph measures the overall system CPU usage and plots it in blue.  It then measures the current CPU utilization of all Norton processes, adds these values up, and then plots it in yellow.   The CPU usage values are collected for all processors (and cores) on the system, and then divided by the number of processors.  For example, consider a computer with two processors.  If one is at 100% utilization and the other is at 0%, then the CPU graph will read 50%.

On a test computer with eight cores and two of them at 100%, the graph would show 25% CPU usage plus any additional usage from the other cores.

Fig. 3: Comparing Task Manager and the Norton CPU Graph

The Memory Graph

There are quite a few different ways to describe the memory used by a process, and by the system as a whole.  Even the information provided by task manager can be a little confusing at first glance.  In the screenshot below, the test system has about 3GB of memory, of which 1.81 GB is used.  But due to caching, and the varying metrics used to describe memory utilization, there is only one megabyte of memory free on the system, implying that the physical memory is entirely consumed.

Fig. 4: Task Manager reporting physical memory usage

Measuring Memory Consumption

Microsoft has published an article on this subject that describes how Task Manager reports the different values. The metrics we are most interested in are "Working Set", "Private Bytes" (also known as "Commit Size" or "Committed Bytes"), and Virtual Bytes.

Virtual Bytes references the entire addressable space of a process.  This includes the process code, data, and other virtual (non-physical) memory allocations.  It is not a measure of how much physical memory a process is consuming, just how much space (in bytes) the process can address.  On 32-bit systems, there is a 2GB limit on each individual process' virtual space.

Private Bytes refers to the memory owned exclusively by a process.   It is also referred to as Commit Size as this number reflects the amount of space the process will require in the paging file should it be completely swapped out.  This metric is a valid way to determine how much memory a process is directly consuming,  but does not take into account how much of that memory is currently backed by (and is consuming) physical memory.

The Working Set of a process is a measure of how much physical memory is being referenced by a process at a given point in time.  It is not reflective of all memory referenced by a process, as additional memory assigned to the process may be swapped out (to the system page file), and therefore is not consuming physical memory.  The working set includes both the resident code occupying main memory, and the private memory allocations made by the process.

The Private Working Set of a process is the same as its working set, less the resident code.  It accounts only for the Private Bytes which are currently occupying main memory.  This metric is only available on Windows VistaTM or later.

We chose to use the Working Set of a process as the measurement to back the memory graph for several reasons.  First, it is compatible with the data displayed in Task Manager on both Windows XP and Vista.  Second, the working set is a measure of how much physical memory is being referenced by a process at a given point in time.  It is not reflective of all memory referenced by a process, as additional memory assigned to the process may be swapped out (to the system page file), and therefore is not consuming physical memory at that point.  The overall memory consumption is computed as the total working sets of all processes, and the Norton memory consumption is the total working sets of all Norton processes.   This calculation intentionally does not include Kernel or System Memory.  This is the memory used by the Operating System itself.  In a future version, we may include this value and the comparable data from the Norton components running in this layer of the OS.

Validating the Memory Graph using Task Manager

When comparing the data task manager reports to that of the Norton memory graph, the results will line up closely.  In the screenshot below, the test system with roughly 3GB of memory is used again.  Task manager reports that physical memory usage is at 2GB, and draws a line on the history graph at roughly the two-thirds mark.  The Norton graph does the same when rendering the overall system's memory consumption.  Interestingly, Task Manager and Norton are plotting two completely different values!   On Vista, Task Manager is displaying the total Committed Bytes on the system, and the Norton graph is displaying the total Working Set size of all running processes.  This approach was chosen so the memory used by Norton processes could be directly compared to that of all other processes.

Fig. 5: Comparing Task Manager and the Norton Memory Graph

One of the most memory and CPU intensive operations a Norton Product will perform is a Full System Scan.  While this scan is running, one would expect the Norton graphs to reflect that both CPU and Memory resources are being consumed by the scan.  But, running this scan on the same computer referenced above (8 processors, 3GB of memory) shows that system memory is only minimally impacted by the scan.

Fig. 6: A Full System Scan on an 8 CPU, 3GB RAM test machine.

(Note that system uptime is about 70 minutes)

This is because the memory graph uses a similar metric to task manager Processes tab page.  In the screen shot above, the highlighted Norton process is using either 42Mb or 28Mb of memory, depending on which number is used.  In Vista, Task Manager displays the second number by default, and in XP it displays the first by default.   You can add these additional columns via the "View" menu in task manager.  Norton processes are using 52 Mb (combined Working Sets).  On this test system, with 3GB of memory, this memory usage is approximately 1.7% of the system memory. 

Running this same test on a different machine (with only 512MB of memory) shows the memory usage a little more clearly.

Fig. 7: A Full System Scan on a single CPU, 512 MB RAM test machine

The combined working sets of the Norton processes are 70 MB, totaling 14% of memory, which is displayed in the right side of the memory graph in the screen shot above.

Validating the Memory Graph using Microsoft Performance Monitor

The data reported by the Norton Memory Graph can be cross referenced using the Microsoft Performance Monitor, which is available on both Windows XP and Windows Vista.  To launch the Performance Monitor, click Start -> Run and type "perfmon.msc".  On Vista, type "perfmon.msc" into the "Start Search" box at the bottom of the start menu.  On XP, the performance monitor will load immediately, and on Vista, you'll need to select "Performance Monitor" in the navigation tree.

To set up the counters, right click on the graph and select "Add Counters".  The counters used in the screen shot below are all from the "Process" group: Working Set, Private Bytes, and Virtual Bytes.  In the "instances" list for each counter, all ccSvcHst instances should be selected.   Since these counters are in bytes, they may need to be scaled so they fit on the graph.  This can be done after the counters are added via the counter properties.

Fig. 8: Performance Monitor's "Add Counters" dialog, displaying counters from the "Process" group.

Launching a full system scan will show how the three metrics discussed above behave as the scan begins.  Note the Y axis is not in percentages, but in megabytes for working set and private bytes, and in 100 megabyte units for virtual bytes.

Fig. 9: Performance Monitor collecting data as a Full System Scan begins

Fig. 10: Performance Monitor "Report View" when paused at the state in Fig. 9

Recalling the definitions above, Working Set is the active memory used by the process (highlighted with a bold, black line).  As the full system scan is started, there is a much heavier demand for memory, so the working set of the process increases from about 10Mb to 60Mb.  Private Bytes is the physical memory requested by the process and is indicated by the green line.  The difference between private bytes and working set can be described as "the amount of physical memory a process asked for" vs. "the amount of memory the process is actually using right now."  Finally, the purple line describes Virtual Bytes, which is the full addressable space of a process.   This value increases as well, as more libraries are loaded, threads are created, and other allocations are made to perform the requested scan.

The working set value maps back to the data reported by Task Manager and the Norton Memory Graph on the test system with 512 Mb of memory.


There are a variety of features we are considering for the next version of Norton Performance Graphs, and for the background task scheduling system in general, and we're always open to suggestions.  The goal for this feature is to communicate what activities our Norton product is performing, when these activities are being performed, and what the performance cost of the activity is.  

While Norton Products may not help you get through security lines at the airport any quicker, they will allow you to get through your work, games, and other daily PC activities a lot faster and safer, and then work when you're not.

Additional Information

Memory Performance Information - Microsoft

This is an article describing how Microsoft Task Manager and Microsoft Performance Monitor interpret the various values describing the memory utilization of a process and the system.


Process Explorer - Sysinternals, Microsoft

This utility can be used instead of Task Manager, and provides much deeper detail into the resource utilization of a process than Task Manager.


Message Edited by MikeO on 09-08-2008 11:22 AMMessage Edited by MikeO on 09-08-2008 11:28 AMMessage Edited by MikeO on 09-08-2008 11:54 AM


Sorry to say that, but my impression is that choosing the Working Set counter for the chart served only one purpose: to make users thing the memory consumption was much lower than it actually was.

The thing is, I've noticed that the Norton 2009 processes keep trimming their working sets quite aggressively (presumably using the SetProcessWorkingSetSize API call), hence making the numbers look much smaller than they really are (by constantly swaping out all the pages).

This not only provides misleading information to the user, but also somewhat degrades overall system performance: trimming the working set usually causes unnecessary/superfluous page-out/page-in operations that serve no practical purpose to the system and/or the user.

Just my .02 worth. :o)
I found this article very informative! I have a better understanding of what it all means. This is definitely one of the best security suites I have used, very light on resources yet seems to do it's job well.