Tool Mentors > Rational Quantify Tool Mentors > Finding Performance Bottlenecks Using Rational Quantify and Rational PurifyPlus (Windows)

Purpose

This tool mentor provides an overview of how to use Rational Quantify® to quickly pinpoint performance bottlenecks in Visual C/C++, Visual Basic, Java, and Visual Studio.Net managed applications. This tool mentor is applicable for use with systems running Microsoft Windows. /p>

PurifyPlus is a Rational product that includes Quantify functionality.

To learn more about Quantify, including how to project the effect of performance improvements, interpret source-code annotations, compare program runs, and fine-tune data collection, read the Getting Started manual for the PurifyPlus product family (Windows version). 

For step-by-step information about using Quantify, see the Quantify online Help.

Related Rational Unified Process information: 

Overview

Quantify provides a complete, accurate, and easy-to-interpret set of performance data for your program and its components, so that you can identify and eliminate performance bottlenecks in your code.  

Tool Steps

To improve a program's performance:

    1. Run the program using Quantify to collect performance data
    2. Use the Quantify data analysis tool to find and diagnose bottlenecks
    3. Eliminate bottlenecks and rerun to verify improvements

1. Run the program using Quantify to collect performance data To top of page

The first step in improving your program's performance is to run the program under Quantify to collect performance data.

You can do this by starting Quantify from your desktop, clicking the Run button on the Welcome screen, and specifying and running your program in the Run Program dialog. You can also run a program under Quantify from within Microsoft Visual Studio or Microsoft Visual Basic; first select the menu item Quantify > Engage Quantify Integration, then run your program as usual.

As you exercise your code, Quantify records data about your program's performance and displays the activity of its threads and fibers. You can pause and resume data recording at any time, and so limit profiling to specific portions of code. You can also take a series of snapshots of the current data and examine performance in stages.

When you exit your program, Quantify has an accurate profile of its performance. Because this base dataset can be very large, Quantify applies default filters to hide non-critical data from modules such as system libraries before it displays the performance profile. You can choose to display more or less data as you proceed with your analysis.

Tip: You can also use Quantify's command line interface to incorporate it into your test scripts, makefiles, and batch files for automated testing. For instructions, look up scripts in the Quantify online Help index. 

For more information, look up the following topics in the Quantify online Help index: 

  • running programs
  • run summary
  • recording data
     

2. Use the Quantify data analysis tool to find and diagnose bottlenecks To top of page

The second step in improving your program's performance is to analyze the performance data to find and diagnose bottlenecks.

When you exit the program for which Quantify has been collecting data, it displays the Call Graph window, which graphically depicts the calling structure and performance of the functions, procedures, or methods (collectively referred to here as functions) in the program. By default, the call graph displays the top 20 functions in the current dataset by function + descendants (F+D) time. Quantify's results include virtually no overhead from the profiling process itself. The numbers you see are the time your program would take without Quantify.

The call graph also highlights the most expensive path; thicker lines indicate more expensive paths. You can highlight other functions based on various criteria, including performance, calling relationships, and possible causes for bottlenecks. You can also show additional functions, hide functions, and move functions around to help with interpreting the call graph.

You can use Quantify's other data analysis windows for further examination of the program's performance. To review all functions in the current dataset, and to sort them by various criteria, use the Function List window. To display tabular and graphical data for a specific function, including data about its callers and descendants, use the Function Detail Window. If debug data was available when you ran the program and you measured functions at line level, you can also use the Annotated Source window to analyze a specific function's performance line by line.

Quantify provides several ways to reduce large datasets and display only the data you're interested in. For example, you can specify filters to hide functions based on module name, pattern (for example, functions with CWnd in their name), or measurement type (for example, all waiting and blocking functions). You can also focus on a specific subtree.

You can easily analyze the performance of the program over several runs by merging the separate runs to create a new dataset. 

With the data you collect, you will be able to identify performance bottlenecks such as needless computations and recomputations, premature computations, or excessive and expensive library calls.

For more information, look up the following topics in the Quantify online Help index:

  • call graph window
  • function list window
  • function detail window
  • annotated source window
  • highlighting functions
  • filtering data
  • subtrees

3. Eliminate bottlenecks and rerun to verify improvements To top of page

The third and final step in improving your program's performance is to modify your code to eliminate bottlenecks, and then to compare the performance of the original code and the modified code

After you make changes to your code, rerun the updated program under Quantify. Then compare the new results to the previous run by creating a "diff" dataset, which gives clear indications of performance improvements and regressions. The call graph for this dataset highlights improvements in green and regressions in red; the function list displays the differences between the two runs, as well as original data from the two runs.

Use the Quantify Navigator window to keep track of all the runs you're working with. You can save performance data as a Quantify data file (.qfy) to use for further analysis or to share with other Quantify users. You can save data to a tab-delimited ASCII text file (.txt) to use outside of Quantify, for example, in test scripts or in Microsoft Excel. You can also copy data directly from the Function List window to use in Excel. 

For more information, look up the following topics in the Quantify online Help index:

  • comparing runs
  • navigator window
  • saving data


Copyright  © 1987 - 2001 Rational Software Corporation


Display Rational Unified Process using frames

Rational Unified Process