search menu icon-carat-right cmu-wordmark

Architecture Tradeoff Analysis Method Collection

This collection contains resources about the Architecture Tradeoff Analysis Method (ATAM), a method for evaluating software architectures against quality attribute goals.

Publisher:

Software Engineering Institute

The Architecture Tradeoff Analysis Method (ATAM) is a method for evaluating software architectures relative to quality attribute goals. ATAM evaluations expose architectural risks that potentially inhibit the achievement of an organization's business goals. The ATAM gets its name because it not only reveals how well an architecture satisfies particular quality goals, but it also provides insight into how those quality goals interact with each other—how they trade off against each other.

The ATAM is the leading method in the area of software architecture evaluation. An evaluation using the ATAM typically takes three to four days and gathers together a trained evaluation team, architects, and representatives of the architecture's various stakeholders.

Challenges

Most complex software systems are required to be modifiable and have good performance. They may also need to be secure, interoperable, portable, and reliable. But for any particular system

  • What precisely do these quality attributes such as modifiability, security, performance, and reliability mean?
  • Can a system be analyzed to determine these desired qualities?
  • How soon can such an analysis occur?
  • How do you know if a software architecture for a system is suitable without having to build the system first?

Description

Business drivers and the software architecture are elicited from project decision makers. These are refined into scenarios and the architectural decisions made in support of each one. Analysis of scenarios and decisions results in identification of risks, non-risks, sensitivity points, and tradeoff points in the architecture. Risks are synthesized into a set of risk themes, showing how each one threatens a business driver.

The ATAM consists of nine steps:

  1. Present the ATAM. The evaluation leader describes the evaluation method to the assembled participants, tries to set their expectations, and answers questions they may have.
  2. Present business drivers. A project spokesperson (ideally the project manager or system customer) describes what business goals are motivating the development effort and hence what will be the primary architectural drivers (e.g., high availability or time to market or high security).
  3. Present architecture. The architect will describe the architecture, focusing on how it addresses the business drivers.
  4. Identify architectural approaches. Architectural approaches are identified by the architect, but are not analyzed.
  5. Generate quality attribute utility tree. The quality factors that comprise system "utility" (performance, availability, security, modifiability, usability, etc.) are elicited, specified down to the level of scenarios, annotated with stimuli and responses, and prioritized.
  6. Analyze architectural approaches. Based on the high-priority factors identified in Step 5, the architectural approaches that address those factors are elicited and analyzed (for example, an architectural approach aimed at meeting performance goals will be subjected to a performance analysis). During this step, architectural risks, sensitivity points, and tradeoff points are identified.
  7. Brainstorm and prioritize scenarios. A larger set of scenarios is elicited from the entire group of stakeholders. This set of scenarios is prioritized via a voting process involving the entire stakeholder group.
  8. Analyze architectural approaches. This step reiterates the activities of Step 6, but using the highly ranked scenarios from Step 7. Those scenarios are considered to be test cases to confirm the analysis performed thus far. This analysis may uncover additional architectural approaches, risks, sensitivity points, and tradeoff points, which are then documented.
  9. Present results. Based on the information collected in the ATAM (approaches, scenarios, attribute-specific questions, the utility tree, risks, non-risks, sensitivity points, tradeoffs), the ATAM team presents the findings to the assembled stakeholders.

The most important results are improved architectures. The output of an ATAM is an outbrief presentation and/or a written report that includes the major findings of the evaluation. These are typically

  • a set of architectural approaches identified
  • a "utility tree"—a hierarchic model of the driving architectural requirements
  • the set of scenarios generated and the subset that were mapped onto the architecture
  • a set of quality-attribute-specific questions that were applied to the architecture and the responses to these questions
  • a set of identified risks
  • a set of identified non-risks
  • a synthesis of the risks into a set of risk themes that threaten to undermine the business goals for the system

Benefits

  • identified risks early in the life cycle
  • increased communication among stakeholders
  • clarified quality attribute requirements
  • improved architecture documentation
  • documented basis for architectural decisions

The most important results are improved architectures. The ATAM aids in eliciting sets of quality requirements along multiple dimensions, analyzing the effects of each requirement in isolation, and then understanding the interactions of these requirements.

Who Would Benefit

Many people have a stake in a system's architecture, and all of them exert whatever influence they can on the architect(s) to make sure that their goals are addressed. For example, the users want a system that is easy to use and has rich functionality. The maintenance organization wants a system that is easy to modify. The developing organization (as represented by management) wants a system that is easy to build and that will employ the existing work force to good advantage. The customer (who pays the bill) wants the system to be built on time and within budget. All of these stakeholders will benefit from applying the ATAM. And needless to say, the architect is also a primary beneficiary.

ATAM: Method for Architecture Evaluation

August 2000

This report presents technical and organizational foundations for performing architectural analysis, and presents the SEI's ATAM, a technique for analyzing software architectures.

Evaluating Software Architectures: Methods and Case Studies

October 2001

This book is a comprehensive guide to software architecture evaluation, describing specific methods that can quickly and inexpensively mitigate enormous risk in software projects.

Impact of Army Architecture Evaluations

April 2009

This 2009 report describes the results of a study of the impact that the ATAM evaluations and QAWs had on Army programs.

Categorizing Business Goals for Software Architectures

December 2005

This report provides a categorization of possible business goals for software-intensive systems, so that individuals have some guidance in the elicitation, expression, and documentation of business goals.

Risk Themes Discovered Through Architecture Evaluations

September 2006

This 2006 report analyzes the output of 18 evaluations conducted using the Architecture Tradeoff Analysis (ATAM). The goal of the analysis was to find patterns in the risk themes identified during those evaluations.

Progress Toward an Organic Software Architecture Capability in the U.S. Army

June 2007

This 2007 report describes the Software Architecture Initiative of the Army Strategic Software Improvement Program.

Risk Themes from ATAM Data: Preliminary Results

April 2006

In this 2006 presentation, Len Bass, Robert Nord, and William G. Wood of the Software Engineering Institute (SEI) present a preliminary analysis of the results of a collection of ATAMs.

Using the SEI Architecture Tradeoff Analysis Method to Evaluate WIN-T: A Case Study

September 2005

This report describes the application of the SEI ATAM (Architecture Tradeoff Analysis Method) to the U.S. Army's Warfighter Information Network-Tactical (WIN-T) system.

Integrating the Architecture Tradeoff Analysis Method (ATAM) with the Cost Benefit Analysis Method (CBAM)

December 2003

This technical note reports on a proposal to integrate the SEI ATAM (Architecture Tradeoff Analysis Method) and the CBAM (Cost Benefit Analysis Method).

Using the Architecture Tradeoff Analysis Method (ATAM) to Evaluate the Software Architecture for a Product Line of Avionics Systems: A Case Study

July 2003

This 2003 technical note describes an ATAM evaluation of the software architecture for an avionics system developed for the Technology Applications Program Office (TAPO) of the U.S. Army Special Operations Command Office.

SEI Architecture Analysis Techniques and When to Use Them

October 2002

When analyzing system and software architectures, the Quality Attribute Workshop (QAW) and the Architecture Tradeoff Analysis Method (ATAM) can be used in combination to obtain early and continuous benefits.

Use of the Architecture Tradeoff Analysis Method (ATAM) in Source Selection of Software-Intensive Systems

June 2002

This report explains the role of software architecture evaluation in a source selection and describes the contractual elements that are needed to support its use.

Using the Architecture Tradeoff Analysis Method to Evaluate a Wargame Simulation System: A Case Study

December 2001

This report describes the application of the ATAM (Architecture Tradeoff Analysis Method) to a major wargaming simulation system.

Applicability of General Scenarios to the Architecture Tradeoff Analysis Method

October 2001

In this report, we compare the scenarios elicited from five ATAM (Architecture Tradeoff Analysis Method) evaluations with the scenarios used to characterize the quality attributes.

Use of the ATAM in the Acquisition of Software-Intensive Systems

September 2001

This report discusses the role of software architecture evaluations in a system acquisition and describes the contractual elements that are needed to accommodate architecture evaluations in an acquisition. The report also provides an example of contractual language that incorporates the ATAM as a software architecture evaluation method in a system acquisition.

An Evaluation Theory Perspective of the Architecture Tradeoff Analysis Method (ATAM)

September 2000

This report analyzes and identifies the Architecture Tradeoff Analysis Method (ATAM)'s evaluation process and criteria, as well as its data-gathering and synthesis techniques, and more.

Using the Architecture Tradeoff Analysis Method to Evaluate a Reference Architecture: A Case Study

June 2000

This report describes the application of the ATAM (Architecture Tradeoff Analysis Method) to evaluate a reference architecture for ground-based command and control systems.

Software Architecture Evaluation with ATAM in the DoD System Acquisition Context

September 1999

This report explains the basics of software architecture and software architecture evaluation in a system acquisition context.

The Architecture Tradeoff Analysis Method

July 1998

This paper presents the Architecture Tradeoff Analysis Method (ATAM), a structured technique for understanding the tradeoffs inherent in the architectures of software-intensive systems.

Steps in an Architecture Tradeoff Analysis Method: Quality Attribute Models and Analysis

May 1998

This paper presents some of the steps in an emerging architecture tradeoff analysis method (ATAM).

The Architecture Tradeoff Analysis Method

April 1998

This paper presents the Architecture Tradeoff Analysis Method (ATAM), a structured technique for understanding the tradeoffs inherent in design.