Large-scale software-intensive widely distributed battlefield systems of systems (SoSs), such as FCS, have some challenging characteristics. There are software elements being developed concurrently by different contractors that will (1) be installed in different weapons, sensor, and command and control platforms, (2) be the basis for providing a shared view of the battlespace across platforms, and (3) allow for distributed planning, decision making, and remote engagements. The development is usually done in a number of phases, with roughly 18 months to 2 years between phases. Each phase typically demonstrates the capability to provide the planned functionality and performance for the phase and may spin off some of these capabilities to the field. There are a number of mission threads that are used to describe the inter-platform and inter-element operations of the system. Mission threads can describe tactical operations, logistical operations, support operations, and maintenance, training, test and development operations. Mission threads serve as drivers for developing the architecture and as the basis for test cases during a verification cycle. The major software elements being developed for the SoS may each have its own software architecture design documentation (SADD), perhaps built with diverse tools and using diverse notations. This makes it difficult to evaluate whether the integrated architecture composed of many of these shared elements will satisfy the mission threads. Moreover, since the architecture is driven primarily by the quality attributes, the mission threads need to be augmented with quality attribute considerations for architecture development and evaluation activities. This presentation will describe an approach to integrating all of these factors in such a way that successful evaluation of the SoS architecture against the mission threads can take place.