Skip to main content

American Systems Corporation

B-292755,B-292755.2 Dec 03, 2003
Jump To:
Skip to Highlights

Highlights

American Systems Corporation (ASC) protests the elimination of its proposal from the competitive range under request for proposals No. N61339-02-R-0063, issued by the Naval Air Warfare Center Training Systems Division, for training systems devices and curricula.

We deny the protest.
View Decision

B-292755; B-292755.2, American Systems Corporation, December 3, 2003

American Systems Corporation, B-292755; B-292755.2, December 3, 2003



DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: American Systems Corporation

File: B-292755; B-292755.2

Date: December 3, 2003

Joseph G. Billings, Esq., for the protester.
K. Lisa Daniel, Esq., Department of the Navy, for the agency.
Charles W. Morrow, Esq., and James A. Spangenberg, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1. Agency reasonably rated proposal as marginal/high risk for technical factor under solicitation requesting proposals for training systems devices and curricula, where the proposal failed to provide sufficient details of its written instructional system development processes and failed to provide sufficient details in its sample task response.

2. Discussions were not misleading, even though, on the basis of an incorrect assumption, protester misinterpreted a particular discussion question, where a reasonably diligent offeror would have correctly understood, or requested clarification of, the agency discussion question.

3. Agency reasonably made one of the awards under solicitation contemplating multiple awards of indefinite-delivery/indefinite-quantity contracts to an offeror whose marginal proposal contained one significant weakness, but not to an offeror whose marginal proposal contained two significant weaknesses under the same technical subfactor (with other aspects of the evaluation being relatively equal), where the agency reasonably determined this was a discriminator between the technical merits of the two proposals that justified award to one and not the other.

DECISION

American Systems Corporation (ASC) protests the elimination of its proposal from the competitive range under request for proposals (RFP) No. N61339-02-R-0063, issued by the Naval Air Warfare Center Training Systems Division, for training systems devices and curricula.

We deny the protest.

The RFP, issued as a partial small business set-aside, was to procure trainer/training systems and technology-based curricula. The RFP contemplated the award of multiple indefinite-delivery/indefinite-quantity (ID/IQ) task order contracts for an 8'year period for two separate contractual lots. Lot I is not at issue here because the protester submitted a proposal only for Lot II. Lot II, involving technology-based curricula, required the contractor to accept task orders to perform planning, analysis, design, development, implementation, evaluation, support, maintenance, modification, modeling and simulation, and management of technology-based training products.

The RFP provided for award of contracts for Lot II to those offerors with proposals representing the greatest value, considering three evaluation factors: technical, past performance, and price. The technical factor was comprised of two equally weighted subfactors, instructional systems development (ISD) and management. The technical and past performance factors were of equal importance and when combined were considered significantly more important than price.

The RFP required proposals to describe the offerors formal, written, documented and in-place processes that will be used in the performance of orders under the ID/IQ contract, and noted that the Government is concerned that awardees under the [contract] be organizationally mature with established processes and procedures that will ensure repeatable success in performance. RFP L.3.2. In addition, offerors were required to respond to a sample task, primarily at an oral presentation.

Under the technical factor, the RFP stated that the proposals would be evaluated to determine the offerors ability to plan, analyze, design, develop, implement, evaluate, support, maintain, modify, and manage technology-based training products. Under the ISD subfactor, the RFP stated that proposals would be evaluated to determine the offerors ability to provide established and proven processes to reliably ensure the successful completion of prospective orders, and that the Sample Task will be evaluated to ensure incorporation of these processes.[1] RFP M.3.2(a). In responding to the sample task, offerors were required to demonstrate understanding and application of the ISD process to the necessary courseware development encompassed by this solicitation. RFP L.5.7.2.1.

Twenty-nine offerors, including ASC (a large business and an incumbent contractor), submitted proposals for Lot II. The source selection evaluation board (SSEB) assigned each proposal a qualitative rating and a risk rating for each technical subfactor and a risk rating for past performance.[2]

ASCs initial proposal was rated marginal with high risk under the ISD subfactor and marginal with medium risk under the management subfactor, with low past performance risk. Among the major weaknesses found in ASCs proposal under the ISD subfactor were that it provided insufficient written processes for ISD, and that its sample task technical approach for courseware development was presented only at a high level, with inadequate details on the proposed plan for meeting the sample task requirements.

Based on these evaluation results, a competitive range of the nine most highly rated proposals was established, including those of ASC and Advanced Engineering & Research Associates (AERA), a small business whose initial proposal had received identical ratings to ASCs but with a lower price.

The agency conducted detailed discussions with each offeror in the competitive range by issuing written evaluation notices (EN), supplemented by oral communications. ASC received numerous ENs encompassing the weaknesses found in its proposal, including some indicating that the agency was concerned about the dearth of details concerning ASCs ISD processes with regard to formative evaluation and one stating [t]he sample task technical approach for courseware development was presented at a high level that furnished inadequate detail on the proposed plan for meeting the Sample Task requirements. Agency Report, Tab 64, EN No. ASC-ISD-11-PC.

Following the receipt of proposal revisions, the SSEB again rated the proposals in the competitive range. ASCs proposal was still rated marginal with high risk under the ISD subfactor but had improved its management subfactor rating to satisfactory with low risk. AERAs proposals rating improved to marginal with medium risk under the ISD subfactor and to satisfactory with low risk under the management subfactor.[3]

Based on the evaluation results, the source selection advisory committee (SSAC) recommended award to the six highest-rated offerors, all of which had resolved all deficiencies and major weaknesses, and had received at least satisfactory with low risk technical ratings and low risk past performance ratings.[4] Of the lower-rated proposals (including AERA, ASC and others), the SSAC recommended award only to AERA because of what the SSAC considered a clear distinction between AERAs proposal and the others.

As between AERAs and ASCs proposals, the SSAC found that AERA had only one remaining significant moderate weakness concerning the level of detail in its ISD processes; the SSAC found that the discussions with AERA had revealed that AERAs ISD processes were fundamentally sound, even though they lacked detail at the lowest level of its processes.

In contrast, the SSAC found that ASCs proposal still had two major weaknesses and a minor weakness. One major weakness involved ASCs written ISD processes, where the agency found ASCs discussion of formative evaluation in the evaluation phase of the ISD did not address the process evaluation portion and much of the product evaluation portion of formative evaluation. The other major weakness involved ASCs lack of details concerning the analysis and design phases of courseware development in its response to the sample task. The minor weakness noted was that ASCs ISD processes were scattered across multiple documents, making it difficult to discern the process flow.

Thus, the SSAC concluded that because ASCs proposal had one more major weakness than AERAs, which was viewed as a discriminator between the proposals, and because AERA had submitted a competitive price proposal (lower'priced than ASCs), had a low risk past performance rating, and is a small business (which would result in a stronger small business pool for set-asides made under the ID/IQ contract), AERA should also receive an award. Agency Report, Tab 123, Addend. to SSEB Report, at 5,7; Tab 125, SSAC Award Recommendation.

ASCs proposal was eliminated from the competition on August 8 and award was made to the seven firms on August 15 without further discussions. After a debriefing, this protest followed.

ASC challenges each aspect of the agencys evaluation. In reviewing a protest of an agencys evaluation of proposals, our Office will not reevaluate proposals but instead will examine the record to determine whether the agencys judgment was reasonable and consistent with the stated evaluation criteria and applicable statutes and regulations. A protesters mere disagreement with the agencys judgment in its determination of the relative merits of competing proposals does not establish that the evaluation was unreasonable. See SDS Intl, Inc., B-291183.4, B-291183.5, Apr. 28, 2003, 2003 CPD 127 at 5-6. Based on our review of the record, we find the agencys evaluation and source selection were reasonable.

As noted above, the Navy found that ASCs written processes were incomplete because the proposal lacked sufficient details of the actual step-by-step processes that ASC would employ during formative evaluation, particularly with regard to process evaluation. See Tr. at 33-34, 36-38. In many cases, the agency found that ASCs proposal identified responsibilities, tasks and procedures, instead of processes. Tr. at 35-37, 60-61, 91-92. For example, in the area of process evaluation, ASC did not provide a detailed description of the formative evaluation process that it would utilize to ensure quality during the analysis, design, and development activities. See Tr. at 29'30. The Navy officials explained that the lack of detail pertaining to formative evaluation in the areas of process and product evaluations caused them to question the adequacy of ASCs written processes for ensuring repeatable success in performance. See Tr. at 13-14, 29, 38.

ASC asserts that its proposal does include formative evaluation elements to ensure instructionally sound courseware throughout the various phases of the ISD process at a sufficient level of detail. To support this contention, ASC has offered several sworn statements and hearing testimony from a consultant with expertise in the ISD field. Although this individual testified at the hearing conducted by our Office that in his opinion the written processes were adequately described for formative evaluation (albeit scattered throughout several documents in ASCs proposal), he conceded that the proposal did not include the level of step-by-step detail desired by the agency, particularly with respect to the process for ensuring quality as part of the process evaluation subphase. This individual argued that in his experience it was not unusual to have less detailed written processes for formative evaluation involving process evaluation, and the initial ISD phases, when, as was the case here, no specific courseware task had been identified. See Tr. at 50-53.

The Navy officials explained at the hearing, however, that the level of detail reflected in an offerors written processes, particularly for formative evaluation, provides the agency an opportunity to assess the offerors ability to succeed on numerous projects involving evolving technologies over the 8-year term of the contract, dozens of orders, a wide range of courseware projects, customers, and training curricula. See Tr. at 11-12, 62'63, 75. The Navy officials explained that, in their view, the details and quality of a contractors written processes ensures repeatable success on a long-term basis through an established structured system, and that a contractor that has not provided sufficient formative evaluation detail would cause the agency to have significant concerns about that contractors performance. See Tr. at 13-14. The Navy officials further testified that the more highly rated proposals included the level of detail found lacking in ASCs written processes and that the quality of the detail found in the written processes was the basis upon which the agency conducted the evaluation. Tr. at 63-65.

Although it is apparent that formative evaluation was addressed to some extent in ASCs proposal, such as in the area of validation, we find that the agency reasonably determined that ASCs written processes lacked step-by-step detail, particularly concerning the process evaluation subphase. As noted above, the RFP required proposals to describe the offerors formal, written, documented and in-place processes that will be used in the performance of orders under the ID/IQ contract. RFP L.3.2. Consequently, the agency reasonably determined that ASCs failure to adequately address formative evaluation in its proposal constituted a significant weakness. While we recognize that the Navy and ASCs consultant disagree on the required or desired level and adequacy of detail by an offeror to show effective written processes, such a disagreement is not a basis to overturn the agencys evaluation decision. See SDS Intl, Inc., supra.

The second reason that the Navy found warranted assigning a marginal/high risk rating for the ISD subfactor involved ASCs response to the sample task, including its oral presentation, which the Navy found addressed in adequate detail only the development phase of courseware development, but not the analysis and design phases.

ASC concedes that its proposal did not provide the same level of detail for the analysis and design phases as for the development phase, see Tr. at 101-03, 112, but contends that the Navy led it to believe that it only needed to address the development phase in its response to the discussions concerning its sample task response. ASC states that it made this assumption based on its understanding of the EN it received from the Navy on this matter, which only referenced courseware development, and because during oral discussions on this point the Navy had cited to a development tool that had been discussed only in the development phase of ASCs plan for implementing the sample task.

According to the parties, the term courseware development can, depending on the context, refer to either all phases of ISD, including analysis, design, development, implementation and evaluation, or, more narrowly, only to the actual development phase. Agency Report at 23; Tr. at 95, 106-07; Affidavit of Protesters Consultant (Oct. 5, 2003) at 7. In this regard, the Navy argues that ASC interpreted the discussion question unreasonably because, according to the Navy, the common interpretation for the term courseware development covers all five ISD phases,
and the word phase would ordinarily be added when the reference is to the development phase as a subset of ISD courseware development. The agency advises that this interpretation is consistent with the way the term is used in the RFP, and consistent with how ASC used the terminology in its own proposal.

Although discussions must address at least deficiencies and significant weaknesses identified in proposals, the precise content of discussions is largely a matter of the contracting officers judgment. We review the adequacy of discussions to ensure that agencies point out weaknesses that, unless corrected, would prevent an offeror from having a reasonable chance for award. Northrop Grumman Info. Tech., Inc., B'290080 et al., June 10, 2002, 2002 CPD 136 at 6. In conducting discussions, an agency may not prejudicially mislead offerors. Burns and Roe Servs. Corp., B'251969.4, 94-1 CPD 160 at 4.

Based on our review, we conclude that ASC received meaningful discussions. As indicated by the Navy, the RFP specifically required offerors to discuss all of the processes related to analysis and design in responding to the sample task and stated that the sample task would be evaluated to ensure incorporation of these processes. See RFP L.5.7.2, M.3.2(a)(1). Furthermore, notwithstanding what may have occurred during oral communications,[5] the written EN specifically requested that ASC address courseware development, with no mention that this subject was limited to the development phase. Given the potential dual meaning of courseware development and the context of the EN, as well as the RFPs emphasis on the offerors ability to demonstrate all of its established proven processes, we do not think that a reasonably diligent offeror would have interpreted the agency discussions as not applying to the entire courseware development process in order to meet the sample task requirements, without at least first seeking to clarify the matter. Indeed, ASCs representative was cognizant of the possible dual meaning of courseware development, but admits that it was his own assumption that caused him to believe that a response was required only for the development phase of the firms courseware plan. Tr. at 99-101. On this record, we find no basis to question the propriety of the agencys discussions.

ASC nevertheless argues that the Navy reasonably could have extrapolated that it could sufficiently address the other phases, given its adequate response regarding the development phase, and because it was evident from its response that ASC had misinterpreted the EN. ASC thus contends that its response should have led to further discussions rather than its elimination from the competition.

An agency is not obligated to reopen negotiations to give an offeror the opportunity to remedy a defect that first appears in a revised proposal. See Burns and Roe Servs. Corp., supra. Further, the agencys witness testified that she did not consider it logical to assume that an adequate response to the development phase meant that an offeror was capable of developing adequate processes for the other two phases, because she considered the analysis phase to be a particularly difficult aspect of ISD courseware, which has tended to be done poorly by some contractors. Tr. at 117. On the record before us, we find that the Navy had a reasonable basis for attributing a weakness to ASCs proposal as it related to the sample task and the analysis and design phases of its courseware plan.

ASC questions whether the evaluation was fair, given that both AERAs and ASCs proposals received marginal ratings for the ISD subfactor because their written ISD processes lacked detail. However, the record reflects that the agency reasonably considered the additional major weakness associated with ASCs sample task to be a significant discriminator for purposes of making an award selection;[6] indeed, it was because of this additional significant weakness that ASCs proposal was considered inferior to AERAs under the ISD subfactor, as evidenced by ASCs high risk rating (as compared to AERAs moderate risk rating) under this subfactor.[7] While ASC argues that its low risk past performance should have somehow counterbalanced any risk found under the ISD subfactor, AERAs past performance was also rated low risk and the evaluation scheme provided for separate evaluations of these two evaluation factors.

Finally, ASC argues that the Navys evaluation was unreasonable because it should have received an overall acceptable rating. In this regard, ASC contends that the management and ISD subfactors (for which ASCs proposal received satisfactory and marginal ratings, respectively) should have been averaged to arrive at an overall technical score, the action it asserts was indicated by the equal weight attached to these two subfactors. Regardless of the logic of this argument, which assumes that the agency should round up (to satisfactory) and not down (to marginal), the record indicates that the Navy source selection officials individually considered the ratings of each proposal under the various subfactors, and also looked behind the ratings to determine their basis, and reasonably determined that ASCs overall technical rating should be marginal with high risk, and that AERAs proposal was technically superior to ASCs in a significant way.[8] Thus, we find no basis to conclude that the Navy acted improperly in eliminating ASCs proposal from the competition.

The protest is denied.

Anthony H. Gamboa
General Counsel









[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8] See SDS Intl, Inc. supra

Downloads

GAO Contacts

Office of Public Affairs