Skip to main content

Science and Technology Corporation

B-422601 Aug 23, 2024
Jump To:
Skip to Highlights

Highlights

Science and Technology Corporation (STC), a small business of Hampton, Virginia, protests the award of a contract to Analytical Mechanics Associates, Inc. (AMA), also a small business of Hampton, Virginia, under request for proposals (RFP) No. 80ARC023R0006, issued by the National Aeronautics and Space Administration (NASA) for aircraft and spaceflight systems engineering support services. The protester contends the agency unreasonably evaluated proposals under the mission suitability and past performance factors, made unreasonable cost adjustments, failed to conduct meaningful discussions, and made an unreasonable best-value tradeoff decision.

We deny the protest.
View Decision

Decision

Matter of: Science and Technology Corporation

File: B-422601

Date: August 23, 2024

Robert J. Symon, Esq., Nathaniel J. Greeson, Esq., Patrick R. Quigley, Esq., and Owen E. Salyers, Esq., Bradley Arant Boult Cummings LLP, for the protester.
Francis E. Purcell Jr., Esq., and Joseph R. Berger, Esq., Thompson Hine LLP, for Analytical Mechanics Associates, Inc., the intervenor.
Michael G. Anderson, Esq., H. Gray Marsee, Esq., and Shannon A. Sharkey, Esq., National Aeronautics and Space Administration, for the agency.
Jacob Talcott, Esq., and Jennifer D. Westfall-McGrail, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest challenging the agency’s evaluation of proposals is denied where the evaluation was reasonable and in accordance with the terms of the solicitation.

DECISION

Science and Technology Corporation (STC), a small business of Hampton, Virginia, protests the award of a contract to Analytical Mechanics Associates, Inc. (AMA), also a small business of Hampton, Virginia, under request for proposals (RFP) No. 80ARC023R0006, issued by the National Aeronautics and Space Administration (NASA) for aircraft and spaceflight systems engineering support services. The protester contends the agency unreasonably evaluated proposals under the mission suitability and past performance factors, made unreasonable cost adjustments, failed to conduct meaningful discussions, and made an unreasonable best-value tradeoff decision.

We deny the protest.

BACKGROUND

On March 16, 2023, the agency issued the RFP as a set-aside for small businesses in accordance with Federal Acquisition Regulation (FAR) part 15 and NASA FAR Supplement (NFS) subpart 1815.3. Agency Report (AR), Tab 3b RFP sections B‑M at 135; Contracting Officer’s Statement (COS) at 1, 4.[1] The agency sought to procure a variety of services, including scientific research, engineering services, and program/project management support services.[2] AR, Tab 3c Performance Work Statement (PWS) at 2. The solicitation contemplated award of a contract with fixed price contract line item numbers (CLINs) for phase-in and core contract management requirements, and an indefinite-delivery, indefinite-quantity (IDIQ) CLIN under which the agency would issue cost-plus-fixed-fee task orders. COS at 1. The period of performance included a 1-year base period with four, 1-year option periods. Id. The due date for proposals was May 12, 2023. AR, Tab 3b, RFP sections B-M at 102.

The solicitation provided for the evaluation of proposals based on the following factors: mission suitability, past performance, and cost/price. Id. at 135. The mission suitability factor consisted of two subfactors: management approach and technical approach. Id. at 139. Award would be made on a best-value tradeoff basis where mission suitability was more important than past performance, past performance was more important that cost/price, and when combined, mission suitability and past performance were significantly more important than cost/price. Id. at 146.

For the management approach subfactor, the agency would evaluate the effectiveness and efficiency of the offeror’s approach to organizational structure/partnering, as well as its approach to staffing, recruitment, retention, training, and its phase-in plan. Id. at 140. The agency would also evaluate the effectiveness of the offeror’s approach to fostering innovation, the reasonableness of its compensation plan, and its representation of limited rights data and restricted computer software. Id. The solicitation further provided that if the source selection board determined that the offeror failed to adequately demonstrate the ability to perform with the resources proposed, it might assign a weakness or perform a probable cost adjustment. Id. at 111.

For the technical approach subfactor, the agency would evaluate the offeror’s approach to three sample task orders. Id. at 141. The three sample task orders covered the following three areas, respectively: systems analysis, entry systems, and launch vehicle analysis. Id. at 114‑115. For sample task order one (STO1), offerors were to provide a conceptual design of an entry vehicle capable of performing the mission, including a nominal trajectory, for preliminary mission design studies. Id. at 114. For sample task order two (STO2), offerors were to perform design, certification, and instrumentation of the forebody thermal protection system (TPS). Id. at 115. For sample task order three (STO3), offerors were to perform launch vehicle buffet and vibroacoustic environment analyses. Id. For each of the sample task orders, offerors were to provide three technical risks most critical to successful performance of the sample task order. Id. at 117‑120.

Under the mission suitability factor, proposals would receive a numerical score on a 1000-point scale. Id. at 139. A proposal could receive up to a total of 550 points for the management approach subfactor, up to a total of 450 points for the technical understanding subfactor. Id. Depending on the point scores assigned, proposals would receive an adjectival rating of excellent, very good, good, fair, or poor.[3] Id. at 138.

For past performance, the agency would evaluate recent and relevant past performance references to determine the offeror’s quality of performance in meeting contractual requirements and management requirements for the submitted references. Id. at 142. Proposals would receive a rating of very level high confidence, high level of confidence, moderate level of confidence, low level of confidence, very low level of confidence, or a neutral rating.[4] Id. at 143‑144.

For cost/price, the agency would perform two analyses. See id. at 144‑145. For the offeror’s cost-plus-fixed-fee CLIN, the agency would perform a cost realism analysis to determine probable cost to the government of performance. Id. at 144. For the fixed-price CLINs, the agency would evaluate pricing in accordance with FAR subsection 15.404-1(b)(2). Id. at 145. The agency would calculate the total evaluated price by adding the probable cost for the cost-plus-fixed-fee CLIN to the fixed price CLINs. Id.

The agency received proposals from STC and AMA by the due date. COS at 4. The evaluation results were as follows:

 

STC

AMA

Management Approach

(550 Point Maximum)

 

Good

(286 points)

 

Good

(379.5 points)

Technical Approach

(450 Point Maximum)

 

Good

(229.5 points)

 

Very Good

(355.5 points)

Total Mission Suitability Points

(1000 Point Maximum)

 

 

515.5 points

 

 

735 points

Past Performance

 

High Level of

Confidence

 

Very High Level of Confidence

Proposed Cost/Price

 

$90,995,657

 

$97,823,235

Probable Cost/Price

 

$99,174,257

 

$103,763,492

 

AR, Tab 10, Source Evaluation Board (SEB) Evaluation Report at 17. In conducting its evaluation, the SEB assigned STC’s proposal one strength and one weakness under the management approach subfactor, and one strength and three weaknesses under the technical approach factor. [5] Id. at 70. The SEB assigned AMA’s proposal five strengths and no weaknesses under the management approach subfactor, and one significant strength, three strengths, and one weakness under the technical approach subfactor. Id.

The strength that STC’s proposal received under the management approach subfactor was for STC’s proposed use of International Organization for Standardization (ISO) 9001 and Capability Maturity Model Integration (CMMI) Level 2 certifications, which the SEB concluded would benefit the agency by reducing risk and increasing effectiveness and efficiency in the performance of contract management functions. Id. at 35. The weakness under this subfactor was due to STC’s failure to describe how it intended to coordinate entities and personnel to accomplish the following requirements in section 2.1.1 of the PWS: safety and health, records management, secure handling and protecting of government-controlled, contractor-generated data, information technology security, and installation-accountable government property. Id.

Under the technical approach subfactor, the SEB assigned STC’s proposal a strength for an effective approach to [DELETED] in the performance of STO3. The SEB concluded that this approach would likely enhance efficiency by reducing unnecessary [DELETED], which, in turn, would enhance the potential for successful contract performance. Id. at 59. STC’s proposal also received one weakness for its response to STO1 and two weaknesses for its response to STO2. Id. at 59-60. For STO1, the SEB concluded that STC’s approach for [DELETED] contained an inconsistency because STC proposed to develop [DELETED] using [DELETED] tools while stating in another area of its proposal that it would use a [DELETED]. Id. at 59. For STO2, the SEB concluded that STC’s proposal failed to provide an adequate discussion of its approach to develop [DELETED] deliverables. Id. at 60. The SEB also concluded that STC’s proposed [DELETED] was incomplete due to the omission of an approach to [DELETED]. Id.

In conducting the best-value tradeoff, the source selection authority (SSA) concluded that AMA’s proposal offered “a substantial discernible advantage” over STC’s proposal under the mission suitability factor. AR, Tab 11, Source Selection Decision Document (SSDD) at 12. The SSA explained that the strength assigned to AMA’s proposal under the management approach subfactor for its effective approach to using a certified quality management system was somewhat similar to the strength assigned to STC’s proposal for its certified business and task order processes. Id. at 11. Given that AMA’s proposal received four additional strengths, and STC’s proposal received a weakness for its failure to adequately describe its proposed organizational structure, the SSA concluded that AMA’s proposal would have a greater chance of successful performance. Id.

The SSDD reflects that, under the technical approach subfactor, the SSA was “particularly impressed” with AMA’s approach to performing the TPS material testing requirement under STO2 because it “illustrated an in-depth understanding of the many interrelated and multidisciplinary tasks.” Id. When comparing this evaluation to the evaluation of STC’s proposal, the SSA acknowledged that STC proposed an effective approach to [DELETED] under STO3 but raised concern over the three weaknesses STC’s proposal received for STO1 and STO2. Id. Ultimately, considering the evaluations under both subfactors, the SSA concluded that AMA’s proposal was the strongest overall under the mission suitability factor. Id.

In reviewing the past performance evaluation, the SSA noted that while the ratings were close, AMA proposed the most highly relevant efforts. Id. at 12. The SSA also explained that AMA had a higher quality of performance when compared to STC, noting that the quality of AMA’s past performance ranged from “exceptional merit” to “very effective” while STC’s past performance ranged from “exceptional merit” to “effective.” Id. In light of these considerations, the SSA concluded that AMA’s past performance was more advantageous. Id.

The final discriminator in the SSA’s best-value tradeoff analysis was cost/price. Id. The SSA noted that although AMA’s probable cost/price was approximately five percent higher than STC’s cost/price, this difference was not enough to overcome the advantages of AMA’s proposal under the mission suitability and past performance factors, particularly as cost/price was the least important evaluation factor. Id. Ultimately, the SSA concluded that AMA’s proposal represented the best value to the agency and selected AMA for award. Id.

On May 9, 2024, the agency notified STC that it had selected AMA’s proposal for award. AR, Tab 12b, Award Notification at 2. STC requested a debriefing the same day. AR, Tab 12c, Request for Debrief at 2. The agency held an oral debriefing on May 15. AR, Tab 12d, Debriefing at 1. STC filed this protest with our Office on May 20.

DISCUSSION

STC challenges the evaluation of its proposal and AMA’s proposal under the mission suitability factor. Protest at 10. Under the management approach subfactor, STC contends that the agency should have assigned its proposal two significant strengths and no weaknesses. Id. at 10‑13. Under the technical approach subfactor, STC argues that its proposal should have received two strengths or one significant strength, and no weaknesses. Id. at 14. STC also contends that the agency unreasonably assigned AMA’s proposal two strengths under the technical approach subfactor, and that the agency engaged in disparate treatment by assigning AMA’s proposal, but not STC’s proposal, a significant strength here. Id. at 20‑23. STC further argues that the agency unreasonably evaluated STC’s past performance, and that the agency made improper cost adjustments to its proposal. Id. at 23, 30. Last, STC argues that the agency failed to hold meaningful discussions and conducted an unreasonable best‑value tradeoff. Id. at 47‑50. For reasons discussed below, we deny the protest.[6]

In reviewing a protest challenging an agency’s evaluation, our Office will not reevaluate proposals or substitute our judgment for that of the agency, as the evaluation of proposals is a matter within the agency’s discretion. The Bionetics Corp., B-420272, Jan. 7, 2022, 2022 CPD ¶ 27 at 3. Rather, we will review the record to determine whether the agency’s evaluation was reasonable and consistent with the stated evaluation criteria and applicable procurement statutes and regulations. Id. A protester’s disagreement with the agency’s judgment, without more, is insufficient to establish that an evaluation was unreasonable. Id.

Evaluation of STC’s Proposal under the Management Approach Subfactor

STC first argues that the agency unreasonably failed to assign its proposal a significant strength for its proposed use of the CMMI Level 2 and ISO 9001 certifications. Protest at 10‑11. According to STC, the assignment of merely a strength (as opposed to a significant strength) here was inconsistent with the SEB’s earlier finding that these certifications demonstrated that STC proposed a robust quality management system that would “greatly enhance” the potential for successful performance of the contract. Id. at 11. In response, the agency argues that assignment of a strength was reasonable and STC’s contention that these certifications would “greatly enhance” the potential for successful contract performance is solely STC’s judgment and not the assessment of the SEB. Memorandum of Law (MOL) at 4.

Based on the record, we have no basis to sustain this protest ground. The contemporaneous evaluation record indicates that the SEB understood the nature of these certifications, as demonstrated through its in-depth explanation of the finding that these certifications warranted a strength. AR, Tab 10, SEB Evaluation Report at 37‑38. In this regard, the SEB explained that the CMMI Level 2 certification ensures that STC has “a proven approach to drive out real benefits through improved project predictability and consistency.” Id. at 37. The SEB continued by explaining that the Level 2 certification is earned by demonstrating that the results are repeatable and consistent. Id. The SEB also concluded that the ISO-9001 certification is earned “through an assessment by an external certification body” and ensures that the firm has a “robust quality management system.” Id. Collectively, the SEB concluded that these certifications assured the agency that STC’s processes were standardized and would lead to “fewer errors, less rework, and improved productivity.” Id. Although STC argues that its proposal deserved a significant strength for these certifications, STC’s disagreement with the relative weight of these certifications is insufficient to establish that the evaluation was unreasonable. See The Bionetics Corp., supra. Therefore, this protest ground is denied.

STC next argues that its proposal should have been assigned a significant strength for the multiple solution attributes it proposed under the management approach subfactor. Protest at 11. Specifically, STC contends that agency failed to appropriately consider that, per STC’s proposal, seventy‑five percent of the civil servants within NASA’s Code TNA branch are former STC employees. Id. at 12. STC also argues that the agency overlooked its proposed project manager who had seventeen years of experience at NASA. Id. In sum, STC contends that its “employee‑focused culture, highly competitive benefits, and other corroborated retention features,” as well as its ability to “attract and retain top-notch talent” warranted a significant strength. Id. at 11-12. The agency responds that the SSEB reasonably concluded that STC’s multiple solution attributes met, but did not exceed, the requirements. MOL at 5.

We have no basis to conclude that the agency’s evaluation was unreasonable. Under the management approach subfactor, the solicitation provided that offerors were to describe, among other things, their approach to (1) providing the staffing necessary to successfully perform the requirements in the PWS, (2) recruiting and retaining employees, and (3) providing internal employee training and development. AR, Tab 3b, RFP sections B-M at 113. STC’s contention that its proposal demonstrated its ability to “retain and recruit top‑notch talent” falls squarely within the solicitation’s requirement for offerors to describe their approach for recruiting and retaining employees. Similarly, the fact that STC proposed a project manager with experience working with NASA coincides with the solicitation’s requirement that offerors provide the staff necessary to successfully perform the requirements. Finally, as the agency points out, STC’s argument that seventy‑five percent of the employees within NASA’s Code TNA branch are former STC employees does nothing to demonstrate its ability to comply with the requirements of the management approach subfactor; instead, given the high percentage of former STC employees that have transitioned to NASA’s Code TNA branch, it may suggest an issue with STC’s ability to retain employees. See MOL at 5. In our view, the protester has failed to demonstrate that the agency’s evaluation was unreasonable.

In its final challenge to the evaluation of its proposal under the management approach subfactor, STC argues that the agency unreasonably assigned its proposal a weakness for its failure to demonstrate how it would meet the requirements of section 2.1.1. of the PWS.[7] Protest at 12. STC contends that the agency’s assignment of a weakness here, particularly as it pertains to its failure to adequately describe its records management system, is internally inconsistent with the strength STC’s proposal received under this subfactor for its CMMI Level 2 certification. Id. at 13. According to STC, the CMMI certification demonstrates that STC has a “robust records management system in place.” Id.

The agency argues that our Office should deny this protest ground for two reasons. See MOL at 6‑9. First, the agency contends that the SSEB assigned STC’s proposal a weakness for its failure to address five areas within PWS section 2.1.1, but STC’s protest ground addresses only its failure to address records management. Id. at 7. Therefore, even if STC were correct that the evaluation was internally inconsistent here, the weakness would still remain. Id. Second, the agency argues that, in any event, the weakness was appropriately assigned and not internally inconsistent because the CMMI certification demonstrated only that STC had a process for creating quality products, which was a separate consideration from whether STC adequately detailed a process for records management. Id. at 9.

We find the protester’s argument to be unpersuasive. In assigning this weakness, the SEB concluded that STC’s proposal failed to address the following areas of section 2.1.1. of the PWS: safety and health, records management, handling and protection of government-controlled, contractor-generated data, information technology security management, and installation-accountable government property. AR, Tab 10, SEB Evaluation Report at 39‑41. The SEB assigned STC’s proposal a weakness here, not only for its failure to adequately describe its ability to perform the records management requirement, but for its failure to describe how it would coordinate personnel and entities to perform in each of the enumerated areas of the PWS. See id. Therefore, STC’s argument that its CMMI certification demonstrated its ability to perform the records management requirement, even if true, still fails to address the remaining areas that the SEB found to be deficient in STC’s proposal. Because STC’s protest does not address these remaining areas, we have no basis to object to the agency’s assignment of the weakness and deny this protest ground.[8]

Evaluation of STC’s Proposal under the Technical Approach Subfactor

STC first argues that under the technical approach subfactor, the agency unreasonably assigned its proposal only one strength for its approach to [DELETED] for STO3. Protest at 15. As relevant here, the SEB concluded that STC’s proposed approach to selecting [DELETED] methodologies would “reduce[] unnecessary [DELETED]” and “minimize [DELETED] requirements, while assuring accuracy of the [DELETED] methodology used to assess [DELETED].” AR, Tab 10, SEB Evaluation Report at 62. Id. According to STC, assuring accuracy and reducing unnecessary expenses are two separate considerations and its proposal should have received two strengths, or alternatively, one significant strength, for these features. Protest at 15. The agency responds that the SEB reasonably assigned only one strength here because “efficiency and accuracy are interdependent, not two separate things.” COS at 15.

We have no basis to object to the agency’s assignment of a single strength. Although STC argues that it should have received separate strengths for efficiency and accuracy, the agency points out that both are required for success. See id. In this regard, the agency states that “[DELETED] efficiency without proper accuracy is not useful, and accuracy without [DELETED] efficiency is similarly not useful.” Id. Although STC asserts that, within the realm of [DELETED], accuracy and efficiency are two separate points that are often in conflict with one another, Protest at 15, it provides no support for this argument. Accordingly, this protest ground is denied.

STC next argues that the agency improperly assigned its proposal three weaknesses under the technical approach subfactor. Id. at 16. Concerning the first weakness, the SEB concluded that STC’s proposed approach to database generation demonstrated a lack of understanding of the requirements for STO1. AR, Tab 10, SEB Evaluation Report at 63. Specifically, the SEB found that STC’s approach to develop [DELETED] databases through [DELETED] tools conflicted with another area of STC’s proposal where it stated that it would use a [DELETED] to anchor solutions.[9] Id. at 64. The SEB noted that this anchored approach would not result in a [DELETED] database because it did not incorporate solely [DELETED] methods. Id. at 63‑64; MOL at 13. In response, STC contends that an anchored database can be considered a [DELETED] database and that the SEB’s contrary position is nothing more than a “bald assertion.” Comments at 26. Furthermore, STC argues that it proposed to provide [DELETED] databases. Comments at 26; Protest at 17.

This protest issue essentially presents a disagreement between STC and the agency’s technical evaluators over the components of a [DELETED] database and whether a [DELETED] database may permissibly include [DELETED] methods. We defer to the agency’s evaluators on technical matters such as this and conclude that STC has not shown that the agency’s evaluation conclusions were unreasonable. See Weibel Equip., Inc., B-406888, B-406888.2, Sept. 21, 2012, 2012 CPD ¶ 279 at 13 (refusing to disturb the evaluators’ conclusion that the protester failed to adequately explain its development process for 300 MHz oscillator). Although STC contends that the agency’s position represents a “bald assertion,” STC itself has provided no support for its contrary position. Accordingly, this protest ground is denied.

Concerning the second and third weakness under the technical approach subfactor, STC argues that its proposal received these two weaknesses “for the same perceived flaw.” According to STC, its proposal received two weaknesses for “a failure to include TPS test plans and associated data.” Protest at 18-19. The agency responds that STC “misconstrues” the two weaknesses assigned by the SEB in an attempt to redefine the weaknesses to fit its argument. MOL at 14.

We have no basis to find the agency improperly double-counted these two weaknesses. STC received one weakness under STO2 for its failure to provide an adequate discussion of its approach for two TPS test plan deliverables, namely the development of arc jet test plans for TPS and TPS instrumentation certification, as required by the solicitation.[10] AR, Tab 10, SEB Evaluation Report at 65; AR, Tab 3b, RFP sections B-M at 118. STC’s proposal received a separate weakness for its risk mitigation plan’s omission of an approach to perform feature testing to inform its computational models. AR, Tab 10, SEB Evaluation Report at 67.

Although STC contends that these weaknesses arose from the same perceived flaw, they were assigned due to STC’s failure to sufficiently address two separate requirements of the solicitation. The solicitation required offerors to develop and implement relevant arc jet test plans for TPS and for instrumentation certification. AR, Tab 3b, RFP Sections B‑M at 118. There was a separate requirement to provide an approach for risk mitigation. Id. The SEB concluded that STC’s proposal failed to address these two areas. AR, Tab 10, SEB Evaluation Report at 65‑67. Accordingly, we have no basis to object to the agency’s decision to assign STC’s proposal two weaknesses due to its failure to address two separate requirements.

Evaluation of AMA’s Proposal under the Technical Approach Subfactor

STC next argues that the agency unreasonably and disparately evaluated AMA under the technical approach subfactor. Protest at 20. First, STC argues that the agency unreasonably assigned AMA’s proposal a strength for the use of the [DELETED] tool in its approach to configuration management. Id. According to STC, AMA’s proposal should not have received a strength here because, in STC’s judgment, AMA “merely introduces another tool” that performs the same tasks of similar tools “generally used” by the agency on a day-to-day basis, such as Microsoft Teams, Slack, and GitHub. Id. Second, STC contends that, to the extent AMA’s proposal appropriately received a strength for its [DELETED] tool, the agency also should have assigned its proposal a strength for a configuration management tool called [DELETED]. Id. at 21. The agency argues that AMA’s proposal appropriately received a strength here and that STC is incorrect in its description of the [DELETED] tool. MOL at 21. The agency also argues that the [DELETED] tool proposed by STC is not a configuration tool but is a “CFD specific data generation and manipulation tool.” Id.

Based on the record, we have no basis to find the agency’s evaluation unreasonable here. With respect to STC’s first argument, the SEB explained that the [DELETED] tool centralizes all documentation used in performance of STO2 and allows authorized team members access to the current version of each document while archiving older versions of the document. AR, Tab 10, SEB Evaluation Report at 50‑51. Although STC argues that the agency already possesses tools that perform these functions, such as Microsoft Teams and Slack, it fails to address how the features mentioned in the SEB evaluation report are already covered by the tools listed by STC. Instead, STC simply raises a disagreement with the evaluation without providing any support of its position. Our decisions explain that such disagreements do not provide a basis to sustain a protest. See The Bionetics Corp., supra.

With respect to STC’s argument that its proposed [DELETED] tool is a functional equivalent to the [DELETED] tool and should have also warranted its proposal receiving a strength, we have no basis to sustain this protest ground. Essentially, STC alleges the agency engaged in disparate treatment here. See Protest at 21. When a protester alleges disparate treatment in an evaluation, it must show the differences in evaluation did not stem from differences between the offerors’ proposals. Battelle Memorial Inst., B-418047.5, B-418047.6, Nov. 18, 2020, 2020 CPD ¶ 369 at 5-6. We conclude that STC has not met this standard. As the contracting officer explains, these two tools serve different functions. COS at 19. In this regard, the [DELETED] tool is “significantly more limited” than the [DELETED] tool in that it is limited to [DELETED]. Id. Although STC contends that the two tools are essentially the same in terms of function, the record does not support this argument. Because the protester has failed to show that the differences in the evaluation did not stem from differences in the proposals, this protest ground is denied.

STC next argues that the agency engaged in disparate treatment by assigning AMA’s proposal a significant strength and its own proposal merely a strength for comparable features. In this regard, the agency assigned AMA’s proposal a significant strength for its “highly effective approach to preliminary design of an entry probe . . . by proposing to include analyses early in the design process” for STO1. Protest at 21. In contrast, the agency assigned only a strength to STC’s proposal for five questions that “will establish limitations of the available [DELETED], help develop [DELETED], and minimize [DELETED] requirements, while assuring accuracy of the [DELETED] methodology used to assess [DELETED]” for STO3. Id. STC contends that both features should have been treated equally because both concern the “efficiency of . . . proposed solutions.” Id.

We have no basis to sustain this protest ground. As the agency points out, STC’s argument that these two features are the same stems from an oversimplification of the contents of the proposals. MOL at 25. The SEB found that AMA’s approach to preliminary part design for STO1 warranted a significant strength because of its “incorporation of the TPS instrumentation,” as well as its approach to addressing “environmental factors that could affect the payload, early in the design process.” AR, Tab 10, SEB Evaluation Report at 48. STC’s proposal, on the other hand, received a strength for its proposed approach to [DELETED] under STO3. Id. at 55. STC essentially engages in a comparison of apples (i.e., AMA’s preliminary design approach for STO1) to oranges (i.e., STC’s approach to [DELETED] under STO3) and argues that because they both deal with efficiency, they should have been treated equally. See Protest at 21. We deny this protest ground because STC has failed to show that the differences in the evaluation did not stem from differences in the proposals. See Battelle Memorial Inst., supra.

In its final challenge to the evaluation of AMA’s proposal under the technical approach subfactor, STC alleges that the agency unreasonably assigned AMA’s proposal a strength for “its proposed approach to perform buffet predictions and performance deliverables in STO3,” which included the “identification of mitigation candidates . . . through iterative computational analysis cycles.” Protest at 22. In summarizing this strength, STC concludes that AMA received this strength for the “design iteration of a launch vehicle,” which was outside the scope of the requirements of the solicitation. Id. The agency responds that STC mischaracterizes the strength by asserting that AMA received a strength for the design iteration of a launch vehicle; instead, the agency contends that the strength was assigned for a “valued-added approach analysis.” MOL at 29.

We have no basis to sustain this protest ground. The solicitation provided that “[f]or each configuration the contractor shall provide launch vehicle analysis throughout the transonic regime to determine the buffet onset and analyze the unsteady buffet loading.” AR, Tab 3b, RFP sections B‑M at 119‑120. In evaluating AMA’s proposal, the SEB explained that AMA’s approach to mitigate adverse design features ensured the likelihood of successful performance because it, among other things, included the “identification of mitigation candidates . . . through iterative computational analysis cycles.” AR, Tab 10, SEB Evaluation Report at 55. As is evident in the SEB evaluation report, the agency assigned this strength not for the design of a launch vehicle, as argued by STC, but for AMA’s approach to identifying mitigation measures that would ultimately lead to changes in design earlier in the process. See id. Although the approach may result in design change, the strength was assigned for the analysis to identify areas needing mitigation, not for any design itself. Thus, STC’s argument that this strength was unreasonably assigned for the design of a launch vehicle is unsupported by the record.

Challenge to the Evaluation of STC’s Past Performance

STC next contends that the agency unreasonably evaluated its proposal under the past performance factor. Protest at 23.

Our Office will examine an agency’s evaluation of an offeror’s past performance only to ensure that it was reasonable and consistent with the stated evaluation criteria and applicable statutes and regulations, because determining the relative merit of an offeror’s past performance is primarily a matter within the agency’s discretion. American Envtl. Servs., Inc., B-406952.2, B-406952.3, Oct. 11, 2012, 2013 CPD ¶ 90 at 5. The evaluation of past performance, by its very nature, is subjective, and we will not substitute our judgment for reasonably based evaluation ratings; an offeror’s disagreement with an agency’s evaluation judgments, by itself, does not demonstrate that those judgments are unreasonable. Id.

In challenging the evaluation of its proposal under the past performance factor, STC first argues that the agency improperly evaluated its past performance reference for the AEMMS contract. Protest at 23. STC argues that despite the AEMMS contract covering all ten areas of the ASSESS PWS, the agency unreasonably concluded that only two of the ASSESS PWS sections covered by the AEMMS contract warranted a rating of “very highly relevant.” Protest at 23‑24. The agency responds that STC’s argument is unsupported by the record and that while the AEMMS contract covered all of the areas of the ASSESS contract, the degree to which it covered these areas varied. MOL at 31.

Based on the record, we have no basis to sustain this protest ground. We note, as provided above, that the subject ASSESS contract is a consolidation of two previous contracts: AEMMS and ESTRAD. AR, Tab 6, ASSESS Background at 11. Essentially, STC alleges that because the AEMMS contract covered all the requirements of the ASSESS contract, the assignment of any adjectival rating other than “very high level of confidence” was unreasonable. See Protest at 24. As the agency explains, however, simply covering the same requirements did not automatically result in the assignment of a rating of “very high level of confidence”; the agency also considered the degree to which the reference covered the requirements. MOL at 31. In this regard, the past performance evaluation report notes that STC’s proposal failed to demonstrate experience in the areas of conceptual design, analysis, trade studies, and the development of systems analysis tools and frameworks as required by section 2.2.2.6 of the PWS. AR, Tab 8A, STC Past Performance Evaluation Report at 6. The failure to address these areas resulted in the agency concluding that its demonstrated experience with regard to these requirements was only “relevant.” Id. STC does not challenge this factual conclusion. See Comments at 32‑34.

The report further notes that other areas of the AEMMS contract, such as those covering machine learning and structural analysis, were only “relevant” to the current PWS. AR, Tab 8A, STC Past Performance Evaluation Report at 4‑7. Again, STC does not raise any challenge to these findings in the past performance evaluation report; instead, it addresses only some areas of the evaluation while ignoring others and making a broad assertion that the overall relevance rating of its AEMMS contract reference is “simply inconsistent” with its demonstrated performance. Comments at 33. Because STC has not addressed several of the factual conclusions of evaluators, we have no basis to find the evaluation here unreasonable. Accordingly, this protest ground is denied.[11]

STC next alleges that the agency unreasonably evaluated its other two past performance references, namely its Engineering and Technical Support Services (ETSS) contract reference and its Mechanical and Composite Hardware Fabrication Support Services (MCHFSS) contract reference. Protest at 25‑26. Regarding the ETSS contract reference, STC argues that work was similar in size, scope, and complexity to the ASSESS contract. Comments at 36. The past performance evaluation report notes, however, that STC’s proposal failed to address numerous PWS areas. AR, Tab 8a, STC Past Performance Report at 10‑11. Specifically, the ETSS reference did not address the PWS areas of workforce training, travel management, information technology security management, records management, safety and health, quality management, and risk management, among others. Id. Due to its failure to address these areas, the agency assigned this past performance reference a relevance rating of “pertinent.” Id. at 11. Similar to the above challenge, STC does not challenge the factual conclusions of the underlying past performance evaluation report; instead, it concludes that, based on the evaluation report, it deserved a higher rating. See Comments at 37 (arguing that “[u]nder the evaluation demonstrated in the contemporaneous record . . . STC’s performance [rating] should have been evaluated higher.”). We find this argument to lack support in the record.

With respect to the MCHFSS contract reference, STC contends that the agency unreasonably concluded that it covered only eight out of eleven PWS sections. Protest at 28. Specifically, STC argues that this conclusion stems from the “unsupported determination” that the reference did not cover project management support, as required by the PWS. Id. STC supports this argument by quoting an excerpt from the MCHFSS contract reference wherein STC explained that, among other things, its project management office created an exceptional workforce that strived to meet the agency’s expectations. Id. According to STC, this excerpt demonstrates that STC management continued to find qualified, competent employees to support the MCHFSS contract. Id. at 29. The agency responds that STC’s proposal demonstrated contract management, but not project management, here. MOL at 38.

Based on the record, we have no basis to find the agency’s evaluation unreasonable. The evaluators noted the portion of STC’s past performance proposal where it referenced project management for the MCHFSS contract. AR, Tab 8a, STC Past Performance Evaluation Report at 21. In reviewing STC’s proposal, the agency concluded that this section addressed only contract management. Id. As explained by the agency, project management involves the actual planning, implementation, reporting, and completion of the project; contract management, on the other hand, involves workforce management, the hiring and training of employees, and ensuring compliance with regulations. MOL at 38. Although STC’s proposal demonstrated contract management, it did not demonstrate relevance to the project management elements. AR, Tab 8a, STC Past Performance Evaluation Report at 21. Accordingly, this protest ground is denied.

Challenge to the Evaluation of STC’s Cost/Price Proposal

STC next contends that the agency made multiple unreasonable probable cost adjustments to STC’s proposed cost. Protest at 30. The protester maintains that but for these unreasonable adjustments, STC’s total cost/price would have been approximately $[DELETED], or [DELETED] percent, lower than AMA’s cost/price. Id.

When an agency evaluates a proposal for the award of a cost-reimbursement contract (or a contract including cost-reimbursable CLINs), an offeror’s proposed costs are not dispositive because, regardless of the costs proposed, the government is bound to pay the contractor its actual and allowable costs. FAR 15.305(a)(1), 15.404-1(d); American Electronics, Inc., B-421021 et al., Dec. 5, 2022, 2022 CPD ¶ 307 at 10. Accordingly, an agency must perform a cost realism analysis to determine the extent to which an offeror’s proposed costs are realistic for the work to be performed. FAR 15.404-1(d)(1); American Electronics, Inc., supra. In fulfilling this obligation, the agency is not required to conduct an in-depth cost analysis or verify each and every item; rather, the evaluation requires the exercise of informed judgment by the contracting agency. FAR 15.404‑1(d)(1); McLaughlin Research Corp., B-421528 et al., June 16, 2023, 2023 CPD ¶ 146 at 7-8. While an agency’s cost realism analysis need not achieve scientific certainty, the methodology employed must be reasonably adequate and provide some measure of confidence that the rates proposed are reasonable and realistic in view of other cost information reasonably available to the agency at the time of its evaluation. Tantus Techs., Inc., B-411608, B‑411608.3, Sept. 14, 2015, 2015 CPD ¶ 299 at 10.

First, STC argues that the agency unreasonably adjusted the amounts for taxes, state income and franchise, and the sector allocation cost included in the pool used to generate its division three overhead rates. Protest at 32-33. With respect to the taxes, state income and franchise cost, the agency explains that STC’s proposal provided the historical cost for this element but failed to forecast an amount for it for any year of performance. MOL at 41. Because STC’s proposal did not explain why no costs were forecast for this element, the agency calculated the future amounts by taking the average historical cost incurred for this cost element. Id. STC explains that its proposal was silent on these rates because [DELETED]. Protest at 34.

We have no basis to object to this adjustment. We note that it is an offeror’s responsibility to submit a well‑written proposal that allows a meaningful review by the procuring agency, and where an offeror fails to do so, it runs the risk that a procuring agency will evaluate its proposal unfavorably. Lovelace Scientific and Tech. Servs., B‑412345, Jan. 19, 2016, 2016 CPD ¶ 23 at 10. Here, STC’s proposal left the cells where the [DELETED] for this element were to be entered [DELETED]. Because STC’s proposal did not make clear that this [DELETED], we have no grounds to object to the agency’s decision to adjust the costs here.

With regard to the sector allocation cost, the agency noted that STC’s proposed costs were much lower than its historical costs (e.g., $[DELETED] in 2019, $[DELETED] in 2020, and $[DELETED] in 2021 compared with $[DELETED] for 2024, $[DELETED] for 2025, $[DELETED] for 2026). MOL at 41. STC argues that it utilized costs directly in line with those approved by the Defense Contract Audit Agency (DCAA) and readily accessible to the agency. Protest at 35. The agency explains, however, that it had no obligation to review DCCA documents as STC did not provide these documents with its proposal; instead, it argues that it was STC’s obligation to explain in its proposal the basis for the amounts it forecast, which it failed to do. MOL at 42‑43. We agree with the agency. Although STC may have had some reasonable basis for the amounts it forecast, it was incumbent upon STC to clearly explain the basis for these amounts in its proposal, which it failed to do here. See Lovelace Scientific and Tech. Servs., supra.

STC also challenges the agency’s adjustment of its proposed costs for general and administrative (G&A) expenses. Protest at 43. Specifically, STC argues that the agency improperly adjusted its costs for the following areas: payroll service center cost, postage and freight, and travel. Id. With respect to the adjustment of the payroll service center cost, and postage and freight cost, STC argues that the agency improperly doubled the total pool cost. Id. at 44. We disagree. As the agency points out, there was no doubling of the pool cost; instead, the probable cost adjustment for these two areas resulted in the pool increasing from $[DELETED] to $[DELETED] for 2024. MOL at 47. STC’s contention to the contrary is factually inaccurate.

With respect to the adjusted travel costs, STC argues that 2023 tier one provisional travel rates showed very little travel cost. Protest at 47. Again, however, STC did not provide this information with its proposal. Because it is the obligation of the offeror to provide a well-written proposal, we have no basis to object to the agency’s adjustments here.

Discussions and Best-Value Tradeoff

Finally, STC argues that the agency failed to conduct meaningful discussions and that the best-value tradeoff decision was unreasonable. Protest at 47, 50. We deny the former protest ground because the solicitation provided that the agency intended to make an award without conducting discussions. AR, Tab 3b, RFP sections B‑M at 135. The agency had no obligation to conduct discussions with STC as it did not conduct discussions with any other offeror. We also deny STC’s argument that the best-value tradeoff decision was unreasonable as it was premised on the arguments addressed above which we have concluded do not provide any basis to sustain the protest.

The protest is denied.

Edda Emmanuelli Perez
General Counsel

 

[1] All citations reference the Adobe PDF page number.

[2] The procurement here, referred to by the acronym ASSESS, consolidated requirements previously provided under two contracts: the Aeronautics and Exploration Mission Modeling and Simulation (AEMMS) contract, which was awarded to STC, and the Entry Systems Research and Technology Development (ESTRAD) contract, which was awarded to AMA. AR, Tab 6, ASSESS Background at 11.

[3] A rating of excellent corresponded to a point score in the 91-100 percentile range and indicated the proposal contained comprehensive and thorough exceptional merit with one or more significant strengths and deficiencies or significant weaknesses. AR, Tab 3b, RFP Sections B‑M at 138. A rating of very good corresponded to a point score in the 71-90 percentile range and indicated the proposal contained no deficiencies, demonstrated overall competence, contained one or more significant strengths, and contained strengths that outbalanced any weaknesses. Id. A rating of good corresponded to a point score in the 51-70 percentile range and indicated that the proposal contained no deficiencies, provided a reasonably sound response, and may contain strengths and weaknesses but the weaknesses did not significantly detract from the offeror’s response. Id. A rating of fair corresponded to a point score in the 31-50 percentile range and indicated the proposal had no deficiencies, had one or more weaknesses, and the weaknesses outbalanced any strengths. Id. A rating of poor corresponded to a point score in the 0-30 percentile range and indicated that the proposal contained one or more deficiencies or significant weaknesses that demonstrated a lack of overall competence or would require a major proposal revision to correct. Id.

[4] As relevant here, a rating of very high level of confidence indicated that the offeror’s relevant past performance was of exceptional merit, was very highly pertinent to the acquisition, indicated exemplary performance, and did not possess any problems that would have an adverse effect on overall performance. AR, Tab 3b, RFP Sections B‑M at 143. A rating of high level of confidence indicated that the offeror’s relevant past performance was highly pertinent to the acquisition, that it would be fully responsive to contract requirement, that the contract requirements were accomplished in a timely, efficient, and economical manner, and any problems were minor with little identifiable effect on overall performance. Id.

[5] The solicitation defined a significant strength as an aspect of the proposal that greatly enhances the potential for successful contract performance. AR, Tab 3b, RFP Sections B‑M at 139. It defined a strength as an aspect of the proposal that enhances the potential for successful contract performance. Id. at 139‑140. It defined a weakness as a flaw in the proposal that increases the risk of unsuccessful contract performance. Id.

[6] Although we do not specifically address each argument raised by the protester, we have considered them and find none to be meritorious.

[7] Section 2.1.1 of the PWS defined the core contract management requirements for the solicitation. AR, Tab 3c, PWS at 4‑10. These requirements included, among other things, workforce training, travel management, employee credential management, quality management, and resource management. Id.

[8] Although STC’s comments on the agency report allege that STC’s protest challenges the entire weakness as it relates to all five areas referenced by the SEB, Comments at 23, we do not find this argument to be supported by the record. STC’s protest addresses only the records management finding and is otherwise silent regarding the other four areas. Protest at 12‑13. Furthermore, the comments do not point to any specific allegation made regarding these other four areas; it simply asserts that they were challenged. Accordingly, we address only STC’s argument as it pertains to records management.

[9] According to the SEB, an anchored database is generated through a combination of data generated by both lower and higher order tools where the lower order data is anchored by the higher order data at selected points. AR, Tab 10, SEB Evaluation Report at 64.

[10] As background for this weakness, the contracting officer explained that a TPS (i.e., Thermal Protection System) exists to protect spacecraft from extreme temperatures during entry into earth or other planet atmospheres at high rates of speed. COS at 16. The arc jet test facility is a device in which gases are super-heated to extreme temperatures and these gases flow at supersonic/hypersonic speeds and pass through a nozzle aimed at a test sample in a vacuum. Id. at 16‑17.

[11] STC also asserts that because the agency engaged in disparate treatment because it assigned a relevancy rating of “very highly pertinent” to the ESTRAD contract that AMA submitted as a reference. Protest at 25. STC does not raise any specific allegations concerning the evaluation of the ESTRAD contract reference but asserts that because both contracts formed the subject ASSESS contract, they should have received the same adjectival rating. See id. As explained above, protesters must demonstrate that the differences in the evaluation did not come from differences in the proposals when alleging disparate treatment. Battelle Memorial Inst., supra. STC has not met this burden here. As such, this ground of protest is denied as STC has not provided us with a basis to sustain this ground of protest.

Downloads

GAO Contacts

Kenneth E. Patton
Managing Associate General Counsel
Office of the General Counsel

Edward (Ed) Goldstein
Managing Associate General Counsel
Office of the General Counsel

Media Inquiries

Sarah Kaczmarek
Managing Director
Office of Public Affairs

Public Inquiries