Parsons Government Services Inc.
Highlights
Parsons Government Services Inc., of Centreville, Virginia, protests the issuance of a task order to ManTech Advanced Systems International, Inc., of Herndon, Virginia, under task order request (TOR) No. 47QFCA24R0006, issued by the General Services Administration (GSA) for intelligence and cyber operations solutions and services for multiple projects in support of the Department of Defense's (DOD) National Security Innovation Network (NSIN) and NSIN's mission partner network. The protester contends that the agency improperly evaluated proposals, performed an unreasonable best-value tradeoff, and failed to properly consider whether ManTech had an unfair competitive advantage due to the hiring of a former government official.
DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.
Decision
Matter of: Parsons Government Services Inc.
File: B-422849
Date: November 21, 2024
Jeffery M. Chiow, Esq., Melissa P. Prusock, Esq., Eleanor M. Ross, Esq., and Cassidy Kim, Esq., Greenberg Traurig, LLP, for the protester.
Paul F. Khoury, Esq., George E. Petel, Esq., Lisa M. Rechden, Esq., and Vaibhavi Patria, Esq., Wiley Rein LLP, for ManTech Advanced Systems International, Inc., the intervenor.
Kelli Cochran-Seabrook, Esq., and Nathan C. Bangsil, Esq., U.S. General Services Administration, for the agency.
Nathaniel S. Canfield, Esq., and Evan D. Wesser, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.
DIGEST
1. Protest challenging evaluation of proposals is denied where the record demonstrates that the evaluation was reasonable, consistent with the solicitation, and did not result from unequal treatment of proposals.
2. Protest alleging that awardee gained an unfair competitive advantage based on employment of a former government official is denied where the record shows that the individual did not have access to non‑public, competitively useful information and was not involved in the drafting of the solicitation or the awardee’s proposal.
DECISION
Parsons Government Services Inc., of Centreville, Virginia, protests the issuance of a task order to ManTech Advanced Systems International, Inc., of Herndon, Virginia, under task order request (TOR) No. 47QFCA24R0006, issued by the General Services Administration (GSA) for intelligence and cyber operations solutions and services for multiple projects in support of the Department of Defense’s (DOD) National Security Innovation Network (NSIN) and NSIN’s mission partner network. The protester contends that the agency improperly evaluated proposals, performed an unreasonable best‑value tradeoff, and failed to properly consider whether ManTech had an unfair competitive advantage due to the hiring of a former government official.
We deny the protest.
BACKGROUND
NSIN is a program office within DOD’s Defense Innovation Unit that aims to build communities of innovators to generate new solutions to national security problems. Contracting Officer’s Statement (COS) at 1. The procurement at issue, referred to as the Interagency Intelligence and Cyber Operations Network (ICON) procurement, is sponsored by NSIN and will provide shared and integrated cyber resources, information, and capabilities services for intelligence and cyber operations mission partners. Id. The ICON task order to be issued will support the United States Cyber Command, Cyber Mission Forces, Geographic and Functional Combatant Commands, Joint Task Forces, Service Cyberspace Components (i.e., Marine Corps Forces Cyberspace Command and the United States Fleet Cyber Command), Military Services, DOD component services and agencies, and interagency intelligence partners. Id. at 1‑2.
Pursuant to Federal Acquisition Regulation (FAR) subpart 16.5, the agency issued the TOR via the GSA eBuy website on February 27, 2024, to vendors holding GSA Alliant 2 governmentwide acquisition contracts (GWAC). Id. at 2; Agency Report (AR), Tab 3.a, Amended TOR at 1, 97, 113.[1] The TOR, which the agency amended once, contemplated issuance of a single task order containing cost‑plus‑award‑fee and cost‑reimbursement line items, with a 1‑year base period, four 1‑year options, and a 6‑month option to extend. COS at 4; Amended TOR at 3‑7, 33, 76, 113. The agency would make its source selection decision on a best‑value tradeoff basis, considering four technical evaluation factors--technical and management approach; ICON project scenario; key personnel and project staffing; and corporate experience--and cost/price. Amended TOR at 113, 115. The TOR listed the four technical evaluation factors in descending order of importance and stated that, when combined, they were significantly more important than cost/price. Id. at 115. Only the first three technical evaluation factors are relevant to the allegations raised here.
With respect to the technical and management approach factor, the TOR instructed offerors to submit a draft transition‑in plan; a quality management plan; and oral technical proposal presentation (OTPP) slides. Id. at 106‑108, 111, 115. The OTPP slides, which offerors would use as the basis for an in‑person oral presentation, were to describe various aspects of the offeror’s approach to meeting task order requirements, as detailed below. Id. at 108‑109, 111. The TOR provided that the agency would evaluate proposals under the technical and management approach factor based on the clarity, completeness, effectiveness, efficiency, and feasibility of meeting the TOR’s requirements, examining the following aspects:
a. The degree of effectiveness and relevance of the offeror’s overall approach to fulfilling the objectives and tasks identified in the performance work statement (PWS), including a detailed description of how the offeror will apply its investments, partnerships, and capabilities to meet the overall objectives and task areas.
b. The effectiveness and comprehensiveness of the offeror’s overall approach to defensive cyberspace operations (DCO), including development and delivery of big data platforms, cyber analytics, and user activity monitoring.
c. The effectiveness and comprehensiveness of the offeror’s overall approach to integrated intelligence and offensive cyberspace operations (OCO), including development and delivery of integrated all-source, open source intelligence and signal intelligence; exploit development; vulnerability research and reverse engineering; and capability development for cyber weapons.
d. The degree of clarity, effectiveness, and feasibility of the offeror’s transition support, including its draft transition‑in plan.
e. The effectiveness and comprehensiveness of the offeror’s methodology, processes, and procedures for establishing and maintaining high quality in the performance of all tasks under the task order and technical direction letters issued thereunder, to include the quality management plan.
f. The effectiveness and comprehensiveness of the offeror’s approach for increasing opportunities for service‑disabled veteran‑owned small business (SDVOSB) firms and complying with the increase in SDVOSB goals to five percent.
Id. at 116.
Under the ICON project scenario factor, offerors were to submit their approaches to two technical scenarios as part of their OTPP slides and present them to the agency. Id. at 108‑109, 111‑112. Both scenarios were based upon the same fictitious operational background, in which a terrorist organization had taken several individuals hostage, and North Atlantic Treaty Organization (NATO) allies were going to use unconventional forces to rescue them. AR, Tab 11, TOR attach. V, ICON Project Scenario at 1. In the first scenario, the offeror was to undertake an OCO “kill chain” to identify the specific location of the terrorist organization’s system to enable the NATO forces to use the location to extract hostages. Id. In the second scenario, the offeror was to undertake a DCO “hunt and clear” from the perspective of the terrorist organization in order to eradicate the adversary from the organization’s network. Id. at 1‑2. The TOR provided that the agency would evaluate proposals under this factor based upon the following aspects:
a. The degree to which the offeror provided a detailed approach to support an OCO kill chain and the execution of a DCO hunt and clear operation that comprehensively and effectively described its solution to meeting scenario requirements.
b. The degree of feasibility and effectiveness of the offeror’s approach to identifying and resourcing specialized skill sets and subject matter expertise to address the work activities required in support of the project.
c. The degree of comprehensiveness and effectiveness of the offeror’s approach to tracking and managing cost, performance, and schedule.
Amended TOR at 116.
For the key personnel and project staffing factor, the TOR instructed offerors to submit a project staffing plan, a project staffing rationale, and a key personnel qualification matrix. Id. at 105‑106, 117. The TOR required offerors to identify key personnel in the project staffing plan, as well as to provide the names of any non‑key personnel that were known prior to proposal submission. Id. at 105. In the project staffing rationale, offerors were to explain how and why the staffing proposed in the project staffing plan would accomplish the offeror’s overall solution. Id. at 106. The key personnel qualification matrix was to relate the qualifications of key personnel to the key personnel qualification requirements of the TOR. Id. at 44‑45, 106. The TOR stated that the agency would evaluate proposals under the key personnel and project staffing factor based upon the following aspects:
a. The degree of relevance, comprehensiveness, effectiveness, and currency of the stated qualifications, experience, skills, and roles of each of the named key personnel to meet the requirements of the TOR and support the offeror’s technical and management approaches.
b. The degree of relevance, comprehensiveness, and effectiveness of the proposed qualifications, level of effort, and roles of the non‑key personnel to meet the requirements of the TOR and support the offeror’s technical and management approaches.
c. A relevant, comprehensive, efficient, and feasible methodology for hiring, retaining, and replacing appropriately qualified personnel throughout the life of the task order, including how the offeror’s strategy and approach will specifically address the shortage of cyber talent, competitive hiring landscape, and talent pipeline.
Id. at 117.
The TOR did not disclose definitions or adjectival ratings that the agency would use in its evaluation of the three technical evaluation factors discussed above. Relevant to the allegations raised by the protester, the agency’s technical evaluation plan (TEP) stated that a strength was “a significant, outstanding, or exceptional aspect of an offeror’s proposal that can be beneficial to the program or increases the probability of successful [t]ask [o]rder performance.” AR, Tab 5, TEP at 5. A weakness was defined as “a flaw in the proposal that increases the risk of unsuccessful performance.” Id.
The TEP further stated that the agency would assign ratings of excellent, good, acceptable, or not acceptable under each of the three technical evaluation factors discussed above, as well as to each proposal overall. As relevant to the allegations raised by the protester, the TEP defined the excellent, good, and acceptable adjectival ratings as follows:
Rating |
Description |
---|---|
Excellent |
A high‑quality proposal that meets all requirements, may exceed some or many requirements, and shows a thorough understanding of the requirements. The risk of unsuccessful performance is very low. The proposal substantially meets the following criteria: a. There are numerous strengths. b. There are no deficiencies. c. There are no significant weaknesses. d. There are few, if any, weaknesses, and they are significantly outweighed by the strengths. e. Most, if not all, risks resulting from the weaknesses are adequately mitigated by something in the offeror’s proposal. f. Slight imperfections may exist, but they cause minimal or no impact. |
Good |
A quality proposal that meets all requirements, exceeds some requirements, and shows a sound understanding of the requirements. The risk of unsuccessful performance is low to moderate. The proposal substantially meets the following criteria: a. There are strengths. b. There are no deficiencies. c. There may be significant weaknesses, but the presence of one or more significant weaknesses does not automatically disqualify the proposal from the good rating; the overall impact of these drawbacks on the effectiveness of the proposal should be limited. d. The weaknesses identified are outweighed by the strengths. e. Any risks resulting from the identified significant weaknesses must be mitigated by something in the offeror’s proposal. f. Any risks resulting from the identified weaknesses (not significant) may or may not be mitigated by something in the offeror’s proposal. g. A few slight imperfections may exist or the proposed approach may be less than 100 percent effective at accomplishing the government’s requirements, but their impact is minimal. |
Acceptable |
An adequate proposal that meets all requirements and shows some understanding of the requirements. The risk of unsuccessful performance is moderate. The proposal substantially meets the following criteria: a. There may be strengths. b. There are no deficiencies. c. There may be significant weaknesses, but the presence of one or more significant weaknesses does not automatically disqualify the proposal from an acceptable rating; the impact of significant weaknesses may considerably lessen the effectiveness of the proposal. d. There may be weaknesses that outweigh the strengths. e. Any risks resulting from the identified weaknesses (significant or otherwise) may or may not be mitigated by something in the offeror’s proposal. f. Some aspects of the response are less than detailed, less than comprehensive, or somewhat generic, but the overall impact of these drawbacks should not result in a proposal that is unlikely to succeed. |
Id. at 5‑6.
The agency received three timely proposals in response to the TOR, including from the protester and ManTech. COS at 4. After review of the written submissions and in‑person oral presentations, the agency evaluated the protester’s and ManTech’s proposals as follows:
Parsons |
ManTech |
|
---|---|---|
TECHNICAL AND MANAGEMENT APPROACH |
Good |
Excellent |
ICON PROJECT SCENARIO |
Acceptable |
Good |
KEY PERSONNEL AND PROJECT STAFFING |
Acceptable |
Good |
CORPORATE EXPERIENCE |
Relevant |
Relevant |
OVERALL TECHNICAL RATING |
Good |
Excellent |
TOTAL PROPOSED PRICE |
$1,348,360,050 |
$1,384,067,029 |
AR, Tab 7, Award Decision Document at 321; COS at 5.
The agency concluded that ManTech’s proposal was technically superior to the protester’s, further noting the associated price premium of $35,706,979, or approximately 2.65 percent, and finding that the cost savings of the protester’s proposal would not be beneficial to the government when weighed against the technical advantages of ManTech’s proposal. AR, Tab 7, Award Decision Document at 325. The agency therefore concluded that ManTech’s proposal offered the best value to the government, and selected ManTech for receipt of the task order. Id. at 325, 328. This protest followed.[2]
DISCUSSION
The protester raises numerous challenges to the agency’s evaluation of proposals under the technical and management approach, ICON project scenario, and key personnel and project staffing factors, contending variously that the agency’s evaluation was unreasonable, inconsistent with the TOR, or the result of unequal treatment. Additionally, the protester contends that ManTech gained an unfair competitive advantage through its hiring of the former Navy principal cyber adviser (PCA) and should have been disqualified from the competition. We have reviewed the record and conclude that there is no basis on which to sustain the protest. We discuss several representative examples of the protester’s arguments below.[3]
Technical and Management Approach
In evaluating proposals under the technical and management approach factor, which was the most important evaluation factor, the agency assigned seven strengths and no weaknesses to ManTech’s proposal, assigning it an adjectival rating of excellent. AR, Tab 6, Technical Evaluation Board (TEB) Consensus Report at 24‑28; Tab 7, Award Decision Document at 322‑323. In contrast, the agency assigned four strengths and three weaknesses to the protester’s proposal under that factor, assigning it an adjectival rating of good. AR, Tab 6, TEB Consensus Report at 38‑42; Tab 7, Award Decision Document at 322‑323. On the basis of that evaluation, the agency concluded that ManTech’s proposal was superior to the protester’s under the technical and management approach factor. AR, Tab 7, Award Decision Document at 322‑323, 325.
The protester contends that the agency’s evaluation of proposals under this factor was unreasonable in several respects. For example, the agency assigned a weakness to the protester’s proposal for misaligned OCO support, noting that the protester’s proposal focused on providing OCO support to the Cyber National Mission Force (CNMF), which the agency stated has a very limited OCO role. AR, Tab 6, TEB Consensus Report at 42; Tab 7, Award Decision Document at 323. Because of the proposal’s focus on providing OCO support to CNMF, the agency concluded that the protester’s approach was unclear as to how OCO support would be integrated into the protester’s performance of the ICON task order, and the agency assigned a weakness on that basis. Id.
The protester contends that the assignment of this weakness was unreasonable because “CNMF is currently the largest OCO [m]ission [p]artner under the incumbent . . . effort[,]” and further that the cited OTPP slide only “reference[d] CNMF as an example of how Parsons’ [OCO] approach has been successfully applied to its work for CNMF.” Protest at 23. The protester points to other OTPP slides that it contends demonstrated its approach to OCO support. Id. at 23‑24 (citing AR, Tab 4.b.3, Parsons OTPP Slides at 8, 9, 44, 46). The agency responds that it reasonably assigned this weakness, again citing the proposal’s focus on OCO support to CNMF, leading to concerns about how the protester would integrate OCO support into its performance of the task order. Memorandum of Law (MOL) at 16‑17.
As noted above, the task order competition here was conducted pursuant to FAR subpart 16.5. The evaluation of proposals in a task order competition is primarily a matter within the contracting agency’s discretion because the agency is responsible for defining its needs and the best method of accommodating them. CACI, Inc.-Fed., B‑420441.3, Nov. 5, 2022, 2022 CPD ¶ 278 at 6. When reviewing protests of an award in a task order competition, we do not reevaluate proposals, but examine the record to determine whether the evaluation and source selection decision are reasonable and consistent with the solicitation’s evaluation criteria and applicable procurement laws and regulations. DynCorp Int’l LLC, B‑411465, B‑411465.2, Aug. 4, 2015, 2015 CPD ¶ 228 at 7. A protester’s disagreement with the agency’s judgment, by itself, is not sufficient to establish that an agency acted unreasonably. CACI, supra at 6.
While the protester and agency disagree with respect to CNMF’s OCO role, we need not resolve the disagreement to conclude that the agency reasonably assigned the weakness. The PWS requires the selected contractor to “perform OCO support activities that enable NSIN and its Mission Partners to improve its OCO and overall security posture, policies, and procedures.” Amended TOR at 27. The PWS also defines the term “Mission Partners” as “the Department of Defense . . . , [i]ntelligence [c]ommunity, and other [g]overnment organizations[.]” Id. at 11. Thus, the TOR indicated that the selected contractor would be responsible for providing OCO support to the multiple stakeholders obtaining services under the ICON task order, and furthermore that the agency would evaluate the effectiveness and comprehensiveness of offerors’ approaches to providing that support. See id. at 116.
As discussed above, the protester contends that the OTPP slide specifically cited by the agency in assigning the weakness references support to CNMF only as an example of its approach to OCO support. The record, however, does not support the protester’s contention, as that slide discusses the provision of OCO support exclusively in terms of supporting CNMF. With respect to policy and strategy, the slide states that the protester will “[DELETED].” AR, Tab 4.b.3, Parsons OTPP Slides at 51. Similarly, with respect to processes, platforms, and systems, the slide states that the protester will “[DELETED][.]” Id. The slide further touts the protester’s “[DELETED]” as enabling the protester to efficiently implement its OCO approach. Id. Thus, the protester’s proposal indicates that the protester’s approach to OCO support is to provide that support to CNMF, as it neither discusses providing OCO support to any other ICON stakeholders nor suggests that the protester is providing its approach to CNMF OCO support only as a demonstration of its general OCO approach. On this record, the agency reasonably understood the protester’s OCO support approach to focus on CNMF, calling into question the protester’s understanding of the requirement to support OCO efforts for the broader ICON task order community.
The other slides cited by the protester as demonstrating the comprehensiveness and effectiveness of its OCO approach also do not support the protester’s contention. Slides 8 and 9 are part of the proposal’s discussion of the protester’s general approach to the task order requirements and discuss OCO only in passing terms. See id. at 8‑9. While slides 44 and 46 are specifically addressed to the OCO task, they discuss the protester’s approach to OCO mission execution, not the entities to which the protester proposes to provide OCO support or how the protester will apply that approach to support the various entities obtaining services under the ICON task order. See id. at 44, 46. Thus, they fail to demonstrate how the protester would integrate its OCO support approach within the ICON task order community, and do not counter the agency’s concern that the protester’s OCO support approach was focused on providing support only to CNMF. We therefore conclude that the agency reasonably assigned this weakness to the protester’s proposal.
As another example, the protester contends that the agency improperly evaluated proposals with respect to their risk management framework (RMF) approach, arguing both that the agency unreasonably assigned a weakness to the protester’s proposal and that the agency’s assignment of a strength to ManTech’s proposal was the result of unequal treatment. Protest at 24‑28. The record does not support either contention.
The agency assigned a weakness to the protester’s proposal because it did not provide in‑depth detail regarding appropriate RMF security practices, noting that the proposal discussed [DELETED], but lacked detail on the consideration of RMF during planning and throughout delivery. AR, Tab 6, TEB Consensus Report at 42; Tab 7, Award Decision Document at 322. The agency found that the protester’s RMF approach was underdeveloped, inadequately addressed throughout the proposal, and failed to detail how the protester would sustain RMF throughout the lifecycle of a system or component of a system. Id. The protester contends that this weakness was not warranted, arguing that slide 10 of its OTPP--which the agency cited in assigning the weakness, see AR, Tab 6, TEB Consensus Report at 42--introduced the protester’s RMF approach, while slides 30 and 35 provided further detail on the protester’s continuous RMF approach. Protest at 26‑27.
On our review of the record, the protester has not demonstrated that the agency’s evaluation was unreasonable. Slide 10 of the protester’s OTPP, as the agency noted, discusses the protester’s [DELETED], and states only that the approach “[DELETED][,]” with no further elaboration as to how the protester’s approach considers RMF during the planning and delivery phases. AR, Tab 4.b.3, Parsons OTPP Slides at 10. Slide 30 states that the protester’s “approach [DELETED][,]” but similarly provides little detail as to how the protester effectuates that approach. See id. at 30. Slide 35, also cited by the protester, references the use of the protester’s [DELETED] tool for security compliance, but does not place that approach within the context of the protester’s RMF approach. See id. at 35. In short, while the protester disagrees with the agency’s judgment that its proposal lacked detail regarding the protester’s RMF approach, particularly with respect to planning and delivery, the protester has not demonstrated that the agency’s judgment in this regard was unreasonable.
Relatedly, the protester alleges that the agency’s assignment of a strength to ManTech’s proposal for its RMF approach was the result of the agency’s unequal treatment of proposals. Protest at 27‑28. In contrast to the evaluation of the protester’s proposal, the agency concluded that ManTech’s proposal included a robust RMF approach to cyber weapons capability development, noting in part ManTech’s proposal to [DELETED], which the agency concluded could [DELETED]. AR, Tab 6, TEB Consensus Report at 27 (citing AR, Tab 16.b.2, ManTech OTPP Slides at 11, 36, 39, 41, 42, 50, 54, 55, 61); Tab 7, Award Decision Document at 322. The protester contends that its proposal contained aspects that were substantively indistinguishable from those that led the agency to assign a strength to ManTech’s proposal, and that the evaluation therefore was the result of unequal treatment. See Comments at 16‑17.
It is a fundamental principle of federal procurement law that agencies must treat all offerors equally and evaluate their proposals evenhandedly against the solicitation’s requirements and evaluation criteria. NTT Data Servs. Fed. Gov’t, LLC, B‑421708.3, B‑421708.4, Nov. 27, 2023, 2023 CPD ¶ 273 at 10. Where a protester alleges unequal treatment in a technical evaluation, it must show that the differences in ratings did not stem from differences between the proposals since agencies properly may assign dissimilar proposals different evaluation ratings. Id.
The record reflects that the agency assigned a strength to ManTech’s proposal for its robust RMF approach, which the agency found to be [DELETED]. In contrast, as detailed above, the agency found that the protester’s RMF approach lacked detail, especially with respect to the planning phase and throughout delivery, and therefore was unclear as to how the protester proposed to sustain RMF throughout system and component lifecycles. As discussed above, we conclude that the agency’s evaluation of the protester’s proposal was not unreasonable in that regard. Thus, where the agency’s respective strength and weakness assignments were based upon the comprehensiveness of each offeror’s RMF approach as reflected throughout their proposals, the protester has not demonstrated that the differences in ratings did not stem from differences in the proposals.
Moreover, while the protester proffers a side‑by‑side comparison of the ManTech OTPP slides cited by the agency in assigning a strength with slides from the protester’s OTPP that the protester contends proposed substantively indistinguishable aspects, see Comments at 16‑17, this particularized comparison does not demonstrate that the protester proposed a similarly robust RMF approach. For example, the protester contends that slide 30 of its OTPP contains similar detail to slide 36 of ManTech’s OTPP. See id. at 16. On our review of the record, these slides are substantively distinguishable. ManTech’s proposal specifically [DELETED]. See AR, Tab 16.b.2, ManTech OTPP Slides at 36.
In contrast, the protester’s proposal references “RMF [s]upport” with respect to [DELETED] but does not clearly identify how RMF support is applied within that approach in a manner similar to ManTech’s proposal. See AR, Tab 4.b.3, Parsons OTPP Slides at 30. The protester similarly contends that slide 20 of its OTPP has content that is substantively indistinguishable from that found at slide 42 of ManTech’s OTPP. See Comments at 17. Slide 20 of the protester’s OTPP, which is addressed to the protester’s approach to overall task order management, states that the protester will “[DELETED][.]” AR, Tab 4.b.3, Parsons OTPP Slides at 20. This list of general [DELETED] to be applied broadly is not comparable to the information found at slide 42 of ManTech’s OTPP, which discusses the use of a specific risk management method as applied to the integrated intelligence task of the PWS. See AR, Tab 16.b.2, ManTech OTPP Slides at 42. Based on the record, the protester has not demonstrated that its proposal was substantively indistinguishable from ManTech’s with respect to the detail and comprehensiveness of their RMF approaches. Accordingly, we deny this allegation.
As a final example, the agency assigned a strength to ManTech’s proposal centering on ManTech’s DCO approach, which the agency noted “included development and the delivery of big data platforms, cyber analytics, and user activity monitoring.” AR, Tab 6, TEB Consensus Report at 23; Tab 7, Award Decision Document at 322. The agency found that “[DELETED].” AR, Tab 6, TEB Consensus Report at 28. With respect to big data platforms, the agency noted that “ManTech’s approach would [DELETED].” Id.
The protester contends that the contents of ManTech’s proposal do not support the assignment of this strength, arguing that neither the OTPP slides cited by the agency--slides 34 and 35--nor any other portion of ManTech’s proposal demonstrate a basis for assigning the strength. Protest at 35‑36; Comments at 26‑28. In response to our request for supplemental briefing, the agency acknowledges that the references to slides 34 and 35 in its evaluation documentation were erroneous. See Supp. MOL at 4. In a sworn declaration, the TEB chair states that the agency assigned the strength on the basis of “the cohesive and complete manner [in which] ManTech outlined its approach to big data platforms, cyber analytics, and user activity monitoring,” and points to slides 5, 36, 46, 49‑52, 54, 58‑59, and 62 as providing the proposal information supporting the strength.[4] See Decl. of TEB Chair at 1‑2. On our review of the record, the agency had a reasonable basis on which to assign this strength.
Most salient to the agency’s assessment of ManTech’s proposed approach to cyber analytics, slide 51 of ManTech’s OTPP details a [DELETED]. AR, Tab 16.b.2, ManTech OTPP Slides at 51. The slide further lists members of the cross‑functional team that would execute this process, the tools to be used, and documents and methodologies to be employed. Id. Additionally, it describes an example of how ManTech successfully employed its approach to cyber analytics in support of national security objectives. Id. While the protester disagrees with the agency’s conclusions regarding ManTech’s approach to cyber analytics, see Supp. Comments at 4, that disagreement, without more, is insufficient to demonstrate that the agency’s judgment was unreasonable.
Similarly, slide 50 of ManTech’s OTPP, in particular, details ManTech’s proposed approach to the development and delivery of big data platforms. It also lays out a multi‑step process with accompanying details for each step, lists cross‑functional team members, and states the tools to be used. AR, Tab 16.b.2, ManTech OTPP Slides at 50. It further provides two examples of past successes in this area. Id. On this record, we conclude that the agency had a reasonable basis to assign a strength based on ManTech’s DCO approach.
As the foregoing examples demonstrate, we conclude that the protester has not demonstrated that the agency’s evaluation of proposals under the technical and management approach factor was unreasonable. We therefore deny this ground of protest.
ICON Project Scenario
The agency assigned two strengths and one weakness to ManTech’s proposal under the ICON project scenario factor, resulting in an adjectival rating of good for that factor. AR, Tab 6, TEB Consensus Report at 28‑30; Tab 7, Award Decision Document at 323‑324. The protester’s proposal received one strength and three weaknesses, and an adjectival rating of acceptable. AR, Tab 6, TEB Consensus Report at 42‑45; Tab 7, Award Decision Document at 323‑324. As with the technical and management approach factor, the agency concluded that ManTech’s proposal was superior under the ICON project scenario factor. AR, Tab 7, Award Decision Document at 323‑325.
The protester also alleges that the agency improperly evaluated proposals under this factor for several reasons. For example, the agency assigned a weakness to the protester’s proposal for the use of DCO tools that the agency found could increase unintentional open vulnerabilities. AR, Tab 6, TEB Consensus Report at 44; Tab 7, Award Decision Document at 323. In particular, the agency cited the proposed use of [DELETED], which the agency stated “increases defensive vulnerabilities[,]” AR, Tab 6, TEB Consensus Report at 44, and further that it “is not accredited on [DOD] networks[,]” id. at 43. The agency also cited the protester’s proposed use of [DELETED] and [DELETED], concluding that “[DELETED]’s data model environment does not provide the flexibility to execute overlapping missions at the same time[,]” and that “[DELETED] introduces numerous vulnerabilities[.]” Id. at 44. Finally, the agency cited the proposal’s inclusion of [DELETED], [DELETED], [DELETED], and [DELETED], which it found “should not be needed and were not relevant to the operation.” Id.
The protester contends that each of these conclusions is unreasonable, arguing that the agency failed to identify what vulnerabilities [DELETED] introduces, and further that the agency is mistaken as to [DELETED]’s accreditation for use on DOD networks. Protest at 42. The protester alleges that the agency also is incorrect with respect to [DELETED]’s ability to execute overlapping, simultaneous missions, and that the agency failed to consider known controls that can mitigate [DELETED]’s vulnerabilities. Id. at 42‑43. Lastly, the protester argues that the agency’s conclusions with respect to [DELETED], [DELETED], [DELETED], and [DELETED] were unreasonable in light of the protester’s proposal to provide a modular, scalable DCO solution, as well as discussion in the protester’s OTPP slides regarding the purpose for including those tools. Id. at 43‑44.
Based on the record provided, we conclude that this challenge presents only the protester’s disagreement with the agency’s evaluation. As the TEB chair states in his declaration, the agency’s evaluators assigned this weakness based on their expertise and experience on Army DCO programs, which included their inability to attain [DELETED] accreditation.[5] Decl. of TEB Chair at 2. He further cites [DELETED]’s “600‑700 free and open‑source tools that include[] countless attack vectors[,]” as well as its “default use of the root account for most operations[,]” which he states “is extremely dangerous in the cyber realm.” Id. Similarly, the TEB chair states that the agency’s evaluators also had experience with [DELETED] in Army DCO programs, “where it was eventually dropped as an option due to its limitations when configuring overlapping DCO missions.” Id. at 3. The agency evaluators encountered similar challenges with [DELETED], which they found the Army was unable to use as part of a DCO platform “due to the numerous Category 1 vulnerabilities[[6]] identified during the [RMF] process.” Id. Finally, with respect to [DELETED], [DELETED], [DELETED], and [DELETED], the TEB chair states that it appeared to the evaluators that the protester “was including any and all tools available rather than a targeted, well thought out focus on only necessary tools to accomplish the scenario actions.” Id. They expressed concern that additional tools provide further vectors for detection that can negatively impact execution. Id.
Thus, the record reflects that the agency assigned this weakness based upon the expertise and experience of the evaluators, including experience with the specific tools the protester proposed to use and limitations encountered on DOD systems. It is not improper for agency evaluators to base their evaluation conclusions on their personal experience where, as here, that experience is relevant. See, e.g., IntegriGuard LLC, B‑401626, B‑401626.2, Oct. 20, 2009, 2010 CPD ¶ 121 at 5 (evaluation was reasonable where evaluators determined, based on their extensive program knowledge and relevant experience, that the protester’s proposed productivity rate was unrealistic). The protester disagrees with those conclusions, but that disagreement, without more, is insufficient to sustain the protest. We therefore deny the protester’s allegation that the agency unreasonably assigned this weakness.
As a second example, the agency assigned a strength to ManTech’s proposal with respect to the OCO kill chain scenario for proposing to [DELETED], which the agency found “could . . . provide unconventional forces a competitive edge against adversaries and increase the probability of success[.]” AR, Tab 6, TEB Consensus Report at 29‑30; Tab 7, Award Decision Document at 323. The protester contends that the agency unreasonably assigned this strength, arguing that it is not supported by the record and fails to take into account increased risks of detection. Protest at 45‑47; Comments at 38‑40. We conclude that the record supports the reasonableness of the assignment of this strength, and that this allegation also represents only the protester’s disagreement with the agency’s evaluative judgments.
The TEB chair explains that this strength was based on the information found at slides 70 and 71 of ManTech’s OTPP. Decl. of TEB Chair at 3‑4. As he points out, slide 70 discusses the use of [DELETED]. AR, Tab 16.b.2, ManTech OTPP Slides at 70. Slide 71 elaborates, stating that “[i]f [DELETED],” ManTech proposed to “[DELETED][.]” Id. at 71. It further proposed to “[DELETED][.]” Id. Thus, the record demonstrates that the agency had a reasonable basis for finding that ManTech proposed to [DELETED].[7] Furthermore, it was reasonable for the agency to find that providing the scenario’s [DELETED] with [DELETED] gleaned through this approach would increase the chances of mission success. We therefore conclude that the protester has not demonstrated that the agency unreasonably assigned this strength.
As these representative examples demonstrate, the agency reasonably evaluated proposals under the ICON project scenario factor. We therefore deny the protester’s challenges to the evaluation under this factor.
Key Personnel and Project Staffing
Under the key personnel and project staffing factor, the agency assigned two strengths and no weaknesses to ManTech’s proposal, resulting in an adjectival rating of good. AR, Tab 6, TEB Consensus Report at 30‑34; Tab 7, Award Decision Document at 324‑325. The agency identified one strength and no weaknesses in the protester’s proposal under that factor and assigned it an adjectival rating of acceptable. AR, Tab 6, TEB Consensus Report at 45‑48; Tab 7, Award Decision Document at 324‑325. The agency again concluded that ManTech’s proposal was technically superior under the key personnel and project staffing factor. AR, Tab 7, Award Decision Document at 325.
The protester challenges the agency’s evaluation under this factor in several respects, none of which provides a basis to sustain the protest. For example, the agency did not assign a weakness or a strength to the protester’s proposal for hiring, retaining, and replacing qualified personnel. The agency concluded that the protester proposed a somewhat feasible approach in that regard but noted that the protester’s proposal “described the shortage of cyber talent and competitive hiring landscape but did not provide a methodology to proactively address the challenges.” AR, Tab 6, TEB Consensus Report at 47; AR, Tab 7, Award Decision Document at 324. The agency further found that the protester’s proposal “did not provide a comprehensive methodology for retaining appropriately qualified personnel[,]” noting in particular that the proposal did not explain the protester’s internal employee advancement system. AR, Tab 6, TEB Consensus Report at 47. Because the protester’s proposal in this regard “was generic, . . . adhered to industry standards, and met the TOR requirements for hiring and retention[,]” the agency did not assign a strength to the protester’s proposal, finding that the protester’s proposal presented a moderate risk of unsuccessful performance in this regard. Id. at 45, 47.
The protester alleges that the agency’s risk assessment ignored the contents of its proposal, which the protester contends addressed both the shortage of cyber talent and the protester’s internal employee advancement system. Protest at 49‑52. The protester further argues that the agency treated proposals unequally by assigning a strength to ManTech’s proposal for demonstrating a comprehensive methodology for hiring, retaining, and replacing talent. Id. at 52‑54. The record does not support the protester’s contentions.
In discussing the shortage of cyber talent, the protester’s proposal states that “providing competitive compensation, flexible work accommodations, and a company culture that focuses on the people” are “[c]ritical to hiring and retaining cyber talent[.]” AR, Tab 4.b.1, Parsons Written Technical Proposal at BB‑15. It was not unreasonable for the agency to conclude that these generally stated concepts provided little in the way of a detailed methodology for addressing the shortage of cyber talent. Additionally, while that section of the protester’s proposal refers to a table describing the protester’s overall approach to hiring, retention, and replacement, see id., that table does not speak to any methods specifically addressed to the challenges of hiring and retaining cyber talent, see id. at BB‑14-BB‑15. While the protester disagrees with the agency’s assessment of the extent to which the protester’s proposal recognized and proposed methods to contend with the challenges of hiring cyber talent, the record does not demonstrate that the agency’s judgment in this regard was unreasonable.
In contrast to the protester’s proposal, the agency found that ManTech “proposed a highly relevant, comprehensive, efficient, and feasible methodology for hiring, retaining, and replacing appropriately qualified personnel[.]” AR, Tab 6, TEB Consensus Report at 32; Tab 7, Award Decision Document at 324. In part, the agency’s evaluation was based upon ManTech’s “strategy for addressing the shortage of cyber talent[, which] centered around [DELETED].” Id. To that end, both the protester and ManTech devoted a portion of their OTPP slides to the challenges arising from the shortage of cyber talent. The protester’s OTPP states that the protester’s team has [DELETED] and makes reference to [DELETED]. See AR, Tab 4.b.3, Parsons OTPP Slides at 12. In contrast, ManTech’s OTPP devotes an entire slide to describing ManTech’s approach to its cyber talent pipeline, listing specific strategies to capture and grow cyber talent. See AR, Tab 16.b.2, ManTech OTPP Slides at 17. Thus, the record does not support the contention that the agency treated proposals unequally in this way, as the proposals were substantively distinguishable.
Additionally, the protester has not demonstrated that the agency unreasonably found that the protester’s proposal lacked detail on the protester’s employee advancement system when discussing the protester’s approach to retaining qualified personnel. With respect to the protester’s approach to employees who seek outside job offers, the protester’s proposal states only that “[DELETED].” AR, Tab 4.b.1, Parsons Written Technical Proposal at BB‑15. It was not unreasonable for the agency to conclude that this single sentence lacked detail regarding the protester’s advancement system. Moreover, while the protester faults the agency for not similarly concluding that ManTech’s proposal lacked detail with respect to employee advancement, see Comments at 43, ManTech’s proposal is substantively distinguishable in this regard. For example, ManTech’s proposal discussed the [DELETED], [DELETED], and [DELETED]. See AR, Tab 16.b.1, ManTech Written Technical Proposal at BB‑15. Thus, the record demonstrates that the agency’s evaluation was neither unreasonable nor the product of unequal treatment.
As a final example, the protester contends that the agency improperly failed to credit its proposal for naming three additional key personnel beyond those the TOR required, as well as for the protester’s key personnel’s superior qualifications to those of ManTech’s key personnel. Protest at 56‑58; Comments at 47‑48. Here, the record reflects that the agency assigned strengths to both the protester’s and ManTech’s proposals for the qualifications of their key personnel. See AR, Tab 6, TEB Consensus Report at 33‑34, 47‑48; Tab 7, Award Decision Document at 324. In assigning those strengths, the agency noted that both offerors had proposed key personnel whose qualifications met all required and desired qualifications, and in many cases, exceeded both the required and desired qualifications. See AR, Tab 6, TEB Consensus Report at 33, 47. Thus, the record reflects that the agency specifically credited the protester’s proposal for the elements that the protester now asserts should have been weighted more favorably. To the extent the protester believes that its proposal merited more heavily or significantly weighted strengths, the protester’s disagreement with the agency’s judgment, without more, does not provide a basis to sustain the protest. Environmental Chem. Corp., B‑416166.3 et al., June 12, 2019, 2019 CPD ¶ 217 at 15; Protection Strategies, Inc., B‑414648.2, B‑414648.3, Nov. 20, 2017, 2017 CPD ¶ 365 at 8. Here, the record shows that the agency positively viewed and credited the protester’s proposal for its key personnel, and the protester’s belief that the agency should have ascribed even more weight to these evaluation findings is quintessentially a matter of disagreement with the evaluation.
Based on our review of the record, we conclude that the agency reasonably evaluated proposals under the key personnel and project staffing factor, as shown by the examples discussed above. Accordingly, we deny the protester’s challenge to that evaluation.[8]
Unfair Competitive Advantage
Lastly, the protester alleges that ManTech had an unfair competitive advantage arising from its hiring of the former Navy PCA, arguing that his duties in that former role must have given him access to competitively sensitive information about the protester and its subcontractors through their performance of the predecessor effort. Protest at 62‑65; Comments at 50‑52. The protester alleges that, given the former Navy PCA’s current role with ManTech, “there is no doubt that [he] was involved in ManTech’s ICON capture effort.” Protest at 64. Thus, the protester argues, ManTech may have gained an unfair competitive advantage through its hiring of a government official, and it therefore should have been disqualified from the competition. Id. at 65.
Where a firm may have gained an unfair competitive advantage through its hiring of a former government official, the firm can be disqualified from a competition based on the appearance of impropriety created by the situation, even if no actual impropriety can be shown, so long as the determination of an unfair competitive advantage is based on hard facts and not on mere innuendo or suspicion. Peraton, Inc., B‑422585 et al., Aug. 16, 2024, 2024 CPD ¶ 173 at 6. The assessment of whether an unfair competitive advantage has been created by a firm’s hiring of a former government official is based on a variety of factors, including an assessment of whether the government employee had access to non‑public proprietary or source selection sensitive information that was competitively useful. Id. The protester bears the burden of providing the hard facts needed to support an unfair competitive advantage allegation. See Science Applications Int’l Corp., B‑419961.3, B‑419961.4, Feb. 10, 2022, 2022 CPD ¶ 59 at 12; Perspecta Enter. Sols., LLC, B‑418533.2, B‑418533.3, June 17, 2020, 2020 CPD ¶ 213 at 11. Here, we conclude that there are no hard facts demonstrating an unfair competitive advantage.
While the protester infers that the former Navy PCA must have had access to non‑public, competitively sensitive information about the protester’s performance of the predecessor effort based on the duties of his position with the Navy, the individual has submitted a sworn declaration detailing his involvement with that effort. Relevant to the protester’s allegations, he states that he “oversaw the Navy’s cyber program, which included, at a high level, the resource allocation, policy[,] and readiness for all Navy and Marine Corps cyberspace activities.” Decl. of Former Navy PCA at 1. He further states that he “was not directly involved in any of the Navy’s cyber procurement efforts[,]” and that his “involvement remained at the Department‑level[.]” Id. He avers that “[t]o the extent [he] was briefed on any programs, it was never a discussion of any contractor’s performance issues or capabilities[,]” and that “[a]t no time did [he] have any access to contractors’ proprietary information or any competitively useful information.” Id. With respect to the protester’s performance under the predecessor vehicle, he states that he “never had any visibility into any proprietary details of Parsons’ performance[.]” Id.
Similarly, with respect to ManTech’s proposal in the instant procurement, he states that his involvement in ManTech’s procurement efforts is high‑level but that he “do[es] not play a material role in the submission.” Id. at 2. Thus, for the ICON proposal, he “did not draft the proposal, make any revisions to the proposal, or attend the oral presentation.” Id. Furthermore, the contracting officer confirmed that the Navy Cyber Warfare Development Group and Marine Corps Cyberspace Operations were not involved in drafting the ICON task order PWS or evaluating proposals submitted in response to the TOR. COS at 3. The contracting officer also confirmed that the individual in question was not the point of contact for those agencies, and that he had no involvement in drafting the PWS. Id.
On this record, we conclude that there are no hard facts demonstrating that ManTech gained an unfair competitive advantage through its hiring of the former Navy PCA. The protester has alleged, in essence, that this individual’s duties as the Navy PCA necessarily must have given him access to information regarding the protester’s performance under the predecessor effort, and furthermore, that he must have been involved in preparing ManTech’s proposal. His sworn declaration, however, provides no factual basis for us to conclude that he did, in fact, have access to information about the technical capabilities of the protester and its subcontractors through their prior performance. Furthermore, his declaration and the contracting officer’s statement indicate that he had no substantive involvement in either the drafting of the TOR and PWS or ManTech’s proposal.
Thus, the allegation that ManTech gained an unfair competitive advantage through the hiring of this individual rests solely on innuendo or suspicion, and not the requisite hard facts. We therefore deny this ground of protest. See Cybermedia Techs., Inc., B‑420881, B‑420881.2, Oct. 14, 2022, 2022 CPD ¶ 259 at 9 (where protester alleged that it was “extremely likely” former agency officials had access to information regarding performance of incumbent contracts but record demonstrated they were not involved in any aspect of the procurement, contracting officer did not err in not investigating); PRC, Inc., B‑274698.2, B‑274698.3, Jan. 23, 1997, 97‑1 CPD ¶ 115 at 17‑20 (denying unfair competitive advantage allegation where hearing testimony from former government official hired by awardee did not establish that he had access to competitively sensitive information); cf. Health Net Fed. Servs., LLC, B‑401652.3, B‑401652.5, Nov. 4, 2009, 2009 CPD ¶ 220 at 29‑35 (sustaining allegation that contracting officer improperly failed to investigate unfair competitive advantage where record established a prima facie case that former government official had access to competitively useful information and worked on the awardee’s proposal).
The protest is denied.
Edda Emmanuelli Perez
General Counsel
[1] Citations to the amended TOR are to the Adobe PDF page numbers.
[2] Based on the approximately $1.3 billion value of the task order, see AR, Tab 7, Award Decision Document at 328, the protest falls within our statutory grant of jurisdiction to hear protests in connection with task and delivery orders valued in excess of $10 million issued under civilian agency multiple‑award, indefinite‑delivery, indefinite‑quantity contracts. 41 U.S.C. § 4106(f); see also General Dynamics Info. Tech., Inc., B‑422272, B‑422272.2, Mar. 15, 2024, 2024 CPD ¶ 81 at 4 n.2 (applicable threshold for protest of task order in support of a DOD agency issued under Alliant 2 GWAC is $10 million).
[3] The protester advances additional collateral arguments. While we do not address all of the protester’s arguments, we have considered them and conclude that they provide no basis to sustain the protest. For example, the protester alleges that the agency’s evaluators were not representative of the ICON user community, which the protester contends contributed to the allegedly unreasonable evaluation of proposals. See Protest at 15‑16; Comments at 5‑6. We dismiss that allegation as failing to state a legally sufficient basis for protest. The composition of a technical evaluation panel is within the discretion of the contracting agency, and absent evidence of bad faith, bias, or conflict of interest, none of which has been alleged, let alone shown, here, we will not question the agency’s choice of evaluators. See NSR Sols., Inc., B‑406337, B‑406337.2, Apr. 18, 2012, 2012 CPD ¶ 154 at 2 n.2 (dismissing challenge that agency evaluation panel members lacked sufficient technical expertise).
The protester also contends that the agency unreasonably failed to ask clarifying questions during the protester’s oral presentation, leading to the assignment of weaknesses that the protester argues could have been addressed through clarifications. Protest at 18‑21; Comments at 8‑9. The TOR provided that the agency had the option, but was not required, to ask clarification questions of offerors in connection with their oral presentations. See Amended TOR at 110 (“Upon completion of the [i]n‑[p]erson [o]ral [t]echnical [p]roposal [p]resentation, the [g]overnment may caucus to formulate any clarification questions regarding the [w]ritten [t]echnical [p]roposal and [i]n‑[p]erson [o]ral [t]echnical [p]roposal [p]resentation.”); 113 (“The [g]overnment may . . . [a]sk clarifying questions during the [question and answer] period of the presentations if needed.”). Agencies have broad discretion whether to seek clarifications from offerors, and offerors have no automatic right to clarifications regarding proposals. Castellano Cobra UTE- MACC LEY 18‑1982, B‑421146.2, Jan. 19, 2024, 2024 CPD ¶ 37 at 4; Valkyrie Enters., LLC, B‑414516, June 30, 2017, 2017 CPD ¶ 212 at 7. This is especially true where, as here, the solicitation permits, but does not require, the agency to seek clarifications. See All Points Logistics, Inc., B‑418700.2, Jan. 11, 2021, 2021 CPD ¶ 19 at 8 (no right to clarifications where the solicitation reserved the right to ask clarification questions). As discussed below, we conclude that the agency reasonably assigned weaknesses to the protester’s proposal based upon the information submitted and presented to the agency. Given the agency’s wide discretion, we have no basis to find that the agency acted improperly by assigning those weaknesses without seeking to resolve them through clarifications.
[4] While the protester decries the declaration as a “post hoc effort to justify [the agency’s] unsupported finding,” Supp. Comments at 3, our Office generally will consider post‑protest explanations that provide a detailed rationale for contemporaneous conclusions, and simply fill in previously unrecorded details, as long as those explanations are credible and consistent with the contemporaneous record, Booz Allen Hamilton, Inc., B‑420116.6, B‑420116.7, Aug. 22, 2022, 2022 CPD ¶ 221 at 9. Here, as discussed in greater detail below, the TEB chair’s declaration fills in previously unrecorded details consistent with the contemporaneous record and demonstrates there was a reasonable basis for the agency to assign this strength to ManTech’s proposal. See, e.g., Wolff & Mueller Gov’t Servs. GmbH & Co. KG, B‑419181, B‑419181.2, Dec. 28, 2020, 2021 CPD ¶ 12 at 4‑6 (concluding that post‑protest explanations for errors in evaluation board report and source selection decision document were consistent with the contemporaneous record); SENTEL Corp., B‑407060, B‑407060.2, Oct. 26, 2012, 2012 CPD ¶ 309 at 9 & n.6 (same with respect to selection decision).
[5] As with the technical and management approach factor, see n.4 supra, we find that the TEB chair’s declaration is consistent with the contemporaneous record, and fills in previously unrecorded details regarding the basis for the agency’s evaluation under the ICON project scenario factor.
[6] As the TEB chair explains, “Category 1 RMF vulnerabilities are most at risk of serious exploitation and if exploited by a malicious attack, these vulnerabilities are the most significant threats to the wider network. If unchecked, Category 1 vulnerabilities are likely to directly lead to data breaches or loss of services.” Decl. of TEB Chair at 3.
[7] In its supplemental comments, the protester contends that its “core protest ground” took exception specifically to the agency’s use of the phrase “[DELETED]” in assigning this strength, as ManTech’s proposal does not use that particular terminology. Supp. Comments at 9. The TEB chair acknowledges that “[t]he reference to [DELETED] should have been identified as . . . [DELETED][.]” Decl. of TEB Chair at 3. In light of the evaluation record’s further reference to “[DELETED][,]” as well as the substantive content of ManTech’s OTPP slides discussed above, we assign no importance to the evaluators’ apparently erroneous reference to “[DELETED].”
[8] The protester also challenges the agency’s best‑value tradeoff decision based on its challenges to the underlying evaluation. Protest at 60‑62; Comments at 48‑50. This assertion is based on the protester’s various complaints that we have rejected. Accordingly, we dismiss the challenges to the best‑value tradeoff because they do not establish a valid basis of protest. NTT Data Servs. Fed. Gov’t, LLC, B‑420274, B‑420274.2, Jan. 18, 2022, 2022 CPD ¶ 69 at 17.