Skip to main content

AttainX, Inc.

B-421546,B-421546.2 Jun 22, 2023
Jump To:
Skip to Highlights

Highlights

AttainX Think Tank LLC, of Herndon, Virginia, protests the evaluation of its proposal by the Department of Commerce, National Oceanic and Atmospheric Administration (NOAA), pursuant to request for proposals (RFP) No. 1305M4-22-RNEEA-0001, to provide "a wide assortment of professional, technical and scientific services." See Combined Contracting Officer's Statement and Memorandum of Law (COS/MOL) at 1. AttainX primarily challenges the agency's assessment of a "low confidence" rating under the most important evaluation factor, relevant technical experience.

We deny the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: AttainX, Inc.

File: B-421546; B-421546.2

Date: June 22, 2023

Matthew T. Schoonover, Esq., Matthew P. Moriarty, Esq., John M. Mattox II, Esq., Ian P. Patterson, Esq., and Tim J. Laughlin, Esq., Schoonover & Moriarty LLC, for the protester.
Ryan Lambrecht, Esq., Department of Commerce, for the agency.
Glenn G. Wolcott, Esq., and Christina Sklarew, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1. Agency reasonably assessed a rating of “low confidence” to protester’s proposal under the solicitation’s most important evaluation factor, technical experience.

2. Agency’s methodology in evaluating proposals and identifying the most highly rated offerors was reasonable and consistent with the provisions of the solicitation.

DECISION

AttainX Think Tank LLC, of Herndon, Virginia, protests the evaluation of its proposal by the Department of Commerce, National Oceanic and Atmospheric Administration (NOAA), pursuant to request for proposals (RFP) No. 1305M4-22-RNEEA-0001, to provide “a wide assortment of professional, technical and scientific services.” See Combined Contracting Officer’s Statement and Memorandum of Law (COS/MOL) at 1.[1] AttainX primarily challenges the agency’s assessment of a “low confidence” rating under the most important evaluation factor, relevant technical experience.

We deny the protest

BACKGROUND

On December 2, 2021, the agency issued the solicitation as a total small business set‑aside. The solicitation sought proposals to provide a broad range of professional, technical and scientific services in the “Satellite Domain,”[2] and stated that the agency intended to award between 10 and 25 IDIQ contracts under which task orders will subsequently be issued.[3] RFP at 112.[4] The services identified in the solicitation’s performance work statement (PWS) were divided into various “service areas,”[5] and each service area identified specific “elements” that may be required under subsequent task orders.[6] The solicitation further provided that source selection decisions would be made on the basis of “Highest Technically Rated Offerors with a Fair and Reasonable Price,” and provided that, in identifying the highest technically rated proposals, the agency would consider the following evaluation factors, listed in descending order of importance: relevant technical experience,[7] management approach, and past performance. Id.

The solicitation also provided that the procurement would be conducted in two phases, stating that phase one would consist of each offeror’s “self-assessment” of its recent relevant experience. Id. at 101. More specifically, the solicitation required each offeror to submit a matrix (RFP attachment J-4) in which the offeror characterized its experience in performing each of the service areas’ 190 elements as “extensive,” “some,” or “no” relevant experience. Id. at 114-15. The solicitation warned offerors that, as discussed below, experience claimed in phase one would have to be substantiated during phase two and, accordingly, advised offerors that they “should review and consider” the phase two requirements before submitting their phase one self-assessment proposals. Id. at 101. More specifically, the solicitation advised that: “[o]fferors should only claim experience for those elements where they can clearly substantiate (in Phase 2) the level of the experience they are claiming.” AR, Tab 4b, attach. J-4, Tab 1. The solicitation provided that, following the phase one submissions, the agency would make advisory recommendations to offerors regarding whether they should proceed to phase two.

In phase two, offerors were required to submit additional information substantiating the experience claimed in phase one.[8] With regard to the technical experience evaluation factor, each offeror was required to identify up to 20 prior contracts under which it had gained the experience the offeror was claiming;[9] map the prior contracts to the relevant performance elements;[10] and submit a written narrative[11] in which the offeror “shall describe its depth of experience.”[12] RFP at 103. Of significance here, the solicitation specifically stated:

It is the Offeror’s responsibility to demonstrate [its] experience in [its] proposal. For example, the Offeror must demonstrate that the relevant experience examples provided in Phase Two align with the levels of experience provided in Phase One. The Offeror is required to ensure all proposal information submitted is verifiable. If the Source Selection Evaluation Board detects a high degree of contradictory or unsubstantiated information submitted in an Offeror’s proposal, the Government will negatively evaluate the proposal, and remove the Offeror from being considered for award.[[13]]

Id. at 117 (emphasis added).

The solicitation provided that, following submission of the phase two technical experience proposals, the agency would evaluate each offeror’s claimed experience, making judgments and assessments regarding the agency’s “degree of confidence in an Offeror’s understanding of and capability to perform work that is relevant to the elements of the PWS.” Id. With regard to these assessments, the solicitation identified multiple aspects of an offeror’s experience that would be considered.[14] Id. Under the heading “Basis for Award,” section M of the solicitation stated that an offeror “need not provide capability for all of the listed services set forth in the PWS to be considered for award,” elaborating that, in performing its evaluation and making its source selection decisions, the agency would consider whether a given offeror “demonstrate[d] a high level of technical merit or proficiency for a subset of the PWS services.”[15] Id. at 113.

On January 6, 2022, phase one proposals were submitted by 66 offerors, including AttainX.[16] AttainX claimed various levels of experience under virtually all (188 of 190) of the performance elements. Protest at 7. Based on its assertions of prior experience, AttainX was one of 40 offerors subsequently invited to submit phase two proposals. Id. On February 28, phase two proposals were submitted by 40 offerors, including AttainX.

Thereafter, the agency evaluated the phase two proposals.[17] In evaluating AttainX’s proposal under the technical experience factor, the agency found that AttainX’s proposal “repeatedly failed to demonstrate the requirements of the evaluation criteria, both by failing to address the requirements of the PWS’s service areas and elements, and by providing vague descriptions of its understanding and experience.” COS/MOL at 2-3; see AR, Tab 22, Consensus Evaluation Report at 1-19. Consistent with the solicitation’s warning that phase two proposals that failed to adequately substantiate the experience claimed in phase one would be “negatively evaluate[d]” and “remove[d] . . . from being considered for award,” see RFP at 117, the agency assessed a rating of low confidence to AttainX’s proposal under the technical experience evaluation factor, rendering the proposal ineligible for award.

In assessing a rating of low confidence to AttainX’s proposal under the technical experience factor, the agency’s contemporaneous evaluation documentation included the following summary:

Of the 188 elements the Offeror proposed, 49 elements (26%) supported a Low Confidence rating, 113 elements (60%) supported a Some Confidence rating, and 26 elements (14%) supported a High Confidence rating, reflecting the Offeror’s demonstrated understanding of the work and capability to successfully perform. The Service Areas for [REDACTED] were rated Low Confidence in half or more of their proposed elements. None of the Service Areas were rated High Confidence in half or more of their proposed elements. Because of the high number of service areas with Low Confidence ratings, and because of the significant number of proposed elements with Low Confidence ratings that were not offset by the relatively low number of High Confidence element ratings, the Government has low confidence [that AttainX] understands the requirement, has relevant technical experience, and will be successful in performing the contract.

AR, Tab 22, Consensus Evaluation Report at 1.

Overall, the proposals of AttainX and the offerors selected for award were rated as follows:

 

Technical Experience

Management Approach

Past Performance

 

Cost/Price

Centuria

High Confidence

Some Confidence

 

Exceptional

 

Reasonable

Columbus Techs. and Services, Inc.

Some Confidence

High Confidence

 

Exceptional

 

Reasonable

Data Networks, Inc.

High Confidence

Some Confidence

 

Very Good

 

Reasonable

Earth Resources Technology, Inc.

High Confidence

Some Confidence

 

Very Good

 

Reasonable

ENSCO Inc.

High Confidence

Some Confidence

 

Exceptional

 

Reasonable

Global Science & Technology, Inc.

High Confidence

Some Confidence

 

Very Good

 

Reasonable

I.M. Systems Group, Inc.

High Confidence

High Confidence

 

Very Good

 

Reasonable

IBSS Corporation

High Confidence

Some Confidence

 

Very Good

 

Reasonable

INNOVIM, LLC

High Confidence

Some Confidence

 

Exceptional

 

Reasonable

Integrated Systems Solutions, Inc.

High Confidence

Some Confidence

 

Very Good

 

Reasonable

Relative Dynamics, Inc.

Some Confidence

Some Confidence

 

Very Good

 

Reasonable

RIVA Solutions, Inc.

High Confidence

Some Confidence

 

Very Good

 

Reasonable

Riverside Technology, Inc.

High Confidence

Some Confidence

 

Very Good

 

Reasonable

Science and Technology Corp.

High Confidence

Some Confidence

 

Exceptional

 

Reasonable

Spatial Front

High Confidence

Some Confidence

 

Very Good

 

Reasonable

AttainX

Low Confidence

Some Confidence

 

Very Good

 

Reasonable

 

AR, Tab 25.a, SSDD at 5, 8-9.

On March 1, the agency awarded IDIQ contracts to the 15 contractors listed above and notified the unsuccessful offerors, including AttainX, that their proposals had not been selected for award. This protest followed.

DISCUSSION

AttainX challenges various aspects of the agency’s source selection process, first and foremost challenging the agency’s assessment of a low confidence rating under the most important evaluation factor, relevant technical experience. Additionally, AttainX asserts that its proposal should have received a rating of high confidence, rather than some confidence, under the management approach evaluation factor. As discussed below, we find no basis to sustain AttainX’s protest.[18]

Evaluation of AttainX’s Technical Experience

In protesting the agency’s assessment of a low confidence rating under the technical experience evaluation factor, AttainX challenges: (1) the agency’s determination that AttainX failed to substantiate its claimed experience for a substantial portion of the performance elements; and (2) the agency’s evaluation methodology.

Substantiation of Claimed Experience

First, AttainX challenges the agency’s evaluation with regard to each and every one of the 49 performance elements for which the agency assigned low confidence ratings, maintaining that the agency applied unstated evaluation factors, ignored information in AttainX’s proposal, and failed to adequately document its evaluation. In short, AttainX asserts that the phase two descriptions of experience that AttainX submitted for each of these elements was adequate to substantiate its experience and that they should have led to ratings of either some confidence or high confidence.[19]

In challenging the reasonableness of the agency’s assessments, AttainX first asserts that the solicitation “provided remarkably little direction” regarding the performance requirements; maintains that the specifications for each element were “highly general” and “nothing more than the Element’s name”; and complains that the solicitation “did not provide discrete requirements, performance objectives, or sample deliverables.” Protest at 2, 4; Supp. Protest at 2. Based on its characterization of the solicitation provisions, AttainX asserts that the requirement to demonstrate relevant experience “did not present a high bar for offerors.” Supp. Protest at 4. Accordingly, AttainX maintains that the agency’s multiple assessments that AttainX’s phase two proposal did not adequately describe, and therefore failed to substantiate, its experience under the 49 elements held AttainX to a “heightened and unstated evaluation standard” that was inconsistent with the terms of the solicitation. Id. at 1.

AttainX further complains that the agency’s evaluation “provides no analysis” to support its low confidence determinations, arguing that the agency had an affirmative obligation to document its contemporaneous evaluation with statements regarding what AttainX’s proposal should have stated. Supp. Protest at 2-5. That is, AttainX argues that the agency’s contemporaneous evaluation documentation was inadequate because it did not describe what information about AttainX’s puported experience was missing from its proposal. Id.

In response, the agency first maintains that AttainX’s description of the solicitation requirements is fundamentally inaccurate. COS/MOL at 24-26. Specifically, the agency notes that, the solicitation requirements were defined by reasonably detailed specifications for each service area, which identified multiple requirements and objectives that offerors were required to demonstrate. The requirements were further defined by the solicitation’s “General Definitions” section, which identified specific requirements for terms that were frequently used throughout the PWS. Finally, the requirements were further defined by the specific title of the element itself. Id.; see RFP at 14, 17. In short, the agency maintains that, contrary to AttainX’s characterization of the solicitation’s performance requirements, those requirements were well-defined.

Next, the agency notes that the solicitation expressly placed offerors on notice that they were responsible for demonstrating relevant experience by adequately describing their prior activities and ensuring that the information presented was verifiable; warned that “statements paraphrasing the requirements” would be considered “inadequate and unsatisfactory”; and further warned offerors that failure to adequately substantiate their claimed experience would lead to rejection of their proposals. See RFP at 94, 117.

Finally, with regard to AttainX’s assertion that the agency’s evaluation of each of the 49 performance elements that received ratings of low confidence was flawed and inadequately documented, the agency provides a detailed discussion of its evaluation for each of those elements. See COS/MOL at 26‑135.

In reviewing protests challenging an agency’s evaluation of proposals, our Office does not reevaluate proposals, but examines the record to determine whether the agency’s judgments were reasonable and in accordance with the stated evaluation criteria and applicable procurement laws and regulations. Trandes Corp., B‑411742 et al., Oct. 13, 2015, 2015 CPD ¶ 317 at 6. An offeror’s disagreement with the agency’s judgments, without more, is insufficient to establish that the agency acted unreasonably. STG, Inc., B‑405101.3 et al., Jan. 12, 2012, 2012 CPD ¶ 48 at 7. Additionally, an offeror has the burden of submitting a clearly written proposal, and where a proposal fails to clearly convey required information, the offeror runs the risk of an adverse agency evaluation. G.A. Braun, Inc., B-413735, Dec. 21, 2016, 2016 CPD ¶ 374 at 5.

In reviewing an agency’s evaluation, we do not limit our consideration to contemporaneously-documented evidence, but instead consider all the information provided, including the parties’ arguments and explanations concerning the contemporaneous record. Remington Arms Co., Inc., B-297374, B-297374.2, Jan. 12, 2006, 2006 CPD ¶ 32 at 10. Post-protest explanations that provide a detailed rationale for contemporaneous conclusions, and simply fill in previously unrecorded details, will generally be considered in our review of the reasonableness of evaluation decisions--provided those explanations are credible and consistent with the contemporaneous record.[20] OGSystems, LLC, B-417026.5, B-417026.6, July 16, 2019, 2019 CPD ¶ 273 at 4-5; NWT, Inc.; PharmChem Labs., Inc., B-280988, B-280988.2, Dec. 17, 1998, 98‑2 CPD ¶ 158 at 16.

Finally, where a protester and agency disagree over the meaning of solicitation language, we will resolve the matter by reading the solicitation as a whole and in a manner that gives effect to all its provisions; to be reasonable, and therefore valid, an interpretation must be consistent with the solicitation when read as a whole and in a reasonable manner. See, e.g., Alluviam LLC, B-297280, Dec. 15, 2005, 2005 CPD ¶ 223 at 2.

Here, we have reviewed the record and find no basis to question the agency’s evaluation of AttainX’s technical experience that concluded AttainX’s descriptions of its experience under 49 performance elements was inadequate and failed to substantiate the experience claimed.

For example, AttainX’s phase one proposal asserted that it had experience performing the requirements contained in PWS section C.3.2.60, “Flight Segment – Pre-Launch, Launch, Early Orbit Raising.” The solicitation defined those requirements as “services . . . support[ing] program flight segments,” elaborating that “[a] flight segment is defined as a collection of airborne and spaceborne hardware, software and communications resources to support all phases of an observing system lifecycle,” and specifically contemplated services performed during pre-launch, launch, and early orbit. RFP at 21. AttainX’s phase two proposal identified two contracts (described by the agency as “data-buy” contracts) as its basis for claiming experience in performing this requirement. AttainX’s entire description of its experience (repeated verbatim for both contracts) was:

[REDACTED]

AR, Tab 15, AttainX Phase Two Technical Proposal at 18, 24.

In assigning a low confidence rating for this element, the agency’s contemporaneous evaluation stated that “[AttainX] asserted capability but failed to demonstrate capability or experience with this element,” adding that AttainX’s proposal “failed to demonstrate how this element occurred in [the] data-buy contract[s] cited.” AR, Tab 22, Consensus Evaluation Report at 9.

In responding to AttainX’s protest, the agency notes that AttainX’s description of its purported experience did not describe what actual work it performed; did not explain what “implement[ing] a process” entailed; and notes that, although the performance element contemplated experience with pre-launch, launch, and orbit activities, AttainX statement did not even cursorily address pre-launch and launch activities. COS/MOL at 71-72.

By way of another example, AttainX’s phase one proposal asserted that it had experience performing the requirements contained in PWS section C.3.2.66, “Ground Segment – Data Systems – Calibration, Validation, Verification.” The solicitation defined those requirements as “services [that] support the data systems elements of a program’s ground system,” elaborating that “[t]he ground segment is defined as [a] collection of on-ground hardware, software, network and communication resources that support all phases of an observing system lifecycle.” RFP at 21. The solicitation further provided detailed definitions of the terms “calibration,”[21] “validation,”[22] and “verification.”[23]

AttainX’s phase two proposal identified two contracts as the basis for its claimed experience performing this requirement. The entire description of its experience (repeated verbatim for both contracts) was:

[REDACTED]

AR, Tab 15, AttainX Phase Two Technical Proposal at 21, 26.

In assigning a rating of low confidence for this element, the agency’s contemporaneous evaluation stated that “[AttainX] asserted capability but failed to demonstrate capability or experience with this element,” adding that “[AttainX] states they do extensive testing – but no mention of Calibration, Validation, and Verification.” AR, Tab 22, Consensus Evaluation Report at 10.

In responding to AttainX’s protest, the agency notes that AttainX’s description of its purported experience did not describe what actual work it performed; did not explain what it did in the “development test and implementation of [its] ground station systems”; and provided no information about what the ground systems were, or what data systems were included in those ground systems. COS/MOL at 73-74.

Overall, based on our review of the agency’s contemporaneous evaluation documentation; its comprehensive response to the protest (which provides further explanation and detail regarding the agency’s basis for evaluating each of the 49 elements challenged by AttainX);[24] and AttainX’s phase two proposal, we find nothing unreasonable in the agency’s assessment of low confidence ratings for each of the 49 challenged elements.

First, we reject AttainX’s assertion that the solicitation’s descriptions of the performance requirements were “highly general,” “nothing more than the Element’s name,” and “did not present a high bar for offerors.” See Protest at 2, 4; Supp. Protest at 2, 4. Rather, as the agency points out, the solicitation here contained reasonably detailed definitions of the performance requirements based on the specifications identified for each service area, the solicitation’s definition of multiple relevant terms, and the specific title of each element; read together, we agree with the agency that the solicitation requirements were defined with reasonable detail.

Further, as noted above, the solicitation clearly directed offerors to describe their experience in a manner that could be substantiated, and to provide sufficient information for the agency to make reasonable assessments regarding the extent of the offeror’s understanding of the performance requirements. Here, as represented by the examples discussed above, we find nothing unreasonable in the agency’s determination that AttainX’s phase two proposal failed to meet those requirements.

Based on our review of the record, we find no basis to question the agency’s evaluation of AttainX’s proposal with regard to each of the challenged elements. Accordingly, AttainX’s assertions that the agency applied unstated evaluation criteria, ignored information in AttainX’s proposal, and failed to adequately document its evaluation are denied.

Agency’s Evaluation Methodology

In addition to protesting the agency’s low confidence assessments at the element level, AttainX also challenges the agency’s evaluation methodology, which incorporated those assessments into its overall assessment of AttainX’s proposal under the technical experience evaluation factor. First, AttainX complains that it was unreasonable for the agency to assess an overall rating of low confidence under the technical experience factor when AttainX’s proposal received ratings of some confidence for 60 percent of the solicitation requirements, and ratings of high confidence for 14 percent of the requirements. Protest at 9-11. In this context, AttainX maintains that, even if it failed to substantiate more than 25 percent of the solicitation requirements, this did not constitute the “high degree” of unsubstantiated information that warranted exclusion, see RFP at 117, and, accordingly, did not warrant assessment of an overall low confidence rating. Id.

Next, AttainX complains that the agency improperly considered the extent to which an offeror demonstrated experience in the various service areas,[25] rather than limiting its consideration to the total number of elements for which at least some experience was demonstrated.[26] Id. Overall, AttainX asserts that the agency’s evaluation methodology was unreasonable and contrary to the terms of the solicitation.

The agency responds that AttainX’s challenges to the agency’s methodology fail to acknowledge that the solicitation expressly advised offerors that, in evaluating technical experience, the agency would consider multiple aspects of offerors’ proposals and make qualitative assessments regarding the offerors’ relative qualifications and experience, stating that the agency would “assess its degree of confidence in an offeror’s understanding and capability to perform [the] work.”[27] COS/MOL at 20-24; see RFP at 117. The agency points out that, in addition to providing for consideration of an offeror’s experience in performing each element, the solicitation advised offerors that the agency would also consider an offeror’s experience regarding “a subset of the PWS services.” COS/MOL at 20-24; see RFP at 113. Thus, the agency maintains that the solicitation put offerors’ on notice that the agency would make assessments regarding offerors’ experience in the various service areas--that is, “subsets” of the PWS requirements. Further, in the context of considering the extent of offerors’ experience in the various service areas, the agency maintains that it was reasonable to consider whether an offeror did, or did not, have experience performing a majority of a given service area’s elements. COS/MOL at 20‑24. Finally, the agency notes that the solicitation advised offerors of the agency’s stated intent to award contracts to “a set of service providers” that will ensure competition at the task order level and, accordingly permitted the agency’s consideration of whether all service areas were sufficiently covered by capable contractors in order to facilitate competition. Id. at 15; see RFP at 14, 17, 112. Accordingly, the agency maintains that the solicitation placed offerors on notice that the evaluation methodology to be employed under the technical experience factor would not be limited to consideration of individual performance elements, and asserts that the agency’s execution of the evaluation methodology was reasonable and consistent with the terms of the solicitation.

While procuring agencies are required to identify significant evaluation factors and subfactors in a solicitation, they are not required to identify every aspect of each factor that might be considered; rather, agencies reasonably may take into account considerations, even if unstated, that are reasonably related to or encompassed by the stated evaluation criteria. See, e.g., Front End Analytics, LLC, B-420024.2, B‑420024.3, Feb. 2, 2022, 2022 CPD ¶ 53 at 8.

Here, we reject the protester’s assertion that the methodology the agency employed in evaluating proposals under the technical experience evaluation factor was unreasonable or contrary to the terms of the solicitation. As discussed above, the solicitation clearly identified multiple aspects of offerors’ technical experience proposals that would be considered, including the extent to which the various offerors had experience in performing the requirements of the various service areas. In this context, we further find nothing unreasonable in the agency’s consideration of whether an offeror’s experience did, or did not, extend to a majority of elements in the various service areas. Finally, we do not find persuasive AttainX’s assertion that its failure to substantiate more than 25 percent of the solicitation requirements did not constitute a “high degree” of unsubstantiated information. As noted above, the solicitation specifically advised that: “offerors should only claim experience for those elements where they can clearly substantiate (in Phase 2) the level of the experience they are claiming.” See AR, Tab 4b, attach. J-4, Tab 1. On the record here, we reject AttainX’s various allegations challenging the evaluation methodology the agency employed under the technical experience evaluation factor.

Evaluation of Management Approach

Finally, AttainX asserts that its proposal should have received a rating of high confidence, rather than some confidence, under the second most important evaluation factor, management approach. However, as discussed above, we have concluded that the agency reasonably evaluated AttainX’s proposal as ineligible for award on the basis of the low confidence rating assessed under the most important evaluation factor, technical experience. Accordingly, even were we to agree that the agency should have assigned a rating of high confidence to AttainX under the management approach evaluation factor, AttainX would not be in line for award; therefore, there is no potential competitive prejudice to AttainX based on the alleged flaws in evaluation of its proposal under the management approach evaluation factor. See, e.g., MCR Federal, LLC, B‑411977, B‑411977.2, Nov. 23, 2015, 2016 CPD ¶ 3 at 5. Based on the record discussed above, we decline to further address AttainX’s complaints regarding the evaluation of its proposal under the management approach evaluation factor.

The protest is denied.

Edda Emmanuelli Perez
General Counsel

 

[1] Page number citations in this decision refer to the Adobe PDF page numbers in the documents submitted.

[2] The agency states that this procurement is “a follow-on to the Professional, Scientific, and Technical Services (‘ProTech’) Program, which was approved on May 20, 2015,” and explains that the ProTech program is comprised of multiple-award indefinite‑delivery indefinite-quantity (IDIQ) contracts in four “domains”: satellite, fisheries, oceans, and weather. COS/MOL at 4. The IDIQ contracts to be awarded under this procurement (generally referred to as “ProTech Satellite”) are “intended to satisfy the need for professional, technical, and scientific services to support the full range of related requirements for observing system activities, including satellite missions, which NOAA manages or in which NOAA participates, and managing the space and Earth environmental data that results from those missions.” Id.

[3] Noting that the agency did not expect that all of the required services could be acquired from a single contractor, the solicitation stated: “NOAA intends to achieve . . . a set of service providers who collectively can perform all of the required . . . services, and can provide NOAA with competition for coverage of services at the task order level.” Agency Report (AR), Tab 12.a, RFP amend. 2 at 13-14, 17.

[4] Unless otherwise indicated, all references to the RFP in this decision are to AR, Tab 12.a.

[5] The service areas were divided between “professional” service areas and “technical and scientific” service areas. RFP at 13-27.

[6] Specifically, and as discussed in more detail below, the solicitation contained over 20 “service areas” comprised of various “elements” (the number of elements in each of these service areas ranged from 2 to 18) for which offerors were required to demonstrate their technical experience. See AR, Tab 4b, RFP attach. J-4, Technical Experience Matrix, Tab 1. The solicitation stated that the specific performance requirements were defined by: (1) the specifications listed in each element; (2) the specification of each element’s service area; and (3) the definitions contained in section C.4 of the solicitation, titled “General Definitions.” RFP at 14, 17.

[7] The solicitation provided that the agency would assign ratings of high confidence, some confidence, or low confidence under the technical experience evaluation factor. Of relevance to this protest, the solicitation defined a low confidence rating as, “[t]he Government has low confidence that the Offeror understands the requirement, has sufficient relevant technical experience, and will be successful in performing the contract even with Government intervention,” and provided that assessment of a low confidence rating under this factor would render the proposal ineligible for award. RFP at 112-13.

[8] In phase two, offerors were also required to provide submissions relevant to the other evaluation factors; those submissions are not relevant to resolution of this protest and are not further discussed.

[9] The solicitation required that the prior contracts be identified by contract number and customer, and include contact information for the prior contracting officer and the contracting officer’s representative. Tab 4c, RFP attach. J-5, Technical Experience Form.

[10] See AR, Tab 4b, RFP attach. J-4, Phase Two Tab of Technical Experience Matrix.

[11] The narrative was limited to 40 pages. RFP at 95.

[12] In this context, the solicitation defined “depth” of experience as the extent to which the offeror’s description addressed “the entire mission lifecycle of an individual service element”; defined “lifecycle” as including “analysis,” “development,” and “execution”; and noted that “[e]xperience across the entire mission lifecycle of a service element will be evaluated more favorably than limited experience within the mission lifecycle.” Id. at 17-18, 103, 117.

[13] Consistent with these provisions, the solicitation also stated that “Offerors shall provide sufficient information for the Government to determine its level of confidence in the ability of the Offeror to perform the requirements of the RFP based on an assessment of relevant experience,” adding that “statements paraphrasing the requirements” would be considered “inadequate and unsatisfactory.” Id. at 94.

[14] In addition to warning that claims of experience that were not substantiated would lead to a “negative[] evaluat[ion]” and exclusion of the proposal from further consideration, the solicitation stated that an offeror’s claimed experience must “meet[] an element of the PWS”; “align[] with” at least one of seven “mission focus areas” identified in the solicitation; be “similar in size to current ProTech Satellite services”; and have been performed within the last five years. Id. at 101.

[15] As discussed below, the agency considered each of the various service areas as “a subset of the PWS services.” See AR, Tab 25.a, Source Selection Decision Document (SSDD) at 2.

[16] AttainX states that it “is a small business joint venture between AttainX and Think Tank.” Supp. Protest at 5.

[17] The solicitation advised offerors that the agency intended to make contract awards without conducting discussions. RFP at 113. Consistent with that provision, the agency did not conduct discussions with any offeror.

[18] In its various protest submissions, AttainX presents arguments that are variations of, or additions to, those specifically discussed below, including assertions that the agency failed to consider the experience of the joint venture partners in the aggregate; failed to reasonably advise offerors of the risk related to overstating their experience; and documented its evaluation of AttainX’s proposal in a contradictory manner. We have considered all of AttainX’s allegations and find no basis to sustain its protest.

[19] A “high confidence” rating was defined as: “[t]he Government has high confidence that the Offeror understands the requirement, has extensive relevant technical experience, and will be successful in performing the contract with little or no Government intervention.” RFP at 113. A “some confidence” rating was defined as: “[t]he Government has some confidence that the Offeror understands the requirement, has relevant technical experience, and will be successful in performing the contract with some Government intervention.” Id. Finally, as noted above, a “low confidence” rating was defined as: “[t]he Government has low confidence that the Offeror understands the requirement, has sufficient relevant technical experience, and will be successful in performing the contract even with Government intervention.” Id.

[20] In contrast, where an agency offers an explanation of its evaluation during the heat of litigation that is not borne out by the contemporaneous record, we will give little weight to the later explanation. See, e.g., Al Raha Grp. for Tech. Servs., Inc.; Logistics Mgmt. Int’l, Inc., B-411015.2, B-411015.3, Apr. 22, 2015, 2015 CPD ¶ 134 at 10.

[21] The solicitation defined calibration as: “a comparison between a known quantity or standard and its corresponding measured or sensed quantity. The concept generalizes to software, with algorithmic parameters or coefficients calibrated or ‘tuned’ to generate a result that conforms to some calibration standard.” RFP at 28.

[22] The solicitation defined validation as: “[a]ssessment of engineering, scientific, or technical fidelity. The several instances of validation throughout the PWS indicate that validation occurs at all scales ranging from individual data to products and algorithms, to systems operations, such as uplinking a satellite command load. Validation does not imply verification: a validated system may produce a scientifically accurate result, yet it may not meet the system’s accuracy requirements.” Id. at 29.

[23] The solicitation defined verification as “[a]ssessment of compliance with requirements and specifications. The several instances of verification throughout the PWS indicate that verification occurs at all scales ranging from individual data to satellite constellations. Verification does not imply validation: a system’s verified ability to timely generate a product does not imply the correctness of that product.” Id. at 30.

[24] As noted above, our Office will consider an agency’s post-protest explanations that fill in previously unrecorded details, provided those explanations are credible and consistent with the contemporaneous record. OGSystems, LLC, supra. Here, we view the agency’s post-protest explanations as simply providing additional details with regard to its contemporaneous evaluation conclusions; further, we find the agency’s explanations regarding the inadequacy of AttainX’s description of its experience to be consistent with AttainX’s proposal and, therefore, credible and consistent with the contemporaneous record. Id.

[25] As discussed above, the agency’s evaluation noted that AttainX failed to demonstrate experience in a majority of elements for eight service areas, and failed to demonstrate high confidence in a majority of the elements for any service area. See AR, Tab 22, Consensus Evaluation Report at 1. In this context, AttainX also asserts that the agency applied an “arbitrary metric” by calculating whether a proposal did, or did not, demonstrate experience in a majority (that is, over 50 percent) of the elements within a given service area. Protest at 11.

[26] In this context, AttainX asserts that each of the individual elements constituted evaluation factors that were to be given equal weight. Id. at 10-11.

[27] Among other things, the solicitation specifically contemplated the agency’s assessment regarding the demonstrated “depth” of an offeror’s experience. RFP at 103.

Downloads

GAO Contacts