Anika Systems, Inc.
Highlights
Anika Systems, Inc., of Leesburg, Virginia, the incumbent contractor, protests the issuance of a task order to Amaze Technologies, LLC, of Herndon, Virginia, under request for proposals (RFP) No. 70SBUR24R00000009, issued by the Department of Homeland Security (DHS) to provide support services to the United States Citizenship and Immigration Services (USCIS) Office of the Chief Data Officer (OCDO). The protester contends that the agency unreasonably evaluated technical proposals and unreasonably concluded that a price realism analysis was unnecessary.
DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.
Decision
Matter of: Anika Systems, Inc.
File: B-422681.5; B-422681.6
Date: April 8, 2025
Carla J. Weiss, Esq., Logan Kemp, Esq., and Annie Hudgins, Esq., Nichols Liu, LLP, for the protester.
Daniel J. Strouse, Esq., Pablo Nichols, Esq., and Sam Van Kopp, Esq., Cordatis LLP, for Amaze Technologies, LLC, the intervenor.
Richard W. Postma, Esq., Department of Homeland Security, for the agency.
Kenneth Kilgour, Esq., and Jennifer D. Westfall-McGrail, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.
DIGEST
1. Protest that the agency unreasonably evaluated technical proposals is sustained where the record demonstrates the evaluation was unreasonable and inconsistent with procurement law and regulation.
2. Protest that the agency unreasonably concluded that a price realism analysis was unnecessary is denied where the record demonstrates that the agency had a reasonable basis for its decision not to conduct that analysis.
DECISION
Anika Systems, Inc., of Leesburg, Virginia, the incumbent contractor, protests the issuance of a task order to Amaze Technologies, LLC, of Herndon, Virginia, under request for proposals (RFP) No. 70SBUR24R00000009, issued by the Department of Homeland Security (DHS) to provide support services to the United States Citizenship and Immigration Services (USCIS) Office of the Chief Data Officer (OCDO). The protester contends that the agency unreasonably evaluated technical proposals and unreasonably concluded that a price realism analysis was unnecessary.
We sustain the protest.
BACKGROUND
USCIS is a component of DHS with the mission of adjudicating requests for immigration benefits. The contracting officer explains that, aside from the agency’s workforce, data is its most valuable resource. Email from USCIS to GAO, Mar. 13, 2025, at 2. Adjudicators use applicant-supplied data to make benefit decisions, managers use data to prioritize and assign work, and leaders use data to drive immigration policy and set fees. Id. According to the contracting officer, “[t]he amount and variability of data across the USCIS enterprise and other DHS components contribute to the complexity of using it.” Id. The contracting officer describes USCIS’s “Data Strategy” as the agency’s plan “for making its data reliable, trusted, secure, and used to drive efficiencies across the enterprise and other DHS components.” Id.
To obtain contractor support, USCIS issued the RFP in accordance with Federal Acquisition Regulation (FAR) section 16.505 to holders of the General Services Administration’s (GSA) One Acquisition Solution for Integrated Services (OASIS) small business 8(a) pool 1 contract. Agency Report (AR), Tab 6, Conformed RFP at 3. The RFP contemplated the issuance of a fixed-price task order for data strategy support services (DSSS3) to the offeror whose proposal represented the best value to the government, considering technical approach, corporate experience, and price. Id. at 16, 36-37. The technical approach factor was more important than the corporate experience factor; when combined, these two factors were significantly more important than price. Id. at 33. The task order would have a 9-month base period, four 1-year options, and a fifth option of three months. Id. at 15.
Under the technical approach factor, offerors would submit a response to two scenarios; the agency would use those responses to assess the offeror’s ability to successfully accomplish the performance work statement requirements. Id. at 37. The agency would assign the technical approach factor a rating of high confidence, some confidence, or low confidence, which were defined as follows:
Factor 1: Technical Approach and Factor 2: Corporate Experience |
|
---|---|
Confidence Level |
Definition |
High Confidence |
The Government has High Confidence that the Offeror understands the requirement, responded effectively to the evaluation criteria, and will be successful in performing the task order with little or no Government intervention. |
Some Confidence |
The Government has some confidence that the Offeror understands the requirement, responded effectively to the evaluation criteria, and will be successful in performing the task order with some Government intervention. |
Low Confidence |
The Government has low confidence that the Offeror understands the requirement or will be successful in performing the task order even with Government intervention. An Offeror receiving a low confidence rating is not eligible for award without conducting exchanges. |
Id. at 38. A key determinant here is the required degree of government intervention. See id.
The solicitation described scenario 1 under the technical approach factor as follows:
USCIS requires a new data sharing agreement with Health and Human Services (HHS). USCIS will be receiving HHS data related to refugee resettlement. HHS has indicated that the data would be available via a flat file transfer. The intention is for the data to be used for adjudicative purposes, which may require repetitive checks, and will be available for USCIS adjudicators and officers in the USCIS case management system, Electronic Immigration System (ELIS). The data will be electronically stored in the USCIS data warehouse. This will require collaboration with both internal (within USCIS) stakeholders (including the Office of Information Technology-OIT) and external (external to USCIS) stakeholders to draft those agreements and policies.
Id. at 35.
Offerors were to respond to the following prompts pertaining to this scenario:
a. Demonstrate how you would determine the standards that could be applied to the data; describe the policies and describe the coordination processes to ingest the data and insure appropriate protection.
b. Provide a plan for the assessment of quality.
c. Create a plan to train USCIS users on this new data source, including describing the source, uses, and constraints associated with the dataset.
d. Identify potential visualizations to provide leadership visibility into the data and aid with decision-making on the issue.
e. Identify business requirements for automating processes throughout the data lifecycle.
f. Define the components of a communications and outreach plan to coordinate the availability and use of data with business and technology stakeholders.
Id.[1]
Under the corporate experience factor, the agency would “assess its level of confidence that the offeror understands the requirement, responded effectively to the evaluation criteria, and will be successful in performing the task order,” using the same confidence levels. Id. Anika does not challenge the agency’s evaluation of proposals under the corporate experience factor.
The solicitation advised offerors that the agency would consider price proposals to ensure the final price was fair and reasonable. Id. at 38. The solicitation further advised offerors that, in accordance with FAR section 15.404-1(g), the agency might also determine an offer unacceptable if the prices proposed were materially unbalanced. Id. With respect to price realism, the RFP stated: “The government reserves the right to conduct a price realism analysis, should it be determined necessary.” Id.
Twenty offerors, including Amaze, Anika, and Analytica LLC, submitted proposals. AR, Tab 32, Amended Price Analysis Report at 2. USCIS issued the task order to Analytica LLC, and Anika and Amaze both protested that award decision with our Office, asserting that the awardee was ineligible because it did not hold the required OASIS 8(a) contract. We dismissed both protests when the agency took corrective action. Anika Systems, Inc., B-422681, July 29, 2024 (unpublished decision); Amaze Techs., LLC, B-422681.2, July 29, 2024 (unpublished decision). During corrective action, the agency found Analytica ineligible on the basis argued by Anika and Amaze, and Analytica protested that determination to our Office; we denied the protest. Analytica LLC, B-422681.3, B‑422681.4, Nov. 26, 2024, 2024 CPD ¶ 290.
With Analytica ineligible for award, the contracting officer reviewed the remaining proposals to make a new award decision. Contracting Officer’s Statement (COS) at 3. The table below summarizes the agency’s evaluation of the proposals of Anika and Amaze:
Factor |
Offeror |
|
---|---|---|
Anika |
Amaze |
|
Technical Approach |
Some Confidence |
High Confidence |
Corporate Experience |
High Confidence |
Some Confidence |
Price |
$85,949,560 |
$68,486,273 |
AR, Tab 25, Source Selection Document (SSD) at 5-6. The source selection authority (SSA), who was also the contracting officer for this procurement, began his best-value tradeoff by reviewing the technical evaluation committee (TEC) report and agreeing with its findings. AR, Tab 33, Addendum to Selection Decision at 3. The SSA noted that proposed prices had been found fair and reasonable based on adequate competition. Id. at 4; see also AR, Tab 32, Amended Price Analysis Report at 3. The price analysis stated that “[p]rice realism was not performed and not necessary for this firm-fixed price task order where only the labor rate was proposed. Additionally, no offeror’s evaluated price was significantly lower than the average price.” AR, Tab 32, Amended Price Analysis Report at 4.
Four offerors--including Amaze but not Anika--received the highest confidence level rating of high confidence under the most important technical evaluation factor--technical approach. AR, Tab 33, Addendum to Selection Decision at 3. Those four offerors also received ratings of some confidence under the corporate experience factor. Id. Anika was one of three additional offerors who received ratings of some confidence or higher for both technical approach and corporate experience. Id. The SSA indicated that he conducted a thorough review of the price analysis report to assess the offers in detail. Id. Of the four highest rated offerors that received identical confidence level ratings under the technical approach and corporate experience factors, Amaze had the lowest proposed price of $68,486,273 and a total evaluated price of $75,627,599.00. Id. at 4. The SSA noted that the agency had determined Amaze’s price fair and reasonable based on adequate price competition. Id. The SSA stated that he would “award to the highest rated offeror, who also offers the lowest price and therefore a trade-off is not in the best interest of the government.” Id.
USCIS issued the task order to Amaze, and Anika’s protest followed.[2]
DISCUSSION
Anika asserts myriad errors in USCIS’s evaluation of technical proposals, including that the agency used an unreasonable methodology for assigning confidence ratings. The protester also alleges that the agency unreasonably determined not to conduct a price realism analysis. As discussed below, we deny the allegation that the agency’s methodology for assigning confidence ratings was unreasonable and the allegation that the agency lacked a reasonable basis for its decision not to conduct a price realism analysis. We find meritorious many challenges to the underlying evaluation of technical proposals, however, and we sustain the protest on that basis.
Evaluation of Technical Proposals
The findings on which the agency based its assignment of a “some confidence” rating to Anika’s proposal under the technical approach factor were as follows:
·While the vendor identifies all six data literacy personas developed at USCIS, the training plan is limited to end users of the data only ([DELETED]) with no training defined for [DELETED]. While there are multiple approaches for delivering the training, [DELETED]; however, the plan seems to narrowly define the training needs as ‘data quality training’ which seems to conflict with the defined plan in Exhibit 5. It’s unclear what the vendor intends.
·The vendor demonstrates familiarity with USCIS data, existing visualizations, and mockups for potential use with the new HHS/USCIS integration. The mockups clearly align with establish work products and are in scope data from this potential scenario. While addressing the requirements, expected more innovation or creativity in expanding current government capabilities for a brand-new exchange.
·Vendor’s overall approach to automation is thorough, with examples from the initial collection of the data through transforming, using, storing, and disposition. The plan lacks boundaries for work between Team Anika and where OCDO government or OIT would execute all these activities, specifically citing solutions like [DELETED] numerous times without clearly identifying to who would complete the work in these actions. The vendor may be going into more detail than needed, creating confusion for actual work support. Overall, this plan could effectively address the requirements, however it’s unclear who is intended to do the work described.
·Vendor’s overall communication plan identifies the limitations of relying on disseminating ‘[DELETED],’ however, the approach the vendor subsequently recommends relies only on [DELETED]. There is differentiated communication based on the stakeholder (business or technical). It’s unclear what the vendor intends.
·The vendor’s approach for scenario 2 provided an accurate balance of blending current experience and best practices to achieve the goals identified in the scenario.
AR, Tab 23, TEC Report at 8.
As noted above, Anika contends that the agency’s methodology for assigning confidence ratings was unreasonable and that a number of the agency’s findings were unreasonable.[3] We first address Anika’s challenge to the methodology for assigning confidence ratings.
Methodology for Assignment of Confidence Ratings
Anika challenges the methodology the agency used to assign confidence ratings as unsupported and unjustified. Comments and Supp. Protest at 5. The protester contends that the record reflects that the evaluators did not consider how well the offeror understood the requirement, how effectively the offeror responded to the evaluation criteria, and the degree to which government intervention is required for the offeror to successfully perform the task order. Id. at 6. Anika asserts that those considerations were necessary to properly assign confidence ratings. Id. The protester argues that, because the evaluators did not identify findings as either increasing or decreasing confidence, it is impossible to know how any given finding affected the confidence rating. Id.
We disagree. While the protester claims it is impossible to know which findings increased and which decreased confidence, Anika nonetheless challenges those findings that decreased confidence. See id. at 7-16. Anika describes those findings which it challenged as “negative.” See, e.g., id. at 8 (discussing “the first negative finding, which was associated with the third prompt”). Additionally, Anika asserts that the agency disparately evaluated proposals when it assigned a “positive finding” to Amaze’s proposal but not to Anika’s. Id. at 7. Thus, it is apparent from Anika’s protest that it ascertained which of the agency’s findings were positive and increased confidence, and those which were negative and decreased confidence. Accordingly, we see no basis in the record to sustain the allegation that the methodology for assigning confidence ratings was unreasonable.
Challenges to Technical Findings
As noted above, the protester challenges a number of the agency’s technical findings. The protester disputes the negative findings attributed to its proposal, contends that the agency disparately evaluated the proposals with regard to certain content, and argues that the agency failed to reward its proposal for multiple positive features. We find merit to several, but not all, of the protester’s arguments.
Offeror Plans for Assessment of Quality
First, the protester contends that the agency evaluated the offerors’ plans for the assessment of quality unequally. According to the protester, this resulted in the agency attributing a positive finding to Amaze’s proposal, but no corresponding finding to Anika’s proposal, for comparable proposal content.
The second prompt under scenario 1 required offerors to provide a plan for the assessment of quality. RFP at 35. The agency found that Amaze’s proposal “[DELETED].” AR, Tab 23, TEC Report at 5. The protester contends that the agency “simply walks through the subheadings of Amaze’s response to the [solicitation’s] prompt [to provide a plan for the assessment of quality] without examining the substance.” Comments and Supp. Protest at 8, citing AR, Tab 15, Amaze Technical Approach Proposal at 3. Anika asserts that its proposal also incorporated these aspects into a [DELETED] to data quality, rather than simply “chunking the approach into subheadings.” Comments and Supp. Protest at 8, citing AR, Tab 19, Anika Technical Approach Proposal at 1-3. Anika compares the attributes of the two proposals and asserts that they both identify “the benchmarks of data quality[.]” Comments and Supp. Protest at 8.
In a task order competition, the evaluation of proposals is a matter within the contracting agency’s discretion, as the agency is responsible for defining its needs and the best method of accommodating them. CACI, Inc.--Fed., B-420729.2, Mar. 1, 2023, 2023 CPD ¶ 51 at 7. When reviewing protests of such, we do not reevaluate proposals but examine the record to determine whether the evaluation and source selection decision are reasonable and consistent with the solicitation and applicable procurement laws and regulations. Id. A protester’s disagreement with the agency’s judgment of the relative merit of competing proposals, without more, does not establish that the evaluation was unreasonable. Id. When a protester alleges disparate treatment in a technical evaluation, to prevail, it must show that the agency unreasonably evaluated the protester’s proposal in a different manner than another proposal that was substantively indistinguishable or nearly identical. Id. at 10. In other words, to establish disparate treatment a protester must show that the differences in evaluation did not stem from differences between the proposals. Id.
While we will not substitute our judgment for that of the agency, we will sustain a protest where the agency’s conclusions are inconsistent with the solicitation’s evaluation criteria, undocumented, or not reasonably based. NavQSys, LLC, supra at 3. Where an agency fails to document or retain evaluation materials, it bears the risk that there may not be adequate supporting rationale in the record for us to conclude that the agency had a reasonable basis for its evaluation conclusions. Id. Post-protest explanations that simply fill in previously unrecorded details will generally be considered in our review of the rationality of selection decisions, so long as those explanations are credible and consistent with the contemporaneous record. Artek Constr. Co., B‑418657, B-418657.2, July 17, 2020, 2020 CPD ¶ 285 at 9 n.14.
The agency contends that the positive statement about which Anika complains does not apply solely to Amaze’s half-page plan for assessment of quality, but rather encompasses Amaze’s entire technical approach proposal, which demonstrated that Amaze “responded effectively to the evaluation criteria,” thus meeting the solicitation’s standard for high confidence. Resp. to Comments at 7-8, quoting RFP at 38 (providing definition of high confidence).[4] The agency asserts that “[a] review of Amaze’s proposal shows the words [DELETED] appear multiple times throughout its [technical proposal], and not merely within the half-page section where Amaze describes its plan for ‘assessment of quality.’” Resp. to Comments at 7.
The agency does not respond substantively to the protester’s claim that Anika’s proposal also identified the capabilities for which Amaze’s proposal received credit. See id. at 8 (asserting, vaguely, that several areas of Anika’s proposal did not respond effectively to the evaluation criteria, without responding to Anika’s assertion that the agency treated proposals disparately). The justification provided for the agency’s evaluation of Amaze’s proposal is the frequency with which the words [DELETED] appear throughout Amaze’s proposal. Id. at 7. The agency provides no rationale why the repetition of those words provides reasonable support for the agency’s evaluation finding. See id. Moreover, Anika’s proposal also mentions numerous times most of the words cited by the agency. See AR, Tab 19, Anika Tech. Proposal. If mere mentions of those words warranted an agency finding that the offeror “responded effectively to the evaluation criteria,” then Anika’s proposal warranted such a finding, as well. In sum, the record supports the protester’s assertion that the agency treated the two offerors’ proposals unequally in assigning Amaze’s, but not Anika’s, a positive finding for identifying significant goals of standards, structure, relative tools for work, security, and audit capabilities. Thus, we sustain the allegation on this basis.
User Training
Anika contends that USCIS applied an unstated evaluation criterion when it downgraded Anika’s proposal because its training plan was limited to end users of the data, with no training provided to [DELETED]. Comments and Supp. Protest at 8, citing AR, Tab 23, TEP Report at 8. The protester argues that the agency’s finding ignored the solicitation requirement that offerors describe a plan for training “USCIS users (as opposed to all other data literary [personas]).” Comments and Supp. Protest at 8. As noted above, the third prompt under scenario 1 was to “[c]reate a plan to train USCIS users[[5]] on this new data source, including describing the source, uses, and constraints associated with the dataset.” RFP at 35. Anika proposed [DELETED]. AR, Tab 19, Anika Tech. Proposal at 6.
At issue is the agency finding that, ”[w]hile [Anika] identifies all [DELETED] data literacy personas[[6]] developed at USCIS, the training plan is limited to end users of the data only ([DELETED]) with no training defined for [DELETED].” AR, Tab 23, TEC Report at 8. Anika asserts that, because the solicitation did not define “USCIS users,” the protester reasonably interpreted the term to mean end users that would interact with USCIS data‑-[DELETED]. Comments and Supp. Protest at 9. That is, Anika’s proposal identified [DELETED] as “end users” and proposed a training program for those two groups. See AR, Tab 19, Anika Tech. Proposal at 6.
As noted above, this is the context in which the training requirement appears:
USCIS requires a new data sharing agreement with Health and Human Services (HHS). USCIS will be receiving HHS data related to refugee resettlement. HHS has indicated that the data would be available via a flat file transfer. The intention is for the data to be used for adjudicative purposes, which may require repetitive checks, and will be available for USCIS adjudicators and officers in the USCIS case management system, Electronic Immigration System (ELIS). The data will be electronically stored in the USCIS data warehouse. This will require collaboration with both internal (within USCIS) stakeholders (including the Office of Information Technology-OIT) and external (external to USCIS) stakeholders to draft those agreements and policies.
RFP at 35. In other words, USCIS will be receiving data from a new source. The new data will need to be transferred from HHS to USCIS. The data will be subject to quality checks and will require storage. The entire process of obtaining and warehousing the data--which will be used by adjudicators in the case management system--will require collaboration with internal and external stakeholders to draft agreements and policies.
Given the complexity of scenario 1, we think it was unreasonable for the protester to assume that “USCIS users” meant “end users” only; the record supports the agency’s contention that utilization of data from a new source will implicate many facets of USCIS besides [DELETED]. For that reason, we see no basis in the record to object to the assessment of this finding and this protest allegation is denied.[7]
Potential Visualizations
Anika argues that, in evaluating the protester’s proposal under the RFP’s fourth prompt for scenario 1, USCIS employed an unstated evaluation criterion by requiring innovative and creative visualizations when proposing potential visualizations. Comments and Supp. Protest at 12. That prompt required offerors to “[i]dentify potential visualizations to provide leadership visibility into the data and aid with decision-making on the issue.” RFP at 35. The agency found that Anika’s proposal demonstrated familiarity with existing visualizations, but USCIS “expected more innovation or creativity in expanding current government capabilities for a brand-new exchange.” AR, Tab 23, TEC Report at 8. Anika argues that innovation and creativity were not solicitation requirements. Comments and Supp. Protest at 11‑12. USCIS asserts that the requirement to “[i]dentify potential visualizations” was, by definition, “forward-looking”; the agency wanted more than just “existing visualizations.” Resp. to Comments at 11. The protester contends that “‘potential visualizations’ does not mean ‘new visualizations,’” and that Anika reasonably understood potential visualizations to mean visualizations that could be used during contract performance. Supp. Comments at 12. Finally, Anika asserts that USCIS evaluated proposals disparately because the agency did not apply the criterion for innovative and creative visualizations to Amaze’s proposal; the protester contends that “nothing within the evaluation identifies that Amaze offered new visualizations, much less innovative or creative ones.” Comments and Supp. Protest at 12, citing AR, Tab 15, Amaze Tech. Proposal at 5-6. Amaze’s proposal, Anika argues, was not similarly penalized for a lack of innovative visualizations. Id.
USCIS also asserts that “Anika acknowledges in a footnote that non-incumbent offerors, i.e., offerors with no access to ‘existing visualizations,’ had no option but to offer potential visualizations, i.e., new visualizations. See Supp. Protest at 11 and n.5.” Resp. to Comments at 11. The agency is incorrect. The protester said: “[The solicitation] makes no mention that offerors must demonstrate ‘innovation or creativity’ or create new visualizations for a brand-new exchange over the visualizations that are currently used.” Comments and Supp. Protest at 11. In a footnote, the protester adds: “And this weakness could only be applied towards the incumbent because other offerors do not have visibility as to the current offerings.” Id. at n.5. Anika is not conceding that Amaze’s proposal must have satisfied the solicitation requirement; the protester is contending that the misreading of the solicitation was prejudicial, in that it permits the agency to apply the requirement only against Anika, the incumbent.
USCIS did not respond to Anika’s allegation that the agency disparately evaluated proposals, except to incorrectly assert that the protester acknowledged that any visualization from a non-incumbent offeror would be a “potential” visualization. See Resp. to Comments at 10-11. Such an interpretation of the requirement unreasonably assumes that proposed visualizations are “potential visualizations” if they are “new” to the offeror--not USCIS. Under that reading of the solicitation, “offerors with no access to ‘existing visualizations,’ had no option but to offer potential visualizations, i.e., new visualizations.” Id. at 11. Thus, for a non-incumbent offeror, any visualization would satisfy the RFP requirements. Such an interpretation, besides being illogical, is unsupported by the context in which “potential visualizations” appears: “Identify potential visualizations to provide leadership visibility into the data and aid with decision-making on the issue.” RFP at 35. Not every visualization proposed by every non-incumbent offeror provides visibility into the data and aid in decision-making, and that is integral to satisfying the RFP requirement.
Moreover, the record does not support a finding that Anika’s proposal relied on “existing” visualizations. The protester’s first potential visualization “[DELETED].” AR, Tab 19, Anika Tech. Proposal at 7. A second proposed visualization examines whether [DELETED]. Id. at 8. A plain reading of these two visualizations suggests that they have not been executed before and therefore are not “existing.” The evaluation offers no explanation for why the agency considers them “existing visualizations,” AR, Tab 23, TEC Report at 8, and the agency offers no after-the-fact rationale. See Resp. to Comments at 10-11. The agency’s evaluation of Anika’s proposal found that the protester’s proposed visualizations lacked innovation and creativity, AR, Tab 23, TEB Report at 8, and USCIS does not directly respond to Anika’s contention that the agency did not apply that same criterion to Amaze’s proposal. Comments and Supp. Protest at 12; see Resp. to Comments at 11-12. Based on the record, we find that the agency disparately evaluated proposals and unreasonably concluded that the protester’s proposal lacked innovative visualizations. We sustain the protest on this basis.
Data Automation Approach
Anika contends that USCIS disparately evaluated proposals under the fifth scenario 1 prompt, which required offerors to identify business requirements for automating processes throughout the data lifecycle, when the agency solely found that Anika’s proposal did not clearly define who would perform the required activities. Comments and Supp. Protest at 13-14. The protester asserts that, if it was unclear in Anika’s proposal who would be executing the proposed activities, Amaze’s proposal offered no more specificity regarding the intended actors. Id. at 14.
The agency found that Anika’s approach to data automation “lacks boundaries for work between Team Anika and where OCDO government or [Office of Information Technology] would execute all these activities, specifically citing solutions like [DELETED] numerous times without clearly identifying [ ] who would complete the work in these actions.” AR, Tab 23, TEC Report at 8. It is “unclear,” USCIS found, “who is intended to do the work.” Id. The agency explained that the protester’s proposal used numerous sentences that began with “we” as the subject, and that left it “unclear whom Anika is recommending will do the creating, using, and building actions.” Resp. to Comments at 12.
Anika argues that “Amaze’s proposal likewise does not specifically state Amaze would be doing each of the proposed activities.” Comments and Supp. Protest at 14. Anika provided the following example:
[DELETED].
Id., quoting AR, Tab 15, Amaze Tech. Proposal at 6. This passage, and many like it, provide the agency no basis on which to conclude that Amaze proposed to perform the actions related to data automation.
USCIS argues that “for each of Amaze’s proposed identifying activities for automatic processes, Amaze declared that it (and not the government) would take the initiative.” Resp. to Comments at 12. The agency notes, for example, that Amaze’s proposal states that “[DELETED].” Id., quoting AR, Tab 15, Amaze Tech. Proposal at 9. That is one of only a few instances in Amaze’s discussion of automating processes throughout the data life cycle that Amaze’s proposal uses the subject “we.” See AR, Tab 15, Amaze Tech. Proposal at 8-9. Frequently, as Anika notes above, Amaze’s proposal is ambiguous as to who is responsible for performing the requirement. Amaze’s proposal states, as another example, that “[DELETED],” without explicitly stating that Amaze would be performing the embedding. See id. at 9. If the agency was confused about who Anika meant when it said “we,” we agree with Anika that USCIS cannot credibly claim that Amaze’s proposal offered greater clarity as to who would perform the proposed tasks. See Comments and Supp. Protest at 14. Accordingly, we again agree with Anika that USCIS disparately evaluated proposals in assigning the above negative finding to its proposal, and we sustain the protest on this basis.
Communication Plan
Anika argues that the agency unreasonably evaluated its proposal under the sixth prompt under scenario 1 that required offerors to “[d]efine the components of a communications and outreach plan to coordinate the availability and use of data with business and technology stakeholders.” RFP at 35; Comments and Supp. Protest at 15. The protester contends that its proposal in fact outlined the components of a communication plan. Comments and Supp. Protest at 15, citing AR, Tab 19, Anika Tech. Proposal at 10-11.
Anika’s proposal provided a communication plan that “describes the methods used for communication such as [DELETED]. AR, Tab 19, Anika Tech. Proposal at 10. USCIS found that Anika’s communication plan “identifies the limitations of relying on disseminating ‘[DELETED],’[[8]] however, the approach the vendor subsequently recommends relies on [DELETED].” AR, Tab 23, TEC Report at 8, quoting AR, Tab 19, Anika Tech. Proposal at 10.
Anika argues that the finding “ignores that the requirement was to ‘define the components of a communications and outreach plan,’ not to create the communications plan.” Supp. Comments at 15. We agree with the agency that the protester proposed only [DELETED] as communication channels. See AR, Tab 19, Amaze Tech. Proposal at 10. In other words, the components of the protester’s proposed communication plan were [DELETED]; Anika’s proposal identifies no other components. See id. at 10-11; see also Comments and Supp. Protest at 14-15. The record provides no basis on which to object to the reasonableness of the finding that Anika’s communication plan relied on those two means of communication. Accordingly, this allegation is denied.
Four Missed Findings that Increase Confidence
Anika contends that the agency failed to find, under scenario 1 of the technical approach factor, that four features of the protester’s proposal would increase the agency’s confidence in performance “with little or no government intervention”; a criterion which in addition to other criteria, warranted a rating of “high confidence.” The protester argues it proposed: the identification of two major challenges and recommendations for overcoming them; a plan to coordinate processes to ingest data to ensure appropriate data protection; an approach to the assessment of data quality; and the plan to train USCIS users. Comments and Supp. Protest at 16. USCIS responds that these four recommendations “will require some government intervention” and therefore properly correspond to the solicitation’s rating of “Some Confidence,” not “High Confidence.” COS at 4. Without elaboration, Anika asserts that these missed findings would not require government intervention as that term is used in this solicitation. Id. at 17. In fact, Anika argues, “none of these [unidentified findings] involve the government at all.” Id. at 18. As discussed below, the record provides no basis for the agency’s determination that one of the four missed findings would have required government intervention.
USCIS contends that, regarding the first missed finding--the identification of major challenges--Anika proposes to [DELETED]. Resp. to Comments at 15, citing AR, Tab 19, Anika Tech. Proposal at 3. The protester fails to articulate why it was unreasonable for the agency to conclude that [DELETED]. See Comments and Supp. Protest at 17. Rather, the protester asserts that the agency “disingenuously misstates the meaning of ‘government intervention.’” Id. We disagree. We think that USCIS reasonably concluded that a proposal feature that relied on [DELETED] was one that required some government intervention. Accordingly, this protest allegation is denied.
To substantiate the protester’s second missed strength--a plan to ensure data protection--Anika cites proposal exhibit two, which, in turn, cites to exhibit three. See Comments and Supp. Protest at 17, citing to AR, Tab 19, Anika Tech. Proposal at 4. The agency notes that those exhibits contain the following steps:
[DELETED]
Resp. to Comments at 16, quoting AR, Tab 19, Anika Tech. Proposal at 4. Again, Anika does not substantively respond to USCIS’s claim that executing this strength would entail some government intervention and that therefore the agency reasonably did not find this proposal feature increased confidence. The record supports the reasonableness of the agency’s evaluation. As such, the allegation is denied.
Thirdly, the protester argues that the agency should have credited Anika’s proposal for its approach to assess data quality using an automated data quality tool, [DELETED]. Comments and Supp. Protest at 17. USCIS maintains that the bullet points above, regarding the second missed strength, somehow indicate that deploying the [DELETED] tool would entail some government intervention. See Resp. to Comments at 16-17. Those bullet points, however, are taken from exhibits two and three to Anika’s proposal, and they are unrelated to the deployment of the [DELETED]. See AR, Tab 19, Anika Tech. Proposal at 4. Rather, Anika’s proposal identifies the following four steps for “expanding the use of the [DELETED]”:
[DELETED].
Id. at 5. There is no obvious government intervention required in the four steps outlined in Anika’s proposal, and USCIS did not specifically explain what government intervention would be required by these steps. See Resp. to Comments at 16-17. Thus, while the agency asserts that it reasonably did not identify this proposal feature as increasing confidence because its implementation required some government intervention, we see no basis for that assertion in the record. Accordingly, we conclude that the agency unreasonably failed to credit Anika’s proposal with a finding that increases confidence for the protester’s deployment of the [DELETED]. We sustain the allegation on this basis.
Finally, Anika claims that USCIS should have credited the protester’s proposal for its plan to train USCIS users, but, as discussed above, failed to do so when it unreasonably evaluated Anika’s proposal. Comments and Supp. Protest at 18. We addressed that allegation above and found that the record provides no basis on which to find the agency’s evaluation unreasonable with respect to Anika’s proposed training plan.
Lack of Price Realism Determination
Anika contends that USCIS unreasonably concluded that a price realism analysis was not necessary. Comments and Supp. Protest at 19.
The solicitation advised offerors that “[t]he government reserves the right to conduct a price realism analysis, should it be determined necessary.” RFP at 38.
As a general matter, when awarding a fixed-price order or contract, an agency is only required to determine whether offered prices are fair and reasonable. FAR 15.402(a); High Plains Computing, Inc-d/b/a HPC Sols., B-422934, Dec. 6, 2024, 2024 CPD ¶ 298 at 4. Nonetheless, an agency may conduct a price realism analysis “for the limited purpose of assessing whether an offeror’s . . . low price reflects a lack of technical understanding or risk.” High Plains Computing, Inc-d/b/a HPC Sols., supra. GAO affords an agency broad discretion to conduct or not to conduct a price realism analysis where a solicitation advises offerors that the agency “reserves the right” to conduct that analysis. US&S-Pegasus JV, LLC, B-421681.8, B-421681.9, Nov. 19, 2024, 2024 CPD ¶ 284 at 6. A solicitation that advises offerors that an agency “reserves the right” to conduct a price realism evaluation does not impose any obligation on the agency to do so, and there is otherwise no procurement statute or regulation requiring the agency to perform such an analysis for a fixed-price contract. Id.
Citing US&S-Pegasus JV, LLC, Anika argues that where a solicitation “reserves the right” to conduct a price realism analysis, and thus offerors are on notice that a price realism assessment is possible, GAO reviews whether the agency acted unreasonably in deciding not to conduct one. Protest at 13-14. The protester notes that an agency may rely on a comparison of proposed prices when determining whether to conduct a price realism analysis. Id. at 14 n.8 (noting that “GAO has determined that a 5 [percent] price difference in proposals does not constitute an unreasonable decision to not conduct a price realism analysis. MES Simulation & Training Corp., B-416210, July 10, 2018, 2018 CPD ¶ 261”).
The contracting officer states that a price realism analysis was not performed and was not necessary for this firm-fixed price task order because no offeror’s evaluated price was significantly lower than the average price. AR, Tab 32, Amended Price Analysis Report at 4. Also relying on US&S-Pegasus JV, LLC, USCIS argues that its decision not to conduct a price realism analysis was reasonable. See Memorandum of Law (MOL) at 16, citing US&S-Pegasus JV, LLC. In US&S-Pegasus JV, LLC, the awardee’s price was 19 percent lower than the competition average and 22 percent lower than the independent government cost estimate (IGCE). US&S-Pegasus JV, LLC, supra at 6. GAO found that “the record demonstrates that the agency analyzed [the awardee’s] price by comparing it to the competition average and the IGCE, and reasonably determined that a price realism analysis was not necessary.” Id. USCIS notes that, in this procurement, Amaze’s proposed price is only 7.2 percent lower than the average price of the other acceptable offerors. MOL at 17.
In response, Anika claims that the comparison of the awardee’s price against the average of all offerors’ prices, while permitted in a price reasonableness analysis, “is irrelevant here where the concern is price realism.” Comments and Supp. Protest at 20. This argument is contradicted by the protester’s earlier acknowledgement--with citation to GAO decisions--that GAO may consider the price difference in proposals when assessing the reasonableness of an agency’s decision not to conduct a price realism analysis. See Protest at 14 n.8. We agree with the agency that a difference of just over 7 percent between the awardee’s price and the average proposed price is a reasonable basis for not conducting a price realism analysis.[9] Accordingly, this allegation is denied.
Prejudice
Competitive prejudice is an essential element of a viable protest, and where no prejudice is shown or is otherwise evident, our Office will not sustain a protest, even if a deficiency in the procurement is evident. Insight Tech. Sols., LLC, B-421764.2 et al., Sept. 29, 2023, 2023 CPD ¶ 224 at 9. We resolve any doubts regarding prejudice in favor of a protester because a reasonable possibility of prejudice is a sufficient basis for sustaining a protest. Id.
We found that USCIS disparately evaluated proposals when under scenario 1 of the technical approach factor, it assessed Amaze’s proposal alone a finding of increases confidence for identifying significant goals of standards, structure, relative tools for work, security, and audit capabilities. USCIS also assessed Anika’s proposal four findings that decrease confidence under the technical approach factor. We concluded that two of those decreases confidence findings were unreasonable. We also found unreasonable the agency’s failure to assess Anika’s proposal one of four additional increases confidence findings. It is unclear that, but for the evaluation errors, the agency would have reached the same conclusion as to the relative advantages of the two proposals. Because there is a reasonable possibility that, had the agency properly evaluated the offerors’ technical proposals, Anika would have received the contract award, we find that the protester was prejudiced by the errors in the evaluation of proposals under the technical approach factor. We sustain the protest on that basis.
RECOMMENDATION
As detailed above, we sustain the protest on the basis that the agency unreasonably evaluated technical proposals. We recommend that USCIS reevaluate proposals consistent with this decision and make a new award decision. In the event the reevaluation results in the selection of an offeror other than Amaze, we recommend that the agency terminate the task order issued to Amaze for the convenience of the government and issue the task order to the offeror found to represent the best value, if otherwise proper. We also recommend that Anika be reimbursed the costs of filing and pursuing its protest, including reasonable attorneys’ fees. 4 C.F.R. § 21.8(d)(1). Anika should submit its certified claim for costs, detailing the time expended and costs incurred, directly to the contracting agency within 60 days after receipt of this decision. 4 C.F.R. § 21.8(f)(1).
The protest is sustained.
Edda Emmanuelli Perez
General Counsel
[1] We are not providing a detailed summary of scenario 2 because Anika does not challenge the agency’s evaluation of proposals under that scenario.
[2] As noted above, the total evaluated price of the task order at issue here exceeds $10 million; accordingly, this procurement is within our jurisdiction to hear protests of task orders placed under civilian agency indefinite-delivery, indefinite quantity contracts. 41 U.S.C. § 4106(f)(1)(B).
[3] The protester also argues that the agency failed to document its reevaluation of proposals. Comments and Supp. Protest at 2, citing NavQSys, LLC, B-417028.3, Mar. 27, 2019, 2019 CPD ¶ 130. In NavQSys, GAO sustained the protest because the agency found the protester ineligible for award but “nothing in the record reflects the agency’s apparent determination that award should no longer be made to NavQSys.” NavQSys, LLC, supra at 4. In this procurement, the contracting officer who made the most recent award decision relied on the initial evaluation. Resp. to Comments and Supp. Protest at 3. Indisputably, the record contains documentation of that evaluation. See, e.g., AR, Tab 23, TEC Report. The discussion below will consider the thoroughness of that documentation as well as the reasonableness of the evaluation.
[4] As discussed above, the degree of government intervention was a significant factor in the assigning of confidence ratings. RFP at 38.
[5] The agency notes that the solicitation did not define “USCIS users,” and instead “left it to the offerors to identify the users.” Resp. to Comments at 8.
[6] The [DELETED] data literacy personas that Anika’s proposal identified are: [DELETED]. AR, Tab 19, Anika Tech. Proposal at 6. Each persona, or employee group, has a unique relationship to data.
[7] The finding also faulted Anika’s proposed training plan as narrowly focused on data quality training, AR, Tab 23, TEC Report at 8, an assertion that the protester challenges. See Comments and Supp. Protest at 10. Notwithstanding Anika’s objection to the USCIS’s characterization of the protester’s proposal, the record supports a finding that Anika’s proposed plan is deficient in that it only addresses training for end users.
[8] Anika’s proposal stated that “[h]igh dependency on dissemination of critical information via [DELETED] does not promote coordinated effort among branches within the OCDO to produce standardized communications to a broader business audience.” Id.
[9] Anika argues that Amaze’s proposed price was more than 20 percent below its incumbent price. Protest at 13-15. The intervenor asserts that, because Anika’s proposed price was the second highest of 20 offerors, “Anika’s reliance on its own high price as the benchmark upon which a price realism analysis should result” “lacks any support in the law.” Intervenor’s Comments at 4. We agree. Anika also argues that, because the solicitation’s pricing only included labor rates, and the RFP stated that the agency reserved the right to conduct a price realism analysis, USCIS’s claim that the solicitation only requested labor rates could not serve as a justification for not electing to conduct a price realism analysis. Comments and Supp. Protest at 19, citing AR, Tab 32, Amended Price Analysis Report at 4. We agree, but, as discussed above, we find that the agency nevertheless asserted a reasonable basis for its decision. Although the agency compared the offerors’ proposed prices to the IGCE, see id., all proposed prices were lower than the IGCE, and the focus of the discussion was on the comparison of Amaze’s proposed price to the average price.