CrowderGulf, LLC; DRC Emergency Services, LLC; Phillips & Jordan, Inc.
Highlights
CrowderGulf, LLC, of Mobile, Alabama; DRC Emergency Services LLC, of Metairie, Louisiana; and, Phillips & Jordan, Inc., of Knoxville, Tennessee, protest the award of several indefinite-delivery, indefinite-quantity (IDIQ) contracts under request for proposals (RFP) No. W912EK18R0022, issued by the Department of the Army, Corps of Engineers (Corps), for debris management services. The protesters primarily challenge the agency's evaluation of proposals and resulting source selection decisions.
DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.
Decision
Matter of: CrowderGulf, LLC; DRC Emergency Services, LLC; Phillips & Jordan, Inc.
File: B-418693.9; B-418693.10; B-418693.11; B-418693.12; B-418693.13; B-418693.14; B-418693.15; B-418693.17
Date: March 25, 2022
William M. Jack, Esq., Amba M. Datta, Esq., and Ken M. Kanzawa, Esq., Kelley Drye & Warren LLP, for CrowderGulf, LLC; David R. Hazelton, Esq, Kyle R. Jefcoat, Esq., Julia Lippman, Esq., and Joshua Craddock, Esq., Latham & Watkins LLP, for DRC Emergency Services LLC; Robert J. Symon, Esq., Patrick R. Quigley, Esq., and Nathaniel J. Greeson, Esq., Bradley Arant Boult Cummings LLP, for Phillips & Jordan, Inc., the protesters.
Neil H. O'Donnell, Esq., Jeffrey M Chiow, Esq., Lucas T. Hanback, Esq., and Stephen L. Bacon, Esq., Rogers Joseph O'Donnell, PC, for AshBritt, Inc.; Richard B. Oliver, Esq. and J. Matthew Carter, Esq., Pillsbury Winthrop Shaw Pittman LLP, for ECC Constructors, LLC, the intervenors.
Tristan S. Brown, Esq. and Rebecca E. Martinez, Esq., Department of the Army, for the agency.
Christopher Alwood, Esq., and Christina Sklarew, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.
DIGEST
1. Protests challenging the agency’s evaluation of proposals under the solicitation’s technical/management approach, past performance, and price evaluation factors is denied where the evaluation was reasonable and consistent with the solicitation’s criteria and the protesters could not establish they were competitively prejudiced by the agency’s actions.
2. Protests challenging the agency’s comparative analysis and source selection decisions under the solicitation’s best-value tradeoff source selection scheme are denied where the agency’s comparative analysis and source selection decisions were reasonable, adequately documented, and consistent with the terms of the solicitation.
DECISION
CrowderGulf, LLC, of Mobile, Alabama; DRC Emergency Services LLC, of Metairie, Louisiana; and, Phillips & Jordan, Inc., of Knoxville, Tennessee, protest the award of several indefinite-delivery, indefinite-quantity (IDIQ) contracts under request for proposals (RFP) No. W912EK18R0022, issued by the Department of the Army, Corps of Engineers (Corps), for debris management services. The protesters primarily challenge the agency’s evaluation of proposals and resulting source selection decisions.
We deny the protests.
BACKGROUND
On May 2, 2019, the Corps issued the RFP as a partial small business set-aside, seeking to establish twenty IDIQ contracts for debris management operations after natural or man-made disasters across the United States. Contracting Officer’s Statement (COS) at 2. The RFP contemplated two groups of awards, one restricted to small businesses and the other open to all offerors, with each award to be made on a regional basis. [1] Agency Report (AR), Tab 2.20, RFP at 11. The solicitation defined twelve small business set-aside regions and eight unrestricted regions, specifying that the agency was to award a single IDIQ contract for each region.[2] Id. at 11‑13. The instant protests concern the Corps’s evaluation and award decisions for six of the eight unrestricted regions.[3]
The RFP required offerors to submit separate and complete proposals for each region in which the offerors wished to be considered for award and provided that the agency would evaluate each proposal separately. Id. at 2. The RFP did not limit the number of regional contracts a single offeror could be awarded. Id. The RFP provided for award on a best-value tradeoff basis, considering the following non‑price factors: (1) technical/management approach, (2) past performance, and (3) small business participation. Id. at 14. For the purpose of performing the best-value tradeoff, the technical/management approach factor and past performance factor were of equal importance, and each was more important than small business participation. Id. The RFP provided that, when combined, the non-price factors were approximately equal in importance to price. Id.
The agency was to evaluate proposals under the technical/management approach factor considering the adequacy of each offeror’s response to the RFP’s requirements, and the feasibility of the offeror’s proposed approach. Id. at 15. Specifically, the RFP provided that the agency would evaluate each offeror’s staffing approach, deployment/mobilization plan, management/operations plan, and safety plan. Id. at 16‑18. As relevant here, the RFP provided that, as part of the evaluation of offerors’ staffing approaches, the agency would consider “how major subcontractors fit into the overall staffing approach.” Id. at 16. The RFP provided that the agency would assign each proposal’s technical/management approach a combined technical/risk rating of outstanding, good, acceptable, marginal, or unacceptable. Id. at 15‑16.
The agency was to evaluate proposals under the past performance factor by considering past performance information provided by the offeror and the offeror’s references to determine the probability that the offeror would successfully perform the contract. Id. at 8‑9, 18‑20. The RFP instructed offerors to demonstrate their past performance by identifying at least three and no more than five completed projects.[4] Id. at 8. While the RFP provided that the agency could consider past performance information from other sources, it did not require the agency to do so. Id. at 9.
The RFP specified that the agency would evaluate past performance projects for recency, relevancy, and quality. Id. at 18‑19. The RFP considered projects completed within 12 years from the date of issuance of amendment 0013 to be recent. Id. at 18. The RFP noted that the agency could consider more recent past performance as more reflective of the offeror’s ability to successfully perform future debris removal requirements. Id. The agency was to evaluate a project’s relevancy by determining how relevant the submitted project was to the work required by the RFP.[5] Id.
The agency was to evaluate the quality of a proposed project “to determine how well the offeror performed on its past performance projects.” Id. at 19. The agency was to consider the “number, type, and severity” of any performance issues identified as well as the “effectiveness of corrective actions taken.” Id. The RFP provided that the agency would assign each proposal’s past performance an adjectival rating of substantial confidence, satisfactory confidence, neutral confidence, limited confidence, or no confidence. Id. at 20. As relevant here, the RFP provided that the agency would assess a rating of substantial confidence where it found, based on the offeror’s recent and relevant performance record, that the agency “has a high expectation that the offeror will successfully perform the required effort.” Id.
To evaluate the small business participation factor, the agency would assess all offerors’ small business participation plans. Id. at 20‑21. The agency would evaluate the small business participation plans to determine the extent of an offeror’s proposed participation of small businesses in the performance of the contract “relative to the objectives and goals” set forth in the solicitation. Id. at 20. The RFP provided that the agency would assign each proposal’s small business participation an adjectival rating of outstanding, good, acceptable, marginal, or unacceptable. Id. at 22.
The RFP specified that the total evaluated price for each proposal would be calculated based on the schedule of prices submitted by the offeror. Id. at 22. The RFP’s schedule of prices for each region included a set of estimated quantities for a “likely emergency event” to take place in that region. See AR, Tab 2.18, RFP amend. 0018, Schedule of Prices. The schedule of prices stated that:
The Government intends to multiply the estimated quantities in the Schedule of Prices by the rates proposed by each offeror for each [contract line item number (CLIN)] for the base period and each option year. The Government will then apply the escalation rates and add all prices for each year, base and options, to arrive at the total evaluated price for each region.
Id. at 1.
The RFP also provided that the agency would evaluate proposed pricing for reasonableness and balance. RFP at 22. The RFP specified that a materially unbalanced offer “may be rejected . . . if the contracting office determines that the lack of balance poses an unacceptable risk to the Government.” Id. The RFP also stated that the agency would not evaluate proposed prices for realism. Id. To be considered for award, offerors were required to affirmatively represent that they intended to comply with all Service Contract Act (SCA) wage rate requirements. Id. at 23.
The first closing date for receipt of proposals was June 24, 2019. AR, Tab 2.11, RFP amend. 0011 at 1. On April 1, 2020, the agency awarded contracts for each of the eight unrestricted regions. COS at 5. Following these awards, and before the agency had awarded any small business set-aside contracts under the RFP, four unsuccessful offerors filed protests with our Office, challenging the Corps’s evaluation and award decisions. Id. In response to those protests, the agency took corrective action, including amending the solicitation and allowing offerors to submit new proposals.[6] Id.
Following the agency’s corrective action, the closing date for receipt of new proposals was January 5, 2021. COS at 5; AR, Tab 2.16, RFP amend. 0016 at 2. On or before the January 5 due date, the agency received 37 offers for the regions protested here from at least seven different offerors.[7] COS at 5‑6; see AR, Tab 16.1, Final Price Evaluation Report Spreadsheets; AR, Tab 23a, DRC Unsuccessful Offeror Notice. After the initial evaluation of the new proposals, the agency entered into discussions with all offerors. See AR, Tab 10, Competitive Range Determination at 8. After the conclusion of discussions, on May 10, the Corps issued amendment 0019 to the RFP, which clarified certain solicitation language and set the due date for the receipt of final proposals to May 28. AR, Tab 2.19, RFP amend. 0019 at 1. The agency timely received revised final proposals from all offerors, including the protesters and intervenors here. COS at 6.
The agency evaluated the awardees’ and protesters’ final proposals for the protested regions as follows:
Region 1 |
||||
---|---|---|---|---|
Technical/ Management Approach |
Past Performance |
Small Business Participation |
Price |
|
AshBritt |
Good |
Substantial |
Outstanding |
$195,787,201 |
DRC |
Outstanding |
Satisfactory |
Outstanding |
$231,640,831 |
CrowderGulf |
Acceptable |
Satisfactory |
Outstanding |
$262,350,190 |
Region 2 |
||||
---|---|---|---|---|
Technical/ Management Approach |
Past Performance |
Small Business Participation |
Price |
|
AshBritt |
Good |
Substantial |
Outstanding |
$186,115,650 |
DRC |
Outstanding |
Substantial |
Outstanding |
$231,758,494 |
CrowderGulf |
Acceptable |
Substantial |
Outstanding |
$253,649,300 |
Region 4 |
||||
---|---|---|---|---|
Technical/ Management Approach |
Past Performance |
Small Business Participation |
Price |
|
ECC |
Outstanding |
Substantial |
Outstanding |
$213,560,874 |
DRC |
Outstanding |
Substantial |
Outstanding |
$303,421,093 |
CrowderGulf |
Acceptable |
Substantial |
Outstanding |
$265,295,595 |
Region 6 |
||||
---|---|---|---|---|
Technical/ Management Approach |
Past Performance |
Small Business Participation |
Price |
|
AshBritt |
Good |
Substantial |
Outstanding |
$186,115,650 |
DRC |
Outstanding |
Substantial |
Outstanding |
$223,499,214 |
CrowderGulf |
Acceptable |
Substantial |
Outstanding |
$252,671,350 |
Region 7 |
||||
---|---|---|---|---|
Technical/ Management Approach |
Past Performance |
Small Business Participation |
Price |
|
AshBritt |
Good |
Substantial |
Outstanding |
$186,115,650 |
DRC |
Outstanding |
Substantial |
Outstanding |
$222,006,695 |
CrowderGulf |
Acceptable |
Substantial |
Outstanding |
$252,687,589 |
Phillips & Jordan |
Good |
Substantial |
Outstanding |
$232,336,438 |
Region 8 |
||||
---|---|---|---|---|
Technical/ Management Approach |
Past Performance |
Small Business Participation |
Price |
|
ECC |
Outstanding |
Satisfactory |
Outstanding |
$256,243,015 |
DRC |
Outstanding |
Substantial |
Outstanding |
$402,288,073 |
AR, Tab 15.1, SSAC Region 1 Report at 3; Tab 15.2, SSAC Region 2 Report at 3; Tab 14, Final SSEB Report at 223‑224; Tab 15.6, SSAC Region 6 Report at 3; Tab 15.7, SSAC Region 7 Report at 3; Tab 15.8, SSAC Region 8 Report at 3.[8]
For each region, the SSA independently assessed proposals and reviewed the SSEB and SSAC reports. See, e.g., AR, Tab 19.1, Region 1 Source Selection Decision Document (SSDD) at 1‑6,18. The SSA concurred with the SSEB’s evaluation and the SSAC’s recommendations, and, as relevant to the instant protests, selected AshBritt for award in regions 1, 2, 6, and 7, and ECC for award in regions 4 and 8. COS at 5‑6. On November 12, 2021, the Corps notified the protesters that they had not been selected for award. See, e.g., AR, Tab 23a, DRC Unsuccessful Offeror Notice. The agency provided debriefings that concluded on December 13, and these protests followed.
DISCUSSION
The protesters generally challenge the agency’s evaluation of proposals, conduct of discussions, and resulting source selection decisions. We note that the protesters raise many collateral arguments. While our decision does not specifically address every argument, we have reviewed all the arguments and conclude that none provides a basis to sustain the protests. We discuss several representative issues below.
As an initial matter, we dismiss several protest grounds that were not suitable for consideration on the merits. For example, CrowderGulf’s initial protest alleged that, based on the awardee’s low pricing, the Corps had awarded contracts that would allow the awardees to adjust their pricing during performance, contrary to the RFP’s requirement to propose fixed prices. CrowderGulf Protest at 16. The agency provided a detailed response to this protest allegation. Memorandum of Law (MOL) at 33. In response, CrowderGulf did not rebut or address the agency’s arguments. See CrowderGulf Comments & Supp. Protest. Accordingly, we dismiss the protest grounds on which CrowderGulf did not comment as abandoned. See Tec-Masters, Inc., B‑416235, July 12, 2018, 2018 CPD ¶ 241 at 6.
DRC alleges that the agency provided offerors with unequal access to information. DRC Protest at 10‑11; DRC Comments at 17‑20. DRC contends that, following the original awards made on April 1, 2020, the Corps improperly disclosed the original awardees’ unit-level pricing--including DRC’s--to all offerors in post-award debriefings and unreasonably failed to “level the playing field” by disclosing the original disappointed offerors’ unit-level pricing as part of its corrective action. Id.
Our prior decisions have considered the timing of protests challenging the propriety of an agency’s corrective action. See, e.g., Quotient, Inc., B‑416473.4, B‑416473.5, Mar. 12, 2019, 2019 CPD ¶ 106 at 4. We have considered a challenge to the ground rules under which the agency will conduct its corrective action and recompetition to be analogous to a challenge to the terms of a solicitation, thus involving a basis for protest that must be raised prior to the closing time for receipt of proposals. Odyssey Systems Consulting Group, Ltd., B-418440.8, B‑418440.9, Nov. 24, 2020, 2020 CPD ¶ 385 at 5‑6; see Domain Name Alliance Registry, B‑310803.2, Aug. 18, 2008, 2008 CPD ¶ 168 at 7‑8.
DRC maintains that our timeliness rules should not “be applied overly broadly” to the facts here, because DRC “raised the issue directly with the agency prior to proposal submission.” DRC Protest at 10 n.1. In support of its argument, DRC cites to the U.S. Court of Appeals for the Federal Circuit’s recent decision in Harmonia Holdings Group, LLC v. United States, 20 F.4th 759 (Fed. Cir. 2021). In Harmonia, the Federal Circuit held that the Blue & Gold waiver rule[9] did not apply where a disappointed offeror had challenged alleged solicitation defects in a timely pre-award agency-level protest. Id. at 767. The Court in Harmonia found that the protester’s pre-award protest preserved its challenges and provided “notice to interested parties,” allowing the protester to raise the same solicitation defects in a post-award protest before the U.S. Court of Federal Claims more than 5 months after the closing date for receipt of proposals. Id. DRC avers that a May 29, 2020 letter to the Corps and subsequent questions submitted to the agency regarding access to other offerors’ prior proposed pricing put the agency on sufficient notice that DRC may raise such challenges in a post-award protest. DRC Protest at 10 n.1; DRC Comments at 18‑20. However, DRC fails to recognize that, in this instance, our timeliness rules are different from those at the Court of Federal Claims.
Our Bid Protest Regulations contain strict rules for the timely submission of protests. These rules reflect the dual requirements of giving parties a fair opportunity to present their cases and resolving protests expeditiously without unduly disrupting or delaying the procurement process. Verizon Wireless, B‑406854, B‑406854.2, Sept. 17, 2012, 2012 CPD ¶ 260 at 4. Our timeliness rules specifically require that a protest based upon alleged improprieties in a solicitation that are apparent prior to the closing time for receipt of initial proposals be filed before that time. 4 C.F.R. § 21.2(a)(1). However, if a timely agency-level protest was previously filed, any subsequent protest to our Office must be filed within 10 days of actual or constructive knowledge of initial adverse agency action. 4 C.F.R. § 21.2(a)(3).
The facts here are distinguishable from those in Harmonia in that DRC has not alleged or demonstrated that it previously filed a timely agency-level protest. In Harmonia, the Court noted that the protester’s pre-award agency-level protest was a “timely, formal challenge of the solicitation” before the agency. 20 F.4th at 767. Rather, DRC first references a letter that allegedly sought “to ensure that ‘DRC has received all of the same information provided to other offerors.’” DRC Comments at 18, 20 (quoting DRC May 29, 2020 Letter to the Agency).[10] DRC also points to a question it submitted to the agency in response to amendment 0016 asking how the Corps would ensure that some offerors did not have access to information “that could give them an unfair competitive advantage.” AR, Tab 24.2, RFP amend. 0016, Questions and Answers (Q&A) at 10.
Our Office has consistently stated that, to be regarded as an agency-level protest, a written statement must convey the intent to protest by a specific expression of dissatisfaction with the agency's actions and a request for relief. Masai Techs. Corp., B‑400106, May 27, 2008, 2008 CPD ¶ 100 at 3; ILC Dover, Inc., B‑244389, Aug. 22, 1991, 91‑2 CPD ¶ 188 at 2. In contrast, we have explained that a letter that merely expresses a suggestion, hope, or expectation, does not constitute an agency-level protest. Masai Techs. Corp., supra.
Here, we do not see how the communications proffered by DRC in its pleadings are more than an expression of an expectation of equal access to information. We find that these communications did not convey an intent to protest the agency’s actions and therefore do not constitute a formal agency-level protest that preserved its solicitation challenges. Accordingly, on this record, we see no basis to apply the agency-level protest exception at 4 C.F.R. § 21.2(a)(3) and hold DRC to the timeliness rules for protests of apparent solicitation defects in 4 C.F.R. § 21.2(a)(1).[11]
As noted above, the next closing date for receipt of proposals after the agency undertook its corrective action was January 5, 2021. COS at 5; AR, Tab 2.16, RFP amend. 0016 at 2. Accordingly, we dismiss this protest ground as untimely where it was not raised until December 17, 2021, almost a year after the closing time for receipt of proposals.[12]
DRC also alleges for the first time in its comments that the agency engaged in misleading discussions after conducting an impermissible price realism analysis in violation of the terms of the solicitation. DRC Comments at 14‑16. Specifically, DRC contends that the agency evaluated DRC’s pricing for CLIN [DELETED] as understated when reviewing pricing for balance, then asked DRC in discussions to explain how it arrived at that price in its proposals for regions 1, 2, 4, 6, 7, and 8. Id. at 15. DRC argues that asking for such information regarding a low-priced CLIN demonstrates that the agency was actually concerned that DRC’s price was too low, which would constitute a price realism analysis. Id. DRC alleges it was competitively prejudiced thereby, because it caused DRC to raise its prices in order to address the agency’s stated concerns in discussions. We dismiss this protest ground as untimely filed.
Our timeliness rules provide that protests, other than those based on alleged solicitation improprieties, shall be filed not later than 10 days after the basis of the protest is known or should have been known, with the exception of protests challenging a procurement “under which a debriefing is requested and, when requested, is required.” 4 C.F.R. § 21.2(a)(2). Here, the record is clear that DRC had all the information it needed to raise this protest ground at the time of its debriefing.
DRC received the evaluation notices which form the basis for its protest on April 27, 2021. AR, Tab 29.3, DRC Discussion Letters. Further, DRC was aware of its pricing disadvantages--both generally and on a line-item basis--once it received its debriefing from the agency. AR, Tab 21.1, DRC Debriefing (disclosing the awardees’ line-item pricing for regions 1, 2, 4, 6, 7, and 8). DRC does not point to, and our review of the record does not reveal, any new information produced in the agency report which forms a basis for this protest ground. Accordingly, DRC was required to raise this protest ground within 10 days of its requested and required debriefing. We therefore dismiss this protest ground as untimely where DRC failed to raise it until February 8, 2022, more than 10 days after the agency concluded debriefings on December 13, 2021. See COS at 7.
General Evaluation Challenge
Turning to the substantive allegations, DRC generally challenges the agency’s evaluation of proposals, alleging that the agency failed to conduct its evaluation in accordance with the solicitation’s requirement that each “proposal submitted for each region will be evaluated separately.” DRC 2nd Supp. Protest at 3 (citing RFP at 2). In this regard, DRC objects to the agency’s use of a single SSEB to evaluate proposals for all eight regions. Id. DRC argues that the fact that the SSEB created a single omnibus consensus evaluation document covering its evaluation of all regions, and that the evaluation often used repeated language when describing the same offeror’s proposals for different regions, demonstrates that the regional proposals were not “evaluated separately.” Id. at 3‑8.
The agency responds that its evaluation was reasonable and done in accordance with the solicitation’s mandate to evaluate the separate regional proposals separately. Supp. MOL at 9‑15. The agency contends that nothing in the solicitation prevented the agency from having the same SSEB evaluate the proposals for all eight regions. Id. at 9. The agency also argues that it was reasonable to use the same evaluation language when describing similar aspects of an offeror’s proposals for different regions.[13] Id. at 10. Finally, the agency states that it clearly evaluated the proposals separately in the context of the region for which they were submitted, as demonstrated by the agency’s assessment of significant strengths to DRC’s proposals for seven different regions for providing a detailed approach for “the entire debris removal process specific to” that region. Id. at 11 (citing AR, Tab 14, SSEB Report at 70, 132, 206, 280, 464, 551, 615).
In reviewing a protest challenging an agency’s evaluation, our Office will not reevaluate proposals, nor substitute our judgment for that of the agency, as the evaluation of proposals is a matter within the agency’s discretion. Rather, we will review the record to determine whether the agency’s evaluation was reasonable and consistent with the stated evaluation criteria and with applicable procurement statutes and regulations. AECOM Mgmt. Servs., Inc., B‑417639.2, B‑417639.3, Sept. 16, 2019, 2019 CPD ¶ 322 at 9. A protester’s disagreement with the agency’s judgment, without more, is insufficient to establish that the agency acted unreasonably. Vertex Aerospace, LLC, B‑417065, B‑417065.2, Feb. 5, 2019, 2019 CPD ¶ 75 at 8.
Our review of the record confirms that a single SSEB created a single consensus evaluation document that contains the evaluation of each proposal in all eight regions. AR, Tab 14, SSEB Report. However, the SSEB report separately documents the evaluation of proposals for each different region. See, e.g., Id. at 14‑75 (documenting the SSEB’s consensus evaluation for region 1). Further, the record shows that almost every other step of the procurement was broken out into separate documents by region, including separate SSAC reports, supplemental unbalanced price analysis reports, price reasonableness determinations, and source selection decision documents.[14]
On this record, we see no basis to sustain this protest ground. DRC does not point to, and the record does not reveal, anything in the RFP that would require the agency to have different SSEBs evaluate different regions’ proposals. Further, DRC has not explained why a single SSEB cannot reasonably perform the evaluation for several separate procurements, regardless of whether the procurements are conducted through multiple solicitations, or--as here--under one. Also, while DRC points to several sections of repeated language in the SSEB’s evaluations addressing a single offeror’s proposals from different regions, see DRC 2nd Supp. Protest at 4‑8, DRC has not demonstrated that these identical evaluation conclusions derive from dissimilar proposal language. We conclude that the protester’s arguments here constitute nothing more than disagreement with the manner in which the agency undertook its separate proposal evaluations, and deny this ground of protest on that basis.
Unbalanced Price Analysis
CrowderGulf and DRC both raise a variety of challenges to the agency’s evaluation of pricing for balance. CrowderGulf Protest at 15‑24; CrowderGulf Comments & Supp. Protest at 3‑12; CrowderGulf Supp. Comments at 3‑12; DRC Protest at 11-16; DRC 2nd Supp. Protest at 8‑14; DRC Comments at 3‑14; DRC Supp. Comments at 4‑9. We have reviewed the protesters’ arguments and the price evaluation record, and find that none provides a basis to sustain the protest. As discussed below in a few representative examples, we find the agency’s evaluation of price for balance was reasonable and in accordance with the terms of the solicitation.[15]
As a general matter, unbalanced pricing exists when, despite an acceptable total evaluated price, the price of one or more contract line items is significantly overstated or understated. Federal Acquisition Regulation (FAR) 15.404‑1(g)(1). With respect to unbalanced pricing generally, the FAR requires that contracting officers analyze offers with separately-priced line items or subline items, to detect unbalancing. FAR 15.404‑1(g)(2). While both understated and overstated prices are relevant to the question of whether unbalanced pricing exists, the primary risk to be assessed in an unbalanced pricing context is the risk posed by overstatement of prices because, absent a price realism provision,[16] low prices (even below-cost prices) are not improper and do not themselves establish (or create the risk inherent in) unbalanced pricing. See Crown Point Systems, B‑413940, B‑413940.2, Jan. 11, 2017, 2017 CPD ¶ 19 at 5; Mancon, LLC, B‑417571.5, B‑417571.6, May 12, 2020, 2020 CPD ¶ 169 at 11. Our Office reviews the reasonableness of an agency’s determination about whether a firm’s prices are unbalanced, and an agency’s determination as to whether the unbalanced prices pose an unacceptable risk. Triumvirate Envtl., Inc., B‑406809, Sept. 5, 2012, 2012 CPD ¶ 244 at 5.
Here, the agency reports that it analyzed proposed prices for balance by first comparing each proposal’s line-item pricing to the independent government estimate (IGE) and to the other proposed prices received. COS at 26‑27. In this regard, the agency created a set of spreadsheets for each region comparing all the proposed CLIN prices, then calculated the maximum proposed price, minimum proposed price, the mean, the average of the minimum and maximum prices, the median, and the standard deviation for each set of proposed CLIN prices. See AR, Tab 11, Initial Price Evaluation Spreadsheets. The agency then highlighted those proposed CLIN prices that fell outside one or two standard deviations from the mean.[17] Id. With this data in hand, the agency reviewed the CLIN pricing for all regions to determine if it considered any of the proposed pricing to be over or understated. AR, Tab 11, Initial Price Analysis; AR, Tab 11.1, Contracting Officer Analysis of CLINs for Unbalance Risk. This analysis included comparing proposed prices within a proposal and between proposals submitted by the same offeror for different regions. AR, Tab 28, Decl. of Agency Price Evaluator at ¶ 12‑13.
The agency then conducted discussions with offerors, which included informing each offeror if the agency considered any of their proposed pricing to be potentially unbalanced. See, e.g., AR, Tab 29.3, DRC Factor 4 Evaluation Notice for Region 1 (informing DRC that “CLINs 0002CB-0002CK . . . appear to be overstated”). After receipt of final proposals, the agency again analyzed and compared the proposed prices against each other and the IGE. See AR, Tab 16, Final Price Analysis and Price Evaluation Spreadsheets. The agency then analyzed whether it considered any of the final proposed CLIN prices to be unbalanced or whether award at the proposed prices would result in paying unreasonably high prices for contract performance. See, e.g., AR Tab 12.1, Region 1 Pre-Negotiation Objective Memorandum (POM)/Price Negotiation Memorandum (PNM) at 51‑54.
Further, after conducting the above analysis and determining that all proposed prices were sufficiently balanced, the agency undertook a supplemental unbalanced price analysis to investigate whether it would pay unreasonably high prices if events other than the likely scenario event used to evaluate price were to occur. See, e.g., AR, Tab 17.1, Supp. Unbalanced Price Analysis for Region 1 at 2. The agency recognized that its methodology for calculating a total evaluated price--where it created estimated ordering quantities based on a most likely to occur disaster event--resulted in several CLINs in each region not being included in the total evaluated price. Id. To address this, the agency calculated alternate estimated quantities and evaluated prices for the proposals based on other likely disaster events in the relevant region. As a result of this supplemental analysis, the agency found that alternate disaster scenarios in each region would not result in the agency paying unreasonably high prices for contract performance. AR, Tabs 17.1‑17.8, Supp. Unbalanced Price Analysis for Regions 1, 2, 4, 6, 7, and 8.
First, CrowderGulf and DRC contend that the agency unreasonably used standard deviation calculations across proposals in its balanced pricing analysis in an overly mechanical way that resulted in unbalanced prices being overlooked. CrowderGulf Comments & Supp. Protest at 4‑8; DRC Comments at 12‑13. In support of their arguments, both protesters rely on our Office’s decision in Multimax, Inc. for the proposition that an analysis based on whether prices fall outside two standard deviations of the mean of proposed prices is mechanical and improper.[18] See CrowderGulf Comments & Supp. Protest at 4 (citing Multimax, Inc., B‑298249.6 et al., Oct. 24, 2006, 2006 CPD ¶ 165 at 11. In this regard, the protesters allege that the agency failed to exercise judgement and independently assess whether the results of such calculations reflect reasonable pricing. Id.
The agency responds that its evaluation was reasonable and can be “clearly distinguished” from Multimax because the price evaluators used the standard deviation calculations “as a starting point” for further analysis and consideration of potentially unbalanced pricing. Supp. MOL at 28‑31. The agency explains that the standard deviation calculations provided a helpful way to begin analyzing the more than 2000 separate CLIN prices in the regions relevant to this protest. This step was followed, however, by further analysis such as comparing proposed CLIN prices within a proposal, comparing CLIN prices in different proposals by the same offeror, and considering the assumptions underlying the prices that the offerors included in their proposals. Id. at 30.
Here, the record is clear that the agency did more than mechanically apply the results of its standard deviation calculations. By way of one example, the agency issued evaluation notices stating that it was concerned certain CLIN pricing might be overstated for CLINs that were not calculated to be more than one standard deviation from the mean. Compare AR, Tab 12.4, Region 4 POM/PNM at 31 (informing ECC that CLINs 0002AI-0002AN “appears to be overstated”), with AR, Tab 11, Initial Price Evaluation Spreadsheets (calculating that only one of the six proposed prices for CLINs 0002AI-0002AN was more than a standard deviation from the mean). CrowderGulf and DRC do not explain how such findings of potentially overstated pricing would be possible if the agency mechanically relied only on its standard deviation calculation.
Further, and as discussed above, the record shows that the agency also compared proposed CLIN pricing within proposals for balance, and also considered whether any of the prices could result in the agency paying unreasonably high prices during performance, including analyzing alternate scenarios to determine the impact of different possible ordering quantities for zero estimated quantity CLINs. AR Tab 12.1-12.8 POM/PNMs for regions 1, 2, 4, 6, 7, and 8; AR, Tabs 17.1‑17.8, Supp. Unbalanced Price Analysis for Regions 1, 2, 4, 6, 7, and 8. On this record, we find no basis to object to the agency’s use of standard deviation calculations[19] and find that its unbalanced pricing analysis was not mechanical. This protest ground is denied.
CrowderGulf also argues that the agency’s evaluation of price for balance was improper where it failed to evaluate allegedly understated line item prices for performance risk in accordance with FAR section 15.404-1(g). CrowderGulf Protest at 15‑16; CrowderGulf Comments and Supp. Protest at 7, n.4. Analyzing understated line items for performance risk requires the agency to consider whether a proposed price is too low to accomplish the required work--i.e., perform a price realism analysis. Here, the solicitation establishes that the agency would not conduct any price realism analysis. RFP at 22‑23. As noted above, “absent a price realism provision, there is nothing objectionable in an offeror’s proposal of low, or even below-cost, prices.” Mancon, LLC, supra at 11. Accordingly, we see no basis to conclude that the agency should have evaluated understated line-item prices for performance risk and deny this ground of protest.
We note that the protesters proffered various challenges to the manner in which the agency conducted its pricing analysis, but none has convincingly demonstrated that the result of the analysis was unreasonable. As mentioned above, the ultimate determination of an unbalanced pricing analysis is not whether any prices are unbalanced, but whether any such unbalanced pricing poses an unacceptable risk that the agency will pay an unreasonably high price during contract performance. See Triumvirate Envtl. Inc., supra, at 5. While the protesters argue that some overstated prices could result in the agency paying more than anticipated, and that the awardee’s evaluated price advantages could disappear, they do not establish that any specific awardee proposed prices that will result in the agency paying an unreasonably high price. In short, based on our review of the record and our conclusions above, we find that the agency’s unbalanced price analysis was reasonable and in accordance with the terms of the solicitation and FAR section 15.404‑1(g). Accordingly, we see no basis to disagree with the agency’s conclusions that the awardees’ prices are not materially unbalanced and that the risk of paying unreasonably high prices during contract performance is low.
Technical/Management Approach Evaluation
Phillips & Jordan and CrowderGulf challenge several aspects of the agency’s evaluation under the technical/management approach factor. Mainly, the protesters challenge the agency’s assessment of strengths under the technical/management approach factor, arguing both that the agency treated offerors disparately in its assessment of certain strengths and failed to recognize other, additional strengths in the proposals. As discussed below in a few representative examples, we find the agency’s evaluation of the proposals’ technical/management approach was reasonable and in accordance with the terms of the solicitation.
For example, both Phillips & Jordan and CrowderGulf allege that their technical/ management approaches were evaluated unequally when compared to AshBritt’s. Phillips & Jordan Supp. Protest at 6‑9; CrowderGulf Comments & Supp. Protest at 26‑27. The protesters complain that while AshBritt was assigned a strength for its detailed explanation of how it planned to utilize two tools called an automated debris management system (ADMS) and [DELETED] to provide real-time mission monitoring and quality control, neither protester was assessed a strength for what they contend are virtually identical proposal features. Phillips & Jordan Supp. Comments at 6‑9; CowderGulf Supp. Comments at 31‑32.
In response, the agency argues that AshBritt’s proposal contained a “superior” explanation of its approach to utilize ADMS and [DELETED]. Supp. MOL at 7‑8. Further, the agency argues that there are aspects of AshBritt’s proposal which the other proposals do not contain. Id. 46‑48. In support of its argument the agency points to the level of detail concerning the process of utilizing ADMS and [DELETED] in AshBritt’s proposals, and notes that the strength, in part, reflects that these tools could help reduce the wait time for trucks during mobilization due to [DELETED]. Id. at 7‑8, 48.
It is a fundamental principle of federal procurement law that a contracting agency must treat all offerors equally and evaluate their proposals evenhandedly against the solicitation’s requirements and evaluation criteria. Abacus Tech. Corp.; SMS Data Prods. Grp., Inc., B‑413421 et al., Oct. 28, 2016, 2016 CPD ¶ 317 at 11. Where a protester alleges unequal treatment in a technical evaluation, it must show that the differences in the evaluation did not stem from differences between the proposals. Nexant Inc., B‑417421, B‑417421.2, June 26, 2019, 2019 CPD ¶ 242 at 10. Phillips & Jordan and CrowderGulf have not made such a showing here.
The contemporaneous record shows that the agency assessed a strength to AshBritt’s region 7 proposal for more than its “means and methods to provide real-time mission execution monitoring” and the “detailed synopsis and example screen shots” of the ADMS and [DELETED] tools AshBritt planned to use. AR, Tab 14, SSEB Report at 499. The strength also described several of the tools’ features discussed by AshBritt’s proposal and specifically notes the visualization capabilities of the tools’ various views and dashboard page displays. Id. AshBritt’s region 7 proposal contains a description of its ADMS and [DELETED] tools that includes more than 20 screenshots demonstrating the tools’ features that are described by the proposal. AR, Tab 4.7, AshBritt Region 7 Proposal, Volume 1 at 39‑49. While both Phillips & Jordan and CrowderGulf propose a required ADMS tool, neither provides the same level of examples or visualizations of the tools’ capabilities. AR, Tab 7.7, Phillips & Jordan Region 7 Proposal, Volume 1 at 40‑49; AR, Tab 5.7, CrowderGulf Region 7 Proposal, Volume 1 at 62‑72. Further, CrowderGulf does not propose to utilize the [DELETED] tool that was specifically referenced in the assessment of AshBritt’s strength and, while Philllips & Jordan does, its proposal only discusses the [DELETED] tool in a single paragraph, without providing the same visual detail as to the tool’s capabilities. AR, Tab 7.7, Phillips & Jordan Region 7 Proposal, Volume 1 at 48.
On this record, we find unobjectionable the agency’s conclusion that AshBritt’s proposal included a more detailed synopsis with more visual examples of its plan to utilize ADMS and [DELETED]. The record here demonstrates that the differences in the assessments of strengths stem from the differences in the details found in the proposals. Accordingly, we deny these grounds of protest.
Phillips & Jordan also alleges that the agency’s evaluation of AshBritt’s proposal for region 7 was unreasonable because it failed to account for the performance risk presented by AshBritt’s reliance on subcontractors. Phillips & Jordan Supp. Protest at 2‑6; Phillips & Jordan Supp. Comments at 2‑6. In this regard, Phillips & Jordan contends that AshBritt has only between [DELETED] and [DELETED] full-time employees, and is otherwise reliant on subcontractors to perform. Id. Phillips & Jordan argues that the fact that the agency did not downgrade AshBritt’s proposal for the increased performance risk due to their reliance on subcontractors demonstrates that the agency must not have considered performance risk here at all. Id.
As noted above, our review of an evaluation challenge is to determine whether the agency’s evaluation was reasonable and consistent with the stated evaluation criteria and with applicable procurement statutes and regulations. AECOM Mgmt. Servs., Inc., supra. A protester’s disagreement with the agency’s judgment, without more, is insufficient to establish that the agency acted unreasonably. Vertex Aerospace, LLC, supra.
Here, the RFP provided that the agency would evaluate the offerors’ staffing approaches, including the offerors’ explanations “of how major subcontractors fit into the overall staffing approach.” RFP at 16. The record demonstrates that the agency considered AshBritt’s extensive network of subcontractors when evaluating how subcontractors fit into AshBritt’s staffing approach and found that it met, but did not exceed, this evaluation criterion. AR, Tab 14, SSEB Report at 497 (noting AshBritt’s “list of more than [DELETED] subcontractors”). Further, the SSEB report shows that the agency reviewed the page of AshBritt’s proposal that documents its allegedly small full-time staff. Id. at 496 (noting that it reviewed pages 9-19 of AshBritt’s region 7 proposal); AR, Tab 4.7, AshBritt Region 7 Proposal, Volume 1 at 10 (listing only [DELETED] full-time AshBritt or “reserve” employee positions).
Here, we have reviewed the evaluation record and find no basis to question the agency’s assessments regarding any performance risk from AshBritt’s reliance on subcontractors. In this regard, we note that while the agency was required to consider AshBritt’s general staffing approach and how major subcontractors fit into the staffing approach, the protester does not point to, and our review of the solicitation does not reveal, any requirement that the agency evaluate whether an offeror was too reliant on subcontractors versus in-house employees. Regardless, the record here shows that the agency considered both the broad subcontractor network AshBritt uses, and the relatively small in-house staff it maintains. In short, Phillips & Jordan’s disagreements with the agency’s evaluation judgements, without more, do not provide a basis to sustain its protest.
Past Performance Evaluation
The protesters challenge several aspects of the agency’s evaluation under the past performance factor. We have reviewed the protesters’ arguments and the evaluation record and find that none of the protesters’ arguments provides a basis to sustain their protests. As discussed below in a few representative examples, we find the agency’s evaluation of past performance was either reasonable and in accordance with the solicitation or that the protester’s arguments failed to demonstrate competitive prejudice.
CrowderGulf and Phillips & Jordan contend that it was unreasonable for the agency not to consider the awardees’ allegedly poor past performance that the protesters claim was apparent from news articles discussing ongoing litigation, settlement agreements, and investigations. CrowderGulf Protest at 35‑37; CrowderGulf Comments & Supp. Protest at 35‑36; Phillips & Jordan protest at 16‑17; Phillips & Jordan Comments at 7. Specifically, the protesters allege that some of this information was too “close at hand” for agencies to ignore in their evaluation of an offeror’s past performance proposal. CrowderGulf Comments & Supp. Protest at 35; Phillips & Jordan Comments at 7.
The agency responds that the information in question was not reasonably within the possession of the Corps’s evaluators during the past performance evaluation. The agency argues that its evaluation of past performance was reasonable as it, in accordance with the terms of the RFP, was based on the past performance information submitted by the offerors in their proposals. See, e.g., MOL at 18-23.
An agency’s evaluation of past performance, including its consideration of the relevance, scope, and significance of an offeror’s performance history, is a matter of discretion which we will not disturb unless the agency’s assessments are unreasonable or inconsistent with the solicitation criteria. Metropolitan Interpreters & Translators, Inc., B‑415080.7, B‑415080.8, May 14, 2019, 2019 CPD ¶ 181 at 10; see also SIMMEC Training Sols., B‑406819, Aug. 20, 2012, 2012 CPD ¶ 238 at 4. A protester’s disagreement with the agency’s judgment does not establish that an evaluation was unreasonable. FN Mfg., LLC, B‑402059.4, B‑402059.5, Mar. 22, 2010, 2010 CPD ¶ 104 at 7.
Here, as noted above, the RFP required the agency to evaluate past performance information submitted by the offerors in the form of CPARs or past performance questionnaires. RFP at 18‑20. While the RFP provided that the agency could consider past performance information from other sources, it did not require the agency to do so. Id. at 9. While the protesters generally concede that the agency was not required to consider past performance information not included in the proposals, they argue that an exception exists for certain past performance information that is too “close at hand” for the agency to ignore. CrowderGulf Comments & Supp. Protest at 35; Phillips & Jordan Comments at 7.
We have recognized that in certain limited circumstances, an agency has an obligation (as opposed to the discretion) to consider “outside information” bearing on an offeror’s past performance when it is “too close at hand” to require the offerors to shoulder the inequities that spring from an agency’s failure to obtain and consider the information. International Bus. Sys., Inc., B‑275554, Mar. 3, 1997, 97‑1 CPD ¶ 114 at 5. However, our Office has not extended the “too close at hand” principle to apply to every case where an agency might conceivably find additional information regarding an offeror’s proposal. See U.S. Facilities, Inc., B‑293029, B‑293029.2, Jan. 16, 2004, 2004 CPD ¶ 17 at 12. Rather, our Office has generally limited application of this principle to situations where the alleged “close at hand” information relates to contracts for the same services with the same procuring activity, or information personally known to the evaluators. TRW, Inc., B‑282162, B‑282162.2, June 9, 1999, 99‑2 CPD ¶ 12 at 5; Leidos, Inc., B‑414773, B‑414773.2, Sept. 12, 2017, 2017 CPD ¶ 303 at 10.
Here, CrowderGulf and Phillips & Jordan fail to demonstrate that their protest allegations meet this standard. First, our review of the pleadings reveal that only one of the news stories purported to contain relevant past performance information about the awardees relates to a U.S. Army Corps of Engineers procurement. See CrowderGulf Protest at 35. Further, while the article cited by CrowderGulf does seem to refer to the California wildfire clean-up project AshBritt uses as a past performance reference, see AR, Tab 4.7, AshBritt Proposal, Volume 2 at xviii, CrowderGulf does not demonstrate that the Corps had knowledge of the homeowner complaints referenced in the article.
Second, regardless of the Corps’s general knowledge of the complaints in the article, the protesters have failed to demonstrate, with any evidence in the record, that any of the agency evaluators involved in this procurement were personally aware of this allegedly negative past performance information. Consequently, the protesters have not demonstrated that the agency’s past performance evaluation was unreasonable for failing to seek out and consider this type of past performance evaluation.[20] As a result, we deny these grounds of protest.
The protesters also contend that the agency’s evaluation of AshBritt’s past performance was unreasonable in some instances where the agency’s findings were not supported by the information contained in AshBritt’s submitted past performance information. We have reviewed the protesters arguments and the evaluation record and find that, despite some minor errors in the documentation of its evaluation, the agency’s evaluation of AshBritt’s past performance is reasonable and in accordance with the terms of the solicitation. Further, we find that protesters do not establish that they were prejudiced by these errors in the agency’s documentation.
As a representative example, CrowderGulf specifically argues that the agency’s evaluation of AshBritt’s third past performance project erred when it concluded that the submitted past performance questionnaires had “no performance issues listed.” CrowderGulf Comments & Supp. Protest at 36 (quoting AR, Tab 14, SSEB Report at 24, 86, 418, 505). CrowderGulf points to AshBritt’s past performance proposals, which include a questionnaire under project 3 that states “AshBritt initially struggled to pull resources on time for debris pick up.” CrowderGulf Comments & Supp. Protest at 36; see also, e.g., AR, Tab 4.7, AshBritt Region 7 Proposal, Volume 2 at cxiii. The protester also argues that the agency never considered whether AshBritt took effective steps to correct this performance problem, and that the agency’s evaluation failures here render the agency’s assignment of a rating of substantial confidence to AshBritt’s past performance unreasonable.
The agency responds that its past performance evaluation was reasonable and in accordance with the terms of the solicitation. Specifically, the agency argues that CrowderGulf’s focus on a single cherry-picked statement does not render the entire evaluation of this past performance project unreasonable. Supp. MOL at 52.
As noted above, as part of its past performance evaluation, the RFP required the agency to consider the “number, type, and severity” of any performance issues identified as well as the “effectiveness of corrective actions taken.” RFP at 19. Here, the record shows that the past performance project at issue consisted of 17 submitted past performance questionnaires (PPQ’s), all of which contained almost entirely ratings of “excellent” or “very good.” See AR, Tab 4.7, AshBritt Region 7 Proposal, Vol. 2 at xli‑cxiv. The protester is correct that there was a single performance issue noted in a single PPQ, but the protester fails to note that this same comment also states that AshBritt addressed the issue within two weeks and went on to have 5 years of successful performance. Id. at cxiii (explaining that the county had renewed AshBritt’s contract and the reviewer was “looking forward to another 5 successful years”).
On this record, especially in light of the RFP requirement that the agency consider how an offeror addressed any performance issues, we cannot conclude that the agency’s evaluation of this project was unreasonable. While the protester does find a single statement in the agency’s evaluation that is contradicted by the language of the proposal, the protester does not meaningfully allege that the agency’s conclusions regarding either the quality of this past performance project or its overall assessment of substantial confidence in AshBritt’s past performance were incorrect. Considering that the underlying statement in the PPQ at issue demonstrates that AshBritt quickly resolved this minor performance issue and performed successfully for years, we do not find that this single incorrect evaluation statement regarding one of 17 otherwise high-quality PPQs for the project renders the agency’s conclusion that it has high expectations that AshBritt will successfully perform the required effort to be unreasonable.
Similarly, CrowderGulf challenges the agency’s evaluation of AshBritt’s California wildfire past performance project. CrowderGulf Comments & Supp. Protest at 34‑35. In this regard, CrowderGulf argues that the SSA’s finding that “the CPAR[s] rating for the wildfire project shows Exceptional or Very Good” ratings is not supported by the actual CPARs. Id. The agency responds that the protester is objecting to “a minor scrivener’s error.” Supp. MOL at 51. The agency argues that the SSEB report accurately described the CPARs ratings as satisfactory to very good, and that the SSA clearly states that he agreed with and adopted the SSEB’s and SSAC’s evaluations, notwithstanding the SSA’s mistaken use of the words “exceptional or very good” for this single past performance reference. Id.
Here, the record is clear that the CPARs for this one project all had ratings of either satisfactory or very good, with no assessed ratings of exceptional in the CPAR evaluation areas. See, e.g., AR, Tab 4.7, AshBritt Region 7 Proposal, Vol. 2 at iii‑xxix. The record also shows that the reviewing official on each CPAR for this project concluded that “[p]erformance on this contract was exceptional.” Id. at vi, xi, xvi, xx, xxiii, xxvi, xxix. Accordingly, while we find the SSA’s description of the CPARs for this project were inaccurate, we do not find that the agency’s overall evaluation of this project was unreasonable. In this regard, the protester does not meaningfully allege that the actual CPARs ratings--satisfactory and very good as opposed to very good and exceptional--of this past performance project renders the agency’s evaluation of this project unreasonable.[21] Further, CrowderGulf does not point to, and our review of the solicitation does not reveal, any requirement in the RFP that the agency only consider a past performance project high-quality if it received certain past performance ratings in a CPAR or PPQ.[22] In short, while the protester may disagree with the agency’s judgments based on this documentation error, they have not demonstrated that the agency’s conclusions regarding the quality of AshBritt’s past performance are unreasonable or not in accordance with the terms of the solicitation. Accordingly, we deny this ground of protest.
We also find that the protesters have not demonstrated that they were prejudiced by these errors in the agency’s documentation of its past performance evaluation. Competitive prejudice is an essential element of a viable protest. Where a protester fails to demonstrate that, but for the agency’s actions, it would have had a substantial chance of receiving the award, our Office will not sustain the protest. See e.g., Access Interpreting, Inc., B‑413990, Jan. 17, 2017, 2017 CPD ¶ 24 at 5. As noted above, the agency was to evaluate the quality of performance for each submitted past performance project, then assign each offeror a confidence rating. RFP at 19‑20.
The record shows that in each region where AshBritt was the successful offeror, its past performance was assessed a rating of substantial confidence by the agency. AR, Tab 15.1, SSAC Region 1 Report at 3; AR, Tab 15.2, SSAC Region 2 Report at 3; AR, Tab 15.6, SSAC Region 6 Report at 3; AR, Tab 15.7, SSAC Region 7 Report at 3. For each of these regions, AshBritt submitted five past performance projects that the agency evaluated and found to be recent, either relevant or very relevant, and demonstrated high-quality performance. AR, Tab 14, Final SSEB Report at 23‑25, 84‑87, 416‑419, 503‑506. Each of AshBritt’s five submitted projects included multiple CPARs or PPQs which collectively contained a substantial amount of qualitative and narrative assessments of their performance on those projects. See, e.g., AR, Tab 4.7, AshBritt Region 7 Proposal, Vol. 2 at iii‑clvi.
The protesters do not demonstrate that the aspects of the two challenged past performance projects discussed above or the other three past performance projects submitted by AshBritt were--other than the alleged errors--not recent, relevant, or actually indicative of high-quality past performance. Further, in each region where AshBritt was the successful offeror it was found to present the best value based on a significant price advantage, despite not having the most highly rated offer.[23] In their pleadings, the protesters do not demonstrate how a slight downgrade to the agency’s assessment of a rating of outstanding to AshBritt’s past performance would put at risk its significant evaluated price advantage in a procurement where price is the single most important factor for award. In short, while the protesters have pointed out some incorrect statements in the evaluation record, they have not demonstrated that, but for these documentation errors, the protesters would have had a substantial chance of receiving the award and have therefore failed to demonstrate prejudice. Accordingly, we see no basis to sustain these grounds of protest.
Discussions
Both CrowderGulf and DRC contend that the agency’s discussions either were not meaningful or were misleading due to the agency’s failure to inform them of their relatively higher prices.[24] CrowderGulf Protest at 11‑15; CrowderGulf Comments & Supp. Protest at 17‑18, 20‑21. The protesters argue, in essence, that the importance of price in the best-value tradeoff should have led the agency to inform offerors during discussions if their higher pricing made them less competitive. See DRC 2nd Supp. Protest at 19.
When an agency engages in discussions with an offeror, the discussions must be meaningful. In order to be meaningful, discussions must be sufficiently detailed so as to lead an offeror into the areas of its proposal requiring amplification or revision in a manner to materially enhance the offeror's potential for receiving award. Powersolv, Inc., B‑402534, B‑402534.2, June 1, 2010, 2010 CPD ¶ 206 at 7. While the precise content of discussions is largely a matter of the contracting officer’s judgment, such discussions must, at a minimum, address significant weaknesses, deficiencies and adverse past performance information to which the offeror has not yet had an opportunity to respond. FAR 15.306(d)(3); American States Utilities Servs., Inc., B‑291307.3, June 30, 2004, 2004 CPD ¶ 150 at 6. These minimum requirements do not require an agency to engage in discussions because an offeror’s price is significantly high or too high as a function of competitive standing. See Joint Logistics Managers, Inc., B‑410465.2, B‑410465.3, May 5, 2015, 2015 CPD ¶ 152 at 5 (noting that “a significantly higher price, or a price that is too high, is not a significant weakness or a deficiency as contemplated by the regulatory scheme delineating the rules for discussions”).
With respect to issues related to price, our decisions have consistently concluded that the decision to inform an offeror that its price is too high during discussions is discretionary. See Hydraulics Int'l, Inc., B‑284684, B‑284684.2, May 24, 2000, 2000 CPD ¶ 149 at 17. Unless an offeror’s proposed price is so high as to be unreasonable or unacceptable, an agency is not required to inform an offeror during discussions that its proposed price is high in comparison to a competitor’s proposed price, even where price is the determinative factor for award. Peridot Solutions, LLC, B‑408638, Nov. 6, 2013, 2013 CPD ¶ 260 at 4.
Our review of the record shows that the agency only informed offerors of prices it considered potentially unbalanced or unreasonable. See, generally, AR, Tab 29, Discussion Letters with Evaluation Notices. At no point does the record indicate that the agency found CrowderGulf’s or DRC’s final proposed prices to be unreasonably high, or that the agency otherwise excluded the protesters from consideration for award due to their higher prices. Accordingly, we conclude that it was within the agency’s discretion whether to inform the protesters that their prices were significantly higher than other offerors’. That the agency did not do so does not support the conclusion that the discussions were misleading or not meaningful, as the protesters allege here. Hydraulics Int'l, Inc., supra.
Best-Value Tradeoff and Source Selection Decisions
Finally, DRC challenges the agency’s best-value tradeoffs and source selection decisions. DRC Comments at 17‑18. In this regard, DRC argues that the SSDDs’ tradeoff analyses are cursory and do not show a meaningful comparison of proposals. Id. at 21‑22. DRC complains that the agency “treated price as the sole and determining factor for award, consistently awarding the contracts to low[er] rated, lower priced offerors.” Id. at 22.
Source selection officials in negotiated procurements have broad discretion in determining the manner and extent to which they will make use of technical and price evaluation results; price/technical trade-offs may be made, and the extent to which one may be sacrificed for the other is governed only by the tests of rationality and consistency with the evaluation criteria. 2H & V Constr. Servs., B‑411959, Nov. 23, 2015, 2015 CPD ¶ 368 at 8. Where a price/technical tradeoff is made, the source selection decision must be documented, and the documentation must include the rationale for any tradeoffs made, including the benefits associated with additional costs. FAR 15.308; The MIL Corp., B‑297508, B‑297508.2, Jan. 26, 2006, 2006 CPD ¶ 34 at 13. However, there is no requirement that an agency selection decision discuss the agency’s comparison of every proposal received in order to document the selection of the awardees’ proposals. Rather, the documentation need only be sufficient to establish the agency was aware of the relative merits and costs of the competing proposals and that the source selection was reasonably based. See General Dynamics‑Ordnance & Tactical Sys., B‑401658, B‑401658.2, Oct. 26, 2009, 2009 CPD ¶ 217 at 8.
Here, our review of the record shows that the agency conducted a fulsome evaluation and documented what it considered to be the relative merits of the proposals in each region several times, including in the SSDDs. See, e.g., AR, Tab 19.1, Region 1 SSDD. For each of the several tradeoffs that the agency conducted per region, it analyzed the two proposals’ relative merits under each evaluation factor and ultimately decided whether the better non-price proposal justified any evaluated price premium. Id. at 23‑25. As noted above, the RFP set forth price as the most important evaluation factor for award, specifying that it was as important as all the non-price evaluation factors combined. RFP at 14. Accordingly, we see nothing objectionable in the SSA’s conclusions that lower-priced proposals were often a better value than more expensive, slightly higher technically rated proposals.
DRC’s complaints that the SSA essentially converted this best-value tradeoff procurement into one made on a lowest-priced, technically acceptable basis are unpersuasive. See DRC Comments at 22. The record shows that two regions were not awarded to the lowest-priced offerors. In region 1, the agency found that AshBritt’s non-price proposal was superior to the lowest-priced offeror and justified the evaluated price premium. AR, Tab 19.1, Region 1 SSDD at 23. Further, while the record here does not include the SSDD for region 3, it does demonstrate that DRC was selected for award in that region despite not having the lowest-priced proposal. Compare AR, Tab 23b, CrowderGulf Unsuccessful Offeror Notice at 1 (stating DRC was the awardee for region 3), with, AR, Tab 16, Final Price Evaluation Spreadsheets, sheet “Region 3” (showing that DRC did not have the lowest evaluated price in region).
DRC’s disagreement with the agency’s determinations, without more, does not establish that the source selection was unreasonable. CACI–WGI, Inc., B‑408520.2, Dec. 16, 2013, 2013 CPD ¶ 293 at 17.
The protests are denied.
Edda Emmanuelli Perez
General Counsel
[1] The Corps issued 20 amendments to the solicitation but did not issue a single conformed copy of the solicitation after issuing amendment 20. Accordingly, citations to the RFP without noting an amendment in this decision are to the updated language issued as part of amendment 20. Citations to all other solicitation language will include the amendment where the relevant section of the RFP was last amended.
[2] The RFP specified the U.S. states and territories that would be covered under each regional IDIQ contract. The regions relevant to this decision are: region 1 covering Washington, Oregon, Idaho, Montana, Wyoming, North Dakota, South Dakota, Nebraska, Colorado, Kansas, and Missouri; region 2 covering Minnesota, Wisconsin, Iowa, Illinois, Louisiana, and Mississippi; region 4 covering Maine, Vermont, New Hampshire, Massachusetts, Rhode Island, Connecticut, New York, Pennsylvania, Delaware, Maryland, District of Columbia, and Virginia; region 6 covering Texas, Oklahoma, and Arkansas; region 7 covering North Carolina, South Carolina, Georgia, Alabama, and Florida; and, region 8 covering Hawaii, Alaska, Guam, American Samoa, and “other U.S. territories” within the Corps’s Pacific Ocean division. RFP at 11‑12.
[3] The protesters here challenge the award of contracts for regions 1, 2, 4, 6, 7, and 8. Region 5 was still under evaluation and had not been awarded when the instant protests were filed. COS at 1 n.1. Two offerors filed protests challenging the region 3 evaluation and award decision, but they subsequently withdrew all grounds of protests regarding region 3. D&J Enters., Inc., B‑418693.8, Jan. 6, 2022 (unpublished decision); CrowderGulf Partial Withdrawal of Protest, Jan. 12, 2022.
[4] The RFP defined a project as “all work happening in response to the same disaster,” noting that “debris removal missions are often accomplished with multiple contracts” from various ordering entities. Id. at 8. Offerors were to include past performance information about each project in the form of completed contractor performance assessment reports (CPARs), if available, or completed past performance questionnaires. Id. at 8‑9.
[5] The RFP provided that the agency would assign each project a past performance relevancy rating of very relevant, relevant, somewhat relevant, or not relevant. Id. at 19.
[6] As a result of the Corps’s intended corrective action, we dismissed those protests as academic. AshBritt, Inc., B‑418693, B‑418693.4, May 29, 2020 (unpublished decision); Ceres Envtl. Servs., Inc., B‑418693.2, B‑418693.3, May 29, 2020 (unpublished decision); Phillips & Jordan, Inc., B‑418693.5, May 29, 2020 (unpublished decision); D&J Enters., Inc., B‑418693.6, May 29, 2020 (unpublished decision). Subsequently, another offeror filed a protest with our Office challenging the solicitation amendments issued as part of the Corps’s corrective action. Our Office denied the protest. RELYANT Global, LLC, B-418693.7, Apr. 9, 2021, 2021 CPD ¶ 166.
[7] Due to the agency’s redaction of proposal and evaluation information for offerors that were neither an awardee nor a protester, the record is unclear regarding exactly how many offerors submitted proposals. Compare AR, Tab 16.1, Final Price Evaluation Report Spreadsheets, with, AR, Tab 10, Competitive Range Determination at 2.
[8] The agency conducted its evaluation of final proposals through a source selection evaluation board (SSEB) that separately evaluated proposals, and a source selection advisory council (SSAC) that reviewed the SSEB’s findings, and conducted a comparative analysis of proposals in order to make award recommendations to the source selection authority (SSA). See, generally, AR, Tab 14, SSEB Report; see also, e.g., AR, Tab 15.1, SSAC Region 1 Report at 1.
[9] The Blue & Gold waiver rule, which is applicable to bid protests filed before the U.S. Court of Federal Claims, is similar to our Office’s pre-award protest timeliness rule at 4 C.F.R. § 21.2(a)(1). Blue & Gold Fleet, L.P. v. United States, 492 F.3d 1308, 1313 (Fed. Cir. 2007). In that case, the Federal Circuit held:
a party who has the opportunity to object to the terms of a government solicitation containing a patent error and fails to do so prior to the close of the bidding process waives its ability to raise the same objection subsequently in a bid protest action in the Court of Federal Claims.
Id.
[10] Despite citing to and quoting a specific May 29, 2020 letter in its comments and other nonspecific “letters,” DRC did not include a copy of these letters with its pleadings. See DRC Comments at 18, 20.
[11] Even if, for the sake of argument, we were to find that DRC’s pre-award communications here constituted a timely pre-award agency-level protest, which we do not, DRC’s protest would still be untimely. The record here shows that DRC had at least constructive knowledge of initial adverse agency action to its objections on December 4, 2020. On that date, the agency responded to one of DRC’s questions regarding unequal access to pricing information, stating “the Government did not provide any of the offerors with non-public information or source selection information about any of the other offerors during the course of the prior procurement.” AR, Tab 24.2, RFP amend. 0016, Q&A at 10. Accordingly to be timely under 4 C.F.R. § 21.2(a)(3), DRC would have had to file this protest ground with our Office no later than December 14, 2020.
[12] DRC makes a collateral argument that the agency’s failure to disclose the original disappointed offerors’ pricing to DRC during discussions resulted in unequal discussions. DRC Protest at 16‑17; DRC Comments at 17. However, we fail to see how these arguments provide a basis to sustain a protest where DRC does not allege, and the record does not reveal, that the agency provided the unit-level pricing information from DRC’s April 1, 2020 award to other offerors as part of discussions. See, e.g., AR, Tab 29, Ashbritt Discussion Letters. To the extent that this argument still challenges the agency’s disclosure of DRC’s pricing information in a debriefing before undertaking corrective action, we dismiss it as untimely, as we did the unequal access to information argument above.
[13] In this regard, the agency’s response highlights several aspects of DRC’s six different regional proposals which each contain identical language to describe DRC’s approach and capabilities. Supp. MOL at 12‑14.
[14] The agency’s price evaluation report and price analysis were not separated into different documents for the different regions. See AR, Tab 16, Final Price Evaluation Report; Tab 16.1, Final Price Analysis. However, like the SSEB report, the price documents clearly delineated the separate evaluations and analysis.
[15] DRC also argues that the agency’s unbalanced price analysis should have considered the effect of contractor authority to choose the distance to which it would remove debris. DRC Protest at 13; DRC Comments at 6. In this regard, DRC notes that ECC and Ashbritt both priced certain short-distance debris removal CLINs much lower than the corresponding medium- or long-distance debris removal CLINs of the same type. DRC Protest at 13‑15. DRC concludes that the agency should have evaluated, as part of its balance analyses, whether ECC and Ashbritt could “game” contract performance to minimize the use of their allegedly understated short-distance disposal CLINs. Id. However, the solicitation is clear at paragraphs 2.4.13 and 2.4.17 of the performance work statement that the agency must approve all debris dumpsites to be used under the contract, including temporary debris storage and reduction sites, and final disposal sites. AR, Tab 2.17, RFP amend. 0017 at 19. Given the agency’s control over disposal sites here, we do not see how the protesters’ arguments provide a basis to sustain a protest. To the extent the protester is alleging that the agency will not use its authority to ensure contractors do not insidiously steer work toward more lucrative CLINs, we consider the argument to concern a matter of contract administration that is beyond the scope of our bid protest jurisdiction. See 4 C.F.R. § 21.5(a).
[16] Where a solicitation contemplates the award of a fixed-price or time-and-materials contract, price realism is not ordinarily considered, because a fixed-priced type contract places the risk and responsibility for costs and resulting profit or loss on the contractor. HP Enter. Servs., LLC, B‑413888.2 et al., June 21, 2017, 2017 CPD ¶ 239 at 5; see FAR 15.402(a). While an agency may conduct a price realism analysis in awarding a fixed‑price contract, it is for the limited purpose of measuring an offeror’s understanding of the requirements or to assess the risk inherent in the offeror’s proposal. FAR 15.404‑1(d)(3); Hewlett Packard Enter. Co.--Costs, B‑413444.3, Mar. 3, 2017, 2017 CPD ¶ 85 at 5; Emergint Techs., Inc., B‑407006, Oct. 18, 2012, 2012 CPD ¶ 295 at 5-6. Absent a solicitation provision providing for a price realism evaluation, however, agencies are neither required nor permitted to conduct one in awarding a fixed‑price or labor‑hour contract. Delta Risk, LLC, B‑416420, Aug. 24, 2018, 2018 CPD ¶ 305 at 18‑19.
[17] For region 5, the agency price evaluator found that the standard deviation calculations were not helpful for analyzing the pricing due to an offeror with “consistently higher prices” skewing the data. AR, Tab 28, Decl. of Agency Price Evaluator at ¶ 7. Accordingly, for region 5, the evaluator removed the offeror’s consistently higher prices from the statistical calculations “in order to better assess pricing outliers for the remaining offerors. Id.; AR, Tab 11, Initial Price Evaluation Spreadsheets, tab “Region 5-Reduced.”
[18] In Multimax, the agency identified a labor rate as potentially unreasonable or unbalanced only if “the rate both exceeded (or was lower than) the IGCE rate, and was more than two standard deviations greater (or less) than the mean rate of all offerors for that category.” Multimax, supra at 9. Our Office sustained the protest challenging this methodology, finding the agency’s analysis unreasonable where it mechanically applied its formula without considering whether the resulting outliers actually reflected unreasonable or unbalanced pricing. Id. at 11.
[19] DRC also alleges that the specific standard deviation calculations used here were unreasonable because the agency utilized the wrong formula in its Microsoft Excel spreadsheet. DRC Comments at 13 (citing DRC Comments exh. 1, Expert Decl. at ¶¶ 16‑17). DRC argues that the agency calculated the standard deviation as if the range of numbers provided‑‑the proposed prices‑‑was a sample of a larger group of numbers instead of the entire range of prices, which it alleges resulted in a larger and less useful standard deviation. Id. We fail to see how such arguments provide a basis to sustain a protest. The protester does not point to, and our review of the record does not reveal, any requirement that the agency use a specific formula when calculating standard deviations as part of a price analysis. Further, even if we were to agree with the protester that the agency’s use of a specific standard deviation formula was improper, which we do not, the protester has not demonstrated that the use of this specific formula failed to identify any proposed pricing that could result in the agency paying unreasonably high prices during performance.
[20] DRC, on the other hand, contends that the agency performed an unequal past performance evaluation where it considered allegedly negative past performance information regarding DRC based on an ongoing lawsuit, but not the similar information flagged by the protesters above, related to Ashbritt and ECC. DRC 2nd Supp. Protest at 14‑16. However, the initial SSEB report is clear that this is different from the other outside information cited by the protesters because this was “brought to the Agency’s attention” and was therefore known to the agency at the time of the evaluation. AR, Tab 9, Initial SSEB Report at 10. Further, the record demonstrates that this information was ultimately not used by the agency in discussions with DRC or in DRC’s final past performance evaluation as the contracting officer ultimately did not consider it past performance information. See generally AR, Tab 14, SSEB Report (not evaluating the ongoing lawsuit information provided by an outside source); Contracting Officer Decl. at ¶¶ 8‑9. On this record, we see no basis to conclude that the past performance evaluations were unequal in their consideration of information outside the proposals.
[21] Notably, the SSEB correctly documented the CPAR evaluation ratings as satisfactory and very good and still concluded that this project demonstrated high-quality performance. See, e.g., AR, Tab 14, Final SSEB Report at 416‑417. It was the SSAC and SSA that in some instances incorrectly documented that these CPAR ratings were very good or exceptional but still concurred with the SSEB’s overall evaluation of the protester’s past performance. See AR, Tab 15.7, Region 7 SSAC Report at 13.
[22] CrowderGulf does allege that the agency unreasonably considered this past performance project to be high‑quality, noting that, for the CPARs at issue, under the evaluation area “Quality,” every CPAR was rated only satisfactory. CrowderGulf Comments & Supp. Protest at 35; see also, e.g., AR, Tab 4.7, Ashbritt Region 7 Proposal, Vol. 2 at iii. We note that each CPAR contained six rated evaluation areas and Ashbritt received ratings of very good in some of these. Id. at iii‑iv. CrowderGulf does not point to, and our review of the solicitation does not reveal, any requirement that the agency only consider the “Quality” evaluation area of a CPAR when evaluating past performance information for quality. Accordingly, we see no basis to find the agency’s consideration of all six rated evaluation areas within a CPAR to be unreasonable.
[23] AshBritt was evaluated to have a more than $35 million price advantage compared to the next closest priced protester in region 1, a more than $45 million price advantage compared to the next closest priced protester in region 2, a more than $37 million price advantage compared to the next closest priced protester in region 6, and more than $36 million price advantage compared to the next closest priced protester in region 7. See AR, Tab 19.1, Region 1 SSDD at 5; AR, Tab 19.2, Region 2 SSDD at 5; AR, Tab 19.6, Region 6 SSDD at 5; AR, Tab 19.7, Region 7 SSDD at 5.
[24] The protesters also raise several collateral arguments that discussions were misleading due to the alleged evaluation errors raised in the protests--especially the alleged errors in the agency’s unbalanced pricing analysis. See, e.g., CrowderGulf Comments & Supp. Protest at 3 (alleging the agency’s use of standard deviation calculations as part of its balance analysis caused misleading discussions). Given our conclusions above regarding the general reasonableness of the agency’s evaluation, we do not see how any of these arguments would form a basis to sustain a protest, and therefore deny them.