Risk Analysis and Mitigation Partners

B-409687,B-409687.2: Jul 15, 2014

Additional Materials:

Contact:

Ralph O. White
(202) 512-8278
WhiteRO@gao.gov

 

Office of Public Affairs
(202) 512-4800
youngc1@gao.gov

Risk Analysis and Mitigation Partners (RAMP), of Fairfax, Virginia, protests the evaluation and nonselection of its qualification statement for negotiation of an architect/engineering (A/E) services contract, pursuant to solicitation/synopsis No. HSFE60-14-R-0003, issued by the Department of Homeland Security, Federal Emergency Management Agency (FEMA), for architectural and engineering services. RAMP challenges the agency's evaluation as unreasonable.

We sustain the protest in part, and deny it in part.

Decision

Matter of: Risk Analysis and Mitigation Partners

File: B-409687; B-409687.2

Date: July 15, 2014

Stuart B. Nibley, Esq., Andrew N. Cook, Esq., John P. Estep, Esq., Amy M. Conant, Esq., and Kirstin D. Dietel, Esq., K&L Gates LLP, for the protester.
Richard J. Conway, Esq., and Erin Wilcox Burns, Esq., Dickstein Shapiro LLP, for STARR; and Richard B. O’Keeffe, Jr., Esq., William A. Roberts, III, Esq., and Samantha S. Lee, Esq., Wiley Rein LLP, for The Compass PTS, Joint Venture, the intervenors.
Jeffrey D. Webb, Esq., and KalMarie Rawald, Esq., Department of Homeland Security, for the agency.
Heather Weiner, Esq., and Jonathan L. Kang, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest that the agency used unstated evaluation criteria in assessing weaknesses in the protester’s proposal is sustained where the agency’s evaluation was based on criteria that offerors could not have reasonably known to address in their proposals.

DECISION

Risk Analysis and Mitigation Partners (RAMP), of Fairfax, Virginia, protests the evaluation and nonselection of its qualification statement for negotiation of an architect/engineering (A/E) services contract, pursuant to solicitation/synopsis No. HSFE60-14-R-0003, issued by the Department of Homeland Security, Federal Emergency Management Agency (FEMA), for architectural and engineering services.[1] RAMP challenges the agency’s evaluation as unreasonable.

We sustain the protest in part, and deny it in part.

BACKGROUND

The procurement was conducted pursuant to the Brooks Act, 40 U.S.C. § 1101 et seq. and its implementing regulations, Federal Acquisition Regulation (FAR) subpart 36.6. In accordance with these regulations, on December 23, 2013, FEMA announced through the Federal Business Opportunities website the subject A/E requirements, and invited capable firms to submit Standard Form (SF) 330, “Architect-Engineer Qualifications” statements--referred to herein as proposals. The solicitation contemplated the award of “two or more” indefinite-delivery, indefinite-quantity, performance-based A/E contracts, for a base year and four 12-month options. Solicitation at 17. The solicitation sought to replace FEMA’s current suite of A/E contracts--which expired or will expire in March 2014, September 2014, and May 2015--with new contracts for FEMA’s production and technical services. The new contracts will consist of three programs: Risk Mapping, Assessment, and Planning (Risk MAP), Hazard Mitigation Technical Assistance Program (HMTAP), and Technical Assistance and Research Contracts (TARC). Id. at 2-3.

The procedures provided for procurements of A/E requirements under the Brooks Act do not include a price competition; rather, the agency must select the most highly qualified firm(s), on the basis of demonstrated competence and qualifications, and negotiate contracts with those firms at a fair and reasonable level of compensation. Photo Science, Inc., B-296391, July 25, 2005, 2005 CPD ¶ 140 at 1-2; see FAR subpart 36.6. In accordance with these procedures, the solicitation stated that agency would evaluate the offerors’ proposals to select the two most highly-rated firms for negotiations based on the evaluation factors in the solicitation, and the “Performance Objectives, Goals and Outcomes,” contained in the solicitation’s statement of objectives (SOO). Solicitation at 7.

The solicitation included seven evaluation factors: (1) specialized experience and technical competence; (2) regional support and delivery; (3) professional qualifications necessary for satisfactory performance of required services; (4) capacity to accomplish the work required; (5) past performance on contracts with government agencies and private industry in terms of cost control, quality of work, and compliance with performance schedules; (6) location in the geographic area of the project and knowledge of the locality; and (7) subcontracting plan. Id. at 7-12. For purposes of evaluation, factors 1 and 2 were equally important, and were each more important than factor 3; factors 3, 4, 5, and 6 were equally important. Id. Factor 7 was to be evaluated on a pass/fail basis, and was no more or less important than the other factors. Id. at 7.

Under Factor 1, specialized experience and technical competence, the solicitation stated that the government would evaluate an offeror’s experience and capabilities based on the following five sub-factors, listed in descending order of importance: (A) increasing resilience and producing all elements of regulatory and non-regulatory Risk Map products with high quality; (B) providing technical support for Risk Analysis and Floodplain Management issues to internal and external stakeholders, including regions, Tribes, states, and local partners; (C) providing Risk Reduction and Hazard Mitigation Assistance technical support; (D) providing building sciences technical support; and (E) producing and managing map revision (MT-2) application forms for conditional letters and letters of map revision. Id. at 7-8.

Under Factor 2, regional support, the solicitation stated that the government would evaluate an offeror’s regional support and delivery approach based on the following four subfactors, listed in descending order of importance: (A) regional support; (B) data development and program delivery; (C) innovation and efficiency; and (D) proposed approach to effectively manage contract transition, quality management, and approach to incorporate broad program and policy impacts, including the methods and techniques used to balance the government’s competing interests. Id. at 8.

The solicitation also included three sample task order scenarios, which required that offerors provide a description[2] of their proposed solution for the scenario, including, for example, how they will work with FEMA and other providers; a basis of estimate; a project schedule; and their approach to balancing competing demands. Id. at 10-12.

In addition, the solicitation’s SOO listed the following six performance objectives, goals and outcomes:

Objective 1: Increase Resiliency: Produce Regulatory and Non-Regulatory Risk MAP Products

Objective 2: Provide Technical Support to FEMA, CTPs, and other Risk MAP Program Stakeholders

Objective 3: Produce MT-2s in a Cost-Efficient and Timely Manner

Objective 4: Provide Production and Technical Services for FEMA’s Risk Reduction and Hazard Mitigation Assistance (HMA) Missions

Objective 5: Provide Technical Services, Assistance, and Research Services

Objective 6: Provide Program Management and Subject Matter Expertise

Solicitation at 21-25. As discussed below, each objective included a list of bullet points describing that objective. Id.

The solicitation stated that offerors’ “[r]esponses to this Solicitation will provide straightforward, concise information that satisfies the requirements specified,” and “[e]mphasis should be placed on brevity, conformity to instructions, specified requirements of this Solicitation, and clarity of content.” Id. at 3-4. The solicitation also stated that an offeror’s submission “must not exceed sixty (60) pages,” single sided. Solicitation at 4.

Four offerors responded to the solicitation by the January 30, 2014, closing date, including RAMP, STARR, and The Compass PTS, Joint Venture (Compass). Contracting Officer (CO) Statement at 6. RAMP is one of the incumbent contractors for the requirement. Agency Report (AR), Tab D, Initial Technical Consensus Evaluation, at 44.

Following FEMA’s initial evaluation of proposals, the contracting officer opened discussions with all four offerors. CO Statement at 6. Discussions involved a 1-hour meeting with each offeror, during which the offeror was given 15 minutes to provide the agency with an overview of its proposal, and 45 minutes to respond to the agency’s questions. Id. The agency’s questions, which included both general questions provided to all offerors, and detailed questions from the technical evaluation panel (TEP) that were specific to each offeror’s proposal, were provided to the offerors 1 hour prior to their scheduled discussions. Id. Discussions were audio recorded, and the offerors were told that revised proposals were not required. Id.

The TEP incorporated the information gathered during discussions into the technical evaluation report. Id. at 7. The TEP assigned Compass’ proposal an overall rating of good, and assigned RAMP’s and STARR’s proposals overall ratings of satisfactory. AR, Tab D, Initial Technical Consensus Evaluation, at 71. The TEP stated that none of these three offerors had any significant weaknesses or deficiencies. Id.

As relevant here, the TEP assigned the following adjectival ratings to RAMP’s and STARR’s proposals:[3]

 

RAMP

STARR

COMPASS

OVERALL

SATISFACTORY

SATISFACTORY

GOOD[4]

Factor 1

Satisfactory

Good

 

Subfactor A

Satisfactory

Good

 

Subfactor B

Satisfactory

Satisfactory

 

Subfactor C

Good

Satisfactory

 

Subfactor D

Good

Good

 

Subfactor E

Good

Good

 

Factor 2

Satisfactory

Satisfactory

 

Subfactor A

Satisfactory

Satisfactory

 

Subfactor B

Satisfactory

Satisfactory

 

Subfactor C

Satisfactory

Excellent

 

Subfactor D

Satisfactory

Satisfactory

 

Factor 3

Good

Satisfactory

 

Factor 4

Satisfactory

Satisfactory

 

Factor 5

Medium

High

 

Factor 6

Good

Good

 

Factor 7

Pass

Pass

 


AR, Tab D, Initial Technical Consensus Evaluation, at 3.

The TEP recommended that the source selection authority (SSA) select Compass and STARR for negotiations. Id. at 71. Specifically, the TEP found Compass to be the most highly-qualified offeror based on its overall rating of good. Id. The TEP concluded that STARR was the second most highly-qualified offeror because STARR received higher ratings than RAMP under factor 1, subfactor 1A, and subfactor 2C. Id. The TEP ranked RAMP as the third most highly-qualified offeror. Id.

On March 25, the SSA approved the TEP’s recommendation of the ranking of the offerors. Id. at 73. In response to the protest, the SSA stated that, in approving the ranking, he relied on STARR’s rating of good for factor 1 in placing STARR as the second most highly-qualified firm. SSA Statement, May 7, 2014, at 2. On March 27, the contracting officer entered into negotiations with Compass and STARR. CO Statement at 7. Also on March 27, FEMA also notified RAMP that it was not selected as one of the two most highly-qualified offerors for negotiations. Id. On April 2, the agency provided RAMP with a debriefing. This protest followed.

In its protest, RAMP raises three main arguments: (1) the agency applied unstated evaluated criteria in assessing weaknesses in RAMP’s proposal; (2) the agency improperly assessed weaknesses under the innovation and efficiency subfactor for items in RAMP’s proposal that were not proposed as innovations; and (3) the agency failed to assign adjectival ratings that appropriately reflected the balance of strengths and weaknesses in RAMP’s proposal.[5] For the reasons discussed below, we find that the agency applied unstated evaluation criteria in evaluating RAMP’s proposal, and sustain the protest on these bases. We deny the remaining protest grounds.

UNSTATED EVALUATION CRITERIA

RAMP argues that FEMA’s evaluation of its proposal under subfactor 1A, increasing resilience, subfactor 1B, risk analysis and floodplain management, and factor 4, capacity to accomplish the work required, was unreasonable because it relied on unstated evaluation criteria. Specifically, RAMP argues that the agency unreasonably downgraded its proposal based on criteria that were not set forth in the solicitation.

Agencies are required to evaluate proposals based solely on the factors identified in the solicitation, and must adequately document the bases for their evaluation conclusions. Intercon Assocs., Inc., B-298282, B-298282.2, Aug. 10, 2006, 2006 CPD ¶ 121 at 5. While agencies properly may apply evaluation considerations that are not expressly outlined in the RFP if those considerations are reasonably and logically encompassed within the stated evaluation criteria, there must be a clear nexus between the stated and unstated criteria. Raytheon Co., B-404998, July 25, 2011, 2011 CPD ¶ 232 at 15-16. An agency may not give importance to specific factors, subfactors, or criteria beyond that which would reasonably be expected by offerors. Lloyd H. Kessler, Inc., B-284693, May 24, 2000, 2000 CPD ¶ 96 at 3. While we will not substitute our judgment for that of the agency, we will question the agency’s conclusions where they are inconsistent with the solicitation criteria and applicable procurement statutes and regulations, undocumented, or not reasonably based. Public Commc’ns Servs., Inc., B-400058, B-400058.3, July 18, 2008, 2009 CPD ¶ 154 at 17.

As discussed in detail below, we find that the agency’s assessment of weaknesses was based on the following concerns: RAMP’s failure to address certain “standards,” which were not specifically referenced in the solicitation, but which were listed along with over 400 other FEMA standards in a document on FEMA’s website; RAMP’s failure to discuss issues raised by the agency during a pre-solicitation, industry day presentation, but that were not addressed in the solicitation; and RAMP’s failure to list all proposed engineers for a labor category, even though such a list was not a requirement for the applicable evaluation factor. We agree with the agency that the concerns cited in the weaknesses assessed for RAMP’s proposal have some relationship to the evaluation criteria set forth in the solicitation; as discussed below, however, we find that the concerns are so removed from the expressly stated criteria in the solicitation that offerors could not have reasonably understood that they were required to address the concerns in their proposals. We therefore conclude that the agency improperly relied upon unstated evaluation criteria in assessing these weaknesses, and sustain the protest on this basis. [6]

Evaluation of Risk Analysis and Floodplain Management Subfactor

As relevant here, under the risk analysis and floodplain management technical support subfactor (subfactor 1B), the solicitation stated that offerors would be evaluated based on their “experience and capabilities providing technical support for Risk Analysis and Floodplain Management issues . . . .” Solicitation at 8.

In response to this subfactor, RAMP’s proposal provided information detailing its risk analysis technical support and floodplain management technical support experience.[7] AR, Tab C, RAMP Proposal, at 31-32. It also described RAMP’s expertise concerning [DELETED], such as [DELETED]. Id. at 14.

RAMP’s proposal received a satisfactory rating for this subfactor, and the technical evaluators assessed several strengths for RAMP’s proposal, including that “RAMP has demonstrated experience and capability providing technical support for Risk Analysis and Floodplain Management issues . . . that is highly likely to succeed in the future, and is low risk.” AR, Tab D, Initial Technical Consensus Evaluation, at 37-38. The evaluators, however, also assessed the following weaknesses to RAMP’s proposal under this subfactor:

RAMP’s proposal does not demonstrate that RAMP has experience coordinating with or supporting the Scientific Resolution Panel, nor did it articulate RAMP’s experience and capabilities in supporting such work. RAMP’s proposal does not list experience providing technical support to populate or maintain information in the National Levee Database, a critical element for coordination with the U.S. Army Corps of Engineers and other stakeholders. RAMP’s proposal does not list any experience or capability to provide technical support for ice-jam analysis. Combined, these limitations and omissions introduce a moderate risk that RAMP may not be able to deliver timely, effective and comprehensive technical support for routine Risk MAP production work, a large percentage of the overall contract scope.

Id. at 39.

As discussed above, RAMP argues that the weaknesses assessed based on its failure to describe its experience with the scientific resolution panel, ice-jam analysis, and the national levee database constituted unstated evaluation subfactors because they were not set forth in the solicitation, and because offerors could not reasonably have been expected to know the importance of these specific criteria.

FEMA contends that its assessment of the weaknesses was reasonable because the scientific resolution panel, ice-jam analysis, and national levee database are each reasonably related to or encompassed by the solicitation’s evaluation factors. With regard to the scientific resolution panel and ice-jam analysis, FEMA points to performance objective 1 in the SOO, “Increase Resiliency: Produce Regulatory and Non-Regulatory Risk MAP Products,” which stated the following objective:

Produce effectual, flexible, and sustainable Risk MAP products that are aligned with user needs using innovative production processes in order to increase risk awareness and achieve mitigation actions. Provide innovations and methodologies to enable the mapping program to become significantly more efficient and cost-effective.

Solicitation at 21. In particular, the agency relies on one of ten bullet points listed under this objective, which stated: “Produce the full range of Risk MAP engineering and mapping elements (e.g. Hydrology and Hydraulics), ensuring compliance with FEMA standards.” Id. The agency explains that the scientific research panel process and ice-jam analysis are “FEMA standards,” and argues that supporting these standards is reasonably encompassed within the solicitation’s evaluation criteria under this subfactor to “provid[e] technical support for Risk Analysis issues.” Id.

The agency concedes that the solicitation contains no reference to the scientific resolution panel or ice-jam analysis, but asserts that these standards are contained in FEMA’s Federal Insurance and Mitigation Administration (FIMA) Policy document, which is available on FEMA’s website. AR at 12, 14-15. In addition, FEMA contends that information provided by the agency during a presentation at the FIMA Industry Day on August 22, 2013, should have placed offerors on notice of the importance of addressing the scientific resolution panel process in their proposals. Id. at 15-16.

We conclude that FEMA applied unstated evaluation criteria in assessing the weaknesses here. With regard to the scientific resolution panel and ice-jam analysis, the agency correctly notes that the solicitation required offerors to address their “experience and capabilities providing technical support for Risk Analysis and Floodplain Management issues,” and included a performance objective to “[p]roduce the full range of Risk MAP engineering and mapping elements . . . , ensuring compliance with FEMA standards.” Solicitation at 8, 21. Nonetheless, we do not think the solicitation reasonably advised offerors of the agency’s view that offerors were required to specifically address their experience with the “scientific resolution panel” and “ice-jam analysis,” as these two areas constitute only two categories of hundreds of “FEMA standards” contained in the agency’s FIMA Policy document. AR, Tab Q, FIMA Policy Document. Moreover, the FIMA Policy Document was not included as part of the solicitation, and the solicitation did not require offerors to address all of the FEMA standards in their proposals.

In addition, the record shows that FEMA’s evaluation of RAMP’s proposal did not assess weaknesses for the protester’s failure to address every FEMA standard; the agency assessed weaknesses only for the failure to address certain of the hundreds of standards. See AR, Tab D, Initial Technical Consensus Evaluation, at 37-39. As stated above, an agency may not give importance to specific factors, subfactors, or criteria beyond that which would reasonably be expected by offerors. Lloyd H. Kessler, Inc., supra, at 3. Here, based on the language in the solicitation, we conclude that offerors could not have known that the agency expected the offerors to focus their proposals on a particular number of the numerous standards listed in the FIMA Policy document. Further, the agency did not raise any of these issues with the protester during discussions. Accordingly, we conclude that the agency’s evaluation in this regard was inconsistent with the solicitation’s stated evaluation factors.

With regard to the national levee database, FEMA points to the term “levees” in the following objective listed under the solicitation’s performance objective 2, which stated the following:

Provide technical support to FEMA (headquarters and regional) and external stakeholders on complex technical Risk MAP issues such as levees, coastal, geographic information systems (GIS) data, the National Flood Hazard Layer (NFHL), dam safety, risk assessments, and post-preliminary processing.

Solicitation at 22. Based on this objective, FEMA contends that “[u]se and interaction with the National Levee Database (NLD) is a crucial part of providing technical support for relevant stakeholders, including the United States Army Corps of Engineers (the agency that developed the [national levee database]).” AR at 16. FEMA argues that providing support for the Army Corps of Engineers and national levee database is reasonably encompassed within the solicitation’s risk analysis and floodplain management technical support subfactor evaluation criteria to “provid[e] technical support for Risk Analysis issues.” Id. While FEMA acknowledges that the solicitation does not mention either the national levee database, or support for the Army Corps of Engineers, it asserts that information provided by the agency during the agency’s presentation at the FIMA Industry Day on August 22, 2013, identified the Army Corps of Engineers as a key stakeholder and advised offerors that they would be required to provide “analyses relating to levees” in their proposals. AR at 16. Accordingly, FEMA contends that its assessment of a weakness for RAMP’s proposal for its failure to address this issue was reasonable.

Here again, we conclude that FEMA’s evaluation relied on unstated evaluation criteria. With regard to the national levee database, although the solicitation included a performance objective to, among other things, “[p]rovide technical support to FEMA” on issues “such as levees,” this statement did not advise offerors that they were required to specifically address the “national levee database” nor did the solicitation reference the significance of the Army Corps of Engineers. Solicitation at 22. Although the agency asserts that information provided at the FIMA Industry Day establishes the required nexus between the solicitation’s stated evaluation criteria and the unstated criteria utilized by the evaluators during the evaluation, the information provided at the FIMA Industry Day was not thereafter incorporated into the solicitation, nor was this issue raised with the protester during discussions. See IBM Global Bus. Servs., B-404498, B-404498.2, Feb. 23, 2011, 2012 CPD ¶ 36 at 4 (sustaining protest where the agency applied an unstated evaluation criterion based on information provided in industry day briefing materials that was not thereafter incorporated into the solicitation).

Evaluation of Increasing Resilience Subfactor

Next, RAMP challenges the weakness its proposal received under the increasing resilience subfactor (subfactor 1A) for not demonstrating experience providing average annualized loss (AAL) studies. As relevant here, the solicitation stated that offerors would be evaluated under the increasing resilience subfactor, in part, based on their “experience . . . producing all elements of regulatory and non-regulatory Risk MAP products with high quality.” Solicitation at 7.

In response to this subfactor, RAMP’s proposal included examples of its delivery of quality regulatory and non-regulatory Risk MAP products, including [DELETED], as well as [DELETED]. AR, Tab C, RAMP Proposal, at 30.

The technical evaluators assessed several strengths to Ramp’s proposal, and assessed it a rating of satisfactory. AR, Initial Technical Consensus Evaluation, at 37. The evaluators, however, also assessed a weakness to RAMP’s proposal under this subfactor because it did not “demonstrate experience generating . . . annualized loss studies.” Id. at 38.

The agency argues that its assessment of the weakness was reasonable because the AAL studies were reasonably encompassed within the solicitation’s evaluation criteria. In this regard, similar to the agency’s position discussed previously regarding the scientific resolution panel and ice-jam analysis, FEMA points to the solicitation’s objective to “[p]roduce the full range of Risk MAP engineering and mapping elements (e.g. Hydrology and Hydraulics), ensuring compliance with FEMA standards.” Solicitation at 21. FEMA argues that the AAL studies are a “FEMA standard,” and that supporting this standard is reasonably encompassed within the solicitation’s evaluation criteria under the increasing resilience subfactor to “produc[e] all elements of regulatory and non-regulatory Risk MAP products with high quality.” AR at 15, 19. For the same reasons articulated above, we conclude that the solicitation did not reasonably advise offerors that they were required to address their experience with the AAL studies, nor was this unstated evaluation consideration reasonably subsumed within the solicitation’s stated evaluation criteria. We also sustain the protest on this basis.

Evaluation of Capacity to Accomplish the Work Required Factor

Finally, RAMP contends that FEMA applied an unstated evaluation criterion under factor 4, capacity to accomplish the work required, by assessing a weakness to RAMP’s proposal based on the quantity of [DELETED] in RAMP’s proposal. Specifically, RAMP argues that it was unreasonable for the agency to count the number of personnel listed under the [DELETED] labor category because the solicitation did not require offerors to break out personnel under specific labor categories, and RAMP’s joint venture partners included their proposed [DELETED] under broader labor categories.

As relevant here, the solicitation stated that the agency would evaluate an offeror’s “ability to provide staff to meet surge requirements necessary to provide post-disaster PTS services.” Solicitation at 9. The solicitation also stated that, although proposals must not discuss cost or price, information that could affect cost or price, “such as labor hours, labor categories, labor mix, or materials, must be discussed in sufficient detail in the proposal to allow the Agency to evaluate the contractor’s understanding of the work.” Id. at 5.

In response to this subfactor, RAMP’s proposal stated that, based on its experience as an incumbent on the Risk MAP, HMTAP, and TARC contracts, RAMP “estimates an annual required capacity of approximately [DELETED], or approximately [DELETED] on the new PTS A&E contract,” and that “[t]his level of staffing would require only [DELETED] staff that are experienced on Risk MAP, HMTAP, and TARC.” AR, Tab C, RAMP Proposal, at 46. In addition, RAMP detailed its experience meeting surge requirements in some of the largest disasters in U.S. history, including Hurricane Katrina, between 2004 and 2007, stating: “The [DELETED] during this time required over [DELETED] committed through FEMA disaster deployments and the HMTAP, TARC and FEMA Map Mod contracts.” Id. In addition, RAMP’s proposal stated that, “[i]n the extraordinary event of simultaneous major catastrophic disasters, RAMP has more than [DELETED] additional engineering and professional staff beyond our base of [DELETED] with the necessary technical qualifications to provide [DELETED].” Id.

FEMA assessed a satisfactory rating to RAMP’s proposal for this subfactor, stating: “The RAMP team is currently serving as incumbents on contracts for each of the programs in this procurement, and can meet existing demands.” AR, Tab D, Initial Technical Consensus Evaluation, at 44. As the record shows, page 44 of RAMP’s proposal demonstrates access to [DELETED], and page 28 demonstrates [DELETED]. AR, Tab C, RAMP Proposal, at 28, 44. The agency, however, identified a weakness for RAMP’s proposal under this subfactor, finding that “RAMP[’s] proposal introduces some surge capacity concerns for coastal tasks, as this issue was a concern during high-volume tasks after Hurricane Sandy.” Id. at 45. Specifically, the evaluators stated the following:

There are only [DELETED] listed in RAMP’s proposal. This level of coastal expertise will likely be strained to deliver day to day operations for this contract given the amount of coastal work anticipated (coastal concerns are Risk MAP’s highest risk). If a disaster were to strike that required coastal expertise, the RAMP team is at risk of not being able to respond and to operate in the capacity needed. This introduces a moderate risk of delays to disaster-response and continuing operations for coastal tasks.

Id.

RAMP challenges the weakness it received based on there being “only [DELETED] listed in RAMP’s proposal,” arguing that the solicitation did not require a listing of all [DELETED] that an offeror could employ to meet surge needs. Id. We agree.

While the solicitation provided for the evaluation of an offeror’s “ability to provide staff to meet surge requirements necessary to provide post-disaster PTS services,” it did not specify that offerors had to list their proposed [DELETED] in a labor category titled “[DELETED].” Solicitation at 9. As the record reflects, RAMP’s joint venture partners listed their proposed [DELETED] in labor categories, such as [DELETED]. See AR, Tab C, RAMP Proposal, Part II. For example, URS Group listed [DELETED], and stated it has experience performing [DELETED] work in the amount of [DELETED] per year. Id. Based on our review of the record, and the terms of the solicitation, we find unreasonable the agency’s assessment of a weakness for RAMP’s lack of [DELETED] because the agency focused solely on a specific labor category, not required by the solicitation, without analyzing the full extent of the staff proposed in RAMP’s proposal.

Prejudice

Considering that price is not a factor, and that RAMP’s proposal was ranked the third most-highly rated proposal when improperly evaluated using the unstated criteria, we conclude that there is a reasonable possibility that RAMP’s proposal could be selected as one of the two most-highly rated proposals if the actual evaluation factors and subfactors are disclosed, and the protester is provided an opportunity to submit a proposal based upon the agency’s actual requirements. See Lloyd H. Kessler, Inc., B-284693, May 24, 2000, 2000 CPD ¶ 96 at 4-5. In addition, because FEMA did not provide any of the evaluation record reflecting the agency’s evaluation of the other offerors, we do not know whether the evaluation scores given to Compass and STARR were likewise improper due to the agency’s application of the unstated criteria to their proposals, or whether the evaluation was otherwise unequal. Moreover, the record provided by the agency shows that the selection decision was made based on the adjectival evaluation ratings, rather than a discussion of any specific strengths or weaknesses. See, AR, Tab D, Initial Technical Consensus Evaluation, at 71. On this record, we conclude that RAMP has been prejudiced by the agency’s actions and sustain the protest.

OTHER EVALUATION ISSUES

Next, RAMP contends that FEMA’s evaluation under several of the other subfactors was unreasonable. Specifically, RAMP challenges three weaknesses assessed to its proposal under the innovation and efficiency subfactor, arguing that the agency improperly based these weaknesses on two items included in RAMP’s proposal, which the protester did not intend to be considered as innovations under this subfactor. RAMP also challenges the agency’s assigned ratings for several subfactors, arguing that they were inconsistent with the solicitation’s definitions for the rating. We have reviewed all of the protester’s remaining arguments, and conclude that none provide a basis for which to sustain the protest. We address one example below.

RAMP argues that FEMA unreasonably assigned three weaknesses to its proposal under the innovation and efficiency subfactor. As relevant here, the solicitation stated that proposals would be evaluated under this subfactor as follows:

The Offeror’s experience and proposed solution for providing continuous process improvement, and the likelihood of delivering significant, large scale efficiencies and innovation throughout the life of the contract for all production and technical services, including engineering, mapping, and MT-2/Letter of Map Revision (LOMR) services.

Solicitation at 8.

RAMP’s proposal included a list of [DELETED] in the innovations and efficiencies section of RAMP’s proposal. AR, Tab C, RAMP Proposal, at 38-39. In addition, in response to the solicitation’s scenario 1, RAMP’s proposal discussed using “our proven [DELETED]” to cost-effectively produce high-quality [DELETED] products. Id. at 53. In response to the solicitation’s scenario 2, RAMP’s proposal stated, in relevant part, that RAMP “will develop specialized tools, such as enhancing the [DELETED] we built for FEMA after Hurricane Sandy.” Id. at 54.

The technical evaluators assessed the following weaknesses for RAMP’s proposal under this subfactor:

The RAMP team did not reference any [technical assistance and research contracts] services in response to this sub-factor. Further, other proposed enhancements, such as adding current and future risk information to [DELETED] (page 38), have yet to be completed. RAMP noted the [DELETED] (page 57) as an innovation, but this tool is still under development has not been delivered to FEMA nor deployed by FEMA.

AR, Tab D, Initial Technical Consensus Evaluation, at 42-43. The evaluators also stated that while RAMP attempted to address these weaknesses during discussions by “highlighting their innovations officer and restating the innovations approach detailed in the proposal,” and stating that “in this environment you don’t need whiz-bang extras, and we don’t spend time chasing mythical silver bullet[s, but will] partner with you to look for technological advances and procedural enhancements to help deliver your programs better, faster, and cheaper.” Id. The evaluators interpreted these statements as promoting improvements and collaborative approach, rather than innovations, and concluded that “[w]hile continuous process improvement is critical, the solicitation was very clear in setting expectations of significant innovation and efficiencies.” Id. Accordingly, the evaluators found that RAMP’s response to discussions did not remove or address any of the weaknesses. Id.

With regard to the weaknesses based on RAMP’s proposal to provide its [DELETED] and [DELETED], the protester argues that neither of these items was included in the list of [DELETED] described in the innovations and efficiencies section of RAMP’s proposal. RAMP contends that the agency’s evaluation under this subfactor should have focused solely on the innovations proposed by RAMP specific to this subfactor, and that the agency should not have looked to other sections of RAMP’s proposal in considering innovations. FEMA responds that offerors were advised that their responses to the solicitation’s three scenarios would be evaluated across all of the evaluation criteria, and that it was within the agency’s discretion to consider relevant information under more than one evaluation factor. See Solicitation, Questions and Answers (Q&A), at 8,11.

Based on this record, we find nothing unreasonable regarding the agency’s evaluation. As the agency notes, offerors were advised that responses to the scenarios were not specific to one factor, and that responses should be “as comprehensive as possible.” Id. The record also reflects that RAMP included [DELETED] and the [DELETED] as part of its proposed solutions provided in response to the solicitation’s scenarios. AR, Tab C, RAMP Technical Proposal, at 53-54. To the extent the protester contends that it was improper for the agency to consider information from RAMP’s response to the scenarios in its evaluation under this subfactor, this issue was apparent on the face of the solicitation, and therefore, this argument is untimely. Bid Protest Regulations, 4 C.F.R. § 21.2(a)(1) (2014) (protests based upon alleged improprieties in a solicitation which are apparent prior to the time set for receipt of initial proposals shall be filed prior to the time set for receipt of initial proposals).

CONCLUSION AND RECOMMENDATION

In sum, we sustain RAMP’s protest because FEMA evaluated RAMP’s proposal using unstated evaluation factors. We recommend that FEMA reevaluate proposals consistent with the discussion above. Alternatively, to the extent the unstated evaluation criteria reflect the agency’s requirements, we recommend that FEMA amend the solicitation to advise offerors of the agency’s requirements and intended evaluation approach. If the agency amends the solicitation, it should provide all offerors an opportunity to submit revised proposals, reevaluate proposals in a manner that is reasonable and consistent with the solicitation’s evaluation criteria, and make a new selection of offerors for negotiation. We also recommend that RAMP be reimbursed its costs of filing and pursing the protest. Bid Protest Regulations, 4 C.F.R. § 21.8(d)(1). The protester’s certified claims for such costs, detailing the time expended and costs incurred, must be submitted directly to the agency within 60 days after receipt of this decision. 4 C.F.R. § 21.8(f)(1).

The protest is sustained in part, and denied in part.

Susan A. Poling
General Counsel



[1] RAMP is a joint venture between Dewberry Consultants, LLC, URS Group, Inc., and ESP Associates, PA.

[2] The offerors were limited to 5 pages for two of the scenarios, and to 300 words for each task description for the third scenario. Solicitation at 11-12.

[3] For all of the evaluation factors and subfactors other than factor 5, past performance, and factor 7, subcontracting plan, the TEP assessed the proposals as excellent, good, satisfactory, and unsatisfactory. Solicitation at 12-13. For past performance, the TEP rated the proposals as high, medium, poor, or neutral. Id. at 13. The TEP evaluated factor 7 on a pass/fail basis. Id. at 12.

[4] The record provided by the agency included only the overall rating for Compass, but not the factor and subfactor ratings. However, as discussed below, because the agency selected two firms for negotiations, prejudice is established in this case based solely on the ratings of the second highest-rated firm, STARR.

[5] RAMP also argues that the agency failed to adequately document its rationale for the evaluation and for its decision to enter into negotiations with Compass and STARR. Because we find that the agency otherwise applied unstated criteria in evaluating RAMP’s proposal, we need not address this issue in detail. We recommend, however, that FEMA ensure that any revised evaluations adequately document the basis for the agency’s judgments.

[6] RAMP also challenges a weakness it received under this subfactor for not sufficiently demonstrating experience providing high-quality non-regulatory data. Based on our review of the record, we find that the protester’s arguments provide no basis to sustain the protest.

[7] For example, with regard to risk analysis technical support, RAMP detailed its experience with [DELETED]. AR, Tab C, RAMP Proposal, at 31-32. As far as floodplain management technical support, RAMP’s proposal provided information regarding RAMP’s experience with [DELETED]. Id.

Oct 2, 2014

Oct 1, 2014

Sep 30, 2014

Sep 26, 2014

Looking for more? Browse all our products here