Skip to main content

Solers Inc.

B-409079,B-409079.2 Jan 27, 2014
Jump To:
Skip to Highlights

Highlights

Solers, Inc., of Arlington, Virginia, protests award of a task order to Data Systems Analysts, Inc. (DSA), of Fairfax, Virginia, by the Department of Defense, Defense Information Systems Agency (DISA), under request for proposals (RFP) No. E200549.00 for the development and sustainment of the Defense Industrial Base Network (DIBNet). The protester argues that the agency failed to reasonably evaluate the awardee's proposed price and cost, and the protester's technical proposal.

We sustain the protest.

We sustain the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: Solers Inc.

File: B-409079; B-409079.2

Date: January 27, 2014

Patricia H. Wittie, Esq., Karla J. Letsche, Esq., and Daniel J. Strouse, Esq., Wittie, Letsche & Waldo, LLP, for the protester.
David S. Cohen, Esq., Norah D. Molnar, Esq., John J. O’Brien, Esq., Gabriel E. Kennon, Esq., Amy J. Spence, Esq., and Nicole Picard, Esq., Cohen Mohr LLP, for the intervenor.
LaVette Lyas-Webster, Esq., Department of Defense, Defense Information Systems Agency, for the agency.
Christopher L. Krafchek, Esq., and Jonathan L. Kang, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1. Challenge to the evaluation of the realism of the awardee’s proposed price and cost is sustained where the record does not demonstrate that the agency’s evaluation was reasonable.

2. Challenge to the evaluation of the protester’s technical approach is sustained where the contemporaneous record does not support the agency’s explanation for its assessment of the merits of the protester’s proposal.

DECISION

Solers, Inc., of Arlington, Virginia, protests award of a task order to Data Systems Analysts, Inc. (DSA), of Fairfax, Virginia, by the Department of Defense, Defense Information Systems Agency (DISA), under request for proposals (RFP) No. E200549.00 for the development and sustainment of the Defense Industrial Base Network (DIBNet). The protester argues that the agency failed to reasonably evaluate the awardee’s proposed price and cost, and the protester’s technical proposal.

We sustain the protest.

BACKGROUND

DISA issued the RFP on April 25, 2013, as a small business set-aside that sought proposals from firms holding one of the ENCORE II multiple-award indefinite-delivery/indefinite-quantity (ID/IQ) contracts.[1] RFP at 1. DIBNet is a web-based tool that enables program participants to exchange classified and unclassified cyber threat information on a daily basis. Agency Report (AR) at 3. The RFP required offerors to propose the following systems engineering and technical assistance services in support of the DIBNet program: custom code development and maintenance; requirements analysis; information assurance analysis; certification and accreditation support; integration of commercial off-the-shelf products and services; quality assurance system evaluation; developmental testing; installation support; documentation; and system performance analysis. RFP, Performance Work Statement (PWS), at 5.

The RFP anticipated award of a task order with fixed-price and cost-plus-award fee contract line item numbers (CLINs), for a base period of 1 year with four 1-year options. RFP at 2. The RFP stated that proposals would be evaluated based on the following factors: (1) technical/management approach; (2) past performance; and (3) price/cost.[2] The technical/management approach factor had the following four subfactors: (1) architecture and engineering; (2) DIBNet testing; (3) DIBNet deployment; and (4) management approach. Id. at 2-3. For purposes of award, the RFP stated that the technical/management factor was more important than the past performance factor, and that the non-price/cost factors, when combined, were “significantly more important” than price/cost. Id. at 2.

As relevant here, the RFP stated that offerors’ price/cost proposals would be evaluated for reasonableness and completeness. Id. at 3. With regard to the cost-reimbursement CLINs, the RFP stated that the agency would evaluate the realism of the proposed costs. Id. With regard to the fixed-price CLINs, the RFP stated that the agency “reserves the right to conduct a realism analysis of the offeror’s proposed price.” Id. at 3-4. The RFP required offerors to “ensure price proposals include detailed information regarding the resources required to accomplish the task . . . [including] labor categories, labor hours, number of employees for each labor category, rates, travel, and incidental equipment.” Id.

DISA received proposals from four offerors, including DSA and Solers, by the closing date of May 31. AR at 13. The agency evaluated each offeror’s proposal and conducted discussions. Following discussions, the agency requested final proposal revisions (FPRs).

DISA evaluated the offerors’ FPRs and revised the cost evaluation and technical ratings. As relevant here, the final technical ratings and cost evaluations were as follows:

 

DSA

Solers

TECHNICAL/MANAGEMENT

 

Architecture and Engineering

 

Acceptable

 

Acceptable

DIBNet Testing

Acceptable

Acceptable

DIBNet Deployment

Acceptable

Acceptable

Management Approach

Acceptable

Acceptable

 

PAST PERFORMANCE

SUBSTANTIAL CONFIDENCE

SUBSTANTIAL CONFIDENCE

EVALUATED PRICE/COST

$25,789,312

$35,700,944


AR, Tab 10, Final Selection Recommendation Document (SRD), at 1, 11, 15.

The contracting officer (CO), who also served as the source selection authority for the procurement, concluded that each offeror met the requirements of the solicitation and merited ratings of acceptable. AR, Tab 11, Price Negotiation Memorandum, at 8. The CO concluded that no offeror provided any strengths that exceeded the solicitation requirements, and that award should be made based on DSA’s overall low price/cost. Id.; see also Tab 10, Final SRD, at 1, 19. The agency advised Solers that DSA was selected for award and provided a debriefing on September 30. This protest followed.

DISCUSSION

Solers argues that DISA unreasonably evaluated the realism of the awardee’s proposed price/cost and the merits of the protester’s technical proposal. As discussed below, we conclude that the agency’s evaluation of the offerors’ price/cost proposals was inadequate, and thus does not demonstrate that the evaluation was reasonable. We also conclude that the agency’s evaluation of the protester’s technical proposal, as described in the agency’s response to the protest, was based on non-contemporaneous judgments. We therefore sustain these protest arguments.

Cost/Price Realism Evaluation

Solers argues that DISA’s evaluation of DSA’s price/cost proposal did not reasonably evaluate the realism of the awardee’s proposed labor mix (that is, appropriate staffing of labor categories to perform the work) for the cost-reimbursement and fixed-price CLINs, or the adequacy of its proposed level of effort (that is, the labor hours and full-time equivalent (FTE) personnel) for the fixed-price CLINs. Supp. Protest at 4, 13. For the reasons discussed below, we conclude that neither the contemporaneous evaluation, nor the agency’s testimony at a hearing convened by our Office to address gaps in the contemporaneous record, adequately explains how the agency evaluated the realism of DSA’s price/cost proposal.[3] For this reason, we sustain the protest.

When an agency evaluates a proposal for the award of a cost-reimbursement contract, or portion of a contract (such as a cost-reimbursement CLIN), an offeror’s proposed costs are not dispositive because, regardless of the costs proposed, the government is bound to pay the contractor its actual and allowable costs. Federal Acquisition Regulation (FAR) §§ 15.305(a)(1); 15.404-1(d); Palmetto GBA, LLC, B‑298962, B-298962.2, Jan. 16, 2007, 2007 CPD ¶ 25 at 7. Consequently, the agency must perform a cost realism analysis to determine the extent to which an offeror’s proposed costs are realistic for the work to be performed. FAR § 15.404‑1(d)(1). An agency’s cost realism analysis need not achieve scientific certainty; rather, the methodology employed must be reasonably adequate and provide some measure of confidence that the rates proposed are reasonable and realistic in view of other cost information reasonably available to the agency as of the time of its evaluation. See SGT, Inc., B-294722.4, July 28, 2005, 2005 CPD ¶ 151 at 7. Our review of an agency’s cost realism evaluation is limited to determining whether the cost analysis is reasonably based. Jacobs COGEMA, LLC, B‑290125.2, B-290125.3, Dec. 18, 2002, 2003 CPD ¶ 16 at 26.

As a general matter, when awarding a fixed-price contract, or portion of a contract (such as a fixed-price CLIN), an agency is only required to determine whether the offered prices are fair and reasonable, that is, whether proposed prices are too high. FAR § 15.402(a). A price realism evaluation, in contrast, applies cost realism analysis techniques to fixed prices, and is intended to evaluate whether proposed prices are too low by assessing an offeror’s understanding of the requirements. FAR § 15.404-1(d)(3); Ball Aerospace & Tech. Corp., B-402148, Jan. 25, 2010, 2010 CPD ¶ 37 at 8. Where, as here, an agency states in a solicitation that it “reserves the right” to conduct a price realism analysis, the decision to conduct such an analysis is a matter within the agency’s discretion. Guident Techs., Inc., B‑405112.3, June 4, 2012, 2012 CPD ¶ 166 at 13 n.9. However, where an agency elects to conduct a price realism evaluation, we will review that evaluation for reasonableness. See Science Applications Int’l Corp., B‑407013, Oct. 19, 2012, 2012 CPD ¶ 308 at 5-6.

As discussed above, the solicitation here required offerors to propose their price/cost based on cost-reimbursement and fixed-price CLINs. DISA evaluated each offeror’s proposed price/cost to “determine if it is reasonable, realistic, and complete.” AR, Tab 12A, Cost/Price Report, at 7, 13. The agency considered the offerors’ direct labor rates, direct labor hours, indirect rates, travel and other direct costs, subcontractor costs, and fee. Id. at 7‑8, 13-14. The agency’s price/cost analyst stated that he reviewed the offerors’ direct labor rates for the cost-reimbursement and fixed-price CLINS, and their proposed indirect costs, and concluded that these price/cost elements were realistic for DSA and Solers. Decl. of Price/Cost Analyst (Dec. 9, 2013) at 2. The price/cost analyst stated that the evaluation of the realism of offerors proposed labor mix and level of effort was performed at his request by the technical evaluation team. Id. at 3. During the hearing, the agency provided the testimony of a member of the technical evaluation team who was responsible for this evaluation. This witness explained that the technical evaluators analyzed the adequacy of the offerors’ proposed labor mix and level of effort, found that both offerors’ proposals were realistic, and documented these findings in a spreadsheet which was attached to the cost/price evaluation. Hearing Transcript (Tr.) at 131:1-132:2.

The agency’s evaluation provided the following summary of the offerors’ proposed costs, prices, labor hours, and FTE staff as compared to the independent government cost estimate (IGCE):

 

IGCE

DSA

SOLERS

Labor Hours

Fixed-price

Cost-reimbursement
TOTAL

 

[Deleted]

[Deleted]

[Deleted]

 

[Deleted]

[Deleted]

[Deleted]

 

[Deleted] [Deleted]

[Deleted]

Labor Price/ Cost

Fixed-price

Cost-reimbursement
TOTAL

 

[Deleted]

[Deleted]

[Deleted]

 

[Deleted]

[Deleted] [Deleted]

 

[Deleted]

[Deleted]

[Deleted

Proposed FTEs

Fixed-price

Cost-reimbursement

TOTAL FTEs

 

[Deleted]

[Deleted]

[Deleted]

 

[Deleted]

[Deleted]

[Deleted]

 

[Deleted]

[Deleted]

[Deleted]

Other Direct Costs

[Deleted]

[Deleted]

[Deleted]

Total Price/Cost

[Deleted]

$25.8M

$35.7M


AR, Tab 10, Final SRD, at 17; Tab 12B, FTE Summaries.

Solers argues that, despite the disparities between the agency’s estimates and DSA’s proposed labor mix and level of effort, DISA failed to reasonably evaluate the realism of the awardee’s proposed labor mix for the cost-reimbursement and fixed-price CLINs, or the adequacy of its proposed level of effort for the fixed-price CLINs. Specifically, the protester argues that the record does not show how the agency concluded that DSA’s proposed labor mix was realistic for the cost-reimbursement and fixed-price CLINs, and also argues that DSA’s proposed level of effort for the fixed-price CLINs was unrealistically low.[4] As discussed below, we conclude that the record does not demonstrate how the agency found that DSA’s proposed price/cost was realistic for the work to be performed.

As an initial matter, DISA notes that the solicitation did not obligate it to conduct a price realism evaluation, and therefore contends that the agency was not required to evaluate the realism of the fixed-price CLINs with regard to labor mix or level of effort. Supp. AR at 37. As discussed above, however, when an agency elects to conduct a price realism evaluation, that evaluation must be reasonable. See Science Applications Int’l Corp., supra. Here, the agency acknowledges that it evaluated the realism of both Solers’ and DSA’s fixed-price CLINs for realism with regard to the proposed labor rates, Tr. at 19:13‑18, and also acknowledges that price realism was evaluated regarding the “entire evaluated price of the effort,” Decl. of Price/Cost Analyst (Dec. 9, 2013) at 2. To the extent that the agency contends that it did not perform any other price realism analyses, and that its evaluation did not rely upon conclusions regarding the realism of DSA’s proposed labor mix or level of effort, the record does not support this argument.

As discussed above, the price/cost analyst evaluated the realism of the offerors’ labor rates; he explained in his testimony that he did not evaluate the realism of the offerors’ proposed labor mix or level of effort. Tr. at 39: 4-12. Instead, the evaluation of labor mix and level of effort was performed by the technical evaluation team. The agency’s witness for the technical evaluation team who performed the evaluation of labor mix and level of effort explained that her evaluation considered whether the proposed prices and costs were realistic. See id. at 131:1-132:2. Furthermore, the record shows that the agency’s conclusions regarding the realism of DSA’s price/cost were based on the overall proposed labor mix and level of effort--the agency’s evaluations did not distinguish between the cost-reimbursement and fixed-price CLINs. See AR, Tab 11, Price Negotiation Memorandum, at 6; Tab 12A, Cost/Price Evaluation, at 7-8. Specifically, the agency’s evaluation cited DSA’s overall price/cost, hours, and FTEs for the combined cost-reimbursement and fixed-price CLINs, and concluded that the awardee’s proposal was realistic. See id. On this record, we conclude that the agency’s evaluation of DSA was based on the conclusion that its proposed price for the fixed-price CLINs was also realistic with regard to its labor mix and level of effort.

With regard to the realism evaluations here, DISA correctly notes that a cost or price realism evaluation must consider the unique technical approaches proposed by each offeror, and that, to the extent that an agency concludes that an offeror’s proposed costs are realistic for its technical approach, such an evaluation may be reasonable despite differences as compared to other offerors or a government estimate. See FAR § 15.404-1(c)(1); Systems Techs., Inc., B-404985, B‑404985.2, July 20, 2011, 2011 CPD ¶ 170 at 5. Here, however, neither the contemporaneous record provided by DISA, nor the testimony provided by agency witnesses, demonstrates how the agency evaluated the offerors’ technical approaches for the purpose of determining the realism of the proposed labor mix or level of effort.

DISA argues that it reasonably evaluated DSA’s proposed labor mix and level of effort because the agency issued discussion questions to both DSA and Solers that required each to provide more detail concerning its proposed labor mix and level of effort. Supp. AR at 25. The agency further notes that a report prepared by the technical evaluators found that DSA’s labor mix and level of effort were realistic for its proposed technical approach. See AR, Tab 12B, Task Monitor Technical Questionnaire, at 2.

As relevant here, five of the agency’s discussion questions, in the form of evaluation notices (EN), requested that DSA address its proposed labor mix and level of effort. See AR, Tab 6A, EN, DSA-MGMT-001, at 23-25; DSA-MGMT-002, at 26-28; DSA-MGMT-06, at 38-39; DSA-COST-002, at 44-46; and DSA-COST-003, at 47-49. For example, the agency advised DSA that “[i]t is not clear that the [o]fferor understands the complexity and scope of 3.2 Task Area 5--Requirements Analysis with the labor mix that has been presented in the cost proposal (used by the Technical Evaluation Team).” AR, Tab 6A, EN, DSA-MGMT-003, at 29. None of the discussion questions, however, show how the agency evaluated the proposed labor categories, hours, or FTEs; rather, each only express the agency’s concern regarding the particular aspect of DSA’s proposal that was deemed inadequate. Moreover, the record does not show how the agency evaluated DSA’s responses to the ENs, that is, how the agency concluded that the proposed labor mix and level of effort were realistic. See AR, Tab 12A, Cost/Price Evaluation Report, at 8; Tab 10, Final SRD, at 17-18; Tab 20, Interim SRD, at 36.

Similarly, the agency’s cost report provides only the following general statements that DSA’s proposed costs were realistic:

The Price/Cost Analyst discussed the technical evaluation with [the contracting officer’s representative (COR)] and explained that the realism evaluation had to be based on the offeror’s specific approach. Per this discussion and the technical realism analysis, the COR found that the proposed labor mix and numbers of hours are reasonable and realistic for this effort. Therefore, the Price/Cost Analyst took no exception to the proposed hours and no adjustments were required.

AR, Tab 12A, Cost/Price Evaluation Report, at 7-8.

As relevant here, the questionnaire referenced above set forth the following questions and responses:

Yes

No

Direct Labor

X

 

1. Is the proposed mix/type of labor effort appropriate for the IT support to be performed?

X

 

2. Are the proposed number of hours/percents of effort reasonable for the IT Support to be performed?


AR, Tab 12B, Task Monitor Technical Questionnaire, at 1. Apart from the “yes,” responses, however, the document did not contain any analysis supporting the conclusion.

During the hearing conducted by our Office to further develop the record, the source selection evaluation board (SSEB) chair, the price/cost team lead, the CO, the COR, and the technical evaluator provided testimony concerning the agency’s price and cost realism evaluations. Of these witnesses, only the technical evaluator was involved with the evaluation of the offerors’ proposed labor mix and level of effort. Tr. at 86:7-87:11

The technical evaluator explained that her evaluation of the offerors’ proposed labor mixes and level of effort relied upon her own knowledge and personal experience to judge whether the offerors’ staffing was realistic to perform the PWS requirements. Id. at 81:11-82:13. The technical evaluator stated that she evaluated each offerors’ proposal, and then discussed her findings with the other technical evaluators in order to reach a consensus. Id. at 72:5-21. She explained that when the evaluators had concerns regarding the realism of an offerors’ proposal, they issued an EN and asked the offeror to further address the matter. Id. at 72:21-73:9. If the offeror’s response to the EN satisfied the evaluators’ concerns the evaluators simply noted that the EN had been resolved and that there were no more questions. Id.; see AR, Tab 12A, Cost/Price Evaluation Report, at 7-8. As discussed above, however, the contemporaneous evaluation record does not explain how the evaluators concluded that the concerns raised in the ENs had been resolved, or the basis for the agency’s conclusions that DSA’s proposed price/cost was realistic.

Moreover, the technical evaluator’s testimony did not describe in any meaningful detail the basis for any of the evaluator’s conclusions. Instead, when asked to describe with specificity how the evaluators resolved concerns regarding the realism of DSA’s proposed labor mix and level of effort, she merely reiterated that she personally reviewed the proposals and then conferred with the other evaluators.[5] See id. at 76:1-79:3, 80:3:81-10, 84:9-87:11, 94:19-98:19, 104:7-107:9, 115:17-121:5, 126:14-128:18, 130:22-133:20, 142:18-143:16, 150:6-158:22.

In addition to the lack of an adequate record concerning the agency’s realism analysis, the technical evaluator also explained that her review of DSA’s proposed labor mix relied on her own assumptions concerning how the offeror would perform the work. For example, the technical evaluator testified that if she found that the awardee proposed too many personnel for one labor category, and too few in another labor category, she assumed that the awardee would be able to account for the shortfall by reassigning the excess personnel. See id. at 105:1‑106:13. The technical evaluator, however, did not point to anything in the awardee’s proposal that supported her assumption that such substitutions could account for the evaluated shortfalls. In this regard, the technical evaluator could not explain her basis for concluding that, if DSA’s proposal stated that it would need a particular number of personnel to perform a task, DSA would not actually need that level of personnel and could devote them to other areas of its technical approach (for example, areas that the technical evaluation felt were understaffed). See id. at 108:8-110:20. Because the technical evaluator’s assumptions in this regard lacked a reasonable basis, we conclude that her apparent reliance on such assumptions also renders her evaluation of the adequacy of DSA’s labor mix unreasonable.

For the reasons discussed above, we find that neither the contemporaneous record nor the hearing testimony provides a basis for our Office to find that DISA reasonably evaluated the realism of DSA’s proposed labor mix for the cost-reimbursement and fixed-price CLINs, or the realism of its proposed level of effort for the fixed-price CLINs. See TriCenturion, Inc.; SafeGuard Servs., LLC, B‑406032 et al., Jan. 25, 2012, 2012 CPD ¶ 52 at 17. To the extent that the CO relied on the judgment of the technical evaluation board members in concluding that DSA’s proposed labor mix and level of effort were realistic, the record does not show how they reached their judgments or whether they were reasonable. See id.

We specifically note here that our conclusions are based on the inadequacies of the contemporaneous record, as produced by DISA. DISA was provided multiple opportunities to ensure that the record was complete. GAO specifically requested additional information to address the lack of an adequate record, but none were provided. In addition, GAO provided the agency and the other parties an opportunity before the hearing to submit additional documents for the purpose of establishing a clear record for the hearing and for the ultimate resolution of the protest. GAO Confirmation of Hearing Notice (Jan. 4, 2014) at 3. Based on this record, we sustain the protester’s challenges to the adequacy of DISA’s evaluation of the realism of DSA’s proposed labor mix for the cost-reimbursement and fixed-price CLINs.

Technical/Management Approach Evaluation

Solers also argues that DISA’s evaluation of its technical proposal was unreasonable because the agency failed to recognize various aspects of the proposal that exceeded the solicitation requirements. The agency’s response to the protest acknowledges that DISA viewed numerous areas of Solers’ proposal as being “advantageous” because they exceeded the solicitation requirements. See AR at 41; Supp. AR at 2-3. However, the agency contends that none of these areas constituted strengths that would distinguish Solers’ proposal from DSA’s because the advantages of Solers’ proposal were not “quantifiable.” See id. For the reasons discussed below, we conclude that the record does not adequately explain the basis for the agency’s evaluation, and we therefore sustain the protest.

As discussed above, the solicitation stated that DISA would assign adjectival ratings to proposals under each of the first three subfactors of the technical/management approach factor. The solicitation provided that the first three technical subfactors would be evaluated to determine if the offeror “proposed an approach to fully meet or exceed” the applicable task in the PWS. RFP at 2. Moreover, the RFP advised offerors that their proposals would be assigned adjectival ratings which would be based on the assessment of strengths and weaknesses. RFP, Evaluation Tables, at 1-2. As also discussed above, the agency did not assign any strengths or weaknesses to Solers’ or DSA’s proposals, and assigned both offerors ratings of acceptable for all of the technical approach evaluation subfactors. AR, Tab 10, Final SRD, at 1. Based on these evaluations, the agency found that the offerors were equivalent under the non-price/cost evaluation. Id. at 19.

Solers argues that its proposal exceeded the PWS requirements in numerous areas, and that the agency unreasonably failed to assign its proposal strengths for these features. Protest at 11-13. In response to the protest, DISA provided a declaration by the SSEB chair, which explained that although Solers’ proposal exceeded certain of the PWS requirements, the proposal did not merit any strengths. AR, Tab 18, SSEB Chair Decl. (Nov. 15, 2013), at 2-4. Specifically, the SSEB chair explained that, with regard to five areas of Solers’ proposal, the technical evaluators agreed that the proposal was “advantageous” and exceeded the PWS requirements. Id. The SSEB chair stated, however, that Solers’ proposal did not merit a strength for any of these areas because the benefits and advantages provided to the government were not “quantifiable.” Id.

In his testimony at the hearing, the SSEB Chair explained that by “quantifiable,” he meant “something tangible that can [be] put into a contract that [the government] can get back from a contractor, that [the government] can put [a] value on that's going to save time, reduce risk or save money.” Tr. at 219:11-15. For this reason, the SSEB chair explained, in order to receive a strength, a proposal was required to demonstrate that it both exceeded the requirements of the PWS, and also that it provided a quantifiable benefit, that is, a benefit that can be expressed in “a tangible way.” Id. at 216:19-217:15.

As the SSEB chair’s hearing testimony showed, however, the analysis reflected in his declaration concerning the evaluation of Solers’ proposal was made in response to the protest, and did not reflect the agency’s contemporaneous evaluation judgments. Specifically, the SSEB chair acknowledged that he did not review the offerors’ proposals. Id. at 228:15-20. The SSEB chair further acknowledged that he did not discuss any of the alleged strengths with the technical evaluators prior to award. Id. at 231:12-14; 233:5-16. Instead, the SSEB chair first discussed the alleged strengths with the evaluators for the purpose of preparing his declaration in response to the protest. Id. at 230:10-231:14; 234:7-11.

Significantly, the SSEB chair states that the two-part assessment of whether a particular feature of Solers’ proposal merited a strength--that is, whether it both exceeded the PWS requirements and provided a quantifiable benefit--was not a contemporaneous judgment. The SSEB chair stated that although the technical evaluators advised him in response to the protest that the potential strengths identified by the protester exceeded the PWS requirements, they did not give him any advice or opinion as to whether those advantages were “quantifiable.” Id. at 235:4-19. Instead, the judgment as to whether the advantageous aspects of Solers’ proposal provided a quantifiable benefit was made by the SSEB chair for the first time in his preparation of the declaration responding to the protest. Id.

at 232:9-233:4; 234:12-235:3.

In sum, DISA’s response to the protest specifically acknowledged that the evaluators believed that certain areas of the protester’s proposal exceeded the requirements of the PWS; despite these judgments, the agency states that these areas did not merit a strength because there was no quantifiable benefit. As the record above shows, however, the analysis as to whether Solers’ proposal provided quantifiable benefits was not made by the evaluators prior to the award, but was instead made by the SSEB chair in response to the protest. Where, as here, an agency offers an explanation of its evaluation during the heat of litigation that is not borne out by the contemporaneous record, we give little weight to the later explanation. CIGNA Gov’t Servs., LLC, B-401062.2, B-401062.3, May 6, 2009, 2010 CPD ¶ 283 at 6; Boeing Sikorsky Aircraft Support, B-277263.2, B-277263.3, Sept. 29, 1997, 97-2 CPD ¶ 91 at 15. We therefore conclude based on the record here that the agency’s evaluation lacks a reasonable basis.[6]

In addition to our concerns regarding DISA’s reliance on a non-contemporaneous analysis to respond to the protest, we also have concerns regarding the definition of a strength cited by the SSEB chair. As the parties agree, the RFP did not define the term strength, and instead advised offerors that proposals would be evaluated as to whether they “fully meet or exceed” the PWS requirements. RFP at 2. Moreover, the RFP advised offerors that their proposals would be assigned adjectival ratings which would be based on the assessment of strengths and weaknesses. Id., Evaluation Tables, at 1-2. The hearing testimony provided by the SSEB chair and the CO did not provide a clear definition of the term “quantifiable benefit,” and it is not clear, based on the record, what kind of benefit would be viewed as quantifiable. For this reason, we think that the agency should consider amending the solicitation to more clearly advise offerors of the type of advantages the government is seeking under their technical approaches.

We also find that Solers was prejudiced by the agency’s improper evaluation. Because the record does not demonstrate that the agency reasonably considered the relative merits of the offerors’ proposals, we have no basis to conclude whether this review would or would not have changed the agency’s view of this competition. See Kellogg, Brown & Root Servs., Inc.--Recon., B-309752.8, Dec. 20, 2007, 2008 CPD ¶ 84 at 5. Accordingly, we sustain the protest on this basis.

CONCLUSION AND RECOMMENDATION

For the reasons discussed above, we conclude that DISA unreasonably evaluated the realism of the awardee’s proposed labor mix for the cost-reimbursement and fixed-price CLINs, and the adequacy of its proposed level of effort for the fixed-price CLINs; we also conclude that the agency unreasonably evaluated the protester’s proposal under the technical/management approach factor. We recommend that DISA reevaluate these proposals, consistent with our decision, and conduct discussions and obtain revised proposals if appropriate. If the agency believes that the solicitation does not clearly advise offerors of the type of advantages the government is seeking under their technical approaches, the agency should amend the RFP and provide offerors an opportunity to submit new proposals. The agency also should make a new award decision, supported by adequate documentation.

We also recommend that DISA reimburse Solers the costs of filing and pursuing its protest, including reasonable attorney’s fees. 4 C.F.R. § 21.8(d)(1) (2013). The protester should submit its certified claim for costs, detailing the time expended and cost incurred, directly to the contracting agency within 60 days after receipt of this decision. 4 C.F.R. § 21.8(f)(1).

The protest is sustained.

Susan A. Poling
General Counsel

 



[1] Although the solicitation anticipated the issuance of a task order under an ID/IQ contract, the solicitation was issued as an RFP and the evaluation record primarily refers to “offerors” and “proposals.” For the sake of consistency, and because the distinction between a quotation and a proposal has no bearing on our analysis in this protest, we use the terms offerors and proposals in this decision.

[2] The record uses the terms price/cost and cost/price interchangeably.

[3] Our Office conducted a hearing on January 8, 2014, to further develop certain protest issues, at which the CO, the source selection evaluation board chair, the cost/price team lead, the CO’s representative, and a technical evaluator provided testimony.

[4] Solers states that, because it proposed a similar number of hours for the cost-reimbursement CLINs to DSA, the protester does not challenge this aspect of the awardee’s proposal or the agency’s evaluation. Supp. Protest (Nov. 29, 2013) at 11 n.4. The protester also does not challenge the agency’s evaluation of the offerors’ direct labor rates or indirect costs.

[5] Although the technical evaluator did not provide specific details regarding her evaluation, she stated that she took notes which described the basis for her conclusions, some of which she retained. Tr. 82:14-83:3, 100:4-18. The agency did not provide these documents in response to the protest.

[6] In contrast to the SSEB chair’s testimony, the CO testified that she recalled speaking with the SSEB chair and the evaluation teams prior to the award decision about whether the offerors’ proposals, including Solers, warranted strengths. Tr. at 263:17-264:2. The CO suggested that certain of the potential strengths cited by Solers in its protest may have been brought to her attention by the evaluators as potential strengths, but that none were adequately supported. Id. at 265:8-20. The CO, however, could not recall what strengths were discussed, and was not certain whether the strengths cited in Solers’ protest were the same as those discussed with the SSEB. Id. at 264:3-8; 286:2-5. Moreover, the CO states that her recollection was based in part on a draft version of the SRD, which was not provided to our Office or the parties in connection with the protest. Id. at 282:11-14. On this record, and without any contemporaneous documentation of the assessment, we find that the CO’s testimony does not support the reasonableness of the agency’s evaluation.

Downloads

GAO Contacts

Office of Public Affairs