Skip to main content

The SI Organization, Inc.

B-410496,B-410496.2 Jan 07, 2015
Jump To:
Skip to Highlights

Highlights

The SI Organization, Inc., (SI) of Chantilly, Virginia, protests the award of a contract to TASC, Inc., also of Chantilly, under request for proposals (RFP) No. HM1572-13-R-0007, issued by the National Geospatial-Intelligence Agency (NGA) for an applications operations service provider. SI argues that the agency's evaluation of technical, past performance, and cost proposals was unreasonable. The protester also asserts that the agency's best value award decision is unreasonable and not consistent with the terms of the solicitation.

We deny the protest.

We deny the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:      The SI Organization, Inc.

File:                B-410496; B-410496.2

Date:              January 7, 2015

Mary Beth Bosco, Esq., Elizabeth M. Gill, Esq., and Elizabeth N. Jochum, Esq., Holland & Knight, LLP, for the protester.
Claude P. Goddard, Jr., Esq., Daniel J. Donohue, Esq., Grace O. Aduroja, Esq., and Walter A.I. Wilson, Esq., Polsinelli PC, for TASC, Inc., the intervenor.
Jill A. O'Connor, Esq., National Geospatial-Intelligence Agency, for the agency.
Pedro E. Briones, Esq., and Nora K. Adkins, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  Protest challenging agency’s cost, technical, and past performance evaluations is denied where the record demonstrates that the agency’s conclusions were reasonable.

2.  Protest challenging agency’s best value tradeoff determination is denied where the record demonstrates that the agency’s tradeoff decision was reasonable and in accordance with the terms of the solicitation.

DECISION

The SI Organization, Inc., (SI)[1] of Chantilly, Virginia, protests the award of a contract to TASC, Inc., also of Chantilly, under request for proposals (RFP) No. HM1572-13-R-0007, issued by the National Geospatial-Intelligence Agency (NGA) for an applications operations service provider.  SI argues that the agency’s evaluation of technical, past performance, and cost proposals was unreasonable.  The protester also asserts that the agency’s best value award decision is unreasonable and not consistent with the terms of the solicitation.

We deny the protest.

BACKGROUND

The RFP provided for the award of a cost reimbursement contract, with cost-plus-award-fee (CPAF) and cost-reimbursable contract line items (CLINs), for a base year and 3 option years.  RFP at 11, 66.  The solicitation sought the services of an applications operations service provider (AOSP) to identify, pre-screen, negotiate prices (with other vendors), recommend, and deliver applications (or apps) for inclusion in the agency’s geospatial intelligence (GEOINT) app store.  Statement of Objectives (SOO) at 1.  The AOSP contractor, in essence, serves as a “trusted broker” of mobile, web, and desktop apps for the agency in support of its national security missions.  Id. at 4; RFP at 2.

The solicitation stated that award would be made on a best‑value basis considering the following evaluation factors (listed in descending order of importance):  technical/management, past performance, security, and cost.  RFP at 111, 113.  The RFP stated that cost is significantly less important than all non-cost factors combined.  Id. at 113.  The technical/management factor included three subfactors:  management and technical approach; business and contract administration; and testing and evaluation services.  Id.  The past performance factor included two subfactors:  technical past performance and management past performance.  Offerors were instructed to submit proposals in separate volumes for each evaluation factor.  Id. at 76.

In their technical/management proposals, offerors were to address, among other things, their approach to providing AOSP services, including personnel and resources needed to carry out the SOO, and include a technical basis‑of‑estimate (BOE), without costs, for the offeror’s proposed labor/skill mix.  See id. at 78, 81.  Under the management and technical approach subfactor (no. 1.1), offerors were to provide a staffing plan, identify key personnel skills and labor hours, and describe the workflow for their marketing strategy, among other things.  See id. at 79.   Under the business and contract administration subfactor (no. 1.2), offerors were to propose a marketing strategy for reaching the most, and best, app vendors, and describe their methodology for estimating and negotiating a per‑download price with potential app vendors.  See id. at 79‑80.  Under the testing and evaluation services subfactor (no. 1.3), offerors were to address their approach to pre‑screening, testing, and evaluating apps, including managing a collaborative testing environment, and cite at least one recent example of their experience in that regard.  See id. at 80‑81.  The RFP stated that the agency would evaluate an offeror’s understanding of the program requirements and whether the offeror demonstrated a technically sound approach with relevant experience, and assess a technical risk rating based on evaluated technical weaknesses.[2]  Id. at 115, 119.

In their past performance proposals, offerors were to provide information and client questionnaires for at least three (but no more than five) projects, similar in scope, magnitude, and complexity to the requirement, and performed within the last 3 years.[3]  Id. at 85-86.  The RFP stated that the agency would evaluate the relevance, magnitude, and scope of past performance, including for an offeror’s proposed subcontractors, and assign a relevancy rating for each project and an overall past performance confidence rating.  Id. at 115-16.

In their cost proposals, offerors were to propose an estimated cost and award fee for AOSP services (CLIN 0001).  The solicitation provided not-to-exceed values for the two cost-reimbursable CLINs--pass through (i.e., payment) to app vendors based on downloads and apps usage (CLIN 0002), and travel (CLIN 0003)--and instructed offerors to insert these values in their cost proposal as plug numbers for the respective CLINs.  Id. at 2-3, 87-8.  Offerors were also to provide a cost for each SOO requirement; supporting cost or pricing data for each CLIN and for various cost elements (such as direct labor and overhead); and detailed descriptions of the offeror’s BOE, assumptions, and business strategies, among other things.  Id. at 89‑93.  Moreover, offerors were to provide a copy of any existing forward pricing rate agreement, including cognizant audit agency rate recommendations.  See id. at 91.

The RFP stated that the agency would evaluate cost proposals for completeness, reasonableness, and realism.  Id. at 113, 120-21.  Offerors were also advised that the agency would consider their BOEs (and basis‑of‑materials) to determine whether the proposed approach could be accomplished using the hours, skills, materials, and travel proposed.  Id. at 121.  The solicitation stated that the agency would develop a most probable cost (MPC) estimate, as necessary, to be used in the agency’s best value analysis, and that an offeror’s total evaluated cost would be the sum, for all performance periods, of the MPC for all CPAF CLINs and the plug numbers for the cost-reimbursable CLINs.  Id. at 122.

NGA received eight proposals, including from SI and TASC, which were evaluated by separate technical, past performance, and cost evaluation teams (TET, PET, CET, respectively).[4]  Agency Report (AR), Tab 14, Source Selection Decision, at 1; Tab 13, Source Selection Award Recommendation, Evaluation Team Structure, at 7.  After initial evaluations, the agency established a competitive range limited to SI’s and TASC’s proposals, and conducted discussions with the two firms.  See Tab 8, Discussions.

NGA requested final revised proposals from SI and TASC, which were evaluated as follows:


 

SI

TASC

Technical/Management Overall

Purple / Good

Purple / Good

       Management & Technical Approach

Blue / Outstanding

Purple / Good

       Business & Contract Administration

Purple / Good

Purple / Good

       Testing & Evaluation Services

Purple / Good

Blue / Outstanding

Technical Risk

Low

Low

Past Performance

Satisfactory Confidence

Substantial Confidence

       Technical Past Performance

Satisfactory Confidence

Substantial Confidence

       Management Past Performance

Substantial Confidence

Substantial Confidence

Proposed Cost

$24,321,283

$25,130,128

Evaluated Cost

$24,321,283

$25,130,128


AR, Tab 14, Source Selection Decision, at 9; Tab 11C, SI PET Report, at 8-11; Tab 12C, TASC PET Report, at 8‑12; see RFP at 114-15 (adjectival rating descriptions).  The agency’s evaluation ratings and narratives were documented in separate technical and past performance evaluation reports, and include evaluators’ assessment of numerous strengths--and no weaknesses--in SI’s and TASC’s respective proposals.  See AR, Tabs 11A & 12A, TET Reports; Tabs 11C & 12C, PET Reports.

The CET also prepared detailed reports documenting their evaluations of the firms’ cost proposals, including the evaluators’ cost realism assessments and BOE reviews.  AR, Tabs 11E & 12E, CET Reports.  The CET compared SI’s and TASC’s cost proposals to their respective BOEs and found them consistent.  AR, Tab 11E, SI CET Report at 33; Tab 12E, TASC CET Report, at 35.  The CET also found TASC’s and SI’s proposed costs realistic and that no MPC adjustments (including to TASC’s indirect cost rates) were necessary.  AR, Tab 11E, SI CET Report at 33; Tab 12E, TASC CET Report, at 35.

The source selection authority (SSA) for the procurement reviewed the TET, PET, and CET reports, and concurred with their findings and conclusions.  AR, Tab 14, Source Selection Decision, at 15.  The SSA determined that, “[a]lthough the TASC proposal is slightly more expensive, the technical merits of the TASC proposal--particularly in sub-factor 1.3 [testing and evaluation services]--and the substantial confidence past performance rating make the award to TASC the best value to the Government.”  Id.  Award was made to TASC and this protest followed.

DISCUSSION

SI protests the agency’s technical, past performance, and cost evaluations, as well as NGA’s cost/technical trade-off and best­ value determination.  While our decision here does not specifically discuss each and every argument, we have considered all of the protester’s assertions and find none furnish a basis for sustaining the protest.

Cost Realism Analysis

The protester challenges NGA’s cost realism analysis with respect to TASC’s proposed indirect cost rates and marketing strategy.  The protester argues that it was unreasonable for NGA not to upwardly adjust TASC’s proposed costs in light of a recent Defense Contract Audit Agency (DCAA) audit that found that [DELETED].  Comments & Supp. Protest at 23-24, citing AR, Tab 19C‑1, DCAA Indep. Audit of TASC’s Indirect Expense Rate Forward Pricing Proposal for FYs 2013-2015, Dec. 18, 2013, at 3.  SI insists that the agency’s cost realism analysis was also flawed insofar as it compared TASC’s actual 2013 indirect cost rates, because the firm has recently made significant changes to its rate structure.  Protester’s Supp. Comments at 16.  Moreover, the protester maintains that the agency did not properly assess the risk of TASC’s “unique” marketing strategy for [DELETED], because TASC did not identify labor hours in that regard.[5]  See id. at 18; Comments & Supp. Protest at 26‑27.  The protester urges that a proper cost realism analysis would have resulted in an upward MPC adjustment of TASC’s cost proposal, and changed the outcome of the agency’s cost/technical trade-off in favor of award to SI for its “substantially equal” proposal.  See Comments & Supp. Protest at 23; Protest at 8.

NGA argues that it evaluated TASC’s cost proposal extensively and performed a thorough cost realism analysis.  Supp. AR at 16.  The agency disputes SI’s assertion that it ignored questions over TASC’s indirect cost rates, and states that it coordinated its cost realism analysis with DCAA to determine the best method for evaluating TASC’s proposed rates in light of the firm’s recent rate restructuring.  See id. at 16-18.  NGA also argues that it reasonably determined that TASC’s labor/skill mix (which included [DELETED]), were sufficient to carry out TASC’s proposed technical approach and marketing strategy, and contends that offerors’ BOEs were not required to itemize every vendor interaction.  See id. at 18-19.  The agency also points out that, regardless of who identifies a potential app vendor [DELETED], the vendor and its app would undergo the same review and negotiation process contemplated by the SOO, which the agency has already determined could be realistically managed by TASC with its proposed labor/skill mix.  See id.

When an agency evaluates a proposal for the award of a cost-reimbursement contract, an offeror’s proposed estimated costs are not dispositive because, regardless of the costs proposed, the government is bound to pay the contractor its actual and allowable costs.  Federal Acquisition Regulation (FAR) §§ 15.305(a)(1); 15.404-1(d); Palmetto GBA, LLC, B-298962, B-298962.2, Jan. 16, 2007, 2007 CPD ¶ 25 at 7.  Consequently, the agency must perform a cost realism analysis to determine whether the estimated proposed cost elements are realistic for the work to be performed, reflect a clear understanding of the requirements, and are consistent with the unique methods of performance and materials described in the offeror’s proposal.  FAR § 15.404-1(d)(1); Advanced Commc’n Sys., Inc., B‑283650 et al., Dec. 16, 1999, 2000 CPD ¶ 3 at 5.  An offeror’s proposed costs should be adjusted, when appropriate, based on the results of the cost realism analysis.  FAR § 15.404-1(d)(2)(ii).  Our review of an agency’s cost realism evaluation is limited to determining whether the cost analysis is reasonably based and not arbitrary.  Jacobs COGEMA, LLC, B‑290125.2, B‑290125.3, Dec. 18, 2002, 2003 CPD ¶ 16 at 26.

The solicitation, as discussed above, stated that NGA would evaluate the realism of offerors’ proposed costs, including analyzing technical BOEs to determine whether an offeror’s approach could be met with its proposed labor/skill mix.  RFP at 121.  The RFP also stated that the agency would assess the technical risk of potential schedule disruptions, increased costs, or unsuccessful performance, among other things, based on assessed weaknesses in offerors’ proposal.  Id. at 115.  Offerors were also advised that the agency would only develop a MPC estimate if necessary, for the agency’s best value analysis.  Id. at 122 (emphasis added).

Here, in response to the protest, NGA has provided a detailed and lengthy (each CET report is over 40 pages) record of its evaluation of cost proposals, its cost realism analysis, and its best value trade-off.  The contemporaneous record shows that the agency’s cost realism analysis--including of TASC’s indirect rates--included:  (1) a comparison of TASC’s cost and technical proposals; (2) a comparison of TASC’s technical BOE to the agency’s independent government cost estimate; (3) extensive data collection and collaboration with both DCAA and Defense Contract Management Agency (DCMA) officials, including a DCAA auditor “who [was] intimately familiar” with TASC’s historical costs data; and (4) a comparison of TASC’s proposed indirect rates to its 2013 actual indirect rates based on DCAA’s advice.[6]  See AR, Tab 12, TASC CET Rep., at 9-10, 19-20, 33, 35; CET Lead Evaluator’s Declaration at 1‑3.

Using these steps, the agency’s cost evaluators determined that TASC’s technical proposal and BOE were consistent with its cost volume, and that both proposal volumes agreed with TASC’s proposed labor/skill mix for the offeror’s, as well as its subcontractor’s, proposed costs.  AR, Tab 12, TASC CET Rep., at 5, 35.  The evaluators also determined that TASC’s proposed labor hours were adequate to support its technical solution and that no labor hour adjustments were needed.  Id. at 33.  The agency concluded that the difference between TASC’s actual 2013 indirect rates and its proposed indirect rates was immaterial (1.35 percent) and, as noted above, that no MPC adjustment to TASC’s cost proposal was required.  Id. at 20.

We find that the agency’s cost realism analysis techniques were consistent with FAR requirements, and that the agency’s conclusion that TASC’s cost proposal did not require a MPC adjustment was reasonable.  An agency is not required to conduct an in-depth cost analysis, see FAR § 15.404‑1(c), or to verify each and every item in assessing cost realism; rather, the evaluation requires the exercise of informed judgment by the contracting agency.  Cascade Gen., Inc., B‑283872, Jan. 18, 2000, 2000 CPD ¶ 14 at 8.  The methodology employed must, as here, be reasonably adequate and provide some measure of confidence that the rates proposed are reasonable and realistic in view of other cost information reasonably available to the agency as of the time of its evaluation.  SGT, Inc., B‑294722.4, July 28, 2005, 2005 CPD ¶ 151 at 7.  There is no requirement that an agency follow any particular cost realism evaluation method, or evaluate offerors’ proposed costs using every possible method of analysis.  See id.; Cascade Gen., Inc., supra.  In short, we agree with NGA that the agency did not ignore questions about TASC’s indirect rates.  See Supp. AR at 16.

To the extent that the protester complains that the agency’s comparison of TASC’s FY 2013 actual indirect rates to its proposed rates may be “misleading,” because of recent changes to TASC’s rate structure, an agency’s cost realism analysis need not achieve scientific certainty.  See SGT, Inc., supra.  Indeed, the RFP contemplated that an offeror’s indirect rates may vary significantly from their recent experience, or that an offeror may not have current government rate recommendations.  RFP at 93-94.  The protester has also not meaningfully rebutted the agency’s argument that TASC’s marketing strategy would result in the same app review process already contemplated by the RFP, or acknowledged that TASC’s proposal explicitly states that [DELETED] will be “managed independently” of TASC.[7]  See AR at 40; Supp. AR at 18-19.

Technical Evaluation

SI also protests the agency’s technical evaluations, arguing that NGA’s “relative scoring of [SI’s and TASC’s] proposals is internally inconsistent, not supported by the [r]ecord, and not rationally based.”  Protest at 11.  The protester challenges the agency’s evaluation of SI’s marketing strategy, compensation model, and collaborative testing environment, as well as the agency’s risk assessment of TASC’s “unproven” compensation model and marketing strategy, among other things.  See id. at 11-12; Comments & Supp. Protest at 22.  The protester also contends that the agency’s technical evaluators did not conduct an independent review of TASC’s final proposal, insofar as they used the final consensus report for the initial proposals as a template for their evaluation of final proposal revisions.  Protester’s Supp. Comments at 12.  This process, SI contends, only permits the evaluation of changes from the initial proposals, and limits the analysis of how those changes impact the proposal as a whole.  Comments & Supp. Protest at 12.

An agency’s evaluation of technical proposals is primarily the responsibility of the contracting agency, since the agency is responsible for defining its needs and identifying the best method of accommodating them.  Wyle Labs., Inc., B‑311123, Apr. 29, 2008, 2009 CPD ¶ 96 at 5-6.  In reviewing protests of an agency’s evaluation, our Office does not reevaluate proposals, rather, we review the record to determine if the evaluation was reasonable, consistent with the solicitation’s evaluation scheme, as well as procurement statutes and regulations, and adequately documented.  See Wackenhut Servs., Inc., B‑400240, B‑400240.2, Sept. 10, 2008, 2008 CPD ¶ 184 at 6; Cherry Road Techs.; Elec. Data Sys. Corp., B‑296915 et al., Oct. 24, 2005, 2005 CPD ¶ 197 at 6.

The RFP stated that the agency would evaluate an offeror’s understanding of program requirements and whether the offeror demonstrated a technically sound approach with relevant experience.  See RFP at 119.  The RFP also stated that the agency would evaluate whether an offeror demonstrated an ability to thoroughly fulfill the requirements, and proposed appropriate personnel, resources, and a sound marketing approach, among other things.  See id.  Moreover, the RFP stated that the agency would evaluate an offeror’s expertise in vendor outreach and contract administration.  See id.  Furthermore, the RFP stated that the agency would evaluate an offeror’s approach and expertise in screening, testing, and implementing software, as well as managing a collaborative testing environment.  Id.  Offerors were advised that NGA would assess a technical risk rating based on any evaluated technical weaknesses.  Id. at 115.

We find, based on our review of the contemporaneous evaluation record, that SI’s protest of the agency’s technical evaluations amounts to little more than disagreement with the agency’s subjective judgments, and largely reflects the protester’s quibbling over evaluation ratings.  See, e.g., Protest at 11 (relative scoring and number of strengths); Comments & Supp. Protest at 18-19 (“The SI’s Outstanding rating [under subfactor 1.1] was supported by more major strengths, more meets the standards and only one [less] minor strength than TASC’s Outstanding [rating under subfactor 1.3].); Protester’s Supp. Comments at 11 (A “higher technical rating for The SI or a lower one for TASC would have changed the best value calculus, giving the SI the higher technical score and the lower price.”).

Where the evaluation and source selection decision, as discussed below, reasonably consider the underlying basis for the ratings, including the advantages and disadvantages associated with the specific content of competing proposals, in a manner that is fair and equitable, and consistent with the terms of the solicitation, the protester’s disagreement (as here, see generally id.) over the actual numerical, adjectival, or color ratings is essentially inconsequential in that it does not affect the reasonableness of the judgments made in the source selection decision.[8]  General Dynamics, American Overseas Marine, B‑401874.14, B‑401874.15, Nov. 1, 2011, 2012 CPD ¶ 85 at 10.

Here, the contemporaneous evaluation record reflects that the agency reasonably considered the qualitative merits of SI’s and TASC’s technical proposals.  For example, the record shows that, consistent with the evaluation criteria described above, the agency found that SI’s marketing strategy demonstrated a substantial investment and the resources necessary for successful AOSP marketing activities; that SI showed experience in that regard, as well as engineering experience with NGA’s information technology systems; and that SI’s prescreening workflow offered shortened development time, increased testing efficiency, and a good collaborative testing environment.  See AR, Tab 11A, SI TET Report, at 1.  Similarly, with regard to TASC, the agency found its proposed key personnel to be better than required; that it proposed a very good workable compensation model based upon a study of the app industry; and that it demonstrated multiple examples of successful app development and testing with multiple organizations.  AR, Tab 12A, TASC TET Report, at 1.  These findings were substantiated by numerous assessed strengths, with pin-point citations to the offerors’ proposals.  Moreover, as we note above, the agency did not assess any weaknesses in either offeror’s technical proposal.[9]

Thus, nothing in this record suggests that the agency’s technical evaluations were inconsistent with the solicitation’s evaluation criteria, or that NGA violated procurement laws or regulations in evaluating the protester’s or the awardee’s proposals.  See, e.g., QinetiQ North America, Inc., B‑405163.2 et al., Jan. 25, 2012, 2012 CPD ¶ 53 at 15 (protest of agency’s technical evaluations denied where record shows that agency reasonably evaluated proposals consistent with evaluation criteria, extensively documenting qualitative differences between the protester’s and awardee’s proposals).  To the extent that SI complains about the technical evaluator’s use of their initial consensus report as a template for their final proposal evaluations, we consider an agency’s evaluation record adequate if the consensus documents and source selection decision sufficiently document the agency’s rationale.  Alliance Tech. Servs., Inc., B‑311329, B‑311329.2, May 30, 2008, 2008 CPD ¶ 108 at 3.  The overriding concern for our purposes is not whether an agency’s final evaluation conclusions are consistent with earlier evaluation conclusions (individual or group), but whether they are reasonable and consistent with the stated evaluation criteria, and reasonably reflect the relative merits of the proposals.  See, e.g., URS Fed. Tech. Servs., Inc., B‑405922.2, B‑405922.3, May 9, 2012, 2012 CPD ¶ 155 at 9.  The agency’s evaluation findings, as described above, are reasonable and extensively documented here.

Past Performance Evaluation

The protester also challenges the agency’s past performance evaluation.  Protest at 14-16.  SI contends that the agency’s evaluation of its past performance was based solely on three questionnaires but otherwise ignored information contained in SI’s past performance proposal.  Comments & Supp. Protest at 14.  For example, the protester contends that NGA failed to consider SI’s performance of its Enterprise Systems Engineering Architecture (ESEA) project with the Maryland Procurement Office, National Security Agency.  Id. at 14-15.

In reviewing a protest challenging an agency’s past performance evaluation, we will examine the record to determine whether the agency’s judgment was reasonable and consistent with the stated evaluation criteria and applicable statutes and regulations.  Ostrom Painting & Sandblasting, Inc., B-285244, July 18, 2000, 2000 CPD ¶ 132 at 4.  A protester’s disagreement with an agency’s past performance evaluation provides no basis to question the reasonableness of the evaluator’s judgments.  Citywide Managing Servs. of Port Washington, Inc., B‑281287.12, B-281287.13, Nov. 15, 2000, 2001 CPD ¶ 6 at 10-11.

SI’s protest of the agency’s past performance evaluation, like its challenge of NGA’s technical evaluations, amounts to little more than quibbling over adjectival ratings and enumerated strengths, and is based on the protester’s selective identification of isolated statements in the record.  Contrary to the protester’s assertion, for example, the record clearly shows that the agency evaluated SI’s past performance proposal based on more than three questionnaires.  See, e.g., AR, Tab 20, Past Performance Emails.  To the extent that the protester complains that the evaluators, in their PET consensus narratives, may not have specifically commented on any strengths in SI’s ESEA project, (Comments & Supp. Protest at 15), the record actually shows that the evaluators otherwise reviewed, commented on, and rated the relevance of that project.  AR, Tab 11C, SI PET Report at 6‑11.

Best Value Determination

Finally, SI protests NGA’s source selection decision, arguing that it was flawed because the agency’s best value tradeoff relied on past performance as the determinative selection factor, contrary to the terms of the solicitation.  Protest at 9, 16.  The protester also challenges the agency’s best value determination on the basis of improper cost, technical, and past performance evaluations as discussed above.

We find that SI’s objections to the agency’s best value decision, like the protester’s objections to the agency’s past performance evaluation, is based on selective identification of isolated statements in the solicitation and the record.  The RFP’s basis of award provisions here state as follows:

(1)  The Government intends to award one Contract resulting from this Request for Proposal (RFP) to a responsible Offeror whose proposal, conforming to this RFP, is determined to be of best value to the Government, cost or price and non-cost factors considered.

* * * * *

(2)  The Government will evaluate proposals received under this RFP to determine the Offeror whose proposal represents the best value to the Government, cost and non-cost factors considered.  The Government will evaluate the proposals based on the following factors:  Technical/Management, Past Performance, Security and Cost.  The Government will perform a trade-off analysis between cost and non‑cost factors to determine best value. The Government may award to a higher rated, higher cost/priced Offeror, where the decision is consistent with the evaluation factors and the Source Selection Authority (SSA) reasonably determines that the technical superiority and/or overall business approach of the higher priced Offeror merits the additional cost.

* * * * *

In determining the award of a contract, the Government will give primary consideration to the Offeror that can perform the contract in a manner most advantageous to the Government, cost/price and other factors considered.

RFP at 111 (emphasis added).

Quoting (in part) the RFP provisions above, the protester insists that NGA was only permitted to select TASC’s higher-priced proposal for award if the SSA determined “that the technical superiority and/or overall business approach of the higher priced Offeror merits the additional cost.”  Protest at 8-9; Comments & Supp. Protest at 3‑4.  Although, admittedly, the solicitation’s basis of award provisions are not a model of clarity in this regard, the protester’s interpretation of those provisions is unreasonable because it would foreclose any consideration of the RFP’s other non‑cost evaluation factors as part of the agency’s best value tradeoff.

For a number of reasons, this interpretation is untenable.  First, the protester’s argument selectively ignores the provision’s explicit statements--in four places--that best value would be determined considering cost and non-cost factors.  RFP at 111.  Where a dispute exists as to the actual meaning of a solicitation requirement, our Office will resolve the matter by reading the solicitation as a whole and in a manner that gives effect to all of its provisions.  Sea-Land Serv., Inc., B‑278404.2, Feb. 9, 1998, 98-1 CPD ¶ 47 at 5.  Second, the protester’s interpretation is inconsistent with the FAR, which states, in relevant part, that the “source selection authority’s (SSA) decision shall be based on a comparative assessment of proposals against all source selection criteria in the solicitation.”  FAR § 15.308 (emphasis added).[10]  Third, the protester should have questioned any ambiguities in the solicitation’s stated basis for award prior to submitting it proposal.  A firm may not compete under a patently ambiguous solicitation and then complain when the agency proceeds in a way inconsistent with one of the possible interpretations.  Rather, the firm has an affirmative obligation to seek clarification prior to the first due date for responding to the solicitation following introduction of the ambiguity into the solicitation.  4 C.F.R. § 21.2(a)(1); see Dix Corp., B‑293964, July 13, 2004, 2004 CPD ¶ 143 at 3; Gartner Inc., B‑408933.2, B‑408933.3, Feb. 12, 2014, 2014 CPD ¶ 67 at 3.  Accordingly, we find the agency’s best value tradeoff was consistent with the solicitation.

Additionally, the record does not demonstrate that the award was made to TASC solely on the basis of its higher past performance rating.  SI’s assertions largely stem from isolated statements from SI’s debriefing and from the source selection evaluation board (SSEB) chairperson’s award recommendations to the SSA, and not from the final award decision.  See Protest at 8-9 (citing SI’s debriefing); Comments & Supp. Protest at 3 (citing SSEB’s briefing slides to SSA).  As described above, the SSA’s tradeoff analysis and best value determination explicitly states that the SSA determined that, “[a]lthough the TASC proposal is slightly more expensive, the technical merits of the TASC proposal--particularly in sub-factor 1.3 [testing and evaluation services]--and the substantial confidence past performance rating make the award to TASC the best value to the Government.”[11]  AR, Tab 14, Source Selection Decision, at 15.  The protester’s reliance on its debriefing and the SSEB chairperson’s recommendation is thus misplaced, because it does not reflect the agency’s actual best value determination.  See FAR § 15.308 (“While the SSA may use reports and analyses prepared by others, the source selection decision shall represent the SSA’s independent judgment.”)  In this respect, nothing in the record supports the protester’s argument that the agency deviated from the RFP’s relative weighting of evaluation factors.  See AR, Tab 14, Source Selection Decision, at 14‑15 (SSA’s narrative discussion of evaluation factors in final determination summary).

Source selection officials have broad discretion in determining the manner and extent to which they will make use of the technical and cost evaluation results, and their judgments are governed only by the tests of rationality and consistency with the stated evaluation criteria.  Client Network Servs., Inc., B-297994, Apr. 28, 2006, 2006 CPD ¶ 79 at 9; Atteloir, Inc., B-290601, B-290602, Aug. 12, 2002, 2002 CPD ¶ 160 at 5.  Where, as here, a solicitation provides for a tradeoff between the cost/price and non-cost factors, the agency retains discretion to make award to a firm with a higher technical rating, despite the higher price, so long as the tradeoff decision is properly justified and otherwise consistent with the stated evaluation and source selection scheme.  See, e.g., TtEC–Tesoro, JV, B-405313, B-405313.3, Oct. 7, 2011, 2012 CPD ¶ 2 at 10.  In reviewing an agency’s source selection decision, we examine the supporting record to determine if it was reasonable and consistent with the solicitation’s evaluation criteria and applicable procurement statutes and regulations.  See Honeywell Tech. Solutions, Inc., B-406036, Jan. 3, 2012, 2012 CPD ¶ 43 at 5.

As discussed above, we find no merit to SI’s objections to the agency’s cost, technical, or past performance evaluations.  Thus, there is no basis to question the agency’s reliance upon those evaluation judgments in making its source selection, and the protester’s disagreement in that regard does not establish that the agency acted unreasonably or provide a basis to sustain its protest.  See Citywide Managing Servs. of Port Washington, Inc., supra, at 10‑11.

The protest is denied.

Susan A. Poling
General Counsel



[1] The SI Organization, Inc., changed its name to Vencore, Inc., shortly before filing this protest.  Protest at 1 n.1.  However, the firm filed its protest with our Office under its former name, because the firm had submitted its proposal for this competition prior to its name change.  For the purpose of this decision we will refer to the company as SI.

[2]  Offerors were advised that the technical risk rating and the three technical evaluation subfactors were of equal importance.  RFP at 113.

[3] The solicitation permitted offerors to submit references for their subcontractors.  RFP at 86.

[4] The protester does not challenge the agency’s evaluation of proposals under the security factor.

[5] The protester argues, for example, that the agency’s cost realism analysis did not consider that TASC [DELETED].  See Comments & Supp. Protest at 17; Protester’s Supp. Comments at 18.

[6] The agency, in reviewing TASC’s initial cost proposal, made several attempts to obtain a forward pricing rate recommendation (FPRR) from DCAA.  CET Lead Evaluator’s Declaration at 1.  DCAA was unable to provide a current FPRR because, as it advised NGA, TASC’s 2014 rate structures had changed from 2013, and DCAA recommended that NGA use two earlier DCAA audit reports to evaluate TASC’s 2014 proposed rates.  Id.  TASC subsequently submitted a new forward pricing rate proposal to DCAA, which DCAA approved 3 days after NGA made contract award.  See AR, Tab 19C‑2, DCAA Mem. to NGA Contracting Officer, Mar. 28, 2014; NGA Email to Parties, Dec. 2, 2014, attach., DCAA Indep. Audit of TASC’s Indirect Expense Rate Forward Pricing Proposal for FYs 2014-2016, Sept. 25, 2014, at 1-3.

[7] Because we find that the agency’s cost realism analysis was reasonable, we need not respond to the protester’s complaint that NGA did not respond, in its agency report, on whether NGA intends to implement the CET’s recommendation that the agency include a rate re-opener clause in TASC’s contract award.  See Comments & Supp. Protest at 23; AR, Tab 12, TASC CET Rep., at 19.

[8] Our Office has consistently recognized that ratings, be they numerical, adjectival, or color, are merely guides for intelligent decision-making in the procurement process.  Citywide Managing Servs. of Port Washington, Inc., B‑281287.12, B‑281287.13, Nov. 15, 2000, 2001 CPD ¶ 6 at 11.  The evaluation of proposals and assignment of adjectival ratings should generally not be based upon a simple count of strengths and weaknesses, but on a qualitative assessment of the proposals consistent with the evaluation scheme.  See Clark/Foulger-Pratt JV, B‑406627, B‑406627.2, July 23, 2012, 2012 CPD ¶ 213 at 14.

[9] To the extent that SI disputes TASC’s evaluation under testing and evaluation services (subfactor 1.3), because both offerors ostensibly proposed the same hosting platform, the protester concedes that TASC’s proposal included more information in that regard.  Compare Comments & Supp. Protest at 22 with Supp. Comments at 13.

[10] The RFP was issued in accordance with FAR Part 15.  RFP at 1.

[11] In this regard, contrary to the protester’s assertion (Protester’s Comments & Supp. Protest at 18), the record does not show that the agency considered TASC’s and SI’s proposals to be technically equal.  

Downloads

GAO Contacts

Office of Public Affairs