Skip to main content

HP Enterprise Services, LLC

B-408825 Dec 23, 2013
Jump To:
Skip to Highlights

Highlights

HP Enterprise Services, LLC, of Herndon, Virginia, protests the award of a contract to IBM U.S. Federal, of Bethesda, Maryland, by the Department of the Air Force under request for proposals No. FA8771-11-R-0001, for the development of an integrated personnel and pay system.

We deny the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: HP Enterprise Services, LLC

File: B-408825

Date: December 23, 2013

Kevin J. Maynard, Esq., Samantha S. Lee, Esq., Will Novak, Esq., Rand L. Allen, Esq., and Brian G. Walsh, Esq., Wiley Rein LLP, for the protester.
David W. Burgett, Esq., C. Peter Dungan, Esq., and Erin L. Alexander, Esq., Hogan Lovells US LLP, for IBM U.S. Federal, the intervenor.
Marvin K. Gibbs, Esq., Department of the Air Force, for the agency.
Eric M. Ransom, Esq., and Edward Goldstein, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest challenging the agency’s evaluation of protester’s technical proposal and the source selection decision is denied where record shows that the evaluation was reasonable and consistent with the terms of the solicitation and applicable statutes and regulations.

DECISION

HP Enterprise Services, LLC, of Herndon, Virginia, protests the award of a contract to IBM U.S. Federal, of Bethesda, Maryland, by the Department of the Air Force under request for proposals No. FA8771-11-R-0001, for the development of an integrated personnel and pay system.

We deny the protest.

BACKGROUND

The Air Force issued the RFP on November 26, 2012, seeking a contractor to implement the Air Force’s “Integrated Personnel and Pay System” program, which involves bringing separate existing personnel and pay systems together under a new single integrated system. The program requires the selected contractor to design, develop, test, integrate, train, deploy, operate, and support the new integrated software system. Award was to be made on a best-value basis considering three factors: technical, past performance, and cost/price. The RFP advised that the technical and cost/price factors were of equal importance, and that past performance was less important than technical or cost/price. The technical and past performance factors, when combined, were deemed to be significantly more important than cost or price, however cost/price would contribute substantially to the selection decision. RFP at Bates 7178

The technical evaluation factor included four subfactors: (1) system design and implementation; (2) deployment, operations, and support; (3) systems engineering and program management; and (4) small business. According to the RFP, offerors were to be assigned adjectival ratings ranging from outstanding to unacceptable, and risk ratings ranging from low risk to high risk, in accordance with Department of Defense source selection procedures. Id. at 7177. The RFP provided that to receive a rating higher than “acceptable” an offeror must demonstrate strengths “that provide benefit(s) to the Government above and beyond satisfying the requirement without deficiencies.” Id. at 7179.

Under past performance, the RFP provided for an evaluation of the recency, relevancy, and quality of an offeror’s past performance. To be considered recent, a past performance effort had to have been performed, “in whole or in part, within the five (5) year period beginning at a date five years prior to the date of this solicitation;” however, “[o]nly the work that actually occurred within this recency window will be considered.” Id. at 7183. Finally, concerning price, the RFP established that the agency would evaluate the total evaluated price for reasonableness, realism, and balance.

Also of relevance to the protest, the RFP established that the agency would provide “Peoplesoft” and “Oracle E Business Suite” software as government furnished property. Both applications are asset management tools developed by Oracle. The RFP provided these applications to offerors without assessment as to the quality or effectiveness of either application to satisfy the requirements of the RFP, and informed offerors that use of the applications was not mandatory. Id. at 7133. The agency left to the offerors the decision of whether to utilize either product in their proposals.

The Air Force received five proposals in response to the RFP, including the proposals of HP and IBM. After holding discussions and completing its evaluation, the agency rated the offerors as follows:

HP

IBM

Offeror A

Offeror B

Offeror C

Technical

Subfactor 1

Acceptable/

Low Risk

Acceptable/

Low Risk

Acceptable/

Low Risk

Acceptable/

Low Risk

Acceptable/

Low Risk

Technical

Subfactor 2

Acceptable/

Low Risk

Acceptable/

Low Risk

Acceptable/

Low Risk

Acceptable/

Low Risk

Acceptable/

Low Risk

Technical

Subfactor 3

Good/

Low Risk

Good/

Low Risk

Acceptable/

Low Risk

Acceptable/

Low Risk

Good/

Low Risk

Technical

Subfactor 4

Acceptable

Acceptable

Acceptable

Acceptable

Acceptable

Past

Performance

Satisfactory

Satisfactory

Satisfactory

Satisfactory

Satisfactory

Evaluated Cost/Price

$50,595,816

$32,080,114

$54,351,432

$34,765,349

$67,360,334

Source Selection Decision Document (SSDD) at 3. As reflected above, HP, IBM, and Offeror C, were higher rated under technical subfactor 3, under which each received a “good” technical rating. These three higher-rated offerors received their good rating under subfactor 3 (systems engineering and program management) based on a strength for having proposed to use “a configured baseline product during blueprinting.” Id. at 5. For the five proposals evaluated, the agency did not identify any other strength under any factor.

Based on a consideration of the evaluation findings, the source selection authority (SSA) concluded that IBM’s lowest-priced proposal represented the best value to the government. In reaching this conclusion, the SSA found that there was “no evaluated technical superiority or superior past performance [of any other proposal] which justifies a price premium over the IBM proposal.” Id. at 5. On August 19, the Air Force notified HP that IBM’s proposal had been selected for the award. The Air Force subsequently provided HP a debriefing, and this protest followed.

DISCUSSION

HP argues that the agency conducted an unreasonable technical and past performance evaluation, improperly converted the procurement into a lowest-priced technically-acceptable competition, and conducted a flawed best value analysis.[1] As discussed below, HP’s allegations are without merit.

Technical and Past Performance Evaluation

Regarding the agency’s evaluation under the technical factor, HP alleges that the Air Force failed to meaningfully assess the offerors’ proposed software packages and key personnel. HP also challenges the agency’s evaluation under the past performance factor, arguing that the agency did not reasonably consider its performance on a highly relevant contract. The record does not support HP’s contentions.

The evaluation of an offeror’s proposal is a matter within the agency’s discretion. IPlus, Inc., B-298020, B-298020.2, June 5, 2006, 2006 CPD ¶ 90 at 7, 13. In reviewing an agency’s evaluation, our Office will not reevaluate proposals; instead, we will examine the record to ensure that it was reasonable and consistent with the solicitation’s stated evaluation criteria and applicable procurement statutes and regulations. Metro Mach. Corp., B-402567, B-402567.2, June 3, 2010, 2010 CPD ¶ 132 at 13; Urban-Meridian Joint Venture, B-287168, B-287168.2, May 7, 2001, 2001 CPD ¶ 91 at 2. An offeror’s disagreement with the agency’s evaluation is not sufficient to render the evaluation unreasonable. Ben-Mar Enters., Inc., B-295781, Apr. 7, 2005, 2005 CPD ¶ 68 at 7.

Concerning the evaluation of the offerors’ software packages, HP asserts that “the evaluators did not consider the relative capabilities of the offerors’ proposed [commercial off the shelf] software packages (Oracle eBS versus Peoplesoft 9.1), or the degree of development, customization or extensions associated with those different packages.” Protest at 14. As an initial matter this allegation is misplaced since it is based solely on information allegedly provided to HP in response to a question during its oral debriefing. The agency explains that HP’s debriefing question related specifically to the government furnished property offered in the RFP, which the agency properly did not comparatively evaluate where the RFP provided that use of the government furnished property was not mandatory, and provided no comment as to the quality or effectiveness of either application to satisfy the agency’s requirements. RFP at Bates 7133.

More importantly, however, the record contradicts HP’s claims that the agency did not consider the degree of development, customization or extensions associated with the different software packages offered in the offerors’ proposals. In this regard, the individual evaluator worksheets and consensus evaluation report clearly reflect consideration of the degree of development, customization and extensions required by the offerors’ solutions. See AR, Tab 12, HP Initial Technical Group Consensus Report, at Bates 18-19. The record also demonstrates that HP was aware that the agency had evaluated the degree of development, customization or extensions required by the proposals since HP received specific discussions questions in these areas. See AR, Tab 39, HP Responses to Evaluation Notices, at Bates 18-19. Accordingly, we find no basis on which to sustain HP’s protest in this area.

Next, concerning HP’s allegation that the agency failed to reasonably assess the offerors’ key personnel, we note that the RFP did not provide for a qualitative evaluation of proposed personnel. Rather, the RFP merely contemplated an evaluation of the proposed plans for hiring and retaining qualified personnel in several key positions. RFP at Bates 7150. With respect to the actual stated basis for evaluation, the record reflects that the agency reviewed HP’s proposed staffing plan and concluded that it was:

realistic and achievable for hiring and retaining qualified personnel in the following key positions: Program Manager, Chief Engineer, and Functional/Business Lead(s). [HP] sufficiently described their hiring and retention plans for key positions to include employee climate surveys, flexible work arrangements, and a rewards and recognition program.

AR, Tab 46, HP Final Technical Group Consensus Report, at 5. Accordingly, where the RFP did not contemplate an evaluation of specific individuals proposed as key personnel, but only the offerors’ plans for hiring and retaining qualified personnel, we find no error in the agency’s evaluation.

Regarding the past performance evaluation, HP contends that the Air Force failed to consider “eight pages of detailed narrative included in [HP’s] past performance proposal regarding the highly relevant [DELETED] contract.”[2] HP Comments at 6. HP also alleges that the Air Force’s “somewhat relevant” and “satisfactory” ratings for the [DELETED] contract were inconsistent with a strength HP received under technical subfactor 3 (systems engineering and program management) for HP’s use of products of the [DELETED] contract to facilitate blueprinting and development for the Air Force’s requirement. HP maintains that the [DELETED] contract should have been considered “very relevant.”

Based on our review of the record, we conclude that the agency’s evaluation of HP’s past performance under the [DELETED] contract was reasonable and consistent with the RFP. The agency explains that it rated the [DELETED] contract only “somewhat relevant” because HP’s development work on the [DELETED] contract did not fall within the period of consideration established by the RFP and because the work actually performed within the period of consideration was limited to system operation. As noted above, the RFP cautioned that the agency would only consider performance that actually occurred within five years of issuance of the solicitation. The propriety of the agency’s evaluation in this regard is supported by HP’s own proposal, which indicated that “[w]e finished implementing [DELETED] in April 2007 and then moved to steady-state operations, maintenance, and support, which lasted until our contract ended in [DELETED].” AR, Tab 8, HP Initial Proposal, at Bates 528. Further, during phone conversations with [DELETED] on the [DELETED] contract, [DELETED] confirmed that “no development work or upgrades were completed during the recency period.” Contracting Officer’s Statement at 24; AR, Tab 25, Call Log (contemporaneous call log demonstrating multiple conversations with [DELETED]). Accordingly, HP’s past performance evaluation challenge is without merit.[3]

Best Value Trade-Off Decision

Regarding the best value trade-off decision, HP asserts that the SSA failed to look behind the overall adjectival ratings or perform a meaningful best value analysis, and thereby improperly converted the procurement into a lowest-priced technically-acceptable competition. In support of its argument, HP points to over 30 statements from the agency’s proposal analysis report addressing the technical subfactors, which HP contends positively portray its proposal and should have been considered by the SSA as technical discriminators in the best value analysis. HP also generally asserts that it was improper for the SSA to fail to identify discriminators under the past performance factor.

Agencies enjoy discretion in making cost/technical tradeoffs where the solicitation provides for the award of a contract on a best value basis; the agency’s selection decision is governed only by the test of rationality and consistency with the solicitation’s stated evaluation scheme. Marine Hydraulics Int’l, Inc., B-403386.3, May 5, 2011, 2011 CPD ¶ 98 at 4. Additionally, a source selection official may rely on evaluation reports provided by technical evaluators. See General Dynamics C4 Sys., Inc., B-406965, B-406965.2, Oct. 9, 2012, 2012 CPD ¶ 285; Diemaster Tool, Inc., B-241239, B-241239.2, Jan. 30, 1991, 91-1 CPD ¶ 89 at 6.

Our review of the record does not support HP’s claims. The various proposal analysis report statements cited by HP as positively portraying its proposal are in fact consistent with the RFP’s description of an “acceptable” evaluation rating, and do not support a conclusion that HP’s proposal included strengths or technical discriminators ignored by the evaluators or SAA. For example, many of the statements cited by HP reflect that HP’s proposal was “appropriate,” “meets technical and functional requirements,” and demonstrates a “sound understanding.” HP Comments at 8-13. These evaluation statements do not establish strengths or discriminators where, according to the RFP, a rating higher than acceptable “requires strengths in the offeror’s proposal that provide benefit(s) to the Government above and beyond satisfying the requirement without deficiencies.” RFP at Bates 7179. The evaluation language cited by HP here does not describe aspects of HP’s proposal that provided benefits above and beyond satisfying the RFP’s requirements. Further, rather than demonstrating a failure to meaningfully analyze the proposals, the evaluation statements cited by HP demonstrate that the agency thoroughly evaluated the offerors’ proposals.[4]

Further, HP’s contention that the agency’s evaluation had the effect of treating all offerors as technically equal, and reducing the best value evaluation scheme to a lowest-priced technically-acceptable competition is without a basis in the record. The SSDD demonstrates that the SSA considered three of the five offerors technically superior on the basis of each having an evaluated strength for their “use of a configured baseline product during blueprinting.” SSDD at 5. The SSDD also reflects that the SSA considered IBM to have an advantage under the past performance evaluation criteria, where the SSDD concluded that “IBM had the better past performance because of the number and quality of relevant past performance.” Id. The SSA ultimately concluded, however, that “[w]hile there are differences in the contracts evaluated . . . there is no advantage gained by any offerors’ past performance which creates a discriminator in my award decision.” Id. In light of the comprehensive evaluation record and detailed SSDD, both of which reflect a consideration of the relative merits of the offerors’ proposals, we have no basis to find the agency’s selection decision inconsistent with the terms of the solicitation or otherwise improper.

The protest is denied.

Susan A. Poling
General Counsel



[1] HP also raised several price realism arguments which were dismissed because HP largely argued that IBM’s cost/price must have been unrealistic where it was significantly lower than HP’s own cost/price. The fact that an awardee’s cost or price is lower than the protester’s does not by itself bring the agency’s cost/price realism evaluation into question. See Siebe Envtl. Controls, B-275999.2, Feb. 12, 1997, 97-1 CPD ¶ 70 at 3. HP also alleged that the agency improperly failed to utilize its own independent government cost/price estimate in conducting the realism analysis, however, the agency reasonably explained that it did not rely on the independent estimate because the estimate had not been structured in the same way as the RFP’s contracting approach. Finally, HP filed two supplemental protests which our Office concluded did not state a valid basis to challenge the agency’s award decision.

[2] The [DELETED] contract was [DELETED], which was very similar in size and scope to the Air Force’s requirement. However, as detailed below, HP’s work on the [DELETED] contract during the recency period for past performance related only to operations and support tasks, and did not include any of the design and development work required under this RFP.

[3] Our review of the record also reflects that the Air Force reasonably rated HP’s overall performance on the [DELETED] contract as satisfactory, where such a rating was consistent with feedback the Air Force received from [DELETED] regarding HP’s performance on the [DELETED] contract. See AR, Tab 27, [DELETED] Reference Letter, at 1.

[4] The SSA, in the SSDD, explains that he “reviewed the briefing materials, the Proposal Analysis Report (PAR) and the Comparative Analysis Report (CAR),” prior to making the award decision. Accordingly, the SSA was also aware of the proposal analysis report evaluation statements cited by HP.

Downloads

GAO Contacts

Office of Public Affairs