Skip to main content

The Boeing Company

B-409941,B-409941.2 Sep 18, 2014
Jump To:
Skip to Highlights

Highlights

The Boeing Company, Intelligence Systems Group, of Springfield, Virginia, protests the National Geospatial-Intelligence Agency's (NGA) award of a contract to Harris Corporation, of Melbourne, Florida, under request for proposals (RFP) No. HM1574-13-R-0003, for foundation geospatial intelligence content management services. Boeing challenges the agency's evaluation of proposals and contends that the agency conducted misleading discussions.

We deny the protest.

We deny the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: The Boeing Company

File: B-409941; B-409941.2

Date: September 18, 2014

Scott M. McCaleb, Esq., Jon W. Burd, Esq., Tara L. Ward, Esq., and Craig Smith, Esq., Wiley Rein LLP, and Mark W. Reardon, Esq., and Padriac B. Fennelly, Esq., The Boeing Company, for the protester.
Thomas P. Humphrey, Esq., John E. McCarthy Jr., Esq., James G. Peyster, Esq., Robert Sneckenberg, Esq., and Skye Mathieson, Esq., Crowell & Moring LLP, for Harris Corporation, the intervenor.
Jeffrey L. Augustin, Esq. and Mason C. Alinger, Esq., National Geospatial‑Intelligence Agency, for the agency.
Cherie J. Owen, Esq., and David A. Ashen, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1. Protest of agency’s cost realism adjustment is denied where protester’s proposal failed to provide sufficient detail to substantiate a productivity claim in its proposal, and agency evaluators relied on their personal experience with the subject matter to determine that protester’s unsupported claim was unrealistic.

2. Protest that agency failed to conduct meaningful discussions is denied where the specific flaw in the protester’s proposal that resulted in a weakness was not reasonably apparent until the final proposal revision and, in any case, the agency’s discussions led the protester into the general area of its proposal requiring revision.

DECISION

The Boeing Company, Intelligence Systems Group, of Springfield, Virginia, protests the National Geospatial-Intelligence Agency’s (NGA) award of a contract to Harris Corporation, of Melbourne, Florida, under request for proposals (RFP) No. HM1574‑13-R-0003, for foundation geospatial intelligence content management services. Boeing challenges the agency’s evaluation of proposals and contends that the agency conducted misleading discussions.

We deny the protest.

BACKGROUND

The RFP, issued on May 3, 2013, was one of three solicitations seeking regional geospatial intelligence content management. In this regard, the contractor will assist with creating a dynamic “map of the world” that will be populated with geospatial intelligence information that will enable NGA to provide mission-essential content as needed. RFP at 206. The contract for Region C, covering the regions of the globe under the purview of the United States Africa Command (USAFRICOM) and United States Southern Command (USSOUTHCOM), is the subject of this protest. The solicitation contemplated award of an indefinite-delivery requirements contract under which the NGA will issue cost reimbursable, incentive, and fixed-price delivery orders. RFP at 71.

Award was to be made to the offeror whose proposal represented the best value to the government considering the following evaluation factors: technical/management; past performance; security; and cost/price. RFP at 97. As relevant here, the technical/management factor included five subfactors (in descending order of importance): (1) technical and transformational capability, including consideration of the offeror’s “[t]echnical innovations to improve performance, cost and/or schedule,” Id. at 99 (section M), 82 (section L); (2) quality assurance; (3) emerging products, data and services; (4) management approach and personnel plan; and (5) small business participation plan. The technical/management factor was more important than past performance, while all non-cost/price factors, when combined, were significantly more important than cost/price.[1] Id.

Under the cost/price factor, proposals were to be evaluated for completeness, reasonableness, and realism. RFP at 98. With regard to realism, the solicitation provided as follows:

Cost realism analysis will be performed in accordance with the criteria included in FAR [Federal Acquisition Regulation] 15.404-1(d). A realism evaluation will be accomplished by reviewing the cost/price proposal to determine that cost/price is realistic for the work to be performed and reflects a clear understanding of the requirement. The cost realism analysis will be conducted by the Government to determine the Most Probable Cost (MPC) for the [cost plus incentive fee] and [cost plus fixed fee] CLINs.

Id. at 101.

In order to evaluate offerors’ technical approaches, the RFP required offerors to propose solutions to eight sample tasks, termed “delivery orders” in the RFP, for the base year and six sample tasks for the option years. RFP at 267-68, 414-15. As relevant here, the GeoNames Integration sample task required offerors to describe how they would enrich existing data regarding eleven identified features[2] at the global or regional density scale.[3] Another sample task, titled “Argentina,” required offerors to explain how they would approach enriching and updating the agency’s existing topographic data store information.[4] Offerors also were to perform currency updates of the agency’s information under this sample task, which would include analyzing information regarding possible changes on the earth using refined persistent change model (RPM) data.[5] RFP at 282; Hearing Transcript (Tr.) at 80. A third sample task, Human Geography, required offerors to deliver human geography data[6] for groups of people living in Gabon, using source data that could be in the English, French, Fang, or Bantu languages. RFP at 442-43; Tr. at 97, 101.

The agency received proposals from six offerors by the June 6, 2013 closing date. After reviewing initial proposals, NGA established a competitive range consisting of Boeing, Harris, and two other offerors, and opened discussions. AR at 4-5. The agency conducted two rounds of discussions, the first in writing, and the second in person. Id. at 4-5. After receiving offerors’ final proposal revisions, the source selection evaluation board evaluated the proposals submitted by Harris and Boeing as follows:

 

Evaluation Factors & Subfactors

 

Harris

 

Boeing

 

Technical & Management

 

Good

 

Acceptable

 

Technical & Transformational Capability

 

Outstanding

 

Acceptable

 

Quality Assurance

 

Acceptable

 

Acceptable

 

Emerging Products, Data & Services

 

Good

 

Acceptable

 

Management Approach & Personnel Plan

 

Acceptable

 

Acceptable

 

Small Business Participation Plan

 

Acceptable

 

Good

 

Past Performance

Satisfactory/

Relevant

Satisfactory/

Relevant

 

Security Eligibility Requirements

 

Pass

 

Pass

 

Cost/Price

 

 

 

Complete

Yes

Yes

 

Reasonable

Yes

Yes

 

Realistic

Yes

Yes

Proposed Cost/Price

$161.58 Million

$159.14 Million

MPC Adjustment

($3.38 Million)

$1.78 Million

Total Evaluated Cost/Price

$158.20 Million

$160.92 Million


AR, Tab L.2, SSAC Decision Document, at 39.

At the hearing our Office conducted in this matter, the chair of the technical and management evaluation panel testified that Harris’s rating of outstanding under technical and transformational capability (the most important technical subfactor) was due, in part, to the numerous efficiencies and innovations proposed by Harris. Tr. at 51. For example, efficiencies [DELETED], resulting in greater time savings. Id. Further, Harris proposed to use [DELETED]. Id. at 307-10. In contrast, Boeing’s approach was evaluated as more “person-oriented,” and involving more human interaction, thereby contributing to Boeing’s much higher number of proposed labor hours under several of the sample tasks. Tr. at 57, 89 (citing AR, Tab J.3, Boeing’s Final Technical/Management Proposal, at 129).

In evaluating the two proposals, the agency made several adjustments to the offerors’ proposed labor hours, resulting in adjustments to their most probable cost. For example, as relevant here, with respect to the Argentina sample task, the agency found Boeing’s claim that it could process [DELETED] refined persistent change model[7] edits per hour, twice the rate proposed by Boeing in its initial proposal, AR, Tab J.3, Boeing Final Proposal Revision, at 79 (“doubling our estimate for the number of features we can edit per hour”), to be unsupported and unrealistic. AR, Tab K.3, Assessment of Boeing’s Basis of Estimate, at 6. Based on the personal technical experience of the panel members, the evaluators believed that [DELETED] edits per hour was more realistic and adjusted Boeing’s labor hours and resulting cost accordingly. Tr. at 89.

With regard to Harris’s proposal, the evaluators concluded that, given Harris’s unique technical approach to the Human Geography sample task, Harris had proposed too many [DELETED] labor hours. Specifically, Harris proposed to use [DELETED] to accomplish much of the work in the pre-production stage. Tr. at 104. Harris’s approach in this regard was viewed as an innovative approach that would increase efficiency because the work performed by the [DELETED] in the beginning stages of the task would significantly reduce the work of the [DELETED] in later stages. Tr. at 159. In contrast, Boeing proposed a different approach to achieving the goals of this task: rather than using [DELETED] and [DELETED], Boeing proposed to use [DELETED] and [DELETED] to accomplish the work. See AR, Tab J.3, Boeing’s Final Proposal, at 312-13; Tr. at 153. Given Harris’s approach, the evaluators determined that the amount of work needed in the production stage should be reduced, resulting in fewer hours of work for the [DELETED] than proposed by Harris. Id. at 104-105, 152‑53; AR, Tab K.2, Assessment of Harris’s Basis of Estimate, at 36. Overall, the agency made upward cost adjustments of approximately $1.78 million to Boeing’s proposal and downward adjustments of approximately $3.38 million to Harris’s proposal. AR, Tab L.2, SSAC Decision Document, at 39.

After reviewing the source selection evaluation board’s report and the source selection advisory council’s comparative analysis and award recommendation, the source selection authority (SSA) selected Harris’s proposal for award. In selecting Harris’s proposal, the SSA noted that it had the top-rated technical/management proposal and received a rating of outstanding under the most important technical subfactor, while Boeing had the lowest rated proposal of the four offerors in the competitive range. AR, Tab L.3, Source Selection Decision Document (SSDD) at 6. The SSA also noted that Harris had the lowest evaluated cost/price of all offerors Since Harris had the highest-rated and lowest-priced proposal, the SSA determined that its proposal represented the best value to the government. This protest followed.

DISCUSSION

Boeing primarily contends that offerors did not compete on a common basis; that the evaluation was unreasonable; and that the agency engaged in misleading discussions.

The evaluation of an offeror’s proposal is a matter largely within the agency’s discretion. Booz Allen Hamilton, Inc., B-409272 et al., Feb. 25, 2014, 2014 CPD ¶ 84 at 4. In reviewing a protest that challenges an agency’s evaluation of proposals, our Office will not reevaluate proposals; rather, we will examine the record to determine whether the agency’s judgment was reasonable and consistent with the stated evaluation criteria and applicable statutes and regulations. Id.; WAI‑Stoller Servs., LCC; Navarro Research & Eng’g, Inc., B-408248.6 et al., May 22, 2014, 2014 CPD ¶ 164 at 8. Here, we have reviewed all of Boeing’s arguments and find that none provides a basis for sustaining the protest. We discuss several of the arguments below.

Common Understanding of the Requirements

Boeing asserts that the record demonstrates that the offerors had widely divergent understandings of the scope of work required under the solicitation, and that therefore, offerors did not compete on an equal basis. Specifically, Boeing notes that Boeing’s proposed solution required “10 to 20 times more labor hours to accomplish the work” for some sample tasks than did Harris’ proposed solution. Protester’s Comments & Supp. Protest at 2. Since the offerors proposed such divergent levels of effort to accomplish the tasks identified in the solicitation, Boeing contends this evidences that the solicitation was not sufficiently clear or that the offerors adopted different interpretations of the requirements that prevented fair competition. Id.

Solicitations must contain sufficient information to enable offerors to compete intelligently and on a relatively equal basis. American Custom Meats, LLC, B‑409564, June 12, 2014, 2014 CPD ¶ 195 at 9. However, where the RFP contains detailed requirements and the solicitation anticipates that offerors will propose different technical approaches to achieve the required result, a showing that the offerors proposed different levels of effort is not sufficient to demonstrate that offerors lacked a common understanding of the RFP’s requirements. See L-3 Servs., Inc., B-406292, April 2, 2012, 2012 CPD ¶ 170 at 14.

Here, the agency spent nearly four years developing the solicitation and refining its requirements through multiple interactions with industry (including Boeing), responding to over a thousand questions and comments, and releasing several draft versions of the solicitation. Supp. AR at 2-3. The resulting solicitation included a detailed statement of the agency’s requirements. In addition, the solicitation emphasized that one of the goals of the solicitation was to “seek innovation,” RFP at 234, and stated that the agency would evaluate offerors’ proposed “[t]echnical innovations to improve performance, cost and/or schedule.” Id. at 82. In these circumstances, the simple fact that Boeing and Harris proposed different technical approaches that involved varying levels of effort for particular tasks does not establish a disparate understanding of the solicitation’s requirements or an unequal competition; rather, this fact appears to indicate simply that the offerors proposed different or innovative technical approaches, as envisioned by the solicitation. Therefore, we find this basis of protest to be without merit.

Cost Adjustments

Next, Boeing challenges several aspects of the agency’s cost realism analysis. For example, the protester contends that the agency arbitrarily made upward adjustments to Boeing’s cost based on the agency’s unreasonable conclusion that Boeing’s proposal to complete [DELETED] refined persistent-change model (RPM) features edits per hour was unrealistic.

When an agency evaluates proposals for the award of a cost-reimbursement contract, an offeror’s proposed estimated cost of contract performance is not considered controlling since, regardless of the costs proposed by the offeror, the government is bound to pay the contractor its actual and allowable costs. Serco Inc., B-407797.3, B-407797.4, Nov. 8, 2013, 2013 CPD ¶ 264 at 9; see FAR § 16.301. An agency must perform a cost-realism analysis to determine the extent to which an offeror’s proposed costs represent what the contract costs are likely to be under the offeror’s unique technical approach, assuming reasonable economy and efficiency. ManTech SRS Techs., Inc., B-408452, B-408452.2, Sept. 24, 2013, 2013 CPD ¶ 249 at 5. Based on the results of the cost realism analysis, an offeror’s proposed costs should be adjusted when appropriate. FAR § 15.404‑1(d)(2)(ii). An agency’s cost realism analysis need not achieve scientific certainty; rather, the methodology employed must be reasonably adequate and provide a measure of confidence that the agency’s conclusions about the most probable costs under an offeror’s proposal are reasonable and realistic in view of the cost information reasonably available to the agency at the time of its evaluation. Serco Inc., supra.

We find the agency’s adjustment of Boeing’s hours to be unobjectionable. First, the evaluators found nothing in Boeing’s proposal, and Boeing has failed to identify anything, that provided a basis for its proposed rate of [DELETED] RPM edits per hour. In this regard, Boeing’s proposal made only a general reference to its prior experience with “using RPM data over the past 18 months on certain NGA task orders,” but provided no details regarding its estimation that it could achieve [DELETED] edits per hour. AR, Tab J.3, Boeing Final Technical Proposal, at 129. At the hearing in this matter, the chair of the technical and management evaluation panel testified that Boeing’s proposal provided insufficient support for this claim; according to the chair, “[t]hey didn’t explain how they got to [DELETED] features per hour.” Tr. at 92.

Further, based on their own extensive experience performing RPM edits, and developing the RPM model, the evaluators determined that a rate of [DELETED] RPM edits (which would involve adding, modifying, and deleting information for changed features) per hour seemed high. Tr. at 92-93 148-49; see AR, Tab K.3, Assessment of Boeing’s Basis of Estimate, at 6. Boeing has not shown that the claim in the proposal that it could achieve [DELETED] RPM edits per hour was consistent with its prior experience or the experience of the evaluators. Based on this record, we find the agency’s upward adjustments to Boeing’s proposed cost in this area to be reasonable. See EDO Corp., B-296861, Sept. 28, 2005, 2005 CPD ¶ 196 at 4 (finding agency’s realism adjustments based on evaluators’ experience to be reasonable where offeror did not provide a sufficient explanation for why the proposed hours were adequate for the tasks proposed).

Next, Boeing asserts that the agency conducted a disparate evaluation because it did not similarly increase Harris’s proposed RPM edit rate.[8] Specifically, Boeing notes that the agency accepted Harris’s proposal of far fewer hours for the task, without making an upward adjustment. Protester’s Comments & Supp. Protest at 7.

The record here does not show that the agency conducted a disparate evaluation. As set forth above, the offerors proposed different technical approaches to accomplishing the RFP’s requirements. With regard to the Argentina sample task, for enriching and updating the agency’s existing topographic data, Boeing’s approach would require an analyst to call up the imagery, look at two (or more) images, and determine whether a change must be made based on apparent changes shown on the images. Tr. at 76-77. In contrast, Harris proposed to rely on [DELETED], eliminating the need for a [DELETED]. Further, Harris’s projections were substantiated by several pages of supporting details and calculations showing how the software would allow Harris to accomplish the task in the time proposed. Id. at 469-71, 475‑79, 481.

Cost realism evaluations are to consider whether an offeror’s proposed cost elements are realistic and consistent with their unique methods of performance. Sys. Techs., Inc., B-404985, B-404985.2, July 20, 2011, 2011 CPD ¶ 170 at 5. Where, as here, the varying number of labor hours between competing proposals results from different technical approaches, there is no requirement that an agency’s cost realism analysis normalize the proposed labor hours. See ManTech SRS Techs., Inc., supra at 7. Boeing has not shown that the agency’s evaluation failed to reasonably account for the unique methods of performance of the two offerors or was otherwise unreasonable. In these circumstances, we find no merit in the protester’s claim that the agency’s cost realism adjustments resulted from unequal treatment.

Discussions

Boeing contends that the agency did not conduct meaningful discussions because it failed to advise it of a major weakness in Boeing’s approach to the GeoNames Integration sample task (for enriching existing data regarding eleven identified features). Specifically, in the agency’s evaluation of Boeing’s final proposal, it assigned a major weakness related to Boeing’s approach to filtering geographical features in the GeoNames database. In this regard, the evaluators noted that Boeing’s final revised proposal included the following statement under the heading Ground Rules and Assumptions: “This analysis was conducted on the basis of a full April 2013 GNDB extract.” AR, Tab K.1, Boeing FPR Consensus Evaluation Report, at 6; see AR, Tab J.3, Boeing Final Technical Proposal, at 91. The chair of the technical and management evaluation panel explained at the hearing that this weakness was due to Boeing’s proposal to perform its work using the “full database,” rather than first filtering results to only those features at the 1:250,000 and 1:1,000,000 scales specified in the RFP. The agency found that this unfiltered approach would result in more work under the cost-reimbursable contract, and was inefficient. Tr. at 123, 135.

Boeing does not dispute that its technical approach failed to first filter the data according to the scales identified in the solicitation. See Protest at 11 (“[r]ather than apply a global or regional filter to the data . . . Boeing proposed to use more detailed data [which would] capture[] data at a more granular level”). Rather, Boeing contends that the agency improperly failed to raise this concern during discussions. Protester’s Comments & Supp. Protest at 9-13.

The fundamental purpose of discussions is to afford offerors the opportunity to improve their proposals to maximize the government’s ability to obtain the best value, based on the requirement and the evaluation factors set forth in the solicitation. AT&T Gov’t Solutions, Inc., B-406926 et al., Oct. 2, 2012, 2012 CPD ¶ 88 at 17. In negotiated procurements, whenever discussions are conducted by an agency, they are required to be meaningful, equitable, and not misleading. Vectronix, Inc., B-407330, Dec. 19, 2012, 2013 CPD ¶ 13 at 8. However, where a weakness is first introduced in an offeror’s final proposal revision after discussions were concluded, the agency has no obligation to reopen discussions to address the new weakness. Smiths Detection, Inc., B-298838, B‑298838.2, Dec. 22, 2006, 2007 CPD ¶ 5 at 13 n.13; Ouachita Mowing, Inc., B‑276075, B‑276075.2, May 8, 1997, 97-1 CPD ¶ 167 at 4. Further, the requirement that discussions be meaningful does not obligate an agency to spoon-feed an offeror. CEdge Software Consultants, LLC, B-409380, Apr. 1, 2014, 2014 CPD ¶ 107 at 6. Instead, to satisfy the requirement for meaningful discussions, an agency need only lead an offeror into the areas of its proposal requiring amplification or revision. Id.; AAA Mobile Showers, Inc., B-311420.2, Mar. 27, 2009, 2009 CPD ¶ 75 at 7.

We find no basis to question the conduct of discussions here. Regarding Boeing’s technical approach to the GeoNames Integration sample task, the agency maintains that the problem with Boeing’s approach first became apparent in Boeing’s final proposal. Although Boeing’s initial proposal contained the statement under the heading “Source Harvesting” that “[t]his analysis was conducted on the basis of a full April 2013 GNDB extract,” the agency evaluator testified at the hearing that, because the statement was placed only under the source harvesting heading, the agency understood the statement to relate only to the beginning “pre‑production” step of the process. Tr. at 124. However, when in Boeing’s final proposal the statement was placed under the ground rules and assumptions heading, the evaluators concluded that Boeing’s full extract approach would be applied not just to the pre-production stage, but also to the later production and estimation stage as well. Tr. at 134. We find that the agency reasonably interpreted the meaning of the statement differently when the statement appeared under the broader ground rules and assumptions heading in its final proposal, rather than under the more limited source harvesting heading, such that the full extent of the problem only became apparent in its final proposal.

In any case, we also find that the agency satisfied its obligation to lead the offeror into the general area of the agency’s fundamental concern with Boeing’s excessive staffing. Specifically, although the agency did not realize, until receiving Boeing’s final proposal revision, that Boeing’s high number of proposed labor hours was due to its approach of using the full database, rather than first filtering to the appropriate scale, the agency did notice, and discuss with Boeing, the fact that its proposed labor hours for this sample task were considered to be high. For example, in written discussions, the agency notified Boeing that its production hours were “very high” and suggested that the problem could be due to Boeing’s proposal to “re-collect” features. AR, Tab H.I, Boeing Items for Discussion, at 11, 55; Tr. at 127. Further, during oral discussions, the agency repeatedly reminded Boeing of importance of focusing on the scales identified in the sample task. See Intervenor’s Comments on the Supplemental Agency Report at 52-53 (transcribing agency’s oral discussions with Boeing: “I just want to verify that the requirement is 1 to 250 [thousand] scale down to a 1 to 1 million scale . . . I just want to reiterate: be sure of the requirement, 1 to 250 to 1 to 1 million”); see also Tr. at 127-28 (“in the verbal discussions we had, I stated at least twice for Boeing to remember that this was a 1 to 250 scale”). The crux of the weakness assigned to Boeing’s approach was that Boeing planned to “pull” the entire database, rather than filtering its results to only those features that would appear under the scales identified in the solicitation (1 to 250,000 and 1 to 1 million), resulting in more labor hours to accomplish the task. We find that the record reflects that, although the agency did not know the exact cause of Boeing’s high labor hours until it received the final proposal revision, the agency put Boeing on notice that its labor hours for this task were high, and the agency was concerned that Boeing might not be adhering to the solicitation’s required scales. Under these circumstances, we find that the agency sufficiently led Boeing into the area of its proposal that required revision.

Unequal Technical Evaluation

Finally, Boeing argues that the agency engaged in unequal treatment in its evaluation of the offerors. For example, the protester notes that NGA assigned a major weakness to Boeing’s proposal when the firm overestimated the effort needed to accomplish the GeoNames Integration sample task, but when the agency found that Harris overestimated the effort needed to accomplish the Human Geography sample task, the agency made a downward adjustment to Harris’s proposed labor hours resulting in a downward cost adjustment, but did not assign a weakness. Boeing contends that the agency’s different treatment of the two situations demonstrates disparate treatment. Protester’s Comments & Supp. Protest at 14.

It is a fundamental principle of federal procurement law that a contracting agency must treat all offerors equally and evaluate their proposals evenhandedly against the solicitation’s requirements and evaluation criteria. ADNET Systems, Inc., et al., B‑408685.3 et al., June 9, 2014, 2014 CPD ¶ 173 at 16. Where a protester alleges unequal treatment in a technical evaluation, it must show that the differences in ratings did not stem from differences between the offerors’ proposals. Paragon Systems, Inc.; SecTek, Inc., B-409066.2, B-409066.3, June 4, 2014, 2014 CPD ¶ 169 at 8-9.

Here, the agency argues, and we agree, that the differing treatment by the agency was a result of two different types of flaws in the offerors’ proposals. With regard to Harris’s proposal, the evaluators noted a sound technical approach, but concluded that Harris had simply overestimated the number of hours that would be needed for the [DELETED] to accomplish their assigned tasks, given the efficiencies that would be achieved by using [DELETED] to accomplish some of the pre-production steps. Tr. at 151, 159. In contrast, with regard to Boeing’s proposal, the agency found that, while the number of hours proposed was realistic for Boeing’s chosen approach, the technical approach itself was problematic because it was inefficient. Tr. at 150-51. As a result, the evaluators assigned a weakness to Boeing’s proposal due to its flawed technical approach. Id.; AR, Tab K.1, Boeing FPR Consensus Evaluation Report, at 6. We find that the difference in the agency’s treatment of the offerors stemmed from differences in the offerors’ proposals, not from disparate treatment. Therefore, we see no merit in Boeing’s claim of an unequal or disparate evaluation.

The protest is denied.

Susan A. Poling
General Counsel



[1] The security factor was pass/fail. RFP at 97.

[2] The identified features were mostly natural features, such as mountain ranges, oceans, and canyons. RFP at 269-70; Tr. at 113.

[3] At the hearing our Office conducted in this matter, the chair of the technical and management evaluation panel testified that this task involved “fill[ing] in the blanks” on existing data in the agency’s GeoNames Database. Hearing Transcript (Tr.) at 115.

[4] Enriching the agency’s database information included updating the information and ensuring that it represented an accurate portrayal of features in the area of interest. Tr. at 79. This would be done by first validating the existing data, and then collecting and integrating additional data. Id. at 79-80.

[5] RPM data involves analyzing images of the same location, taken at different points in time, to identify potential changes on the face of the earth. See Tr. at 31, 33.

[6] Human geography data could include, for example, demographic and population measures, economy, ethnicity, religion, language, social and political groups, land use, climate, water supply, and education. RFP at 209.

[7] As set forth above, refined persistent change model data involves analyzing images of the same location, taken at different points in time, to identify potential changes on the face of the earth. See Tr. at 31, 33.

[8] Because Harris’s proposal offered a different technical approach to performing this sample task, the proposal’s approach to calculating the hours required to perform the work also varied. As the chair of the technical and management evaluation panel explained at the hearing, Boeing’s approach to the calculation was based on the number of RPM edits per hour a person could perform, while Harris’s approach to the calculation was based on its [DELETED]’s ability to analyze a certain area (expressed in square kilometers) per hour. Tr. at 86, 90. Thus, rather than estimating labor hours based on RPM edits per hour, Harris’s proposal used square kilometers as its basis for calculations. See AR, Tab I.2, Harris Cost/Price Proposal, at 469.

Downloads

GAO Contacts

Office of Public Affairs