Skip to main content

Smiths Detection, Inc.

B-298838,B-298838.2 Dec 22, 2006
Jump To:
Skip to Highlights

Highlights

Smiths Detection, Inc. protests the award of contracts to Science Applications Internal Corporation (SAIC), American Science & Engineering, Inc. (AS&E), and L3 Communications Security & Detection Systems, Inc. (L3) under request for proposals No. HSHQDC-05-R-00007, issued by the Department of Homeland Security (DHS), Domestic Nuclear Detection Office (DNDO), for the research and development, developmental test and evaluation, spiral development, pilot deployment, production, and operational deployment of the Cargo Advanced Automated Radiography System (CAARS) Program. Smiths objects to the agency's evaluation of proposals, and maintains that the agency failed to conduct meaningful discussions, conducted a flawed cost realism analysis, and failed to make a proper best value determination.

We deny the protest.
View Decision

B-298838, B-298838.2, Smiths Detection, Inc., December 22, 2006

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of: Smiths Detection, Inc.

File: B-298838, B-298838.2

Date: December 22, 2006

John S. Pachter, Esq., Jonathan D. Shaffer, Esq., Richard C. Johnson, Esq., Tamara F. Dunlap, Esq., Stephanie D. Capps, Esq., and Mary Pat Gregory, Esq., Smith Pachter McWhorter PLC, for the protester.

Thomas P. Barletta, Esq., Daniel C. Sauls, Esq., Paul R. Hurst, Esq., Paul I. Lieberman, Esq., Michael C. Drew, Esq., and Ana Holmes Voss, Esq., Steptoe & Johnson LLP, for Science Applications International Corporation, an intervenor.

Robert J. Sherry, Esq., Sheila A. Armstrong, Esq., Laura Patterson Hoffman, Esq., and Matthew G. Ball, Esq., Kirkpatrick & Lockhart Nicholson Graham LLP, for American Science & Engineering, Inc., an intervenor.

Catherine Anderson, Esq., and Marion Cordova, Esq., Department of Homeland Security, for the agency.

Linda C. Glass, Esq., and Glenn G. Wolcott, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1. Protest that agency's evaluation and source selection decision (SSD) were flawed is denied where the record shows that the agency's evaluation and SSD were reasonable and consistent with the solicitation's evaluation factors.

2. Source selection authority (SSA) performed a reasonable cost/technical tradeoff in determining that the awardees' proposals represented the best value, where the SSA's judgment, based upon the results of a reasonable, documented technical evaluation, demonstrates the SSA's understanding of the evaluated strengths and weaknesses of the respective proposals, and shows a reasonable weighing of the offerors' respective technical and cost advantages consistent with the solicitation's evaluation criteria.

3. Discussions were meaningful where the discussions led the protester into the areas of its proposal that required improvement or further clarification.

4. Agency's cost evaluation was reasonable even though agency did not verify each and every item of an offeror's proposed costs in conducting its cost realism analysis since the cost evaluation was the result of the agency's exercise of informed judgment.

DECISION

Smiths Detection, Inc. protests the award of contracts to Science Applications International Corporation (SAIC), American Science & Engineering, Inc. (AS&E), and L3 Communications Security & Detection Systems, Inc. (L3) under request for proposals No. HSHQDC-05-R-00007, issued by the Department of Homeland Security (DHS), Domestic Nuclear Detection Office (DNDO), for the research and development, developmental test and evaluation, spiral development, pilot deployment, production, and operational deployment of the Cargo Advanced Automated Radiography System (CAARS ) Program.[1] Smiths objects to the agency's evaluation of proposals, and maintains that the agency failed to conduct meaningful discussions, conducted a flawed cost realism analysis, and failed to make a proper best value determination.

We deny the protest.

BACKGROUND

The solicitation was issued on February 17, 2006, and as amended, provided for the award of up to three indefinite-delivery/indefinite-quantity (ID/IQ) contracts for a period of 7 years. Simultaneously with, or immediately following, the award of the basic ID/IQ contract, the agency plans to issue Task Order No. 1, covering concept and technology development and developmental test and evaluation, to each ID/IQ contractor on a cost plus award fee basis. This effort will culminate with the delivery of one prototype CAARS to DNDO for test and evaluation.

The RFP provided that the award would be made based on the best overall proposals that are determined to be most beneficial to the government with appropriate consideration given to the following evaluation factors listed in descending order of importance: technical, management, past performance and cost. RFP sect. M.1. The RFP provided that when combined, the technical, management, and past performance factors were significantly more important than cost.

The RFP also provided that technical and management proposals would be evaluated qualitatively and rated as exceptional, acceptable, marginal, or unacceptable.[2] Under the past performance evaluation factor, the evaluation would be based on a confidence assessment of high confidence, significant confidence, satisfactory confidence, unknown confidence, little confidence or no confidence. Offerors' proposed costs were not to be rated or scored, but were to be evaluated for realism.

The agency received eight proposals by the closing date. The technical evaluation board (TEB) evaluated the initial technical proposals and a cost evaluation board (CEB) evaluated the initial cost proposals. As a result of the initial evaluations, five offerors were included in the competitive range. On July 25, to begin discussions, all competitive range offerors were provided a set of clarification/discussion questions. In addition, offerors were provided a copy of the results of the TEB's Initial Consensus Report for its individual proposal. AR, Tab 25, Initial TEB Report. The TEB Report included initial ratings and the agency's narrative evaluation for each subfactor, as well as a bullet summary of each strength, weakness and deficiency. DNDO also held oral discussions with each competitive range offeror. Following discussions, final proposal revisions (FPR) were received from all five offerors.

The FPRs were evaluated by the TEB and a consensus rating for each offeror was assigned. The CEB reviewed the costs proposed by each of the five offerors and determined the cost in each cost proposal to be reasonable, realistic and complete for the level of effort proposed by each offeror. AR, Tab 25, Post Negotiation Memo, at 9. The final evaluation results with regard to the proposals of Smiths and the three awardees were as follow:[3]

Offeror

Technical Rating

Management Rating

Past Performance Confidence Assessment

Overall Rating

Total Evaluated Cost + Fee $

AS&E

Exceptional

Acceptable

Satisfactory

Exceptional

28,830,288

L3

Exceptional

Exceptional

Significant

Exceptional

7,491,713

SAIC

Exceptional

Acceptable

Significant

Exceptional

13,490,390

Smiths

Acceptable

Acceptable

Satisfactory

Acceptable

[DELETED]

The source selection authority (SSA) reviewed the evaluation results and compared the proposals, giving consideration to each of the evaluation factors set forth in the RFP and their relative weighting. AR, Tab 4, SSD at 24. The SSA determined that award should be made to L3, SAIC and AS&E whose proposals represented the best value to the government, noting that there was a clear demarcation between the proposals of L3, SAIC and AS&E and that of Smiths.

Specifically, the SSA noted that L3 and SAIC submitted proposals that were rated higher than Smiths' proposal with regard to the non-cost evaluation factors, and offered lower evaluated costs than Smiths' proposal. With regard to AS&E, the SSA noted that AS&E's proposal received an overall rating of exceptional for the non-cost evaluation factors, but offered a much higher cost. The SSA concluded that AS&E's proposal was of significantly superior technical quality to warrant paying a cost/price premium of approximately [DELETED] million over the cost/price offered by Smiths' proposal. The SSA specifically noted that AS&E was the only offeror to propose a technological solution that was not based on [DELETED] and that AS&E's unique software design had the capability to display and examine container cargo in [DELETED] in [DELETED] ranges of materials which exceeded the performance requirement of two ranges and increased the ability to discriminate against innocent alarms. The SSA concluded that AS&E's approach had the potential to provide performance far beyond the [DELETED industry standard proposed by the other offerors. The SSA further concluded that, in comparison, AS&E's technology with its ability to portray high Z detections [DELETED] has the potential to significantly improve the accuracy and speed of detection and alarm against threat materials, with an exceptionally low false alarm rate. The SSA recognized that there was a risk associated with the development of AS&E's technology, but determined that the potential payoff of such a technology outweighed the risks especially when the technology has already been demonstrated in a laboratory environment.

Based on the evaluation discussed above, the SSA subsequently concluded that, when comparing Smiths' lower-rated proposal to AS&E's superior proposal, that Smiths' proposal to execute the program relying on staff located in both [DELETED] without a sufficiently detailed explanation regarding this staffing approach, introduced risk to the program that offset the potential savings associated with Smiths' lower evaluated cost. Accordingly, the SSA determined, based on an integrated assessment of all proposals, that L3, SAIC and AS&E represented the best overall value to the government. Contract awards were made to L3, SAIC and AS&E on September 8, 2006. After receiving a debriefing, Smiths filed an initial protest with our Office on September 19, 2006, and a supplemental protest on October 30, 2006.

DISCUSSION

Smiths maintains that the agency failed to evaluate offerors on a consistent and equitable basis. More specifically, Smiths contends that the agency improperly evaluated all offerors' proposals under the technical factor, the management factor and the past performance factor, that the agency performed an improper cost evaluation of L3's proposal, that the agency failed to conduct meaningful discussions with Smiths, and that the agency failed to perform a proper best value determination.[4]

Our Office reviews challenges to an agency's evaluation of proposals only to determine whether the agency acted reasonably and in accord with the solicitation's evaluation criteria and applicable procurement statutes and regulations. Marine Animal Prods. Int'l, Inc., B-247150.2, July 13, 1992, 92-2 CPD para. 16 at 5. A protester's mere disagreement with the agency's judgment is not sufficient to establish that an agency acted unreasonably. Entz Aerodyne, Inc., B-293531, Mar. 9, 2004, 2004 CPD para. 70 at 3.

Technical Factors

The technical evaluation factor consisted of the following subfactors: software design, hardware design, system performance, open architecture design and production capabilities. Software design and hardware design subfactors were equally important, and when combined, were more important than the other three subfactors. RFP sect. M.1. As shown above, Smiths received an overall rating of acceptable under the technical evaluation factor, while L3, SAIC and AS&E all were rated exceptional. The four offerors were all rated exceptional under the software design subfactor and acceptable under the hardware design subfactor. Smiths basic challenge to the evaluation under these factors is that the proposals were improperly and arbitrarily given equal adjectival ratings when the proposals had differing numbers of strengths, weaknesses and deficiencies. For example, Smiths maintains that SAIC should have not been rated exceptional under the software design subfactor because SAIC had only three strengths, while Smiths had six strengths.

In our view, the protester's arguments are misplaced. The number of strengths, deficiencies, or weaknesses noted in an offeror's proposal does not dictate what overall adjectival rating a proposal receives. The record shows that the evaluators examined the totality of the approach of each offeror for each of the evaluation factors. Moreover, adjectival ratings, like scores, are useful guides to intelligent decision-making; they are not binding on the SSA, who has discretion to determine the weight to accord them in making an award decision. Porter/Novelli, B-258831, Feb. 21, 1995, 95-1 CPD para. 101 at 5. Of concern to our Office is whether the record as a whole supports the reasonableness of the evaluation results and the source selection decision. Orbital Techs. Corp., B-281453 et al., 99-1 CPD para. 59 at 9. Here, the record reflects that along with the adjectival ratings, the TEB provided a narrative description of the evaluation of each proposal for each evaluation factor. The SSA was provided with, and considered, the TEB's report on its evaluation of the revised proposals; this report summarized the TEB's views of the proposals in the context of the adjectival ratings assigned to the proposals under each evaluation factor.

Based on our review of the record, it is clear that Smiths' protest merely expresses disagreement with the agency's judgment as to the value of the offerors' various technical approaches. For example, under the software design factor, the agency concluded that all four offerors' software designs exceeded the RFP requirements and were, therefore, rated exceptional. While the protester disagrees with the agency's determination with respect to the SAIC proposal, on the basis that SAIC had fewer identified strengths, Smiths' protest does not demonstrate that the agency's evaluation was unreasonable.

Smiths also argues that the evaluation of its proposal under the hardware design subfactor was improper because the agency assigned a weakness for Smiths' failure to demonstrate a solution to [DELETED] related to the [DELETED].[5] Smiths complains that SAIC was not assigned a similar weakness under the hardware design subfactor even though SAIC's proposal, in Smiths' view, presented even more risk with regard to [DELETED]. Overall, Smiths maintains that all awardees should have received reduced ratings under the hardware design subfactor.

The record shows that all four offerors received acceptable ratings for hardware design. While the agency noted strengths in each offeror's approach, the perceived risks in each approach resulted in ratings of only acceptable. For example, the agency noted that AS&E proposed several solutions that exceeded the requirements, but AS&E was only rated acceptable because the agency recognized a risk associated with the engineering of AS&E's [DELETED]. Likewise, the L3 design exceeded several RFP requirements, but was rated only acceptable because of risks in the area of [DELETED]. SAIC's weakness centered on its design approach which relied on the successful design development of an [DELETED] that would not [DELETED]. AR, Tab 7, TEB Report. Overall, the record clearly shows that the evaluators understood the hardware design of each offeror, recognized each design's strengths, risks, and limitations, and reasonably concluded that each offeror's design was properly rated as acceptable based on the risks associated with its hardware design.

The protester argues that during discussions, it demonstrated both its understanding of, and a proposed solution for, [DELETED] but that the agency, by requiring Smiths to demonstrate its proposed solution, improperly held Smiths to a standard not required by the solicitation. The agency responds that [DELETED] and intensity variation are problems that cause processing noise which leads to missed detections and/or high false alarms, elaborating that it did not expect Smiths to demonstrate its solution, but rather to demonstrate that it understood the problem by addressing basic issues involved in its plan--such as defining the sources of [DELETED] and identifying mitigation strategies associated with the sources. AR, Tab 4, SSD, at 20; Supplemental Contracting Officer's Statement at 6. The agency concluded that Smiths did not adequately address these issues.[6] In any event, Smiths continues to argue that an exceptional rating under the hardware design evaluation factor may have resulted in an overall rating of exceptional for Smiths' proposal. However, from our review of the record, we do not find the agency's evaluation unreasonable.[7]

Management Factor

The management evaluation factor consisted of the following subfactors: program management approach, management control processes, and utilization of small disadvantaged business concerns. The program management approach and management control processes subfactors were of equal importance and were significantly more important than the utilization of small disadvantaged business concerns subfactor.

For the management evaluation factor, Smiths' proposal received an overall rating of acceptable. Smiths proposed to execute this program by relying on staff located both in the [DELETED]. For the program management approach subfactor, Smiths' proposal was rated unacceptable because the evaluators concluded that Smiths failed to adequately explain how it planned to execute the program utilizing the expertise and experience of its personnel [DELETED] who lacked security clearances--with the less experienced, but security-cleared, personnel at its facility in [DELETED].[8] AR, Tab 4, SSD, at 21. The agency noted that although Smiths' staff in [DELETED] has experience with x-ray development efforts, this is the first such effort for the staff in [DELETED] the agency described Smiths' plan as a –leader-follower— arrangement in which some of the personnel at Smiths' [DELETED] facility would observe initial development in [DELETED], and then development would move to [DELETED]. The agency determined that Smiths' proposal inadequately described the division of labor, responsibility and accountability that would be required to implement its management approach, particularly in light of the security issues. Additionally, the agency concluded that three of Smiths' key personnel, who recently joined Smiths, lacked experience and expertise with Smiths' processes and technology, which created additional risk in Smiths' proposal. Finally, the agency concluded that the software integrated product team leader had no apparent experience in high Z detection software development relevant to this effort.

Smiths argues that its proposal did not use the term –leader-follower,— and that it was improper for the agency to downgrade Smith's proposal for this approach based on the agency's concerns regarding Smiths' compliance with the solicitation's security clearance requirements.[9] Smiths maintains that it explained during oral discussions how it intended to minimize the risk created by reliance on staff located in [DELETED].

Smith's explanations notwithstanding, the agency concluded that, as a result of the oral discussions, it became even more apparent that there was no clear delineation of roles and responsibilities between the staff at the [DELETED] facility and the staff at [DELETED]. Specifically, the agency found that Smiths never adequately explained how it would accomplish the development of the critical inspection and detection software and algorithms by relying on [DELETED] employees with no security clearances, and [DELETED] employees with security clearances--but no experience with Smiths' processes or technologies and limited experience in developing radiography or high Z detection software.

Although Smiths complains about the agency's use of the term –leader-follower— to describe Smiths' proposed approach, based on our review of the record it is clear the agency fully understood the approach that Smiths proposed to employ, and believed that the proposed approach created unacceptable risk to the program, particularly in light of the security requirements. In pursuing this protest, Smiths has failed to demonstrate that the agency's concerns are unreasonable.[10]

Next, Smiths objects to the agency's conclusions that its key personnel who were recently hired (program manager, chief engineer and lead software programmer) lacked experience and expertise with Smiths' processes and technologies. Smiths points out that while these personnel were recent hires, Smiths' program management and system engineering processes are not unusual or unique. Moreover, Smiths states that the program manager, chief engineer and software lead, will not be operating alone, but will be assisted by and collaborate with highly qualified integrated product team members who possess extensive experience. The record shows that the agency understood and recognized the experience of Smiths' key personnel, but nonetheless concluded that their lack of experience and expertise with Smiths' processes and technology, coupled with the –leader-follower— approach, as identified by the agency, posed an unacceptable risk. While Smiths may disagree with the agency's conclusion, its disagreement does not make the evaluation unreasonable.

Finally, with respect to the utilization of SDB concerns evaluation factor, while Smiths' proposal was rated excellent, Smiths maintains that the agency improperly rated AS&E, a small business, and L3 acceptable under this evaluation factor, maintaining that these two offerors failed to the meet the solicitation's minimum requirements, including identification of specific SDB goals. However, while the RFP listed several criteria to be evaluated under the utilization of SDB concerns evaluation factor, including consideration of the extent to which SDB concerns are specifically identified, the RFP did not indicate that the failure to provide specific SDB contracting goals would result in an offeror being rated unacceptable, rather, the RFP gave offerors the opportunity to explain any reasons for the lack of SDB participation. RFP sect. M.3.2. In fact, L3 proposed [DELETED] to participate in performing the contract and indicated that it anticipated utilizing SDBs in the areas of [DELETED]. Similarly, AS&E provided information concerning its proposed goal for SDB participation and identified [DELETED].[11] Based on our review of the solicitation and the offerors' proposals, the agency reasonably determined that AS&E and L3 met the solicitation requirements regarding utilizations of SDB concerns. [12]

Cost Evaluation

Smiths protests that the agency failed to properly evaluate L3's cost proposal. According to Smiths, L3's cost proposal failed to include all required costs and was based on differing, inconsistent estimates of hours for software development.

In performing cost evaluations, an agency is not required to verify the cost for each and every item proposed. Rather, the evaluation of competing cost proposals requires the exercise of informed judgment by the contracting agency; since such an analysis is a judgment matter on the part of the contracting agency, our review is limited to a determination of whether an agency's cost evaluation was reasonably based. Fairchild Weston Sys., Inc., B-229568.2, Apr. 22, 1988, 88-1 CPD para. 394 at 4. As discussed below, our review of the record leads us to conclude that the agency's cost evaluation was reasonable and consistent with the terms of the solicitation.

The RFP provided that, in performing the cost evaluation, the agency would examine the offeror's proposed labor hours, labor rates, materials costs, burden rates, and other costs in light of information available to the contracting officer, including the relationship of such proposed labor hours and costs to the effort described in the offeror's overall proposal and government estimates, and any other costs likely to be incurred by the offeror in performance of the requirement. RFP sect. M.3.4. The record shows that here, the agency reviewed and evaluated specific elements of each offeror's cost proposal, the corresponding elements of the respective technical proposal, and determined that the estimated proposed cost elements were realistic, given the technical solution and approach being proposed. AR, Tab 5, Post Negotiation Memorandum, at 19. The protester argues that L3's proposed costs, which were the lowest, should have been increased because L3 allegedly did not include costs for certain elements that Smiths maintains should have been included. The agency responds that each offeror, including L3, proposed its own technology for development design, based on their particular approach to contract performance. Based on the agency's understanding of each offerors' proposed approach, no adjustments to the individual cost proposals were made. Id. attach. 2a, at 4. Based on our review of the record, the agency's cost evaluation was reasonable and consistent with the terms of the solicitation.

Moreover, the record shows that even if the agency had made all of the cost adjustments that Smiths maintains should have been made, L3's proposal would still have been significantly less expensive, and higher rated technically, than Smiths' proposal. Consequently, Smiths has not shown that it has been prejudiced in this regard. See Wyle Labs., Inc., B-288892, B-288892.2, Dec. 19, 2001, 2002 CPD para. 12 at 17-18.

Meaningful Discussions

Smiths protests that the agency did not conduct meaningful discussions with Smiths with respect to the agency's concerns regarding the [DELETED] issue and the program management issue. Specifically Smiths contends that the agency never informed Smiths that it had to –demonstrate— a solution resolving the agency's concerns regarding [DELETED] or that its management approach was unacceptable because of Smiths' failure to provide information regarding division of labor, responsibility and accountability.

Contracting agencies have considerable discretion in determining the nature and scope of discussions. PRB Assocs., Inc., B-277994, B-277994.2, Dec. 18, 1997, 98-1 CPD para. 13 at 6. Although discussions must be –meaningful,— that is, sufficiently detailed so as to lead an offeror into areas of its proposal requiring amplification or revision, an agency is not required to –spoon feed— an offeror as to each and every item that must be revised to improve their proposal or to achieve the maximum score, Uniband, Inc., B-289305, Feb. 8, 2002, 2002 CPD para. 51 at 11; nor is an agency required to hold successive rounds of discussions until all proposal defects have been corrected. Metro Mach. Corp., B-295744, B-295744.2, Apr. 21, 2005, 2005 CPD para. 112 at 19.

Here, the record shows that the protester was given several opportunities to respond to the agency's concerns regarding both the [DELETED] issue and Smiths' management approach; Smiths simply failed to adequately respond to the agency's questions. Specifically, the agency provided all competitive range offerors with a set of clarification/discussion questions, a copy of the results of the initial evaluation of each respective proposal, and the TEB's Initial Consensus Report, prior to conducting oral discussions with each offeror. With respect to the [DELETED] issue, the TEB report indicated that there was a risk associated with [DELETED]. Further, the agency's written discussion question specifically asked Smiths to describe how issues associated with [DELETED] would be minimized. Finally, the record shows that the issue was raised again by the agency during oral discussions.

Here, with regard to the agency's concerns regarding [DELETED] the record clearly establishes that the agency repeatedly brought these concerns to Smiths' attention during discussions; Smiths simply did not provide information that eliminated those concerns. Instead, Smiths acknowledges in its protest submissions that there was additional information that it could have provided, but did not. Based on the agency's repeated expressions of concern regarding this matter, it was incumbent on Smiths to provide all available information regarding its ability to address those concerns. In this regard, an offeror is responsible for affirmatively demonstrating the merits of its proposal. See generally Will-Burt Co., B-250626.2, Jan. 25, 1993, 93-1 CPD para. 61 at 4. We deny this aspect of Smiths' protest.

With respect to the adequacy of the agency's discussions regarding Smiths' management approach, the record shows that Smiths was provided a copy of the TEB report --which expressly stated that Smiths' management approach was considered unacceptable due to the risks the agency associated with Smiths' reliance on staff located in both [DELETED]. Further, Smiths was specifically asked to describe how the facilities, equipment, quality system, and personnel that manufactured the system in [DELETED] would be duplicated in [DELETED]. AR, Tab 10, Smiths Clarification/Discussions, at 2. Finally, during oral discussions this issue was raised yet again. It was incumbent upon Smiths to provide a detailed explanation that demonstrated the soundness of its management approach. The agency reasonably concluded that Smiths failed to adequately respond to its concerns and, in fact, that Smiths' responses increased the agency's concerns regarding risk to the program.[13] Based on our review of the record, we find the discussions were meaningful.

Source Selection Decisions

Finally, Smiths challenges the reasonableness of the SSA's selection decisions. Here, as discussed above, we have concluded that the agency's evaluation was reasonable and consistent with the evaluation criteria. As explained above, the SSA selected L3 and SAIC for award because their proposals' technical ratings were superior to the technical ratings of Smiths' proposal, and offered lower evaluated costs. Accordingly, no cost/technical tradeoffs were required for the awards to L3 and SAIC. With regard to AS&E's proposal, the SSA performed a tradeoff analysis between AS&E's higher technical rated, but higher-cost, proposal and Smiths' lower rated, lower cost proposal. In performing this tradeoff, the SSA recognized that the AS&E approach presented a risk, but concluded that AS&E's unique technical approach was worth the additional cost. The SSA also concluded that Smiths' management approach introduced an unacceptable risk to the program that offset any potential savings associated with Smiths' proposal.

Based on our review of the record, we conclude that the agency's evaluation, and source selection decision were reasonable and in accordance with the terms of the solicitation.

The protest is denied.

Gary L. Kepplinger

General Counsel



[1] The DNDO is chartered to develop, acquire, and support the deployment and improvement of a domestic system to detect attempts to import, assemble, or transport a nuclear explosive device, fissile material, or radiological material intended for illicit use. RFP attach. 1, Statement of Work at 1. The objective of CAARS is to provide the capability to automatically detect materials with a high atomic number (high Z). Agency Report (AR) Tab 4, SSD at 1. The implemented technology will be able to distinguish between low density materials such as aluminum and steel, and higher density materials such as lead, uranium, or plutonium.

[2] The RFP defined an exceptional rating as –exceeds specified minimum performance or capability requirements in a way beneficial to the DHS/DNDO.— RFP sect. M.2.1.

[3] The agency's evaluation of other offerors' proposals is not relevant to Smiths' protest; accordingly, other offerors' proposals are not further discussed.

[4] In its comments to the agency's supplemental report, Smiths withdrew various protest issues, including: the agency's alleged reliance on information contained in SAIC's proposal to evaluate Smiths' proposal; the agency's alleged failure to assign a weakness to SAIC's and L3's proposals for shielding capabilities; and the agency's alleged overrating of L3's management control process.

[5] The protester also argues that had the agency not arbitrarily shifted the [DELETED] weakness to the more heavily weighted hardware design subfactor instead of the system performance subfactor, Smiths would have been the only offeror with no weaknesses under the first two technical subfactors and, on this basis, asserts that its proposal would have received an overall rating of exceptional. As stated above, Smiths places too much emphasis on the adjectival ratings rather than the narrative evaluation regarding the value of its technical approach.

[6] In contrast, the record shows that SAIC specifically identified three primary sources of [DELETED] as well as specific mitigation measures related to each source. Id.

[7] Similar to the arguments discussed above, Smiths challenges the agency's evaluation with regard to the specific adjectival ratings assigned under the system performance and open architecture design subfactors, arguing that the awardees' proposals should have been downgraded for various risks related to their various approaches. For example, the protester maintains that AS&E should not have been rated acceptable under the open architecture design subfactor because the agency would not be obtaining sufficient data from AS&E under AS&E's approach, and that AS&E's proposal was rated exceptional based only on a proposed concept. Smiths also contends that L3 failed to provide sufficient information regarding radiation shielding. However, our review of the record shows that the evaluators reasonably considered all these risks when rating the respective proposals and that the SSA took them into account when making the selection decision. AR, Tab 4, SSD.

[8] Prior to the closing date for submission of initial proposals, the solicitation was amended to require security clearances for key personnel. The record indicates that Smiths' proposed approach to rely on staff in both the [DELETED] was in response to this solicitation amendment.

[9] Smiths maintains that the agency's late issuance of a mandatory security requirement for key personnel improperly resulted in Smiths receiving an unacceptable rating for program management. To the extent Smiths protest is challenging the provisions of the RFP requiring security clearances for key personnel, the protest is untimely. Our Bid Protest Regulations require that protests based upon alleged improprieties which do not exist in the initial solicitation but which are subsequently incorporated into the solicitation must be protested not later than the next closing time for receipt of proposals following the incorporation. 4 C.F.R. sect. 21.2(a)(1) (2006)

[10] Additionally, Smiths argues that its unacceptable rating for program management approach is inconsistent with its rating of acceptable for production capabilities/facilities. We do not see any inconsistency here. The evaluators found that Smiths provided a sound plan to accomplish production that provided high confidence that Smiths could accomplish the required production rate. However, the agency, nevertheless, remained legitimately concerned about Smiths' plan for transitioning its production expertise from [DELETED]. Based on Smiths' proposed management approach, the agency lacked confidence that Smiths' proposed personnel had the requisite knowledge to successfully execute its production program. Based on the record, it was not unreasonable for the agency to conclude that Smiths had the production capability, but that the proposed management approach created unacceptable risk.

[11] AS&E, a small business, is not required to maintain small business subcontracting plans.

[12] Smiths also asserts that the ratings for L3's and SAIC's proposals should have been lower in various other non-cost evaluation areas including past performance. We have reviewed all of Smiths arguments in this regard and conclude that they constitute disagreement with the agency's judgments. As such, Smiths' arguments provide no basis for sustaining the protest.

[13] With respect to any weaknesses introduced in Smiths' final proposal revision after discussions were concluded, the agency had no obligation to reopen discussions to address these matters. See Ouachita Mowing, Inc., B-276075, B-276075.2, May 8, 1997, 97-1 CPD para. 167 at 4.

Downloads

GAO Contacts

Office of Public Affairs