Skip to main content

Harris IT Services Corporation

B-410898.5 Feb 23, 2016
Jump To:
Skip to Highlights

Highlights

Harris IT Services Corporation, of Herndon, Virginia, protests the exclusion of its proposal from the competitive range under request for proposals (RFP) No. VA118-15-R-0558, issued by the Department of Veterans Affairs (VA) for information technology (IT) services. The protester challenges the agency's evaluation of its technical proposal and contends that the VA failed to meaningfully consider all of the evaluation factors when establishing the competitive range.

We deny the protest.

We deny the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  Harris IT Services Corporation

File:  B-410898.5

Date:  February 23, 2016

Andrew E. Shipley, Esq., Lee P. Curtis, Esq., John F. Henault, Esq., and Seth H. Locke, Esq., Perkins Coie LLP, for the protester.
Desiree A. DiCorcia, Esq., Frank V. DiNicola, Esq., Tiffany Alford, Esq., and Lea Duerinck, Esq., Department of Veterans Affairs, for the agency.
Brent Burris, Esq., and Edward Goldstein, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

1.  Protest challenging agency’s evaluation of sample task responses using model answers that were not disclosed to offerors is denied where the record shows that the evaluation was reasonable and consistent with the solicitation.

2.  Protest that agency failed to meaningfully consider all evaluation factors when establishing competitive range is denied where record reflects that agency’s competitive range determination was reasonable and consistent with the solicitation.

DECISION

Harris IT Services Corporation, of Herndon, Virginia, protests the exclusion of its proposal from the competitive range under request for proposals (RFP) No. VA118-15-R-0558, issued by the Department of Veterans Affairs (VA) for information technology (IT) services.[1]  The protester challenges the agency’s evaluation of its technical proposal and contends that the VA failed to meaningfully consider all of the evaluation factors when establishing the competitive range. 

We deny the protest.

BACKGROUND

The RFP, issued on November 19, 2014, anticipated the award of up to 20 indefinite-delivery/indefinite-quantity (IDIQ) contracts, each with a 5-year base period and one, 5-year option period.[2]  CO’s Statement at 1.  The solicitation provided that task orders under the awarded contracts would be issued on fixed-price, time-and-materials, and cost-reimbursement bases, and established a maximum total value for all orders of $22.3 billion for the base and option period.  RFP at 89-90.  The solicitation sought IT services encompassing the entire range of IT requirements for the VA.  Performance Work Statement (PWS) at 16.

The RFP established that award would be made on a best-value basis considering price and the following five non-price evaluation factors:  (1) technical;[3] (2) past performance;[4] (3) veterans involvement;[5] (4) veterans employment;[6] and (5) small business participation commitment (SBPC).[7]  RFP at 120-21.  The technical factor was further divided into two subfactors, sample tasks and management.  Id. at 121.

For the purpose of making the best-value award decisions, the RFP established the relative importance of the various factors and subfactors.  In this regard, the technical factor was significantly more important than past performance, which was slightly more important than veterans involvement, which was slightly more important than veterans employment, which was slightly more important than SBPC, which was slightly more important than price.  Id. at 120.  Thus, price was the least important factor.  Under the most important factor, technical, the RFP indicated that the sample tasks subfactor was more important than the management subfactor.  Id.

Regarding the sample tasks subfactor, the RFP directed offerors to describe their approach to performing three hypothetical tasks, two of which are at issue here.  Id. at 113.  Sample task 1 was predicated on the VA having acquired a common cloud computing environment for use across the agency.  RFP, Attach. 16, Sample Tasks.  Under task 1, the contractor was required to support the migration of 17 IT applications from their current environments to the cloud environment.  Id.  In responding to sample task 1, the RFP directed offerors to describe their “approach for executing all migration efforts required” for the task and their “approach for providing sustainment services” so as to “ensure that the applications operate and perform as required in the new cloud environment.”  Id.  The RFP further instructed offerors to identify the labor categories that they would use to perform the sample task.  Id.

Sample task 2 involved identifying and analyzing modernization efforts for the VA’s Veterans Health Information Systems and Technology Architecture (VistA).[8]  Id.  In responding to sample task 2, the RFP directed offeors to “[d]escribe the key requirements and technical approaches [they] would consider to implement a program to modernize VistA” and to identify the labor categories they would use to perform this task.  Id.   

Concerning the evaluation of offerors’ sample task responses, the RFP provided that the tasks were designed to “test” offerors’ “expertise and innovative capabilities” with respect to the types of work required by the solicitation.  RFP
at 121.  The RFP informed offerors that they would not be given an opportunity during discussions to correct or revise their sample task responses.  Id.  The RFP further provided that the agency would evaluate the sample task responses to assess offerors’ understanding of the problems and the feasibility of their approach.  RFP at 121-22.  As to the former, the RFP provided for evaluating the extent to which an offeror’s proposal “demonstrates a clear understanding of all features involved in solving the problems and meeting the requirements presented by the sample task” as well as “the extent to which uncertainties are identified and resolutions proposed.”  Id. at 122.

With regard to feasibility of approach, the RFP provided for evaluating whether the offeror’s proposed “methods and approach to meeting the sample task requirements provided the Government with a high level of confidence of successful completion.”  Id.  The solicitation also indicated that the VA would evaluate the realism of the labor categories proposed to perform the sample tasks.  Id.

Under the management subfactor, an offeror was to address: (1) how the T4NG PWS requirements would be accomplished by the offeror and its subcontractors; (2) a proposed management approach to ensure development of a quality assurance system and processes to capture performance and contract metrics; (3) a proposed approach to recognize, react to, and correct problems, which may arise in the performance of a task order; (4) a proposed approach to effectively forecast and control costs; and (5) how the offeror would attract and retain its workforce.  Id. at 114.  In evaluating offerors’ responses, similar to the sample tasks subfactor, the RFP provided that the agency would evaluate the extent to which offerors demonstrated an understanding of the problems and the feasibility of their approaches.  Id. at 122. 

The VA received timely proposals from [deleted] offerors, [deleted] of which were submitted by large businesses, including Harris.[9]  AR, Tab 10, Competitive Range Memo, at 3.  The results of the agency’s evaluation of the protester’s proposal are as follows:

 

Harris Evaluation

Overall Technical

Good

Technical--Sample Tasks Subfactor

Good

  Sample Task 1 (Cloud Migration)

Acceptable

  Sample Task 2 (VistA Modernization)

Acceptable

  Sample Task 3 (Telehealth Expansion)[10]

Outstanding

Technical--Management Subfactor

Acceptable

Past Performance

Low Risk

Veterans Involvement

No Credit

Veterans Employment (total number of veterans; percentage of workforce comprised of veterans)

479 veterans; 20.74 percent

Small Business Participation Commitment

Outstanding

Total Evaluated Price[11]

$15,397,115,988


Id. at 8.

To facilitate its evaluation of the sample tasks, VA technical experts developed a model for each sample task prior to the receipt of proposals.  AR, Tab 9, Sample Task Evaluation, at 1-3.  The model identified key focus areas and lower-level sub‑areas that, in the agency’s view, an offeror would need to address to demonstrate its understanding of the challenges associated with the task orders and the feasibility of its approach to performing the task orders.  Id.

With respect to sample task 1, the agency’s technical experts identified the following five key focus areas:  (1) project management; (2) migration analysis; (3) applications migration; (4) system cutover; and (5) operations and sustainment.  Id. at 1.  Under each of the five key focus areas, the agency also identified various sub‑areas.  For example, under the project management focus area, the VA identified four sub-areas (management activities for the project, methods to develop a schedule, methods to establish stakeholders, and the application of resources and staffing (labor categories)).  Id.

Similarly, with respect to sample task 2, the VA identified five key focus areas:  (1) implementing a program managerial structure; (2) implementing key program processes; (3) understanding of key requirements; (4) understanding of VistA legacy knowledge; and (5) a technical approach to implement a VistA modernization program.  Id.  For each key focus area, the agency identified various sub-areas.  For example, under the key focus area of implementing a program managerial structure, the VA identified three sub-areas (coordination of existing VistA modernization efforts, implementing a program structure, and understanding proper governance).  Id.

According to the VA, the agency rated an offeror’s sample task responses based on the extent to which the offeror addressed the key focus areas and sub-areas.  CO’s Statement at 7-8, 10-11.  Where a sample task response provided a greater level of relevant detail addressing the various sub-areas, the agency assessed the response as having a more thorough understanding of the challenges associated with performance of the task order and a more feasible approach.  Id. at 10-11.  After assessing the extent to which an offeror’s sample task response addressed the sub-areas, the agency assigned strengths, weaknesses, and deficiencies at the key focus area level.  Id.

The VA rated Harris’s response to sample task 1 as acceptable overall, assessing it with strengths under the key focus areas of project management and migration analysis, but also assigning weaknesses under the key areas of applications migration and operations and sustainment.  AR, Tab 9, Sample Task Evaluation, at 5-8.  As to the applications migration weakness, the agency found that the protester’s sample task response provided only a minimal level of detail addressing the four sub-areas under this focus area.  Id. at 6-7. 

For example, regarding the migration activities sub-area, the VA found that Harris’s approach “lacked detail on the processes for configuring the new [cloud environment] to satisfy the application system performance requirements” and also lacked detail concerning “coordination with systems/services that interface with the migrating application system.”  Id. at 6.  Additionally, the agency found that the protester’s proposal provided a minimal level of detail concerning each of the sub-areas under the operations and sustainment key focus area.  Id. at 7-8.  Overall, the VA found that Harris’s sample task 1 response demonstrated a minimally feasible approach.  In the agency’s view, the protester’s proposed approach increased the risk that the applications migrated to the cloud environment would not have high operational availability, potentially compromising the VA’s ability to access health records, benefits information, and other data.  Id. at 8.

Regarding sample task 2, the agency assessed Harris’s response a strength under the focus area of implementing program processes, weaknesses in three focus areas (implementing a program managerial structure, understanding of key requirements, and understanding of legacy VistA), and a significant weakness in the focus area of technical approach to implement a VistA modernization program.  Id. at 9-12.  Similar to task 1, the agency found that Harris’s response to sample task 2 provided minimal detail concerning several of the sub-areas, and as a result, demonstrated an overall lack of understanding of the requirements necessary to successfully perform the task.  Id.

After completing its initial evaluation of proposals, the source selection evaluation board (SSEB) presented the results of its evaluation to the CO and SSA.  CO’s Statement at 2-3.  Following the SSEB’s briefing, the CO prepared a competitive range memo, approved by the SSA, which established three competitive ranges--one for large business offerors, one for SDVOSB and VOSB offerors, and one for non-VOSB small businesses offerors.  AR, Tab 10, Competitive Range Memo, at 7.

With respect to the [deleted] proposals received from large business offerors, the agency rated 7 as unacceptable under the technical factor, and as a result, excluded these proposals from the competitive range.  Id. at 3.  The CO also excluded the 13 proposals that the VA rated as acceptable under the technical factor.  Id.  Given the relative weights of the evaluation factors, the agency concluded that these offerors did not have a reasonable prospect for award.  Id. at 4.

The CO then excluded from the competitive range three proposals that received an overall rating of good under the technical factor, including Harris’s.  The CO concluded, among other things, that Harris’s proposal, which received an acceptable rating under two of the three sample task responses, was not as strong as the proposals that received ratings of good or better for two or more of the sample tasks.  Id.  Further, the CO noted that per the terms of the solicitation, Harris could not resolve the flaws with its sample task responses through discussions.  Id.  The agency’s decision also emphasized that Harris’s proposal received only an acceptable rating under the management subfactor.  Id.

With respect to the large business offerors, the CO established a competitive range comprised of [deleted] proposals.[12]  Id.  Following a debriefing, Harris timely filed a protest with this Office on October 9, 2015.  On October 28, we dismissed that protest because another party had filed a protest with the Court of Federal Claims involving the same procurement.  Harris IT Servs. Corp., B-410898.4, Oct. 28, 2015.  On November 12, the Court dismissed the protest before it, and Harris filed the instant protest with our Office on November 16. 

DISCUSSION

Harris challenges the VA’s evaluation and resulting competitive range determination on several grounds.  First, the protester contends that the agency mechanically compared its sample task responses to the sample task models and, as a result, failed to consider Harris’s unique approach to the sample tasks.  Protest at 20-30.  Second, the protester argues that the agency used unstated evaluation criteria in its evaluation of Harris’s sample task responses.  Id. at 31-35.  Third, Harris contends that the VA failed to meaningfully consider all of the evaluation factors in making its competitive range determination.  Id. at 36-41.  For the reasons discussed below, we deny the protest. 

It is well-established that in reviewing challenges to the agency’s evaluation of proposals, we do not reevaluate proposals, but rather, review the agency’s evaluation to ensure that it was reasonable, consistent with the terms of the solicitation, and consistent with applicable statutes and regulations.  Philips Med. Sys. N. Am. Co., B-293945.2, June 17, 2004, 2004 CPD ¶ 129 at 2. 

Here, Harris argues that the VA used the sample task models as essentially “go/no go” checklists, and thus improperly penalized the protester’s responses where they deviated from the model responses.  Protest at 20-21.  Harris likewise contends that the agency’s rigid application of its model answers was inconsistent with the RFP, which provided that the sample tasks were designed to test offerors’ “expertise and innovative capabilities.”  Id. at 30. 

In response, the VA argues that an offeror could not fully demonstrate its understanding of the issues presented by the sample tasks or the feasibility of the offeror’s approach without addressing the key focus areas and related sub-areas identified by the agency’s technical experts.  CO’s Statement at 7-8.  The agency further contends that the key focus areas and sub‑areas were broad enough to allow offerors to propose a variety of solutions.  Id.  The agency argues, however, that even a unique and innovative approach would need to address the focus areas and sub-areas, as they encompassed aspects fundamental to any sample task solution.  Id. at 11-12.

As an initial matter, although Harris generally contends that the VA rigidly applied its model answers, excluding alternative solutions, the protester has not identified a specific unique or innovative approach contained in its sample task responses that the VA failed to consider.  Moreover, the record reflects that the VA’s model answers did not represent a single, specific solution against which offerors were mechanically evaluated.  Rather, the key focus areas and sub‑areas represented topics and issues that the VA determined should be addressed as part of an offeror’s sample task solutions.  AR, Tab 9, Sample Task Evaluation, at 1-3.  In this regard, the protester has not shown how the agency’s consideration of higher-level key focus areas such as applications migration and operations and sustainment under sample task 2, or underlying sub-areas, such as disaster recovery and help desk and incident response, were so narrow as to prevent the VA from considering Harris’s specific technical approach.  Further, as discussed more fully below, the key focus areas and sub-areas identified by the VA were reasonably related to or encompassed by the sample tasks evaluation criteria.  In short, the protester has not shown that the agency’s evaluation of the sample tasks was unreasonable or inconsistent with the RFP.[13]  See DSS Healthcare Sols., LLC, B-403713.3, June 22, 2011, 2011 CPD 147 at 2-4 (denying protest challenging VA’s use of similar evaluation scheme, where protester failed to show that key focus areas and
lower-level subareas were not reasonably related to performing the sample tasks); MicroTechnologies, LLC, B-403713.6, June 9, 2011, 2012 CPD 131 at 2-4 (same). 

Next, Harris contends that the agency applied unstated evaluation criteria when evaluating the sample tasks.  Protest at 31-35.  As a general matter, when evaluating proposals, an agency properly may take into account specific, albeit not expressly identified, matters that are logically encompassed by, or related to, the stated evaluation criteria.  Open Sys. Science of Virginia, Inc., B-410572, B‑410572.2, Jan. 14, 2015, 2015 CPD ¶ 37 at 11.  Harris argues that the agency improperly considered management-related issues under the key focus area of implementing a program managerial structure when evaluating sample task 2.  Protest at 32‑33.  According to Harris, offerors had no reason to expect that the VA would consider program management in the context of the sample task evaluation since the RFP directed offerors to provide their management approaches under an entirely separate subfactor.  Comments at 28-31. 

Although Harris is correct that the RFP established management approach as a stand‑alone subfactor under the technical factor, the RFP did not preclude the VA from also considering whether an offeror proposed a sound management approach in the context of the specific sample tasks.  In this regard, the record reflects that the management subfactor was focused on an offeror’s management approach as it pertained to the contract as a whole and to task order management generally.  RFP at 113-14.  By contrast, in evaluating sample task 2, the agency assessed the extent to which offerors addressed the management-related tasks that the VA considered necessary to successfully implement a VistA modernization program.  AR, Tab 9, Sample Task Evaluation, at 1-3.  Notably, the protester does not dispute the agency’s assertion that implementing a program managerial structure was necessary to successfully performing sample task 2.  Thus, we find that the VA reasonably expected offerors’ sample task responses to address their management approach as it specifically related to those tasks. 

Harris also argues that the VA unreasonably assigned its proposal a weakness based in part on the fact that the protester did not identify the Veterans Access, Choice and Accountability Act of 2014 (VACAA) and the fiscal year (FY) 2014 National Defense Authorization Act (NDAA) as key requirements under sample task 2.  Protest at 35.  Specifically, the protester contends that identification of these laws was not reasonably related to the RFP’s evaluation criteria because the RFP provided no indication that the VACAA or the FY 2014 NDAA should be addressed in their sample task 2 responses.  Id.  We find this argument to be without merit.

As discussed above, for sample task 2, offerors were to describe the key requirements they would consider in implementing a program to modernize VistA, an IT system that provides integrated electronic health records for VA patients.  RFP, Attach. 16, Sample Tasks.  Further, the VA’s evaluation reflects--and the protester does not dispute--that the VACAA expanded the options for veterans to receive health care at non-VA facilities and the FY 2014 NDAA mandated that the VA’s electronic health records systems be interoperable with Department of Defense systems.  AR, Tab 9, Sample Task Evaluation, at 10.  Given the relevance of these laws to the functionalities of VistA, we find that the agency reasonably considered them to be important considerations under sample task 2.[14]

Harris also challenges the agency’s assignment of a weakness to the protester’s sample task 2 response for failing to identify its proposed project manager and other staff.  The protester maintains that the agency’s evaluation in this regard was contrary to the solicitation, which did not require offerors to provide resumes for individual employees.  Protest at 34-35.  The VA responds that it did not negatively assess Harris’s proposal for not identifying specific individuals, but rather for failing to recognize and discuss the importance of having a project manager and technical team with experience in VistA technologies.  CO’s Statement at 17-18.  Although Harris disputes the VA’s explanation of its evaluation, we find that it is supported by the contemporaneous record.  AR, Tab 9, Sample Task Evaluation, at 9-10.  Furthermore, Harris does not challenge the VA’s determination that staff with expertise in VistA technologies would be necessary to successfully performing sample task 2.  Accordingly, we have no basis to find the agency’s evaluation unreasonable or inconsistent with the terms of the solicitation.[15]

Next, Harris argues that the VA relied exclusively on offerors’ technical ratings and did not meaningfully consider the evaluation factors of price, past performance, or veterans employment in establishing the competitive range.  Comments at 9-23.  As discussed below, the contemporaneous record demonstrates that the agency’s competitive range determination was reasonable and consistent with the solicitation.

As an initial matter, to the extent the protester asserts that the agency was required to conduct a best-value tradeoff analysis among all of the evaluation factors when establishing the competitive range, this argument is misplaced.  Rather, when establishing a competitive range, Federal Acquisition Regulation (FAR) § 15.306(c)(1) directs contracting agencies to evaluate proposals against all evaluation criteria, and eliminate those proposals that are not among the most highly-rated or that the agency otherwise reasonably concludes have no realistic prospect of being selected for award.  FAR § 15.306(c)(1); Wahkontah Servs., Inc., B-292768, Nov. 18, 2003, 2003 CPD ¶ 214 at 4.  The determination of whether a proposal is in the competitive range is principally a matter within the reasonable exercise of discretion of the procuring agency.  Foster‑Miller, Inc., B-296194.4, B-296194.5, Aug. 31, 2005, 2005 CPD ¶ 171 at 6.  Accordingly, in reviewing an agency’s evaluation of proposals and subsequent competitive range determination, our Office will not reevaluate proposals or make our own determination as to their relative merits; rather, we review an agency’s evaluation and exclusion of a proposal from the competitive range for reasonableness and consistency with the solicitation criteria and applicable statutes and regulations.  Outreach Process Partners, LLC, B-405529, Nov. 21, 2011, 2011 CPD ¶ 255 at 3.

In the instant matter, the record reflects that the SSEB evaluated proposals against all of the evaluation factors, and presented the results of its evaluation to the CO and SSA.  CO’s Statement at 2.  The CO represents that over the course of three days, the SSEB provided a thorough discussion of its evaluation results, and that he and the SSA “actively discussed” those results prior to establishing the competitive range.  Id. at 2; Supp. CO’s Statement at 1.  Following the SSEB’s briefing, the CO prepared a competitive range memo, which included a chart summarizing the ratings of all offerors as well as the TEP for each offeror.  AR, Tab 10, Competitive Range Memo, at 3, 8-10.  For the large business proposals, the competitive range memo also summarized the evaluation results for each evaluation factor by noting the range of evaluation ratings.  Id. at 3.

As discussed above, after eliminating those proposals rated as only acceptable under the technical factor, the CO determined that the proposals of Harris and two other offerors that received ratings of good under the technical factor did “not have a reasonable prospect for award given all the factors and their relative importance.”  Id. at 4.  With respect to the protester’s proposal, the CO concluded that it was not as strong as the proposals included in the competitive range due to its ratings of acceptable under two of the sample tasks and the management subfactor.  Id.  The CO likewise excluded Offeror 58’s proposal (which received a rating of outstanding under the management subfactor), based on the proposal’s acceptable ratings for two of the sample tasks and its above average TEP.  Id.  Finally, the CO also excluded the proposal of Offeror 121 from the competitive range, notwithstanding its ratings of good for two of the sample tasks, due to its above average TEP and rating of acceptable under the management subfactor.  Id.

Based on the record described above, we find that the agency properly considered all of the evaluation factors in establishing the competitive range.  Although the protester complains that the competitive range memo does not contain a comparative analysis of its proposal under each evaluation factor, as noted above, such a detailed comparison was not required.  In this regard, the record reflects that in considering all of the evaluation factors, the CO placed the greatest emphasis on the technical factor evaluation.  Id.  Given the relative importance of this factor--it was significantly more important than the next-most important factor--and that offerors could not improve their sample task ratings through discussions, the CO’s emphasis on the technical factor in making the competitive range decision was entirely reasonable.  

Finally, Harris argues that the VA should have included its proposal in the competitive range since it was lower-priced and for some of the evaluation factors, higher-rated, than proposals that the agency included in the competitive range.  Comments at 3-7, 23-25.  For example, Harris complains that, as compared to Offeror 85’s proposal, which the agency included in the competitive range, its proposal was lower-priced, received higher ratings under past performance (low risk versus moderate risk), represented a higher percentage of veterans employed (20.74 percent versus 14.86 percent), and received a better SBPC rating (outstanding versus good).  Id. at 7, 24.  The record, however, reflects that Offeror 85’s proposal was higher-rated under the most important factor, technical, as it not only received better ratings at the individual sample task level[16] (as did all of the other offerors included in the competitive range), but was also rated higher under the management subfactor (outstanding versus acceptable).  Furthermore, Offeror 85’s proposal was rated higher under the veterans involvement factor (some credit versus no credit) and represented a greater number of veterans employed (2,923 employees versus 479 employees).  AR, Tab 10, Competitive Range Memo, at 8-9.  Given the relative weights of the evaluation factors and the broad discretion afforded agencies in establishing a competitive range, we have no basis to conclude that the VA acted unreasonably in deciding to include Offeror 85’s proposal in the competitive range, while deciding to exclude the proposal submitted by Harris.[17]

The protest is denied.

Susan A. Poling
General Counsel



[1] The procurement at issue is commonly referred to as the Transformation Twenty-One Total Technology Next Generation procurement, or T4NG.  Contracting Officer’s (CO’s) Statement at 1.

[2] Of the 20 anticipated awards, the RFP reserved 4 awards for Service Disabled Veteran Owned Small Businesses (SDVOSBs), 4 additional awards for SDVOSBs or Veteran Owned Small Businesses (VOSBs), and 4 awards for small businesses generally.  RFP at 120-21.  Large businesses were eligible to compete for eight unreserved awards, although the solicitation provided that the VA could make additional unreserved awards if the agency determined it to be in the best interest of the government.  Id.

[3] For the technical factor (and related subfactors), the VA rated proposals as outstanding, good, acceptable, susceptible of being made acceptable, and unacceptable.  Agency Report (AR), Tab 8, Source Selection Authority (SSA) Briefing, at 20.

[4] For the past performance factor, the agency rated proposals as low risk, moderate risk, high risk, and unknown risk.  Id. at 171.

[5] For the veterans involvement factor, the agency assigned ratings of full credit, partial credit, some credit, or no credit.  Id. at 551.  In accordance with VA Acquisition Regulation § 852.215-70, only SDVOSBs could receive a rating of full credit and only VOSBs could receive a rating of partial credit.  RFP at 110.

[6] Under the veterans employment factor, the RFP established that the agency would evaluate the extent to which offerors employed veterans.  Id. at 123.  Accordingly, the RFP instructed offerors to identify the percentage of their workforce that was comprised of veterans, as well as the total number of veterans employed at the time of proposal submission.  Id. at 115-16.

[7] For the SBPC factor, the agency rated proposals as outstanding, good, acceptable, susceptible of being made acceptable, and unacceptable.  AR, Tab 8, SSA Briefing, at 603. 

[8] VistA is a VA IT system that is deployed throughout the VA healthcare system and is comprised of approximately 200 applications and modules.  See VistA Monograph, 11-14 (Oct. 13, 2013), http://www.ehealth.va.gov/docs/
VistA_Monograph.pdf.
  VistA includes integrated electronic health records for VA patients and administrative tools for day‑to-day VA operations.  See http://www.ehealth.va.gov/vista.asp (June 3, 2015).

[9] Of the [deleted] proposals submitted, 1 was withdrawn and 1 was eliminated from consideration due to an organizational conflict of interest.  AR, Tab 10, Competitive Range Memorandum (Memo), at 3.  The record does not reflect the size status of these two offerors.

[10] Sample task 3, which is not at issue in this protest, directed offerors to describe their approach to executing all of the tasks necessary to design, acquire, install, test, deploy, and maintain an expanded and enhanced enterprise-wide telehealth capability.  RFP, Attach. 16, Sample Tasks. 

[11] Although not at issue in this protest, an offeror’s total evaluated price (TEP) was derived by summing the offeror’s labor costs, total materials/other direct costs, and total travel costs, for the base period and five-year option period.  RFP at 124.  Offerors’ labor costs were calculated by multiplying their blended labor rates by the agency-provided corresponding level of effort.  Id.

[12] The CO also established a competitive range of [deleted] SDVOSBs and VOSBs and a competitive range of [deleted] non-VOSB small businesses offerors.  AR, Tab 10, Competitive Range Memo, at 7.

[13] In its comments on the agency report, the protester raises additional arguments as to why the specific weaknesses assigned to its sample tasks were unwarranted given the content of its proposal.  Comments at 38-46.  These arguments are untimely, however, as Harris was aware of the factual bases for these contentions at the time it filed its November 16 protest but did not raise them at that time.  4 C.F.R. § 21.2(a)(2) (requiring protest issue be filed within 10 days after the basis is known or should have been known); see also Lanmark Tech., Inc., B-410214.3, March 20, 2015, 2015 CPD ¶ 139 at 5 n.2 (piecemeal presentation of protest grounds, raised for the first time in comments, are untimely).

[14] In reaching this conclusion, we also find no merit to Harris’s contention that the VA’s response to an offeror question indicated that offerors were not to address technical standards, such as the laws at issue, in their sample task responses.  Comments at 34.  In this regard, the record reflects that the question/answer the protester relies on did not pertain to sample task 2 or otherwise address the content of offerors’ sample task responses.  AR, Tab 6, Offeror Questions, at 37.  As such, we agree with the VA that the agency’s response to this question did not in any way contradict the clear direction in sample task 2 that offerors were to identify the key requirements they considered relevant to performing the task.

[15] Harris similarly disputes the VA’s explanation of the contemporaneous evaluation record concerning the agency’s negative assessment of Harris’s sample task 1 response for failing address configuration of the new cloud environment.  Comments at 26-28.  Here again, based on our review of the record, we have no basis to question the agency’s explanation.  In any event, it does not appear that Harris was prejudiced by any such error, as the protester has not timely challenged the agency’s determination that Harris’s proposal also did not adequately address the three other sub-areas under the applications migration key focus area. 

[16] Offeror 85’s sample task responses received ratings of acceptable, good, and outstanding whereas Harris’s sample task responses received two acceptable ratings and one outstanding rating.  AR, Tab 10, Competitive Range Memo, at 8-9.

[17] We have also reviewed the record with respect to the ratings and TEPs of the other proposals included in the competitive range and find no basis to question the agency’s determination that Harris’s proposal was not among the most highly-rated.

Downloads

GAO Contacts

Office of Public Affairs