Network Designs, Inc.
Highlights
Network Designs, Inc., a service-disabled veteran-owned small business (SDVOSB) of McLean, Virginia, protests the elimination of its proposal from the competition by the Department of Veterans Affairs (VA) under request for proposals (RFP) No. 36C10B19R0046 for professional and information technology (IT) services. The protester argues that the agency unreasonably eliminated its proposal from the competitive range.
DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.
Decision
Matter of: Network Designs, Inc.
File: B-418461.7; B-418461.17
Date: February 22, 2021
John M. Manfredonia, Esq., Manfredonia Law Offices, LLC, for the protester.
Tara Nash, Esq., Desiree A. DiCorcia, Esq., and Frank V. DiNicola, Esq., Department of Veterans Affairs, for the agency.
Katherine I. Riback, Esq., and Evan C. Williams, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.
DIGEST
1. Protest challenging agency’s evaluation of sample tasks using a model answer not disclosed to offerors is denied where the record shows that the evaluation was reasonable and consistent with the solicitation.
2. Protest alleging various errors in agency’s evaluation of protester’s proposal is denied where protester has failed to demonstrate that agency’s evaluation was unreasonable or inconsistent with the solicitation.
Network Designs, Inc., a service-disabled veteran-owned small business (SDVOSB) of McLean, Virginia, protests the elimination of its proposal from the competition by the Department of Veterans Affairs (VA) under request for proposals (RFP) No. 36C10B19R0046 for professional and information technology (IT) services. The protester argues that the agency unreasonably eliminated its proposal from the competitive range.
We deny the protest.
BACKGROUND
The procurement at issue is commonly referred to as the Transformation Twenty-One Total Technology Next Generation procurement or T4NG. Contracting Officer’s Statement (COS) at 1. With a program ceiling of $22.3 billion, the T4NG contract is a multiple-award indefinite-delivery, indefinite-quantity (IDIQ) contract that provides professional and IT services for the Department of Veterans Affairs. Agency Report (AR), Tab 5, T4NG On‑Ramp RFP at 7. T4NG delivers contractor-provided solutions in support of IT, health IT, and telecommunications, to include services and incidental hardware/software, for customer requirements that vary across the entire spectrum of existing and future technical environments. Id. at 12; COS at 1.
Currently, there are 28 prime vendors that hold the T4NG contract: 14 large businesses and 14 SDVOSBs. AR, Tab 10, Step Two Competitive Range Determination at 1. Relevant here, the T4NG contract includes “on-ramp” procedures--i.e., procedures for adding additional companies--for the purposes of adding new SDVOSBs, veteran-owned small businesses, and small business contractors. RFP at 77. The RFP specified that the on‑ramp procedures could be implemented at any time by reopening the competition and utilizing the same terms and conditions of the T4NG contract. Id.
On November 12, 2019, the VA issued the T4NG on-ramp solicitation, which contemplated the award of IDIQ contracts that could result in placing individual task orders on a performance-based time-and-material, cost-reimbursement, and fixed-price basis for a period of approximately 5 years. Id. at 16. The solicitation stated that the agency intended to award seven contracts to SDVOSBs, but reserved the right to make more or fewer awards. Id. at 132; COS at 1‑2. The awards would be made to the offerors whose proposals represented the best value, considering the following factors, listed in decreasing level of importance: technical, past performance, veterans employment, small business participation commitment (SBPC), and price. RFP at 133.
The technical factor was comprised of two subfactors, sample task and management. The sample task subfactor consisted of sample tasks 1 and 2, which were of equal importance. Id. Under the technical factor, the sample task subfactor was significantly more important than the management subfactor. Id. Overall, the technical factor was significantly more important than the past performance factor, with past performance and all subsequent factors each slightly more important than the following factor. Id. All non-price factors, when combined, were significantly more important than price. Id.
Under each technical factor and subfactor, and under the SBPC factor, proposals would receive an adjectival rating of outstanding, good, acceptable, susceptible to being made acceptable, or unacceptable. AR, Tab 4, Source Selection Evaluation Plan at 23. Under the past performance factor, each proposal would receive an adjectival risk assessment. Id. at 23-24. Pertinent to this protest, the solicitation advised that offerors were responsible for including sufficient details, in a concise manner, to permit a complete and accurate evaluation of each proposal. RFP at 122.
The solicitation established that the technical evaluation was comprised of two steps, which the solicitation termed step one and step two. Id. at 132. In step one, offerors were required to submit a proposal that included three parts: a response to sample task 1, a price volume, and a volume of certifications and representations. Id. After the evaluation of the step one submissions, the agency would establish a competitive range. Id. The step one evaluations of an offeror’s sample task 1 and price were carried forward to the step two evaluation and would not be reevaluated. Id.
In step two, the agency would evaluate an offeror’s response to sample task 2 and finalize the adjectival rating for the sample task subfactor. Id. at 133. During step two, the VA would also evaluate the management sub-factor, the technical factor, the past performance factor, the veterans employment factor, and the SBPC factor. Id. As to the price evaluation, the agency would conduct a price realism evaluation by examining an offeror’s labor rates to assess performance risk, but would not adjust an offeror’s overall price. Id. at 136.
Broadly, the sample task evaluation would assess the extent to which an offeror demonstrated an understanding of all facets of the problem and whether its proposed solution provided the agency with a high level of confidence in successful project completion. Id. at 134. Sample task 1 asked offerors to explain how they would analyze, report, prioritize, remediate, and track VA’s infrastructure and IT components in anticipation of a new electronic health records (EHR) system. Id. at 203. Sample task 2 asked offerors for their plan to build an online form submission application. Id. at 209.
In evaluating responses to the sample task, the RFP stated that the agency would assess the extent that the response demonstrated the offeror’s understanding of all of the features involved in solving the problems presented, and meeting the requirements, including identifying uncertainties and proposing resolutions to address those uncertainties. Id. The response would also be evaluated for its feasibility of approach, which encompassed considering whether the offeror’s methods and approach to the sample task requirements provided the agency with a high degree of confidence of successful completion. Id. at 134.
The solicitation stated that these tasks were designed to test the offeror’s expertise and innovative capabilities to respond to the types of situations that may be encountered in contract performance. Id. Importantly, the solicitation provided that because the sample tasks were “designed to test the [o]fferor’s expertise and innovative capabilities to respond to the types of situations that may be encountered in [contract] performance,” even if the agency entered into discussions, offerors would “not be given an opportunity to correct or revise a Sample Task response.” Id.
To aid in the evaluation of sample task 1, the agency formed an Integrated Product Team (IPT) that, through a number of meetings, developed successive versions of a model answer. Decl. of Senior Technical Advisor (Dec. 15, 2020) at 3. The IPT’s model answer identified high level focus areas, and lower-level focus areas intrinsic to the higher-level focus areas, that it deemed necessary to meet the agency’s requirements for the sample task. Memorandum of Law (MOL) at 15; COS at 18-19; AR, Tab 8, Network Designs Technical Factor Report at 2-3.
The agency evaluated 94 step one proposals. COS at 3.[1] The senior technical advisor reviewed randomly selected proposals, chose two proposals that he felt had “the best understanding of [s]ample [t]ask 1 based upon the then-current version of the model answer,” and “further refined” the model answer. Decl. of Senior Technical Advisor (Dec. 15, 2020) at 4.
The agency then established a competitive range of 33 of the highest-rated proposals, including Network Designs, and held discussions with those offerors. AR, Tab 10, Step Two On-Ramp Competitive Range Determination Memorandum at 3. Step two proposals, which included sample task 2, were requested and received from the offerors in the competitive range. Id.
The agency assigned Network Designs’s proposal a rating of acceptable for the technical factor, as well as the sample task subfactor and the management subfactor.[2] Id. at 4. Under the sample task subfactor, Network Designs received a rating of acceptable for sample task 1, based upon the agency’s assignment of two strengths and three weaknesses. AR, Tab 8, Network Designs Technical Evaluation Report at 8‑14. With respect to sample task 2, Network Designs received a rating of acceptable, and the agency identified one significant strength, two strengths, and one significant weakness. Id.
The agency also assigned the protester’s proposal a rating of low risk under the past performance factor, recognized the firm’s commitment that [DELETED]% of Network Designs’s workforce would include veterans under the veterans employment factor, and assessed Network Designs’s proposal a rating of good under the SBPC factor. AR, Tab 10, Step Two On-Ramp Competitive Range Determination Memorandum at 3. Network Designs’s proposed price was $9,575,219,298. Id. at 4.
The Source Selection Authority (SSA) determined that Network Designs’s proposal was not among the highest-rated proposals, and eliminated it from the second competitive range. Id. at 6; AR, Tab 11, Unsuccessful Offeror Letter. In making her decision, the SSA explained that she decided to exclude from the step two competitive range all proposals with a rating of acceptable under the technical factor. AR, Tab 10, Step Two Competitive Range Determination at 5. These proposals also all received a rating of acceptable under the sample task subfactor. Id.
In selecting the proposals to be included in the step two competitive range, the SSA recognized that the solicitation did not permit revisions of sample task responses through discussions, and thus, an offeror’s rating under the sample task subfactor could not be improved. Id. The SSA further noted that none of the excluded proposals had issues requiring remediation under the management sub-factor. Id. As a result, the SSA concluded that for the excluded proposals, the adjectival rating under the technical factor would not improve after the step two competitive range discussions. Id.
The SSA then considered the evaluations under the less important evaluation factors and concluded that “none of these differences [in the veterans employment and SBPC factors or in price] were significant enough to outweigh the ‘Good’ or better ratings received for the Technical Factor, the significantly most important factor, especially considering the equal ratings for Past Performance, the second most important factor.” Id. Finally, the SSA noted that the range of prices proposed in the step two competitive range was considerably narrower than in the step one competitive range. Id.
The SSA also inquired whether proposals with ratings of good or outstanding under the technical factor also had low ratings under the veterans employment or SBPC factors, or proposed a price so high, that the low rating or high price would be sufficient to exclude that proposal from the competitive range. Id. at 6. The SSA also considered whether the proposals with a rating of acceptable under the technical factor nevertheless had strengths under the veterans employment or SBPC factors, or proposed a price so low, that this benefit would outweigh the lower rating under the technical factor, and concluded that no proposal with such a rating demonstrated such strengths. Id. at 5-6. The SSA concluded that none of the ratings or the relative prices provided a basis to change the competitive range. Id. at 6.
Network Designs received a debriefing from the agency, and this timely protest to our Office followed.
DISCUSSION
Network Designs primarily challenges the evaluation of its proposal under the technical factor, asserting that the agency unreasonably assigned its proposal a rating of “acceptable.” We have fully considered all of the protest grounds raised, and although we address only a portion of the arguments below, we find that none provide a basis to sustain the protest.
Technical Evaluation
Under the technical factor, Network Designs first contests the agency’s method of developing the model answer to sample task 1. Comments and Supp. Protest at 8‑10. The protester also argues, in the alternative, that even assuming that the model answer was correctly developed, the agency improperly used unstated evaluation criteria by employing the model answer to evaluate sample task 1. Additionally, Network Designs argues that the agency unreasonably evaluated its proposal when evaluating sample tasks 1 and 2. As discussed below, we find both the agency’s development of the model answer and its evaluation of Network Designs’s proposal unobjectionable.
In reviewing protests challenging the evaluation of proposals, we do not conduct a new evaluation or substitute our judgment for that of the agency but examine the record to determine whether the agency’s judgment was reasonable and in accord with the RFP evaluation criteria. Gonzales Consulting Services, Inc., B-416676, B-416676.2, Nov. 20, 2018, 2018 CPD ¶ 396 at 7. An offeror has the burden of submitting an adequately written proposal, and it runs the risk that its proposal will be evaluated unfavorably if it fails to do so. Hawk Institute for Space Sciences, B-409624, June 20, 2014, 2014 CPD ¶ 200 at 3. A protester’s disagreement with the agency’s judgment, without more, is not sufficient to establish that an agency acted unreasonably. Id.
Sample Task 1
As stated above, Network Designs challenges the method the agency used to develop the model answer to sample task 1. Supp. Comments at 8-10. Specifically, the protester alleges that the agency committed prejudicial error when it adjusted the model answer after receiving and reviewing a small sample of proposals. Id. at 2. In this regard, Network Designs contends that the modification of the model answer necessarily resulted in bias in favor of other offerors. As discussed below, however, the record does not support the protester’s argument that the later changes to the model answer benefitted certain offerors or competitively harmed the protester.
In response to the protest, the agency described in detail its method of developing the model answer to sample task 1. Decl. of Senior Technical Advisor (Dec. 15, 2020) at 4. As relevant here, the agency senior technical adviser explained that development of the sample model answer was an iterative process. In this regard, the senior technical advisor states that the model answer had been revised multiple times before the receipt of proposals, and was revised at least three more times afterwards, before it was finalized. Id. at 5.
The agency goes on to state that after the receipt of proposals, it compared the model answer to a sample of two proposals and determined that the model answer should be “further refined.”[3] Id. The agency points out that, during these revisions, the final model answer was reduced from 6 high‑level focus areas with 22 sub-focus areas, to only 5 high‑level focus areas with 16 sub-focus areas. Id.;Supp. MOL at 11.
With respect to the evaluation of sample task 1, the agency explains that no proposals were evaluated under this subfactor until after the model answer was finalized. Decl. of Senior Technical Advisor (Dec. 15, 2020) at 5. Regarding the two proposals that were compared to the model answer during its preliminary evaluation, the agency states that both of these proposals were subsequently eliminated from the competition. Supp. MOL at 11.
Based upon our review of the record, we find that the agency’s iterative method of developing a model answer to sample task 1 was unobjectionable. First, the record shows that the revisions made to the model answer after receipt of proposals resulted in the model answer being less restrictive. Thus, it does not appear that any offerors were competitively harmed by the agency’s actions in this regard.
Second, we disagree with the protester’s assertion that the iterative development of the model answer was somehow “biased.” Supp. Comments at 5. Government officials are presumed to act in good faith and we will not attribute unfair or prejudicial motives to procurement officials on the basis of inference or supposition. See Silynx Communications, Inc., B-310667, B-310667.2, Jan. 23, 2008, 2008 CPD ¶ 36 at 5. Where a protester alleges bias, it must not only provide credible evidence clearly demonstrating bias against the protester, but must show that this bias translated into action that unfairly affected the protester’s competitive position. Id. Here, we reject the protester’s claim that the revised model answer must have favored the two offerors whose proposals were used for the preliminary comparison because these proposals were subsequently eliminated from the competition. Since the protester has failed to show either an improper benefit to other offerors or competitive harm to its firm, we deny this aspect of the protest.[4]
Next, the protester argues that the agency employed unstated evaluation criteria by using a government-developed solution made up of high-level focus areas. Protest at 21. For example, the protester contends that the agency unreasonably assigned it a weakness under the Analyze and Remediate IT Components Factor because its proposal had “minimal detail” and did not address wireless security, which was not mentioned in the PWS or elsewhere in the solicitation. Comments and Supp. Protest at 16, 19. Network Designs contends that the agency’s emphasis on certain high-level focus areas also resulted in an agency evaluation that failed in many instances to consider other aspects of the protester’s proposal that demonstrated its understanding the sample task. Id. at 13, 15, 17.
In response, the agency first notes that the solicitation specifically asked offerors to “describe in detail your approach” to solving the sample task 1 problem so that it could discern the “extent to which” offerors demonstrated a clear understanding of all of the features involved in solving the problems and meeting the requirements. RFP at 134. The agency also explains that it developed a model answer that identified high-level focus areas, and lower-level focus areas intrinsic to the higher-level focus areas to assist the evaluators in determining if offerors’ responses to sample task 1 were complete. MOL at 15; COS at 18-19; AR, Tab 8, Network Designs Technical Factor Evaluation Report at 2‑3. The VA maintains that these high-level focus areas were purposefully broad so as not to limit offerors to any specific approach. MOL at 15. Additionally, the agency states that it expected offerors to demonstrate their understanding of the sample task by providing detailed, clear and pertinent information for lower-level focus areas, under a particular higher level focus area. Id. at 12.
On this record, we conclude that the VA’s evaluation of Network Designs’s proposed solution to sample task 1 was reasonable and consistent with the solicitation. As a general matter, when evaluating proposals, an agency properly may take into account specific, albeit not expressly identified, matters that are logically encompassed by, or related to, the stated evaluation criteria. Synaptek Corp., B-410898.6, Feb. 29, 2016, 2016 CPD ¶ 78 at 8 (denying protest challenging VA’s use of model answer evaluation scheme, where protester failed to show that key focus areas and lower-level sub-areas were not reasonably related to performing the sample tasks). Here, we first find the agency’s consideration of high-level focus areas and related sub‑areas reasonable because all of these areas were sufficiently related to requirements contained in the PWS.
Next, we also find no basis to disturb the substance of the agency’s evaluation of the protester’s sample task 1 response. As stated above, under sample task 1, offerors were instructed to “describe in detail your approach to analyze, remediate, and report VA infrastructure/IT deficiencies across the organization to prepare VA facilities for the new EHR system.” RFP at 203. The RFP also cautioned offerors to provide detail sufficient to permit a complete and accurate evaluation of each proposal. Id. at 122. Offerors were advised that agency would consider “the extent to which the Offeror demonstrates a clear understanding of all features involved in solving the problems and meeting the requirements presented by the [s]ample [t]ask; and the extent to which uncertainties are identified and resolutions proposed.” Id. at 134.
With regard to the agency’s assignment of a rating of acceptable to the protester’s response to sample task 1, the agency contends that it determined the response for the sample task lacked sufficient detail across certain focus areas to merit a higher rating. AR, Tab 8, Network Designs Technical Factor Report at 9-10. For example, as explained above, the agency assigned the protester’s proposal a weakness for not discussing in enough detail “its approach to analyze and remediate IT component deficiencies.” Id. at 10. As part of this weakness, the agency explains that while Network Designs proposed to adhere to Federal Information Processing Standards (FIPS), it was unclear if FIPS 140-2 would be followed for wireless security. The agency also determined that the proposal lacked detail on wireless protected access (WPA-2) to ensure wireless security. Id. In this regard, the agency stated that the protester’s lack of understanding of the security requirements added high risk that it would not be able to provide a secure operational environment, safe from wireless security vulnerabilities. Id. The agency concluded that the minimal level of detail in the protester’s proposal overall presented risk that clinicians and users may experience “sub-optimal performance.” Id.
In support of its protest, Network Designs contends that its proposal demonstrates its understanding of wireless security with regard to cybersecurity requirements by its statement that it will adhere to the National Institute Standards and Technology Cyber Security Framework, FIPS, and Federal Information Security Management Act of 2002 standards. In the protester’s view, this language “demonstrated at least an adequate, if not thorough, understand of Analyzing and Remediating Network Deficiencies,” and, thus, the proposal warranted a higher rating. Comments and Supp. Protest at 19 (citing AR, Tab 7, New Designs Proposal, Sample Task 1, at 6). The protester’s argument in this regard amounts to disagreement with the agency’s judgment, which, without more, does not render the agency’s conclusions unreasonable. Trofholz Techs., Inc., B‑404101, Jan. 5, 2011, 2011 CPD ¶ 144 at 3-4. Given the solicitation’s warning that firms must provide sufficient detail to allow the agency to perform a complete and accurate evaluation (RFP at 122), we find that the protester has provided our Office with no basis to question the agency’s evaluation. Consequently, this protest ground is denied.
Sample Task 2
Regarding sample task 2, Network Designs argues that the agency unreasonably assigned its proposal a significant weakness because its “architecture diagrams did not clearly depict the two environments (testing and production) that it used to build and support the solicitation.” Protest at 35; AR, Tab 8, Network Designs Technical Evaluation Report at 14. The protester also contends that its response to sample task 1 adequately addressed testing and production environments. Comments and Supp. Protest at 21.
In response, the agency explains that this significant weakness was associated with one of the three major deliverables, i.e., the minimum viable product (MVP) documentation. RFP at 211. The agency notes that the language from the solicitation clearly required offerors to provide a diagram for the two environments by requiring that the following MVP documentation shall be provided: “b. [a]rchitecture/network diagram(s) of the cloud platform, environments, and cloud services used in the development, testing, integration and deployment of the WCST [widget claims submission tool].” Id. Concerning the protester’s statement that its response to sample task 1 adequately addressed testing the production environment, the agency states that the information was not presented in the protester’s answer to sample task 2, and notes that it was not required to piece together the information from disparate parts of the proposal. Supp. MOL at 22.
Here, we find that the agency reasonably assigned the protester’s proposal a significant weakness for its response to sample task 2. In this regard, based upon our review of Network Designs’s proposal, we agree with the agency that the protester’s architecture diagrams did not clearly depict the two environments (i.e., testing and production). Additionally, the protester’s assertion that the agency could have gathered this information by reviewing its response to sample task 1 is not persuasive, as the agency is not required to piece together portions of a proposal in conducting an evaluation. James Constr., B‑402429, Apr. 21, 2010, 2010 CPD ¶ 98 at 5. Indeed, where a proposal omits, inadequately addresses, or fails to clearly convey required information, the offeror runs the risk of an adverse agency evaluation. Addvetco, Inc., B-412702, B‑412702.2, May 3, 2016, 2016 CPD ¶ 112 at 7-8. Thus, we find no basis to object to the agency’s assignment of a significant weakness to Network Designs’s proposal based on the firm’s response to sample task 2.
As established above, given that the agency reasonably awarded Network Designs’s proposal a rating of acceptable under the technical factor--the most important evaluation factor--and given that this rating was lower than those of the proposals included in the competitive range, we conclude that the agency reasonably excluded the firm’s proposal from the second competitive range.
The protest is denied.
Thomas H. Armstrong
General Counsel
[1] The agency received 98 step one proposals, however, three were untimely and therefore immediately eliminated, and one offeror withdrew its proposal. COS at 3 n.1.
[2] The definition of an acceptable rating for the technical evaluation is:
A proposal that meets all of the Government’s requirements, contains at least minimal detail, demonstrates at least a minimal understanding of the problems, and is at least minimally feasible (moderate to high degree of risk).
AR, Tab 4, Source Selection Evaluation Plan at 23.
[3] The agency refers to the practice of comparing the sample task model answer, to at least a portion of the received proposals, as “test running.” Decl. of Senior Technical Advisor (Dec. 15, 2020) at 4.
[4] The record also does not support Network Designs’s contention that wireless security was a new focus area added to the model answer after the agency reviewed the answers of two offerors to sample task 1. Comments and Supp. Protest at 9; Supp. AR at 12. The agency states that wireless security was part of the pre-proposal sample task 1 model answer covered under the broader sub-focus area of Security, a sub‑focus area under the high level focus area of Remediation Infrastructure & Support. Id. The agency also notes that numerous performance work statement (PWS) sections contain wireless requirements (i.e. 4.3, 4.6.2.1, 4.6.2.3, 4.6.2.4, 4.6.2.6, 4.11.3). Id. at 13. The agency notes as well that the proposals of the two offerors whose answers to sample task 1 were compared to the model answer ultimately were eliminated from the competition, and both received adverse assessments for wireless security. Id. at 11; Decl. of Senior Technical Advisor (Dec. 15, 2020), Exh. 1, Offeror No. 9, Sample Task at 4; Exh. 2, Offeror No. 67, Technical Factor Report at 10. This supports the agency’s assertion that its review of these proposals did not cause the inclusion of wireless security to the model answer as the protester alleges.