Skip to main content

TeleCommunication Systems, Inc.

B-413265,B-413265.2 Sep 21, 2016
Jump To:
Skip to Highlights

Highlights

TeleCommunication Systems, Inc. (TCS), of Annapolis, Maryland, protests the issuance of a task order to DataPath, Inc., of Duluth, Georgia, under request for task execution plans (RTEP) S4G-035, issued by the Department of the Army, Army Materiel Command (Army), for satellite communication support services. TCS challenges the evaluation of the task execution plans (TEPs) and the selection decision.

We deny the protest.

We deny the protest.
View Decision

DOCUMENT FOR PUBLIC RELEASE
The decision issued on the date below was subject to a GAO Protective Order. This redacted version has been approved for public release.

Decision

Matter of:  TeleCommunication Systems, Inc.

File:  B-413265; B-413265.2

Date:  September 21, 2016

Laurel A. Hockey, Esq., David S. Cohen, Esq., John J. O’Brien, Esq., Amy J. Spencer, Esq., and Daniel Strouse, Esq., Cohen Mohr LLP; Yelena Simonyuk, Esq., Proskauer Rose LLP, for the protester.

Lee Dougherty, Esq., and Katherine A. Straw, Esq., Montgomery Fazzone PLLC, for DataPath, Inc., an intervenor.
Stacy G. Wilhite, Esq., and Peter S. Kozlowski, Esq., Department of the Army, for the agency.
Young H. Cho, Esq., and Christina Sklarew, Esq., Office of the General Counsel, GAO, participated in the preparation of the decision.

DIGEST

Protest challenging the agency’s evaluation and the selection of a lower-priced proposal rated as technically equal to the protester’s is denied where the record shows that the agency’s evaluation and selection decision were reasonable and consistent with the solicitation.

DECISION

TeleCommunication Systems, Inc. (TCS), of Annapolis, Maryland, protests the issuance of a task order to DataPath, Inc., of Duluth, Georgia, under request for task execution plans (RTEP) S4G-035, issued by the Department of the Army, Army Materiel Command (Army), for satellite communication support services.  TCS challenges the evaluation of the task execution plans (TEPs) and the selection decision.

We deny the protest.

BACKGROUND

The RTEP was issued on December 14, 2015, pursuant to Federal Acquisition Regulation (FAR) subpart 16.5, to firms that had been awarded a Global Tactical Advanced Communications Systems (GTACS) indefinite-delivery, indefinite-quantity (ID/IQ) contract, for satellite communication support services.  RTEP[1] at 1; Combined Contracting Officer’s Statement and Memorandum of Law (COS/MOL)
at 1-2.  The RTEP contemplated the award of a single task order with cost-plus-fixed-fee (CPFF), cost, and fixed-price elements.  The performance was to include a 60-day transition-in period; a 10-month base period; three 12-month option periods and one six-month option period.  RTEP at 1.  Award was to be made on a best-value basis in accordance with FAR § 15.101-1, considering the following factors:  technical, past performance, and cost/price.  Id. at 13-14.  The solicitation stated that award was to be made to the offeror whose TEP[2] was acceptable under the technical factor, and represented the best value after a tradeoff analysis between the past performance factor and the cost/price factor.  Id. at 14.  The solicitation also stated that past performance would be significantly more important than cost/price.  Id.

The technical factor contained the following six subfactors, referred to as parts:  sample repair, sample hiring scenario, manpower, transition-in plan, corporate experience, and subcontracting.  Id. at 14.  As relevant here, the solicitation stated that this factor would be evaluated on an acceptable/unacceptable basis, and that to be considered acceptable, an offer had to be acceptable under all six subfactors.  Id.

For past performance, the solicitation required offerors to submit at a minimum three but no more than five past performance references, for which the agency would assign relevancy ratings (very relevant, relevant, somewhat relevant, or not relevant).  Id. at 6, 15-16.  An offeror’s past performance also was to be evaluated to determine how well the contractor performed on those contracts.  Id. at 16.  Based on this information and information that the Army obtained from other sources available to the agency, the agency would assign an overall performance confidence rating.  Id.  As relevant here, a substantial confidence rating was defined as “[b]ased on the [o]fferor[’]s recent relevant performance record, the [g]overnment has high expectation that the [o]fferor will successfully perform the required effort.”  Id.

The cost/price factor contained three parts (fixed-price elements; cost-reimbursement elements; and sample repair).  See id. at 7-13.  As relevant here, the solicitation stated that the cost-reimbursement elements would be evaluated for cost realism and that the “most probable cost may be determined by adjusting (for purposes of evaluation only) each [o]fferor’s proposed cost when appropriate, to reflect any additions or reductions in cost elements to realistic levels based on the results of the cost realism analysis to ensure a realistic cost.”  Id. at 17.  The solicitation further stated that the total evaluated price (the total price for the fixed price elements and the most probable cost of the cost-reimbursement elements and the sample repair subfactor) to be used in the award determination would be evaluated to ensure that all proposed costs were fair, reasonable, and realistic.  See id. at 7, 14, 17. 

The agency received three timely responses, including those from TCS and DataPath.  The TEPs were evaluated by the contracting officer, the contract specialist, a technical evaluator, and a cost/price analyst.  See Agency Report (AR), Tab 10, Source Selection Decision Document (SSDD) at 6.  The agency conducted discussions with all three offerors and subsequently issued amendment 04 to the solicitation which provided, as relevant here, the exact manpower required (i.e., labor categories and labor hours) to perform the work, as well as a corresponding pricing model that the offerors were to complete by proposing direct labor rates for each identified labor category for the cost-reimbursement elements of the task order.[3]  See id. at 5; RTEP at 5, 7-8; id., attach. 7, Price Proposal, Total TEP Summary CPFF Contract Line Item Numbers (CLINs) Worksheet. 

All three offerors submitted final proposal revisions (FPRs), which were evaluated as follows:

 

TCS

Offeror A

DataPath

Technical

Acceptable

Acceptable

Acceptable

Past Performance

Substantial Confidence

Substantial Confidence

Substantial Confidence

Total Proposed Price

$469,310,264.40

$384,999,807.50

$363,100,762.72

Total Evaluated Price

$469,310,264.40

$390,011,388.76

$372,563,480.97


AR, Tab 10a, SSDD at 6. 

The contracting officer, who was the selection official for this procurement, compared the proposals, giving appropriate consideration to the evaluation criteria set forth in the RTEP and their relative importance.  See id. at 11.  As relevant here, the contracting officer reviewed the past performance evaluations for each offeror and concluded that all three offerors clearly demonstrated successful past performance on relevant prior efforts.  Id. at 10.  As between TCS and DataPath, the contracting officer found that the two offerors had “essentially equal past performance ratings with an equal [number] of contract examples determined to be very relevant as compared to the current effort,”[4] while DataPath’s proposal was 29.25 percent lower in price than TCS’s.  Id. at 11.  As a result, the contracting officer found that when compared to DataPath, TCS’s proposal did not represent the best value to the government.  Id.

As between offeror A and DataPath, the contracting officer found that while both offerors received substantial confidence ratings for past performance, more of DataPath’s contract references were deemed very relevant;[5] with regard to their performance, the past performance information for the two offerors showed that they performed equally well.  Id.  Due to the difference in the relevancy of the offerors’ past performance references, the government found it had slightly more confidence in DataPath’s likelihood of successful performance, in comparison to offeror A’s.  Id.  The contracting officer also noted that DataPath’s proposal--which was $17 million (4.68 percent) lower than offeror A’s--had a “significantly lower most probable cost to the [g]overnment,” and concluded that DataPath’s proposal provided the best value to the government and selected this firm for award.  Id.
at 11-12.  

On June 3, 2016, the Army notified the three offerors, including TCS, of the award decision.  AR, Tab 12a, Notice to Unsuccessful Offeror to TCS.  The protester was debriefed on June 9, and this protest followed.[6]

DISCUSSION

TCS challenges numerous aspects of the agency’s evaluation of offeror A’s and DataPath’s technical proposals; the agency’s cost realism analysis; and the selection decision. 

In reviewing protests of an agency’s evaluation and source selection decision, even in a task or delivery order competition as here, we do not reevaluate proposals; rather, we review the record to determine whether the evaluation and source selection decision are reasonable and consistent with the solicitation’s evaluation criteria and applicable procurement laws and regulations.  See Ball Aerospace & Techs. Corp., B-411359, B-411359.2, July 16, 2015, 2015 CPD ¶ 219 at 7.  A protester’s disagreement with the agency’s judgment, by itself, is not sufficient to establish that an agency acted unreasonably.  Id.  Although we do not specifically address all of TCS’s arguments, we have fully considered all of them and find that they afford no basis on which to sustain the protest.

Technical Evaluation

TCS argues that the agency’s evaluation of DataPath’s and offeror A’s proposals as acceptable under the technical factor, sample hiring scenario subfactor, was unreasonable because both offerors failed to satisfy the minimum requirements set forth in the solicitation.  Supplemental (Supp). Protest and Comments at 4-10; Supp. Comments at 2-7. 

The solicitation provided that the technical factor would be evaluated on an acceptable/unacceptable basis; and that in order to be acceptable, offerors must be acceptable in all six subfactors.  RTEP at 14.  The solicitation defined “[a]cceptable” to mean that “the TEP clearly meets the minimum requirements of the RTEP,” and defined “[u]nacceptable” to mean that “the TEP does not clearly meet the minimum requirements of the RTEP or the TEP fails to follow the instructions of the RTEP or the TEP fails to provide all of the submission requirements.”  Id. at 14-15. 

With regard to the sample hiring scenario subfactor, the solicitation stated that:

At a minimum, the [o]fferor shall provide a response that will meet the following [g]overnment requirements as outlined in Attachment 10[7]:

The [o]fferor shall describe the following:

a.  The process to hire three (3) qualified new hires (without current VISAs) and deploy them to Forward Operating Base Union III, Iraq, for a period of one year, in accordance with the PWS [performance work statement] at the earliest possible date.  At a minimum, [o]fferors shall include a detailed description of the following portions of the hiring process:

i.  Iraq Embassy VISA process;

ii.  Medical Process (the [o]fferor shall specify any required forms, testing, labs, or immunizations).

iii.  Training Process (to include all required courses); 

b.  The timeline for deployment of the three (3) FSR new hires from the selection of candidates to their arrival at Forward Operating Base Union III, Iraq.

RTEP at 4-5 (emphasis added).  Relying on the highlighted language above, the protester contends the “minimum” information that offerors were required to include in order to be deemed technically acceptable was “all of the required training courses and all of the required labs.”  Supp. Protest and Comments at 5-8.  In response, the agency contends that the solicitation required offerors to provide a response that addressed the sample hiring scenario set forth in the solicitation, which was “designed to test the knowledge of the offerors of the overall process,” and did not contain any explicit minimum requirements (e.g., every step in the Iraq visa process, every medical procedure, or every training process) that the offerors would have to satisfy to be deemed acceptable (or which would be utilized to exclude those offerors that fail to list a single step, procedure, or process).  See AR, Supp. COS/MOL at 2-4. 

In this regard, the agency states that, had the solicitation listed such “minimum requirements” as the protester claims it did, that would have resulted in offerors submitting “technical proposals that were useless to the [a]gency,” because in that circumstance, offerors could simply have repeated back the stated “minimum requirements” set forth in the solicitation.  Such proposals would not allow the agency to make a proper determination as to each offeror’s understanding of the requirements.  Id. at 4.

Where a protester and agency disagree over the meaning of solicitation language, we will resolve the matter by reading the solicitation as a whole and in a manner that gives effect to all of its provisions; to be reasonable, and therefore valid, an interpretation must be consistent with the solicitation when read as a whole and in a reasonable manner.  DKW Commc’ns, Inc., B-412652.3, B-412652.6, May 2, 2016, 2016 CPD ¶ 143 at 7; Alluviam LLC, B-297280, Dec. 15, 2005, 2005 CPD ¶ 223
at 2.  We have also long held that the evaluation of proposals is a matter within the discretion of the procuring agency; we will question the agency’s evaluation only where the record shows that the evaluation does not have a reasonable basis or is inconsistent with the solicitation.  Hardiman Remediation Servs., Inc., B-402838, Aug. 16, 2010, 2010 CPD ¶ 195 at 3.

On this record, we agree with the agency that the solicitation did not set forth any minimum requirements that offerors were required to satisfy in responding to the sample hiring scenario in order to be deemed “acceptable.”  Rather, the solicitation merely required that an offeror submit its description of the process for hiring three qualified new hires (as described in the scenario) that addressed, at a minimum, the Iraq Embassy visa process, medical process, and training process; as well as a timeline from selection of the new hires to their arrival at the forward operating base in Iraq, in accordance with the PWS. 

Further, the record shows that DataPath’s and offeror A’s proposals both addressed these requirements, which the agency determined to be acceptable.  See AR, Tab 13a, DataPath Technical Factor Evaluation at 3-5; AR, Tab 13c, offeror A Technical Factor Evaluation at 3-5.  Accordingly, we find no merit to the protester’s challenges to the technical acceptability of DataPath’s and offeror A’s proposals in this regard. 

Past Performance

TCS argues that the agency’s evaluation and selection decision unreasonably ignored negative past performance information about DataPath when it assigned the awardee’s past performance a substantial confidence rating and determined that the two offerors were “essentially equal” under this factor.  See Protest at 15‑16; Supp. Protest and Comments at 14-21.  In this regard, TCS claims that, in contrast to its own “flawless” performance, the record shows that DataPath had numerous performance problems, which the agency failed to meaningfully consider in its evaluation and selection decision.  See Protest at 16; Supp. Protest and Comments at 18-20.  TCS further argues that had the agency considered this information, it would have focused on the difference in quality of the two firms’ past performance, and would have found TCS to be superior to DataPath under the past performance factor.  Supp. Protest and Comments at 20-21. 

Our Office will examine an agency’s evaluation of an offeror’s past performance only to ensure that it was reasonable and consistent with the stated evaluation criteria and applicable statutes and regulations, since determining the relative merit of an offeror’s past performance is primarily a matter within the agency’s discretion.  American Envtl. Servs., Inc., B-406952.2, B-406952.3, Oct. 11, 2012,
2013 CPD ¶ 90 at 5; AT&T Gov’t Solutions, Inc., B-406926 et al., Oct. 2, 2012,
2013 CPD ¶ 88 at 15.  The evaluation of past performance, by its very nature, is subjective, and we will not substitute our judgment for reasonably based evaluation ratings; an offeror’s disagreement with an agency’s evaluation judgments, by itself, does not demonstrate that those judgments are unreasonable.  Cape Envtl. Mgmt., Inc., B-412046.4, B-412046.5, May 9, 2016, 2016 CPD ¶ 128 at 8-9.   

Here, the record shows that the agency’s evaluation took into consideration DataPath’s alleged performance issues in assigning a substantial confidence rating and in the selection official’s finding that the two offerors were essentially equal.

For example, while the agency’s evaluation of Data Path’s past performance does not ignore an issue DataPath experienced during its performance of the incumbent contract, the issue is “mention[ed] [] only in passing,” without providing any analysis as to how the issues were weighed in the assignment of the confidence rating.  See Supp. Protest and Comments at 18-19. 

In response, the agency contends that TCS “wrongly overstates the severity of [DataPath’s] past performance issue on [the] task order.”  AR, COS/MOL
at 23-24.  The Army explains that it reviewed a 2015 interim Contractor Performance Assessment Reporting System (CPARS) report[8] for a task order issued under the incumbent contract--under which DataPath is a subcontractor--indicating that while “DataPath FSRs [field support representatives] have performed very well in their specific areas and are extremely professional, experienced and knowledgeable,” there were staffing issues that occurred under the task order (“there have been many open requisites that have taken extremely long to fill or have filled with unqualified personnel which is unacceptable.”).  See id.; AR, Tab 8m, DataPath Past Performance CPARS for Task Order 0014, at 1; AR, Tab 8a, DataPath Past Performance Evaluation at 2. 

In addition to the CPARS report, the agency also reviewed a questionnaire completed by the project lead for the task order, assessing DataPath’s past performance in five areas.  See AR, COS/MOL at 23.  DataPath’s performance received three good ratings, one acceptable rating, and one outstanding rating.[9]  AR, Tab 8k, DataPath Past Performance Evaluation Questionnaire Form for Task Order 0014, at 2-3.  The evaluation form also included a comment that “[b]ased on R23G and WWSS related work, DataPath is fully capable of performing to the level required for this effort.”  Id. at 3. 

The agency explains that notwithstanding the information in the CPARS, it found that the more recent information provided in the questionnaire[10] indicated that DataPath’s performance on the incumbent task order was exemplary.  See AR, COS/MOL at 23-24.  Further, while the interim CPARS noted minor staffing issues related to FSR personnel for which DataPath, as a subcontractor, was not directly accountable, the actual quality of DataPath’s performance was noted as being extremely successful.  Therefore, the agency relied more heavily on DataPath’s overall performance on the contract, rather than its noted shortcomings as a subcontractor in simply filling positions.  Id.  Finally, the agency states that it was able to assign a substantial confidence rating because DataPath had two other “very relevant” and two “relevant” contract references that indicated that the company provided exemplary past performance information.  Id. at 23. 

In sum, the record shows that the agency considered the staffing issue that the protester alleges was ignored; however, the agency concluded that the quality of DataPath’s overall performance on the task order, as well as the totality of DataPath’s past performance, outweighed the minor staffing shortcomings on the task order.  Further, based on the offeror’s recent and relevant performance record, the government had a high expectation that the offeror would successfully perform the required effort.  See AR, COS/MOL at 24; see also AR, Tab 8a, DataPath Past Performance Evaluation at 9-10.  We find nothing unreasonable about the agency’s conclusions. 

We also agree with the agency that the protester overstates the severity of DataPath’s performance problems.  In addition to the alleged performance issues on the incumbent task order, the protester contends that the following statement from another interim CPARS shows negative past performance information:  “This would be exceptional if I were rating the quality and professionalism of the field support representatives (FSRs) in the field operating the terminal. I downgraded it to very good due to some logistics hiccups and length to troubleshoot some efforts.  Overall, the program has been high quality.”  We disagree with the protester’s characterization of this report.  In our view, the overall past performance reflected in that reference was positive; further, the assessing official commented that  “Given what I know today about the contractor’s ability to execute what he promised in his proposal, I definitely would award to him today given that I had a choice.”  Compare Supp. Protest and Comments at 19 with AR, Tab 8a, DataPath Past Performance Evaluation at 4. 

An offeror’s disagreement with an agency’s evaluation judgments, by itself, does not demonstrate that those judgments are unreasonable.  Cape Envtl. Mgmt., Inc., supra.  Accordingly, we find no merit to TCS’s protest challenging the agency’s evaluation of DataPath’s past performance. 

Similarly, we find no merit to TCS’s challenges to the selection decision, which found that both TCS and DataPath were assigned “essentially equal past performance ratings with an equal [number] of contract examples determined to be very relevant as compared to the current effort.”  See Supp. Protest and Comments at 20-21; AR, Tab 10, SSDD at 11.  While the protester disputes the contracting officer’s finding that all three offerors “clearly demonstrated their successful [p]ast [p]erformance” and that the “only distinguisher among the overall [s]ubstantial [c]onfidence ratings is their relevancy ratings for the contract examples provided,” TCS has not otherwise challenged the agency’s assessment of the relevancy ratings or overall confidence assessments, and the record shows that the contracting officer reviewed the past performance evaluation of the offerors which discussed the alleged “negative information.”  See AR, Tab 10, SSDD at 10; AR, Tab 9a, Price Negotiation Objective Memorandum at 12, 7.  Accordingly, this protest ground is denied. 

Cost Evaluation 

TCS argues that the agency failed to evaluate whether the direct labor rates proposed by DataPath were realistic, and instead relied on a mechanical application of a standard deviation methodology.  Supp. Protest and Comments at 22-25.  In this regard, TCS argues DataPath’s labor rates were significantly lower than the other offerors’ proposed labor rates and the inclusion of those rates in the agency’s analysis artificially lowered the standard against which DataPath was evaluated, resulting in a flawed and unreliable cost realism analysis.  Id. at 23. 

When an agency evaluates a proposal for the award of a cost-reimbursement contract or task order, an offeror’s costs are not dispositive because, regardless of the costs proposed, the government is bound to pay the contractor its actual and allowable costs.  FAR § 15.305(a)(1); Exelis Sys. Corp., B-407673 et al.,
Jan. 22, 2013, 2013 CPD ¶ 54 at 7; CGI Fed. Inc., B-403570 et al., Nov. 5, 2010, 2011 CPD ¶ 32 at 5 n.1.  Consequently, an agency must perform a cost realism analysis to determine the extent to which an offeror’s proposed costs are realistic for the work to be performed.  FAR § 15.404-1(d)(1); Solers Inc., B-409079,
B-409079.2, Jan. 27, 2014, 2014 CPD ¶ 74 at 4.  An agency’s cost realism analysis requires the exercise of informed judgment, and we review an agency’s judgment in this area only to see that the cost realism analysis was reasonably based and not arbitrary.  Info. Ventures, Inc., B-297276.2 et al., Mar. 1, 2006, 2006 CPD ¶ 45 at 7.  The analysis need not achieve scientific certainty; rather, the methodology employed must be reasonably adequate and provide some measure of confidence that the agency’s conclusions about the most probable costs under an offeror’s proposal are reasonable and realistic in view of other cost information reasonably available to the agency at the time of its evaluation.  Id.

In general, an agency must independently analyze the realism of an offeror’s proposed costs based upon the offeror’s particular approach, personnel, and other circumstances; a cost estimation method which mechanically adjusts proposed labor rates fails to satisfy the requirement for an independent analysis of an offeror’s proposed costs.  CSI, Inc.; Visual Awareness Techs. and Consulting, Inc.,
B-407332.5 et al., Jan. 12, 2015, 2015 CPD ¶ 35 at 10-11; Metro Mach., Corp.,
B-402567, B-402567.2, June 3, 2010, 2010 CPD ¶ 132 at 6.  However, where, as here, a solicitation provides a cost model that specifies the labor mix and level of effort for offerors’ proposals--thereby making offerors responsible for proposing costs based on their own rates, but not for proposing differing technical approaches
--an agency may reasonably evaluate the rates proposed for those established labor categories based on other data, such as the rates proposed by other offerors.  See Booz Allen Hamilton, Inc., B-412744, B-412774.2, May 26, 2016, 2016 CPD
¶ 151 at 10; CSI, Inc.; Visual Awareness Techs. and Consulting, Inc., supra; Energy Enter. Solutions, LLC; Digital Mgmt., Inc., B-406089 et al., Feb. 7, 2012, 2012 CPD ¶ 96 at 9-10.  As a result, we do not find the agency’s use of the standard deviation methodology as a tool for determining the realism of the offerors’ proposed labor rates to be objectionable. 

Further, the agency explains that in addition to this analysis using the standard deviation methodology discussed above--which resulted in the agency making minor upward adjustments (less than $1) to nine labor categories--the agency also reviewed the salary survey data submitted by DataPath, and performed an independent cost analysis comparing the average direct rates of the offerors to those in the independent government cost estimate (IGCE).  This additional review showed that offerors’ proposed direct labor rates were generally lower than the IGCE (DataPath’s were an average 17.02 percent, TCS’s were an average 17.97 percent, and offeror A’s were an average of 24.05 percent less than the IGCE rates).  AR, COS/MOL at 13, 15-16; See AR, Tab 9a, Unreacted Price Negotiation Objective Memorandum at 22-23, 55-56; AR, Tab 8b, DataPath Final Cost Report at 5-7; AR, Tab 5w, DataPath Cost Narrative at 11, 82-97; AR, Tab 9b, Unredacted IGCE vs. Offeror Labor Rates.  On this record, we see nothing objectionable in the agency’s cost realism analysis of DataPath’s direct labor rates, and therefore find the protester’s arguments to be without merit. 

Selection Decision

Finally, TCS argues that the best-value tradeoff decision was flawed because it relied on a flawed evaluation.  Protest at 17; Supp. Protest and Comments at 25. 

Where, as here, a procurement provides for issuance of a task order on a
best-value basis, it is the function of the selection official to perform a price/technical tradeoff, that is, to determine whether one proposals’ technical superiority is worth its higher price.  See MILVETS Sys. Tech., Inc., B-409051.7, B-409051.9,
Jan. 29, 2016, 2016 CPD ¶ 53 at 10 (citing Research and Dev. Solutions, Inc.,
B-410581, B-410581.2, Jan. 14, 2015, 2015 CPD ¶ 38 at 11).  An agency has broad discretion in making a tradeoff between price and nonprice factors, and the extent to which one may be sacrificed for the other is governed only by the tests of rationality and consistency with the solicitation’s stated evaluation criteria.  Id. (citing Portage, Inc., B-410702, B-410702.4, Jan. 26, 2015, 2015 CPD ¶ 66 at 19).  Where selection officials reasonably regard proposals as being essentially technically equal, price properly may become the determining factor in making award, notwithstanding that the solicitation assigned price less importance than the technical factors.  Synergetics, Inc., B-299904, Sept. 14, 2007, 2007 CPD ¶ 168 at 7. 

As described above, the record does not support TCS’s challenges to the agency’s evaluation of past performance or cost.  Accordingly, we find no merit to TCS’s objections to the agency’s selection decision, which are based upon those alleged errors.  Further, on this record, we find nothing unreasonable in the selection official’s decision to select, as between the two technically-equal proposals, DataPath’s lower-priced proposal.

The protest is denied.

Susan A. Poling
General Counsel



[1] The solicitation was amended five times.  All citations to the RTEP are to the final version, as amended on April 19, 2016. 

[2] The solicitation used the term TEP to refer to the response to the RTEP.  See id. at 1. 

[3] Offerors were also instructed to submit “documentation of the most recent [d]irect [l]abor [r]ates,” to include at least one of four categories, one of which was salary survey data.  RTEP at 9-10.  In this regard, the solicitation advised offerors that “[w]hen salary surveys are used to support proposed direct labor rates, [o]fferors are encouraged to propose a direct labor rate for each category within the range of the 25th to 90th percentiles of the survey data.”  Id. at 10.  The solicitation further advised offerors that if rates higher or lower than this range were proposed, offerors were to explain the plan of action for filling the position in its cost narrative, and that, the “proposed cost may be determined NOT fair and reasonable and/or NOT realistic if unsupported.”  Id.

[4] TCS submitted three past performance references that were found to be very relevant; DataPath submitted five past performance references, three of which were found to be very relevant and two of which were found to be relevant.  See AR,
Tab 10, SSDD at 10. 

[5] Offeror A submitted five past performance references, one of which was found to be very relevant and the remaining references to be relevant.  See AR,
Tab 10, SSDD at 11.

[6] The awarded value of the task order at issue exceeds $10 million.  Accordingly, this procurement is within our jurisdiction to hear protests related to the issuance of orders under multiple-award ID/IQ contracts.  10 U.S.C. § 2304c(e). 

[7] Attachment 10 set forth a sample hiring scenario, which contained essentially the same instructions as those set forth in the RTEP instructions, stating in addition that the scenario was “Three Field Support Representatives (FSR’s) to include one (1) Information Assurance, one (1) NetApp Engineer and one (1) Virtualization and Storage Engineering Support resigned from their positions in Iraq.”  Compare RTEP at 4-5 with RTEP, Attachment 10. 

[8] The CPARS itself assessed the prime contractor’s performance under the task order, but also included the information about DataPath’s performance as a subcontractor, discussed above.  See AR, Tab 8m, DataPath Past Performance CPARS for Task Order 0014, at 1.  The assessing official is the contracting officer for the subject procurement.  See id. at 3. 

[9] The available ratings were outstanding, good, acceptable, marginal, and unsatisfactory.  AR, Tab 8k, DataPath Past Performance Evaluation Questionnaire Form for Task Order 0014, at 1. 

[10] The questionnaire was dated February 7, 2016.  See AR, Tab 8k, DataPath Past Performance Evaluation Questionnaire Form for Task Order 0014, at 4. 

Downloads

GAO Contacts

Office of Public Affairs