ELAINE D. KAPLAN, District Judge.
In this pre-award bid protest, Plaintiff KSC Boss Alliance, LLC ("KBA") challenges as arbitrary and capricious NASA's decision to exclude it from the competitive range in a procurement for "baseport operations and spaceport services" at John F. Kennedy Space Center in Cape Canaveral, Florida.
For the reasons discussed below, the Court finds that KBA's arguments lack merit. KBA's motion is therefore
NASA issued Solicitation NNK18619079R ("the Solicitation") on November 1, 2017. AR Tab 15 at 13589. The Solicitation requested proposals for the Kennedy Space Center Base Operations and Spaceport Services Contract ("the BOSS Contract").
The Solicitation contemplated a "single award, fixed-price contract with a cost-reimbursable Contract Line Item Number (`CLIN') and a fixed-price Indefinite Delivery Indefinite Quantity (`IDIQ') component." AR Tab 58a at 28303. The cost-reimbursable CLIN applied to the "baseline work" set forth in the Performance Work Statement ("PWS"). That baseline work included:
AR Tab 15 at 13720-21. The IDIQ component would include "work which cannot be adequately defined in advance for inclusion with baseline work."
Under the Solicitation, the award would be made to the responsible offeror whose proposal represented the best value to the government.
The Solicitation warned prospective offerors that their "initial proposal[s] should contain [their] best terms from a cost or price and technical standpoint," and indicated that NASA did not intend to hold discussions.
NASA's Source Evaluation Board ("SEB") would conduct evaluations of the proposals pursuant to NFS 1815.370.
As set forth in the Solicitation and in accordance with NFS 1815.304-70, the SEB would evaluate proposals based on three factors: (1) Mission Suitability; (2) Past Performance; and (3) Price. AR Tab 15 at 13707. The Price factor was more important than the Mission Suitability factor, which was more important than the Past Performance factor.
The Mission Suitability factor was comprised of three subfactors: (1) Management Approach; (2) Technical Approach; and (3) Small Business Utilization.
AR Tab 38 at 27802;
As explained by the government, under the Solicitation's evaluation procedures, the percentile scores assigned to each subfactor would be multiplied by the points assigned to arrive at the total for each subfactor.
The Solicitation provided that, consistent with FAR 15.305(a)(2) and NFS 1815.305(a)(2), NASA would "evaluate [offerors'] and proposed major subcontractors' recent performance of work similar in size, content, and complexity to the requirements of [the Solicitation]." AR Tab 15 at 13711. Based upon that evaluation, NASA would then assign confidence ratings as provided in NFS 1815.305(a)(2): Very High, High, Moderate, Low, Very Low, or Neutral.
Price—the most important factor—was to be evaluated in accordance with FAR Subpart 15.4. The Solicitation stated that NASA would perform a price analysis consistent with FAR 15.404-1(b).
The Solicitation required that all proposals be fully submitted by December 29, 2017.
On April 11, 2018, the SEB presented the results of its initial evaluation to the Source Selection Authority ("SSA"). AR Tab 38 at 27787. The presentation summarized the evaluation factors as well as the ratings and scores achieved by the offerors.
In the "Final Summary" section of its presentation, the SEB provided a chart detailing the offerors' Mission Suitability adjectival ratings and scores, their Past Performance ratings, and their total evaluated prices:
The contracting officer decided during the evaluation process that discussions should be held. Accordingly, FAR 15.306 (c) required the establishment of a competitive range composed of the most highly rated proposals. AR Tab 39 at 27913. The SEB "recommended that the Government hold discussions with Offeror A, Offeror B, and PSP, which the SEB determined to be the most highly rated proposals" pursuant to FAR 15.306(c).
On April 12, 2018, the contracting officer prepared a five-page memorandum explaining the rationale for her competitive range determination.
In her memorandum, the contracting officer provided an overview of the ratings assigned to each offeror and an explanation of her competitive range determination. She observed that Offeror A had the highest Mission Suitability score but that it also had the highest price.
The memorandum also included a discussion of the ratings assigned to KBA's proposal and an explanation for its exclusion from the competitive range. The contracting officer observed that KBA's Mission Suitability score "was the second lowest of the five proposals."
The contracting officer determined that it would not be worthwhile to engage in discussions with KBA to address the weaknesses in its proposal. She opined that "[e]ven if KBA were to correct these weaknesses as a result of discussions, without any strengths or significant strengths in the Management and Technical subfactors, it is highly unlikely that discussions would result in KBA substantially increasing its Mission Suitability score without significant proposal revisions."
On April 16, 2018, NASA notified KBA that its proposal was not among the most highly rated proposals and was not included in the competitive range.
On May 4, 2018, KBA filed a protest with GAO. AR Tab 55. Relying upon GAO's decision in
GAO denied the protest on July 27, 2018. AR Tab 71 at 60112. It concluded that NASA's "underlying evaluation of KBA's proposal was reasonable and in accordance with the stated evaluation criteria."
On August 6, 2018, ten days after GAO rejected KBA's protest, NASA awarded the BOSS contract to PSP, Defendant-Intervenor herein. Press Release, Nat'l Aeronautics & Space Admin., NASA Awards Base Operations, Spaceport Services Contract at Kennedy Space Center (Aug. 6, 2018),
In its pleadings, KBA challenges NASA's competitive range determination on several grounds. First, it contends that NASA relied too heavily on the adjectival ratings and point scores assigned to the proposals and that it failed to adequately document the reasons for its competitive range determination. KBA also contends that the point scores themselves were distorted and arbitrary,
On October 10, 2018, contract awardee PSP filed a motion to intervene. ECF No. 15. The Court granted the motion the next day. ECF No. 16. KBA filed its motion for judgment on the administrative record on November 6, 2018 (ECF No. 31), and cross-motions from the government (ECF No. 36) and PSP (ECF No. 35) followed. The cross-motions have been fully briefed. Oral argument was held on February 27, 2019.
The Court of Federal Claims has jurisdiction over bid protests in accordance with the Tucker Act, 28 U.S.C. § 1491, as amended by the Administrative Dispute Resolution Act of 1996 § 12, 28 U.S.C. § 1491(b). Specifically, the Court has the authority "to render judgment on an action by an interested party objecting to a solicitation by a Federal agency for bids or proposals for a proposed contract or to a proposed award or the award of a contract or any alleged violation of statute or regulation in connection with a procurement or a proposed procurement." 28 U.S.C. § 1491(b)(1);
To possess standing to bring a bid protest, a plaintiff must be an "interested party"—
In a pre-award protest, to possess the direct economic interest necessary for standing, a protester must have suffered a non-trivial competitive injury that can be addressed by judicial relief.
KBA, an actual offeror, challenges its exclusion from the competitive range, a procurement decision falling within the Court's "broad grant of jurisdiction over objections to the procurement process."
Parties may move for judgment on the administrative record pursuant to Rule 52.1 of the Rules of the Court of Federal Claims ("RCFC"). Pursuant to RCFC 52.1, the Court reviews an agency's procurement decision based on the administrative record.
The Court reviews challenges to procurement decisions under the same standards used to evaluate agency actions under the Administrative Procedure Act, 5 U.S.C. § 706 ("APA").
This "highly deferential" standard of review "requires a reviewing court to sustain an agency action evincing rational reasoning and consideration of relevant factors."
The scope of judicial review of competitive range determinations is particularly narrow. Thus, "a contracting officer has broad discretion in determining competitive range, and such decisions are not disturbed unless clearly unreasonable."
In short, a disappointed offeror "bears a heavy burden" in attempting to show that a procuring agency's decision lacked a rational basis.
In this case, KBA's challenges to its exclusion from the competitive range in the BOSS procurement fall into two general categories. First, it claims that NASA's decisions to assign its proposals certain weaknesses under the Management and Technical subfactors of the Mission Suitability factor were arbitrary and capricious. Second, it challenges the competitive range determination itself, alleging: 1) that it was inadequately documented; 2) that it placed excessive reliance on adjectival ratings and point scores (which were "distorted and arbitrary"); and 3) that the determination does not reflect a meaningful consideration of relevant factors, including price. Pl.'s Mem. in Supp. of Mot. for J. on the Admin. R. ("Pl.'s Mem.") at 7, ECF No. 31-1.
For the reasons set forth below, each of KBA's protest grounds lacks merit. KBA has not met its "heavy burden" of showing that its exclusion from the competitive range lacked a rational basis. Therefore, its motion for judgment on the administrative record must be denied and the government and PSP's motions granted.
The SEB assigned KBA's proposal two weaknesses under the Management Approach subfactor (one of them at the "significant" level). It assigned five weaknesses under the Technical Approach subfactor (one of which was at the "significant" level). KBA now contends that several of these evaluation decisions were arbitrary and capricious because they were based on a misreading or misunderstanding of the relevant elements of KBA's proposal.
It is well established that the evaluation of proposals for their technical quality generally requires the special expertise of procurement officials. Reviewing courts therefore give the greatest deference possible to these determinations.
Thus, to successfully challenge NASA's determinations regarding the soundness of KBA's proposal, KBA must show that the agency "entirely failed to consider an important aspect of the problem, offered an explanation for its decision that runs counter to the evidence before [it], or [made a decision that] is so implausible that it could not be ascribed to a difference in view or the product of agency expertise."
The Solicitation establishes that the baseline work of the contract would encompass two types of service requests: 1) Baseline Repairs and Replacements ("BRRs"), i.e., "request[s] for work to existing infrastructure that is essential to protect, preserve, or restore Facilities, Systems, Equipment, and Utilities (FSEU)," AR Tab 15 at 13842; and 2) Service Orders ("SOs"), i.e., "request[s] for facilities-related work that is new in nature and not typically essential to protect, preserve, or restore FSEU,"
Under the Performance Work Statement (at PWS 1.2.1-23), the contractor was required to "[m]anage the established counts of SOs and BRRs with the established counts exchange rate (ECER)" and "[e]xchange the established counts of SOs and BRRs within each customer using the ECER to meet fluctuating needs."
In the section of its proposal responsive to the foregoing requirements and the corresponding evaluation criteria, KBA stated as follows, in pertinent part:
AR Tab 20a at 15635.
The SEB assigned KBA a significant weakness with respect to this aspect of its proposal. It concluded that "KBA's approach of using a set point of 85% to monitor and track the expenditure of `work units' by customer is an inappropriate approach to managing the established counts of [BRRs] and [SOs]." AR Tab 32a at 27668. The SEB explained that the approach did not "demonstrate how KBA will not exceed the total combined established units," nor did it "demonstrate how KBA will perform PWS 1.2.1-23 requirement to exchange the established counts of BRRs and SOs within each customer using the ECER to meet fluctuating needs."
The SEB found these flaws in KBA's proposal significant because "[m]anaging the established counts [was] a significant portion of the contract," and because, in the SEB's view, KBA "fail[ed] to describe an approach for actively managing the usage rates of the counts until the 85% set point is reached."
KBA now contends (as it did unsuccessfully before GAO) that the agency's assignment of a significant weakness concerning its proposal to manage established counts is based on a misunderstanding of the proposal. Thus, while KBA concedes that the "85% set point notification" does not satisfy the requirement set forth in PWS 1.2.1-23 that the contractor actively manage established counts, it asserts that it was "never intended" to do so. Pl.'s Mem. at 24. Instead, according to KBA, it proposed to manage established counts by "develop[ing] a budget management system in Excel that monitors and tracks the expenditure of work units by client" and by "communicating with the government regarding needs for approaches to ensure sufficient units are available, such as `exchanges between customers.'"
KBA's challenge to this aspect of NASA's evaluation lacks merit. "An offeror has the responsibility to submit a well-written proposal with adequately detailed information that allows for a meaningful review by the procuring agency."
The agency's conclusion was particularly reasonable given the conclusory nature of the other language in KBA's proposal, which asserted (without explaining how) that KBA "manage[s] and prioritize[s] [electronic work requests] to ensure the total combined established units for each baseline customer are not exceeded, and communicate[s] with the Government on recommended approaches . . . to ensure sufficient units are available throughout each fiscal year." AR Tab 20a at 15635. In fact, this portion of KBA's proposal essentially just restates the Solicitation requirements, which offerors were cautioned not to do.
The Court finds similarly unpersuasive KBA's argument that "NASA evaluators were incorrect in their assertion that KBA's proposal failed to address the monthly surveillance review meeting." Pl.'s Mem. at 24. KBA appears to argue that the evaluators erred by failing to infer that this requirement was satisfied by a separate section of its proposal—which assigned its Program General Manager to participate in the surveillance review meetings. But the SEB's criticism was not directed at whether the appropriate personnel would be participating in the meetings; rather, it was directed at KBA's failure to specify that established counts would be discussed at each meeting. Again, it was KBA's responsibility to submit a well-written proposal that was responsive to the requirements of the Solicitation.
Finally, KBA has challenged the agency's view that its proposal created a risk of imposing "budgetary burdens," contending that budget issues are "not properly considered on a fixed-price contract." Pl.'s Mem. at 25-26. As the government points out, however, KBA misunderstands the nature of the "budgetary" risks the SEB identified. The SEB concluded that KBA's proposed approach could "cause budgetary burdens on customers to fund purchases of additional pre-priced counts or deferral of priority work." AR Tab 32a at 27668. In other words, the risk identified was that additional counts would need to be procured or work deferred because of inadequate management of the established counts. That risk is appropriately a consideration where, as here, the fixed price CLIN covers only a set number of counts. This challenge to the SEB's decision to assign a significant weakness to KBA's proposal, therefore, falls short.
KBA also challenges the SEB's decision to assign its proposal a weakness based on its conclusion that KBA's proposed "outage management process . . . include[d] cost type contract assumptions," which the SEB believed "demonstrate[d] a lack of understanding of and ability to manage the contract as a fixed-price contract." AR Tab 32a at 27670. According to the SEB, these "cost type" assumptions were reflected in KBA's proposal to assign a budget to each outage, to use an outage cost control coordinator to track, update, and report the cost performance against the outage budget, and to present such reports at the weekly outage meeting.
KBA asserts that the inferences that the SEB drew from its proposal were unreasonable. It contends that the proposal "clearly indicated that KBA understood that outage management was to be performed under fixed-price baseline or IDIQ work." Pl.'s Mem. at 27. In support of that assertion, it cites several pages in the Technical Approach 1.0 section of its proposal, in which it referenced the concept of "baseline work" or "Baseline Repairs and Replacement" while also discussing outages.
The Court is also not persuaded by KBA's argument that that the Solicitation "require[d] that actual costs are tracked in order to actively manage categories of BRRs and SOs that correspond to different cost levels," and that therefore, "KBA's proposal to monitor costs and work scope for outages performed as part of the overall baseline fixed-price effort is consistent with contract requirements." Pl.'s Mem. at 27-28. As the government explains, the cost tracking associated with BRRs and SOs "addresses a separate PWS requirement" from outage management. Def.'s Mot. at 34; Def.'s Reply in Supp. of Mot. for J. on the Admin. R. at 13-14, ECF No. 42.
In short, the Court is satisfied that the SEB examined the proposal in its entirety, applied its expertise to the proposal's interpretation, and supplied an adequate explanation of its reasoning. Its assignment of a weakness to this aspect of KBA's proposal was rational and supported by the record.
The agency assigned KBA's proposal a significant weakness under the Technical Approach subfactor because its "basis of estimate" ("BOE") spreadsheet "fail[ed] to identify any other direct costs" ("ODCs") for operations, maintenance, and engineering to perform the [PWS] requirements. AR Tab 32a at 27672. KBA challenges this determination on the grounds that the Solicitation did not require it to provide the information omitted from its spreadsheet and that, in any event, the information was included elsewhere in its proposal. These contentions lack merit.
The SEB's decision to assign the significant weakness was based on what it viewed as a failure on KBA's part to comply with Section L.16.2 of the Solicitation. That provision specified the information an offeror was required to include in its proposal in response to the Technical Approach subfactor. It stated that offerors must provide BOEs "for the requirements identified in Attachment L-03 (BOE Template), using the format provided." AR Tab 15 at 13698; see
Based on these instructions, each offeror submitted a completed version of Attachment L-03 with its proposal. It is undisputed, however, that the BOE Template that KBA submitted did not include entries for the two columns covering "other direct costs" to perform the operations, maintenance, and engineering requirements. AR Tab 32a at 27672. The SEB observed that the maintenance function, for example, involved lubrication and the cleaning of equipment, and yet KBA's spreadsheet did not include other direct costs for the materials needed to perform these tasks.
KBA contends that the Solicitation did not require any itemization of ODCs for the PWS requirements at issue. Pl.'s Mem. at 32-33. Further, it asserts that it "incorporated ODCs into its BOE for these PWS sections," but rather than break them out on the BOE Template, the ODCs "instead were made part of KBA's overall proposed cost."
The Court finds KBA's arguments unpersuasive. As detailed above, the Solicitation specified that offerors were to provide BOEs for the requirements listed, and that they were to use the format provided in Attachment L-03. AR Tab 15 at 13698. That attachment, in turn, included two ODC columns.
In light of the foregoing, it was reasonable for the agency to assign a significant weakness to this aspect of KBA's proposal, notwithstanding that KBA included ODCs in its "overall proposed cost." The agency instructed offerors to fill out the BOE Template so that it could fully assess their understanding of the contract requirements and the reasonableness of their approaches. The SEB's assignment of the significant weakness based on KBA's failure to sufficiently demonstrate its comprehension was reasonable and consistent with the Solicitation.
The Court also rejects KBA's argument that it was "prohibited" from providing ODCs for the requirement to provide a trained and fully licensed workforce. The fact that the Solicitation required training to be provided at the contractor's expense did not obviate the requirement that offerors list the ODCs of such training to demonstrate to the agency's satisfaction that they understood the training requirement and had a reasonable proposal to meet it.
Once again, it was KBA's responsibility to submit a "well-written proposal with adequately detailed information that allow[ed] for a meaningful review by [NASA]."
Finally, NASA assigned KBA a weakness under the Technical Approach subfactor based on KBA's failure to meet the PWS 3.1.2-2 requirement to "[d]evelop and maintain a process to manage Maintenance Action Requests (MARs) within Maximo." AR Tab 15 at 13767. Like its other challenges to the SEB's evaluation decisions, KBA's claim that the assignment of this weakness was improper lacks merit.
Maximo is a government-provided computerized maintenance management system that is used at the Kennedy Space Center.
KBA again contends that the SEB misread its proposal. It argues that in addition to providing for review of MARs during the Baseline Work Integration Meeting, other aspects of its proposal included "a continuous process/work flow within Maximo." Pl.'s Mem. at 36. Specifically, KBA notes that its proposal made reference to related software tools and that it stated: "We use EWRS and Maximo so the current work status of all WONs is available on demand."
As the government points out, the separate section of the proposal cited by KBA is not responsive to the particular PWS requirement at hand.
The Court also finds rational the SEB's concern that, under KBA's proposal, an excessive amount of time would be spent on MARs at weekly meetings. KBA's proposal specifies that it will maintain an MAR log listing work orders, which in turn is to be reviewed at each weekly Baseline Work Integration Meeting, where "each open action/issue will be reviewed and dispositioned in a number of ways." AR Tab 20a at 15669. The proposal further contemplates that "[d]isposition outcome(s) will be reviewed at the next scheduled meeting with the expectation [that] the action will be positively addressed and closed."
In short, the Court cannot second-guess the SEB's conclusion that KBA's proposal did not adequately meet this technical requirement. KBA's challenge to the weakness the SEB assigned regarding this issue is therefore without merit.
In addition to challenging NASA's assignment of weaknesses to aspects of its proposal as discussed above, KBA contends that the competitive range determination itself was arbitrary and capricious. It argues that, in making the determination, the contracting officer "improperly focused almost exclusively on high-level adjectival ratings, the number of assigned Strengths/Weaknesses, and distorted point scores," and further, that she "did not document any meaningful analysis of the evaluation findings or proposal features underlying those high-level findings." Pl.'s Mem. at 8.
As noted above, to prevail in its challenge to the contracting officer's competitive range determination, KBA must establish that it was "clearly unreasonable." For the reasons set forth below, the Court finds that KBA has failed to do so. The reasons for the contracting officer's competitive range determination were fully documented in the record and were not based solely on adjectival ratings and point scores. Instead, the determination reflects a meaningful consideration of all relevant factors. Further, and contrary to KBA's argument, NASA's point-score system did not artificially inflate the evaluated differences between the proposals. Rather, it provided a useful point of reference for distinguishing which proposals were the most highly rated and, therefore, should be included in the competitive range.
FAR 15.306(c)(1) governs the establishment of the competitive range. It states that "[b]ased on the ratings of each proposal against all evaluation criteria, the contracting officer shall establish a competitive range comprised of all of the most highly rated proposals, unless the range is further reduced for purposes of efficiency." FAR 15.306(c)(1) does not prescribe any documentation requirement or mandatory content for a contracting officer's memorandum memorializing her competitive range determination.
Here, the basis for the contracting officer's competitive range determination is well documented in the competitive range memorandum and in the record of the SEB's evaluation of the proposals. Thus, the SEB's presentation of its initial evaluation of each offeror's proposal to the SSA contains an explanation of the strengths and weaknesses the SEB assigned to each offeror's proposal for each technical subfactor under the Mission Suitability factor. These strengths and weaknesses, which were memorialized in individual memoranda, served as the basis for the adjectival ratings and point scores the SEB assigned. AR Tab 38 at 27804-39. The presentation further contains an explanation of the reasoning for the "Very High" confidence rating assigned for each offeror's past performance and includes a summary of the total evaluated price of all offerors.
In the competitive range memorandum itself, the contracting officer explained that she had decided to place the three most highly scored and rated proposals into the competitive range. That determination was supported by the fact that there was a natural breakpoint between the Mission Suitability scores of the three highest-rated proposals (Offeror A, Offeror B, and PSP) as compared to the remaining two proposals (Offeror C and KBA).
The contracting officer also provided a specific explanation in the memorandum as to why she had found that KBA's proposal was not among the most highly rated. She cited the proposal's several significant weaknesses, which had resulted in it being assigned the second-lowest Mission Suitability score of the five proposals. These included, as described above, "several Basis of Estimate areas that demonstrated a lack of understanding in various resource areas" and an "approach to managing the established counts" that the SEB found inadequate.
The contracting officer also explained that, in her view, it was unlikely that holding discussions with KBA would be fruitful. She observed that even if KBA corrected its weaknesses as a result of discussions, its proposal still would lack any strengths or significant strengths in the Management and Technical subfactors. She predicted that KBA would therefore have to make "significant proposal revisions" in order to increase its Mission Suitability score substantially.
As the foregoing demonstrates, the record reflects that the contracting officer concluded that KBA's proposal was not among the most highly rated because it was the second highest in price and yet had significant weaknesses and no strengths with respect to the two most important Mission Suitability subfactors. KBA's adjectival ratings and point scores were inferior to those of the three proposals that the contracting officer concluded were the most highly rated, including two proposals that came in at a significantly lower price. The contracting officer's reasoning was adequately documented through the competitive range memorandum and the supporting evaluation materials. Therefore, KBA's contention that the competitive range determination does not reflect a meaningful consideration of relevant factors, lacks merit.
In addition to its argument that the contracting officer did not adequately document the basis for her competitive range determination, KBA challenges as "arbitrary and misleading" the Mission Suitability point-scoring system that NASA employed in connection with its evaluation process. Pl.'s Mem. at 15. KBA contends that a comparison of the total number of strengths and weaknesses assigned to each offeror's proposal does not portend a "large disparity" in the "underlying Mission Suitability evaluations."
KBA's "distortion" argument lacks merit. It is premised on the notion that the strengths and weaknesses that the SEB identified in each subfactor were of equal importance in scoring the Mission Suitability factor. In fact, however, each subfactor was assigned a weight based on its importance (525 points for Management Approach, 375 points for Technical Approach, and only 100 points for Small Business Utilization). Strengths and weaknesses assigned under the Management Approach subfactor, therefore, carried more than five times the weight of those assigned under the Small Business Utilization subfactor. And, as the summary chart below shows, KBA lacked any strengths that might outweigh its weaknesses under the two most important subfactors, Management and Technical Approach.
The Court finds similarly without merit KBA's contention that the competitive range determination was arbitrary and capricious because there is nothing in the record that explains "how the evaluators arrived at specific point scores within a `percentile point range'" before multiplying the percentile scores by the number of available points for each subfactor. Pl.'s Mem. at 16. Thus, KBA argues that "the underlying record still does not show . . . how the evaluators determined whether an offeror fell within the low-end, middle, or high-end of a given `percentile point range.'"
KBA's contention that there is nothing in the record that explains how specific point scores were assigned ignores the government's explanation that the evaluators reached "consensus findings" and assigned adjectival ratings and scores which were "validated" for consistency.
The evidence supporting this inference is reflected in the summary chart above, which shows each offeror's percentile scores and total points for each Mission Suitability subfactor, as well as the offerors' total Mission Suitability scores. Taking the Management Approach subfactor as an example, Offeror A and PSP received almost identical percentile scores of 68 and 69, which is rational considering their identical distribution of two "other" strengths each, with no significant strengths or any weaknesses of either type. In contrast, although it earned the same adjectival rating of "Good," Offeror B received a percentile score approximately ten points lower, which reasonably follows from the fact that it achieved only one strength and was also assigned a weakness. Along the same lines, there is a rational relationship between Offeror A and Offeror B's Technical Approach scores within the "Very Good" range, because Offeror A did slightly better than Offeror B in that it earned a strength whereas Offeror B did not.
The relative scores of KBA and Offeror C similarly reflect a rational scoring scheme. Each of these offerors achieved the same "Fair" rating for Management Approach and Technical Approach, but their percentile scores differed based upon underlying differences in their respective evaluations. Offeror C fared slightly better on the scoring of its Management Approach because it earned a strength under that subfactor and KBA did not. KBA fared better under Technical Approach because Offeror C was assigned two significant weaknesses and five weaknesses under that subfactor, pulling its percentile score down toward the bottom of the "Fair" range.
These types of rational relationships hold across all of the percentile scores within the given adjectival ranges. Proposals evaluated similarly under a given subfactor received similar percentile scores for that subfactor, and where the percentile scores varied, the differences are based on the relative weaknesses and strengths assigned as part of the underlying evaluation.
In sum, while the specific basis for determining the percentile scores is of "less than ideal clarity . . . the agency's path may reasonably be discerned." Motor Vehicle Mfrs. Ass'n, 463 U.S. at 43. The Court is satisfied that the point-score system, which was used as a guide to the competitive range determination, did not "distort" or "artificially inflate" the differences among the offerors' proposals.
KBA mounts various challenges to the contracting officer's substantive treatment of the relative qualities of each offeror's proposal. For example, it claims that the competitive range memorandum "merely count[ed] the Weaknesses assigned to KBA's proposal" and failed to acknowledge or discuss the fact that other proposals with the same or similar weaknesses were included in the competitive range.
But the contracting officer did not merely count weaknesses; she considered the relative importance of the weaknesses assigned to each offeror's proposal and weighed them against their strengths. KBA's proposals were not comparable to those it cites, for the proposals of the other offerors that had similar weaknesses were also assigned strengths and/or significant strengths under the Management and Technical Approach subfactors. KBA's proposal, on the other hand, was assigned only the weaknesses, and no strengths.
The evaluation scheme required NASA to balance each proposal's strengths and weaknesses to determine its adjectival rating.
KBA also alleges that in determining the competitive range, NASA gave short shrift to its high ratings and strong showing under the Small Business Utilization subfactor within the Mission Suitability evaluation. Pl.'s Mem. at 9-12. KBA argues that "[h]ad NASA's competitive range determination actually looked behind the offerors' adjectival ratings under the Small Business Utilization plan subfactor and actually documented a reasoned consideration of the actual evaluation findings, KBA's Small Business Utilization plan clearly would have been deemed to be superior to the other offerors' plans."
But NASA in fact
KBA's challenges to the contracting officer's analysis of the Past Performance factor are similarly unpersuasive. As explained above, all proposals were assigned a "Very High" confidence rating under this factor. KBA contends that the contracting officer should have treated its proposal as superior to the others with respect to past performance. It contends that "a reasonable evaluation of KBA's `exceptional' and `very highly pertinent' past performance on the
The record shows that—contrary to KBA's argument—the agency engaged in a meaningful evaluation of each offeror's past performance under the criteria set forth in the Solicitation.
Finally, KBA asserts that the competitive range determination "does not document a meaningful consideration of price." Pl.'s Mem. at 18 (emphasis omitted). More specifically, KBA posits that the competitive range determination did not "document any sort of comparison of the offerors' prices to the Independent Government Cost Estimate," nor did it "probe the underlying reasons" for the disparity between KBA and Offeror A's prices—$658.3 million and $670.0 million, respectively—and the other three offerors' prices, all of which were below $490 million. Pl.'s Mem. at 18-19 (emphasis removed); AR Tab 38 at 27851 (table showing each offeror's total evaluated price).
KBA's argument that the agency did not document a comparison of the offerors' prices is inconsistent with the administrative record. The underlying SEB presentation explicitly reflects that—consistent with the criteria set forth in the Solicitation—the SEB "perform[ed] a price analysis in accordance with FAR 15.404-1(b)." AR Tab 15 at 13711 (Solicitation); AR Tab 38 at 27850 (SEB presentation).
KBA's argument that NASA should have "probe[d] the underlying reasons" for the price differences between the higher- and lower-priced proposals also lacks merit. The reasons for the price differences would only be relevant if the Solicitation called for a price realism analysis. "Where an award of a fixed-price contract is contemplated, a proposal's price realism is not ordinarily considered, since a fixed-price contract places the risk of loss on the contractor."
Further, as explained above, the contracting officer explicitly noted KBA's high price (coupled with its relatively low Mission Suitability score) in explaining why its proposal was not included in the competitive range. Although Offeror A's proposed price was slightly higher than KBA's price, Offeror A's proposal also contained several strengths (unlike KBA's), and Offeror A achieved ratings of "Good" and "Very Good" for two Mission Suitability subfactors under which KBA achieved a rating of only "Fair." The record therefore shows that price was considered as one of the factors relevant to the determination of which proposals were the most highly rated.
In summary, the record reveals that the contracting officer considered all of the evaluation criteria, including price, and weighed the proposals holistically in determining which to include in the competitive range. She did not rely exclusively on adjectival ratings and numerical scores, but rather used them as points of reference or guides in making comparisons among the offerors. Accordingly, the Court rejects KBA's argument that the record does not reflect that the contracting officer engaged in a meaningful analysis of all relevant factors in making her competitive range determination.
For the reasons discussed above, KBA's motion for judgment on the administrative record is