Elawyers Elawyers
Washington| Change

Cedge Software Consultants, LLC v. United States, 1:14-cv-00394 (2014)

Court: United States Court of Federal Claims Number: 1:14-cv-00394 Visitors: 1
Judges: Margaret M. Sweeney
Filed: Aug. 11, 2014
Latest Update: Mar. 02, 2020
Summary: In the United States Court of Federal Claims No. 14-394C (Filed Under Seal: July 25, 2014) (Reissued for Publication: August 11, 2014)* ************************************* CEDGE SOFTWARE CONSULTANTS, * LLC, * * Plaintiff, * * v. * * Bid Protest; Cross-Motions for Judgment THE UNITED STATES, * on the Administrative Record; Assignment * of a Deficiency; Removal From Defendant, * Competitive Range; Discussions * and * * TRIDENT TECHNOLOGIES, LLC, * * Defendant-Intervenor. * **********************
More
           In the United States Court of Federal Claims
                                          No. 14-394C
                                 (Filed Under Seal: July 25, 2014)
                           (Reissued for Publication: August 11, 2014)*

*************************************
CEDGE SOFTWARE CONSULTANTS, *
LLC,                                   *
                                       *
                 Plaintiff,            *
                                       *
 v.                                    *
                                       *             Bid Protest; Cross-Motions for Judgment
THE UNITED STATES,                     *             on the Administrative Record; Assignment
                                       *             of a Deficiency; Removal From
                 Defendant,            *             Competitive Range; Discussions
                                       *
and                                    *
                                       *
TRIDENT TECHNOLOGIES, LLC,             *
                                       *
                 Defendant-Intervenor. *
*************************************

James Y. Boland, Tysons Corner, VA, for plaintiff.

Lisa L. Donahue, United States Department of Justice, Washington, DC, for defendant.

W. Brad English, Huntsville, AL, for defendant-intervenor.

                                    OPINION AND ORDER

SWEENEY, Judge

       In this bid protest, plaintiff asserts that the procuring agency improperly excluded it from
the competitive range, conducted inadequate discussions, and evaluated the offerors’ proposals
disparately on a key factor. The parties have cross-moved for judgment on the administrative
record. For the reasons set forth below, the court denies plaintiff’s motion and grants the
motions of defendant and defendant-intervenor.



       *
          The court provided the parties with an opportunity to suggest redactions to this ruling,
but in an August 11, 2014 joint status report, they indicated that no redactions were necessary.
                                      I. BACKGROUND

                                         A. Solicitation

        On June 12, 2013, the United States Transportation Command (“USTRANSCOM”)
issued solicitation number HTC711-13-R-D003 for an Enterprise Architecture, Data, and
Engineering contract to meet the information technology engineering needs of three United
States Department of Defense components located at Scott Air Force Base, Illinois:
USTRANSCOM; the United States Air Force Air Mobility Command Directorate of Command,
Control, Communications, and Computer Information Systems; and the Military Surface
Deployment and Distribution Command Communications Directorate (collectively, “the
procuring agencies”).1 AR 294, 613-15. More specifically, USTRANSCOM sought to procure:

       an integrated enterprise architecture from the enterprise level down through the
       solution level and across solution level architectures. Integration is achieved
       through the architecture tool suite, the use of standardized templates and
       guidelines, training, and the architecture review process. The work effort will
       support analytical services required to support and implement . . . operational and
       system requirements solutions. The Contractor must possess a comprehensive
       understanding of the DOD Architecture Framework (DODAF), the Federal
       Enterprise Architecture Framework (FEAF) and the relationships/dependencies
       between architecture models to support assigned projects. The scope of the
       architecture and data management services . . . includes the operational and
       system perspectives of Command and Control (C2), planning, transportation,
       logistics, and business support system domains. Using established strategic vision
       documentation, the Contractor will provide architecture and data management
       support. . . . Systems administration of the tools that house the architecture and
       data artifacts is also required.

Id. at 615.
The Performance Work Statement (“PWS”) included in the solicitation contains a
description of nine tasks for which the contractor would be responsible: contract management;
enterprise architecture development and maintenance; data management; modernization,
development, support, and security for enterprise architecture tools; enterprise engineering
support; alternate functional area communications and computer systems management duties;
information support plan development; agile development; and prototyping. 
Id. at 615-48.
        USTRANSCOM intended to award a single indefinite-delivery/indefinite-quantity
contract “to the responsible offeror whose offer conforming to the solicitation [would] be most
advantageous to the Government, price and other factors considered.” 
Id. at 682,
758. The four
factors to be considered were technical capability, staffing approach, past performance, and price.


       1
        The court derives the facts in the background section from the administrative record
(“AR”) and documents specifically referenced in the administrative record.

                                                -2-

Id. Technical capability
was significantly more important than the other two nonprice factors,
which were equally important. 
Id. Combined, the
nonprice factors were approximately equally
important as price. 
Id. Given the
weight assigned to each factor, offerors were advised that the
contract could be awarded to “a higher rated, higher priced offeror,” but that USTRANSCOM
would “not pay a price premium that it consider[ed] to be disproportionate to the benefits
associated with the proposed margin of service superiority.” 
Id. at 682.
Offerors were further
advised that USTRANSCOM might “conduct discussions with offerors” and “limit the
competitive range for purposes of efficiency.” 
Id. The technical
capability factor had four subfactors; in “descending order of importance,”
they were technical capability, enterprise architecture development, enterprise engineering
support, and staffing approach. 
Id. Of particular
relevance in this protest are the first two
subfactors:

       Subfactor 1: Technical capability–The Offeror shall submit a sound plan for
       accomplishing the requirements of the PWS. The plan should provide a logical
       approach that ensures timely support for all tasks as described in the PWS.

       Subfactor 2: Enterprise Architecture Development (Task 2)–Offerors shall submit
       an integrated model subset addressing the following:

               (a) The Offeror shall develop and submit Department of Defense
               Architecture Framework (DODAF) v2.02 models based on the attached
               use case . . . and applicable reference listed in sub-paragraph (b), below.
               Models required for submission:

                      1. OV-5a, Operational Decomposition Tree
                      2. OV-6c, Event-Trace Description (Developed using BPMN)2


       2
           “The OV-6c provides a time-ordered examination of the Resource Flows as a result of a
particular scenario. . . . Operational Event/Trace Descriptions, sometimes called sequence
diagrams, event scenarios, or timing diagrams, allow the tracing of actions in a scenario or
critical sequence of events.” OV-6c: Event-Trace Description, http://dodcio.defense.gov/
TodayinCIO/DoDArchitectureFramework/dodaf20_ov6c.aspx (last visited July 11, 2014), quoted
in AR 3718. Offerors were to develop their OV-6c models using BPMN, i.e., Business Process
Model and Notation, a standardized method of graphically representing internal business
procedures. Object Management Group, Business Process Model and Notation (BPMN) Version
2.0 1, 21 (2011), http://www.omg.org/spec/BPMN/2.0/ (“BPMN 2.0”).

        The scenario to be depicted in the OV-6c was the process described in the use case
attached to the solicitation. AR 682. “In BPMN, a Process is depicted as a graph of Flow
Elements, which are a set of Activities, Events, Gateways, and Sequence Flow that adhere to a
finite execution semantics.” BPMN 
2.0, supra, at 502
. A sequence flow is “[a] connecting

                                                -3-
                       3. AV-2, Integrated Dictionary

               (b) Reference Materials. The following reference materials will be used
               by the evaluation team to review submissions under this subfactor:

                       1. DODAF, Version 2.02
                       2. Enterprise Architecture Planning, Developing a Blueprint for
                       Data, Applications, and Technology, Steven H. Spewak, and
                       Steven C. Hill[,] A Wiley–QED publication[,] 1992
                       3. The Practical Guide to Business Process Reengineering Using
                       IDEF0, Clarence Feldmand, Dorset House Publishing, 1998
                       4. BPMN Method & Style, Bruce Silver

Id. at 682-83
(footnote added). The “attached use case” mentioned in the description of the
second subfactor was a hypothetical situation in which a grandmother planned to install an in-
ground swimming pool in her backyard, and included the following information: (1) a
description of the stakeholders and interested parties; (2) a “main success scenario” containing
fifty-one steps; (3) a number of “extensions,” i.e., deviations from the main success scenario; (4)
a list of the technologies required for the project, such as a backhoe and survey equipment; and
(5) a list of data required for the project, such as bids and contracts. 
Id. at 434-38.
        USTRANSCOM described how it would evaluate submitted proposals in section M of
the solicitation. 
Id. at 758-62.
With respect to the two technical capability subfactors at issue in
this protest, section M provided:

       Factor 1–Technical Capability

               Measure of Merit, Subfactor 1: This Measure of Merit is met when the
       offeror has submitted a plan that provides a logical approach that ensures timely
       support for all tasks as described in the PWS. The plan reflects a clear
       understanding of the work.

              Measure of Merit, Subfactor 2: The Measure of Merit for Subfactor 2 is
       met when the offeror submits a fully integrated model subset based on this
       scenario.

               ....



object that shows the order in which activities are performed in a Process . . . .” Id.; accord AR
3907 (containing an excerpt from Silver, BPMN Method & Style). A gateway is “used to control
how the Process flows . . . through Sequence Flows as they converge and diverge within a
Process.” BPMN 
2.0, supra, at 90
.

                                                 -4-
       Color ratings, as documented in the table below, will be used for the
technical capability evaluation. Subfactors 1-4 above will be evaluated separately
and will be given individual color ratings.

  Color         Rating                             Description
 Blue       Outstanding       Proposal meets requirements and indicates an
                              exceptional approach and understanding of the
                              requirements. The proposal contains multiple
                              strengths and no deficiencies.
 Purple     Good              Proposal meets requirements and indicates a
                              thorough approach and understanding of the
                              requirements. Proposal contains at least one
                              strength and no deficiencies.
 Green      Acceptable        Proposal meets requirements and indicates an
                              adequate approach and understanding of the
                              requirements. Proposal has no strengths or
                              deficiencies.
 Yellow     Marginal          Proposal does not clearly meet requirements and
                              has not demonstrated an adequate approach and
                              understanding of the requirements.
 Red        Unacceptable      Proposal does not meet requirements and contains
                              one or more deficiencies and is unawardable.

        Technical proposal risk is the assessment of technical risk, which is
manifested by the identification of weakness(es), [and] considers potential for
disruption of schedule, increased costs, degradation of performance, the need for
increased Government oversight, or the likelihood of unsuccessful contract
performance. Risk will be assessed of the technical proposal and will be assigned
one of the following proposal risk ratings:

   Rating                                 Description
 Low          Has little potential to cause disruption of schedule, increased cost,
              or degradation of performance. Normal contractor effort and
              normal Government monitoring will likely be able to overcome
              any difficulties.




                                        -5-
         Moderate     Can potentially cause disruption of schedule, increased cost, or
                      degradation of performance. Special contractor emphasis and
                      close Government monitoring will likely be able to overcome
                      difficulties.
         High         Is likely to cause significant disruption of schedule, increased cost,
                      or degradation of performance. Is unlikely to overcome any
                      difficulties, even with special contractor emphasis and close
                      Government monitoring.

Id. at 758-59.
                               B. Evaluation of Initial Proposals

        USTRANSCOM received five proposals in response to its solicitation, including one
from plaintiff CEdge Software Consultants, LLC (“CEdge”) and another from defendant-
intervenor Trident Technologies, LLC (“Trident”). 
Id. at 4084.
A Source Selection Evaluation
Board (“SSEB”) reviewed the proposals; identified the proposals’ strengths, weaknesses, and
deficiencies; and reached consensus ratings.

        With respect to CEdge’s proposal, the SSEB identified two strengths under the first
technical capability subfactor–one relating to “the use of advanced penetration testing tools” and
the other relating to “the use of the DataFlux tool.” 
Id. at 4009-10.
The SSEB noted no
weakness or deficiencies, and ultimately assigned CEdge a purple/good rating for this subfactor.
Id. In contrast,
under the second technical capability subfactor, the SSEB identified one
weakness, two significant weaknesses, and four deficiencies with CEdge’s proposal. 
Id. at 4011-
13. Of particular importance is the SSEB’s evaluation of CEdge’s OV-6c model. The SSEB
identified two deficiencies with this model, noting that the model “was not developed using
BPMN” and “only provided a small subset of the fully dressed use case.” 
Id. at 4012.
Based on
its review and evaluation, the SSEB assigned CEdge a red/unacceptable rating for this subfactor.
Id. at 4013.
It also assigned CEdge a moderate risk rating for the technical capability factor
overall. 
Id. at 4023-24.
        Turning to Trident’s proposal, the SSEB identified three strengths under the first
technical capability subfactor–one pertaining to “the use of certified personnel in critical task
areas,” the second pertaining to a “demonstrated knowledge of design patterns,” and the third
pertaining to “the use of DataFlux.” 
Id. at 2560-61.
The SSEB noted one weakness and no
deficiencies, and ultimately assigned Trident a blue/outstanding rating for this subfactor. 
Id. With respect
to the second technical capability subfactor, the SSEB identified five
weaknesses, one significant weakness, and two deficiencies with Trident’s proposal. 
Id. at 2563-
                                                 -6-
64. Again, of particular importance in this protest is the SSEB’s evaluation of OV-6c models.
The SSEB identified three weaknesses with Trident’s model, noting that the model’s narrative
was inconsistent with the diagram, that the use of an external activity “disrupted the logical flow”
of the model, and that the “use of event-based gateways” was inconsistent. 
Id. at 2564.
The
SSEB also identified one deficiency pertaining to the “limited scoping of the use case scenario.”
Id. Based on
its review and evaluation, the SSEB assigned Trident a red/unacceptable rating for
this subfactor. 
Id. at 2565.
It also assigned Trident a moderate risk rating for the technical
capability factor overall. 
Id. at 2558-59.
                            C. Competitive Range and Discussions

                          1. Initial Competitive Range Determination

        The SSEB briefed the Source Selection Authority (“SSA”) on its findings. 
Id. at 4047-
66. It recommended that the proposals of three offerors–CEdge, Trident, and a third company–be
placed in the competitive range for discussions. 
Id. at 4063,
4087. That same day, the SSA
approved a Competitive Range Determination document adopting the SSEB’s recommendation.
Id. at 4084-98.
In that document, the SSA described how discussions would be conducted:

       These Offerors [in the competitive range] will be provided their initial evaluation
       results including their individual past performance confidence assessment rating,
       individual technical and staffing approach ratings, and risk ratings. Evaluation
       notices (EN) for each item requiring discussion will be issued. The Contracting
       Officer will address these issues by conducting written (and oral, if necessary)
       discussions and then requesting each Offeror’s written response to the EN along
       with supporting documentation. Subsequent ENs will be issued, if required.

Id. at 4097.
                                      2. Evaluation Notices

        The contracting officer sent letters to each of the offerors in the competitive range, 
id. at 2683-84,
2689-92, along with an evaluation notice (“EN”) for each weakness and deficiency
identified by the SSEB, 
id. at 2693-860,
4099-113. Thus, Trident received eight ENs related to
the second technical capability subfactor, four of which related to the weaknesses and deficiency
assigned to its OV-6c model. The first–EN-TRIDENT-05–pertained to the deficiency and
provided:

              Your proposal failed to address the full scope of the use case scenario.
       The limited scoping of the use case scenario (Page 67) resulted in an absence of




                                                 -7-
       collapsed sub-processes and un-modeled use case extensions.3 In addition, the
       limited scoping of the OV-6c also impacted the activities in the OV-5a and entries
       into the AV-2.

               Please address the full scope of the use case for OV-6c.

Id. at 2853
(footnote added). The remaining ENs for this subfactor related to the assessed
weaknesses. EN-TRIDENT-06 provided:

               Your proposed OV-6c narrative was not consistent with what was modeled
       in the OV-6c diagram. For example, the offeror’s narrative stated ‘the General
       Contractor obtains signed contracts’ which indicates inputs; however, the model
       shows the General Contractor providing outputs for this particular activity.

              Please provide an updated OV-6c narrative and diagram which maintain
       consistency throughout.

Id. at 2854.
EN-TRIDENT-07 provided:

               Your proposed use of the external activity, A99, disrupted the logical flow
       of the OV-6c.

               Please provide an updated OV-6c that demonstrates a clear and logical
       flow.

Id. at 2855.
And, EN-TRIDENT-08 provided:

              Your proposal was inconsistent in the use of event-based gateways
       throughout the OV-6c. For example, the pool company did not proceed to dig the
       hole (A 3.3) until a payment was received which should have been reflected in an
       event-based gateway and not an External Activity A99.

              Please provide an updated OV-6c properly utilizing event-based gateways
       where applicable.

Id. at 2856.
       Similarly, CEdge received seven ENs related to the second technical capability subfactor,


       3
          A subprocess, “one of BPMN’s most important concepts,” is “an activity containing
subparts that can be expressed as a process flow.” AR 3907. It “is simultaneously an activity, a
step in a process that performs work, and a process, a flow of activities from a start event to one
or more end events.” 
Id. -8- two
of which related to the deficiencies of its OV-6c model. The first–EN-CEDGE-
04–provided:

               Your proposal failed to provide an OV-6c encompassing the entire fully
       dressed use case included in the solicitation. Your proposed OV-6c only provided
       a small subset of the fully dressed use case.

              Please provide an updated OV-6c that encompasses the entire use case
       provided with the solicitation.

Id. at 4102.
The second–EN-CEDGE-05–provided:

              The offeror failed to use BPMN, as required on solicitation Page 118
       Subfactor 2a, to develop the proposed OV-6c Event Trace Description.

               Please provide an OV-6c using BPMN.

Id. at 4103.
CEdge sought clarification regarding EN-CEDGE-05, asking the contracting officer
whether USTRANSCOM was requesting that CEdge use a collaboration diagram, rather than a
choreography diagram, for its OV-6c model. 
Id. at 4114.
The contracting officer responded that
CEdge was required to submit a model that complied with the designated reference materials,
and the relevant reference only addressed collaboration, not choreography, diagrams. 
Id. at 4103,
4114-15.

                                      3. Revised Proposals

        All three offerors in the competitive range responded to the ENs and submitted revised
proposals. 
Id. at 2861-3600.
The SSEB reviewed the revised proposals; identified the proposals’
remaining strengths, weaknesses, and deficiencies; and updated its consensus ratings.4 
Id. at 3619-34,
4343-56. With respect to the first technical capability subfactor, Trident remedied its
one weakness, and both Trident and CEdge retained their earlier strengths and color/adjectival
ratings (i.e., blue/outstanding for Trident and purple/good for CEdge). 
Id. at 3622-23,
4343-44.

        With respect to the second technical capability subfactor, Trident’s revised OV-6c model
remedied the issues raised in the ENs. 
Id. at 2853
-56. In the absence of any strengths,
weaknesses, or deficiencies, the SSEB assigned Trident a green/acceptable rating for this
subfactor. 
Id. at 3624-25.
In fact, upon reviewing Trident’s revised proposal, the SSEB did not
identify any weaknesses or deficiencies under the technical capability factor, 
id. at 3619,
meaning
that further discussions or proposal revisions were unnecessary, see Federal Acquisition
Regulation (“FAR”) 15.306(d)(3) (requiring discussions regarding significant weaknesses and


       4
          Although the SSEB’s second round of evaluation worksheets are undated, their
contents suggest that they were created after the first revised proposals were submitted.

                                                -9-
deficiencies). Moreover, the SSEB upgraded its risk rating of Trident’s technical capability from
moderate to low. AR 3619.

        CEdge’s revised OV-6c model, in contrast, remained unsatisfactory. The SSEB
identified the following deficiency: “The offeror’s proposed OV-6c Event Trace Description was
not developed in[ ]compliance with the BPMN 2.0 as it lacks proper sequence flow and/or sub-
processes.” 
Id. at 4345.
Finding the “severity” of this deficiency to be “significant in nature,” it
did not change CEdge’s red/unacceptable rating for the second technical capability subfactor. 
Id. at 4346.
Moreover, it downgraded its risk rating of CEdge’s technical capability from moderate
to high. 
Id. at 4356.
      Due in part to this deficiency, the contracting officer continued her discussions with
CEdge. She issued EN-CEDGE-18, which provided:

               Your revised OV-6c was not in compliance with the BPMN 2.0
       specification as indicated. Regardless of diagram choice (choreography vice
       collaboration) the revised OV-6c lacked proper sequence flow and/or sub-
       processes.

            Please provide an OV-6c collaboration diagram that is in compliance with
       BPMN 2.0 specifications.

Id. at 4228.
CEdge responded to this EN and submitted a second revised proposal. 
Id. at 4228,
4230-96. The contracting officer and the SSEB remained dissatisfied, providing the following
internal assessment of CEdge’s response:

               The offeror’s OV-6c collaboration diagram was not compliant with BPMN
       2.0 specifications. The OV-6c depicted some sequence flow, which was missing
       from the previous model, but is still incomplete. Although messages, also known
       as information exchanges, were identified between pools,5 it lacked data
       exchanges between performers within the pools as required by the [solicitation]
       reference BPMN Method and Style (Silver, page 160, figure 15-1).6 The
       subprocesses identified were linked to the technology, preventing the re-use of


       5
           “A Pool represents a Participant in a Collaboration.” BPMN 2.0, supra note 2, at 502.
       6
          Although data exchanges are depicted on the cited figure, see AR 3914, there is no
indication in the excerpts from the reference included in the administrative record that data
exchanges are required to be depicted on collaboration diagrams. To the contrary, the reference
considers “the flow of process data in BPMN” to be “implicit.” 
Id. at 3911;
accord 
id. (“I don’t
use data objects very often, but it is a matter of personal style.”); see also BPMN 2.0, supra note
2, at 203 (indicating that data modeling can be accomplished through the use of data objects,
messages, and/or data associations).

                                                -10-
       those activities within the architecture. For example, “Configure new irrigation
       equipment” is linked to irrigation, but if it had been “Configure equipment” the
       activity could have been reused in the different lanes by different performers
       (surveyor or landscaper).7 The model also grouped pools by performer and not
       process. For example, one pool was identified as “Contractors” but should have
       been the name of the process, such as “Pool Installation”. The offeror[’]s
       approach conflicts with guidance provided in Silver, page 12, paragraph 4. “You
       sometimes see pools labeled with the name of an organization [performer], but for
       pools that contain activity flows–some don’t, as we will see later–it’s best practice
       to label them with the name of the process.” As a result, the offeror’s OV-6c
       lacked proper sequence of activities among performers. According to the use
       case, the landscaper had to perform some activities then return after pool
       installation to perform more activities. The offeror’s sequencing didn’t have the
       landscaper perform any activities until the pool was installed.

Id. at 4228
(final alteration in the original) (footnotes added).

                      4. Removal of CEdge From the Competitive Range

        After receiving and reviewing the second revised proposals submitted by CEdge and the
third offeror in the competitive range, the SSEB briefed the SSA. The SSEB indicated that
Trident had resolved all of its ENs in the first round, 
id. at 4400;
that the proposal of the third
offeror in the competitive range had one weakness, one significant weakness, and four
deficiencies after the second round of ENs, 
id. at 4396;
and that CEdge’s proposal had one
deficiency after the second round of ENs, leading the SSEB to conclude that CEdge had not
“demonstrate[d] the understanding nor the discipline to apply the standardized modeling
methodologies use[d] by [the procuring agencies] to define and document requirements,” 
id. at 4395.
The SSEB ultimately recommended that the proposals of CEdge and the third offeror be
removed from the competitive range, leaving only Trident to compete for the contract. 
Id. at 4411,
4413.

        The SSA had two concerns with the SSEB’s recommendation. First, the proposals of the
two offerors recommended for exclusion “had good technical ratings in all subfactors” except for
the second, for which both proposals received red/unacceptable ratings, rendering them
“unawardable.” 
Id. at 4415.
Second, these two offerors proposed the “lowest labor rates and
overall prices,” which, given the existing “fiscally constrained environment,” might prevent the
government from obtaining the best value if their proposals were removed from the competition.
Id. Because of
these concerns, the SSA sought advice from an outside consultant employed by
the MITRE Corporation. 
Id. at 4115-16.
She provided the consultant with the submission
requirements and evaluation criteria set forth in the solicitation, the four reference materials


       7
       A lane is “[a] partition that is used to organize and categorize activities within a Pool.”
BPMN 2.0, supra note 2, at 501.

                                                 -11-
pertinent to the second technical capability subfactor identified in the solicitation, the use case,
and the three offerors’ most recent technical proposals. 
Id. She did
not provide the consultant
with the SSEB’s evaluation worksheets or the ENs. 
Id. at 4115.
        The consultant prepared a memorandum containing an analysis of the models submitted
by the three offerors in the competitive range. 
Id. at 4416-17.
With respect to CEdge’s models,
he provided the following critique:

       A. Notes:
              • Extensions, Technology, and Data List line items–not incorporated
              • Called out Main Success line items in AV-2, but activity-descriptions
                 not aligned well or missing within the OV-6c and OV-5c
                     • For example, line items 20 and 22 were to be done by the
                         respective pool and fence contractors, but these were included
                         in Activity 1.1 which is before the contracts are awarded in
                         OV-6c
                     • Line item 21 was called out in the OV-6c (activity 2.1), but line
                         items 20 and 22 are missing
                     • Some activities involved contractors, but were bundled in
                         project manager which is not correct
                         • For example, can’t approve a bid unless one receives it
                             from a contractor (OV-6c, Activity 1.1, 1.5)
                         • Partial payments included after project is completed which
                             is not the correct event-trace (Activity 1.7) per Government
                             use case
                     ....

       B. Discussion
              • Did the Government ask CEdge why extensions, technology and data
                  list not used? Do they understand how to use/apply this information?
              • Not sure they understood the relationship between Grandma and the
                  contractors
              • Event trace was incorrect/incomplete and did not reflect the use case
                  study requirements

       C. Summary
             • Weak submission
             • Incomplete architecture products and overly simplified
             • Architecture products appear to communicate limited architecture
                knowledge and experience–lots of risk with this use case architecture
                response

Id. at 4416-17.
With respect to Trident’s models, the consultant wrote:


                                                 -12-
       A. Notes:
              • First pass–no errors found
              • Accounted for all Extensions

       B. Discussion
              • AV-2 well written and uses authoritative sources to define activities
              • Excellent attention-to-detail between AV-2, OV-5a, and OV-6c
              • OV-6c had the appropriate level-of-detail, sequence, and was well
                  organized

       C. Summary
             • By far the best architecture submission–bidder understands both the
                “art and science” of producing viable architectures
             • Quality of submission is very professional

Id. at 4417-18.
Overall, the consultant concluded that the models submitted by CEdge and the
third offeror “were weaker submissions and demonstrated a lack of understanding of the
fundamental architectural concepts needed.” 
Id. at 4418.
He shared these findings with the SSA.
Id. at 4415,
4418.

        Thereafter, the SSA approved a new Competitive Range Determination document. 
Id. at 4433-41.
The document contains an analysis of CEdge’s proposal; portions relevant to this
protest are as follows:

       1. C-Edge’s . . . Technical Proposal . . . .

               ....

               b. Subfactor 2 - Enterprise Architecture Development: C-Edge did not
               meet the technical requirements for this Subfactor after two rounds of
               discussions. All weaknesses/significant weaknesses and deficiencies were
               resolved with the exception of EN-CEDGE-05 and subsequent
               EN-CEDGE-18.

                       (1) EN-CEDGE-05 was issued because the OV-6c model was not
                       created using Business Process Model Notation (BPMN) as
                       required by the [solicitation]. Their response to this EN was not
                       adequate and subsequent EN-CEDGE[-]18 was issued. The
                       response submitted for EN-CEDGE-18 did not resolve the issues
                       identified. . . .

                       (2) The purpose of the use case and architectural business process
                       modeling exercise was to demonstrate an offeror’s level of


                                                 -13-
               expertise, discipline with architecture concepts (consistency and
               attention to detail) and overall understanding of fundamental
               architecture concepts needed to support a robust and evolving
               enterprise. Based on the repeated errors, CEdge did not
               demonstrate the level of expertise, discipline, or understanding to
               apply the standardized modeling methodologies required by [the
               procuring agencies] to define and document requirements. The use
               case provided was a simple project compared to the work that will
               be required by this contract. While the solicitation only required 3
               models, a normal IT architecture can include 30 different models.
               Architecture models provide the backbone for engineering
               decisions and the blue print for information technology solutions.
               Flawed logic and documentation of requirements results in flawed
               engineering of technical solutions which impairs our ability to
               respond and support the warfighter. As mobility and sustainment
               operations are increasingly dependent on technology, flawed
               solutions mean critical cargo will not be delivered, surface and air
               missions will not be executed, and planes will not fly.

       ....

       e. Technical Risk: At the end of discussions CEdge has an unresolved
       deficiency in architecture, which is the foundation for the engineering of
       enterprise systems. As a result, CEdge’s Technical Risk was changed
       from Moderate to High, as its proposed approach is likely to cause
       significant disruption of schedule, increased cost, or degradation of
       performance. CEdge is unlikely to overcome any difficulties, even with
       special contractor emphasis and close Government monitoring.

....

4. . . . . CEdge’s revised [total evaluated price] is $56,198,461.68.

5. C-Edge’s revised proposal is not one of the most highly rated proposals . . . . In
total, C-Edge received seven (7) Strengths, and one (1) Deficiency for their
Technical and Staffing proposals. C-Edge received a Past Performance
Confidence Assessment Rating of Satisfactory and has the lowest [total evaluated
price]; however they are technically unacceptable due to the Deficiencies in
Subfactor 2. While CEdge has more strengths overall than any other offeror, their
inability to produce a fully integrated architecture model for a simple scenario is
fatal. As noted above, sound engineering decisions and IT solutions are
dependent upon the architecture model. The evaluation notice provided CEdge
with information specific enough to allow them to resolve the issues. Considering


                                         -14-
       the fundamental, or basic, errors included in the model submitted, further ENs
       would have required the Government to specifically call out each instance where
       CEdge’s model failed to conform to the requirements of the solicitation (e.g., the
       landscaper tasks are not sequenced per the Use Case). To do so would have
       negated the purpose of the use case and architectural business process modeling
       exercise which was to demonstrate an offeror’s level of expertise, discipline with
       architecture concepts (consistency and attention to detail) and overall
       understanding of fundamental architecture concepts needed to support a robust
       and evolving enterprise.

Id. at 4435-37
(emphasis removed). Ultimately, after reviewing the recommendation of the
SSEB and the consultant’s analysis, the SSA approved the removal of CEdge and the third
offeror from the competitive range. 
Id. at 4415,
4440. She expressed confidence that “the SSEB
had full and meaningful discussions with all Offerors” and that “a competitive range of 1 [was]
in the Government[’]s best interest.” 
Id. at 4415.
       The following day, the contracting officer notified Trident that discussions had concluded
and invited Trident to submit a final revision to its proposal. 
Id. at 3684.
She also notified
CEdge that discussions had concluded and that because its proposal was “no longer amongst the
most highly qualified,” it was “removed from the competitive range.” 
Id. at 4443.
The
contracting officer summarized CEdge’s ratings in the notification letter, and provided copies of
the SSEB’s evaluation worksheets for CEdge’s review. 
Id. D. Debriefing,
Initial Protest, and Contract Award

       CEdge immediately requested a debriefing from the contracting officer, who responded in
writing to CEdge’s questions:

               Q2. . . . [I]s the deficiency on the OV-6c based on the “proper” sequence
       flow? The deficiency noted continues to be vague, and does not provide the
       specificity needed to provide a proper response. What are the specific items in
       our OV-6c that you find non-compliant?

               Government Answer: The Government disagrees that the deficiency was
       vague. EN-CEDGE-18 specifically advised that your revised OV-6c “lacked
       proper sequence flow”. This is a commonly used term and should have been
       sufficient for you to correct the deficiency. Furthermore, if you felt the EN was so
       vague that you couldn’t provide an adequate answer, then you had the opportunity
       to request clarification as you did during the first round of discussions. No
       clarification was requested and your response to EN-CEDGE-18 indicated your
       proposal had been updated with a revised OV-6c “to address the issue listed in the
       evaluation notice”.



                                              -15-
              ....

               Q3. We produced the OV-6c using the SparxEA tool which uses business
       rules to ensure artifacts are produced in compliance with BPMN 2.0 standards.
       Further, our OV-6c was verified as compliant through an independent review by
       the author of the original guidance document for creating OV-6c diagrams using
       BPMN. We are highly confident that our OV-6c is BPMN 2.0 compliant. What
       is the specific reference you used to define “proper” in relation to the sequence
       flow and conclude that our OV-6c is non-compliant?

              Government Answer: The four reference materials used by the evaluation
       team to review the submissions for subfactor 2 are listed in the solicitation,
       Section L-4, Factor l, paragraph (b).

              ....

               Q5. If your assessment is based on a misinterpretation of the use case
       (correct sequence of building a pool activities), then we were never afforded the
       opportunity to address that situation. Is the rating based on the assessment that
       one or more sub-processes is out of sequence? As an example, did we incorrectly
       sequence “Install Fence” before “Install Porch”?

               Government Answer: See question 2 above. The sequence flow
       deficiency was addressed in EN-CEDGE-18.

              ....

             Q9. We would like to understand your basis for the change in risk rating
       from moderate to high[.]

               Government Answer: Based upon the initial proposal review the Source
       Selection Evaluation Board thought the deficiencies were due to a lack of
       understanding of the requirement and could be addressed through discussions.
       However, after two rounds of discussions it became evident that you had not
       sufficiently demonstrated a knowledge of architecture necessary to develop
       architecture artifacts. As a result, your risk rating was changed from moderate to
       high.

Id. at 3717-18.
Dissatisfied with these responses, CEdge lodged a protest with the Government
Accountability Office (“GAO”). 
Id. at 3720-36.
The GAO denied CEdge’s protest in an April 1,
2014 decision. 
Id. at 4476-82.
USTRANSCOM ultimately awarded the contract to Trident on
April 10, 2014. 
Id. at 4508.
Trident’s total evaluated price for the contract, which included a



                                              -16-
six-month extension of services, was $74,166,938.31, 
id., approximately $18
million more than
CEdge’s total evaluated price.

                E. Proceedings in the United States Court of Federal Claims

         On May 8, 2014, CEdge filed a protest in this court challenging the removal of its
proposal from the competitive range. In its subsequently filed amended complaint, CEdge
asserts three claims for relief. First, it contends that it was improper for USTRANSCOM to
assign a deficiency for its OV-6c model and then remove its proposal from the competitive range.
Am. Compl. ¶¶ 102-22. Second, it avers that discussions were not meaningful and were
misleading and unequal. 
Id. ¶¶ 123-51.
Third, it alleges that USTRANSCOM’s evaluation of
the first technical capability subfactor was unequal. 
Id. ¶¶ 152-60.
CEdge seeks a declaration
that USTRANSCOM acted improperly under each claim for relief and an injunction directing
USTRANSCOM to “suspend Trident’s contract, reinstate CEdge into the competitive range,
make a rational best value decision on the basis of the final proposals, and if CEdge represents
the best value . . . , cancel the award to Trident and award the contract to CEdge.” 
Id. at 30.
       Trident intervened and the parties all moved for judgment on the administrative record.
Upon the completion of briefing, the court heard argument and is now prepared to rule.

                                        II. DISCUSSION

        Each party has filed a motion for judgment on the administrative record pursuant to Rule
52.1 of the Rules of the United States Court of Federal Claims (“RCFC”), urging the court to
enter judgment in its favor. In ruling on such motions, “the court asks whether, given all the
disputed and undisputed facts, a party has met its burden of proof based on the evidence in the
record.” A & D Fire Prot., Inc. v. United States, 
72 Fed. Cl. 126
, 131 (2006) (citing Bannum,
Inc. v. United States, 
404 F.3d 1346
, 1356 (Fed. Cir. 2005)8). Because the court makes “factual
findings . . . from the record evidence,” judgment on the administrative record “is properly
understood as intending to provide for an expedited trial on the administrative record.” Bannum,
Inc., 404 F.3d at 1356
.

                                     A. Standard of Review

        In a bid protest, the Court of Federal Claims reviews the challenged agency action
pursuant to the standards set forth in 5 U.S.C. § 706. 28 U.S.C. § 1491(b)(4) (2012). Although
section 706 contains several standards, “the proper standard to be applied in bid protest cases is
provided by 5 U.S.C. § 706(2)(A): a reviewing court shall set aside the agency action if it is
‘arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law.’”


       8
         The decision in Bannum was based upon then-RCFC 56.1, which was abrogated and
replaced by RCFC 52.1. RCFC 52.1 was designed to incorporate the decision in Bannum. See
RCFC 52.1, Rules Committee Note (June 20, 2006).

                                               -17-
Banknote Corp. of Am. v. United States, 
365 F.3d 1345
, 1350 (Fed. Cir. 2004). Under this
standard, the court “may set aside a procurement action if ‘(1) the procurement official’s decision
lacked a rational basis; or (2) the procurement procedure involved a violation of regulation or
procedure.’” Centech Grp., 
Inc., 554 F.3d at 1037
(quoting Impresa Construzioni Geom.
Domenico Garufi v. United States, 
238 F.3d 1324
, 1332 (Fed. Cir. 2001)).

         Procurement officials “are entitled to exercise discretion upon a broad range of issues
confronting them in the procurement process.” Impresa Construzioni Geom. Domenico 
Garufi, 238 F.3d at 1332
(internal quotation marks omitted). Thus, when a protester challenges the
procuring agency’s decision as irrational, the court’s review is “highly deferential” to the
agency’s decision, Advanced Data Concepts, Inc. v. United States, 
216 F.3d 1054
, 1058 (Fed.
Cir. 2000), and “[t]he court is not empowered to substitute its judgment for that of the agency,”
Citizens to Preserve Overton Park, Inc. v. Volpe, 
401 U.S. 402
, 416 (1971). “Accordingly, the
test for reviewing courts is to determine whether the contracting agency provided a coherent and
reasonable explanation of its exercise of discretion, and the disappointed bidder bears a heavy
burden of showing that the award decision had no rational basis.” Impresa Construzioni Geom.
Domenico 
Garufi, 238 F.3d at 1332
-33 (citation and internal quotation marks omitted); accord
Advanced Data Concepts, 
Inc., 216 F.3d at 1058
(“The arbitrary and capricious standard . . .
requires a reviewing court to sustain an agency action evincing rational reasoning and
consideration of relevant factors.”). When a protester claims that the procuring agency’s decision
violates a statute, regulation, or procedure, it must show that the violation was “clear and
prejudicial.” Impresa Construzioni Geom. Domenico 
Garufi, 238 F.3d at 1333
(internal
quotation marks omitted).

               B. CEdge’s First Claim for Relief–the Assignment of a Deficiency

        In its first claim for relief, CEdge alleges that USTRANSCOM erred by assigning it a
deficiency for its OV-6c model and subsequently removing it from the competitive range. In
advancing this argument, CEdge relies on the FAR’s definition of deficiency: “[A] material
failure of a proposal to meet a Government requirement or a combination of significant
weaknesses in a proposal that increases the risk of unsuccessful contract performance to an
unacceptable level.”9 FAR 15.001. According to CEdge, the issues identified by
USTRANSCOM with its OV-6c model were not material; therefore, a deficiency assessment was
inappropriate.

        The deficiency that led to the removal of CEdge’s proposal from the competitive range
was communicated to CEdge in EN-CEDGE-18. In that EN, USTRANSCOM requested an OV-
6c collaboration diagram compliant with the BPMN 2.0 specification, explaining that CEdge’s
revised diagram was not BPMN 2.0-compliant and “lacked proper sequence flow and/or sub-
processes.” AR 4228. A straightforward reading of this EN indicates that USTRANSCOM
believed that the activities depicted in CEdge’s revised OV-6c model may have been out of


       9
           “Deficiency” was not defined in the solicitation.

                                                -18-
sequence and that CEdge may have omitted one or more subprocesses from the model.
USTRANSCOM’s comment regarding BPMN 2.0 noncompliance and request that CEdge
submit a BPMN 2.0-compliant diagram do not, as CEdge argues, alter this reading. A reasonable
offeror, informed that its model “lacked proper sequence flow and/or sub-processes,” would have
compared its model with the scenario described in the use case to verify that its model reflected
the proper sequence of events and contained all required subprocesses. Moreover, if CEdge had
been confused by USTRANSCOM’s comments, it could have sought clarification, as it did
during the first round of discussions.

        Issues remained with the second revised OV-6c model that CEdge submitted in response
to EN-CEDGE-18. According to USTRANSCOM, CEdge’s model continued to be
noncompliant with the BPMN 2.0 specification and was still incomplete. USTRANSCOM
specifically remarked on the lack of data exchanges, improper linking of subprocesses, and the
grouping of pools by performer rather than by process, which resulted in the improper sequence
of activities by the performers. Indeed, USTRANSCOM’s evaluation was consistent with the
findings of the consultant who independently reviewed CEdge’s models without the benefit of
the SSEB’s evaluation worksheets or the resulting ENs; the consultant noted that CEdge’s OV-6c
model did not contain the extensions set forth in the use case or some of the steps described in
the use case’s main success scenario.

         As set forth in the solicitation, offerors were required to submit three integrated models.
All three models were to be based on the use case and the applicable listed references. And, the
OV-6c model was to be developed using BPMN. CEdge argues that its proposal satisfied these
requirements, contending that there was only one, minor error in its OV-6c model–the
sequencing error with the landscaper. Thus, from CEdge’s point of view, there was no material
failure to meet the solicitation’s requirements; rather, the only issue was whether its error was
significant enough to constitute a weakness.10 Significantly, however, CEdge ignores the other


       10
            CEdge suggests that USTRANSCOM was expressly prohibited from assigning a
deficiency for the second technical capability subfactor so long as an offeror’s proposal contained
an OV-6c model based on BPMN standards that was fully integrated with the other two models
described in the solicitation. The solicitation does not so limit USTRANSCOM. Although
section M of the solicitation indicated that the measure of merit for this subfactor would be met
by the submission of a “fully integrated model subset based on” the use case, AR 758,
USTRANSCOM possessed the discretion to determine what constituted a “fully integrated
model subset” and how much of the use case needed to be represented in the “fully integrated
model subset.” See Forestry Surveys & Data v. United States, 
44 Fed. Cl. 493
, 499 (1999)
(“[A]gency evaluation personnel are given great discretion in determining the scope of an
evaluation factor.”). In fact, in the first round of discussions, USTRANSCOM made it perfectly
clear that it expected the entirety of the use case to be represented in the models. See AR 2853
(EN-TRIDENT-05), 4102 (EN-CEDGE-04). Moreover, it is perfectly reasonable for
USTRANSCOM to evaluate the accuracy and completeness of the offerors’ models; inherent in
the requirement that these models be submitted is the requirement that these models meet certain

                                                -19-
flaws identified by USTRANSCOM and its consultant–e.g., the model’s lack of completeness,
the improper linking of subprocesses, and the grouping of pools by performer rather than process.
It is readily apparent from the evaluations and comments of the SSEB, the contracting officer,
and the SSA that they considered the multiple flaws in CEdge’s second revised OV-6c model to
constitute a failure to satisfy the solicitation’s requirements. And, the fact that USTRANSCOM
assigned a deficiency for CEdge’s OV-6c model means that it deemed CEdge’s failure to satisfy
the solicitation’s requirements to be material. Such a determination was within
USTRANSCOM’s discretion, and the court will not second-guess it. See E.W. Bliss Co. v.
United States, 
77 F.3d 445
, 449 (Fed. Cir. 1996).

       As CEdge represents, USTRANSCOM removed CEdge’s proposal from the competitive
range as a result of the deficiency and associated red/unacceptable rating that it assigned to
CEdge for CEdge’s flawed OV-6c model. See AR 4437 (noting that CEdge’s proposal was
“technically unacceptable” due to the deficiency in the second technical capability subfactor).
Because USTRANSCOM’s assignment of a deficiency had a rational basis, USTRANSCOM’s
removal of CEdge’s proposal from the competitive range also had a rational basis.11
Accordingly, CEdge cannot prevail on its first claim for relief.

                       C. CEdge’s Second Claim for Relief–Discussions

        CEdge may still succeed on the merits of its protest, however, if it proves its allegation
that USTRANSCOM conducted flawed discussions. Discussions “are undertaken with the intent
of allowing the offeror to revise its proposal,” FAR 15.306(d), with the “primary objective of . . .
maximiz[ing] the Government’s ability to obtain best value,” FAR 15.306(d)(2). The FAR
specifically addresses the scope of discussions:

       At a minimum, the contracting officer must . . . indicate to, or discuss with, each
       offeror still being considered for award, deficiencies, significant weaknesses, and
       adverse past performance information to which the offeror has not yet had an


standards of quality. See Bean Stuyvesant L.L.C. v. United States, 
48 Fed. Cl. 303
, 321 (2000)
(noting that “a solicitation need not identify each element to be considered by the agency during
the course of the evaluation where such element is intrinsic to the stated factors” (internal
quotation marks omitted)).
       11
            FAR 15.306(d)(5) expressly permits the removal of a proposal from the competitive
range if the offeror “is no longer considered to be among the most highly rated offerors being
considered for award . . . .” See also FAR 15.306(c)(3) (noting that “[i]f the contracting officer
. . . decides that an offeror’s proposal should no longer be included in the competitive range, the
proposal shall be eliminated from consideration for award”). Accordingly, CEdge’s argument
that USTRANSCOM was prohibited from removing its proposal from the competitive range,
first raised in its reply in support of its motion for judgment on the administrative record, lacks
merit.

                                                -20-
       opportunity to respond. The contracting officer also is encouraged to discuss
       other aspects of the offeror’s proposal that could, in the opinion of the contracting
       officer, be altered or explained to enhance materially the proposal’s potential for
       award. However, the contracting officer is not required to discuss every area
       where the proposal could be improved. The scope and extent of discussions are a
       matter of contracting officer judgment.

FAR 15.306(d)(3). CEdge specifically contends that the discussions conducted by
USTRANSCOM were not meaningful, and were misleading and unequal.

                               1. Discussions Were Meaningful

        CEdge first argues that discussions were not meaningful because USTRANSCOM failed
to advise it of the specific sequencing error in its OV-6c model. For discussions to be
meaningful, they must “generally lead offerors into the areas of their proposals requiring
amplification or correction, which means that discussions should be as specific as practical
considerations permit.” Advanced Data Concepts, Inc. v. United States, 
43 Fed. Cl. 410
, 422
(1999) (internal quotation marks omitted), 
aff’d, 216 F.3d at 1054
. Ultimately, however, the
scope of discussions is left to the contracting officer’s discretion. See FAR 15.306(d)(3);
Banknote Corp. of Am. v. United States, 
56 Fed. Cl. 377
, 384 (2003), 
aff’d, 365 F.3d at 1345
.
Here, the administrative record reflects that USTRANSCOM’s discussions with CEdge were
meaningful.

        USTRANSCOM conducted two rounds of discussions with CEdge. In the first round,
USTRANSCOM advised CEdge that its OV-6c model was not a collaboration diagram, as
required by the relevant reference specified in the solicitation. Because CEdge did not use the
proper type of diagram for its OV-6 model, there would have been no reason or opportunity for
USTRANSCOM to identify more specific concerns at that time. Because USTRANSCOM led
CEdge to the area of its proposal that required correction, this round of discussions was
meaningful.

         In response to USTRANSCOM’s comments in the first round of discussions, CEdge
submitted a new OV-6c model with its revised proposal. USTRANSCOM reviewed the revised
model and advised CEdge that it had identified the following issues: (1) the model did not
comply with the BPMN 2.0 specification and (2) the model “lacked proper sequence flow and/or
sub-processes.” AR 4228. This communication provided CEdge with notice that it should focus
its attention on its model’s BPMN 2.0 compliance, sequence flow, and subprocesses. More
specific guidance was unnecessary under the applicable precedent.12 Indeed, as noted above, a


       12
          The fact that USTRANSCOM identified more specific issues with the next revised
OV-6c model submitted by CEdge is, contrary to CEdge’s suggestion, irrelevant. Given the
broad categorization of the issues that USTRANSCOM identified with CEdge’s first revised OV-
6c model, it is reasonable to conclude that more issues existed with the model than could be

                                               -21-
reasonable offeror receiving this communication would have compared its model with the
scenario described in the use case to verify that its model reflected the proper sequence of events
and contained all required subprocesses. Because USTRANSCOM disclosed to CEdge the
aspects of the model that required correction, this round of discussions was meaningful.

        After this second round of discussions, CEdge submitted another revised proposal that
included a newly revised OV-6c model. USTRANSCOM evaluated CEdge’s second revised
OV-6c model and determined that while some sequence flow issues had been resolved, the
model remained incomplete. These comments were not disclosed to CEdge at the time they were
made, presumably because the contracting officer determined that no further discussions were
warranted, a decision well within her discretion. See Banknote Corp. of 
Am., 56 Fed. Cl. at 384
.
In the absence of any discussions, no meaningfulness inquiry can be performed.

       In sum, CEdge has not established that the discussions conducted by USTRANSCOM
concerning its OV-6c model were not meaningful.

                              2. Discussions Were Not Misleading

        CEdge next argues that discussions were misleading. Discussions are misleading when a
procuring agency issues “incorrect, confusing or ambiguous” communications that misdirect an
offeror attempting to revise its proposal. DMS All-Star Joint Venture v. United States, 90 Fed.
Cl. 653, 670 (2010). Here, CEdge contends that EN-CEDGE-18 was misleading because its
contents implied that USTRANSCOM’s only concern with CEdge’s revised OV-6c model was
one of BPMN compliance. CEdge’s contention lacks merit.

       As noted above, EN-CEDGE-18 provided:

               Your revised OV-6c was not in compliance with the BPMN 2.0
       specification as indicated. Regardless of diagram choice (choreography vice
       collaboration) the revised OV-6c lacked proper sequence flow and/or sub-
       processes.

            Please provide an OV-6c collaboration diagram that is in compliance with
       BPMN 2.0 specifications.




practically communicated to CEdge. In such circumstances, USTRANSCOM was under no
obligation to identify every issue with specificity. See Advanced Data Concepts, Inc., 43 Fed.
Cl. at 422 (noting that “discussions should be as specific as practical considerations permit”
(emphasis added)); see also D & S Consultants, Inc. v. United States, 
101 Fed. Cl. 23
, 40 (2011)
(“[T]he procuring agency is not required to address in express detail all inferior or inadequate
aspects of a proposal.” (internal quotation marks omitted)).

                                               -22-
AR 4228. Given the contents of the first paragraph of the EN, the second paragraph is
reasonably read as a request for a BPMN 2.0-compliant collaboration diagram that contains
proper sequence flow and subprocesses. There is nothing incorrect, confusing, or ambiguous
about this communication; USTRANSCOM adequately alerted CEdge to its assessment that the
revised OV-6c model had issues beyond BPMN compliance.

                                   3. Discussions Were Equal

        In addition to contending that discussions were not meaningful and were misleading,
CEdge asserts that discussions were unequal. The FAR prohibits contracting officers from
favoring “one offeror over another” when conducting discussions. FAR 15.306(e)(1).
Consequently, although contracting officers should tailor discussions to each offeror’s proposal,
FAR 15.306(d)(1), they should not “inform some offerors of a concern . . . while staying silent
with respect to identical issues in other offerors’ proposals,” Ashbritt, Inc. v. United States, 
87 Fed. Cl. 344
, 372 (2009). Here, CEdge alleges that in conducting discussions regarding the
offerors’ OV-6c models, USTRANSCOM provided Trident with much more specific comments
than it provided to CEdge. Although CEdge’s allegation is superficially compelling, a closer
examination reveals its flaws.

        When comparing the discussions that USTRANSCOM conducted with CEdge to those
conducted with Trident, it is important to note that upon evaluating the two offerors’ initial OV-
6c models, USTRANSCOM found them to be of vastly different quality. CEdge submitted a
model that used the incorrect type of diagram, foreclosing USTRANSCOM from being able to
provide more specific comments. In contrast, Trident submitted a model using the correct type
of diagram, which allowed USTRANSCOM to be more specific in its comments. There can be
no dispute that USTRANSCOM properly tailored this first round of discussions to each proposal
and treated the offerors as equally as possible given the disparity in their submissions.

        The parties subsequently submitted revised OV-6c models. USTRANSCOM concluded
that Trident had resolved the issues with its initial model. CEdge’s revised model, however, was
problematic. Although CEdge used the correct type of diagram, USTRANSCOM identified
other issues related to compliance with the BPMN 2.0 specification and to sequence
flow/subprocesses. Given this broad categorization of the issues with CEdge’s revised model, it
is reasonable to conclude that USTRANSCOM identified more issues with the model than it
could practically communicate to CEdge. This conclusion is supported by USTRANSCOM’s
evaluation of CEdge’s second revised OV-6 model, which reflects that the issues
USTRANSCOM had identified with CEdge’s first revised OV-6c model had been reduced in
number. See AR 4228 (noting that CEdge’s second revised OV-6c model included some
sequence flow that had been missing from CEdge’s first revised OV-6c model). In light of the
number of issues it identified with CEdge’s first revised model, USTRANSCOM properly
tailored its discussions with CEdge. Accordingly, the fact that USTRANSCOM was not more
specific in its second round of discussions with CEdge does not render those discussions
unequal.


                                                -23-
                  D. CEdge’s Third Claim for Relief–Disparate Evaluations

       CEdge’s final contention is that USTRANSCOM’s evaluation of the first technical
capability subfactor was unequal. However, even if CEdge was able to prove this allegation, it
could not prevail in this protest. The court has concluded that USTRANSCOM did not err in
removing CEdge’s proposal from the competitive range based on the deficiency it assigned for
CEdge’s OV-6c model. This deficiency rendered CEdge’s proposal unacceptable for award, and
any changes in the evaluation of another factor or subfactor could not change this fact.
Consequently, the court need not assess the merits of CEdge’s third claim for relief.

                                      III. CONCLUSION

        For the foregoing reasons, the court concludes that CEdge has not established that
USTRANSCOM acted irrationally in assigning it a deficiency for its OV-6c model, removing its
proposal from the competitive range, or conducting discussions. As a result, CEdge is unable to
succeed on the merits of its protest. The court therefore DENIES CEdge’s motion for judgment
on the administrative record and GRANTS defendant’s and defendant-intervenor’s cross-
motions for judgment on the administrative record. CEdge’s protest is DISMISSED with
prejudice. No costs. The clerk shall enter judgment accordingly.

         The court has filed this ruling under seal. The parties shall confer to determine proposed
redactions agreeable to all parties. Then, by no later than Friday, August 8, 2014, the parties
shall file a joint status report indicating their agreement with the proposed redactions, attaching
a copy of those pages of the court’s ruling containing proposed redactions, with all
proposed redactions clearly indicated.

       IT IS SO ORDERED.


                                                      s/ Margaret M. Sweeney
                                                      MARGARET M. SWEENEY
                                                      Judge




                                               -24-

Source:  CourtListener

Can't find what you're looking for?

Post a free question on our public forum.
Ask a Question
Search for lawyers by practice areas.
Find a Lawyer