Elawyers Elawyers
Washington| Change
Find Similar Cases by Filters
You can browse Case Laws by Courts, or by your need.
Find 49 similar cases
DOS OF EDEN SPRINGS, LLC, D/B/A EDEN SPRINGS NURSING AND REHABILITATION CENTER vs AGENCY FOR HEALTH CARE ADMINISTRATION, 09-005321 (2009)
Division of Administrative Hearings, Florida Filed:Crawfordville, Florida Sep. 29, 2009 Number: 09-005321 Latest Update: Sep. 25, 2014

Conclusions THE PARTIES resolved all disputed issues and executed a Settlement Agree t. The parties are directed to comply with the terms of the attached settlement agreement, attached hereto and incorporated herein as Exhibit "1." Based on the foregoing, this file is CLOSED. DONE and ORDERED on this the day of W Tallahassee, Florida. , 2014, m ZABETH DUDEK, SEC gency for Health Care Administration A PARTY WHO IS ADVERSELY AFFECTED BY THIS FINAL ORDER IS ENTITLED TO A JUDICIAL REVIEW WHICH SHALL BE INSTITUTED BY FILING ONE COPY OF A NOTICE OF APPEAL WITH THE AGENCY CLERK OF AHCA, AND A SECOND COPY ALONG WITH FILING FEE AS PRESCRIBED BYLAW, WITH THE DISTRICT COURT OF APPEAL IN THE APPELLATE DISTRICT WHERE THE AGENCY MAINTAINS ITS HEADQUARTERS OR WHERE A PARTY RESIDES. REVIEW PROCEEDINGS SHALL BE CONDUCTED IN ACCORDANCE WITH THE FLORIDA APPELLATE RULES. THE NOTICE OF APPEAL MUST BE FILED WITHIN 30 DAYS OF RENDITION OF THE ORDER TO BE REVIEWED. Theodore Mack, Esquire Powell & Mack 3700 Bellwood Drive Tallahassee, FL 32303 (Via U.S. Mail) Bureau of Health Quality Assurance Agency for Health Care Administration (Interoffice Mail) Agency for Health Care Administration Bureau of Finance and Accounting (Interoffice Mail) Stuart Williams, General Counsel Zainab Day, Medicaid Audit Services Agency for Health Care Administration Agency for Health Care Administration (Interoffice Mail) (Interoffice Mail) Shena Grantham, Chief Medicaid FFS Counsel (Interoffice Mail) Willis F. Melvin, Jr., Esquire Assistant General Counsel Agency for Health Care Administration 2727 Mahan Drive, Building 3 Tallahassee, Florida 32308-5403 (Via Interoffice Mail) State of Florida, Division of Administrative Hearings The Desoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (Via U.S. Mail) CERTIFICATE OF SERVICE I HEREBY CERTIFY that a true and correct copy of the foregoing has been furnished to the above named addressees by U.S. Mail on this the,2- c;:; 14. Richard J. Shoop, Esquire Agency Clerk State of Florida Agency for Health Care Administration 2727 Mahan Drive, Building #3 Tallahassee, Florida 32308-5403

# 1
SOUTH FLORIDA COMMUNITY CARE NETWORK, LLC, D/B/A COMMUNITY CARE PLAN vs AGENCY FOR HEALTH CARE ADMINISTRATION, 18-003513BID (2018)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Jul. 09, 2018 Number: 18-003513BID Latest Update: Jan. 25, 2019

The Issue Does Petitioner, AHF MCO of Florida, Inc., d/b/a PHC Florida HIV/AIDS Specialty Plan (Positive), have standing to contest the intended award to Simply for Regions 10 and 11 or to seek rejection of all proposals? (Case No. 18-3507 and 18-3508) Should the intended decision of Respondent, Agency for Health Care Administration (Agency), to contract with Simply Healthcare Plans, Inc. (Simply), for Medicaid managed care plans for HIV/AIDS patients in Regions 10 (Broward County) and Region 11 (Miami-Dade and Collier Counties) be invalidated and all proposals rejected? (Case Nos. 18-3507 and 18-3508) Must the Agency negotiate with Petitioner, South Florida Community Care Network, LLC, d/b/a Community Care Plan (Community), about a plan to provide HIV/AIDS Medicaid managed care services in Region 10 because it was the only responsive proposer of services that was a Provider Service Network (PSN)? (Case No. 18-3512) Must the Agency negotiate with Community to provide Medicaid managed care services in Region 10 for people with Serious Mental Illnesses because Community is a PSN? (Case No. 18-3511) Must the Agency contract with Community to provide Medicaid managed care services for Children with Special Needs in Region 10 because Community is a PSN? (Case No. 18-3513) Must the Agency negotiate with Community to provide Medicaid managed care services for Child Welfare patients in Region 10 because Community is a PSN? (Case No. 18-3514)

Findings Of Fact THE PARTIES Agency: Section 20.42, Florida Statutes, establishes the Agency as Florida’s chief health policy and planning agency. The Agency is the single state agency authorized to select eligible plans to participate in the Medicaid program. Positive: Positive is a Florida not-for-profit corporation operating a Medicaid health plan dedicated to serving people with HIV/AIDS. Positive serves about 2,000 patients in Florida. Positive’s health plan is accredited by the Accreditation Association for Ambulatory Healthcare. Its disease management program is accredited by the National Committee for Quality Assurance. Currently, the Agency contracts with Positive for a SMMC HIV/AIDS Specialty Plan serving Regions 10 and 11. Simply: Simply is a Florida for-profit corporation operating a Medicaid health plan dedicated to serving people with HIV/AIDS. Currently, the Agency contracts with Simply to provide a SMMC HIV/AIDS Specialty Plan for Regions 1 through 3 and 5 through 11. Simply has maintained the largest patient enrollment of all HIV/AIDs plans in Florida since Florida started its statewide Medicaid managed care program. Community Care: Community is a Florida limited liability company. It is a PSN as defined in sections 409.912(1)(b) and 409.962(14), Florida Statutes. Staywell: Staywell is the fictitious name for WellCare of Florida, Inc., serving Florida’s Medicaid population. Sunshine: Sunshine State Health Plan (Sunshine) is a Florida corporation. It offers managed care plans to Florida Medicaid recipients. THE INVITATION TO NEGOTIATE TIMELINE On July 14, 2017, the Agency released 11 ITNs plans for Florida’s Medicaid managed care program in 11 statutorily defined regions. Region 10, Broward County, and Region 11, Miami-Dade and Collier Counties, are the regions relevant to this proceeding. Part IV of chapter 409, creates a statewide, integrated managed care program for Medicaid services. This program called Statewide Medicaid Managed Care includes two programs, Managed Medical Assistance and Long-term Care. Section 409.966(2), directs the Agency to conduct separate and simultaneous procurements to select eligible plans for each region using the ITN procurement process created by section 287.057(1)(c). The ITNs released July 14, 2017, fulfilled that command. The Agency issued 11 identical ITNs of 624 pages, one for each region, in omnibus form. They provided elements for four types of plans. Some elements were common to all types. Others were restricted to a specific plan type defined by intended patient population. The plan types are comprehensive plans, long-term care plus plans, managed medical assistance plans, and specialty plans. Section 409.962(16) defines “Specialty Plan” as a “managed care plan that serves Medicaid recipients who meet specified criteria based on age, medical condition, or diagnosis.” Responding vendors identified the plan type or types that they were proposing. The Agency issued Addendum No. 1 to the ITNs on September 14, 2017. On October 2, 2017, the Agency issued Addendum No. 2 to the ITNs. Addendum 2 included 628 questions about the ITNs and the Agency’s responses to the questions. Florida law permits potential responders to an ITN to challenge the specifications of an ITN, including the addendums. § 120.57(3)(b), Fla. Stat. Nobody challenged the specifications of the ITNs. As contemplated by section 287.057(c)(2), the Agency conducted “a conference or written question and answer period for purposes of assuring the vendors’ full understanding of the solicitation requirements.” Positive, Community, and Simply, along with United Healthcare of Florida, Inc., HIV/AIDS Specialty Plan (United), submitted responses to the ITN in Region 10 proposing HIV/AIDS Specialty Plans. Community was the only PSN to propose an HIV/AIDS plan for Region 10. Positive, Simply, and United submitted replies to the ITN for Region 11, proposing HIV/AIDS Specialty Plans. Community, United, Staywell, and one other provider submitted proposals to provide SMI Specialty Plan services in Region 10. Community was the only responding PSN. Community, Sunshine, and Staywell submitted proposals to provide Child Welfare Specialty Plans (CW) in Region 10. Community was the only PSN. Community, Staywell, and two others submitted proposals to offer Specialty Plans for Children with Special Needs (CSN) in Region 10. Community was one of two responding PSNs. Proposal scoring began November 6, 2017, and ended January 16, 2018. The Agency announced its intended awards on April 24, 2018. On April 24, 2018, the Agency issued its notices of intent to award specialty contracts in Regions 10 and 11. The following charts summarize the Agency’s ranking of the proposals and its intended awards. The two highest ranked plans, which the Agency selected for negotiations, are identified in bold. Region 10 – Children with Special Needs Respondent Intended Award Ranking Staywell No 1 Community No 2 Miami Children’s Health Plan, LLC No 3 Our Children PSN of Florida, LLC No 4 Region 10 – Child Welfare Respondent Intended Award Ranking Staywell No 1 Sunshine Yes 2 Molina Healthcare of Florida, Inc. No 3 Community No 4 Region 10 – HIV/AIDS Respondent Intended Award Ranking Simply Yes 1 United No 2 Community No 3 Positive No 4 Region 10 – Serious Mental Illness Respondent Intended Award Ranking Staywell Yes 1 United No 2 Florida MHS, Inc. No 3 Community No 4 Region 11 – HIV/AIDS Respondent Intended Award Ranking Simply Yes 1 United No 2 Positive No 3 All of the Specialty Plan awards noticed by the Agency went to bidders who also proposed, and received, comprehensive plan awards. The protests, referrals, and proceedings before the Division summarized in the Preliminary Statement followed the Agency’s announcement of its intended awards. TERMS The voluminous ITN consisted of a two-page transmittal letter and three Attachments (A, B, and C), with a total of 34 exhibits to them. They are: Attachment A, Exhibits A-1 through A-8, Attachment B, Exhibits B-1 through B-3, and Attachment C, Exhibits C-1 through C-8. The ITN establishes a two-step process for selecting: an evaluation phase and a negotiation phase. In the evaluation phase, each respondent was required to submit a proposal responding to criteria of the ITN. Proposals were to be evaluated, scored, and ranked. The goal of the evaluation phase was to determine which respondents would move to negotiations, not which would be awarded a contract. The top two ranking Specialty Plans per specialty population would be invited to negotiations. In the negotiation phase, the Agency would negotiate with each invited respondent. After that, the Agency would announce its intended award of a contract to the plan or plans that the Agency determined would provide the best value. Together, the attachments and exhibits combined instructions, criteria, forms, certifications, and data into a “one size fits all” document that described the information required for four categories of managed care plans to serve Medicaid patients. The ITN also provided data to consider in preparing responses. The transmittal letter emphasized, “Your response must comply fully with the instructions that stipulate what is to be included in the response.” The ITNs identified Jennifer Barrett as the procurement officer and sole point of contact with the Agency for vendors. The transmittal letter is reproduced here. This solicitation is being issued by the State of Florida, Agency for Health Care Administration, hereinafter referred to as “AHCA” or “Agency”, to select a vendor to provide Statewide Medicaid Managed Care Program services. The solicitation package consists of this transmittal letter and the following attachments and exhibits: Attachment A Instructions and Special ConditionsExhibit A-1 Questions TemplateExhibit A-2-a Qualification of Plan Eligibility Exhibit A-2-b Provider Service Network Certification of Ownership and Controlling InterestExhibit A-2-c Additional Required Certifications and StatementsExhibit A-3-a Milliman Organizational Conflict of Interest Mitigation Plan Exhibit A-3-b Milliman Employee Organizational Conflict of Interest AffidavitExhibit A-4 Submission Requirements and Evaluation Criteria InstructionsExhibit A-4-a General Submission Requirements and Evaluation Criteria Exhibit A-4-a-1 SRC# 6 - General Performance Measurement ToolExhibit A-4-a-2 SRC# 9 - Expanded Benefits Tool (Regional) Exhibit A-4-a-3 SRC# 10 - Additional Expanded Benefits Template (Regional)Exhibit A-4-a-4 SRC# 14 - Standard CAHPS Measurement Tool Exhibit A-4-b MMA Submission Requirements and Evaluation Criteria Exhibit A-4-b-1 MMA SRC# 6 - Provider Network Agreements/Contracts (Regional)Exhibit A-4-b-2 MMA SRC# 14 - MMA Performance Measurement Tool Exhibit A-4-b-3 MMA SRC# 21 - Provider Network Agreements/Contracts Statewide Essential Providers Exhibit A-4-c LTC Submission Requirements and Evaluation CriteriaExhibit A-4-c-1 LTC SRC# 4 - Provider Network Agreements/Contracts (Regional) Exhibit A-4-d Specialty Submission Requirements and Evaluation CriteriaExhibit A-5 Summary of Respondent CommitmentsExhibit A-6 Summary of Managed Care Savings Exhibit A-7 Certification of Drug-Free Workplace ProgramExhibit A-8 Standard Contract Attachment B Scope of Service - Core Provisions Exhibit B-1 Managed Medical Assistance (MMA) ProgramExhibit B-2 Long-Term Care (LTC) ProgramExhibit B-3 Specialty Plan Attachment C Cost Proposal Instructions and Rate Methodology NarrativeExhibit C-1 Capitated Plan Cost Proposal TemplateExhibit C-2 FFS PSN Cost Proposal Template Exhibit C-3 Preliminary Managed Medical Assistance (MMA) Program Rate Cell Factors Exhibit C-4 Managed Medical Assistance (MMA) Program Expanded Benefit Adjustment Factors Exhibit C-5 Managed Medical Assistance (MMA) Program IBNR Adjustment Factors Exhibit C-6 Managed Medical Assistance (MMA) Program Historical Capitated Plan Provider Contracting Levels During SFY 15/16 Time Period Exhibit C-7 Statewide Medicaid Managed Care Data BookExhibit C-8 Statewide Medicaid Managed Care Data Book Questions and Answers Your response must comply fully with the instructions that stipulate what is to be included in the response. Respondents submitting a response to this solicitation shall identify the solicitation number, date and time of opening on the envelope transmitting their response. This information is used only to put the Agency mailroom on notice that the package received is a response to an Agency solicitation and therefore should not be opened, but delivered directly to the Procurement Officer. The ITN describes the plans as follows: Comprehensive Long-term Care Plan (herein referred to as a “Comprehensive Plan”) – A Managed Care Plan that is eligible to provide Managed Medical Assistance services and Long-term Care services to eligible recipients. Long-term Care Plus Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services and Long-term Care services to eligible recipients enrolled in the Long-term Care program. This plan type is not eligible to provide services to recipients who are only eligible for MMA services. Managed Medical Assistance (MMA) Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services to eligible recipients. This plan type is not eligible to provide services to recipients who are eligible for Long-term Care services. Specialty Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services to eligible recipients who are defined as a specialty population in the resulting Contract. Specialty Plans are at issue. The ITN did not define, describe, or specify specialty populations to be served. It left that to the responding vendors. Beyond that, the ITN left the ultimate definition of the specialty population for negotiation, saying in Section II(B)(1)(a) of Attachment B, Exhibit B-3, “[t]he Agency shall identify the specialty population eligible for enrollment in the Specialty Plan based on eligibility criteria based upon negotiations.” Some respondents directly identified the specialty population. Simply’s transmittal letter stated that it proposed “a Specialty plan for individuals with HIV/AIDS.” Positive’s response to Exhibit A-4-d Specialty SRC 4, eligibility and enrollment, stated, “the specialty population for the PHC [Positive] plan will be Medicaid eligible, male and female individuals from all age groups who are HIV positive with or without symptoms and those individuals who have progressed in their HIV disease to meet the CDC definition of AIDS.” Some others left definition of the specialty population to be inferred from the ITN response. The result is that the ITN left definition of the specialty populations initially to the respondents and ultimately to negotiations between the Agency and successful respondents. Petitioners and Intervenors describe the populations that they propose serving as HIV/AIDS patients, patients with SMI, CSN, and child welfare populations. ITN respondents could have proposed serving only cancer patients, serving only obstetric patients, or serving only patients with hemophilia. The part of the ITN requiring a respondent to identify the plan type for which it was responding offered only four alternative blocks to check. They were: “Comprehensive Plan,” Long-Term Care Plus Plan,” “Managed Medical Assistance Plan,” or “Specialty Plan.” Attachment A to the ITN, labeled “Instructions and Special Conditions,” provides an overview of the solicitation process; instructions for response preparation and content; information regarding response submission requirements; information regarding response evaluation, negotiations, and contract awards; and information regarding contract implementation. Exhibits A-1 to A-3 and A-5 to A-7 of the ITN contain various certifications and attestations that respondents had to prepare and verify. Exhibit A-4 contains submission requirement components (SRCs) to which respondents had to prepare written responses. Exhibit A-8 contains the state’s standard SMMC contract. ITN Exhibit A-4-a contains 36 general submission requirements and evaluation criteria (General SRCs). ITN Exhibit A-4-b contains 21 MMA submission requirements and evaluation criteria (MMA SRCs). ITN Exhibit A-4-c contains 13 LTC submission requirements and evaluation criteria (LTC SRCs). ITN Exhibit A-4-d contains five specialty submission requirements and evaluation criteria (Specialty SRCs). The responses that the 36 SRCs require vary greatly. Some are as simple as providing documents or listing items. Others require completing tables or spreadsheets with data. Consequently, responses to some SRCS apparently could be reviewed in very little time, even a minute or less. Others requiring narrative responses might take longer. Examples follow. General SRC 1 required a list of the respondent’s contracts for managed care services and 12 information items about them including things such as whether they were capitated, a narrative describing the scope of work; the number of enrollees; and accomplishments and achievement. General SRC 2 asked for documentation of experience operating a Medicaid health plan in Florida. General SRC 3 asked for information confirming the location of facilities and employees in Florida. General SRC 12 requested a flowchart and written description of how the respondent would execute its grievance and appeal system. It listed six evaluation criteria. MMA SRC 2 asks for a description of the respondent’s organizational commitment to quality improvement “as it relates to pregnancy and birth outcomes.” It lists seven evaluation criteria. MMA SRC 10 asks for a description of the respondent’s plan for transition of care between service settings. It lists six evaluation criteria including the respondent’s process for collaboration with providers. Specialty SRC 1 asks for detailed information about respondent’s managed care experience with the specialty population. Specialty SRC 5 asks for detailed information about the respondent’s provider network standards and provides five evaluation criteria for evaluating the answers. Exhibit A-8 of the ITN contains the standard SMMC contract. Attachment B and Exhibits B-1 to B-3 of the ITN contain information about the scope of service and core provisions for plans under the SMMC program. Attachment C and Exhibits C-1 to C-8 of the ITN contain information related to the cost proposals and rate methodologies for plans under the SMMC program. The ITN permitted potential respondents to submit written questions about the solicitation to the Agency by August 14, 2017. Some did. On September 14, 2017, the Agency issued Addendum No. 1 to the ITN. Among other things, Addendum No. 1 changed the anticipated date for the Agency’s responses to respondents’ written questions from September 15 to October 2, 2017. The Agency issued Addendum No. 2 to the ITN on October 2, 2017. Addendum No. 2 included a chart with 628 written questions from potential respondents and the Agency’s answers. Attachment A at A 10-(d) makes it clear that the answers are part of the addendum. Both Addendums to the ITN cautioned that any protest of the terms, conditions, or specifications of the Addendums to the ITN had to be filed with the Agency within 72 hours of their posting. No respondent protested. Instructions for the A-4 Exhibits included these requirements: Each SRC contains form fields. Population of the form fields with text will allow the form field to expand and cross pages. There is no character limit. All SRCs, marked as “(Statewide)” must be identical for each region in which the respondent submits a reply. For timeliness of response evaluation, the Agency will evaluate each “(Statewide)” SRC once and transfer the score to each applicable region’s evaluation score sheet(s). The SRCs marked as “(Regional)” will be specific and only apply to the region identified in the solicitation and the evaluation score will not be transferred to any other region. The instructions continue: Agency evaluators will be instructed to evaluate the responses based on the narrative contained in the SRC form fields and the associated attachment(s), if applicable. Each response will be independently evaluated and awarded points based on the criteria and points scale using the Standard Evaluation Criteria Scale below unless otherwise identified in each SRC contained within Exhibit A-4. This is the scale: STANDARD EVALUATION CRITERIA SCALE Point Score Evaluation 0 The component was not addressed. 1 The component contained significant deficiencies. 2 The component is below average. 3 The component is average. 4 The component is above average. 5 The component is excellent. The ITN further explained that different SRCs would be worth different “weights,” based on the subject matter of the SRC and on whether they were General, MMA, LTC, or Specialty SRCs. It assigned weights by establishing different “weight factors” applied as multipliers to the score a respondent received on a criteria. For example, “Respondent Background/Experience” could generate a raw score of 90. Application of a weight factor of three made 270 the maximum possible score for this criteria. “Oversight and Accountability” could generate a raw score of 275. A weight factor of one, however, made the maximum score available 275. General SRC 6 solicits HEDIS data. HEDIS is a tool that consists of 92 measures across six domains of care that make it possible to compare the performance of health plans on an “apples-to-apples” basis. SRC 6 states: The respondent shall describe its experience in achieving quality standards with populations similar to the target population described in this solicitation. The respondent shall include, in table format, the target population (TANF, ABD, dual eligible), the respondent’s results for the HEDIS measures specified below for each of the last two (2) years (CY 2015/ HEDIS 2016 and CY 2016/ HEDIS 2017) for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees). If the respondent does not have HEDIS results for at least three (3) Medicaid Contracts, the respondent shall provide commercial HEDIS measures for the respondent’s largest Contracts. If the Respondent has Florida Medicaid HEDIS results, it shall include the Florida Medicaid experience as one (1) of three (3) states for the last two (2) years. The respondent shall provide the data requested in Exhibit A-4-a-1, General Performance Measurement Tool[.] x x x Score: This section is worth a maximum of 160 raw points x x x For each of the measure rates, a total of 10 points is available per state reported (for a total of 360 points available). The respondent will be awarded 2 points if their reported plan rate exceeded the national Medicaid mean and 2 points if their reported plan rate exceeded the applicable regional Medicaid mean, for each available year, for each available state. The respondent will be awarded an additional 2 points for each measure rate where the second year’s rate is an improvement over the first year’s rate, for each available state. An aggregate score will be calculated and respondents will receive a final score of 0 through 150 corresponding to the number and percentage of points received out of the total available points. For example, if a respondent receives 100% of the available 360 points, the final score will be 150 points (100%). If a respondent receives 324 (90%) of the available 360 points, the final score will be 135 points (90%). If a respondent receives 36 (10%) of the available 360 points, the final score will be 15 points (10%). The SRC is plainly referring to the broad Medicaid- eligible population when it says “the target population (TANF, ABD, dual eligible).” “Dual eligible” populations are persons eligible for Medicaid and Medicare. There, as throughout the ITN, the ITN delineates between a target population of all Medicaid-eligible patients and a specialty population as described in a respondent’s ITN proposal. The clear instructions for SRC 6 require, “Use the drop-down box to select the state for which you are reporting and enter the performance measure rates (to the hundredths place, or XX.XX) for that state's Medicaid population for the appropriate calendar year.” Community did not comply. General SRC 14 solicits similar data, in similar form using a similar tool, about a respondent’s Consumer Assessment of Healthcare Providers and Systems (CAHPS). CAHPS data is basically a satisfaction survey. It asks respondents to provide “in table format the target population (TANF, ABD, dual eligible) and the respondent’s results for the Consumer Assessment of Healthcare Providers and Systems (CAHPS) items/composites specified below for the 2017 survey for its adult and child populations for the respondent’s three (3) largest Medicaid Contracts (as measured by number of enrollees).” Just like General SRC 6 did with HEDIS data, General SRC 14 ITN instructed bidders to put their CAHPS data for the “target population (TANF, ABD, dual eligible)” “for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees)” for multiple states into an excel spreadsheet “to the hundredths place[.]” Also, like General SRC 6, General SRC 14 includes an objective formula described in the ITN for scoring bidders’ CAHPS data. RANKING PROVISIONS Attachment A at (D)(4)(c)(2) stated: Each response will be individually scored by at least three (3) evaluators, who collectively have experience and knowledge in the program areas and service requirements for which contractual services are sought by this solicitation. The Agency reserves the right to have specific sections of the response evaluated by less than three (3) individuals. The ITN’s example of how total point scores would be calculated, discussed below, also indicated that some sections may be scored by less than three evaluators. The explanatory chart had a column for “[o]ther Sections evaluated by less than three (3) evaluators. ” The Agency’s policy, however, has been to assign at least three evaluators to score program specific SRCs. Attachment A at (D)(4)(e)(2) advised respondents how the agency will rank the competing responses. It was clear and specific, even providing an example of the process showing how the scores “will” be calculated. Step one of the explanatory chart stated that the Agency would calculate a total point score for each response. Step two stated that “[t]he total point scores will be used to rank the responses by an evaluator. . . .” Next, the rankings by the evaluator are averaged to determine the average rank for each respondent. This average ranking is critical because ranking is how the ITN said the Agency would select respondents for negotiation and how the Agency did select respondents for negotiation. The step two and step three charts, reproduced below, demonstrate that the ITN contemplated an evaluation process in which each response was to be evaluated in its entirety by three different evaluators, or maybe less than three, but indisputably in its entirety by those who evaluated it. This did not happen. Step 2 The total point scores will be used to rank the responses by evaluator (Response with the highest number of points = 1, second highest = 2, etc.). POINTS SUMMARY Evaluator A Evaluator B Evaluator C Evaluator D Respondent 446 Respondent 396 Respondent 311 Respondent 413 Respondent 425 Respondent 390 Respondent 443 Respondent 449 Respondent 397 Respondent 419 Respondent 389 Respondent 435 Respondent 410 Respondent 388 Respondent 459 Respondent 325 RANKING SUMMARY Evaluator A Evaluator B Evaluator C Evaluator D Respondent 1 1 Respondent 1 2 Respondent 1 4 Respondent 3 Respondent 2 2 Respondent 2 3 Respondent 2 2 Respondent 1 Respondent 3 4 Respondent 3 1 Respondent 3 3 Respondent 2 Respondent 4 3 Respondent 4 4 Respondent 4 1 Respondent 4 c) Step 3 An average rank will be calculated for each response for all the evaluators. Respondent 1 1+2+4+3=10÷4=2.5 Respondent 2 2+3+2+1=8÷4=2.0 Respondent 3 4+1+3+2=10÷4=2.5 Respondent 4 3+4+1+4=12÷4=3.0 PROVIDER SERVICE NETWORK PROVISIONS Florida law permits a PSN to limit services provided to a target population “based on age, chronic disease state, or medical condition of the enrollee.” This allows a PSN to offer a specialty plan. For each region, the eligible plan requirements of section 409.974(1) state, “At least one plan must be a provider service network if any provider service networks submit a responsive bid.” Section 409.974(3) says: “Participation by specialty plans shall be subject to the procurement requirements of this section. The aggregate enrollment of all specialty plans in a region may not exceed 10 percent of the total enrollees of that region.” The ITN addressed those requirements. The Negotiation Process section of Attachment A, Instructions and Special Conditions, says: The Agency intends to invite the following number of respondents to negotiation: Comprehensive Plans The top four (4) ranking Comprehensive Plans. Long-term Care Plus Plans The top two (2) ranking Long-term Care Plus Plans Managed Medical Assistance Plans The top two (2) ranking Managed Medical Assistance Plans Specialty Managed Medical Assistance Plans The top two (2) ranking Specialty Managed Medical Assistance Plans per specialty population. If there are no provider service networks included in the top ranked respondents listed above, the Agency will invite the highest ranked PSN(s) to negotiations in order to fulfill the requirements of Section 409.974(1), Florida Statutes and Section 409.981(1), Florida Statutes. Emphasis supplied. The ITN specifications in Section D.7, titled Number of Awards, state as follows about Specialty Plan awards: 7. Number of Awards In accordance with Sections 409.966, 409.974, and 409.981, Florida Statutes, the Agency intends to select a limited number of eligible Managed Care Plans to provide services under the SMMC program in Region 10. The Agency anticipates issuing the number of Contract awards for Region 10 as described in Table 5, SMMC Region, below, excluding awards to Specialty MMA Plans. Table 5 SMMC Region Region Total Anticipated Contract Awards Region 10 4 If a respondent is awarded a Contract for multiple regions, the Agency will issue one (1) Contract to include all awarded regions. The Agency will award at least one (1) Contract to a PSN provided a PSN submits a responsive reply and negotiates a rate acceptable to the Agency. The Agency, at its sole discretion, shall make this determination. A respondent that is awarded a Contract as a Comprehensive Plan is determined to satisfy the requirements in Section 409.974, Florida Statutes and Section 409.981, Florida Statutes and shall be considered an awardee of an MMA Contract and a LTC Contract. The Agency will issue one (1) Contract to reflect all awarded populations in all awarded regions. In addition to the number of Contracts awarded in this region, additional Contracts may be awarded to Specialty Plans that negotiate terms and conditions determined to be the best value to the State and negotiate a rate acceptable to the Agency. The Agency, at its sole discretion, shall make this determination. The Agency reserves the right to make adjustments to the enrollee eligibility and identification criteria proposed by a Specialty Plan prior to Contract award in order to ensure that the aggregate enrollment of all awarded Specialty Plans in a region will not exceed ten percent (10%) of the total enrollees in that region, in compliance with Section 409.974(3), Florida Statutes. If a respondent is awarded a Contract as a Specialty Plan and another plan type, the Agency will issue one (1) Contract to include all awarded populations in all awarded regions. A prospective vendor asked about the interplay of Specialty Plan options and the PSN requirements. The question and the answer provided in Addendum 2 follow: Q. Please clarify the number of PSN awards per region and how PSN awards will be determined based on the PSN's plan type (e.g., Comprehensive, LTC Plus, MMA, Specialty). As you know, Sections 409.974 and 409.981, Florida Statutes require one MMA PSN and one LTC PSN award per region (assuming a PSN is responsive) and the Agency has stated that an award to a Comprehensive Plan PSN will meet the requirements of both statutes. However, can the Agency further clarify whether other types of PSNs would meet the statutory requirements? Specifically, would a PSN LTC Plus award meet the requirements of Section 409.981, Florida Statutes? Similarly, would an award to a Specialty Plan PSN meet the requirements of Section 409.974, Florida Statutes? A. See Attachment A Instructions and Special Conditions, Section D Response Evaluations, and Contract Award, Sub-Section 7 Number of Awards. Yes, a PSN LTC Plus award would meet the requirements of Section 409.981(2). A Specialty Plan PSN would not meet the requirements of Section 409.974(1). The only reasonable interpretation of this answer is that Specialty Plan PSNs do not satisfy the requirement to contract with a responsive PSN imposed by section 409.974. None of the prospective vendors, including Community, challenged this clarification. EVALUATION PROCESS THE EVALUATORS The Agency selected 11 people to evaluate the proposals. The Agency assigned each person a number used to identify who was assigned to which task and to track performance of evaluation tasks. The procurement officer sent the evaluators a brief memo of instructions. It provided dates; described logistics of evaluation; emphasized the importance of independent evaluation; and prohibited communicating about the ITN and the proposals with anyone other than the procurement office. The Agency also conducted an instructional session for evaluators. Evaluator 1, Marie Donnelly: During the procurement, Ms. Donnelly was the Agency’s Chief of the Bureau of Medicaid Quality. She held this position for five years before resigning. This bureau bore responsibility for ensuring that the current SMMC plans met their contract requirements for quality and quality improvement measures. Her role specifically included oversight of Specialty Plans. Evaluator 2, Erica Floyd Thomas: Ms. Thomas is the chief of the Bureau of Medicaid Policy. She has worked for the Agency since 2001. Her Medicaid experience includes developing policies for hospitals, community behavioral health, residential treatment, and contract oversight. Before serving as bureau chief, she served as an Agency administrator from 2014 through 2017. Ms. Thomas oversaw the policy research and development process for all Medicaid medical, behavioral, dental, facility, and clinic coverage policies to ensure they were consistent with the state Plan and federal Medicaid requirements. Evaluator 3, Rachel LaCroix, Ph.D.: Dr. LaCroix is an administrator in the Agency’s Performance Evaluation and Research Unit. She has worked for the Agency since 2003. All her positions have been in the Medicaid program. Dr. LaCroix has served in her current position since 2011. She works with the performance measures and surveys that the current SMMC providers report to the Agency. Dr. LaCroix is a nationally recognized expert on healthcare quality metrics like HEDIS. She is also an appointee on the National Association of Medicaid Directors’ task force for national performance measures. Evaluator 4, Damon Rich: Mr. Rich has worked for the Agency since April 2009. He is the chief of the Agency’s Bureau of Recipient and Provider Assistance. This bureau interacts directly with AHCA’s current SMMC care providers about any issues they have, and with Medicaid recipients, usually about their eligibility or plan enrollment. Before Mr. Rich was a bureau chief, he worked as a field office manager for the Agency. Mr. Rich’s experience as bureau chief and field office manager includes oversight of the current SMMC Specialty Plans. Evaluator 5. Eunice Medina: Ms. Medina is the chief of the Agency’s Bureau of Medicaid Plan Management, which includes a staff of over 60 individuals, who manage the current SMMC contracts. Her experience and duties essentially encompass all aspects of the current SMMC plans. Ms. Medina started working with the Agency in 2014. Evaluator 6, Devona “DD” Pickle: Ms. Pickle most recently joined the Agency in 2011. She also worked for the Agency from November 2008 through November 2010. Ms. Pickle’s Agency experience all relates in some way to the Medicaid program. Since March 2013, Ms. Pickle has served as an administrator over managed care policy and contract development in the Bureau of Medicaid Policy. Her job duties include working with the current SMMC contractors. Ms. Pickle is also a Florida licensed mental health counselor. Evaluator 7, Tracy Hurd-Alvarez: Ms. Hurd-Alvarez has worked for the Agency’s Medicaid program since 1997. Since 2014, she has been a field office manager, overseeing compliance monitoring for all the current SMMC contractors. Before assuming her current position, Ms. Hurd-Alvarez implemented the LTC SMMC program. Evaluator 8, Gay Munyon: Ms. Munyon is currently the Chief of the Bureau of Medicaid Fiscal Agent Operations. Ms. Munyon began working with the Agency in April 2013. Ms. Munyon’s bureau oversees fulfillment of the Agency’s contract with the current SMMC fiscal agent. Her unit’s responsibilities include systems maintenance and modifications and overseeing the fiscal agent, which answers phone calls, processes claims, and processes applications. Ms. Munyon has 25 years of experience working with the Medicaid program. Evaluator 9, Laura Noyes: Ms. Noyes started working for the Agency in April 2011. Her years of Agency experience all relate to the Medicaid program, including overseeing six current comprehensive managed care plans by identifying trends in contractual non-compliance. Evaluator 10, Brian Meyer: Mr. Meyer is a CPA, who has worked for the Agency in the Medicaid program since 2011. He is currently chief of the Bureau of Medicaid Data Analytics. Mr. Meyer’s primary responsibility is overseeing the capitation rates for the current SMMC contractors. His experience includes Medicaid plan financial statement analysis, surplus requirement calculation analysis and, in general, all types of financial analysis necessary to understand financial performance of the state’s Medicaid plans. Evaluator 11, Ann Kaperak: Since April 2015, Ms. Kaperak has served as an administrator in the Agency’s Bureau of Medicaid Program Integrity. Ms. Kaperak’s unit oversees the fraud and abuse efforts of the current SMMC plans. She also worked for the Medicaid program from November 2012 through May 2014. Ms. Kaperak worked as a regulatory compliance manager for Anthem/Amerigroup’s Florida Medicaid program between May 2014 and April 2015. Positive and Community challenge the Agency’s plan selections by questioning the qualifications of the evaluators. The first part of their argument is that the evaluators did not have sufficient knowledge about HIV/AIDS and its treatment. The evidence does not prove the theory. For instance, Positive’s argument relies upon criticizing the amount of clinical experience evaluators had managing patients with HIV/AIDS. That approach minimizes the fact that the managed care plan characteristics involve so much more than disease- specific considerations. For instance, many of the components require determining if the respondent provided required documents, verifying conflict of interest documents, management structure, quality control measures, and the like. General SRCs asked for things like dispute resolution models (SRC 16), claims processing information (SRC 17), and fraud and abuse compliance plans (SRC 31). MMA SRCs included criteria, like telemedicine (SRC 4), demonstrated progress obtaining executed provider agreements (SRC 6), and a credentialing process (SRC 12). Specialty SRCs included criteria like copies of contracts for managed care for the proposed specialty population (SRC 1), specific and detailed criteria defining the proposed specialty population (SRC 4), and the like. The evidence does not prove that disease-specific experience is necessary to evaluate responses to these and other SRCs. SRC 6 involving HEDIS data and SRC 14 involving CAHPS data are two good examples. They required respondents to input data into a spreadsheet. All the evaluators had to do was determine what those numbers showed. Evaluation did not require any understanding of disease or how the measures were created. All the evaluator had to know was the number in the spreadsheet. The second part of the evaluator qualification criticisms is that the evaluators did not give adequate weight to some responses. Positive and Community just disagree with the measures requested and the evaluation of them. They conclude from that disagreement that the evaluators’ qualifications were deficient. The argument is not persuasive. The last sentence of paragraph 69 of Positive’s proposed recommended order exemplifies the criticisms of Positive and Community of the evaluators’ qualifications. It states, “The fact that PHC [Positive] was ranked last among competing HIV plans shows that the SRC evaluators did not understand enough about managing individuals with HIV/AIDs to score its proposal competently.” The argument is circular and “ipse dixit”. It does not carry the day. The collective knowledge and experience of the evaluators, with a total of 128 years of Medicaid experience, made them capable of reasonably evaluating the managed care plan proposals, including the Specialty plan proposals. The record certainly does not prove otherwise. EVALUATION PROCESS The Agency assigned the evaluators to the SRCs that it determined they were qualified to evaluate and score. The Agency did not assign entire responses to an evaluator for review. Instead it elected a piecemeal review process assigning various evaluators to various sections, the SRCs of each response. Paragraph 30 of the Agency’s proposed recommended order describes this decision as follows: Although the ITN had contemplated ranking each vendor by evaluator, based on an example in the ITN, such ranking presumed a process where all evaluators scored all or nearly all of the responses to the ITN, which had occurred in the procurement five years ago. In this procurement, each evaluator reviewed only a subset of SRCs based on their knowledge, and experience; therefore, ranking by evaluator was not logical because each had a different maximum point score. The initial SRC scoring assignments were: General SRCs 1 through 4, LTC SRCs 1 and 2, and Specialty SRC 1: Marie Donnelly, Laura Noyes, and Brian Meyer. General SRCs 5 through 8, MMA SRCs 1 through 7, LTC SRCs 3 and 4, and Specialty SRCs 1 and 2: Marie Donnelly, Erica Floyd- Thomas, and Rachel LaCroix. General SRCs 9 through 14, MMA SRCs 8 through 11, LTC SRCs 5 through 7, and Specialty SRC 4: Damon Rich, Eunice Medina, and DD Pickle. General SRCs 15 through 17, MMA SRCs 12 and 13, and LTC SRCs 8 through 10: Damon Rich, Tracy Hurd-Alvarez, Gay Munyon. General SRCs 18 through 25, MMA SRCs 14 through 20, LTC SRCs 11 and 12, and Specialty SRC 5: Erica Floyd-Thomas, Eunice Medina, and DD Pickle. General SRCs 26 through 33 and LTC SRC 13: Gay Munyon, Ann Kaperak, and Brian Meyer. General SRCs 34 through 36 and MMA SRC 21: Marie Donnelly, Rachel LaCroix, and Tracy Hurd-Alvarez. The ranking process presented in the ITN and described in paragraphs 62-64, contemplated ranking each respondent by evaluator. The Agency carried this process over from an earlier procurement. In this procurement, despite what the ITN said, the Agency assigned responsibilities so that each evaluator reviewed only a subset of SRCs. Therefore, the ranking of responses by evaluator presented in the ITN could not work. It was not even possible because no one evaluator reviewed a complete response and because each SRC had a different maximum point score. Instead, the Agency, contrary to the terms of the ITN, ranked proposals by averaging the “total point scores” assigned by all of the evaluators. The Agency considered issuing an addendum advising the parties of the change. The addendum would have informed the respondents and provided them an opportunity to challenge the change. The Agency elected not to issue an addendum. EVALUATION AND SCORING The evaluators began scoring on November 6, 2017, with a completion deadline of December 29, 2017. The 11 evaluators had to score approximately 230 separate responses to the ITNs. The evaluators had to score 67,175 separate items to complete the scoring for all responses for all regions for all types of plans. No one at the Agency evaluated how much time it should take to score a particular item. None of the parties to this proceeding offered persuasive evidence to support a finding that scoring any particular item would or should take a specific length of time or that scoring all of the responses would or should take a specific length of time. Evaluators scored the responses in conference room F at the Agency’s headquarters. This secure room was the exclusive location for evaluation and scoring. Each evaluator had a dedicated workspace equipped with all tools and resources necessary for the task. The workspaces included a computer terminal for each evaluator. The system allowed evaluators to review digital copies of the ITN and proposals and to enter evaluation points in spreadsheets created for the purpose of recording scores. Evaluators also had access to hard copies of the proposals and the ITN. The Agency required evaluators to sign in and to sign out. The sign-in and sign-out sheets record the significant amount of time the evaluators spent evaluating proposals. Evaluators were not permitted to communicate with each other about the responses. To minimize distractions, the Agency prohibited cell phones, tablets and other connected devices in the room. The Agency also authorized and encouraged the evaluators to delegate their usual responsibilities. Agency proctors observed the room and evaluators throughout the scoring process. They were available to answer general and procedural questions and to ensure that the evaluators signed in and signed out. A log sheet documented how much time each evaluator spent in the scoring conference room. Some evaluators took extensive notes. For example, Ms. Median took over 200 pages of notes. Similarly, Ms. Munyon took nearly 400 pages of typewritten notes. The evaluators worked hard. None, other than Dr. LaCroix, testified that they did not have enough time to do their job. The computer system also automatically tracked the evaluators’ progress. Tracking reports showed the number of items assigned to each evaluator and the number of scoring items completed. The first status report was generated on December 8, 2017, approximately halfway through the scheduled scoring. At that time, only 28 percent of the scoring items were complete. Ms. Barrett usually ran the status reports in the morning. She made them available to the evaluators to review. The pace of evaluation caused concern about timely completion and prompted discussions of ways to accelerate scoring. Because it was clear that the majority of the evaluators would not complete scoring their SRCs by December 29, 2017, the Agency extended the scoring deadline to January 12, 2018. It also extended the hours for conference room use. Most respondents filed proposals for more than one type of plan and more than one region. This fact combined with the provision in the instructions saying that all statewide SRC responses must be identical for each region and that scores would transfer to each applicable region’s score sheets, enabled evaluators to score many SRCs just once. The system would then auto-populate the scores to the same SRC for all proposals by that respondent. This time saving measure permitted scoring on many of the items to be almost instantaneous after review of the first response to an SRC. The fact that so many respondents submitted proposals for so many regions and types of plans provided the Agency another opportunity for time-saving. The Agency loaded Adobe Pro on the evaluators’ computers as a timesaving measure. This program allowed the evaluators to compare a bidder’s Comprehensive Plan Proposal to the same company’s regional and Specialty Plan proposals. If the Adobe Pro comparison feature showed that the proposal response was the same for each plan, the Agency permitted evaluators to score the response once and assign the same score for each item where the respondent provided the same proposal. This speeded scoring. It, however, meant that for SRCs where evaluators did this, that they were not reviewing the SRC response in the specific context of the specialty plan population, each of which had specific and limited characteristics that made them different from the broader General and MMA plan populations. This is significant because so many SRCs required narrative responses where context would matter. There is no Specialty SRCs A-4 instruction requirement for specialty plans analogous to the requirement that responses for statewide SRCs must be identical for each region. In other words, the instructions do not say all SRCs marked as statewide must be identical for each specialty plan proposal and that the Agency will evaluate each Statewide SRC once and transfer the score to each applicable Specialty Plan score. In fact, according to the procurement officer, the Agency expected that evaluators would separately evaluate and score the statewide SRCs for Comprehensive Plans and for Specialty Plans, even if the same bidder submitted them. Despite the Agency’s expectation and the absence of an authorizing provision in the ITN, many evaluators, relying on the Adobe Pro tool, copied the SRC scores they gave to a respondent’s comprehensive plan proposal to its specialty plan proposal if the respondent submitted the same response to an SRC for a Comprehensive Plan and a Specialty Plan. For instance, Ms. Thomas (Evaluator 2) and Ms. Munyon (Evaluator 8) did this to save time. Ms. Donnelly (Evaluator 1) did this even when the comprehensive and specialty responses were not identical. This does not amount to the independent evaluation of the responses pledged by the ITN. On separate days, Evaluator 1 scored 1,315 items, 954 items, 779 items and 727 items. On separate days, Evaluator 2 scored 613 items, 606 items, 720 items, 554 items and 738 items. Evaluator 4 scored 874 items on one day. Evaluator 5 scored 813 items in one day. Evaluator 6 scored 1,001 items in one day. Evaluator 8 scored 635 items in one day. The record does not identify the items scored. It also does not permit determining how many of the item scores resulted from auto-population or assignment of scores based upon previous scoring of an identical response. It bears repeating, however, that the record does not support any finding on how long scoring the response to one SRC or an entire response could reasonably be expected to take. Even with the extended scoring period and time-saving measures, the Agency concluded that Evaluator 3 would not be able to finish all of the SRCs assigned to her. Rather than extend the deadline for scoring a second time, the Agency decided to reassign the nine of Evaluator 3’s SRCs that she had not begun scoring to two other evaluators. The Agency did not include scores of other SRCs for which Evaluator 3 had not completed scoring. The Agency only counted Evaluator 3’s scores for an SRC if she scored the SRC for everyone. The result was that only two people scored nine of the Specialty Plan SRCs. The Agency did not reassign all of Evaluator 3’s SRCs’. It only reassigned the SRCs to evaluators who were qualified to evaluate the items, who were not already assigned those items to score, and who had already finished or substantially completed their own evaluations. The decision to reassign the SRCs was not based on any scoring that had already been completed. The Agency did not allow changes to data submitted by any of the vendors. It allowed vendors to exchange corrupted electronic files for ones which could be opened and allowed vendors to exchange electronic files to match up with the paper copies that had been submitted. The Agency allowed Community to correct its submission where it lacked a signature on its transmittal letter and allowed Community to exchange an electronic document that would not open. It did not allow Community to change its reported HEDIS scores, which were submitted in the decimal form required by the instructions. Community erred in the numbers that it reported. There is no evidence showing that other vendors received a competitive or unfair advantage over Community in the Agency’s review of the SMI Specialty Plan submission for Region 10. There was no evidence that the Agency allowed any other vendors to change any substantive information in their submittals for that proposed specialty in that region. HEIDIS ISSUES Positive asserts that Simply’s proposal is non- responsive because Simply submitted HEDIS data from the general Medicaid population in response to SRC 6 and MMA SRC 14. Positive contends that Simply obtained a competitive advantage by supplying non-HIV/AIDS HEDIS data in response to SRC 6 and MMA SRC 14 because HIV/AIDS patients are generally a sicker group and require more care and because some HEDIS measures cannot be reported for an HIV/AIDS population. HEDIS stands for Healthcare Effectiveness and Data Information Set and is a set of standardized performance measures widely used in the healthcare industry. The instructions for both SRC 6 and MMA SRC 14 provide, in relevant part: The respondent shall describe its experience in achieving quality standards with populations similar to the target population described in this solicitation. The respondent shall include in table format, the target population (TANF, ABD, dual eligible), the respondent’s results for the HEDIS measures specified below for each of the last two (2) years (CY 2015/HEDIS 2016 and CY 2016/HEDIS 2017) for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees). If the respondent does not have HEDIS results for at least three (3) Medicaid Contracts, the respondent shall provide commercial HEDIS measures for the respondent’s largest Contracts. If the Respondent has Florida Medicaid HEDIS results, it shall include the Florida Medicaid experience as one (1) of three (3) states for the last two (2) years. (JE 1 at 75 (SRC 6); JE 1 at 158 (MMA SRC 14)). SRC 6 and MMA SRC 14 instruct respondents to provide HEDIS measures for “the target population (TANF, ABD, dual eligible).” Id.. TANF, ABD, and dual eligible are eligibility classifications for the Medicaid population. The Agency sought information regarding the target Medicaid-eligible population, even from respondents proposing a Specialty Plan, because Specialty Plans are required to serve all of the healthcare needs of their recipients, not just the needs related to the criteria making those recipients eligible for the Specialty Plan. Following the instructions in SRC 6 and MMA SRC 14, Simply provided HEDIS data from the Medicaid-eligible population for its three largest Medicaid contracts as measured by the total number of enrollees. For the requested Florida HEDIS data, Simply utilized legacy HEDIS data from Amerigroup Florida, Inc., a Comprehensive Plan. Amerigroup and Simply had merged in October of 2017. Therefore, at the time of submission of Simply’s proposal, the HEDIS data from Amerigroup Florida was the data from Simply’s largest Medicaid contract in Florida for the period requested by the SRCs. Positive asserts that the Agency impermissibly altered scoring criteria after the proposals were submitted when the Agency corrected technical issues within a HEDIS Measurement Tool spreadsheet. SRC 6 and MMA SRC 14 required the submission of numeric data for the requested HEDIS performance measures. To simplify submission of the numeric data for the requested HEDIS performance measures, the Agency required respondents to utilize a HEDIS Measurement Tool spreadsheet. The evaluation criteria for SRC 6 and MMA SRC 14 provided that respondents will be awarded points if the reported HEDIS measures exceed the national or regional mean for such performance measures. Some respondents, including Positive, entered “N/A,” “small denominator,” or other text inputs into the HEDIS Measurement Tool. During the evaluation and scoring process, the Agency discovered that if a respondent input any text into the HEDIS Measurement Tool, the tool would assign random amounts of points, even though respondents had not input measureable, numeric data. The Agency reasonably resolved the problem by removing any text and inserting a zero in place of the text. The correction of the error in the HEDIS Measurement Tool prevented random points from being awarded to respondents and did not alter scores in any way contrary to the ITN. It was reasonable and fair to all respondents.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order rejecting all r esponses to the ITNs to provide a Medicaid Managed Care plan for patients with HIV/AIDS in Regions 10 and 11. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide Medicaid Managed Care plan in Region 10 for patients with serious mental illness. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide a Medicaid Managed Care plan in Region 10 for patients with serious mental illness. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide a Medicaid Managed Care plan in Region 10 for c hild w elfare specialty services. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order awarding Wellcare of Florida, Inc., d/b/a Staywell Health Plan of Florida, a contract for a specialty Medicaid Managed Care plan for patients with Serious Mental Illness in Region 10. Based on the foregoing Findings of Fact and Conclusions of Law it is RECOMMENDED that the Agency for Health Care Administration enter a final order dismissing the Petition in Case No. 18-3513. DONE AND ENTERED this day of , , in Tallahassee, Leon County, Florida. S JOHN D. C. NEWTON, II Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this day of , .

USC (1) 42 U.S.C 1396u Florida Laws (9) 120.5720.42287.057409.912409.962409.966409.97409.974409.981
# 3
AGENCY FOR HEALTH CARE ADMINISTRATION vs NORTH CENTRAL FLORIDA HOSPITAL, INC., D/B/A HAVEN HOSPICE, 09-005554MPI (2009)
Division of Administrative Hearings, Florida Filed:Gainesville, Florida Oct. 13, 2009 Number: 09-005554MPI Latest Update: Feb. 25, 2010

Conclusions THE PARTIES resolved all disputed issues and executed a settlement agreement, which is attached and incorporated by reference. The parties are directed to comply with the terms of the attached settlement agreement. Based on the foregoing, this file is hereby CLOSED . DONE AND ORDERED on this the , l <or·r- Tallahassee, Florida. day of =tf , 2010, in Thomas W. Arnold, Secrefary Agency for Health Care Administration Agency for Health Care Administration v. North Central Florida Hospice, Inc. d/b/a Haven Hospice Final Order - Page I of 3 Filed February 25, 2010 12:11 PM Division of Administrative Hearings. A PARTY WHO IS ADVERSELY AFFECTED BY THIS FINAL ORDER IS ENTITLED TO A JUDICIAL REVIEW WHICH SHALL BE INSTITUTED BY FILING ONE COPY OF A NOTICE OF APPEAL WITH THE AGENCY CLERK OF AHCA, AND A SECOND COPY ALONG WITH FILING FEE AS PRESCRIBED BYLAW, WITH THE DISTRICT COURT OF APPEAL IN THE APPELLATE DISTRICT WHERE THE AGENCY MAINTAINS ITS HEADQUARTERS OR WHERE A PARTY RESIDES. REVIEW PROCEEDINGS SHALL BE CONDUCTED IN ACCORDANCE WITH THE FLORIDA APPELLATE RULES. THE NOTICE OF APPEAL MUST BE FILED WITHIN 30 DAYS OF RENDITION OF THE ORDER TO BE REVIEWED. Copies furnished to: Alison Ingram North Central Florida Hospital, Inc. d/b/a Haven Hospice 4200 Northwest 90th Boulevard Gainesville, Florida 32606 (Via U.S. Mail) Justin M. Senior, General Counsel Agency for Health Care Administration 2727 Mahan Drive Building 3, Mail Station 3 Tallahassee, Florida 32308 (Interoffice Mail) Kim Kellum, Chief Medicaid Counsel Agency for Health Care Administration 2727 Mahan Drive Building 3, Mail Station 3 Tallahassee, Florida 32308 (Interoffice Mail) Tracie L. Hardin, Esquire Agency for Health Care Administration 2727 Mahan Drive Building 3, Mail Station 3 Tallahassee, Florida 32308 (Interoffice Mail) Bureau of Health Quality Assurance 2727 Mahan Drive, Mail Stop 9 Tallahassee, Florida 32308 (Interoffice Mail) Ken Yon, Bureau Chief Medicaid Program Integrity 2727 Mahan Drive Building 2, Mail Station 6 Tallahassee, Florida 32308 (Interoffice Mail) Peter Williams, Inspector General Medicaid Program Integrity 2727 Mahan Drive Building 2, Mail Station 6 Tallahassee, Florida 32308 (Interoffice Mail) Division of Administrative Hearings The Desoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (Via U.S. Mail) Agency for Health Care Administration Bureau of Finance and Accounting 2727 Mahan Drive Building 2, Mail Station 14 Tallahassee, Florida 32308 (Interoffice Mail) '. CERTIFICATE OF SERVICE I HEREBY CERTIFY that a true and correct copy of the foregoing has been furnished to the above named addressees by U.S. Mail, or the method designated, on this the Z,</ day of ---"- ---= y 7 ., 2010. Richard Shoop, Esquire Agency Clerk State of Florida Agency for Health Care Administration 2727 Mahan Drive, Building #3 Tallahassee, Florida 32308-5403 (850) 922-5873

# 4
SOUTH FLORIDA COMMUNITY CARE NETWORK, LLC, D/B/A COMMUNITY CARE PLAN vs AGENCY FOR HEALTH CARE ADMINISTRATION, 18-003512BID (2018)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Jul. 09, 2018 Number: 18-003512BID Latest Update: Jan. 25, 2019

The Issue Does Petitioner, AHF MCO of Florida, Inc., d/b/a PHC Florida HIV/AIDS Specialty Plan (Positive), have standing to contest the intended award to Simply for Regions 10 and 11 or to seek rejection of all proposals? (Case No. 18-3507 and 18-3508) Should the intended decision of Respondent, Agency for Health Care Administration (Agency), to contract with Simply Healthcare Plans, Inc. (Simply), for Medicaid managed care plans for HIV/AIDS patients in Regions 10 (Broward County) and Region 11 (Miami-Dade and Collier Counties) be invalidated and all proposals rejected? (Case Nos. 18-3507 and 18-3508) Must the Agency negotiate with Petitioner, South Florida Community Care Network, LLC, d/b/a Community Care Plan (Community), about a plan to provide HIV/AIDS Medicaid managed care services in Region 10 because it was the only responsive proposer of services that was a Provider Service Network (PSN)? (Case No. 18-3512) Must the Agency negotiate with Community to provide Medicaid managed care services in Region 10 for people with Serious Mental Illnesses because Community is a PSN? (Case No. 18-3511) Must the Agency contract with Community to provide Medicaid managed care services for Children with Special Needs in Region 10 because Community is a PSN? (Case No. 18-3513) Must the Agency negotiate with Community to provide Medicaid managed care services for Child Welfare patients in Region 10 because Community is a PSN? (Case No. 18-3514)

Findings Of Fact THE PARTIES Agency: Section 20.42, Florida Statutes, establishes the Agency as Florida’s chief health policy and planning agency. The Agency is the single state agency authorized to select eligible plans to participate in the Medicaid program. Positive: Positive is a Florida not-for-profit corporation operating a Medicaid health plan dedicated to serving people with HIV/AIDS. Positive serves about 2,000 patients in Florida. Positive’s health plan is accredited by the Accreditation Association for Ambulatory Healthcare. Its disease management program is accredited by the National Committee for Quality Assurance. Currently, the Agency contracts with Positive for a SMMC HIV/AIDS Specialty Plan serving Regions 10 and 11. Simply: Simply is a Florida for-profit corporation operating a Medicaid health plan dedicated to serving people with HIV/AIDS. Currently, the Agency contracts with Simply to provide a SMMC HIV/AIDS Specialty Plan for Regions 1 through 3 and 5 through 11. Simply has maintained the largest patient enrollment of all HIV/AIDs plans in Florida since Florida started its statewide Medicaid managed care program. Community Care: Community is a Florida limited liability company. It is a PSN as defined in sections 409.912(1)(b) and 409.962(14), Florida Statutes. Staywell: Staywell is the fictitious name for WellCare of Florida, Inc., serving Florida’s Medicaid population. Sunshine: Sunshine State Health Plan (Sunshine) is a Florida corporation. It offers managed care plans to Florida Medicaid recipients. THE INVITATION TO NEGOTIATE TIMELINE On July 14, 2017, the Agency released 11 ITNs plans for Florida’s Medicaid managed care program in 11 statutorily defined regions. Region 10, Broward County, and Region 11, Miami-Dade and Collier Counties, are the regions relevant to this proceeding. Part IV of chapter 409, creates a statewide, integrated managed care program for Medicaid services. This program called Statewide Medicaid Managed Care includes two programs, Managed Medical Assistance and Long-term Care. Section 409.966(2), directs the Agency to conduct separate and simultaneous procurements to select eligible plans for each region using the ITN procurement process created by section 287.057(1)(c). The ITNs released July 14, 2017, fulfilled that command. The Agency issued 11 identical ITNs of 624 pages, one for each region, in omnibus form. They provided elements for four types of plans. Some elements were common to all types. Others were restricted to a specific plan type defined by intended patient population. The plan types are comprehensive plans, long-term care plus plans, managed medical assistance plans, and specialty plans. Section 409.962(16) defines “Specialty Plan” as a “managed care plan that serves Medicaid recipients who meet specified criteria based on age, medical condition, or diagnosis.” Responding vendors identified the plan type or types that they were proposing. The Agency issued Addendum No. 1 to the ITNs on September 14, 2017. On October 2, 2017, the Agency issued Addendum No. 2 to the ITNs. Addendum 2 included 628 questions about the ITNs and the Agency’s responses to the questions. Florida law permits potential responders to an ITN to challenge the specifications of an ITN, including the addendums. § 120.57(3)(b), Fla. Stat. Nobody challenged the specifications of the ITNs. As contemplated by section 287.057(c)(2), the Agency conducted “a conference or written question and answer period for purposes of assuring the vendors’ full understanding of the solicitation requirements.” Positive, Community, and Simply, along with United Healthcare of Florida, Inc., HIV/AIDS Specialty Plan (United), submitted responses to the ITN in Region 10 proposing HIV/AIDS Specialty Plans. Community was the only PSN to propose an HIV/AIDS plan for Region 10. Positive, Simply, and United submitted replies to the ITN for Region 11, proposing HIV/AIDS Specialty Plans. Community, United, Staywell, and one other provider submitted proposals to provide SMI Specialty Plan services in Region 10. Community was the only responding PSN. Community, Sunshine, and Staywell submitted proposals to provide Child Welfare Specialty Plans (CW) in Region 10. Community was the only PSN. Community, Staywell, and two others submitted proposals to offer Specialty Plans for Children with Special Needs (CSN) in Region 10. Community was one of two responding PSNs. Proposal scoring began November 6, 2017, and ended January 16, 2018. The Agency announced its intended awards on April 24, 2018. On April 24, 2018, the Agency issued its notices of intent to award specialty contracts in Regions 10 and 11. The following charts summarize the Agency’s ranking of the proposals and its intended awards. The two highest ranked plans, which the Agency selected for negotiations, are identified in bold. Region 10 – Children with Special Needs Respondent Intended Award Ranking Staywell No 1 Community No 2 Miami Children’s Health Plan, LLC No 3 Our Children PSN of Florida, LLC No 4 Region 10 – Child Welfare Respondent Intended Award Ranking Staywell No 1 Sunshine Yes 2 Molina Healthcare of Florida, Inc. No 3 Community No 4 Region 10 – HIV/AIDS Respondent Intended Award Ranking Simply Yes 1 United No 2 Community No 3 Positive No 4 Region 10 – Serious Mental Illness Respondent Intended Award Ranking Staywell Yes 1 United No 2 Florida MHS, Inc. No 3 Community No 4 Region 11 – HIV/AIDS Respondent Intended Award Ranking Simply Yes 1 United No 2 Positive No 3 All of the Specialty Plan awards noticed by the Agency went to bidders who also proposed, and received, comprehensive plan awards. The protests, referrals, and proceedings before the Division summarized in the Preliminary Statement followed the Agency’s announcement of its intended awards. TERMS The voluminous ITN consisted of a two-page transmittal letter and three Attachments (A, B, and C), with a total of 34 exhibits to them. They are: Attachment A, Exhibits A-1 through A-8, Attachment B, Exhibits B-1 through B-3, and Attachment C, Exhibits C-1 through C-8. The ITN establishes a two-step process for selecting: an evaluation phase and a negotiation phase. In the evaluation phase, each respondent was required to submit a proposal responding to criteria of the ITN. Proposals were to be evaluated, scored, and ranked. The goal of the evaluation phase was to determine which respondents would move to negotiations, not which would be awarded a contract. The top two ranking Specialty Plans per specialty population would be invited to negotiations. In the negotiation phase, the Agency would negotiate with each invited respondent. After that, the Agency would announce its intended award of a contract to the plan or plans that the Agency determined would provide the best value. Together, the attachments and exhibits combined instructions, criteria, forms, certifications, and data into a “one size fits all” document that described the information required for four categories of managed care plans to serve Medicaid patients. The ITN also provided data to consider in preparing responses. The transmittal letter emphasized, “Your response must comply fully with the instructions that stipulate what is to be included in the response.” The ITNs identified Jennifer Barrett as the procurement officer and sole point of contact with the Agency for vendors. The transmittal letter is reproduced here. This solicitation is being issued by the State of Florida, Agency for Health Care Administration, hereinafter referred to as “AHCA” or “Agency”, to select a vendor to provide Statewide Medicaid Managed Care Program services. The solicitation package consists of this transmittal letter and the following attachments and exhibits: Attachment A Instructions and Special ConditionsExhibit A-1 Questions TemplateExhibit A-2-a Qualification of Plan Eligibility Exhibit A-2-b Provider Service Network Certification of Ownership and Controlling InterestExhibit A-2-c Additional Required Certifications and StatementsExhibit A-3-a Milliman Organizational Conflict of Interest Mitigation Plan Exhibit A-3-b Milliman Employee Organizational Conflict of Interest AffidavitExhibit A-4 Submission Requirements and Evaluation Criteria InstructionsExhibit A-4-a General Submission Requirements and Evaluation Criteria Exhibit A-4-a-1 SRC# 6 - General Performance Measurement ToolExhibit A-4-a-2 SRC# 9 - Expanded Benefits Tool (Regional) Exhibit A-4-a-3 SRC# 10 - Additional Expanded Benefits Template (Regional)Exhibit A-4-a-4 SRC# 14 - Standard CAHPS Measurement Tool Exhibit A-4-b MMA Submission Requirements and Evaluation Criteria Exhibit A-4-b-1 MMA SRC# 6 - Provider Network Agreements/Contracts (Regional)Exhibit A-4-b-2 MMA SRC# 14 - MMA Performance Measurement Tool Exhibit A-4-b-3 MMA SRC# 21 - Provider Network Agreements/Contracts Statewide Essential Providers Exhibit A-4-c LTC Submission Requirements and Evaluation CriteriaExhibit A-4-c-1 LTC SRC# 4 - Provider Network Agreements/Contracts (Regional) Exhibit A-4-d Specialty Submission Requirements and Evaluation CriteriaExhibit A-5 Summary of Respondent CommitmentsExhibit A-6 Summary of Managed Care Savings Exhibit A-7 Certification of Drug-Free Workplace ProgramExhibit A-8 Standard Contract Attachment B Scope of Service - Core Provisions Exhibit B-1 Managed Medical Assistance (MMA) ProgramExhibit B-2 Long-Term Care (LTC) ProgramExhibit B-3 Specialty Plan Attachment C Cost Proposal Instructions and Rate Methodology NarrativeExhibit C-1 Capitated Plan Cost Proposal TemplateExhibit C-2 FFS PSN Cost Proposal Template Exhibit C-3 Preliminary Managed Medical Assistance (MMA) Program Rate Cell Factors Exhibit C-4 Managed Medical Assistance (MMA) Program Expanded Benefit Adjustment Factors Exhibit C-5 Managed Medical Assistance (MMA) Program IBNR Adjustment Factors Exhibit C-6 Managed Medical Assistance (MMA) Program Historical Capitated Plan Provider Contracting Levels During SFY 15/16 Time Period Exhibit C-7 Statewide Medicaid Managed Care Data BookExhibit C-8 Statewide Medicaid Managed Care Data Book Questions and Answers Your response must comply fully with the instructions that stipulate what is to be included in the response. Respondents submitting a response to this solicitation shall identify the solicitation number, date and time of opening on the envelope transmitting their response. This information is used only to put the Agency mailroom on notice that the package received is a response to an Agency solicitation and therefore should not be opened, but delivered directly to the Procurement Officer. The ITN describes the plans as follows: Comprehensive Long-term Care Plan (herein referred to as a “Comprehensive Plan”) – A Managed Care Plan that is eligible to provide Managed Medical Assistance services and Long-term Care services to eligible recipients. Long-term Care Plus Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services and Long-term Care services to eligible recipients enrolled in the Long-term Care program. This plan type is not eligible to provide services to recipients who are only eligible for MMA services. Managed Medical Assistance (MMA) Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services to eligible recipients. This plan type is not eligible to provide services to recipients who are eligible for Long-term Care services. Specialty Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services to eligible recipients who are defined as a specialty population in the resulting Contract. Specialty Plans are at issue. The ITN did not define, describe, or specify specialty populations to be served. It left that to the responding vendors. Beyond that, the ITN left the ultimate definition of the specialty population for negotiation, saying in Section II(B)(1)(a) of Attachment B, Exhibit B-3, “[t]he Agency shall identify the specialty population eligible for enrollment in the Specialty Plan based on eligibility criteria based upon negotiations.” Some respondents directly identified the specialty population. Simply’s transmittal letter stated that it proposed “a Specialty plan for individuals with HIV/AIDS.” Positive’s response to Exhibit A-4-d Specialty SRC 4, eligibility and enrollment, stated, “the specialty population for the PHC [Positive] plan will be Medicaid eligible, male and female individuals from all age groups who are HIV positive with or without symptoms and those individuals who have progressed in their HIV disease to meet the CDC definition of AIDS.” Some others left definition of the specialty population to be inferred from the ITN response. The result is that the ITN left definition of the specialty populations initially to the respondents and ultimately to negotiations between the Agency and successful respondents. Petitioners and Intervenors describe the populations that they propose serving as HIV/AIDS patients, patients with SMI, CSN, and child welfare populations. ITN respondents could have proposed serving only cancer patients, serving only obstetric patients, or serving only patients with hemophilia. The part of the ITN requiring a respondent to identify the plan type for which it was responding offered only four alternative blocks to check. They were: “Comprehensive Plan,” Long-Term Care Plus Plan,” “Managed Medical Assistance Plan,” or “Specialty Plan.” Attachment A to the ITN, labeled “Instructions and Special Conditions,” provides an overview of the solicitation process; instructions for response preparation and content; information regarding response submission requirements; information regarding response evaluation, negotiations, and contract awards; and information regarding contract implementation. Exhibits A-1 to A-3 and A-5 to A-7 of the ITN contain various certifications and attestations that respondents had to prepare and verify. Exhibit A-4 contains submission requirement components (SRCs) to which respondents had to prepare written responses. Exhibit A-8 contains the state’s standard SMMC contract. ITN Exhibit A-4-a contains 36 general submission requirements and evaluation criteria (General SRCs). ITN Exhibit A-4-b contains 21 MMA submission requirements and evaluation criteria (MMA SRCs). ITN Exhibit A-4-c contains 13 LTC submission requirements and evaluation criteria (LTC SRCs). ITN Exhibit A-4-d contains five specialty submission requirements and evaluation criteria (Specialty SRCs). The responses that the 36 SRCs require vary greatly. Some are as simple as providing documents or listing items. Others require completing tables or spreadsheets with data. Consequently, responses to some SRCS apparently could be reviewed in very little time, even a minute or less. Others requiring narrative responses might take longer. Examples follow. General SRC 1 required a list of the respondent’s contracts for managed care services and 12 information items about them including things such as whether they were capitated, a narrative describing the scope of work; the number of enrollees; and accomplishments and achievement. General SRC 2 asked for documentation of experience operating a Medicaid health plan in Florida. General SRC 3 asked for information confirming the location of facilities and employees in Florida. General SRC 12 requested a flowchart and written description of how the respondent would execute its grievance and appeal system. It listed six evaluation criteria. MMA SRC 2 asks for a description of the respondent’s organizational commitment to quality improvement “as it relates to pregnancy and birth outcomes.” It lists seven evaluation criteria. MMA SRC 10 asks for a description of the respondent’s plan for transition of care between service settings. It lists six evaluation criteria including the respondent’s process for collaboration with providers. Specialty SRC 1 asks for detailed information about respondent’s managed care experience with the specialty population. Specialty SRC 5 asks for detailed information about the respondent’s provider network standards and provides five evaluation criteria for evaluating the answers. Exhibit A-8 of the ITN contains the standard SMMC contract. Attachment B and Exhibits B-1 to B-3 of the ITN contain information about the scope of service and core provisions for plans under the SMMC program. Attachment C and Exhibits C-1 to C-8 of the ITN contain information related to the cost proposals and rate methodologies for plans under the SMMC program. The ITN permitted potential respondents to submit written questions about the solicitation to the Agency by August 14, 2017. Some did. On September 14, 2017, the Agency issued Addendum No. 1 to the ITN. Among other things, Addendum No. 1 changed the anticipated date for the Agency’s responses to respondents’ written questions from September 15 to October 2, 2017. The Agency issued Addendum No. 2 to the ITN on October 2, 2017. Addendum No. 2 included a chart with 628 written questions from potential respondents and the Agency’s answers. Attachment A at A 10-(d) makes it clear that the answers are part of the addendum. Both Addendums to the ITN cautioned that any protest of the terms, conditions, or specifications of the Addendums to the ITN had to be filed with the Agency within 72 hours of their posting. No respondent protested. Instructions for the A-4 Exhibits included these requirements: Each SRC contains form fields. Population of the form fields with text will allow the form field to expand and cross pages. There is no character limit. All SRCs, marked as “(Statewide)” must be identical for each region in which the respondent submits a reply. For timeliness of response evaluation, the Agency will evaluate each “(Statewide)” SRC once and transfer the score to each applicable region’s evaluation score sheet(s). The SRCs marked as “(Regional)” will be specific and only apply to the region identified in the solicitation and the evaluation score will not be transferred to any other region. The instructions continue: Agency evaluators will be instructed to evaluate the responses based on the narrative contained in the SRC form fields and the associated attachment(s), if applicable. Each response will be independently evaluated and awarded points based on the criteria and points scale using the Standard Evaluation Criteria Scale below unless otherwise identified in each SRC contained within Exhibit A-4. This is the scale: STANDARD EVALUATION CRITERIA SCALE Point Score Evaluation 0 The component was not addressed. 1 The component contained significant deficiencies. 2 The component is below average. 3 The component is average. 4 The component is above average. 5 The component is excellent. The ITN further explained that different SRCs would be worth different “weights,” based on the subject matter of the SRC and on whether they were General, MMA, LTC, or Specialty SRCs. It assigned weights by establishing different “weight factors” applied as multipliers to the score a respondent received on a criteria. For example, “Respondent Background/Experience” could generate a raw score of 90. Application of a weight factor of three made 270 the maximum possible score for this criteria. “Oversight and Accountability” could generate a raw score of 275. A weight factor of one, however, made the maximum score available 275. General SRC 6 solicits HEDIS data. HEDIS is a tool that consists of 92 measures across six domains of care that make it possible to compare the performance of health plans on an “apples-to-apples” basis. SRC 6 states: The respondent shall describe its experience in achieving quality standards with populations similar to the target population described in this solicitation. The respondent shall include, in table format, the target population (TANF, ABD, dual eligible), the respondent’s results for the HEDIS measures specified below for each of the last two (2) years (CY 2015/ HEDIS 2016 and CY 2016/ HEDIS 2017) for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees). If the respondent does not have HEDIS results for at least three (3) Medicaid Contracts, the respondent shall provide commercial HEDIS measures for the respondent’s largest Contracts. If the Respondent has Florida Medicaid HEDIS results, it shall include the Florida Medicaid experience as one (1) of three (3) states for the last two (2) years. The respondent shall provide the data requested in Exhibit A-4-a-1, General Performance Measurement Tool[.] x x x Score: This section is worth a maximum of 160 raw points x x x For each of the measure rates, a total of 10 points is available per state reported (for a total of 360 points available). The respondent will be awarded 2 points if their reported plan rate exceeded the national Medicaid mean and 2 points if their reported plan rate exceeded the applicable regional Medicaid mean, for each available year, for each available state. The respondent will be awarded an additional 2 points for each measure rate where the second year’s rate is an improvement over the first year’s rate, for each available state. An aggregate score will be calculated and respondents will receive a final score of 0 through 150 corresponding to the number and percentage of points received out of the total available points. For example, if a respondent receives 100% of the available 360 points, the final score will be 150 points (100%). If a respondent receives 324 (90%) of the available 360 points, the final score will be 135 points (90%). If a respondent receives 36 (10%) of the available 360 points, the final score will be 15 points (10%). The SRC is plainly referring to the broad Medicaid- eligible population when it says “the target population (TANF, ABD, dual eligible).” “Dual eligible” populations are persons eligible for Medicaid and Medicare. There, as throughout the ITN, the ITN delineates between a target population of all Medicaid-eligible patients and a specialty population as described in a respondent’s ITN proposal. The clear instructions for SRC 6 require, “Use the drop-down box to select the state for which you are reporting and enter the performance measure rates (to the hundredths place, or XX.XX) for that state's Medicaid population for the appropriate calendar year.” Community did not comply. General SRC 14 solicits similar data, in similar form using a similar tool, about a respondent’s Consumer Assessment of Healthcare Providers and Systems (CAHPS). CAHPS data is basically a satisfaction survey. It asks respondents to provide “in table format the target population (TANF, ABD, dual eligible) and the respondent’s results for the Consumer Assessment of Healthcare Providers and Systems (CAHPS) items/composites specified below for the 2017 survey for its adult and child populations for the respondent’s three (3) largest Medicaid Contracts (as measured by number of enrollees).” Just like General SRC 6 did with HEDIS data, General SRC 14 ITN instructed bidders to put their CAHPS data for the “target population (TANF, ABD, dual eligible)” “for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees)” for multiple states into an excel spreadsheet “to the hundredths place[.]” Also, like General SRC 6, General SRC 14 includes an objective formula described in the ITN for scoring bidders’ CAHPS data. RANKING PROVISIONS Attachment A at (D)(4)(c)(2) stated: Each response will be individually scored by at least three (3) evaluators, who collectively have experience and knowledge in the program areas and service requirements for which contractual services are sought by this solicitation. The Agency reserves the right to have specific sections of the response evaluated by less than three (3) individuals. The ITN’s example of how total point scores would be calculated, discussed below, also indicated that some sections may be scored by less than three evaluators. The explanatory chart had a column for “[o]ther Sections evaluated by less than three (3) evaluators. ” The Agency’s policy, however, has been to assign at least three evaluators to score program specific SRCs. Attachment A at (D)(4)(e)(2) advised respondents how the agency will rank the competing responses. It was clear and specific, even providing an example of the process showing how the scores “will” be calculated. Step one of the explanatory chart stated that the Agency would calculate a total point score for each response. Step two stated that “[t]he total point scores will be used to rank the responses by an evaluator. . . .” Next, the rankings by the evaluator are averaged to determine the average rank for each respondent. This average ranking is critical because ranking is how the ITN said the Agency would select respondents for negotiation and how the Agency did select respondents for negotiation. The step two and step three charts, reproduced below, demonstrate that the ITN contemplated an evaluation process in which each response was to be evaluated in its entirety by three different evaluators, or maybe less than three, but indisputably in its entirety by those who evaluated it. This did not happen. Step 2 The total point scores will be used to rank the responses by evaluator (Response with the highest number of points = 1, second highest = 2, etc.). POINTS SUMMARY Evaluator A Evaluator B Evaluator C Evaluator D Respondent 446 Respondent 396 Respondent 311 Respondent 413 Respondent 425 Respondent 390 Respondent 443 Respondent 449 Respondent 397 Respondent 419 Respondent 389 Respondent 435 Respondent 410 Respondent 388 Respondent 459 Respondent 325 RANKING SUMMARY Evaluator A Evaluator B Evaluator C Evaluator D Respondent 1 1 Respondent 1 2 Respondent 1 4 Respondent 3 Respondent 2 2 Respondent 2 3 Respondent 2 2 Respondent 1 Respondent 3 4 Respondent 3 1 Respondent 3 3 Respondent 2 Respondent 4 3 Respondent 4 4 Respondent 4 1 Respondent 4 c) Step 3 An average rank will be calculated for each response for all the evaluators. Respondent 1 1+2+4+3=10÷4=2.5 Respondent 2 2+3+2+1=8÷4=2.0 Respondent 3 4+1+3+2=10÷4=2.5 Respondent 4 3+4+1+4=12÷4=3.0 PROVIDER SERVICE NETWORK PROVISIONS Florida law permits a PSN to limit services provided to a target population “based on age, chronic disease state, or medical condition of the enrollee.” This allows a PSN to offer a specialty plan. For each region, the eligible plan requirements of section 409.974(1) state, “At least one plan must be a provider service network if any provider service networks submit a responsive bid.” Section 409.974(3) says: “Participation by specialty plans shall be subject to the procurement requirements of this section. The aggregate enrollment of all specialty plans in a region may not exceed 10 percent of the total enrollees of that region.” The ITN addressed those requirements. The Negotiation Process section of Attachment A, Instructions and Special Conditions, says: The Agency intends to invite the following number of respondents to negotiation: Comprehensive Plans The top four (4) ranking Comprehensive Plans. Long-term Care Plus Plans The top two (2) ranking Long-term Care Plus Plans Managed Medical Assistance Plans The top two (2) ranking Managed Medical Assistance Plans Specialty Managed Medical Assistance Plans The top two (2) ranking Specialty Managed Medical Assistance Plans per specialty population. If there are no provider service networks included in the top ranked respondents listed above, the Agency will invite the highest ranked PSN(s) to negotiations in order to fulfill the requirements of Section 409.974(1), Florida Statutes and Section 409.981(1), Florida Statutes. Emphasis supplied. The ITN specifications in Section D.7, titled Number of Awards, state as follows about Specialty Plan awards: 7. Number of Awards In accordance with Sections 409.966, 409.974, and 409.981, Florida Statutes, the Agency intends to select a limited number of eligible Managed Care Plans to provide services under the SMMC program in Region 10. The Agency anticipates issuing the number of Contract awards for Region 10 as described in Table 5, SMMC Region, below, excluding awards to Specialty MMA Plans. Table 5 SMMC Region Region Total Anticipated Contract Awards Region 10 4 If a respondent is awarded a Contract for multiple regions, the Agency will issue one (1) Contract to include all awarded regions. The Agency will award at least one (1) Contract to a PSN provided a PSN submits a responsive reply and negotiates a rate acceptable to the Agency. The Agency, at its sole discretion, shall make this determination. A respondent that is awarded a Contract as a Comprehensive Plan is determined to satisfy the requirements in Section 409.974, Florida Statutes and Section 409.981, Florida Statutes and shall be considered an awardee of an MMA Contract and a LTC Contract. The Agency will issue one (1) Contract to reflect all awarded populations in all awarded regions. In addition to the number of Contracts awarded in this region, additional Contracts may be awarded to Specialty Plans that negotiate terms and conditions determined to be the best value to the State and negotiate a rate acceptable to the Agency. The Agency, at its sole discretion, shall make this determination. The Agency reserves the right to make adjustments to the enrollee eligibility and identification criteria proposed by a Specialty Plan prior to Contract award in order to ensure that the aggregate enrollment of all awarded Specialty Plans in a region will not exceed ten percent (10%) of the total enrollees in that region, in compliance with Section 409.974(3), Florida Statutes. If a respondent is awarded a Contract as a Specialty Plan and another plan type, the Agency will issue one (1) Contract to include all awarded populations in all awarded regions. A prospective vendor asked about the interplay of Specialty Plan options and the PSN requirements. The question and the answer provided in Addendum 2 follow: Q. Please clarify the number of PSN awards per region and how PSN awards will be determined based on the PSN's plan type (e.g., Comprehensive, LTC Plus, MMA, Specialty). As you know, Sections 409.974 and 409.981, Florida Statutes require one MMA PSN and one LTC PSN award per region (assuming a PSN is responsive) and the Agency has stated that an award to a Comprehensive Plan PSN will meet the requirements of both statutes. However, can the Agency further clarify whether other types of PSNs would meet the statutory requirements? Specifically, would a PSN LTC Plus award meet the requirements of Section 409.981, Florida Statutes? Similarly, would an award to a Specialty Plan PSN meet the requirements of Section 409.974, Florida Statutes? A. See Attachment A Instructions and Special Conditions, Section D Response Evaluations, and Contract Award, Sub-Section 7 Number of Awards. Yes, a PSN LTC Plus award would meet the requirements of Section 409.981(2). A Specialty Plan PSN would not meet the requirements of Section 409.974(1). The only reasonable interpretation of this answer is that Specialty Plan PSNs do not satisfy the requirement to contract with a responsive PSN imposed by section 409.974. None of the prospective vendors, including Community, challenged this clarification. EVALUATION PROCESS THE EVALUATORS The Agency selected 11 people to evaluate the proposals. The Agency assigned each person a number used to identify who was assigned to which task and to track performance of evaluation tasks. The procurement officer sent the evaluators a brief memo of instructions. It provided dates; described logistics of evaluation; emphasized the importance of independent evaluation; and prohibited communicating about the ITN and the proposals with anyone other than the procurement office. The Agency also conducted an instructional session for evaluators. Evaluator 1, Marie Donnelly: During the procurement, Ms. Donnelly was the Agency’s Chief of the Bureau of Medicaid Quality. She held this position for five years before resigning. This bureau bore responsibility for ensuring that the current SMMC plans met their contract requirements for quality and quality improvement measures. Her role specifically included oversight of Specialty Plans. Evaluator 2, Erica Floyd Thomas: Ms. Thomas is the chief of the Bureau of Medicaid Policy. She has worked for the Agency since 2001. Her Medicaid experience includes developing policies for hospitals, community behavioral health, residential treatment, and contract oversight. Before serving as bureau chief, she served as an Agency administrator from 2014 through 2017. Ms. Thomas oversaw the policy research and development process for all Medicaid medical, behavioral, dental, facility, and clinic coverage policies to ensure they were consistent with the state Plan and federal Medicaid requirements. Evaluator 3, Rachel LaCroix, Ph.D.: Dr. LaCroix is an administrator in the Agency’s Performance Evaluation and Research Unit. She has worked for the Agency since 2003. All her positions have been in the Medicaid program. Dr. LaCroix has served in her current position since 2011. She works with the performance measures and surveys that the current SMMC providers report to the Agency. Dr. LaCroix is a nationally recognized expert on healthcare quality metrics like HEDIS. She is also an appointee on the National Association of Medicaid Directors’ task force for national performance measures. Evaluator 4, Damon Rich: Mr. Rich has worked for the Agency since April 2009. He is the chief of the Agency’s Bureau of Recipient and Provider Assistance. This bureau interacts directly with AHCA’s current SMMC care providers about any issues they have, and with Medicaid recipients, usually about their eligibility or plan enrollment. Before Mr. Rich was a bureau chief, he worked as a field office manager for the Agency. Mr. Rich’s experience as bureau chief and field office manager includes oversight of the current SMMC Specialty Plans. Evaluator 5. Eunice Medina: Ms. Medina is the chief of the Agency’s Bureau of Medicaid Plan Management, which includes a staff of over 60 individuals, who manage the current SMMC contracts. Her experience and duties essentially encompass all aspects of the current SMMC plans. Ms. Medina started working with the Agency in 2014. Evaluator 6, Devona “DD” Pickle: Ms. Pickle most recently joined the Agency in 2011. She also worked for the Agency from November 2008 through November 2010. Ms. Pickle’s Agency experience all relates in some way to the Medicaid program. Since March 2013, Ms. Pickle has served as an administrator over managed care policy and contract development in the Bureau of Medicaid Policy. Her job duties include working with the current SMMC contractors. Ms. Pickle is also a Florida licensed mental health counselor. Evaluator 7, Tracy Hurd-Alvarez: Ms. Hurd-Alvarez has worked for the Agency’s Medicaid program since 1997. Since 2014, she has been a field office manager, overseeing compliance monitoring for all the current SMMC contractors. Before assuming her current position, Ms. Hurd-Alvarez implemented the LTC SMMC program. Evaluator 8, Gay Munyon: Ms. Munyon is currently the Chief of the Bureau of Medicaid Fiscal Agent Operations. Ms. Munyon began working with the Agency in April 2013. Ms. Munyon’s bureau oversees fulfillment of the Agency’s contract with the current SMMC fiscal agent. Her unit’s responsibilities include systems maintenance and modifications and overseeing the fiscal agent, which answers phone calls, processes claims, and processes applications. Ms. Munyon has 25 years of experience working with the Medicaid program. Evaluator 9, Laura Noyes: Ms. Noyes started working for the Agency in April 2011. Her years of Agency experience all relate to the Medicaid program, including overseeing six current comprehensive managed care plans by identifying trends in contractual non-compliance. Evaluator 10, Brian Meyer: Mr. Meyer is a CPA, who has worked for the Agency in the Medicaid program since 2011. He is currently chief of the Bureau of Medicaid Data Analytics. Mr. Meyer’s primary responsibility is overseeing the capitation rates for the current SMMC contractors. His experience includes Medicaid plan financial statement analysis, surplus requirement calculation analysis and, in general, all types of financial analysis necessary to understand financial performance of the state’s Medicaid plans. Evaluator 11, Ann Kaperak: Since April 2015, Ms. Kaperak has served as an administrator in the Agency’s Bureau of Medicaid Program Integrity. Ms. Kaperak’s unit oversees the fraud and abuse efforts of the current SMMC plans. She also worked for the Medicaid program from November 2012 through May 2014. Ms. Kaperak worked as a regulatory compliance manager for Anthem/Amerigroup’s Florida Medicaid program between May 2014 and April 2015. Positive and Community challenge the Agency’s plan selections by questioning the qualifications of the evaluators. The first part of their argument is that the evaluators did not have sufficient knowledge about HIV/AIDS and its treatment. The evidence does not prove the theory. For instance, Positive’s argument relies upon criticizing the amount of clinical experience evaluators had managing patients with HIV/AIDS. That approach minimizes the fact that the managed care plan characteristics involve so much more than disease- specific considerations. For instance, many of the components require determining if the respondent provided required documents, verifying conflict of interest documents, management structure, quality control measures, and the like. General SRCs asked for things like dispute resolution models (SRC 16), claims processing information (SRC 17), and fraud and abuse compliance plans (SRC 31). MMA SRCs included criteria, like telemedicine (SRC 4), demonstrated progress obtaining executed provider agreements (SRC 6), and a credentialing process (SRC 12). Specialty SRCs included criteria like copies of contracts for managed care for the proposed specialty population (SRC 1), specific and detailed criteria defining the proposed specialty population (SRC 4), and the like. The evidence does not prove that disease-specific experience is necessary to evaluate responses to these and other SRCs. SRC 6 involving HEDIS data and SRC 14 involving CAHPS data are two good examples. They required respondents to input data into a spreadsheet. All the evaluators had to do was determine what those numbers showed. Evaluation did not require any understanding of disease or how the measures were created. All the evaluator had to know was the number in the spreadsheet. The second part of the evaluator qualification criticisms is that the evaluators did not give adequate weight to some responses. Positive and Community just disagree with the measures requested and the evaluation of them. They conclude from that disagreement that the evaluators’ qualifications were deficient. The argument is not persuasive. The last sentence of paragraph 69 of Positive’s proposed recommended order exemplifies the criticisms of Positive and Community of the evaluators’ qualifications. It states, “The fact that PHC [Positive] was ranked last among competing HIV plans shows that the SRC evaluators did not understand enough about managing individuals with HIV/AIDs to score its proposal competently.” The argument is circular and “ipse dixit”. It does not carry the day. The collective knowledge and experience of the evaluators, with a total of 128 years of Medicaid experience, made them capable of reasonably evaluating the managed care plan proposals, including the Specialty plan proposals. The record certainly does not prove otherwise. EVALUATION PROCESS The Agency assigned the evaluators to the SRCs that it determined they were qualified to evaluate and score. The Agency did not assign entire responses to an evaluator for review. Instead it elected a piecemeal review process assigning various evaluators to various sections, the SRCs of each response. Paragraph 30 of the Agency’s proposed recommended order describes this decision as follows: Although the ITN had contemplated ranking each vendor by evaluator, based on an example in the ITN, such ranking presumed a process where all evaluators scored all or nearly all of the responses to the ITN, which had occurred in the procurement five years ago. In this procurement, each evaluator reviewed only a subset of SRCs based on their knowledge, and experience; therefore, ranking by evaluator was not logical because each had a different maximum point score. The initial SRC scoring assignments were: General SRCs 1 through 4, LTC SRCs 1 and 2, and Specialty SRC 1: Marie Donnelly, Laura Noyes, and Brian Meyer. General SRCs 5 through 8, MMA SRCs 1 through 7, LTC SRCs 3 and 4, and Specialty SRCs 1 and 2: Marie Donnelly, Erica Floyd- Thomas, and Rachel LaCroix. General SRCs 9 through 14, MMA SRCs 8 through 11, LTC SRCs 5 through 7, and Specialty SRC 4: Damon Rich, Eunice Medina, and DD Pickle. General SRCs 15 through 17, MMA SRCs 12 and 13, and LTC SRCs 8 through 10: Damon Rich, Tracy Hurd-Alvarez, Gay Munyon. General SRCs 18 through 25, MMA SRCs 14 through 20, LTC SRCs 11 and 12, and Specialty SRC 5: Erica Floyd-Thomas, Eunice Medina, and DD Pickle. General SRCs 26 through 33 and LTC SRC 13: Gay Munyon, Ann Kaperak, and Brian Meyer. General SRCs 34 through 36 and MMA SRC 21: Marie Donnelly, Rachel LaCroix, and Tracy Hurd-Alvarez. The ranking process presented in the ITN and described in paragraphs 62-64, contemplated ranking each respondent by evaluator. The Agency carried this process over from an earlier procurement. In this procurement, despite what the ITN said, the Agency assigned responsibilities so that each evaluator reviewed only a subset of SRCs. Therefore, the ranking of responses by evaluator presented in the ITN could not work. It was not even possible because no one evaluator reviewed a complete response and because each SRC had a different maximum point score. Instead, the Agency, contrary to the terms of the ITN, ranked proposals by averaging the “total point scores” assigned by all of the evaluators. The Agency considered issuing an addendum advising the parties of the change. The addendum would have informed the respondents and provided them an opportunity to challenge the change. The Agency elected not to issue an addendum. EVALUATION AND SCORING The evaluators began scoring on November 6, 2017, with a completion deadline of December 29, 2017. The 11 evaluators had to score approximately 230 separate responses to the ITNs. The evaluators had to score 67,175 separate items to complete the scoring for all responses for all regions for all types of plans. No one at the Agency evaluated how much time it should take to score a particular item. None of the parties to this proceeding offered persuasive evidence to support a finding that scoring any particular item would or should take a specific length of time or that scoring all of the responses would or should take a specific length of time. Evaluators scored the responses in conference room F at the Agency’s headquarters. This secure room was the exclusive location for evaluation and scoring. Each evaluator had a dedicated workspace equipped with all tools and resources necessary for the task. The workspaces included a computer terminal for each evaluator. The system allowed evaluators to review digital copies of the ITN and proposals and to enter evaluation points in spreadsheets created for the purpose of recording scores. Evaluators also had access to hard copies of the proposals and the ITN. The Agency required evaluators to sign in and to sign out. The sign-in and sign-out sheets record the significant amount of time the evaluators spent evaluating proposals. Evaluators were not permitted to communicate with each other about the responses. To minimize distractions, the Agency prohibited cell phones, tablets and other connected devices in the room. The Agency also authorized and encouraged the evaluators to delegate their usual responsibilities. Agency proctors observed the room and evaluators throughout the scoring process. They were available to answer general and procedural questions and to ensure that the evaluators signed in and signed out. A log sheet documented how much time each evaluator spent in the scoring conference room. Some evaluators took extensive notes. For example, Ms. Median took over 200 pages of notes. Similarly, Ms. Munyon took nearly 400 pages of typewritten notes. The evaluators worked hard. None, other than Dr. LaCroix, testified that they did not have enough time to do their job. The computer system also automatically tracked the evaluators’ progress. Tracking reports showed the number of items assigned to each evaluator and the number of scoring items completed. The first status report was generated on December 8, 2017, approximately halfway through the scheduled scoring. At that time, only 28 percent of the scoring items were complete. Ms. Barrett usually ran the status reports in the morning. She made them available to the evaluators to review. The pace of evaluation caused concern about timely completion and prompted discussions of ways to accelerate scoring. Because it was clear that the majority of the evaluators would not complete scoring their SRCs by December 29, 2017, the Agency extended the scoring deadline to January 12, 2018. It also extended the hours for conference room use. Most respondents filed proposals for more than one type of plan and more than one region. This fact combined with the provision in the instructions saying that all statewide SRC responses must be identical for each region and that scores would transfer to each applicable region’s score sheets, enabled evaluators to score many SRCs just once. The system would then auto-populate the scores to the same SRC for all proposals by that respondent. This time saving measure permitted scoring on many of the items to be almost instantaneous after review of the first response to an SRC. The fact that so many respondents submitted proposals for so many regions and types of plans provided the Agency another opportunity for time-saving. The Agency loaded Adobe Pro on the evaluators’ computers as a timesaving measure. This program allowed the evaluators to compare a bidder’s Comprehensive Plan Proposal to the same company’s regional and Specialty Plan proposals. If the Adobe Pro comparison feature showed that the proposal response was the same for each plan, the Agency permitted evaluators to score the response once and assign the same score for each item where the respondent provided the same proposal. This speeded scoring. It, however, meant that for SRCs where evaluators did this, that they were not reviewing the SRC response in the specific context of the specialty plan population, each of which had specific and limited characteristics that made them different from the broader General and MMA plan populations. This is significant because so many SRCs required narrative responses where context would matter. There is no Specialty SRCs A-4 instruction requirement for specialty plans analogous to the requirement that responses for statewide SRCs must be identical for each region. In other words, the instructions do not say all SRCs marked as statewide must be identical for each specialty plan proposal and that the Agency will evaluate each Statewide SRC once and transfer the score to each applicable Specialty Plan score. In fact, according to the procurement officer, the Agency expected that evaluators would separately evaluate and score the statewide SRCs for Comprehensive Plans and for Specialty Plans, even if the same bidder submitted them. Despite the Agency’s expectation and the absence of an authorizing provision in the ITN, many evaluators, relying on the Adobe Pro tool, copied the SRC scores they gave to a respondent’s comprehensive plan proposal to its specialty plan proposal if the respondent submitted the same response to an SRC for a Comprehensive Plan and a Specialty Plan. For instance, Ms. Thomas (Evaluator 2) and Ms. Munyon (Evaluator 8) did this to save time. Ms. Donnelly (Evaluator 1) did this even when the comprehensive and specialty responses were not identical. This does not amount to the independent evaluation of the responses pledged by the ITN. On separate days, Evaluator 1 scored 1,315 items, 954 items, 779 items and 727 items. On separate days, Evaluator 2 scored 613 items, 606 items, 720 items, 554 items and 738 items. Evaluator 4 scored 874 items on one day. Evaluator 5 scored 813 items in one day. Evaluator 6 scored 1,001 items in one day. Evaluator 8 scored 635 items in one day. The record does not identify the items scored. It also does not permit determining how many of the item scores resulted from auto-population or assignment of scores based upon previous scoring of an identical response. It bears repeating, however, that the record does not support any finding on how long scoring the response to one SRC or an entire response could reasonably be expected to take. Even with the extended scoring period and time-saving measures, the Agency concluded that Evaluator 3 would not be able to finish all of the SRCs assigned to her. Rather than extend the deadline for scoring a second time, the Agency decided to reassign the nine of Evaluator 3’s SRCs that she had not begun scoring to two other evaluators. The Agency did not include scores of other SRCs for which Evaluator 3 had not completed scoring. The Agency only counted Evaluator 3’s scores for an SRC if she scored the SRC for everyone. The result was that only two people scored nine of the Specialty Plan SRCs. The Agency did not reassign all of Evaluator 3’s SRCs’. It only reassigned the SRCs to evaluators who were qualified to evaluate the items, who were not already assigned those items to score, and who had already finished or substantially completed their own evaluations. The decision to reassign the SRCs was not based on any scoring that had already been completed. The Agency did not allow changes to data submitted by any of the vendors. It allowed vendors to exchange corrupted electronic files for ones which could be opened and allowed vendors to exchange electronic files to match up with the paper copies that had been submitted. The Agency allowed Community to correct its submission where it lacked a signature on its transmittal letter and allowed Community to exchange an electronic document that would not open. It did not allow Community to change its reported HEDIS scores, which were submitted in the decimal form required by the instructions. Community erred in the numbers that it reported. There is no evidence showing that other vendors received a competitive or unfair advantage over Community in the Agency’s review of the SMI Specialty Plan submission for Region 10. There was no evidence that the Agency allowed any other vendors to change any substantive information in their submittals for that proposed specialty in that region. HEIDIS ISSUES Positive asserts that Simply’s proposal is non- responsive because Simply submitted HEDIS data from the general Medicaid population in response to SRC 6 and MMA SRC 14. Positive contends that Simply obtained a competitive advantage by supplying non-HIV/AIDS HEDIS data in response to SRC 6 and MMA SRC 14 because HIV/AIDS patients are generally a sicker group and require more care and because some HEDIS measures cannot be reported for an HIV/AIDS population. HEDIS stands for Healthcare Effectiveness and Data Information Set and is a set of standardized performance measures widely used in the healthcare industry. The instructions for both SRC 6 and MMA SRC 14 provide, in relevant part: The respondent shall describe its experience in achieving quality standards with populations similar to the target population described in this solicitation. The respondent shall include in table format, the target population (TANF, ABD, dual eligible), the respondent’s results for the HEDIS measures specified below for each of the last two (2) years (CY 2015/HEDIS 2016 and CY 2016/HEDIS 2017) for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees). If the respondent does not have HEDIS results for at least three (3) Medicaid Contracts, the respondent shall provide commercial HEDIS measures for the respondent’s largest Contracts. If the Respondent has Florida Medicaid HEDIS results, it shall include the Florida Medicaid experience as one (1) of three (3) states for the last two (2) years. (JE 1 at 75 (SRC 6); JE 1 at 158 (MMA SRC 14)). SRC 6 and MMA SRC 14 instruct respondents to provide HEDIS measures for “the target population (TANF, ABD, dual eligible).” Id.. TANF, ABD, and dual eligible are eligibility classifications for the Medicaid population. The Agency sought information regarding the target Medicaid-eligible population, even from respondents proposing a Specialty Plan, because Specialty Plans are required to serve all of the healthcare needs of their recipients, not just the needs related to the criteria making those recipients eligible for the Specialty Plan. Following the instructions in SRC 6 and MMA SRC 14, Simply provided HEDIS data from the Medicaid-eligible population for its three largest Medicaid contracts as measured by the total number of enrollees. For the requested Florida HEDIS data, Simply utilized legacy HEDIS data from Amerigroup Florida, Inc., a Comprehensive Plan. Amerigroup and Simply had merged in October of 2017. Therefore, at the time of submission of Simply’s proposal, the HEDIS data from Amerigroup Florida was the data from Simply’s largest Medicaid contract in Florida for the period requested by the SRCs. Positive asserts that the Agency impermissibly altered scoring criteria after the proposals were submitted when the Agency corrected technical issues within a HEDIS Measurement Tool spreadsheet. SRC 6 and MMA SRC 14 required the submission of numeric data for the requested HEDIS performance measures. To simplify submission of the numeric data for the requested HEDIS performance measures, the Agency required respondents to utilize a HEDIS Measurement Tool spreadsheet. The evaluation criteria for SRC 6 and MMA SRC 14 provided that respondents will be awarded points if the reported HEDIS measures exceed the national or regional mean for such performance measures. Some respondents, including Positive, entered “N/A,” “small denominator,” or other text inputs into the HEDIS Measurement Tool. During the evaluation and scoring process, the Agency discovered that if a respondent input any text into the HEDIS Measurement Tool, the tool would assign random amounts of points, even though respondents had not input measureable, numeric data. The Agency reasonably resolved the problem by removing any text and inserting a zero in place of the text. The correction of the error in the HEDIS Measurement Tool prevented random points from being awarded to respondents and did not alter scores in any way contrary to the ITN. It was reasonable and fair to all respondents.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order rejecting all r esponses to the ITNs to provide a Medicaid Managed Care plan for patients with HIV/AIDS in Regions 10 and 11. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide Medicaid Managed Care plan in Region 10 for patients with serious mental illness. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide a Medicaid Managed Care plan in Region 10 for patients with serious mental illness. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide a Medicaid Managed Care plan in Region 10 for c hild w elfare specialty services. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order awarding Wellcare of Florida, Inc., d/b/a Staywell Health Plan of Florida, a contract for a specialty Medicaid Managed Care plan for patients with Serious Mental Illness in Region 10. Based on the foregoing Findings of Fact and Conclusions of Law it is RECOMMENDED that the Agency for Health Care Administration enter a final order dismissing the Petition in Case No. 18-3513. DONE AND ENTERED this day of , , in Tallahassee, Leon County, Florida. S JOHN D. C. NEWTON, II Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this day of , .

USC (1) 42 U.S.C 1396u Florida Laws (9) 120.5720.42287.057409.912409.962409.966409.97409.974409.981
# 5
SOUTH BROWARD HOSPITAL DISTRICT, D/B/A MEMORIAL REGIONAL HOSPITAL vs AGENCY FOR HEALTH CARE ADMINISTRATION, 12-000424CON (2012)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Jan. 27, 2012 Number: 12-000424CON Latest Update: Mar. 14, 2012

Conclusions THIS CAUSE comes before the Agency For Health Care Administration (the "Agency") concerning Certificate of Need ("CON") Application No. 10131 filed by The Shores Behavioral Hospital, LLC (hereinafter “The Shores”) to establish a 60-bed adult psychiatric hospital and CON Application No. 10132 The entity is a limited liability company according to the Division of Corporations. Filed March 14, 2012 2:40 PM Division of Administrative Hearings to establish a 12-bed substance abuse program in addition to the 60 adult psychiatric beds pursuant to CON application No. 10131. The Agency preliminarily approved CON Application No. 10131 and preliminarily denied CON Application No. 10132. South Broward Hospital District d/b/a Memorial Regional Hospital (hereinafter “Memorial”) thereafter filed a Petition for Formal Administrative Hearing challenging the Agency’s preliminary approval of CON 10131, which the Agency Clerk forwarded to the Division of Administrative Hearings (“DOAH”). The Shores thereafter filed a Petition for Formal Administrative Hearing to challenge the Agency’s preliminary denial of CON 10132, which the Agency Clerk forwarded to the Division of Administrative Hearings (‘DOAH”). Upon receipt at DOAH, Memorial, CON 10131, was assigned DOAH Case No. 12-0424CON and The Shores, CON 10132, was assigned DOAH Case No. 12-0427CON. On February 16, 2012, the Administrative Law Judge issued an Order of Consolidation consolidating both cases. On February 24, 2012, the Administrative Law Judge issued an Order Closing File and Relinquishing Jurisdiction based on _ the _ parties’ representation they had reached a settlement. . The parties have entered into the attached Settlement Agreement (Exhibit 1). It is therefore ORDERED: 1. The attached Settlement Agreement is approved and adopted as part of this Final Order, and the parties are directed to comply with the terms of the Settlement Agreement. 2. The Agency will approve and issue CON 10131 and CON 10132 with the conditions: a. Approval of CON Application 10131 to establish a Class III specialty hospital with 60 adult psychiatric beds is concurrent with approval of the co-batched CON Application 10132 to establish a 12-bed adult substance abuse program in addition to the 60 adult psychiatric beds in one single hospital facility. b. Concurrent to the licensure and certification of 60 adult inpatient psychiatric beds, 12 adult substance abuse beds and 30 adolescent residential treatment (DCF) beds at The Shores, all 72 hospital beds and 30 adolescent residential beds at Atlantic Shores Hospital will be delicensed. c. The Shores will become a designated Baker Act receiving facility upon licensure and certification. d. The location of the hospital approved pursuant to CONs 10131 and 10132 will not be south of Los Olas Boulevard and The Shores agrees that it will not seek any modification of the CONs to locate the hospital farther south than Davie Boulevard (County Road 736). 3. Each party shall be responsible its own costs and fees. 4. The above-styled cases are hereby closed. DONE and ORDERED this 2. day of Meaich~ , 2012, in Tallahassee, Florida. ELIZABETH DEK, Secretary AGENCY FOR HEALTH CARE ADMINISTRATION

# 6
PRESBYTERIAN RETIREMENT COMMUNITIES, INC., D/B/A WESTMINISTER TOWERS vs AGENCY FOR HEALTH CARE ADMINISTRATION, 02-004442 (2002)
Division of Administrative Hearings, Florida Filed:Orlando, Florida Nov. 18, 2002 Number: 02-004442 Latest Update: May 21, 2004

The Issue Whether Petitioner, Presbyterian Retirement Communities, Inc., d/b/a Westminster Towers: (1) should be given a "conditional" or "standard" license effective June 17, 2002; and whether Petitioner is subject to an administrative fine of $2,500.

Findings Of Fact Based on the oral and documentary evidence presented at the final hearing, the following findings of fact are made: Petitioner is a long-term, skilled nursing facility located in Orlando, Florida. Respondent is a State of Florida agency responsible for surveying nursing homes to ensure compliance with applicable state and federal requirements. An annual survey was conducted by Respondent on Petitioner during June 17 through 20, 2002. As a result of the survey, Respondent asserted that Petitioner failed to adequately notify the attending physician of Resident No. 13's urinary tract infection, resulting in a delay in treatment of the infection. This resulted in citing Petitioner for a Class II deficiency, Tag F309, as follows: "Respondent failed to ensure that each resident received the necessary care and services to attain or maintain the highest practicable physical, mental, and psychological well-being, in accordance with the comprehensive assessment and plan of care." A federal scope and severity rating of level "G" was assigned to this deficiency. "Scope and severity" levels are identified by letters A through L. A level "G" rating requires that "harm or pain has come to the resident," more specifically, the resident must "have more than minimal harm with discomfort." If a level "G" scope and severity is assigned, a Class II deficiency is cited. Resident No. 13 was a 108-year-old female with a history of urinary tract infections. She was alert, oriented and articulate. She was capable of advising caregivers of her wants, needs, and physical condition. On May 27, 2002 Resident No. 13 complained of "some burning upon urinating." Petitioner's staff called Resident No. 13's attending physician by calling the "on-call" physician. The "on call" physician ordered a urinalysis and culture; a urine sample was obtained by Petitioner's staff noting that the urine was "cloudy." The laboratory that performs the testing is at a remote location. On May 28, 2002, the urinalysis results were received by Petitioner and transmitted by facsimile to the attending physician's office on the same day. The culture results were received by Petitioner on May 30, 2002, a Thursday, but were not faxed to the attending physician's office until June 1, 2002, a Saturday. On May 29, 2002, the attending physician performed a routine assessment and evaluation of Resident No. 13. His notes of the examination read as follows: No complaints. Feels well. Appetite is adequate. Otherwise, non-ROS. An extremely elderly lady doing quite well. Will continue to monitor and keep close tabs on her. On June 5, 2002, the nurses notes reflect that Resident No. 13 stated, "it hurts when I urinate." Her urine was discolored and was odiferous. Petitioner's staff notified the attending physician's office. The attending physician ordered the antibiotic, oxacillin, on June 6th. This antibiotic was inappropriate for Resident No. 13. On June 7, 2002, the attending physician ordered a second antibiotic, dioxicillin; this was also inappropriate, as there is no such antibiotic. Again, the physician was notified, and on June 8, 2002, he ordered an antibiotic, dicloxicillin, which was administered to Resident No. 13 during the early morning hours of the following day, June 9, 2002. Notwithstanding the administration of dicloxicillin, a broad spectrum antibiotic, the urinalysis and culture reports of the specimen taken on May 28, 2002, indicated colonized, saprophytic organisms and did not indicate pathologic organisms. The administration of an antibiotic is an optional treatment. The symptoms exhibited by Resident No. 13, burning sensation on urination, odiferous urine and a change in urine color can be caused by conditions other than urinary tract infections. Burning sensations can be caused by atrophic vaginitis and other non-pathogenic causes. Typical symptoms of a geriatric patient suspected of having a urinary tract infection are: fever, abdominal and flank pain, change in mental status, and fatigue. There is no indication in the records of Resident No. 13, during the relevant period, of the presence of these symptoms; the examination of the attending physician on May 28, 2002, does not indicate any symptoms typical of a urinary tract infection; in fact, he reports that Resident No. 13 is "doing quite well." Individuals familiar with Resident No. 13 observed no changes in her physical or mental status during the period from May 27 through June 8, 2002.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is Recommended that Respondent enter a final order determining that the deficiency described under Tag F309 in the June 17 through 20, 2002, survey did not occur, issue a Standard licensure rating to Petitioner, and that the Administrative Complaint seeking a fine be dismissed. DONE AND ENTERED this 15th day of July, 2003, in Tallahassee, Leon County, Florida. S JEFF B. CLARK Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 15th day of July, 2003. COPIES FURNISHED: Joanna Daniels, Esquire Agency for Health Care Administration 2727 Mahan Drive, Mail Station 3 Tallahassee, Florida 32308 Alex Finch, Esquire Goldsmith, Grout & Lewis, P.A. 2180 North Park Avenue, Suite 100 Post Office Box 2011 Winter Park, Florida 32790-2011 Lealand McCharen, Agency Clerk Agency for Health Care Administration 2727 Mahan Drive, Mail Stop 3 Tallahassee, Florida 32308 Valda Clark Christian, General Counsel Agency for Health Care Administration 2727 Mahan Drive Fort Knox Building III, Suite 3431 Tallahassee, Florida 32308

CFR (3) 42 CFR 48342 CFR 483 .2542 CFR 483.25 Florida Laws (6) 120.569120.57400.022400.23400.235408.035
# 8
AHF MCO OF FLORIDA, INC., D/B/A PHC FLORIDA HIV/AIDS SPECIALTY PLAN vs AGENCY FOR HEALTH CARE ADMINISTRATION, 18-003508BID (2018)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Jul. 09, 2018 Number: 18-003508BID Latest Update: Jan. 25, 2019

The Issue Does Petitioner, AHF MCO of Florida, Inc., d/b/a PHC Florida HIV/AIDS Specialty Plan (Positive), have standing to contest the intended award to Simply for Regions 10 and 11 or to seek rejection of all proposals? (Case No. 18-3507 and 18-3508) Should the intended decision of Respondent, Agency for Health Care Administration (Agency), to contract with Simply Healthcare Plans, Inc. (Simply), for Medicaid managed care plans for HIV/AIDS patients in Regions 10 (Broward County) and Region 11 (Miami-Dade and Collier Counties) be invalidated and all proposals rejected? (Case Nos. 18-3507 and 18-3508) Must the Agency negotiate with Petitioner, South Florida Community Care Network, LLC, d/b/a Community Care Plan (Community), about a plan to provide HIV/AIDS Medicaid managed care services in Region 10 because it was the only responsive proposer of services that was a Provider Service Network (PSN)? (Case No. 18-3512) Must the Agency negotiate with Community to provide Medicaid managed care services in Region 10 for people with Serious Mental Illnesses because Community is a PSN? (Case No. 18-3511) Must the Agency contract with Community to provide Medicaid managed care services for Children with Special Needs in Region 10 because Community is a PSN? (Case No. 18-3513) Must the Agency negotiate with Community to provide Medicaid managed care services for Child Welfare patients in Region 10 because Community is a PSN? (Case No. 18-3514)

Findings Of Fact THE PARTIES Agency: Section 20.42, Florida Statutes, establishes the Agency as Florida’s chief health policy and planning agency. The Agency is the single state agency authorized to select eligible plans to participate in the Medicaid program. Positive: Positive is a Florida not-for-profit corporation operating a Medicaid health plan dedicated to serving people with HIV/AIDS. Positive serves about 2,000 patients in Florida. Positive’s health plan is accredited by the Accreditation Association for Ambulatory Healthcare. Its disease management program is accredited by the National Committee for Quality Assurance. Currently, the Agency contracts with Positive for a SMMC HIV/AIDS Specialty Plan serving Regions 10 and 11. Simply: Simply is a Florida for-profit corporation operating a Medicaid health plan dedicated to serving people with HIV/AIDS. Currently, the Agency contracts with Simply to provide a SMMC HIV/AIDS Specialty Plan for Regions 1 through 3 and 5 through 11. Simply has maintained the largest patient enrollment of all HIV/AIDs plans in Florida since Florida started its statewide Medicaid managed care program. Community Care: Community is a Florida limited liability company. It is a PSN as defined in sections 409.912(1)(b) and 409.962(14), Florida Statutes. Staywell: Staywell is the fictitious name for WellCare of Florida, Inc., serving Florida’s Medicaid population. Sunshine: Sunshine State Health Plan (Sunshine) is a Florida corporation. It offers managed care plans to Florida Medicaid recipients. THE INVITATION TO NEGOTIATE TIMELINE On July 14, 2017, the Agency released 11 ITNs plans for Florida’s Medicaid managed care program in 11 statutorily defined regions. Region 10, Broward County, and Region 11, Miami-Dade and Collier Counties, are the regions relevant to this proceeding. Part IV of chapter 409, creates a statewide, integrated managed care program for Medicaid services. This program called Statewide Medicaid Managed Care includes two programs, Managed Medical Assistance and Long-term Care. Section 409.966(2), directs the Agency to conduct separate and simultaneous procurements to select eligible plans for each region using the ITN procurement process created by section 287.057(1)(c). The ITNs released July 14, 2017, fulfilled that command. The Agency issued 11 identical ITNs of 624 pages, one for each region, in omnibus form. They provided elements for four types of plans. Some elements were common to all types. Others were restricted to a specific plan type defined by intended patient population. The plan types are comprehensive plans, long-term care plus plans, managed medical assistance plans, and specialty plans. Section 409.962(16) defines “Specialty Plan” as a “managed care plan that serves Medicaid recipients who meet specified criteria based on age, medical condition, or diagnosis.” Responding vendors identified the plan type or types that they were proposing. The Agency issued Addendum No. 1 to the ITNs on September 14, 2017. On October 2, 2017, the Agency issued Addendum No. 2 to the ITNs. Addendum 2 included 628 questions about the ITNs and the Agency’s responses to the questions. Florida law permits potential responders to an ITN to challenge the specifications of an ITN, including the addendums. § 120.57(3)(b), Fla. Stat. Nobody challenged the specifications of the ITNs. As contemplated by section 287.057(c)(2), the Agency conducted “a conference or written question and answer period for purposes of assuring the vendors’ full understanding of the solicitation requirements.” Positive, Community, and Simply, along with United Healthcare of Florida, Inc., HIV/AIDS Specialty Plan (United), submitted responses to the ITN in Region 10 proposing HIV/AIDS Specialty Plans. Community was the only PSN to propose an HIV/AIDS plan for Region 10. Positive, Simply, and United submitted replies to the ITN for Region 11, proposing HIV/AIDS Specialty Plans. Community, United, Staywell, and one other provider submitted proposals to provide SMI Specialty Plan services in Region 10. Community was the only responding PSN. Community, Sunshine, and Staywell submitted proposals to provide Child Welfare Specialty Plans (CW) in Region 10. Community was the only PSN. Community, Staywell, and two others submitted proposals to offer Specialty Plans for Children with Special Needs (CSN) in Region 10. Community was one of two responding PSNs. Proposal scoring began November 6, 2017, and ended January 16, 2018. The Agency announced its intended awards on April 24, 2018. On April 24, 2018, the Agency issued its notices of intent to award specialty contracts in Regions 10 and 11. The following charts summarize the Agency’s ranking of the proposals and its intended awards. The two highest ranked plans, which the Agency selected for negotiations, are identified in bold. Region 10 – Children with Special Needs Respondent Intended Award Ranking Staywell No 1 Community No 2 Miami Children’s Health Plan, LLC No 3 Our Children PSN of Florida, LLC No 4 Region 10 – Child Welfare Respondent Intended Award Ranking Staywell No 1 Sunshine Yes 2 Molina Healthcare of Florida, Inc. No 3 Community No 4 Region 10 – HIV/AIDS Respondent Intended Award Ranking Simply Yes 1 United No 2 Community No 3 Positive No 4 Region 10 – Serious Mental Illness Respondent Intended Award Ranking Staywell Yes 1 United No 2 Florida MHS, Inc. No 3 Community No 4 Region 11 – HIV/AIDS Respondent Intended Award Ranking Simply Yes 1 United No 2 Positive No 3 All of the Specialty Plan awards noticed by the Agency went to bidders who also proposed, and received, comprehensive plan awards. The protests, referrals, and proceedings before the Division summarized in the Preliminary Statement followed the Agency’s announcement of its intended awards. TERMS The voluminous ITN consisted of a two-page transmittal letter and three Attachments (A, B, and C), with a total of 34 exhibits to them. They are: Attachment A, Exhibits A-1 through A-8, Attachment B, Exhibits B-1 through B-3, and Attachment C, Exhibits C-1 through C-8. The ITN establishes a two-step process for selecting: an evaluation phase and a negotiation phase. In the evaluation phase, each respondent was required to submit a proposal responding to criteria of the ITN. Proposals were to be evaluated, scored, and ranked. The goal of the evaluation phase was to determine which respondents would move to negotiations, not which would be awarded a contract. The top two ranking Specialty Plans per specialty population would be invited to negotiations. In the negotiation phase, the Agency would negotiate with each invited respondent. After that, the Agency would announce its intended award of a contract to the plan or plans that the Agency determined would provide the best value. Together, the attachments and exhibits combined instructions, criteria, forms, certifications, and data into a “one size fits all” document that described the information required for four categories of managed care plans to serve Medicaid patients. The ITN also provided data to consider in preparing responses. The transmittal letter emphasized, “Your response must comply fully with the instructions that stipulate what is to be included in the response.” The ITNs identified Jennifer Barrett as the procurement officer and sole point of contact with the Agency for vendors. The transmittal letter is reproduced here. This solicitation is being issued by the State of Florida, Agency for Health Care Administration, hereinafter referred to as “AHCA” or “Agency”, to select a vendor to provide Statewide Medicaid Managed Care Program services. The solicitation package consists of this transmittal letter and the following attachments and exhibits: Attachment A Instructions and Special ConditionsExhibit A-1 Questions TemplateExhibit A-2-a Qualification of Plan Eligibility Exhibit A-2-b Provider Service Network Certification of Ownership and Controlling InterestExhibit A-2-c Additional Required Certifications and StatementsExhibit A-3-a Milliman Organizational Conflict of Interest Mitigation Plan Exhibit A-3-b Milliman Employee Organizational Conflict of Interest AffidavitExhibit A-4 Submission Requirements and Evaluation Criteria InstructionsExhibit A-4-a General Submission Requirements and Evaluation Criteria Exhibit A-4-a-1 SRC# 6 - General Performance Measurement ToolExhibit A-4-a-2 SRC# 9 - Expanded Benefits Tool (Regional) Exhibit A-4-a-3 SRC# 10 - Additional Expanded Benefits Template (Regional)Exhibit A-4-a-4 SRC# 14 - Standard CAHPS Measurement Tool Exhibit A-4-b MMA Submission Requirements and Evaluation Criteria Exhibit A-4-b-1 MMA SRC# 6 - Provider Network Agreements/Contracts (Regional)Exhibit A-4-b-2 MMA SRC# 14 - MMA Performance Measurement Tool Exhibit A-4-b-3 MMA SRC# 21 - Provider Network Agreements/Contracts Statewide Essential Providers Exhibit A-4-c LTC Submission Requirements and Evaluation CriteriaExhibit A-4-c-1 LTC SRC# 4 - Provider Network Agreements/Contracts (Regional) Exhibit A-4-d Specialty Submission Requirements and Evaluation CriteriaExhibit A-5 Summary of Respondent CommitmentsExhibit A-6 Summary of Managed Care Savings Exhibit A-7 Certification of Drug-Free Workplace ProgramExhibit A-8 Standard Contract Attachment B Scope of Service - Core Provisions Exhibit B-1 Managed Medical Assistance (MMA) ProgramExhibit B-2 Long-Term Care (LTC) ProgramExhibit B-3 Specialty Plan Attachment C Cost Proposal Instructions and Rate Methodology NarrativeExhibit C-1 Capitated Plan Cost Proposal TemplateExhibit C-2 FFS PSN Cost Proposal Template Exhibit C-3 Preliminary Managed Medical Assistance (MMA) Program Rate Cell Factors Exhibit C-4 Managed Medical Assistance (MMA) Program Expanded Benefit Adjustment Factors Exhibit C-5 Managed Medical Assistance (MMA) Program IBNR Adjustment Factors Exhibit C-6 Managed Medical Assistance (MMA) Program Historical Capitated Plan Provider Contracting Levels During SFY 15/16 Time Period Exhibit C-7 Statewide Medicaid Managed Care Data BookExhibit C-8 Statewide Medicaid Managed Care Data Book Questions and Answers Your response must comply fully with the instructions that stipulate what is to be included in the response. Respondents submitting a response to this solicitation shall identify the solicitation number, date and time of opening on the envelope transmitting their response. This information is used only to put the Agency mailroom on notice that the package received is a response to an Agency solicitation and therefore should not be opened, but delivered directly to the Procurement Officer. The ITN describes the plans as follows: Comprehensive Long-term Care Plan (herein referred to as a “Comprehensive Plan”) – A Managed Care Plan that is eligible to provide Managed Medical Assistance services and Long-term Care services to eligible recipients. Long-term Care Plus Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services and Long-term Care services to eligible recipients enrolled in the Long-term Care program. This plan type is not eligible to provide services to recipients who are only eligible for MMA services. Managed Medical Assistance (MMA) Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services to eligible recipients. This plan type is not eligible to provide services to recipients who are eligible for Long-term Care services. Specialty Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services to eligible recipients who are defined as a specialty population in the resulting Contract. Specialty Plans are at issue. The ITN did not define, describe, or specify specialty populations to be served. It left that to the responding vendors. Beyond that, the ITN left the ultimate definition of the specialty population for negotiation, saying in Section II(B)(1)(a) of Attachment B, Exhibit B-3, “[t]he Agency shall identify the specialty population eligible for enrollment in the Specialty Plan based on eligibility criteria based upon negotiations.” Some respondents directly identified the specialty population. Simply’s transmittal letter stated that it proposed “a Specialty plan for individuals with HIV/AIDS.” Positive’s response to Exhibit A-4-d Specialty SRC 4, eligibility and enrollment, stated, “the specialty population for the PHC [Positive] plan will be Medicaid eligible, male and female individuals from all age groups who are HIV positive with or without symptoms and those individuals who have progressed in their HIV disease to meet the CDC definition of AIDS.” Some others left definition of the specialty population to be inferred from the ITN response. The result is that the ITN left definition of the specialty populations initially to the respondents and ultimately to negotiations between the Agency and successful respondents. Petitioners and Intervenors describe the populations that they propose serving as HIV/AIDS patients, patients with SMI, CSN, and child welfare populations. ITN respondents could have proposed serving only cancer patients, serving only obstetric patients, or serving only patients with hemophilia. The part of the ITN requiring a respondent to identify the plan type for which it was responding offered only four alternative blocks to check. They were: “Comprehensive Plan,” Long-Term Care Plus Plan,” “Managed Medical Assistance Plan,” or “Specialty Plan.” Attachment A to the ITN, labeled “Instructions and Special Conditions,” provides an overview of the solicitation process; instructions for response preparation and content; information regarding response submission requirements; information regarding response evaluation, negotiations, and contract awards; and information regarding contract implementation. Exhibits A-1 to A-3 and A-5 to A-7 of the ITN contain various certifications and attestations that respondents had to prepare and verify. Exhibit A-4 contains submission requirement components (SRCs) to which respondents had to prepare written responses. Exhibit A-8 contains the state’s standard SMMC contract. ITN Exhibit A-4-a contains 36 general submission requirements and evaluation criteria (General SRCs). ITN Exhibit A-4-b contains 21 MMA submission requirements and evaluation criteria (MMA SRCs). ITN Exhibit A-4-c contains 13 LTC submission requirements and evaluation criteria (LTC SRCs). ITN Exhibit A-4-d contains five specialty submission requirements and evaluation criteria (Specialty SRCs). The responses that the 36 SRCs require vary greatly. Some are as simple as providing documents or listing items. Others require completing tables or spreadsheets with data. Consequently, responses to some SRCS apparently could be reviewed in very little time, even a minute or less. Others requiring narrative responses might take longer. Examples follow. General SRC 1 required a list of the respondent’s contracts for managed care services and 12 information items about them including things such as whether they were capitated, a narrative describing the scope of work; the number of enrollees; and accomplishments and achievement. General SRC 2 asked for documentation of experience operating a Medicaid health plan in Florida. General SRC 3 asked for information confirming the location of facilities and employees in Florida. General SRC 12 requested a flowchart and written description of how the respondent would execute its grievance and appeal system. It listed six evaluation criteria. MMA SRC 2 asks for a description of the respondent’s organizational commitment to quality improvement “as it relates to pregnancy and birth outcomes.” It lists seven evaluation criteria. MMA SRC 10 asks for a description of the respondent’s plan for transition of care between service settings. It lists six evaluation criteria including the respondent’s process for collaboration with providers. Specialty SRC 1 asks for detailed information about respondent’s managed care experience with the specialty population. Specialty SRC 5 asks for detailed information about the respondent’s provider network standards and provides five evaluation criteria for evaluating the answers. Exhibit A-8 of the ITN contains the standard SMMC contract. Attachment B and Exhibits B-1 to B-3 of the ITN contain information about the scope of service and core provisions for plans under the SMMC program. Attachment C and Exhibits C-1 to C-8 of the ITN contain information related to the cost proposals and rate methodologies for plans under the SMMC program. The ITN permitted potential respondents to submit written questions about the solicitation to the Agency by August 14, 2017. Some did. On September 14, 2017, the Agency issued Addendum No. 1 to the ITN. Among other things, Addendum No. 1 changed the anticipated date for the Agency’s responses to respondents’ written questions from September 15 to October 2, 2017. The Agency issued Addendum No. 2 to the ITN on October 2, 2017. Addendum No. 2 included a chart with 628 written questions from potential respondents and the Agency’s answers. Attachment A at A 10-(d) makes it clear that the answers are part of the addendum. Both Addendums to the ITN cautioned that any protest of the terms, conditions, or specifications of the Addendums to the ITN had to be filed with the Agency within 72 hours of their posting. No respondent protested. Instructions for the A-4 Exhibits included these requirements: Each SRC contains form fields. Population of the form fields with text will allow the form field to expand and cross pages. There is no character limit. All SRCs, marked as “(Statewide)” must be identical for each region in which the respondent submits a reply. For timeliness of response evaluation, the Agency will evaluate each “(Statewide)” SRC once and transfer the score to each applicable region’s evaluation score sheet(s). The SRCs marked as “(Regional)” will be specific and only apply to the region identified in the solicitation and the evaluation score will not be transferred to any other region. The instructions continue: Agency evaluators will be instructed to evaluate the responses based on the narrative contained in the SRC form fields and the associated attachment(s), if applicable. Each response will be independently evaluated and awarded points based on the criteria and points scale using the Standard Evaluation Criteria Scale below unless otherwise identified in each SRC contained within Exhibit A-4. This is the scale: STANDARD EVALUATION CRITERIA SCALE Point Score Evaluation 0 The component was not addressed. 1 The component contained significant deficiencies. 2 The component is below average. 3 The component is average. 4 The component is above average. 5 The component is excellent. The ITN further explained that different SRCs would be worth different “weights,” based on the subject matter of the SRC and on whether they were General, MMA, LTC, or Specialty SRCs. It assigned weights by establishing different “weight factors” applied as multipliers to the score a respondent received on a criteria. For example, “Respondent Background/Experience” could generate a raw score of 90. Application of a weight factor of three made 270 the maximum possible score for this criteria. “Oversight and Accountability” could generate a raw score of 275. A weight factor of one, however, made the maximum score available 275. General SRC 6 solicits HEDIS data. HEDIS is a tool that consists of 92 measures across six domains of care that make it possible to compare the performance of health plans on an “apples-to-apples” basis. SRC 6 states: The respondent shall describe its experience in achieving quality standards with populations similar to the target population described in this solicitation. The respondent shall include, in table format, the target population (TANF, ABD, dual eligible), the respondent’s results for the HEDIS measures specified below for each of the last two (2) years (CY 2015/ HEDIS 2016 and CY 2016/ HEDIS 2017) for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees). If the respondent does not have HEDIS results for at least three (3) Medicaid Contracts, the respondent shall provide commercial HEDIS measures for the respondent’s largest Contracts. If the Respondent has Florida Medicaid HEDIS results, it shall include the Florida Medicaid experience as one (1) of three (3) states for the last two (2) years. The respondent shall provide the data requested in Exhibit A-4-a-1, General Performance Measurement Tool[.] x x x Score: This section is worth a maximum of 160 raw points x x x For each of the measure rates, a total of 10 points is available per state reported (for a total of 360 points available). The respondent will be awarded 2 points if their reported plan rate exceeded the national Medicaid mean and 2 points if their reported plan rate exceeded the applicable regional Medicaid mean, for each available year, for each available state. The respondent will be awarded an additional 2 points for each measure rate where the second year’s rate is an improvement over the first year’s rate, for each available state. An aggregate score will be calculated and respondents will receive a final score of 0 through 150 corresponding to the number and percentage of points received out of the total available points. For example, if a respondent receives 100% of the available 360 points, the final score will be 150 points (100%). If a respondent receives 324 (90%) of the available 360 points, the final score will be 135 points (90%). If a respondent receives 36 (10%) of the available 360 points, the final score will be 15 points (10%). The SRC is plainly referring to the broad Medicaid- eligible population when it says “the target population (TANF, ABD, dual eligible).” “Dual eligible” populations are persons eligible for Medicaid and Medicare. There, as throughout the ITN, the ITN delineates between a target population of all Medicaid-eligible patients and a specialty population as described in a respondent’s ITN proposal. The clear instructions for SRC 6 require, “Use the drop-down box to select the state for which you are reporting and enter the performance measure rates (to the hundredths place, or XX.XX) for that state's Medicaid population for the appropriate calendar year.” Community did not comply. General SRC 14 solicits similar data, in similar form using a similar tool, about a respondent’s Consumer Assessment of Healthcare Providers and Systems (CAHPS). CAHPS data is basically a satisfaction survey. It asks respondents to provide “in table format the target population (TANF, ABD, dual eligible) and the respondent’s results for the Consumer Assessment of Healthcare Providers and Systems (CAHPS) items/composites specified below for the 2017 survey for its adult and child populations for the respondent’s three (3) largest Medicaid Contracts (as measured by number of enrollees).” Just like General SRC 6 did with HEDIS data, General SRC 14 ITN instructed bidders to put their CAHPS data for the “target population (TANF, ABD, dual eligible)” “for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees)” for multiple states into an excel spreadsheet “to the hundredths place[.]” Also, like General SRC 6, General SRC 14 includes an objective formula described in the ITN for scoring bidders’ CAHPS data. RANKING PROVISIONS Attachment A at (D)(4)(c)(2) stated: Each response will be individually scored by at least three (3) evaluators, who collectively have experience and knowledge in the program areas and service requirements for which contractual services are sought by this solicitation. The Agency reserves the right to have specific sections of the response evaluated by less than three (3) individuals. The ITN’s example of how total point scores would be calculated, discussed below, also indicated that some sections may be scored by less than three evaluators. The explanatory chart had a column for “[o]ther Sections evaluated by less than three (3) evaluators. ” The Agency’s policy, however, has been to assign at least three evaluators to score program specific SRCs. Attachment A at (D)(4)(e)(2) advised respondents how the agency will rank the competing responses. It was clear and specific, even providing an example of the process showing how the scores “will” be calculated. Step one of the explanatory chart stated that the Agency would calculate a total point score for each response. Step two stated that “[t]he total point scores will be used to rank the responses by an evaluator. . . .” Next, the rankings by the evaluator are averaged to determine the average rank for each respondent. This average ranking is critical because ranking is how the ITN said the Agency would select respondents for negotiation and how the Agency did select respondents for negotiation. The step two and step three charts, reproduced below, demonstrate that the ITN contemplated an evaluation process in which each response was to be evaluated in its entirety by three different evaluators, or maybe less than three, but indisputably in its entirety by those who evaluated it. This did not happen. Step 2 The total point scores will be used to rank the responses by evaluator (Response with the highest number of points = 1, second highest = 2, etc.). POINTS SUMMARY Evaluator A Evaluator B Evaluator C Evaluator D Respondent 446 Respondent 396 Respondent 311 Respondent 413 Respondent 425 Respondent 390 Respondent 443 Respondent 449 Respondent 397 Respondent 419 Respondent 389 Respondent 435 Respondent 410 Respondent 388 Respondent 459 Respondent 325 RANKING SUMMARY Evaluator A Evaluator B Evaluator C Evaluator D Respondent 1 1 Respondent 1 2 Respondent 1 4 Respondent 3 Respondent 2 2 Respondent 2 3 Respondent 2 2 Respondent 1 Respondent 3 4 Respondent 3 1 Respondent 3 3 Respondent 2 Respondent 4 3 Respondent 4 4 Respondent 4 1 Respondent 4 c) Step 3 An average rank will be calculated for each response for all the evaluators. Respondent 1 1+2+4+3=10÷4=2.5 Respondent 2 2+3+2+1=8÷4=2.0 Respondent 3 4+1+3+2=10÷4=2.5 Respondent 4 3+4+1+4=12÷4=3.0 PROVIDER SERVICE NETWORK PROVISIONS Florida law permits a PSN to limit services provided to a target population “based on age, chronic disease state, or medical condition of the enrollee.” This allows a PSN to offer a specialty plan. For each region, the eligible plan requirements of section 409.974(1) state, “At least one plan must be a provider service network if any provider service networks submit a responsive bid.” Section 409.974(3) says: “Participation by specialty plans shall be subject to the procurement requirements of this section. The aggregate enrollment of all specialty plans in a region may not exceed 10 percent of the total enrollees of that region.” The ITN addressed those requirements. The Negotiation Process section of Attachment A, Instructions and Special Conditions, says: The Agency intends to invite the following number of respondents to negotiation: Comprehensive Plans The top four (4) ranking Comprehensive Plans. Long-term Care Plus Plans The top two (2) ranking Long-term Care Plus Plans Managed Medical Assistance Plans The top two (2) ranking Managed Medical Assistance Plans Specialty Managed Medical Assistance Plans The top two (2) ranking Specialty Managed Medical Assistance Plans per specialty population. If there are no provider service networks included in the top ranked respondents listed above, the Agency will invite the highest ranked PSN(s) to negotiations in order to fulfill the requirements of Section 409.974(1), Florida Statutes and Section 409.981(1), Florida Statutes. Emphasis supplied. The ITN specifications in Section D.7, titled Number of Awards, state as follows about Specialty Plan awards: 7. Number of Awards In accordance with Sections 409.966, 409.974, and 409.981, Florida Statutes, the Agency intends to select a limited number of eligible Managed Care Plans to provide services under the SMMC program in Region 10. The Agency anticipates issuing the number of Contract awards for Region 10 as described in Table 5, SMMC Region, below, excluding awards to Specialty MMA Plans. Table 5 SMMC Region Region Total Anticipated Contract Awards Region 10 4 If a respondent is awarded a Contract for multiple regions, the Agency will issue one (1) Contract to include all awarded regions. The Agency will award at least one (1) Contract to a PSN provided a PSN submits a responsive reply and negotiates a rate acceptable to the Agency. The Agency, at its sole discretion, shall make this determination. A respondent that is awarded a Contract as a Comprehensive Plan is determined to satisfy the requirements in Section 409.974, Florida Statutes and Section 409.981, Florida Statutes and shall be considered an awardee of an MMA Contract and a LTC Contract. The Agency will issue one (1) Contract to reflect all awarded populations in all awarded regions. In addition to the number of Contracts awarded in this region, additional Contracts may be awarded to Specialty Plans that negotiate terms and conditions determined to be the best value to the State and negotiate a rate acceptable to the Agency. The Agency, at its sole discretion, shall make this determination. The Agency reserves the right to make adjustments to the enrollee eligibility and identification criteria proposed by a Specialty Plan prior to Contract award in order to ensure that the aggregate enrollment of all awarded Specialty Plans in a region will not exceed ten percent (10%) of the total enrollees in that region, in compliance with Section 409.974(3), Florida Statutes. If a respondent is awarded a Contract as a Specialty Plan and another plan type, the Agency will issue one (1) Contract to include all awarded populations in all awarded regions. A prospective vendor asked about the interplay of Specialty Plan options and the PSN requirements. The question and the answer provided in Addendum 2 follow: Q. Please clarify the number of PSN awards per region and how PSN awards will be determined based on the PSN's plan type (e.g., Comprehensive, LTC Plus, MMA, Specialty). As you know, Sections 409.974 and 409.981, Florida Statutes require one MMA PSN and one LTC PSN award per region (assuming a PSN is responsive) and the Agency has stated that an award to a Comprehensive Plan PSN will meet the requirements of both statutes. However, can the Agency further clarify whether other types of PSNs would meet the statutory requirements? Specifically, would a PSN LTC Plus award meet the requirements of Section 409.981, Florida Statutes? Similarly, would an award to a Specialty Plan PSN meet the requirements of Section 409.974, Florida Statutes? A. See Attachment A Instructions and Special Conditions, Section D Response Evaluations, and Contract Award, Sub-Section 7 Number of Awards. Yes, a PSN LTC Plus award would meet the requirements of Section 409.981(2). A Specialty Plan PSN would not meet the requirements of Section 409.974(1). The only reasonable interpretation of this answer is that Specialty Plan PSNs do not satisfy the requirement to contract with a responsive PSN imposed by section 409.974. None of the prospective vendors, including Community, challenged this clarification. EVALUATION PROCESS THE EVALUATORS The Agency selected 11 people to evaluate the proposals. The Agency assigned each person a number used to identify who was assigned to which task and to track performance of evaluation tasks. The procurement officer sent the evaluators a brief memo of instructions. It provided dates; described logistics of evaluation; emphasized the importance of independent evaluation; and prohibited communicating about the ITN and the proposals with anyone other than the procurement office. The Agency also conducted an instructional session for evaluators. Evaluator 1, Marie Donnelly: During the procurement, Ms. Donnelly was the Agency’s Chief of the Bureau of Medicaid Quality. She held this position for five years before resigning. This bureau bore responsibility for ensuring that the current SMMC plans met their contract requirements for quality and quality improvement measures. Her role specifically included oversight of Specialty Plans. Evaluator 2, Erica Floyd Thomas: Ms. Thomas is the chief of the Bureau of Medicaid Policy. She has worked for the Agency since 2001. Her Medicaid experience includes developing policies for hospitals, community behavioral health, residential treatment, and contract oversight. Before serving as bureau chief, she served as an Agency administrator from 2014 through 2017. Ms. Thomas oversaw the policy research and development process for all Medicaid medical, behavioral, dental, facility, and clinic coverage policies to ensure they were consistent with the state Plan and federal Medicaid requirements. Evaluator 3, Rachel LaCroix, Ph.D.: Dr. LaCroix is an administrator in the Agency’s Performance Evaluation and Research Unit. She has worked for the Agency since 2003. All her positions have been in the Medicaid program. Dr. LaCroix has served in her current position since 2011. She works with the performance measures and surveys that the current SMMC providers report to the Agency. Dr. LaCroix is a nationally recognized expert on healthcare quality metrics like HEDIS. She is also an appointee on the National Association of Medicaid Directors’ task force for national performance measures. Evaluator 4, Damon Rich: Mr. Rich has worked for the Agency since April 2009. He is the chief of the Agency’s Bureau of Recipient and Provider Assistance. This bureau interacts directly with AHCA’s current SMMC care providers about any issues they have, and with Medicaid recipients, usually about their eligibility or plan enrollment. Before Mr. Rich was a bureau chief, he worked as a field office manager for the Agency. Mr. Rich’s experience as bureau chief and field office manager includes oversight of the current SMMC Specialty Plans. Evaluator 5. Eunice Medina: Ms. Medina is the chief of the Agency’s Bureau of Medicaid Plan Management, which includes a staff of over 60 individuals, who manage the current SMMC contracts. Her experience and duties essentially encompass all aspects of the current SMMC plans. Ms. Medina started working with the Agency in 2014. Evaluator 6, Devona “DD” Pickle: Ms. Pickle most recently joined the Agency in 2011. She also worked for the Agency from November 2008 through November 2010. Ms. Pickle’s Agency experience all relates in some way to the Medicaid program. Since March 2013, Ms. Pickle has served as an administrator over managed care policy and contract development in the Bureau of Medicaid Policy. Her job duties include working with the current SMMC contractors. Ms. Pickle is also a Florida licensed mental health counselor. Evaluator 7, Tracy Hurd-Alvarez: Ms. Hurd-Alvarez has worked for the Agency’s Medicaid program since 1997. Since 2014, she has been a field office manager, overseeing compliance monitoring for all the current SMMC contractors. Before assuming her current position, Ms. Hurd-Alvarez implemented the LTC SMMC program. Evaluator 8, Gay Munyon: Ms. Munyon is currently the Chief of the Bureau of Medicaid Fiscal Agent Operations. Ms. Munyon began working with the Agency in April 2013. Ms. Munyon’s bureau oversees fulfillment of the Agency’s contract with the current SMMC fiscal agent. Her unit’s responsibilities include systems maintenance and modifications and overseeing the fiscal agent, which answers phone calls, processes claims, and processes applications. Ms. Munyon has 25 years of experience working with the Medicaid program. Evaluator 9, Laura Noyes: Ms. Noyes started working for the Agency in April 2011. Her years of Agency experience all relate to the Medicaid program, including overseeing six current comprehensive managed care plans by identifying trends in contractual non-compliance. Evaluator 10, Brian Meyer: Mr. Meyer is a CPA, who has worked for the Agency in the Medicaid program since 2011. He is currently chief of the Bureau of Medicaid Data Analytics. Mr. Meyer’s primary responsibility is overseeing the capitation rates for the current SMMC contractors. His experience includes Medicaid plan financial statement analysis, surplus requirement calculation analysis and, in general, all types of financial analysis necessary to understand financial performance of the state’s Medicaid plans. Evaluator 11, Ann Kaperak: Since April 2015, Ms. Kaperak has served as an administrator in the Agency’s Bureau of Medicaid Program Integrity. Ms. Kaperak’s unit oversees the fraud and abuse efforts of the current SMMC plans. She also worked for the Medicaid program from November 2012 through May 2014. Ms. Kaperak worked as a regulatory compliance manager for Anthem/Amerigroup’s Florida Medicaid program between May 2014 and April 2015. Positive and Community challenge the Agency’s plan selections by questioning the qualifications of the evaluators. The first part of their argument is that the evaluators did not have sufficient knowledge about HIV/AIDS and its treatment. The evidence does not prove the theory. For instance, Positive’s argument relies upon criticizing the amount of clinical experience evaluators had managing patients with HIV/AIDS. That approach minimizes the fact that the managed care plan characteristics involve so much more than disease- specific considerations. For instance, many of the components require determining if the respondent provided required documents, verifying conflict of interest documents, management structure, quality control measures, and the like. General SRCs asked for things like dispute resolution models (SRC 16), claims processing information (SRC 17), and fraud and abuse compliance plans (SRC 31). MMA SRCs included criteria, like telemedicine (SRC 4), demonstrated progress obtaining executed provider agreements (SRC 6), and a credentialing process (SRC 12). Specialty SRCs included criteria like copies of contracts for managed care for the proposed specialty population (SRC 1), specific and detailed criteria defining the proposed specialty population (SRC 4), and the like. The evidence does not prove that disease-specific experience is necessary to evaluate responses to these and other SRCs. SRC 6 involving HEDIS data and SRC 14 involving CAHPS data are two good examples. They required respondents to input data into a spreadsheet. All the evaluators had to do was determine what those numbers showed. Evaluation did not require any understanding of disease or how the measures were created. All the evaluator had to know was the number in the spreadsheet. The second part of the evaluator qualification criticisms is that the evaluators did not give adequate weight to some responses. Positive and Community just disagree with the measures requested and the evaluation of them. They conclude from that disagreement that the evaluators’ qualifications were deficient. The argument is not persuasive. The last sentence of paragraph 69 of Positive’s proposed recommended order exemplifies the criticisms of Positive and Community of the evaluators’ qualifications. It states, “The fact that PHC [Positive] was ranked last among competing HIV plans shows that the SRC evaluators did not understand enough about managing individuals with HIV/AIDs to score its proposal competently.” The argument is circular and “ipse dixit”. It does not carry the day. The collective knowledge and experience of the evaluators, with a total of 128 years of Medicaid experience, made them capable of reasonably evaluating the managed care plan proposals, including the Specialty plan proposals. The record certainly does not prove otherwise. EVALUATION PROCESS The Agency assigned the evaluators to the SRCs that it determined they were qualified to evaluate and score. The Agency did not assign entire responses to an evaluator for review. Instead it elected a piecemeal review process assigning various evaluators to various sections, the SRCs of each response. Paragraph 30 of the Agency’s proposed recommended order describes this decision as follows: Although the ITN had contemplated ranking each vendor by evaluator, based on an example in the ITN, such ranking presumed a process where all evaluators scored all or nearly all of the responses to the ITN, which had occurred in the procurement five years ago. In this procurement, each evaluator reviewed only a subset of SRCs based on their knowledge, and experience; therefore, ranking by evaluator was not logical because each had a different maximum point score. The initial SRC scoring assignments were: General SRCs 1 through 4, LTC SRCs 1 and 2, and Specialty SRC 1: Marie Donnelly, Laura Noyes, and Brian Meyer. General SRCs 5 through 8, MMA SRCs 1 through 7, LTC SRCs 3 and 4, and Specialty SRCs 1 and 2: Marie Donnelly, Erica Floyd- Thomas, and Rachel LaCroix. General SRCs 9 through 14, MMA SRCs 8 through 11, LTC SRCs 5 through 7, and Specialty SRC 4: Damon Rich, Eunice Medina, and DD Pickle. General SRCs 15 through 17, MMA SRCs 12 and 13, and LTC SRCs 8 through 10: Damon Rich, Tracy Hurd-Alvarez, Gay Munyon. General SRCs 18 through 25, MMA SRCs 14 through 20, LTC SRCs 11 and 12, and Specialty SRC 5: Erica Floyd-Thomas, Eunice Medina, and DD Pickle. General SRCs 26 through 33 and LTC SRC 13: Gay Munyon, Ann Kaperak, and Brian Meyer. General SRCs 34 through 36 and MMA SRC 21: Marie Donnelly, Rachel LaCroix, and Tracy Hurd-Alvarez. The ranking process presented in the ITN and described in paragraphs 62-64, contemplated ranking each respondent by evaluator. The Agency carried this process over from an earlier procurement. In this procurement, despite what the ITN said, the Agency assigned responsibilities so that each evaluator reviewed only a subset of SRCs. Therefore, the ranking of responses by evaluator presented in the ITN could not work. It was not even possible because no one evaluator reviewed a complete response and because each SRC had a different maximum point score. Instead, the Agency, contrary to the terms of the ITN, ranked proposals by averaging the “total point scores” assigned by all of the evaluators. The Agency considered issuing an addendum advising the parties of the change. The addendum would have informed the respondents and provided them an opportunity to challenge the change. The Agency elected not to issue an addendum. EVALUATION AND SCORING The evaluators began scoring on November 6, 2017, with a completion deadline of December 29, 2017. The 11 evaluators had to score approximately 230 separate responses to the ITNs. The evaluators had to score 67,175 separate items to complete the scoring for all responses for all regions for all types of plans. No one at the Agency evaluated how much time it should take to score a particular item. None of the parties to this proceeding offered persuasive evidence to support a finding that scoring any particular item would or should take a specific length of time or that scoring all of the responses would or should take a specific length of time. Evaluators scored the responses in conference room F at the Agency’s headquarters. This secure room was the exclusive location for evaluation and scoring. Each evaluator had a dedicated workspace equipped with all tools and resources necessary for the task. The workspaces included a computer terminal for each evaluator. The system allowed evaluators to review digital copies of the ITN and proposals and to enter evaluation points in spreadsheets created for the purpose of recording scores. Evaluators also had access to hard copies of the proposals and the ITN. The Agency required evaluators to sign in and to sign out. The sign-in and sign-out sheets record the significant amount of time the evaluators spent evaluating proposals. Evaluators were not permitted to communicate with each other about the responses. To minimize distractions, the Agency prohibited cell phones, tablets and other connected devices in the room. The Agency also authorized and encouraged the evaluators to delegate their usual responsibilities. Agency proctors observed the room and evaluators throughout the scoring process. They were available to answer general and procedural questions and to ensure that the evaluators signed in and signed out. A log sheet documented how much time each evaluator spent in the scoring conference room. Some evaluators took extensive notes. For example, Ms. Median took over 200 pages of notes. Similarly, Ms. Munyon took nearly 400 pages of typewritten notes. The evaluators worked hard. None, other than Dr. LaCroix, testified that they did not have enough time to do their job. The computer system also automatically tracked the evaluators’ progress. Tracking reports showed the number of items assigned to each evaluator and the number of scoring items completed. The first status report was generated on December 8, 2017, approximately halfway through the scheduled scoring. At that time, only 28 percent of the scoring items were complete. Ms. Barrett usually ran the status reports in the morning. She made them available to the evaluators to review. The pace of evaluation caused concern about timely completion and prompted discussions of ways to accelerate scoring. Because it was clear that the majority of the evaluators would not complete scoring their SRCs by December 29, 2017, the Agency extended the scoring deadline to January 12, 2018. It also extended the hours for conference room use. Most respondents filed proposals for more than one type of plan and more than one region. This fact combined with the provision in the instructions saying that all statewide SRC responses must be identical for each region and that scores would transfer to each applicable region’s score sheets, enabled evaluators to score many SRCs just once. The system would then auto-populate the scores to the same SRC for all proposals by that respondent. This time saving measure permitted scoring on many of the items to be almost instantaneous after review of the first response to an SRC. The fact that so many respondents submitted proposals for so many regions and types of plans provided the Agency another opportunity for time-saving. The Agency loaded Adobe Pro on the evaluators’ computers as a timesaving measure. This program allowed the evaluators to compare a bidder’s Comprehensive Plan Proposal to the same company’s regional and Specialty Plan proposals. If the Adobe Pro comparison feature showed that the proposal response was the same for each plan, the Agency permitted evaluators to score the response once and assign the same score for each item where the respondent provided the same proposal. This speeded scoring. It, however, meant that for SRCs where evaluators did this, that they were not reviewing the SRC response in the specific context of the specialty plan population, each of which had specific and limited characteristics that made them different from the broader General and MMA plan populations. This is significant because so many SRCs required narrative responses where context would matter. There is no Specialty SRCs A-4 instruction requirement for specialty plans analogous to the requirement that responses for statewide SRCs must be identical for each region. In other words, the instructions do not say all SRCs marked as statewide must be identical for each specialty plan proposal and that the Agency will evaluate each Statewide SRC once and transfer the score to each applicable Specialty Plan score. In fact, according to the procurement officer, the Agency expected that evaluators would separately evaluate and score the statewide SRCs for Comprehensive Plans and for Specialty Plans, even if the same bidder submitted them. Despite the Agency’s expectation and the absence of an authorizing provision in the ITN, many evaluators, relying on the Adobe Pro tool, copied the SRC scores they gave to a respondent’s comprehensive plan proposal to its specialty plan proposal if the respondent submitted the same response to an SRC for a Comprehensive Plan and a Specialty Plan. For instance, Ms. Thomas (Evaluator 2) and Ms. Munyon (Evaluator 8) did this to save time. Ms. Donnelly (Evaluator 1) did this even when the comprehensive and specialty responses were not identical. This does not amount to the independent evaluation of the responses pledged by the ITN. On separate days, Evaluator 1 scored 1,315 items, 954 items, 779 items and 727 items. On separate days, Evaluator 2 scored 613 items, 606 items, 720 items, 554 items and 738 items. Evaluator 4 scored 874 items on one day. Evaluator 5 scored 813 items in one day. Evaluator 6 scored 1,001 items in one day. Evaluator 8 scored 635 items in one day. The record does not identify the items scored. It also does not permit determining how many of the item scores resulted from auto-population or assignment of scores based upon previous scoring of an identical response. It bears repeating, however, that the record does not support any finding on how long scoring the response to one SRC or an entire response could reasonably be expected to take. Even with the extended scoring period and time-saving measures, the Agency concluded that Evaluator 3 would not be able to finish all of the SRCs assigned to her. Rather than extend the deadline for scoring a second time, the Agency decided to reassign the nine of Evaluator 3’s SRCs that she had not begun scoring to two other evaluators. The Agency did not include scores of other SRCs for which Evaluator 3 had not completed scoring. The Agency only counted Evaluator 3’s scores for an SRC if she scored the SRC for everyone. The result was that only two people scored nine of the Specialty Plan SRCs. The Agency did not reassign all of Evaluator 3’s SRCs’. It only reassigned the SRCs to evaluators who were qualified to evaluate the items, who were not already assigned those items to score, and who had already finished or substantially completed their own evaluations. The decision to reassign the SRCs was not based on any scoring that had already been completed. The Agency did not allow changes to data submitted by any of the vendors. It allowed vendors to exchange corrupted electronic files for ones which could be opened and allowed vendors to exchange electronic files to match up with the paper copies that had been submitted. The Agency allowed Community to correct its submission where it lacked a signature on its transmittal letter and allowed Community to exchange an electronic document that would not open. It did not allow Community to change its reported HEDIS scores, which were submitted in the decimal form required by the instructions. Community erred in the numbers that it reported. There is no evidence showing that other vendors received a competitive or unfair advantage over Community in the Agency’s review of the SMI Specialty Plan submission for Region 10. There was no evidence that the Agency allowed any other vendors to change any substantive information in their submittals for that proposed specialty in that region. HEIDIS ISSUES Positive asserts that Simply’s proposal is non- responsive because Simply submitted HEDIS data from the general Medicaid population in response to SRC 6 and MMA SRC 14. Positive contends that Simply obtained a competitive advantage by supplying non-HIV/AIDS HEDIS data in response to SRC 6 and MMA SRC 14 because HIV/AIDS patients are generally a sicker group and require more care and because some HEDIS measures cannot be reported for an HIV/AIDS population. HEDIS stands for Healthcare Effectiveness and Data Information Set and is a set of standardized performance measures widely used in the healthcare industry. The instructions for both SRC 6 and MMA SRC 14 provide, in relevant part: The respondent shall describe its experience in achieving quality standards with populations similar to the target population described in this solicitation. The respondent shall include in table format, the target population (TANF, ABD, dual eligible), the respondent’s results for the HEDIS measures specified below for each of the last two (2) years (CY 2015/HEDIS 2016 and CY 2016/HEDIS 2017) for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees). If the respondent does not have HEDIS results for at least three (3) Medicaid Contracts, the respondent shall provide commercial HEDIS measures for the respondent’s largest Contracts. If the Respondent has Florida Medicaid HEDIS results, it shall include the Florida Medicaid experience as one (1) of three (3) states for the last two (2) years. (JE 1 at 75 (SRC 6); JE 1 at 158 (MMA SRC 14)). SRC 6 and MMA SRC 14 instruct respondents to provide HEDIS measures for “the target population (TANF, ABD, dual eligible).” Id.. TANF, ABD, and dual eligible are eligibility classifications for the Medicaid population. The Agency sought information regarding the target Medicaid-eligible population, even from respondents proposing a Specialty Plan, because Specialty Plans are required to serve all of the healthcare needs of their recipients, not just the needs related to the criteria making those recipients eligible for the Specialty Plan. Following the instructions in SRC 6 and MMA SRC 14, Simply provided HEDIS data from the Medicaid-eligible population for its three largest Medicaid contracts as measured by the total number of enrollees. For the requested Florida HEDIS data, Simply utilized legacy HEDIS data from Amerigroup Florida, Inc., a Comprehensive Plan. Amerigroup and Simply had merged in October of 2017. Therefore, at the time of submission of Simply’s proposal, the HEDIS data from Amerigroup Florida was the data from Simply’s largest Medicaid contract in Florida for the period requested by the SRCs. Positive asserts that the Agency impermissibly altered scoring criteria after the proposals were submitted when the Agency corrected technical issues within a HEDIS Measurement Tool spreadsheet. SRC 6 and MMA SRC 14 required the submission of numeric data for the requested HEDIS performance measures. To simplify submission of the numeric data for the requested HEDIS performance measures, the Agency required respondents to utilize a HEDIS Measurement Tool spreadsheet. The evaluation criteria for SRC 6 and MMA SRC 14 provided that respondents will be awarded points if the reported HEDIS measures exceed the national or regional mean for such performance measures. Some respondents, including Positive, entered “N/A,” “small denominator,” or other text inputs into the HEDIS Measurement Tool. During the evaluation and scoring process, the Agency discovered that if a respondent input any text into the HEDIS Measurement Tool, the tool would assign random amounts of points, even though respondents had not input measureable, numeric data. The Agency reasonably resolved the problem by removing any text and inserting a zero in place of the text. The correction of the error in the HEDIS Measurement Tool prevented random points from being awarded to respondents and did not alter scores in any way contrary to the ITN. It was reasonable and fair to all respondents.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order rejecting all r esponses to the ITNs to provide a Medicaid Managed Care plan for patients with HIV/AIDS in Regions 10 and 11. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide Medicaid Managed Care plan in Region 10 for patients with serious mental illness. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide a Medicaid Managed Care plan in Region 10 for patients with serious mental illness. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide a Medicaid Managed Care plan in Region 10 for c hild w elfare specialty services. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order awarding Wellcare of Florida, Inc., d/b/a Staywell Health Plan of Florida, a contract for a specialty Medicaid Managed Care plan for patients with Serious Mental Illness in Region 10. Based on the foregoing Findings of Fact and Conclusions of Law it is RECOMMENDED that the Agency for Health Care Administration enter a final order dismissing the Petition in Case No. 18-3513. DONE AND ENTERED this day of , , in Tallahassee, Leon County, Florida. S JOHN D. C. NEWTON, II Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this day of , .

USC (1) 42 U.S.C 1396u Florida Laws (9) 120.5720.42287.057409.912409.962409.966409.97409.974409.981
# 9
NAPLES COMMUNITY HOSPITAL, INC. vs AGENCY FOR HEALTH CARE ADMINISTRATION, 92-001510CON (1992)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Mar. 04, 1992 Number: 92-001510CON Latest Update: Jun. 08, 1993

The Issue Whether the application of Petitioner Naples Community Hospital, Inc. for a Certificate of Need to add a total of 35 beds to Naples Community Hospital and North Collier Community Hospital should be approved based on peak seasonal demand for acute care beds in the relevant subdistrict.

Findings Of Fact Naples Community Hospital, Inc., ("NCH") holds the license for and operates Naples Community Hospital ("Naples"), a 331 bed not-for-profit acute care hospital, and North Collier Community Hospital ("North Collier"), a 50 bed acute care hospital. NCH also operates a 22 bed comprehensive rehabilitation facility and a 23 bed psychiatric facility. NCH is owned by Community Health Care, Inc., "(CHC"). Both Naples and North Collier are located within Agency for Health Care Administration ("ACHA") district 8 and are the only hospitals within subdistrict 2 of the district. Naples is located in central Collier County. North Collier is (as the name implies) located in northern Collier County approximately 2-3 miles from the county line. NCH's primary service area is Collier County from which approximately 85-90 percent of its patients come, with a secondary service area extending north into Lee County. Neither Naples nor North Collier are teaching hospitals as defined by Section 407.002(27), Florida Statutes (1991). NCH is not proposing a joint venture in this CON application. NCH has a record of providing health care services to Medicaid patients and the medically indigent. NCH proposes to provide health care services to Medicaid patients and the medically indigent. Neither Naples nor North Collier are currently designated by the Office of Medicaid as disproportionate share providers. NCH has the funds for capital and initial operating expenditures for the project. NCH has sufficient financial resources to construct and equip the proposed project. The costs and methods of the proposed construction are reasonable. The Agency for Health Care Administration ("AHCA") is the state agency charged with responsibility for administering the Certificate of Need program. Southwest Florida Regional Medical Center ("Southwest") is a 400 bed for-profit acute care hospital located in Fort Myers, Lee County. Lee County is adjacent to and north of Collier County. Southwest is owned by Columbia Hospital Corporation ("Columbia"), which also owns Gulf Coast Hospital in Fort Myers, and two additional hospitals in AHCA District 8. Southwest's primary service area is Lee County. Although Southwest asserts that it would be negatively impacted by the addition of acute care beds at NCH, the greater weight of the credible evidence fails to support the assertion. The primary market services areas of NCH and Southwest are essentially distinct. However, the facilities are located in such proximity as to indicate that secondary service areas overlap and that, at least during peak winter season periods, approval of the NCH application could potentially impact Southwest's operations. Southwest has standing to participate in this proceeding. Southwest offered evidence to establish that it would be substantially affected by approval of the NCH application. The NCH length-of-stay identified in the Southwest documents is inaccurate and under-reports actual length-of-stay statistics. The documentation also includes demographic information from a zip code (33912) which contributes an insignificant portion of NCH patients, and relies on only two years of data in support of the assertion that utilization in the NCH service area is declining. Southwest's chief operating officer testified that he considers Gulf Coast Hospital, another Columbia-owned facility, to offer more competition to Southwest that does NCH. Further, a physician must have admitting privileges at a hospital before she can admit patients to the facility. Of the physicians holding admitting privileges at Southwest, only two, both cardiologists, also have admitting privileges at NCH. Contrary to Southwest, NCH does not have an open heart surgery program. Accordingly, at least as to physician-admitted patients, approval of the NCH application would likely have little impact. On August 26, 1991, NCH submitted to AHCA a letter of intent indicating that NCH would file a Certificate of Need ("CON") application in the September 26, 1991 batching cycle for the addition of 35 acute care beds to the Naples and North Collier facilities. The letter of intent did not specify how the additional beds would be divided between the two facilities. The determination of the number of beds for which NCH would apply was solely based on the fact that the applicant had 35 observation beds which could be readily converted to acute care beds. The observation beds NCH proposes to convert are equipped identically to the acute care beds at NCH and are currently staffed. The costs involved in such conversion are minimal and relatively insignificant. Included with the letter of intent was a certified corporate resolution which states that on July 24, 1991, the NCH Board of Trustees authorized the filing of an application for the additional beds, authorized NCH to incur related expenses, stated that NCH would accomplish the proposed project within time and budget allowances set forth in the application, and that NCH would license and operate the facility. By certification executed August 7, 1991, the NCH secretary certified that the resolution was enacted at the July 24, 1991 board meeting and that the resolution did not contravene the NCH articles of incorporation or bylaws. Article X, Sections 10.1 and 10.1.3 of the NCH bylaws provides that no CON application shall be legally effective without the written approval of CHC. On September 26, 1991, NCH filed an application for CON No. 6797 proposing to add 31 acute care beds to Naples and 4 acute care beds to North Collier. The CON application included a copy of the NCH board resolution and certification which had been previously submitted with the letter of intent as well as the appropriate filing fee. NCH published appropriate public notice of the application's filing. As of the date of the CON application's filing, CHC had not issued written approval of the CON application prior to the action of the NCH Board of Directors and the filing of the letter of intent or the application. On October 2, 1992, four days prior to the administrative hearing in this case, the board of CHC ratified the actions of NCH as to the application for CON at issue in this case. The CHC board has previously ratified actions of the NCH in such fashion. There is uncontroverted testimony that the CHC board was aware of the NCH application and that no reservation was expressed by any CHC board member regarding the CON application. Although NCH's filing of the CON application without appropriate authorization from its parent company appears to be in violation of the NCH bylaws, such does not violate the rules of the AHCA. There is no evidence that the AHCA requested written authorization from the CHC board. After review of the application, the AHCA identified certain deficiencies in the application and notified NCH, which apparently rectified the deficiencies. The AHCA deemed the application complete on November 8, 1991. As required by statute, NCH included a list of capital projects as part of the CON application. The list of capital projects attached to the application was incomplete. The capital projects list failed to identify approximate expenditures of $370,000 to construct a patio enclosure, $750,000 to install an interim sprinkler system, $110,000 to construct emergency room triage space, and $125,000 to complete electrical system renovations. At hearing, witnesses for NCH attempted to clarify the omissions from the capital projects list. The witnesses claimed that such omitted projects were actually included within projects which were identified on the list. When identifying the listed projects within which the omitted projects were supposedly included, the witnesses testified inconsistently. For example, one witness testified that the patio project was included in the emergency room expansion project listed in the application. Another witness claimed that the patio enclosure was included in an equipment purchase category. Based on the testimony, it is more likely that the patio enclosure was neither a part of an emergency room expansion nor equipment purchase, but was a separate construction project which was omitted from the CON application. Similarly inconsistent explanations were offered for the other projects which were omitted from the capital projects list. The testimony was not credible. The capital projects omitted from the list do not affect the ability of NCH to implement the CON sought in this proceeding. The parties stipulated to the fact the NCH has sufficient financial resources to construct and equip the proposed project. As part of the CON application, NCH was required to submit a pro forma income statement for the time period during which the bed additions would take place. The application failed to include a pro forma statement for the appropriate time period. Based on the stipulation of the parties that the costs and methods of the proposed construction are reasonable, and that NCH has adequate resources to fund the project, the failure to include the relevant pro forma is immaterial. Pursuant to applicable methodology, the AHCA calculates numeric acute care bed need projections for each subdistrict's specific planning period. Accordingly, the AHCA calculated the need for additional acute care beds in district 8, subdistrict 2 for the July, 1996 planning horizon. The results of the calculation are published by the agency. The unchallenged, published fixed need pool for the planning horizon at issue in this proceeding indicated that there was no numeric need for additional acute care beds in district 8, subdistrict 2, Collier County, Florida, pursuant to the numeric need methodology under Rule 59C-1.038 Florida Administrative Code. The CON application filed by NCH is based on the peak seasonal demand experienced by hospitals in the area during the winter months, due to part-time residents. NCH asserts that the utilization of acute care beds during the winter months (January through April) results in occupancy levels in excess of 75 percent and justifies the addition of acute care beds, notwithstanding the numerical need determination. Approval of the CON application is not justified by the facts in this case. The AHCA's acute care bed need methodology accounts for high seasonal demand in certain subdistricts in a manner which provides that facilities have bed space adequate to accommodate peak demand. The calculation which requires that the average annual occupancy level exceed 75 percent reflects AHCA consideration of occupancy levels which rise and fall with seasonal population shifts. The applicant has not challenged the methodology employed by the AHCA in projecting need. Peak seasonal acute care bed demand may justify approval of a CON application seeking additional beds if the lack of available beds poses a credible threat of potentially negative impact on patient outcomes. The peak seasonal demand experienced by NCH has not adversely affected patient care and there is insufficient evidence to establish that, at this time, such peak demand poses a credible threat of potential negative impact on patient outcomes in the foreseeable future. There is no dispute regarding the existing quality of care at Naples, North Collier, Southwest or any other acute care hospital in district 8. The parties stipulated that NCH has the ability to provide quality of care and a record of providing quality of care. In this case, the applicant is seeking to convert existing beds from a classification of "observation" to "acute care". The observation beds NCH proposes to convert are equipped identically to the acute care beds at NCH. Approval of the CON application would result in no net increase in the number of licensed beds. NCH offered anecdotal evidence suggesting that delays in transferring patients from the Naples emergency room to acute care beds (a "logjam") was caused by peak seasonal occupancy rates. There was no evidence offered as to the situation at the North Collier emergency room. The anecdotal evidence is insufficient to establish that "logjams" (if they occur at all) are related to an inadequate number of beds identified as "acute care" at NCH facilities. There are other factors which can result in delays in moving patients from emergency rooms to acute care beds, including facility discharge patterns, delays in obtaining medical test results and staffing practices. NCH asserted at hearing that physicians who refer patients to NCH facilities will not refer such patients to other facilities. The evidence fails to establish that such physician practice is reasonable or provides justification for approval of CON applications under "not normal" circumstances and further fails to establish that conditions at NCH are such as to result in physicians attempting to locate other facilities in which to admit patients. The rule governing approval of acute care beds provides that, prior to such approval, the annual occupancy rate for acute care beds in the subdistrict or for the specific provider, must exceed 75 percent. This requirement has not been met. Applicable statutes require that, in considering applications for CON's, the AHCA consider accessibility of existing providers. The AHCA- established standard provides that acute care bed accessibility requirements are met when at least 90 percent of the residents in an urban subdistrict are within a 30 minute automobile trip to such facilities. At least 90 percent of Naples residents are presently within a 30 minute travel time to NCH acute care beds. The number of acute care beds in the subdistrict substantially exceed the demand for such beds. Additional beds would result in inefficient utilization of existing beds, would further increase the current oversupply of beds, would delay the time at which need for additional beds may be determined and, as such, would prevent competing facilities from applying for and receiving approval for such beds. The financial feasibility projections set forth in the CON application rely on assumptions as to need and utilization projections which are not supported by the greater weight of the evidence and are not credited. Accordingly, the evidence fails to establish that the addition of 35 acute care beds to NCH facilities is financially feasible in the long term or that the income projections set forth in the CON application are reasonable. As to projections related to staffing requirements and costs, the beds are existing and are currently staffed on a daily, shift-by-shift basis, based on patient census and acuity of illness. There is reason to believe that the staffing patterns will remain fairly constant and accordingly the projections, based on historical data, are reasonable. Generally stated, where there is no numeric or "not normal" need for the proposed addition of 35 acute care beds in the relevant subdistrict, it could be predicted that the addition of acute care beds would exacerbate the oversupply of available beds and could cause a slight reduction in the occupancy levels experienced by other providers. In this case, the market service areas are sufficiently distinct as to suggest that such would not necessarily be the result. However, based on the lack of need justifying approval of the CON application under any existing circumstances, it is unnecessary to address in detail the impact on existing providers. The state and district health plans identify a number of preferences which should be considered in determining whether a CON application should be approved. The plans suggest that such preferences are to be considered when competing CON applications are reviewed. In this case there is no competing application and the applicability of the preferences is unclear. However, in any event, application of the preferences to this proposal fail to support approval of the application.

Recommendation RECOMMENDED that a Final Order be entered DENYING the application of Naples Community Hospital, Inc., for Certificate of Need 6797. DONE and RECOMMENDED this 19th day of March, 1993 in Tallahassee, Florida. WILLIAM F. QUATTLEBAUM Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 19th day of March, 1993. APPENDIX TO RECOMMENDED ORDER, CASE NO. 92-1510 To comply with the requirements of Section 120.59(2), Florida Statutes, the following constitute rulings on proposed findings of facts submitted by the parties. Petitioner The Petitioner's proposed findings of fact are accepted as modified and incorporated in the Recommended Order except as follows: 3-4, 6-8, 16-20, 29-36, 38, 41, 44, 47, 49-61, 80, 88, 95-96, 100, 104, 108, 117-119, 122-125, 127, 134-138. Rejected as unnecessary. 15. Rejected as irrelevant. Peak seasonal demand is accounted for by the numeric need determination methodology. There is no credible evidence which supports a calculation of three years of four month winter occupancy to reach a 12 month average occupancy rate. 21-27, 37, 42-43, 62-64, 66, 97, 99, 101-103, 105-107, 109, 120-121, 126. Rejected as not supported by the greater weight of credible and persuasive evidence. 28. Rejected as not supported by the greater weight of credible and persuasive evidence and contrary to the stipulation filed by the parties. Rejected as not supported by greater weight of credible and persuasive evidence which fails to establish that the transfer of patients from emergency room to acute care beds is delayed due to numerical availability of beds. Rejected as not supported by greater weight of credible and persuasive evidence which fails to establish that the alleged lack of acute care beds is based on insufficient number of total beds as opposed to other factors which affect bed availability. Rejected as immaterial and contrary to the greater weight of the evidence Rejected as immaterial and contrary to the greater weight of the evidence which fails to establish reasonableness of considering only a four month period under "not normal" circumstances where the period and the peak seasonal demand are included within the averages utilized to project bed need. 86. Rejected as cumulative. 114. Rejected as unsupported hearsay. Respondent/Intervenor The Respondent and Intervenor filed a joint proposed recommended order. The proposed order's findings of fact are accepted as modified and incorporated in the Recommended Order except as follows: 6, 45, 51, 53, 59-67, 69-70, 94-113. Rejected as unnecessary. 16. Rejected as to use of term "false", conclusion of law. 58. Rejected as not clearly supported by credible evidence. 71-93, 114-124. Rejected as cumulative. COPIES FURNISHED: Douglas M. Cook, Director Agency for Health Care Administration 2727 Mahan Drive Tallahassee, Florida 32308 Sam Power, Agency Clerk Agency for Health Care Administration The Atrium, Suite 301 325 John Knox Road Tallahassee, Florida 32303 Harold D. Lewis, Esquire Agency for Health Care Administration The Atrium, Suite 301 325 John Knox Road Tallahassee, Florida 32303 W. David Watkins, Esquire Oertel, Hoffman, Fernandez, & Cole Post Office Box 6507 Tallahassee, Florida 32314-6507 Edward G. Labrador, Esquire Thomas Cooper, Esquire Agency for Health Care Administration 2727 Mahan Drive Tallahassee, Florida 32308 John D.C. Newton, II, Esquire Aurell, Radey, Hinkle, Thomas & Beranek Monroe Park Tower, Suite 1000 101 North Monroe Street Post Office Drawer 11307 Tallahassee, Florida 32302

Florida Laws (1) 120.57 Florida Administrative Code (1) 59C-1.008
# 10

Can't find what you're looking for?

Post a free question on our public forum.
Ask a Question
Search for lawyers by practice areas.
Find a Lawyer