Elawyers Elawyers
Ohio| Change
Find Similar Cases by Filters
You can browse Case Laws by Courts, or by your need.
Find 49 similar cases
SOUTH FLORIDA COMMUNITY CARE NETWORK, LLC, D/B/A COMMUNITY CARE PLAN vs AGENCY FOR HEALTH CARE ADMINISTRATION, 18-003513BID (2018)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Jul. 09, 2018 Number: 18-003513BID Latest Update: Jan. 25, 2019

The Issue Does Petitioner, AHF MCO of Florida, Inc., d/b/a PHC Florida HIV/AIDS Specialty Plan (Positive), have standing to contest the intended award to Simply for Regions 10 and 11 or to seek rejection of all proposals? (Case No. 18-3507 and 18-3508) Should the intended decision of Respondent, Agency for Health Care Administration (Agency), to contract with Simply Healthcare Plans, Inc. (Simply), for Medicaid managed care plans for HIV/AIDS patients in Regions 10 (Broward County) and Region 11 (Miami-Dade and Collier Counties) be invalidated and all proposals rejected? (Case Nos. 18-3507 and 18-3508) Must the Agency negotiate with Petitioner, South Florida Community Care Network, LLC, d/b/a Community Care Plan (Community), about a plan to provide HIV/AIDS Medicaid managed care services in Region 10 because it was the only responsive proposer of services that was a Provider Service Network (PSN)? (Case No. 18-3512) Must the Agency negotiate with Community to provide Medicaid managed care services in Region 10 for people with Serious Mental Illnesses because Community is a PSN? (Case No. 18-3511) Must the Agency contract with Community to provide Medicaid managed care services for Children with Special Needs in Region 10 because Community is a PSN? (Case No. 18-3513) Must the Agency negotiate with Community to provide Medicaid managed care services for Child Welfare patients in Region 10 because Community is a PSN? (Case No. 18-3514)

Findings Of Fact THE PARTIES Agency: Section 20.42, Florida Statutes, establishes the Agency as Florida’s chief health policy and planning agency. The Agency is the single state agency authorized to select eligible plans to participate in the Medicaid program. Positive: Positive is a Florida not-for-profit corporation operating a Medicaid health plan dedicated to serving people with HIV/AIDS. Positive serves about 2,000 patients in Florida. Positive’s health plan is accredited by the Accreditation Association for Ambulatory Healthcare. Its disease management program is accredited by the National Committee for Quality Assurance. Currently, the Agency contracts with Positive for a SMMC HIV/AIDS Specialty Plan serving Regions 10 and 11. Simply: Simply is a Florida for-profit corporation operating a Medicaid health plan dedicated to serving people with HIV/AIDS. Currently, the Agency contracts with Simply to provide a SMMC HIV/AIDS Specialty Plan for Regions 1 through 3 and 5 through 11. Simply has maintained the largest patient enrollment of all HIV/AIDs plans in Florida since Florida started its statewide Medicaid managed care program. Community Care: Community is a Florida limited liability company. It is a PSN as defined in sections 409.912(1)(b) and 409.962(14), Florida Statutes. Staywell: Staywell is the fictitious name for WellCare of Florida, Inc., serving Florida’s Medicaid population. Sunshine: Sunshine State Health Plan (Sunshine) is a Florida corporation. It offers managed care plans to Florida Medicaid recipients. THE INVITATION TO NEGOTIATE TIMELINE On July 14, 2017, the Agency released 11 ITNs plans for Florida’s Medicaid managed care program in 11 statutorily defined regions. Region 10, Broward County, and Region 11, Miami-Dade and Collier Counties, are the regions relevant to this proceeding. Part IV of chapter 409, creates a statewide, integrated managed care program for Medicaid services. This program called Statewide Medicaid Managed Care includes two programs, Managed Medical Assistance and Long-term Care. Section 409.966(2), directs the Agency to conduct separate and simultaneous procurements to select eligible plans for each region using the ITN procurement process created by section 287.057(1)(c). The ITNs released July 14, 2017, fulfilled that command. The Agency issued 11 identical ITNs of 624 pages, one for each region, in omnibus form. They provided elements for four types of plans. Some elements were common to all types. Others were restricted to a specific plan type defined by intended patient population. The plan types are comprehensive plans, long-term care plus plans, managed medical assistance plans, and specialty plans. Section 409.962(16) defines “Specialty Plan” as a “managed care plan that serves Medicaid recipients who meet specified criteria based on age, medical condition, or diagnosis.” Responding vendors identified the plan type or types that they were proposing. The Agency issued Addendum No. 1 to the ITNs on September 14, 2017. On October 2, 2017, the Agency issued Addendum No. 2 to the ITNs. Addendum 2 included 628 questions about the ITNs and the Agency’s responses to the questions. Florida law permits potential responders to an ITN to challenge the specifications of an ITN, including the addendums. § 120.57(3)(b), Fla. Stat. Nobody challenged the specifications of the ITNs. As contemplated by section 287.057(c)(2), the Agency conducted “a conference or written question and answer period for purposes of assuring the vendors’ full understanding of the solicitation requirements.” Positive, Community, and Simply, along with United Healthcare of Florida, Inc., HIV/AIDS Specialty Plan (United), submitted responses to the ITN in Region 10 proposing HIV/AIDS Specialty Plans. Community was the only PSN to propose an HIV/AIDS plan for Region 10. Positive, Simply, and United submitted replies to the ITN for Region 11, proposing HIV/AIDS Specialty Plans. Community, United, Staywell, and one other provider submitted proposals to provide SMI Specialty Plan services in Region 10. Community was the only responding PSN. Community, Sunshine, and Staywell submitted proposals to provide Child Welfare Specialty Plans (CW) in Region 10. Community was the only PSN. Community, Staywell, and two others submitted proposals to offer Specialty Plans for Children with Special Needs (CSN) in Region 10. Community was one of two responding PSNs. Proposal scoring began November 6, 2017, and ended January 16, 2018. The Agency announced its intended awards on April 24, 2018. On April 24, 2018, the Agency issued its notices of intent to award specialty contracts in Regions 10 and 11. The following charts summarize the Agency’s ranking of the proposals and its intended awards. The two highest ranked plans, which the Agency selected for negotiations, are identified in bold. Region 10 – Children with Special Needs Respondent Intended Award Ranking Staywell No 1 Community No 2 Miami Children’s Health Plan, LLC No 3 Our Children PSN of Florida, LLC No 4 Region 10 – Child Welfare Respondent Intended Award Ranking Staywell No 1 Sunshine Yes 2 Molina Healthcare of Florida, Inc. No 3 Community No 4 Region 10 – HIV/AIDS Respondent Intended Award Ranking Simply Yes 1 United No 2 Community No 3 Positive No 4 Region 10 – Serious Mental Illness Respondent Intended Award Ranking Staywell Yes 1 United No 2 Florida MHS, Inc. No 3 Community No 4 Region 11 – HIV/AIDS Respondent Intended Award Ranking Simply Yes 1 United No 2 Positive No 3 All of the Specialty Plan awards noticed by the Agency went to bidders who also proposed, and received, comprehensive plan awards. The protests, referrals, and proceedings before the Division summarized in the Preliminary Statement followed the Agency’s announcement of its intended awards. TERMS The voluminous ITN consisted of a two-page transmittal letter and three Attachments (A, B, and C), with a total of 34 exhibits to them. They are: Attachment A, Exhibits A-1 through A-8, Attachment B, Exhibits B-1 through B-3, and Attachment C, Exhibits C-1 through C-8. The ITN establishes a two-step process for selecting: an evaluation phase and a negotiation phase. In the evaluation phase, each respondent was required to submit a proposal responding to criteria of the ITN. Proposals were to be evaluated, scored, and ranked. The goal of the evaluation phase was to determine which respondents would move to negotiations, not which would be awarded a contract. The top two ranking Specialty Plans per specialty population would be invited to negotiations. In the negotiation phase, the Agency would negotiate with each invited respondent. After that, the Agency would announce its intended award of a contract to the plan or plans that the Agency determined would provide the best value. Together, the attachments and exhibits combined instructions, criteria, forms, certifications, and data into a “one size fits all” document that described the information required for four categories of managed care plans to serve Medicaid patients. The ITN also provided data to consider in preparing responses. The transmittal letter emphasized, “Your response must comply fully with the instructions that stipulate what is to be included in the response.” The ITNs identified Jennifer Barrett as the procurement officer and sole point of contact with the Agency for vendors. The transmittal letter is reproduced here. This solicitation is being issued by the State of Florida, Agency for Health Care Administration, hereinafter referred to as “AHCA” or “Agency”, to select a vendor to provide Statewide Medicaid Managed Care Program services. The solicitation package consists of this transmittal letter and the following attachments and exhibits: Attachment A Instructions and Special ConditionsExhibit A-1 Questions TemplateExhibit A-2-a Qualification of Plan Eligibility Exhibit A-2-b Provider Service Network Certification of Ownership and Controlling InterestExhibit A-2-c Additional Required Certifications and StatementsExhibit A-3-a Milliman Organizational Conflict of Interest Mitigation Plan Exhibit A-3-b Milliman Employee Organizational Conflict of Interest AffidavitExhibit A-4 Submission Requirements and Evaluation Criteria InstructionsExhibit A-4-a General Submission Requirements and Evaluation Criteria Exhibit A-4-a-1 SRC# 6 - General Performance Measurement ToolExhibit A-4-a-2 SRC# 9 - Expanded Benefits Tool (Regional) Exhibit A-4-a-3 SRC# 10 - Additional Expanded Benefits Template (Regional)Exhibit A-4-a-4 SRC# 14 - Standard CAHPS Measurement Tool Exhibit A-4-b MMA Submission Requirements and Evaluation Criteria Exhibit A-4-b-1 MMA SRC# 6 - Provider Network Agreements/Contracts (Regional)Exhibit A-4-b-2 MMA SRC# 14 - MMA Performance Measurement Tool Exhibit A-4-b-3 MMA SRC# 21 - Provider Network Agreements/Contracts Statewide Essential Providers Exhibit A-4-c LTC Submission Requirements and Evaluation CriteriaExhibit A-4-c-1 LTC SRC# 4 - Provider Network Agreements/Contracts (Regional) Exhibit A-4-d Specialty Submission Requirements and Evaluation CriteriaExhibit A-5 Summary of Respondent CommitmentsExhibit A-6 Summary of Managed Care Savings Exhibit A-7 Certification of Drug-Free Workplace ProgramExhibit A-8 Standard Contract Attachment B Scope of Service - Core Provisions Exhibit B-1 Managed Medical Assistance (MMA) ProgramExhibit B-2 Long-Term Care (LTC) ProgramExhibit B-3 Specialty Plan Attachment C Cost Proposal Instructions and Rate Methodology NarrativeExhibit C-1 Capitated Plan Cost Proposal TemplateExhibit C-2 FFS PSN Cost Proposal Template Exhibit C-3 Preliminary Managed Medical Assistance (MMA) Program Rate Cell Factors Exhibit C-4 Managed Medical Assistance (MMA) Program Expanded Benefit Adjustment Factors Exhibit C-5 Managed Medical Assistance (MMA) Program IBNR Adjustment Factors Exhibit C-6 Managed Medical Assistance (MMA) Program Historical Capitated Plan Provider Contracting Levels During SFY 15/16 Time Period Exhibit C-7 Statewide Medicaid Managed Care Data BookExhibit C-8 Statewide Medicaid Managed Care Data Book Questions and Answers Your response must comply fully with the instructions that stipulate what is to be included in the response. Respondents submitting a response to this solicitation shall identify the solicitation number, date and time of opening on the envelope transmitting their response. This information is used only to put the Agency mailroom on notice that the package received is a response to an Agency solicitation and therefore should not be opened, but delivered directly to the Procurement Officer. The ITN describes the plans as follows: Comprehensive Long-term Care Plan (herein referred to as a “Comprehensive Plan”) – A Managed Care Plan that is eligible to provide Managed Medical Assistance services and Long-term Care services to eligible recipients. Long-term Care Plus Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services and Long-term Care services to eligible recipients enrolled in the Long-term Care program. This plan type is not eligible to provide services to recipients who are only eligible for MMA services. Managed Medical Assistance (MMA) Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services to eligible recipients. This plan type is not eligible to provide services to recipients who are eligible for Long-term Care services. Specialty Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services to eligible recipients who are defined as a specialty population in the resulting Contract. Specialty Plans are at issue. The ITN did not define, describe, or specify specialty populations to be served. It left that to the responding vendors. Beyond that, the ITN left the ultimate definition of the specialty population for negotiation, saying in Section II(B)(1)(a) of Attachment B, Exhibit B-3, “[t]he Agency shall identify the specialty population eligible for enrollment in the Specialty Plan based on eligibility criteria based upon negotiations.” Some respondents directly identified the specialty population. Simply’s transmittal letter stated that it proposed “a Specialty plan for individuals with HIV/AIDS.” Positive’s response to Exhibit A-4-d Specialty SRC 4, eligibility and enrollment, stated, “the specialty population for the PHC [Positive] plan will be Medicaid eligible, male and female individuals from all age groups who are HIV positive with or without symptoms and those individuals who have progressed in their HIV disease to meet the CDC definition of AIDS.” Some others left definition of the specialty population to be inferred from the ITN response. The result is that the ITN left definition of the specialty populations initially to the respondents and ultimately to negotiations between the Agency and successful respondents. Petitioners and Intervenors describe the populations that they propose serving as HIV/AIDS patients, patients with SMI, CSN, and child welfare populations. ITN respondents could have proposed serving only cancer patients, serving only obstetric patients, or serving only patients with hemophilia. The part of the ITN requiring a respondent to identify the plan type for which it was responding offered only four alternative blocks to check. They were: “Comprehensive Plan,” Long-Term Care Plus Plan,” “Managed Medical Assistance Plan,” or “Specialty Plan.” Attachment A to the ITN, labeled “Instructions and Special Conditions,” provides an overview of the solicitation process; instructions for response preparation and content; information regarding response submission requirements; information regarding response evaluation, negotiations, and contract awards; and information regarding contract implementation. Exhibits A-1 to A-3 and A-5 to A-7 of the ITN contain various certifications and attestations that respondents had to prepare and verify. Exhibit A-4 contains submission requirement components (SRCs) to which respondents had to prepare written responses. Exhibit A-8 contains the state’s standard SMMC contract. ITN Exhibit A-4-a contains 36 general submission requirements and evaluation criteria (General SRCs). ITN Exhibit A-4-b contains 21 MMA submission requirements and evaluation criteria (MMA SRCs). ITN Exhibit A-4-c contains 13 LTC submission requirements and evaluation criteria (LTC SRCs). ITN Exhibit A-4-d contains five specialty submission requirements and evaluation criteria (Specialty SRCs). The responses that the 36 SRCs require vary greatly. Some are as simple as providing documents or listing items. Others require completing tables or spreadsheets with data. Consequently, responses to some SRCS apparently could be reviewed in very little time, even a minute or less. Others requiring narrative responses might take longer. Examples follow. General SRC 1 required a list of the respondent’s contracts for managed care services and 12 information items about them including things such as whether they were capitated, a narrative describing the scope of work; the number of enrollees; and accomplishments and achievement. General SRC 2 asked for documentation of experience operating a Medicaid health plan in Florida. General SRC 3 asked for information confirming the location of facilities and employees in Florida. General SRC 12 requested a flowchart and written description of how the respondent would execute its grievance and appeal system. It listed six evaluation criteria. MMA SRC 2 asks for a description of the respondent’s organizational commitment to quality improvement “as it relates to pregnancy and birth outcomes.” It lists seven evaluation criteria. MMA SRC 10 asks for a description of the respondent’s plan for transition of care between service settings. It lists six evaluation criteria including the respondent’s process for collaboration with providers. Specialty SRC 1 asks for detailed information about respondent’s managed care experience with the specialty population. Specialty SRC 5 asks for detailed information about the respondent’s provider network standards and provides five evaluation criteria for evaluating the answers. Exhibit A-8 of the ITN contains the standard SMMC contract. Attachment B and Exhibits B-1 to B-3 of the ITN contain information about the scope of service and core provisions for plans under the SMMC program. Attachment C and Exhibits C-1 to C-8 of the ITN contain information related to the cost proposals and rate methodologies for plans under the SMMC program. The ITN permitted potential respondents to submit written questions about the solicitation to the Agency by August 14, 2017. Some did. On September 14, 2017, the Agency issued Addendum No. 1 to the ITN. Among other things, Addendum No. 1 changed the anticipated date for the Agency’s responses to respondents’ written questions from September 15 to October 2, 2017. The Agency issued Addendum No. 2 to the ITN on October 2, 2017. Addendum No. 2 included a chart with 628 written questions from potential respondents and the Agency’s answers. Attachment A at A 10-(d) makes it clear that the answers are part of the addendum. Both Addendums to the ITN cautioned that any protest of the terms, conditions, or specifications of the Addendums to the ITN had to be filed with the Agency within 72 hours of their posting. No respondent protested. Instructions for the A-4 Exhibits included these requirements: Each SRC contains form fields. Population of the form fields with text will allow the form field to expand and cross pages. There is no character limit. All SRCs, marked as “(Statewide)” must be identical for each region in which the respondent submits a reply. For timeliness of response evaluation, the Agency will evaluate each “(Statewide)” SRC once and transfer the score to each applicable region’s evaluation score sheet(s). The SRCs marked as “(Regional)” will be specific and only apply to the region identified in the solicitation and the evaluation score will not be transferred to any other region. The instructions continue: Agency evaluators will be instructed to evaluate the responses based on the narrative contained in the SRC form fields and the associated attachment(s), if applicable. Each response will be independently evaluated and awarded points based on the criteria and points scale using the Standard Evaluation Criteria Scale below unless otherwise identified in each SRC contained within Exhibit A-4. This is the scale: STANDARD EVALUATION CRITERIA SCALE Point Score Evaluation 0 The component was not addressed. 1 The component contained significant deficiencies. 2 The component is below average. 3 The component is average. 4 The component is above average. 5 The component is excellent. The ITN further explained that different SRCs would be worth different “weights,” based on the subject matter of the SRC and on whether they were General, MMA, LTC, or Specialty SRCs. It assigned weights by establishing different “weight factors” applied as multipliers to the score a respondent received on a criteria. For example, “Respondent Background/Experience” could generate a raw score of 90. Application of a weight factor of three made 270 the maximum possible score for this criteria. “Oversight and Accountability” could generate a raw score of 275. A weight factor of one, however, made the maximum score available 275. General SRC 6 solicits HEDIS data. HEDIS is a tool that consists of 92 measures across six domains of care that make it possible to compare the performance of health plans on an “apples-to-apples” basis. SRC 6 states: The respondent shall describe its experience in achieving quality standards with populations similar to the target population described in this solicitation. The respondent shall include, in table format, the target population (TANF, ABD, dual eligible), the respondent’s results for the HEDIS measures specified below for each of the last two (2) years (CY 2015/ HEDIS 2016 and CY 2016/ HEDIS 2017) for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees). If the respondent does not have HEDIS results for at least three (3) Medicaid Contracts, the respondent shall provide commercial HEDIS measures for the respondent’s largest Contracts. If the Respondent has Florida Medicaid HEDIS results, it shall include the Florida Medicaid experience as one (1) of three (3) states for the last two (2) years. The respondent shall provide the data requested in Exhibit A-4-a-1, General Performance Measurement Tool[.] x x x Score: This section is worth a maximum of 160 raw points x x x For each of the measure rates, a total of 10 points is available per state reported (for a total of 360 points available). The respondent will be awarded 2 points if their reported plan rate exceeded the national Medicaid mean and 2 points if their reported plan rate exceeded the applicable regional Medicaid mean, for each available year, for each available state. The respondent will be awarded an additional 2 points for each measure rate where the second year’s rate is an improvement over the first year’s rate, for each available state. An aggregate score will be calculated and respondents will receive a final score of 0 through 150 corresponding to the number and percentage of points received out of the total available points. For example, if a respondent receives 100% of the available 360 points, the final score will be 150 points (100%). If a respondent receives 324 (90%) of the available 360 points, the final score will be 135 points (90%). If a respondent receives 36 (10%) of the available 360 points, the final score will be 15 points (10%). The SRC is plainly referring to the broad Medicaid- eligible population when it says “the target population (TANF, ABD, dual eligible).” “Dual eligible” populations are persons eligible for Medicaid and Medicare. There, as throughout the ITN, the ITN delineates between a target population of all Medicaid-eligible patients and a specialty population as described in a respondent’s ITN proposal. The clear instructions for SRC 6 require, “Use the drop-down box to select the state for which you are reporting and enter the performance measure rates (to the hundredths place, or XX.XX) for that state's Medicaid population for the appropriate calendar year.” Community did not comply. General SRC 14 solicits similar data, in similar form using a similar tool, about a respondent’s Consumer Assessment of Healthcare Providers and Systems (CAHPS). CAHPS data is basically a satisfaction survey. It asks respondents to provide “in table format the target population (TANF, ABD, dual eligible) and the respondent’s results for the Consumer Assessment of Healthcare Providers and Systems (CAHPS) items/composites specified below for the 2017 survey for its adult and child populations for the respondent’s three (3) largest Medicaid Contracts (as measured by number of enrollees).” Just like General SRC 6 did with HEDIS data, General SRC 14 ITN instructed bidders to put their CAHPS data for the “target population (TANF, ABD, dual eligible)” “for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees)” for multiple states into an excel spreadsheet “to the hundredths place[.]” Also, like General SRC 6, General SRC 14 includes an objective formula described in the ITN for scoring bidders’ CAHPS data. RANKING PROVISIONS Attachment A at (D)(4)(c)(2) stated: Each response will be individually scored by at least three (3) evaluators, who collectively have experience and knowledge in the program areas and service requirements for which contractual services are sought by this solicitation. The Agency reserves the right to have specific sections of the response evaluated by less than three (3) individuals. The ITN’s example of how total point scores would be calculated, discussed below, also indicated that some sections may be scored by less than three evaluators. The explanatory chart had a column for “[o]ther Sections evaluated by less than three (3) evaluators. ” The Agency’s policy, however, has been to assign at least three evaluators to score program specific SRCs. Attachment A at (D)(4)(e)(2) advised respondents how the agency will rank the competing responses. It was clear and specific, even providing an example of the process showing how the scores “will” be calculated. Step one of the explanatory chart stated that the Agency would calculate a total point score for each response. Step two stated that “[t]he total point scores will be used to rank the responses by an evaluator. . . .” Next, the rankings by the evaluator are averaged to determine the average rank for each respondent. This average ranking is critical because ranking is how the ITN said the Agency would select respondents for negotiation and how the Agency did select respondents for negotiation. The step two and step three charts, reproduced below, demonstrate that the ITN contemplated an evaluation process in which each response was to be evaluated in its entirety by three different evaluators, or maybe less than three, but indisputably in its entirety by those who evaluated it. This did not happen. Step 2 The total point scores will be used to rank the responses by evaluator (Response with the highest number of points = 1, second highest = 2, etc.). POINTS SUMMARY Evaluator A Evaluator B Evaluator C Evaluator D Respondent 446 Respondent 396 Respondent 311 Respondent 413 Respondent 425 Respondent 390 Respondent 443 Respondent 449 Respondent 397 Respondent 419 Respondent 389 Respondent 435 Respondent 410 Respondent 388 Respondent 459 Respondent 325 RANKING SUMMARY Evaluator A Evaluator B Evaluator C Evaluator D Respondent 1 1 Respondent 1 2 Respondent 1 4 Respondent 3 Respondent 2 2 Respondent 2 3 Respondent 2 2 Respondent 1 Respondent 3 4 Respondent 3 1 Respondent 3 3 Respondent 2 Respondent 4 3 Respondent 4 4 Respondent 4 1 Respondent 4 c) Step 3 An average rank will be calculated for each response for all the evaluators. Respondent 1 1+2+4+3=10÷4=2.5 Respondent 2 2+3+2+1=8÷4=2.0 Respondent 3 4+1+3+2=10÷4=2.5 Respondent 4 3+4+1+4=12÷4=3.0 PROVIDER SERVICE NETWORK PROVISIONS Florida law permits a PSN to limit services provided to a target population “based on age, chronic disease state, or medical condition of the enrollee.” This allows a PSN to offer a specialty plan. For each region, the eligible plan requirements of section 409.974(1) state, “At least one plan must be a provider service network if any provider service networks submit a responsive bid.” Section 409.974(3) says: “Participation by specialty plans shall be subject to the procurement requirements of this section. The aggregate enrollment of all specialty plans in a region may not exceed 10 percent of the total enrollees of that region.” The ITN addressed those requirements. The Negotiation Process section of Attachment A, Instructions and Special Conditions, says: The Agency intends to invite the following number of respondents to negotiation: Comprehensive Plans The top four (4) ranking Comprehensive Plans. Long-term Care Plus Plans The top two (2) ranking Long-term Care Plus Plans Managed Medical Assistance Plans The top two (2) ranking Managed Medical Assistance Plans Specialty Managed Medical Assistance Plans The top two (2) ranking Specialty Managed Medical Assistance Plans per specialty population. If there are no provider service networks included in the top ranked respondents listed above, the Agency will invite the highest ranked PSN(s) to negotiations in order to fulfill the requirements of Section 409.974(1), Florida Statutes and Section 409.981(1), Florida Statutes. Emphasis supplied. The ITN specifications in Section D.7, titled Number of Awards, state as follows about Specialty Plan awards: 7. Number of Awards In accordance with Sections 409.966, 409.974, and 409.981, Florida Statutes, the Agency intends to select a limited number of eligible Managed Care Plans to provide services under the SMMC program in Region 10. The Agency anticipates issuing the number of Contract awards for Region 10 as described in Table 5, SMMC Region, below, excluding awards to Specialty MMA Plans. Table 5 SMMC Region Region Total Anticipated Contract Awards Region 10 4 If a respondent is awarded a Contract for multiple regions, the Agency will issue one (1) Contract to include all awarded regions. The Agency will award at least one (1) Contract to a PSN provided a PSN submits a responsive reply and negotiates a rate acceptable to the Agency. The Agency, at its sole discretion, shall make this determination. A respondent that is awarded a Contract as a Comprehensive Plan is determined to satisfy the requirements in Section 409.974, Florida Statutes and Section 409.981, Florida Statutes and shall be considered an awardee of an MMA Contract and a LTC Contract. The Agency will issue one (1) Contract to reflect all awarded populations in all awarded regions. In addition to the number of Contracts awarded in this region, additional Contracts may be awarded to Specialty Plans that negotiate terms and conditions determined to be the best value to the State and negotiate a rate acceptable to the Agency. The Agency, at its sole discretion, shall make this determination. The Agency reserves the right to make adjustments to the enrollee eligibility and identification criteria proposed by a Specialty Plan prior to Contract award in order to ensure that the aggregate enrollment of all awarded Specialty Plans in a region will not exceed ten percent (10%) of the total enrollees in that region, in compliance with Section 409.974(3), Florida Statutes. If a respondent is awarded a Contract as a Specialty Plan and another plan type, the Agency will issue one (1) Contract to include all awarded populations in all awarded regions. A prospective vendor asked about the interplay of Specialty Plan options and the PSN requirements. The question and the answer provided in Addendum 2 follow: Q. Please clarify the number of PSN awards per region and how PSN awards will be determined based on the PSN's plan type (e.g., Comprehensive, LTC Plus, MMA, Specialty). As you know, Sections 409.974 and 409.981, Florida Statutes require one MMA PSN and one LTC PSN award per region (assuming a PSN is responsive) and the Agency has stated that an award to a Comprehensive Plan PSN will meet the requirements of both statutes. However, can the Agency further clarify whether other types of PSNs would meet the statutory requirements? Specifically, would a PSN LTC Plus award meet the requirements of Section 409.981, Florida Statutes? Similarly, would an award to a Specialty Plan PSN meet the requirements of Section 409.974, Florida Statutes? A. See Attachment A Instructions and Special Conditions, Section D Response Evaluations, and Contract Award, Sub-Section 7 Number of Awards. Yes, a PSN LTC Plus award would meet the requirements of Section 409.981(2). A Specialty Plan PSN would not meet the requirements of Section 409.974(1). The only reasonable interpretation of this answer is that Specialty Plan PSNs do not satisfy the requirement to contract with a responsive PSN imposed by section 409.974. None of the prospective vendors, including Community, challenged this clarification. EVALUATION PROCESS THE EVALUATORS The Agency selected 11 people to evaluate the proposals. The Agency assigned each person a number used to identify who was assigned to which task and to track performance of evaluation tasks. The procurement officer sent the evaluators a brief memo of instructions. It provided dates; described logistics of evaluation; emphasized the importance of independent evaluation; and prohibited communicating about the ITN and the proposals with anyone other than the procurement office. The Agency also conducted an instructional session for evaluators. Evaluator 1, Marie Donnelly: During the procurement, Ms. Donnelly was the Agency’s Chief of the Bureau of Medicaid Quality. She held this position for five years before resigning. This bureau bore responsibility for ensuring that the current SMMC plans met their contract requirements for quality and quality improvement measures. Her role specifically included oversight of Specialty Plans. Evaluator 2, Erica Floyd Thomas: Ms. Thomas is the chief of the Bureau of Medicaid Policy. She has worked for the Agency since 2001. Her Medicaid experience includes developing policies for hospitals, community behavioral health, residential treatment, and contract oversight. Before serving as bureau chief, she served as an Agency administrator from 2014 through 2017. Ms. Thomas oversaw the policy research and development process for all Medicaid medical, behavioral, dental, facility, and clinic coverage policies to ensure they were consistent with the state Plan and federal Medicaid requirements. Evaluator 3, Rachel LaCroix, Ph.D.: Dr. LaCroix is an administrator in the Agency’s Performance Evaluation and Research Unit. She has worked for the Agency since 2003. All her positions have been in the Medicaid program. Dr. LaCroix has served in her current position since 2011. She works with the performance measures and surveys that the current SMMC providers report to the Agency. Dr. LaCroix is a nationally recognized expert on healthcare quality metrics like HEDIS. She is also an appointee on the National Association of Medicaid Directors’ task force for national performance measures. Evaluator 4, Damon Rich: Mr. Rich has worked for the Agency since April 2009. He is the chief of the Agency’s Bureau of Recipient and Provider Assistance. This bureau interacts directly with AHCA’s current SMMC care providers about any issues they have, and with Medicaid recipients, usually about their eligibility or plan enrollment. Before Mr. Rich was a bureau chief, he worked as a field office manager for the Agency. Mr. Rich’s experience as bureau chief and field office manager includes oversight of the current SMMC Specialty Plans. Evaluator 5. Eunice Medina: Ms. Medina is the chief of the Agency’s Bureau of Medicaid Plan Management, which includes a staff of over 60 individuals, who manage the current SMMC contracts. Her experience and duties essentially encompass all aspects of the current SMMC plans. Ms. Medina started working with the Agency in 2014. Evaluator 6, Devona “DD” Pickle: Ms. Pickle most recently joined the Agency in 2011. She also worked for the Agency from November 2008 through November 2010. Ms. Pickle’s Agency experience all relates in some way to the Medicaid program. Since March 2013, Ms. Pickle has served as an administrator over managed care policy and contract development in the Bureau of Medicaid Policy. Her job duties include working with the current SMMC contractors. Ms. Pickle is also a Florida licensed mental health counselor. Evaluator 7, Tracy Hurd-Alvarez: Ms. Hurd-Alvarez has worked for the Agency’s Medicaid program since 1997. Since 2014, she has been a field office manager, overseeing compliance monitoring for all the current SMMC contractors. Before assuming her current position, Ms. Hurd-Alvarez implemented the LTC SMMC program. Evaluator 8, Gay Munyon: Ms. Munyon is currently the Chief of the Bureau of Medicaid Fiscal Agent Operations. Ms. Munyon began working with the Agency in April 2013. Ms. Munyon’s bureau oversees fulfillment of the Agency’s contract with the current SMMC fiscal agent. Her unit’s responsibilities include systems maintenance and modifications and overseeing the fiscal agent, which answers phone calls, processes claims, and processes applications. Ms. Munyon has 25 years of experience working with the Medicaid program. Evaluator 9, Laura Noyes: Ms. Noyes started working for the Agency in April 2011. Her years of Agency experience all relate to the Medicaid program, including overseeing six current comprehensive managed care plans by identifying trends in contractual non-compliance. Evaluator 10, Brian Meyer: Mr. Meyer is a CPA, who has worked for the Agency in the Medicaid program since 2011. He is currently chief of the Bureau of Medicaid Data Analytics. Mr. Meyer’s primary responsibility is overseeing the capitation rates for the current SMMC contractors. His experience includes Medicaid plan financial statement analysis, surplus requirement calculation analysis and, in general, all types of financial analysis necessary to understand financial performance of the state’s Medicaid plans. Evaluator 11, Ann Kaperak: Since April 2015, Ms. Kaperak has served as an administrator in the Agency’s Bureau of Medicaid Program Integrity. Ms. Kaperak’s unit oversees the fraud and abuse efforts of the current SMMC plans. She also worked for the Medicaid program from November 2012 through May 2014. Ms. Kaperak worked as a regulatory compliance manager for Anthem/Amerigroup’s Florida Medicaid program between May 2014 and April 2015. Positive and Community challenge the Agency’s plan selections by questioning the qualifications of the evaluators. The first part of their argument is that the evaluators did not have sufficient knowledge about HIV/AIDS and its treatment. The evidence does not prove the theory. For instance, Positive’s argument relies upon criticizing the amount of clinical experience evaluators had managing patients with HIV/AIDS. That approach minimizes the fact that the managed care plan characteristics involve so much more than disease- specific considerations. For instance, many of the components require determining if the respondent provided required documents, verifying conflict of interest documents, management structure, quality control measures, and the like. General SRCs asked for things like dispute resolution models (SRC 16), claims processing information (SRC 17), and fraud and abuse compliance plans (SRC 31). MMA SRCs included criteria, like telemedicine (SRC 4), demonstrated progress obtaining executed provider agreements (SRC 6), and a credentialing process (SRC 12). Specialty SRCs included criteria like copies of contracts for managed care for the proposed specialty population (SRC 1), specific and detailed criteria defining the proposed specialty population (SRC 4), and the like. The evidence does not prove that disease-specific experience is necessary to evaluate responses to these and other SRCs. SRC 6 involving HEDIS data and SRC 14 involving CAHPS data are two good examples. They required respondents to input data into a spreadsheet. All the evaluators had to do was determine what those numbers showed. Evaluation did not require any understanding of disease or how the measures were created. All the evaluator had to know was the number in the spreadsheet. The second part of the evaluator qualification criticisms is that the evaluators did not give adequate weight to some responses. Positive and Community just disagree with the measures requested and the evaluation of them. They conclude from that disagreement that the evaluators’ qualifications were deficient. The argument is not persuasive. The last sentence of paragraph 69 of Positive’s proposed recommended order exemplifies the criticisms of Positive and Community of the evaluators’ qualifications. It states, “The fact that PHC [Positive] was ranked last among competing HIV plans shows that the SRC evaluators did not understand enough about managing individuals with HIV/AIDs to score its proposal competently.” The argument is circular and “ipse dixit”. It does not carry the day. The collective knowledge and experience of the evaluators, with a total of 128 years of Medicaid experience, made them capable of reasonably evaluating the managed care plan proposals, including the Specialty plan proposals. The record certainly does not prove otherwise. EVALUATION PROCESS The Agency assigned the evaluators to the SRCs that it determined they were qualified to evaluate and score. The Agency did not assign entire responses to an evaluator for review. Instead it elected a piecemeal review process assigning various evaluators to various sections, the SRCs of each response. Paragraph 30 of the Agency’s proposed recommended order describes this decision as follows: Although the ITN had contemplated ranking each vendor by evaluator, based on an example in the ITN, such ranking presumed a process where all evaluators scored all or nearly all of the responses to the ITN, which had occurred in the procurement five years ago. In this procurement, each evaluator reviewed only a subset of SRCs based on their knowledge, and experience; therefore, ranking by evaluator was not logical because each had a different maximum point score. The initial SRC scoring assignments were: General SRCs 1 through 4, LTC SRCs 1 and 2, and Specialty SRC 1: Marie Donnelly, Laura Noyes, and Brian Meyer. General SRCs 5 through 8, MMA SRCs 1 through 7, LTC SRCs 3 and 4, and Specialty SRCs 1 and 2: Marie Donnelly, Erica Floyd- Thomas, and Rachel LaCroix. General SRCs 9 through 14, MMA SRCs 8 through 11, LTC SRCs 5 through 7, and Specialty SRC 4: Damon Rich, Eunice Medina, and DD Pickle. General SRCs 15 through 17, MMA SRCs 12 and 13, and LTC SRCs 8 through 10: Damon Rich, Tracy Hurd-Alvarez, Gay Munyon. General SRCs 18 through 25, MMA SRCs 14 through 20, LTC SRCs 11 and 12, and Specialty SRC 5: Erica Floyd-Thomas, Eunice Medina, and DD Pickle. General SRCs 26 through 33 and LTC SRC 13: Gay Munyon, Ann Kaperak, and Brian Meyer. General SRCs 34 through 36 and MMA SRC 21: Marie Donnelly, Rachel LaCroix, and Tracy Hurd-Alvarez. The ranking process presented in the ITN and described in paragraphs 62-64, contemplated ranking each respondent by evaluator. The Agency carried this process over from an earlier procurement. In this procurement, despite what the ITN said, the Agency assigned responsibilities so that each evaluator reviewed only a subset of SRCs. Therefore, the ranking of responses by evaluator presented in the ITN could not work. It was not even possible because no one evaluator reviewed a complete response and because each SRC had a different maximum point score. Instead, the Agency, contrary to the terms of the ITN, ranked proposals by averaging the “total point scores” assigned by all of the evaluators. The Agency considered issuing an addendum advising the parties of the change. The addendum would have informed the respondents and provided them an opportunity to challenge the change. The Agency elected not to issue an addendum. EVALUATION AND SCORING The evaluators began scoring on November 6, 2017, with a completion deadline of December 29, 2017. The 11 evaluators had to score approximately 230 separate responses to the ITNs. The evaluators had to score 67,175 separate items to complete the scoring for all responses for all regions for all types of plans. No one at the Agency evaluated how much time it should take to score a particular item. None of the parties to this proceeding offered persuasive evidence to support a finding that scoring any particular item would or should take a specific length of time or that scoring all of the responses would or should take a specific length of time. Evaluators scored the responses in conference room F at the Agency’s headquarters. This secure room was the exclusive location for evaluation and scoring. Each evaluator had a dedicated workspace equipped with all tools and resources necessary for the task. The workspaces included a computer terminal for each evaluator. The system allowed evaluators to review digital copies of the ITN and proposals and to enter evaluation points in spreadsheets created for the purpose of recording scores. Evaluators also had access to hard copies of the proposals and the ITN. The Agency required evaluators to sign in and to sign out. The sign-in and sign-out sheets record the significant amount of time the evaluators spent evaluating proposals. Evaluators were not permitted to communicate with each other about the responses. To minimize distractions, the Agency prohibited cell phones, tablets and other connected devices in the room. The Agency also authorized and encouraged the evaluators to delegate their usual responsibilities. Agency proctors observed the room and evaluators throughout the scoring process. They were available to answer general and procedural questions and to ensure that the evaluators signed in and signed out. A log sheet documented how much time each evaluator spent in the scoring conference room. Some evaluators took extensive notes. For example, Ms. Median took over 200 pages of notes. Similarly, Ms. Munyon took nearly 400 pages of typewritten notes. The evaluators worked hard. None, other than Dr. LaCroix, testified that they did not have enough time to do their job. The computer system also automatically tracked the evaluators’ progress. Tracking reports showed the number of items assigned to each evaluator and the number of scoring items completed. The first status report was generated on December 8, 2017, approximately halfway through the scheduled scoring. At that time, only 28 percent of the scoring items were complete. Ms. Barrett usually ran the status reports in the morning. She made them available to the evaluators to review. The pace of evaluation caused concern about timely completion and prompted discussions of ways to accelerate scoring. Because it was clear that the majority of the evaluators would not complete scoring their SRCs by December 29, 2017, the Agency extended the scoring deadline to January 12, 2018. It also extended the hours for conference room use. Most respondents filed proposals for more than one type of plan and more than one region. This fact combined with the provision in the instructions saying that all statewide SRC responses must be identical for each region and that scores would transfer to each applicable region’s score sheets, enabled evaluators to score many SRCs just once. The system would then auto-populate the scores to the same SRC for all proposals by that respondent. This time saving measure permitted scoring on many of the items to be almost instantaneous after review of the first response to an SRC. The fact that so many respondents submitted proposals for so many regions and types of plans provided the Agency another opportunity for time-saving. The Agency loaded Adobe Pro on the evaluators’ computers as a timesaving measure. This program allowed the evaluators to compare a bidder’s Comprehensive Plan Proposal to the same company’s regional and Specialty Plan proposals. If the Adobe Pro comparison feature showed that the proposal response was the same for each plan, the Agency permitted evaluators to score the response once and assign the same score for each item where the respondent provided the same proposal. This speeded scoring. It, however, meant that for SRCs where evaluators did this, that they were not reviewing the SRC response in the specific context of the specialty plan population, each of which had specific and limited characteristics that made them different from the broader General and MMA plan populations. This is significant because so many SRCs required narrative responses where context would matter. There is no Specialty SRCs A-4 instruction requirement for specialty plans analogous to the requirement that responses for statewide SRCs must be identical for each region. In other words, the instructions do not say all SRCs marked as statewide must be identical for each specialty plan proposal and that the Agency will evaluate each Statewide SRC once and transfer the score to each applicable Specialty Plan score. In fact, according to the procurement officer, the Agency expected that evaluators would separately evaluate and score the statewide SRCs for Comprehensive Plans and for Specialty Plans, even if the same bidder submitted them. Despite the Agency’s expectation and the absence of an authorizing provision in the ITN, many evaluators, relying on the Adobe Pro tool, copied the SRC scores they gave to a respondent’s comprehensive plan proposal to its specialty plan proposal if the respondent submitted the same response to an SRC for a Comprehensive Plan and a Specialty Plan. For instance, Ms. Thomas (Evaluator 2) and Ms. Munyon (Evaluator 8) did this to save time. Ms. Donnelly (Evaluator 1) did this even when the comprehensive and specialty responses were not identical. This does not amount to the independent evaluation of the responses pledged by the ITN. On separate days, Evaluator 1 scored 1,315 items, 954 items, 779 items and 727 items. On separate days, Evaluator 2 scored 613 items, 606 items, 720 items, 554 items and 738 items. Evaluator 4 scored 874 items on one day. Evaluator 5 scored 813 items in one day. Evaluator 6 scored 1,001 items in one day. Evaluator 8 scored 635 items in one day. The record does not identify the items scored. It also does not permit determining how many of the item scores resulted from auto-population or assignment of scores based upon previous scoring of an identical response. It bears repeating, however, that the record does not support any finding on how long scoring the response to one SRC or an entire response could reasonably be expected to take. Even with the extended scoring period and time-saving measures, the Agency concluded that Evaluator 3 would not be able to finish all of the SRCs assigned to her. Rather than extend the deadline for scoring a second time, the Agency decided to reassign the nine of Evaluator 3’s SRCs that she had not begun scoring to two other evaluators. The Agency did not include scores of other SRCs for which Evaluator 3 had not completed scoring. The Agency only counted Evaluator 3’s scores for an SRC if she scored the SRC for everyone. The result was that only two people scored nine of the Specialty Plan SRCs. The Agency did not reassign all of Evaluator 3’s SRCs’. It only reassigned the SRCs to evaluators who were qualified to evaluate the items, who were not already assigned those items to score, and who had already finished or substantially completed their own evaluations. The decision to reassign the SRCs was not based on any scoring that had already been completed. The Agency did not allow changes to data submitted by any of the vendors. It allowed vendors to exchange corrupted electronic files for ones which could be opened and allowed vendors to exchange electronic files to match up with the paper copies that had been submitted. The Agency allowed Community to correct its submission where it lacked a signature on its transmittal letter and allowed Community to exchange an electronic document that would not open. It did not allow Community to change its reported HEDIS scores, which were submitted in the decimal form required by the instructions. Community erred in the numbers that it reported. There is no evidence showing that other vendors received a competitive or unfair advantage over Community in the Agency’s review of the SMI Specialty Plan submission for Region 10. There was no evidence that the Agency allowed any other vendors to change any substantive information in their submittals for that proposed specialty in that region. HEIDIS ISSUES Positive asserts that Simply’s proposal is non- responsive because Simply submitted HEDIS data from the general Medicaid population in response to SRC 6 and MMA SRC 14. Positive contends that Simply obtained a competitive advantage by supplying non-HIV/AIDS HEDIS data in response to SRC 6 and MMA SRC 14 because HIV/AIDS patients are generally a sicker group and require more care and because some HEDIS measures cannot be reported for an HIV/AIDS population. HEDIS stands for Healthcare Effectiveness and Data Information Set and is a set of standardized performance measures widely used in the healthcare industry. The instructions for both SRC 6 and MMA SRC 14 provide, in relevant part: The respondent shall describe its experience in achieving quality standards with populations similar to the target population described in this solicitation. The respondent shall include in table format, the target population (TANF, ABD, dual eligible), the respondent’s results for the HEDIS measures specified below for each of the last two (2) years (CY 2015/HEDIS 2016 and CY 2016/HEDIS 2017) for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees). If the respondent does not have HEDIS results for at least three (3) Medicaid Contracts, the respondent shall provide commercial HEDIS measures for the respondent’s largest Contracts. If the Respondent has Florida Medicaid HEDIS results, it shall include the Florida Medicaid experience as one (1) of three (3) states for the last two (2) years. (JE 1 at 75 (SRC 6); JE 1 at 158 (MMA SRC 14)). SRC 6 and MMA SRC 14 instruct respondents to provide HEDIS measures for “the target population (TANF, ABD, dual eligible).” Id.. TANF, ABD, and dual eligible are eligibility classifications for the Medicaid population. The Agency sought information regarding the target Medicaid-eligible population, even from respondents proposing a Specialty Plan, because Specialty Plans are required to serve all of the healthcare needs of their recipients, not just the needs related to the criteria making those recipients eligible for the Specialty Plan. Following the instructions in SRC 6 and MMA SRC 14, Simply provided HEDIS data from the Medicaid-eligible population for its three largest Medicaid contracts as measured by the total number of enrollees. For the requested Florida HEDIS data, Simply utilized legacy HEDIS data from Amerigroup Florida, Inc., a Comprehensive Plan. Amerigroup and Simply had merged in October of 2017. Therefore, at the time of submission of Simply’s proposal, the HEDIS data from Amerigroup Florida was the data from Simply’s largest Medicaid contract in Florida for the period requested by the SRCs. Positive asserts that the Agency impermissibly altered scoring criteria after the proposals were submitted when the Agency corrected technical issues within a HEDIS Measurement Tool spreadsheet. SRC 6 and MMA SRC 14 required the submission of numeric data for the requested HEDIS performance measures. To simplify submission of the numeric data for the requested HEDIS performance measures, the Agency required respondents to utilize a HEDIS Measurement Tool spreadsheet. The evaluation criteria for SRC 6 and MMA SRC 14 provided that respondents will be awarded points if the reported HEDIS measures exceed the national or regional mean for such performance measures. Some respondents, including Positive, entered “N/A,” “small denominator,” or other text inputs into the HEDIS Measurement Tool. During the evaluation and scoring process, the Agency discovered that if a respondent input any text into the HEDIS Measurement Tool, the tool would assign random amounts of points, even though respondents had not input measureable, numeric data. The Agency reasonably resolved the problem by removing any text and inserting a zero in place of the text. The correction of the error in the HEDIS Measurement Tool prevented random points from being awarded to respondents and did not alter scores in any way contrary to the ITN. It was reasonable and fair to all respondents.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order rejecting all r esponses to the ITNs to provide a Medicaid Managed Care plan for patients with HIV/AIDS in Regions 10 and 11. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide Medicaid Managed Care plan in Region 10 for patients with serious mental illness. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide a Medicaid Managed Care plan in Region 10 for patients with serious mental illness. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide a Medicaid Managed Care plan in Region 10 for c hild w elfare specialty services. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order awarding Wellcare of Florida, Inc., d/b/a Staywell Health Plan of Florida, a contract for a specialty Medicaid Managed Care plan for patients with Serious Mental Illness in Region 10. Based on the foregoing Findings of Fact and Conclusions of Law it is RECOMMENDED that the Agency for Health Care Administration enter a final order dismissing the Petition in Case No. 18-3513. DONE AND ENTERED this day of , , in Tallahassee, Leon County, Florida. S JOHN D. C. NEWTON, II Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this day of , .

USC (1) 42 U.S.C 1396u Florida Laws (9) 120.5720.42287.057409.912409.962409.966409.97409.974409.981
# 2
METRO TREATMENT OF FLORIDA, L.P. vs DEPARTMENT OF CHILDREN AND FAMILIES, 20-004323 (2020)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Sep. 29, 2020 Number: 20-004323 Latest Update: Dec. 24, 2024

The Issue Whether the procedure utilized by Respondent, Department of Children and Families (Department), for breaking a tie for the award of a Methadone Medication-Assisted Treatment (MAT) license, pursuant to the “FY 2018/2019 Methadone Medication-Assisted Treatment Needs Assessment Notice of Intended Award for Brevard County, July 10, 2020,” (Notice) is an unadopted rule under section 120.52(16) and thus cannot form the basis for the Department’s decision to award an MAT in Brevard County to Intervenor CFSATC7 d/b/a Central Florida Treatment Centers (Central Florida), pursuant to section 120.57(1)(e).

Findings Of Fact The Parties Metro is a provider of specialized quality care for opioid disorder treatment and operates methadone medication treatment centers nationwide, including the state of Florida, and supports education and understanding of addiction as a disease, so that more patients and communities can find the care that is needed to address opioid addiction. Metro’s MAT counseling and medical services programs are customized to a patient’s needs, and services are delivered in a way that respects their dignity, value, and self-worth. Metro currently has 18 licensed MAT clinics and one satellite clinic in Florida. The Department is the agency with regulatory authority over the provision of substance abuse services. See § 397.321(1), Fla. Stat. These duties include, but are not limited to, the licensing and regulation of the delivery of substance abuse services, including clinical treatment and clinical treatment services such as “medication-assisted treatment for opiate addiction.” §§ 397.321(1) and (6); 397.311(26)(a)7., Fla. Stat. The Department also promulgates rules governing substance abuse providers. See § 397.321(1), Fla. Stat. Central Florida is a Florida corporation licensed to operate MAT clinics within Florida. Central Florida currently operates numerous MAT clinics within Florida. Methadone Medication-Assisted Treatment MAT is the use of medications, in combination with counseling and behavioral therapies, to provide a whole-patient approach to the treatment of substance abuse. In Florida, MAT providers for opiate addiction may not be licensed unless they provide supportive rehabilitation programs such as counseling, therapy, and vocational rehabilitation. See § 397.427(1), Fla. Stat. Generally, methadone treatment requires many patients seeking treatment to come to the clinic every day. During the initial induction period, the patient sees the clinic’s physician and is monitored so that the clinic’s medical professionals can ensure that the patient’s medication is level and stable. Thereafter, a patient comes to the clinic every day to receive a methadone dose until the patient is eligible, through negative urine screens, for a limited supply of take-home medication. The substance abuse regulatory scheme in Florida is designed to provide a statewide system of care for the prevention, treatment, and recovery of children and adults with serious substance abuse disorders. Substance abuse providers, which include MAT clinics, are subject to a strict statutory, regulatory, and licensing scheme, which provides direction for a continuum of community-based services including prevention, treatment, and detoxification services. See Ch. 394 and 397, Fla. Stat. The Department is responsible for the licensure and oversight of all substance abuse providers, and administers and maintains a comprehensive regulatory process for this purpose. Chapter 397, Florida Statutes, and Florida Administrative chapter 65D-30 govern and regulate this process. The Department’s duties include the licensing and regulation of the delivery of substance abuse services pursuant to chapter 397. The licensed services include “medication-assisted treatment for opiate use disorders.” § 397.311(26)(a)7., Fla. Stat. The Department is tasked with determining the need for establishing MAT providers for opiate addiction. There is currently an unmet need for opioid treatment in Florida. Generally, providers of MAT services for opiate addiction may only be established in response to the Department’s determination and publication for additional medication treatment services. See § 397.427, Fla. Stat. The primary reason for the Department’s annual determination of need requirement is to make sure clinics are located where people need them, as timely access to treatment is a recognized public health strategy for addressing substance abuse. Florida Administrative Code Rule 65D-30.014 Rule 65D-30.014 (Rule) specifies the “Standards for Medication and Methadone Maintenance Treatment” in Florida. Rule 65D-30.014(3)1 requires that the following application procedures be followed: 1 The undersigned notes that the Department has amended the Rule since conducting the determination of need and evaluations pertinent to this matter; however, the undersigned will refer to the version of the Rule (amended 6-15-19) that was promulgated and in effect at that time. (3) Determination of Need. The Department shall annually perform the assessment detailed in the “Methodology of Determination of Need Methadone Medication- Assisted Treatment,” CF-MH 4038, May 2019, incorporated by reference and available at http://www.flrules.org/Gateway/reference.asp?No=Ref- 10669. The Department shall publish the results of the assessment in the Florida Administrative Register by June 30. Facilities owned and operated by the Florida Department of Corrections are exempt from the needs assessment process. However, these facilities must apply for a license to deliver this service. The publication shall direct interested parties to submit a letter of intent to apply for licensure to provide medication-assisted treatment for opioid use disorders to the Regional Office of Substance Abuse and Mental Health where need has been demonstrated. The publication shall provide a closing date for submission of letters of intent. Interested parties must identify the fiscal year of the needs assessment to which they are responding and the number of awards they are applying for per county identified in the assessment in their letter of intent. Within seven (7) business days of the closing date, the Regional Office shall notify parties who submitted a letter of intent on how to proceed. If the number of letters of intent equals or is less than the determined need, parties shall be awarded the opportunity to proceed to licensure by completing an “Application for Licensure to Provide Substance Abuse Services” form, C&F-SA Form 4024, May 2019, incorporated by reference and available at http://www.flrules.org/Gateway/ reference.asp?No=Ref-10668. If the number of letters of intent exceeds the determined need, parties shall be invited to submit a “Methadone Medication-Assisted Treatment (MAT) Application to Proceed to Licensure Application” form, CF-MH 4041, May 2019, incorporated by reference and available at http://www.flrules.org/Gateway/reference.asp?No=Ref- 10671. Applications may not be rolled over for consideration in response to a needs assessment published in a different year and may only be submitted for a current fiscal year needs assessment. The Department shall utilize an evaluation team made up of industry experts to conduct a formal rating of applications as stipulated in the “Methadone Medication-Assisted Treatment (MAT) Application Evaluation” form, CF-MH 4040, May 2019, incorporated by reference and available at http://www.flrules.org/Gateway/reference.asp?No=Ref- 10670. The evaluation team members shall not be affiliated with the Department, current methadone medication-assisted treatment providers operating in Florida, or the applicants. The selection of a provider shall be based on the following criteria: Capability to Serve Selected Area(s) of Need and Priority Populations. Area(s) of Need are the counties identified as having a need for additional clinics. Priority Populations are pregnant women, women with young children, and individuals with financial hardships; Patient Safety and Quality Assurance/ Improvement; Scope of Methadone Medication-Assisted Treatment Services; Capability and Experience; and Revenue Sources. Applicants with the highest-scored applications in each county shall be awarded the opportunity to apply for licensure for the number of programs specified in their letter of intent to meet the need of that county. If there is unmet need, the next highest scored applicant(s) will receive an award(s) based on the remaining need and the number of programs specified in their letter of intent. This process will continue until the stated need is met. Regional offices shall inform the highest-scoring applicant(s) in writing of the award. All awarded applicants must submit a letter of intent to apply for licensure to the appropriate regional office within 30 calendar days after the award. If an applicant declines an award or fails to submit the letter of intent within the specified time, the Department shall rescind the award. After the Department rescinds the original award for that selected area of need, the applicant with the next highest score shall receive the award. Awarded applicants must receive at least a probationary license within two (2) years of the published needs assessment connected to their application. See rule 65D-30.0036, F.A.C. for licensure application requirements. Applicants may submit a request to the State Authority and Substance Abuse and Mental Health Program Office for an exception if unable to meet timeframes due to a natural disaster that causes physical damage to the applicant’s building(s). Proof of natural disaster and impact on physical property must accompany the request. Upon receipt of the request for exception and accompanying proof, a one-time extension shall be granted for six (6) months. Providers who are delayed for a reason other than a natural disaster may petition the Department for a rule waiver pursuant to section 120.542, F.S. Rule 65D-30.014(3)(c)2.a. through c. are the portions of the Rule that address the application process of how providers will be selected to apply for licensure, and are applicable to this proceeding. The Rule cites section 397.321(5) as rulemaking authority, and cites sections 397.311(26), 397.321, 397.410, and 397.427 as the laws implemented. Rule 65D-30.014(3)(c)2.a., requires that applicants for a particular clinic be evaluated by industry experts who are independent of the Department, and not Department personnel. Rule 65D-30.014(3)(c)2.b., further provides that industry experts would select the best-suited applicant for each county pursuant to the process set forth in the Rule. The Rule limited the evaluation team to the following five criteria: Capability to Serve Selected Area(s) of Need and Prior Populations. Area(s) of Need are the counties identified as having a need for additional clinics. Priority Populations are pregnant women, women with young children, and individuals with financial hardships; Patient Safety and Quality Assurance/Improvement; Scope of Methadone Medication-Assisted Treatment Services; Capability and Experience; and Revenue Sources. Pursuant to the Rule, the applicants with the highest-rated score in each county shall be awarded the opportunity to apply for licensure for the number of programs specified in the applicant’s letter of intent to meet the need of that county. Neither chapter 397, nor the Rule, contain a procedure to break a tie score between applicants. FY 2018/2019 Needs Assessment The Department conducted an MAT needs assessment for fiscal year 2018/2019, and determined that 42 new MAT clinics were needed in Florida, including one in Brevard County. Six providers, including Metro and Central Florida, submitted letters of intent/applications for Brevard County, which is the subject of the Notice. As described in the Rule—specifically, rule 65D-30.014(3)(c)2.a.—a team of external evaluators received and scored the applications received for Brevard County. The evaluators’ scoring of applications for Brevard County resulted in a tie for the highest score between Metro and Central Florida. The individual scores from the evaluators varied; however, the combined scores for both Metro and Central Florida totaled 641 points each. The individual scoring, as reflected within the Notice, provides as follows: Brevard County Team 1 Evaluation Scores Applicant by County Academic Medical Public Policy Total CFSATC dba Central Florida Treatment Centers 215 211 215 641 Metro Treatment of Florida, LP 205 218 218 641 CRC Health Treatment Clinics, LLC 214 187 212 613 Maric Healthcare, LLC 200 205 198 503 Psychological Addiction Services, LLC 143 177 149 469 Treatment Centers of America 156 120 167 443 The Tiebreaker The Notice further provides the following concerning the tie scores between Metro and Central Florida: The evaluator scoring of applications for Brevard County resulted in a tie for the highest score between Metro Treatment of Florida (Metro Treatment) and Central Florida Treatment Centers (Central Florida). The individual scores from the evaluators varied; however, the combined scores totaled 614 [sic] points each.[2] There is no tie breaking procedure set forth in rule 65D-30.014, F.A.C., or other rules in the Florida Administrative Code. To resolve the tie in this circumstance, the Department reviewed a variety of possible factors in order to recommend an award. These factors included performance indicators, corporate status, and Florida operations as follows: An average score for licensure inspections over the past three years Data from the Department’s Central Registry System from 10/1/2019 to 5/1/2020. Methadone medication-assisted treatment providers are required to register and participate in a Department-approved electronic registry system by rule 65D-30.014(4)(f), F.A.C. The data points considered were: Percentage of a provider’s failure to enter required demographic information Percentage of a provider’s failure to enter required photographs Percentage of a provider’s failure to enter required dosing information Whether the provider operates exclusively in Florida Involvement of women in senior management positions 2 The parties do not dispute that the total combined score should reflect 641, and not 614. The Notice further provided: Award Recommendation Criteria (Top Score Highlighted in Bold Italics) Provider Inspection Average % Missing Demographics % Missing Photograph % Missing Dosing Central Florida Treatment Centers 96.6% 1.6% 3.71% 2.33% Metro Treatment of Florida 93.6% 11.31% 1.62% 9.75% Additionally, the Notice stated: Based on the four performance-based measures, Central Florida demonstrated a higher level of adherence to licensure requirements and entering data into the Central Registry System. In addition, Central Florida operates exclusively in Florida and has a woman as the Chief Executive Officer of the corporation. Based on these factors, the Department recommends award of the opportunity for licensure in Brevard County to Central Florida. Metro challenges the agency statements in the Notice—as quoted in paragraphs 27 and 28 above—that set forth the Department’s tiebreaking procedure, as constituting an unadopted rule.3 Ms. Gazioch testified that, after receiving the scoring for Brevard County from the evaluation team, which was a tie, “the Department made the final decision of who to award to.” She stated that the Rule did not address what the Department should do in the event of a tie. After consulting with officials within the Department, she testified as to the decision the Department ultimately made: 3 The Petition only challenges the tie breaking criteria the Department utilized as an unadopted rule upon which agency action cannot be based, pursuant to section 120.57(1)(e), and does not challenge any other aspect of the Department’s handling of the evaluation of the letters of intent for the Brevard County license. [T]he course of action that the Department took was to award the opportunity to apply for licensure in Brevard County to Central Florida Treatment Centers. And that was based on looking at the average inspection scores, licensing inspection scores, looking at data entered into the central registry and compliance with certain items, such as missing demographics, as well as missing photographs in the central registry system, and also missing dosing in the central registry system. Ms. Gazioch further testified as to the reason the Department considered these particular tie-breaking factors: Because they were factors that are equally – that could be equally measured across, really, any licensed methadone opioid treatment provider. The inspection average obviously speaks to compliance with rule and statue in terms of implementing an opioid treatment program. And then, obviously, the documentation that is entered into the registry is very, very important to make sure that, you know, as clients move through the system and they move from one provider to another, or in the event of a hurricane where somebody might have to get a guest dose, it’s always very important to have the information accurate and updated in the central registry system. So that’s another quality indicator that we felt was important to look at compliance with the information in that system. Ms. Gazioch also testified that as a result of the tie, the Department was concerned that it might not be able to open a clinic in Brevard County, even though “the need was clear based on the needs assessment. So we felt that we were in a position that we had to move forward with a tiebreaker to at least be able to establish one clinic that was needed in that county.” The Department’s decision to award the opportunity to apply for licensure in Brevard County to Central Florida was based on the tiebreaking factors contained in the Notice and listed in paragraph 26 above. Obviously, these tiebreaking factors are not found in the Rule. There is no evidence in the record that establishes whether the Department had time to initiate rulemaking to adopt a tiebreaking procedure for the Rule. There is no evidence in the record that establishes whether rulemaking (to establish a tiebreaking procedure) was feasible or practicable. There is no evidence in the record that establishes whether the Department would have utilized a different tiebreaking procedure in another county, if one had occurred. However, if a tie happened involving an applicant that did not currently operate in Florida, or only recently began operating in Florida, many of the tiebreaking criteria utilized by the Department for Brevard County would be inapplicable. 37, Although the Department developed and utilized the tiebreaking procedures in arriving at its decision to award the opportunity to apply for licensure in Brevard County to Central Florida, the external evaluators scored the applications pursuant to the Rule, and the Department did not change the scores from the external evaluators in arriving at its decision to award the opportunity to apply for licensure in Brevard County to Central Florida.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, the undersigned hereby RECOMMENDS that the Department of Children and Families enter a final order dismissing the Petition for Formal Administrative Hearing Involving Material Disputed Facts of Metro Treatment of Florida, L.P., and awarding the MAT license in Brevard County to CFSATC d/b/a Central Florida Treatment Centers. 4 The undersigned also finds instructive the administrative law judge’s determination that a “coin toss” tie-breaking procedure in a competitive procurement that was not supported by the applicable statute or rule was not an unadopted rule because the procedure was “not a statement of general applicability because it was, in essence, an ad hoc decision, for obscure reasons, by which the Department elected to break the tie purportedly involved in the case at hand, solely applicable to these two applicants.” T.S. v. Dep’t of Educ., Div. of Blind Servs., Case No. 05-1695BID, RO at p. 29-30 (DOAH Oct. 7, 2005), rejected in part, Case No. DOE- 2005-1076 (Fla. DOE Nov. 23, 2005). The undersigned notes that the Department of Education, in its final order, rejected the administrative law judge’s findings and conclusions as “immaterial, irrelevant, and unnecessary” on this issue because it determined that there was in fact no tie between the applicants. DONE AND ENTERED this 9th day of December, 2020, in Tallahassee, Leon County, Florida. S ROBERT J. TELFER III Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 9th day of December, 2020. COPIES FURNISHED: Daniel Ryan Russell, Esquire Dean Mead Post Office Box 351 Tallahassee, Florida 32302 (eServed) John L. Wharton, Esquire Dean, Mead & Dunbar Suite 815 215 South Monroe Street Tallahassee, Florida 32301 (eServed) Maureen McCarthy Daughton, Esquire Maureen McCarthy Daughton, LLC Suite 3-231 1400 Village Square Boulevard Tallahassee, Florida 32312 (eServed) William D. Hall, Esquire Dean Mead Suite 130 215 South Monroe Street Tallahassee, Florida 32301 (eServed) Mia L. McKown, Esquire Holland & Knight LLP Suite 600 315 South Calhoun Street Tallahassee, Florida 32301 (eServed) Lacey Kantor, Agency Clerk Department of Children and Families Building 2, Room 204Z 1317 Winewood Boulevard Tallahasee, Florida 32399-0700 (eServed) Javier Enriquez, General Counsel Department of Children and Families Building 2, Room 204F 1317 Winewood Boulevard Tallahasee, Florida 32399-0700 (eServed) Chad Poppell, Secretary Department of Children and Families Building 1, Room 202 1317 Winewood Boulevard Tallahasee, Florida 32399-0700 (eServed)

Florida Laws (9) 120.52120.542120.56120.569120.57120.68397.311397.321397.427 Florida Administrative Code (1) 65D-30.014 DOAH Case (2) 05-1695BID20-4323
# 3
AGENCY FOR HEALTH CARE ADMINISTRATION vs PARMANAND GURNANI, M.D., 05-002573MPI (2005)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Jul. 18, 2005 Number: 05-002573MPI Latest Update: Dec. 24, 2024
# 6
DEPARTMENT OF HEALTH, BOARD OF PHARMACY vs NGONI C. KWANGARI, 00-000372 (2000)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Jan. 21, 2000 Number: 00-000372 Latest Update: Dec. 24, 2024
# 7
SOUTH FLORIDA COMMUNITY CARE NETWORK, LLC, D/B/A COMMUNITY CARE PLAN vs AGENCY FOR HEALTH CARE ADMINISTRATION, 18-003514BID (2018)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Jul. 09, 2018 Number: 18-003514BID Latest Update: Jan. 25, 2019

The Issue Does Petitioner, AHF MCO of Florida, Inc., d/b/a PHC Florida HIV/AIDS Specialty Plan (Positive), have standing to contest the intended award to Simply for Regions 10 and 11 or to seek rejection of all proposals? (Case No. 18-3507 and 18-3508) Should the intended decision of Respondent, Agency for Health Care Administration (Agency), to contract with Simply Healthcare Plans, Inc. (Simply), for Medicaid managed care plans for HIV/AIDS patients in Regions 10 (Broward County) and Region 11 (Miami-Dade and Collier Counties) be invalidated and all proposals rejected? (Case Nos. 18-3507 and 18-3508) Must the Agency negotiate with Petitioner, South Florida Community Care Network, LLC, d/b/a Community Care Plan (Community), about a plan to provide HIV/AIDS Medicaid managed care services in Region 10 because it was the only responsive proposer of services that was a Provider Service Network (PSN)? (Case No. 18-3512) Must the Agency negotiate with Community to provide Medicaid managed care services in Region 10 for people with Serious Mental Illnesses because Community is a PSN? (Case No. 18-3511) Must the Agency contract with Community to provide Medicaid managed care services for Children with Special Needs in Region 10 because Community is a PSN? (Case No. 18-3513) Must the Agency negotiate with Community to provide Medicaid managed care services for Child Welfare patients in Region 10 because Community is a PSN? (Case No. 18-3514)

Findings Of Fact THE PARTIES Agency: Section 20.42, Florida Statutes, establishes the Agency as Florida’s chief health policy and planning agency. The Agency is the single state agency authorized to select eligible plans to participate in the Medicaid program. Positive: Positive is a Florida not-for-profit corporation operating a Medicaid health plan dedicated to serving people with HIV/AIDS. Positive serves about 2,000 patients in Florida. Positive’s health plan is accredited by the Accreditation Association for Ambulatory Healthcare. Its disease management program is accredited by the National Committee for Quality Assurance. Currently, the Agency contracts with Positive for a SMMC HIV/AIDS Specialty Plan serving Regions 10 and 11. Simply: Simply is a Florida for-profit corporation operating a Medicaid health plan dedicated to serving people with HIV/AIDS. Currently, the Agency contracts with Simply to provide a SMMC HIV/AIDS Specialty Plan for Regions 1 through 3 and 5 through 11. Simply has maintained the largest patient enrollment of all HIV/AIDs plans in Florida since Florida started its statewide Medicaid managed care program. Community Care: Community is a Florida limited liability company. It is a PSN as defined in sections 409.912(1)(b) and 409.962(14), Florida Statutes. Staywell: Staywell is the fictitious name for WellCare of Florida, Inc., serving Florida’s Medicaid population. Sunshine: Sunshine State Health Plan (Sunshine) is a Florida corporation. It offers managed care plans to Florida Medicaid recipients. THE INVITATION TO NEGOTIATE TIMELINE On July 14, 2017, the Agency released 11 ITNs plans for Florida’s Medicaid managed care program in 11 statutorily defined regions. Region 10, Broward County, and Region 11, Miami-Dade and Collier Counties, are the regions relevant to this proceeding. Part IV of chapter 409, creates a statewide, integrated managed care program for Medicaid services. This program called Statewide Medicaid Managed Care includes two programs, Managed Medical Assistance and Long-term Care. Section 409.966(2), directs the Agency to conduct separate and simultaneous procurements to select eligible plans for each region using the ITN procurement process created by section 287.057(1)(c). The ITNs released July 14, 2017, fulfilled that command. The Agency issued 11 identical ITNs of 624 pages, one for each region, in omnibus form. They provided elements for four types of plans. Some elements were common to all types. Others were restricted to a specific plan type defined by intended patient population. The plan types are comprehensive plans, long-term care plus plans, managed medical assistance plans, and specialty plans. Section 409.962(16) defines “Specialty Plan” as a “managed care plan that serves Medicaid recipients who meet specified criteria based on age, medical condition, or diagnosis.” Responding vendors identified the plan type or types that they were proposing. The Agency issued Addendum No. 1 to the ITNs on September 14, 2017. On October 2, 2017, the Agency issued Addendum No. 2 to the ITNs. Addendum 2 included 628 questions about the ITNs and the Agency’s responses to the questions. Florida law permits potential responders to an ITN to challenge the specifications of an ITN, including the addendums. § 120.57(3)(b), Fla. Stat. Nobody challenged the specifications of the ITNs. As contemplated by section 287.057(c)(2), the Agency conducted “a conference or written question and answer period for purposes of assuring the vendors’ full understanding of the solicitation requirements.” Positive, Community, and Simply, along with United Healthcare of Florida, Inc., HIV/AIDS Specialty Plan (United), submitted responses to the ITN in Region 10 proposing HIV/AIDS Specialty Plans. Community was the only PSN to propose an HIV/AIDS plan for Region 10. Positive, Simply, and United submitted replies to the ITN for Region 11, proposing HIV/AIDS Specialty Plans. Community, United, Staywell, and one other provider submitted proposals to provide SMI Specialty Plan services in Region 10. Community was the only responding PSN. Community, Sunshine, and Staywell submitted proposals to provide Child Welfare Specialty Plans (CW) in Region 10. Community was the only PSN. Community, Staywell, and two others submitted proposals to offer Specialty Plans for Children with Special Needs (CSN) in Region 10. Community was one of two responding PSNs. Proposal scoring began November 6, 2017, and ended January 16, 2018. The Agency announced its intended awards on April 24, 2018. On April 24, 2018, the Agency issued its notices of intent to award specialty contracts in Regions 10 and 11. The following charts summarize the Agency’s ranking of the proposals and its intended awards. The two highest ranked plans, which the Agency selected for negotiations, are identified in bold. Region 10 – Children with Special Needs Respondent Intended Award Ranking Staywell No 1 Community No 2 Miami Children’s Health Plan, LLC No 3 Our Children PSN of Florida, LLC No 4 Region 10 – Child Welfare Respondent Intended Award Ranking Staywell No 1 Sunshine Yes 2 Molina Healthcare of Florida, Inc. No 3 Community No 4 Region 10 – HIV/AIDS Respondent Intended Award Ranking Simply Yes 1 United No 2 Community No 3 Positive No 4 Region 10 – Serious Mental Illness Respondent Intended Award Ranking Staywell Yes 1 United No 2 Florida MHS, Inc. No 3 Community No 4 Region 11 – HIV/AIDS Respondent Intended Award Ranking Simply Yes 1 United No 2 Positive No 3 All of the Specialty Plan awards noticed by the Agency went to bidders who also proposed, and received, comprehensive plan awards. The protests, referrals, and proceedings before the Division summarized in the Preliminary Statement followed the Agency’s announcement of its intended awards. TERMS The voluminous ITN consisted of a two-page transmittal letter and three Attachments (A, B, and C), with a total of 34 exhibits to them. They are: Attachment A, Exhibits A-1 through A-8, Attachment B, Exhibits B-1 through B-3, and Attachment C, Exhibits C-1 through C-8. The ITN establishes a two-step process for selecting: an evaluation phase and a negotiation phase. In the evaluation phase, each respondent was required to submit a proposal responding to criteria of the ITN. Proposals were to be evaluated, scored, and ranked. The goal of the evaluation phase was to determine which respondents would move to negotiations, not which would be awarded a contract. The top two ranking Specialty Plans per specialty population would be invited to negotiations. In the negotiation phase, the Agency would negotiate with each invited respondent. After that, the Agency would announce its intended award of a contract to the plan or plans that the Agency determined would provide the best value. Together, the attachments and exhibits combined instructions, criteria, forms, certifications, and data into a “one size fits all” document that described the information required for four categories of managed care plans to serve Medicaid patients. The ITN also provided data to consider in preparing responses. The transmittal letter emphasized, “Your response must comply fully with the instructions that stipulate what is to be included in the response.” The ITNs identified Jennifer Barrett as the procurement officer and sole point of contact with the Agency for vendors. The transmittal letter is reproduced here. This solicitation is being issued by the State of Florida, Agency for Health Care Administration, hereinafter referred to as “AHCA” or “Agency”, to select a vendor to provide Statewide Medicaid Managed Care Program services. The solicitation package consists of this transmittal letter and the following attachments and exhibits: Attachment A Instructions and Special ConditionsExhibit A-1 Questions TemplateExhibit A-2-a Qualification of Plan Eligibility Exhibit A-2-b Provider Service Network Certification of Ownership and Controlling InterestExhibit A-2-c Additional Required Certifications and StatementsExhibit A-3-a Milliman Organizational Conflict of Interest Mitigation Plan Exhibit A-3-b Milliman Employee Organizational Conflict of Interest AffidavitExhibit A-4 Submission Requirements and Evaluation Criteria InstructionsExhibit A-4-a General Submission Requirements and Evaluation Criteria Exhibit A-4-a-1 SRC# 6 - General Performance Measurement ToolExhibit A-4-a-2 SRC# 9 - Expanded Benefits Tool (Regional) Exhibit A-4-a-3 SRC# 10 - Additional Expanded Benefits Template (Regional)Exhibit A-4-a-4 SRC# 14 - Standard CAHPS Measurement Tool Exhibit A-4-b MMA Submission Requirements and Evaluation Criteria Exhibit A-4-b-1 MMA SRC# 6 - Provider Network Agreements/Contracts (Regional)Exhibit A-4-b-2 MMA SRC# 14 - MMA Performance Measurement Tool Exhibit A-4-b-3 MMA SRC# 21 - Provider Network Agreements/Contracts Statewide Essential Providers Exhibit A-4-c LTC Submission Requirements and Evaluation CriteriaExhibit A-4-c-1 LTC SRC# 4 - Provider Network Agreements/Contracts (Regional) Exhibit A-4-d Specialty Submission Requirements and Evaluation CriteriaExhibit A-5 Summary of Respondent CommitmentsExhibit A-6 Summary of Managed Care Savings Exhibit A-7 Certification of Drug-Free Workplace ProgramExhibit A-8 Standard Contract Attachment B Scope of Service - Core Provisions Exhibit B-1 Managed Medical Assistance (MMA) ProgramExhibit B-2 Long-Term Care (LTC) ProgramExhibit B-3 Specialty Plan Attachment C Cost Proposal Instructions and Rate Methodology NarrativeExhibit C-1 Capitated Plan Cost Proposal TemplateExhibit C-2 FFS PSN Cost Proposal Template Exhibit C-3 Preliminary Managed Medical Assistance (MMA) Program Rate Cell Factors Exhibit C-4 Managed Medical Assistance (MMA) Program Expanded Benefit Adjustment Factors Exhibit C-5 Managed Medical Assistance (MMA) Program IBNR Adjustment Factors Exhibit C-6 Managed Medical Assistance (MMA) Program Historical Capitated Plan Provider Contracting Levels During SFY 15/16 Time Period Exhibit C-7 Statewide Medicaid Managed Care Data BookExhibit C-8 Statewide Medicaid Managed Care Data Book Questions and Answers Your response must comply fully with the instructions that stipulate what is to be included in the response. Respondents submitting a response to this solicitation shall identify the solicitation number, date and time of opening on the envelope transmitting their response. This information is used only to put the Agency mailroom on notice that the package received is a response to an Agency solicitation and therefore should not be opened, but delivered directly to the Procurement Officer. The ITN describes the plans as follows: Comprehensive Long-term Care Plan (herein referred to as a “Comprehensive Plan”) – A Managed Care Plan that is eligible to provide Managed Medical Assistance services and Long-term Care services to eligible recipients. Long-term Care Plus Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services and Long-term Care services to eligible recipients enrolled in the Long-term Care program. This plan type is not eligible to provide services to recipients who are only eligible for MMA services. Managed Medical Assistance (MMA) Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services to eligible recipients. This plan type is not eligible to provide services to recipients who are eligible for Long-term Care services. Specialty Plan – A Managed Care Plan that is eligible to provide Managed Medical Assistance services to eligible recipients who are defined as a specialty population in the resulting Contract. Specialty Plans are at issue. The ITN did not define, describe, or specify specialty populations to be served. It left that to the responding vendors. Beyond that, the ITN left the ultimate definition of the specialty population for negotiation, saying in Section II(B)(1)(a) of Attachment B, Exhibit B-3, “[t]he Agency shall identify the specialty population eligible for enrollment in the Specialty Plan based on eligibility criteria based upon negotiations.” Some respondents directly identified the specialty population. Simply’s transmittal letter stated that it proposed “a Specialty plan for individuals with HIV/AIDS.” Positive’s response to Exhibit A-4-d Specialty SRC 4, eligibility and enrollment, stated, “the specialty population for the PHC [Positive] plan will be Medicaid eligible, male and female individuals from all age groups who are HIV positive with or without symptoms and those individuals who have progressed in their HIV disease to meet the CDC definition of AIDS.” Some others left definition of the specialty population to be inferred from the ITN response. The result is that the ITN left definition of the specialty populations initially to the respondents and ultimately to negotiations between the Agency and successful respondents. Petitioners and Intervenors describe the populations that they propose serving as HIV/AIDS patients, patients with SMI, CSN, and child welfare populations. ITN respondents could have proposed serving only cancer patients, serving only obstetric patients, or serving only patients with hemophilia. The part of the ITN requiring a respondent to identify the plan type for which it was responding offered only four alternative blocks to check. They were: “Comprehensive Plan,” Long-Term Care Plus Plan,” “Managed Medical Assistance Plan,” or “Specialty Plan.” Attachment A to the ITN, labeled “Instructions and Special Conditions,” provides an overview of the solicitation process; instructions for response preparation and content; information regarding response submission requirements; information regarding response evaluation, negotiations, and contract awards; and information regarding contract implementation. Exhibits A-1 to A-3 and A-5 to A-7 of the ITN contain various certifications and attestations that respondents had to prepare and verify. Exhibit A-4 contains submission requirement components (SRCs) to which respondents had to prepare written responses. Exhibit A-8 contains the state’s standard SMMC contract. ITN Exhibit A-4-a contains 36 general submission requirements and evaluation criteria (General SRCs). ITN Exhibit A-4-b contains 21 MMA submission requirements and evaluation criteria (MMA SRCs). ITN Exhibit A-4-c contains 13 LTC submission requirements and evaluation criteria (LTC SRCs). ITN Exhibit A-4-d contains five specialty submission requirements and evaluation criteria (Specialty SRCs). The responses that the 36 SRCs require vary greatly. Some are as simple as providing documents or listing items. Others require completing tables or spreadsheets with data. Consequently, responses to some SRCS apparently could be reviewed in very little time, even a minute or less. Others requiring narrative responses might take longer. Examples follow. General SRC 1 required a list of the respondent’s contracts for managed care services and 12 information items about them including things such as whether they were capitated, a narrative describing the scope of work; the number of enrollees; and accomplishments and achievement. General SRC 2 asked for documentation of experience operating a Medicaid health plan in Florida. General SRC 3 asked for information confirming the location of facilities and employees in Florida. General SRC 12 requested a flowchart and written description of how the respondent would execute its grievance and appeal system. It listed six evaluation criteria. MMA SRC 2 asks for a description of the respondent’s organizational commitment to quality improvement “as it relates to pregnancy and birth outcomes.” It lists seven evaluation criteria. MMA SRC 10 asks for a description of the respondent’s plan for transition of care between service settings. It lists six evaluation criteria including the respondent’s process for collaboration with providers. Specialty SRC 1 asks for detailed information about respondent’s managed care experience with the specialty population. Specialty SRC 5 asks for detailed information about the respondent’s provider network standards and provides five evaluation criteria for evaluating the answers. Exhibit A-8 of the ITN contains the standard SMMC contract. Attachment B and Exhibits B-1 to B-3 of the ITN contain information about the scope of service and core provisions for plans under the SMMC program. Attachment C and Exhibits C-1 to C-8 of the ITN contain information related to the cost proposals and rate methodologies for plans under the SMMC program. The ITN permitted potential respondents to submit written questions about the solicitation to the Agency by August 14, 2017. Some did. On September 14, 2017, the Agency issued Addendum No. 1 to the ITN. Among other things, Addendum No. 1 changed the anticipated date for the Agency’s responses to respondents’ written questions from September 15 to October 2, 2017. The Agency issued Addendum No. 2 to the ITN on October 2, 2017. Addendum No. 2 included a chart with 628 written questions from potential respondents and the Agency’s answers. Attachment A at A 10-(d) makes it clear that the answers are part of the addendum. Both Addendums to the ITN cautioned that any protest of the terms, conditions, or specifications of the Addendums to the ITN had to be filed with the Agency within 72 hours of their posting. No respondent protested. Instructions for the A-4 Exhibits included these requirements: Each SRC contains form fields. Population of the form fields with text will allow the form field to expand and cross pages. There is no character limit. All SRCs, marked as “(Statewide)” must be identical for each region in which the respondent submits a reply. For timeliness of response evaluation, the Agency will evaluate each “(Statewide)” SRC once and transfer the score to each applicable region’s evaluation score sheet(s). The SRCs marked as “(Regional)” will be specific and only apply to the region identified in the solicitation and the evaluation score will not be transferred to any other region. The instructions continue: Agency evaluators will be instructed to evaluate the responses based on the narrative contained in the SRC form fields and the associated attachment(s), if applicable. Each response will be independently evaluated and awarded points based on the criteria and points scale using the Standard Evaluation Criteria Scale below unless otherwise identified in each SRC contained within Exhibit A-4. This is the scale: STANDARD EVALUATION CRITERIA SCALE Point Score Evaluation 0 The component was not addressed. 1 The component contained significant deficiencies. 2 The component is below average. 3 The component is average. 4 The component is above average. 5 The component is excellent. The ITN further explained that different SRCs would be worth different “weights,” based on the subject matter of the SRC and on whether they were General, MMA, LTC, or Specialty SRCs. It assigned weights by establishing different “weight factors” applied as multipliers to the score a respondent received on a criteria. For example, “Respondent Background/Experience” could generate a raw score of 90. Application of a weight factor of three made 270 the maximum possible score for this criteria. “Oversight and Accountability” could generate a raw score of 275. A weight factor of one, however, made the maximum score available 275. General SRC 6 solicits HEDIS data. HEDIS is a tool that consists of 92 measures across six domains of care that make it possible to compare the performance of health plans on an “apples-to-apples” basis. SRC 6 states: The respondent shall describe its experience in achieving quality standards with populations similar to the target population described in this solicitation. The respondent shall include, in table format, the target population (TANF, ABD, dual eligible), the respondent’s results for the HEDIS measures specified below for each of the last two (2) years (CY 2015/ HEDIS 2016 and CY 2016/ HEDIS 2017) for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees). If the respondent does not have HEDIS results for at least three (3) Medicaid Contracts, the respondent shall provide commercial HEDIS measures for the respondent’s largest Contracts. If the Respondent has Florida Medicaid HEDIS results, it shall include the Florida Medicaid experience as one (1) of three (3) states for the last two (2) years. The respondent shall provide the data requested in Exhibit A-4-a-1, General Performance Measurement Tool[.] x x x Score: This section is worth a maximum of 160 raw points x x x For each of the measure rates, a total of 10 points is available per state reported (for a total of 360 points available). The respondent will be awarded 2 points if their reported plan rate exceeded the national Medicaid mean and 2 points if their reported plan rate exceeded the applicable regional Medicaid mean, for each available year, for each available state. The respondent will be awarded an additional 2 points for each measure rate where the second year’s rate is an improvement over the first year’s rate, for each available state. An aggregate score will be calculated and respondents will receive a final score of 0 through 150 corresponding to the number and percentage of points received out of the total available points. For example, if a respondent receives 100% of the available 360 points, the final score will be 150 points (100%). If a respondent receives 324 (90%) of the available 360 points, the final score will be 135 points (90%). If a respondent receives 36 (10%) of the available 360 points, the final score will be 15 points (10%). The SRC is plainly referring to the broad Medicaid- eligible population when it says “the target population (TANF, ABD, dual eligible).” “Dual eligible” populations are persons eligible for Medicaid and Medicare. There, as throughout the ITN, the ITN delineates between a target population of all Medicaid-eligible patients and a specialty population as described in a respondent’s ITN proposal. The clear instructions for SRC 6 require, “Use the drop-down box to select the state for which you are reporting and enter the performance measure rates (to the hundredths place, or XX.XX) for that state's Medicaid population for the appropriate calendar year.” Community did not comply. General SRC 14 solicits similar data, in similar form using a similar tool, about a respondent’s Consumer Assessment of Healthcare Providers and Systems (CAHPS). CAHPS data is basically a satisfaction survey. It asks respondents to provide “in table format the target population (TANF, ABD, dual eligible) and the respondent’s results for the Consumer Assessment of Healthcare Providers and Systems (CAHPS) items/composites specified below for the 2017 survey for its adult and child populations for the respondent’s three (3) largest Medicaid Contracts (as measured by number of enrollees).” Just like General SRC 6 did with HEDIS data, General SRC 14 ITN instructed bidders to put their CAHPS data for the “target population (TANF, ABD, dual eligible)” “for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees)” for multiple states into an excel spreadsheet “to the hundredths place[.]” Also, like General SRC 6, General SRC 14 includes an objective formula described in the ITN for scoring bidders’ CAHPS data. RANKING PROVISIONS Attachment A at (D)(4)(c)(2) stated: Each response will be individually scored by at least three (3) evaluators, who collectively have experience and knowledge in the program areas and service requirements for which contractual services are sought by this solicitation. The Agency reserves the right to have specific sections of the response evaluated by less than three (3) individuals. The ITN’s example of how total point scores would be calculated, discussed below, also indicated that some sections may be scored by less than three evaluators. The explanatory chart had a column for “[o]ther Sections evaluated by less than three (3) evaluators. ” The Agency’s policy, however, has been to assign at least three evaluators to score program specific SRCs. Attachment A at (D)(4)(e)(2) advised respondents how the agency will rank the competing responses. It was clear and specific, even providing an example of the process showing how the scores “will” be calculated. Step one of the explanatory chart stated that the Agency would calculate a total point score for each response. Step two stated that “[t]he total point scores will be used to rank the responses by an evaluator. . . .” Next, the rankings by the evaluator are averaged to determine the average rank for each respondent. This average ranking is critical because ranking is how the ITN said the Agency would select respondents for negotiation and how the Agency did select respondents for negotiation. The step two and step three charts, reproduced below, demonstrate that the ITN contemplated an evaluation process in which each response was to be evaluated in its entirety by three different evaluators, or maybe less than three, but indisputably in its entirety by those who evaluated it. This did not happen. Step 2 The total point scores will be used to rank the responses by evaluator (Response with the highest number of points = 1, second highest = 2, etc.). POINTS SUMMARY Evaluator A Evaluator B Evaluator C Evaluator D Respondent 446 Respondent 396 Respondent 311 Respondent 413 Respondent 425 Respondent 390 Respondent 443 Respondent 449 Respondent 397 Respondent 419 Respondent 389 Respondent 435 Respondent 410 Respondent 388 Respondent 459 Respondent 325 RANKING SUMMARY Evaluator A Evaluator B Evaluator C Evaluator D Respondent 1 1 Respondent 1 2 Respondent 1 4 Respondent 3 Respondent 2 2 Respondent 2 3 Respondent 2 2 Respondent 1 Respondent 3 4 Respondent 3 1 Respondent 3 3 Respondent 2 Respondent 4 3 Respondent 4 4 Respondent 4 1 Respondent 4 c) Step 3 An average rank will be calculated for each response for all the evaluators. Respondent 1 1+2+4+3=10÷4=2.5 Respondent 2 2+3+2+1=8÷4=2.0 Respondent 3 4+1+3+2=10÷4=2.5 Respondent 4 3+4+1+4=12÷4=3.0 PROVIDER SERVICE NETWORK PROVISIONS Florida law permits a PSN to limit services provided to a target population “based on age, chronic disease state, or medical condition of the enrollee.” This allows a PSN to offer a specialty plan. For each region, the eligible plan requirements of section 409.974(1) state, “At least one plan must be a provider service network if any provider service networks submit a responsive bid.” Section 409.974(3) says: “Participation by specialty plans shall be subject to the procurement requirements of this section. The aggregate enrollment of all specialty plans in a region may not exceed 10 percent of the total enrollees of that region.” The ITN addressed those requirements. The Negotiation Process section of Attachment A, Instructions and Special Conditions, says: The Agency intends to invite the following number of respondents to negotiation: Comprehensive Plans The top four (4) ranking Comprehensive Plans. Long-term Care Plus Plans The top two (2) ranking Long-term Care Plus Plans Managed Medical Assistance Plans The top two (2) ranking Managed Medical Assistance Plans Specialty Managed Medical Assistance Plans The top two (2) ranking Specialty Managed Medical Assistance Plans per specialty population. If there are no provider service networks included in the top ranked respondents listed above, the Agency will invite the highest ranked PSN(s) to negotiations in order to fulfill the requirements of Section 409.974(1), Florida Statutes and Section 409.981(1), Florida Statutes. Emphasis supplied. The ITN specifications in Section D.7, titled Number of Awards, state as follows about Specialty Plan awards: 7. Number of Awards In accordance with Sections 409.966, 409.974, and 409.981, Florida Statutes, the Agency intends to select a limited number of eligible Managed Care Plans to provide services under the SMMC program in Region 10. The Agency anticipates issuing the number of Contract awards for Region 10 as described in Table 5, SMMC Region, below, excluding awards to Specialty MMA Plans. Table 5 SMMC Region Region Total Anticipated Contract Awards Region 10 4 If a respondent is awarded a Contract for multiple regions, the Agency will issue one (1) Contract to include all awarded regions. The Agency will award at least one (1) Contract to a PSN provided a PSN submits a responsive reply and negotiates a rate acceptable to the Agency. The Agency, at its sole discretion, shall make this determination. A respondent that is awarded a Contract as a Comprehensive Plan is determined to satisfy the requirements in Section 409.974, Florida Statutes and Section 409.981, Florida Statutes and shall be considered an awardee of an MMA Contract and a LTC Contract. The Agency will issue one (1) Contract to reflect all awarded populations in all awarded regions. In addition to the number of Contracts awarded in this region, additional Contracts may be awarded to Specialty Plans that negotiate terms and conditions determined to be the best value to the State and negotiate a rate acceptable to the Agency. The Agency, at its sole discretion, shall make this determination. The Agency reserves the right to make adjustments to the enrollee eligibility and identification criteria proposed by a Specialty Plan prior to Contract award in order to ensure that the aggregate enrollment of all awarded Specialty Plans in a region will not exceed ten percent (10%) of the total enrollees in that region, in compliance with Section 409.974(3), Florida Statutes. If a respondent is awarded a Contract as a Specialty Plan and another plan type, the Agency will issue one (1) Contract to include all awarded populations in all awarded regions. A prospective vendor asked about the interplay of Specialty Plan options and the PSN requirements. The question and the answer provided in Addendum 2 follow: Q. Please clarify the number of PSN awards per region and how PSN awards will be determined based on the PSN's plan type (e.g., Comprehensive, LTC Plus, MMA, Specialty). As you know, Sections 409.974 and 409.981, Florida Statutes require one MMA PSN and one LTC PSN award per region (assuming a PSN is responsive) and the Agency has stated that an award to a Comprehensive Plan PSN will meet the requirements of both statutes. However, can the Agency further clarify whether other types of PSNs would meet the statutory requirements? Specifically, would a PSN LTC Plus award meet the requirements of Section 409.981, Florida Statutes? Similarly, would an award to a Specialty Plan PSN meet the requirements of Section 409.974, Florida Statutes? A. See Attachment A Instructions and Special Conditions, Section D Response Evaluations, and Contract Award, Sub-Section 7 Number of Awards. Yes, a PSN LTC Plus award would meet the requirements of Section 409.981(2). A Specialty Plan PSN would not meet the requirements of Section 409.974(1). The only reasonable interpretation of this answer is that Specialty Plan PSNs do not satisfy the requirement to contract with a responsive PSN imposed by section 409.974. None of the prospective vendors, including Community, challenged this clarification. EVALUATION PROCESS THE EVALUATORS The Agency selected 11 people to evaluate the proposals. The Agency assigned each person a number used to identify who was assigned to which task and to track performance of evaluation tasks. The procurement officer sent the evaluators a brief memo of instructions. It provided dates; described logistics of evaluation; emphasized the importance of independent evaluation; and prohibited communicating about the ITN and the proposals with anyone other than the procurement office. The Agency also conducted an instructional session for evaluators. Evaluator 1, Marie Donnelly: During the procurement, Ms. Donnelly was the Agency’s Chief of the Bureau of Medicaid Quality. She held this position for five years before resigning. This bureau bore responsibility for ensuring that the current SMMC plans met their contract requirements for quality and quality improvement measures. Her role specifically included oversight of Specialty Plans. Evaluator 2, Erica Floyd Thomas: Ms. Thomas is the chief of the Bureau of Medicaid Policy. She has worked for the Agency since 2001. Her Medicaid experience includes developing policies for hospitals, community behavioral health, residential treatment, and contract oversight. Before serving as bureau chief, she served as an Agency administrator from 2014 through 2017. Ms. Thomas oversaw the policy research and development process for all Medicaid medical, behavioral, dental, facility, and clinic coverage policies to ensure they were consistent with the state Plan and federal Medicaid requirements. Evaluator 3, Rachel LaCroix, Ph.D.: Dr. LaCroix is an administrator in the Agency’s Performance Evaluation and Research Unit. She has worked for the Agency since 2003. All her positions have been in the Medicaid program. Dr. LaCroix has served in her current position since 2011. She works with the performance measures and surveys that the current SMMC providers report to the Agency. Dr. LaCroix is a nationally recognized expert on healthcare quality metrics like HEDIS. She is also an appointee on the National Association of Medicaid Directors’ task force for national performance measures. Evaluator 4, Damon Rich: Mr. Rich has worked for the Agency since April 2009. He is the chief of the Agency’s Bureau of Recipient and Provider Assistance. This bureau interacts directly with AHCA’s current SMMC care providers about any issues they have, and with Medicaid recipients, usually about their eligibility or plan enrollment. Before Mr. Rich was a bureau chief, he worked as a field office manager for the Agency. Mr. Rich’s experience as bureau chief and field office manager includes oversight of the current SMMC Specialty Plans. Evaluator 5. Eunice Medina: Ms. Medina is the chief of the Agency’s Bureau of Medicaid Plan Management, which includes a staff of over 60 individuals, who manage the current SMMC contracts. Her experience and duties essentially encompass all aspects of the current SMMC plans. Ms. Medina started working with the Agency in 2014. Evaluator 6, Devona “DD” Pickle: Ms. Pickle most recently joined the Agency in 2011. She also worked for the Agency from November 2008 through November 2010. Ms. Pickle’s Agency experience all relates in some way to the Medicaid program. Since March 2013, Ms. Pickle has served as an administrator over managed care policy and contract development in the Bureau of Medicaid Policy. Her job duties include working with the current SMMC contractors. Ms. Pickle is also a Florida licensed mental health counselor. Evaluator 7, Tracy Hurd-Alvarez: Ms. Hurd-Alvarez has worked for the Agency’s Medicaid program since 1997. Since 2014, she has been a field office manager, overseeing compliance monitoring for all the current SMMC contractors. Before assuming her current position, Ms. Hurd-Alvarez implemented the LTC SMMC program. Evaluator 8, Gay Munyon: Ms. Munyon is currently the Chief of the Bureau of Medicaid Fiscal Agent Operations. Ms. Munyon began working with the Agency in April 2013. Ms. Munyon’s bureau oversees fulfillment of the Agency’s contract with the current SMMC fiscal agent. Her unit’s responsibilities include systems maintenance and modifications and overseeing the fiscal agent, which answers phone calls, processes claims, and processes applications. Ms. Munyon has 25 years of experience working with the Medicaid program. Evaluator 9, Laura Noyes: Ms. Noyes started working for the Agency in April 2011. Her years of Agency experience all relate to the Medicaid program, including overseeing six current comprehensive managed care plans by identifying trends in contractual non-compliance. Evaluator 10, Brian Meyer: Mr. Meyer is a CPA, who has worked for the Agency in the Medicaid program since 2011. He is currently chief of the Bureau of Medicaid Data Analytics. Mr. Meyer’s primary responsibility is overseeing the capitation rates for the current SMMC contractors. His experience includes Medicaid plan financial statement analysis, surplus requirement calculation analysis and, in general, all types of financial analysis necessary to understand financial performance of the state’s Medicaid plans. Evaluator 11, Ann Kaperak: Since April 2015, Ms. Kaperak has served as an administrator in the Agency’s Bureau of Medicaid Program Integrity. Ms. Kaperak’s unit oversees the fraud and abuse efforts of the current SMMC plans. She also worked for the Medicaid program from November 2012 through May 2014. Ms. Kaperak worked as a regulatory compliance manager for Anthem/Amerigroup’s Florida Medicaid program between May 2014 and April 2015. Positive and Community challenge the Agency’s plan selections by questioning the qualifications of the evaluators. The first part of their argument is that the evaluators did not have sufficient knowledge about HIV/AIDS and its treatment. The evidence does not prove the theory. For instance, Positive’s argument relies upon criticizing the amount of clinical experience evaluators had managing patients with HIV/AIDS. That approach minimizes the fact that the managed care plan characteristics involve so much more than disease- specific considerations. For instance, many of the components require determining if the respondent provided required documents, verifying conflict of interest documents, management structure, quality control measures, and the like. General SRCs asked for things like dispute resolution models (SRC 16), claims processing information (SRC 17), and fraud and abuse compliance plans (SRC 31). MMA SRCs included criteria, like telemedicine (SRC 4), demonstrated progress obtaining executed provider agreements (SRC 6), and a credentialing process (SRC 12). Specialty SRCs included criteria like copies of contracts for managed care for the proposed specialty population (SRC 1), specific and detailed criteria defining the proposed specialty population (SRC 4), and the like. The evidence does not prove that disease-specific experience is necessary to evaluate responses to these and other SRCs. SRC 6 involving HEDIS data and SRC 14 involving CAHPS data are two good examples. They required respondents to input data into a spreadsheet. All the evaluators had to do was determine what those numbers showed. Evaluation did not require any understanding of disease or how the measures were created. All the evaluator had to know was the number in the spreadsheet. The second part of the evaluator qualification criticisms is that the evaluators did not give adequate weight to some responses. Positive and Community just disagree with the measures requested and the evaluation of them. They conclude from that disagreement that the evaluators’ qualifications were deficient. The argument is not persuasive. The last sentence of paragraph 69 of Positive’s proposed recommended order exemplifies the criticisms of Positive and Community of the evaluators’ qualifications. It states, “The fact that PHC [Positive] was ranked last among competing HIV plans shows that the SRC evaluators did not understand enough about managing individuals with HIV/AIDs to score its proposal competently.” The argument is circular and “ipse dixit”. It does not carry the day. The collective knowledge and experience of the evaluators, with a total of 128 years of Medicaid experience, made them capable of reasonably evaluating the managed care plan proposals, including the Specialty plan proposals. The record certainly does not prove otherwise. EVALUATION PROCESS The Agency assigned the evaluators to the SRCs that it determined they were qualified to evaluate and score. The Agency did not assign entire responses to an evaluator for review. Instead it elected a piecemeal review process assigning various evaluators to various sections, the SRCs of each response. Paragraph 30 of the Agency’s proposed recommended order describes this decision as follows: Although the ITN had contemplated ranking each vendor by evaluator, based on an example in the ITN, such ranking presumed a process where all evaluators scored all or nearly all of the responses to the ITN, which had occurred in the procurement five years ago. In this procurement, each evaluator reviewed only a subset of SRCs based on their knowledge, and experience; therefore, ranking by evaluator was not logical because each had a different maximum point score. The initial SRC scoring assignments were: General SRCs 1 through 4, LTC SRCs 1 and 2, and Specialty SRC 1: Marie Donnelly, Laura Noyes, and Brian Meyer. General SRCs 5 through 8, MMA SRCs 1 through 7, LTC SRCs 3 and 4, and Specialty SRCs 1 and 2: Marie Donnelly, Erica Floyd- Thomas, and Rachel LaCroix. General SRCs 9 through 14, MMA SRCs 8 through 11, LTC SRCs 5 through 7, and Specialty SRC 4: Damon Rich, Eunice Medina, and DD Pickle. General SRCs 15 through 17, MMA SRCs 12 and 13, and LTC SRCs 8 through 10: Damon Rich, Tracy Hurd-Alvarez, Gay Munyon. General SRCs 18 through 25, MMA SRCs 14 through 20, LTC SRCs 11 and 12, and Specialty SRC 5: Erica Floyd-Thomas, Eunice Medina, and DD Pickle. General SRCs 26 through 33 and LTC SRC 13: Gay Munyon, Ann Kaperak, and Brian Meyer. General SRCs 34 through 36 and MMA SRC 21: Marie Donnelly, Rachel LaCroix, and Tracy Hurd-Alvarez. The ranking process presented in the ITN and described in paragraphs 62-64, contemplated ranking each respondent by evaluator. The Agency carried this process over from an earlier procurement. In this procurement, despite what the ITN said, the Agency assigned responsibilities so that each evaluator reviewed only a subset of SRCs. Therefore, the ranking of responses by evaluator presented in the ITN could not work. It was not even possible because no one evaluator reviewed a complete response and because each SRC had a different maximum point score. Instead, the Agency, contrary to the terms of the ITN, ranked proposals by averaging the “total point scores” assigned by all of the evaluators. The Agency considered issuing an addendum advising the parties of the change. The addendum would have informed the respondents and provided them an opportunity to challenge the change. The Agency elected not to issue an addendum. EVALUATION AND SCORING The evaluators began scoring on November 6, 2017, with a completion deadline of December 29, 2017. The 11 evaluators had to score approximately 230 separate responses to the ITNs. The evaluators had to score 67,175 separate items to complete the scoring for all responses for all regions for all types of plans. No one at the Agency evaluated how much time it should take to score a particular item. None of the parties to this proceeding offered persuasive evidence to support a finding that scoring any particular item would or should take a specific length of time or that scoring all of the responses would or should take a specific length of time. Evaluators scored the responses in conference room F at the Agency’s headquarters. This secure room was the exclusive location for evaluation and scoring. Each evaluator had a dedicated workspace equipped with all tools and resources necessary for the task. The workspaces included a computer terminal for each evaluator. The system allowed evaluators to review digital copies of the ITN and proposals and to enter evaluation points in spreadsheets created for the purpose of recording scores. Evaluators also had access to hard copies of the proposals and the ITN. The Agency required evaluators to sign in and to sign out. The sign-in and sign-out sheets record the significant amount of time the evaluators spent evaluating proposals. Evaluators were not permitted to communicate with each other about the responses. To minimize distractions, the Agency prohibited cell phones, tablets and other connected devices in the room. The Agency also authorized and encouraged the evaluators to delegate their usual responsibilities. Agency proctors observed the room and evaluators throughout the scoring process. They were available to answer general and procedural questions and to ensure that the evaluators signed in and signed out. A log sheet documented how much time each evaluator spent in the scoring conference room. Some evaluators took extensive notes. For example, Ms. Median took over 200 pages of notes. Similarly, Ms. Munyon took nearly 400 pages of typewritten notes. The evaluators worked hard. None, other than Dr. LaCroix, testified that they did not have enough time to do their job. The computer system also automatically tracked the evaluators’ progress. Tracking reports showed the number of items assigned to each evaluator and the number of scoring items completed. The first status report was generated on December 8, 2017, approximately halfway through the scheduled scoring. At that time, only 28 percent of the scoring items were complete. Ms. Barrett usually ran the status reports in the morning. She made them available to the evaluators to review. The pace of evaluation caused concern about timely completion and prompted discussions of ways to accelerate scoring. Because it was clear that the majority of the evaluators would not complete scoring their SRCs by December 29, 2017, the Agency extended the scoring deadline to January 12, 2018. It also extended the hours for conference room use. Most respondents filed proposals for more than one type of plan and more than one region. This fact combined with the provision in the instructions saying that all statewide SRC responses must be identical for each region and that scores would transfer to each applicable region’s score sheets, enabled evaluators to score many SRCs just once. The system would then auto-populate the scores to the same SRC for all proposals by that respondent. This time saving measure permitted scoring on many of the items to be almost instantaneous after review of the first response to an SRC. The fact that so many respondents submitted proposals for so many regions and types of plans provided the Agency another opportunity for time-saving. The Agency loaded Adobe Pro on the evaluators’ computers as a timesaving measure. This program allowed the evaluators to compare a bidder’s Comprehensive Plan Proposal to the same company’s regional and Specialty Plan proposals. If the Adobe Pro comparison feature showed that the proposal response was the same for each plan, the Agency permitted evaluators to score the response once and assign the same score for each item where the respondent provided the same proposal. This speeded scoring. It, however, meant that for SRCs where evaluators did this, that they were not reviewing the SRC response in the specific context of the specialty plan population, each of which had specific and limited characteristics that made them different from the broader General and MMA plan populations. This is significant because so many SRCs required narrative responses where context would matter. There is no Specialty SRCs A-4 instruction requirement for specialty plans analogous to the requirement that responses for statewide SRCs must be identical for each region. In other words, the instructions do not say all SRCs marked as statewide must be identical for each specialty plan proposal and that the Agency will evaluate each Statewide SRC once and transfer the score to each applicable Specialty Plan score. In fact, according to the procurement officer, the Agency expected that evaluators would separately evaluate and score the statewide SRCs for Comprehensive Plans and for Specialty Plans, even if the same bidder submitted them. Despite the Agency’s expectation and the absence of an authorizing provision in the ITN, many evaluators, relying on the Adobe Pro tool, copied the SRC scores they gave to a respondent’s comprehensive plan proposal to its specialty plan proposal if the respondent submitted the same response to an SRC for a Comprehensive Plan and a Specialty Plan. For instance, Ms. Thomas (Evaluator 2) and Ms. Munyon (Evaluator 8) did this to save time. Ms. Donnelly (Evaluator 1) did this even when the comprehensive and specialty responses were not identical. This does not amount to the independent evaluation of the responses pledged by the ITN. On separate days, Evaluator 1 scored 1,315 items, 954 items, 779 items and 727 items. On separate days, Evaluator 2 scored 613 items, 606 items, 720 items, 554 items and 738 items. Evaluator 4 scored 874 items on one day. Evaluator 5 scored 813 items in one day. Evaluator 6 scored 1,001 items in one day. Evaluator 8 scored 635 items in one day. The record does not identify the items scored. It also does not permit determining how many of the item scores resulted from auto-population or assignment of scores based upon previous scoring of an identical response. It bears repeating, however, that the record does not support any finding on how long scoring the response to one SRC or an entire response could reasonably be expected to take. Even with the extended scoring period and time-saving measures, the Agency concluded that Evaluator 3 would not be able to finish all of the SRCs assigned to her. Rather than extend the deadline for scoring a second time, the Agency decided to reassign the nine of Evaluator 3’s SRCs that she had not begun scoring to two other evaluators. The Agency did not include scores of other SRCs for which Evaluator 3 had not completed scoring. The Agency only counted Evaluator 3’s scores for an SRC if she scored the SRC for everyone. The result was that only two people scored nine of the Specialty Plan SRCs. The Agency did not reassign all of Evaluator 3’s SRCs’. It only reassigned the SRCs to evaluators who were qualified to evaluate the items, who were not already assigned those items to score, and who had already finished or substantially completed their own evaluations. The decision to reassign the SRCs was not based on any scoring that had already been completed. The Agency did not allow changes to data submitted by any of the vendors. It allowed vendors to exchange corrupted electronic files for ones which could be opened and allowed vendors to exchange electronic files to match up with the paper copies that had been submitted. The Agency allowed Community to correct its submission where it lacked a signature on its transmittal letter and allowed Community to exchange an electronic document that would not open. It did not allow Community to change its reported HEDIS scores, which were submitted in the decimal form required by the instructions. Community erred in the numbers that it reported. There is no evidence showing that other vendors received a competitive or unfair advantage over Community in the Agency’s review of the SMI Specialty Plan submission for Region 10. There was no evidence that the Agency allowed any other vendors to change any substantive information in their submittals for that proposed specialty in that region. HEIDIS ISSUES Positive asserts that Simply’s proposal is non- responsive because Simply submitted HEDIS data from the general Medicaid population in response to SRC 6 and MMA SRC 14. Positive contends that Simply obtained a competitive advantage by supplying non-HIV/AIDS HEDIS data in response to SRC 6 and MMA SRC 14 because HIV/AIDS patients are generally a sicker group and require more care and because some HEDIS measures cannot be reported for an HIV/AIDS population. HEDIS stands for Healthcare Effectiveness and Data Information Set and is a set of standardized performance measures widely used in the healthcare industry. The instructions for both SRC 6 and MMA SRC 14 provide, in relevant part: The respondent shall describe its experience in achieving quality standards with populations similar to the target population described in this solicitation. The respondent shall include in table format, the target population (TANF, ABD, dual eligible), the respondent’s results for the HEDIS measures specified below for each of the last two (2) years (CY 2015/HEDIS 2016 and CY 2016/HEDIS 2017) for the respondent’s three (3) largest Medicaid Contracts (measured by number of enrollees). If the respondent does not have HEDIS results for at least three (3) Medicaid Contracts, the respondent shall provide commercial HEDIS measures for the respondent’s largest Contracts. If the Respondent has Florida Medicaid HEDIS results, it shall include the Florida Medicaid experience as one (1) of three (3) states for the last two (2) years. (JE 1 at 75 (SRC 6); JE 1 at 158 (MMA SRC 14)). SRC 6 and MMA SRC 14 instruct respondents to provide HEDIS measures for “the target population (TANF, ABD, dual eligible).” Id.. TANF, ABD, and dual eligible are eligibility classifications for the Medicaid population. The Agency sought information regarding the target Medicaid-eligible population, even from respondents proposing a Specialty Plan, because Specialty Plans are required to serve all of the healthcare needs of their recipients, not just the needs related to the criteria making those recipients eligible for the Specialty Plan. Following the instructions in SRC 6 and MMA SRC 14, Simply provided HEDIS data from the Medicaid-eligible population for its three largest Medicaid contracts as measured by the total number of enrollees. For the requested Florida HEDIS data, Simply utilized legacy HEDIS data from Amerigroup Florida, Inc., a Comprehensive Plan. Amerigroup and Simply had merged in October of 2017. Therefore, at the time of submission of Simply’s proposal, the HEDIS data from Amerigroup Florida was the data from Simply’s largest Medicaid contract in Florida for the period requested by the SRCs. Positive asserts that the Agency impermissibly altered scoring criteria after the proposals were submitted when the Agency corrected technical issues within a HEDIS Measurement Tool spreadsheet. SRC 6 and MMA SRC 14 required the submission of numeric data for the requested HEDIS performance measures. To simplify submission of the numeric data for the requested HEDIS performance measures, the Agency required respondents to utilize a HEDIS Measurement Tool spreadsheet. The evaluation criteria for SRC 6 and MMA SRC 14 provided that respondents will be awarded points if the reported HEDIS measures exceed the national or regional mean for such performance measures. Some respondents, including Positive, entered “N/A,” “small denominator,” or other text inputs into the HEDIS Measurement Tool. During the evaluation and scoring process, the Agency discovered that if a respondent input any text into the HEDIS Measurement Tool, the tool would assign random amounts of points, even though respondents had not input measureable, numeric data. The Agency reasonably resolved the problem by removing any text and inserting a zero in place of the text. The correction of the error in the HEDIS Measurement Tool prevented random points from being awarded to respondents and did not alter scores in any way contrary to the ITN. It was reasonable and fair to all respondents.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order rejecting all r esponses to the ITNs to provide a Medicaid Managed Care plan for patients with HIV/AIDS in Regions 10 and 11. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide Medicaid Managed Care plan in Region 10 for patients with serious mental illness. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide a Medicaid Managed Care plan in Region 10 for patients with serious mental illness. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order inviting Community to negotiate to provide a Medicaid Managed Care plan in Region 10 for c hild w elfare specialty services. Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Agency for Health Care Administration enter a final order awarding Wellcare of Florida, Inc., d/b/a Staywell Health Plan of Florida, a contract for a specialty Medicaid Managed Care plan for patients with Serious Mental Illness in Region 10. Based on the foregoing Findings of Fact and Conclusions of Law it is RECOMMENDED that the Agency for Health Care Administration enter a final order dismissing the Petition in Case No. 18-3513. DONE AND ENTERED this day of , , in Tallahassee, Leon County, Florida. S JOHN D. C. NEWTON, II Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this day of , .

USC (1) 42 U.S.C 1396u Florida Laws (9) 120.5720.42287.057409.912409.962409.966409.97409.974409.981
# 8
SOUTHPOINTE PHARMACY vs DEPARTMENT OF HEALTH AND REHABILITATIVE SERVICES, 92-003321F (1992)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Jun. 01, 1992 Number: 92-003321F Latest Update: Apr. 29, 1994

Findings Of Fact Petitioner, South Beach Pharmacy, Inc., d/b/a Southpointe Pharmacy (Southpointe), was at all times material hereto, a pharmacy located in Dade County, Florida, and a provider under the Medicaid program. Respondent, Department of Health and Rehabilitative Services (DHRS), was and is the state agency responsible for regulating the Medicaid program in Florida. Pharmacies participating in the Medicaid program are subject to routine audits, which are coordinated by the DHRS Office of Program Integrity. In early 1989 a routine audit of Southpointe was conducted by the Professional Foundation for Health Care (PFHC) at the request of DHRS. Following that audit, DHRS reasonably determined that further investigation was warranted and asked PFHC to perform an audit referred to as an "aggregate analysis." PFHC performed the aggregate analysis audit as instructed and determined that an overpayment had been made to Southpointe. The PFHC audit was submitted to and reviewed for accuracy and correctness by the Office of Program Integrity. In August 1989, DHRS took proposed final agency action against Southpointe in the form of a letter which, among other things, demanded repayment of funds alleged to have been overpaid under the Medicaid program, assessed an administrative fine against Southpointe, and terminated Southpointe from the Medicaid program for two years. That letter was revised in September 1989 to change the amount of the alleged overpayment. The alleged overpayment was challenged by Southpointe. The matter was submitted to the Division of Administrative Hearings and assigned DOAH Case No. 89-6057. At the times pertinent to this proceeding DHRS had not adopted the "aggregate analysis" methodology by rule. Instead, DHRS relied on incipient, non-rule policy and attempted, without success, to explicate its reasons for relying on this methodology. The first time the "aggregate analysis" methodology was used in an effort to determine the overpayment by Medicaid to a pharmacy was in the case of David's Pharmacy v. Department of Health and Rehabilitative Services, DOAH Case No. 88-1668 (Final Order entered September 15, 1988). The Final Order entered in the David's Pharmacy case specifically recognized that DHRS was not entitled to rely on non-rule policy in imposing sanctions against a provider because of the wording of Section 409.266(11)(g), Florida Statutes (1989), which limits the imposition of sanctions against a Medicaid provider to situations where the provider is not in compliance with the Florida Administrative Code. Further, the Final Order in the David's Pharmacy case concluded that the aggregate analysis methodology was flawed and, consequently, could not be relied upon by DHRS in determining that an overpayment had been made. Although DHRS again attempted to rely on the aggregate analysis methodology in the audit of Southpointe, DHRS had not adopted the aggregate analysis methodology as a rule (even though there were no changes in the governing statute) and it did not cure all the defects in the methodology that were specifically raised by the Final Order in David's Pharmacy. The only material change in the aggregate analysis procedure between the time of the David's Pharmacy final order and the time it was used to audit Southpointe was the elimination of the use of a Medicaid percentage applied to the quantities of audited drugs. The Recommended Order submitted by the undersigned following the formal hearing in the underlying proceeding (DOAH Case No. 89-6057) found that DHRS had not adopted the "aggregate analysis" methodology as a rule and that DHRS had not explicated its policy in attempting to rely on this non-rule policy. The Recommended Order concluded that DHRS had failed to prove any overpayment to Southpointe. The Recommended Order also found that certain data relied on by PFHC in performing the aggregate analysis was unreliable, which resulted in the amount of claimed overpayment being overstated. While DHRS was not aware that this data was unreliable, this data merely affected the amount of the overpayment. It was DHRS's continued reliance on the aggregate analysis that led DHRS to the assertion that there had been an overpayment. DHRS, by its Final Order in DOAH Case No. 89-6057, rejected many of the facts and the conclusion contained in the Recommended Order. Instead, DHRS determined that there had been an overpayment to Southpointe, demanded repayment of the alleged overpayment, imposed an administrative fine in the amount of $250.00, and suspended Southpointe as a Medicaid provider for three months. Thereafter, Southpointe appealed DHRS's Final Order to the First District Court of Appeal. The First District Court of Appeal reversed DHRS and concluded, in pertinent part, as follows: . . . Therefore, as found by the hearing officer, the Department was proceeding not under any existing rule but rather under incipient policy. That finding was based upon competent and substantial evidence, and we hold it was a gross abuse of discretion for the Department to reject that finding of fact. * * * . . . [W]e agree with the hearing officer that HRS failed in its mission to support and defend the aggregate analysis with competent and substantial evidence. In an earlier final order issued by the Department, David's Pharmacy v. Department of Health and Rehabilitative Services, 11 FALR 2935 (HRS 1988), wherein aggregate analysis was utilized for the first time, the Department found HRS had not appropriately explicated this non-rule policy by its failing to produce evidence that would establish a rational, reasonable basis for the procedure. In the instant case, despite rather pat testimony to the effect that the aggregate analysis is indeed contemplated by the rule, it was shown that HRS had not checked a single Medicaid patient to determine if the medication had been dispensed, or a single physician to see if the medication had been prescribed. Robert Peirce testified that the only thing HRS had done since David's Pharmacy, was to delete the requirement of utilizing a "percentage of Medicaid sales" from the formula. As pointed out by Southpointe, none of the other shortcomings of aggregate analysis which were identified in the David's final order were remedied by HRS at the hearing below. For example, neither a beginning nor ending inventory had been taken into consideration, and no consideration was given to whether Southpointe had acquired additional drugs to augment its inventory by means other than direct purchase from its manufacturers 1/ DHRS has failed to establish that it was substantially justified in taking action against Southpointe based on the aggregate analysis methodology. There was no evidence to show that an award of fees and costs to Southpointe would be unjust in this case. Southpointe has become obligated to pay costs and attorney's fees in excess of $15,000.00, the maximum allowable recovery under the Equal Access to Justice Act. The parties stipulated that these costs and fees are reasonable. Petitioner is a prevailing small business within the meaning of Section 57.111, Florida Statutes, and has met all conditions precedent for such an award.

Florida Laws (3) 120.57120.6857.111
# 9

Can't find what you're looking for?

Post a free question on our public forum.
Ask a Question
Search for lawyers by practice areas.
Find a Lawyer