Elawyers Elawyers
Washington| Change
Find Similar Cases by Filters
You can browse Case Laws by Courts, or by your need.
Find 49 similar cases
RICHARD LEE vs. DEPARTMENT OF HEALTH AND REHABILITATIVE SERVICES, 77-002211 (1977)
Division of Administrative Hearings, Florida Number: 77-002211 Latest Update: Jun. 02, 1978

Findings Of Fact In 1955, the Department of Health and Rehabilitative Services (HRS) entered into a contract with the federal Social Security Administration under which HRS agreed to evaluate applications for social security disability benefits and to allow or disallow claims, accordingly. At the time of the hearing, respondent employed 343 persons who were involved in adjudicating some 2,500 cases every week. On the average, every claim allowed has a value of ninety thousand dollars ($90,000.00). From 1955 through 1972, every determination was reviewed in a central office in Baltimore, Maryland. Since then, as an economy measure required by federal legislation, review has been limited to more or less random samples and most determinations have gone unreviewed by anybody outside of the office making the determination. The original means of choosing samples for review involved the selection of a one or two digit number, which was changed periodically. When applicants' social security numbers ended in the chosen numerals, their cases were sent to another office for review, after the initial determinations had been made. Under this system, the primary evaluators learned which cases would be reviewed, if they found out what digits were being used. This created possibilities for non-random samples. While the first sampling method was in effect, one of respondent's district supervisors, Robert Melcher, limited his review of determinations that had been made in HRS' Orlando office, which he headed, to the cases of applicants whose social security numbers ended in digits designated as calling for inclusion in samples to be mailed elsewhere for review. If one of these determinations struck him as problematic, he might direct that additional work be done with a view toward the matter's eventually being reconsidered. This sometimes had the incidental effect of delaying final processing in Orlando until a time when different digits had been announced for review samples purposes, so that the case escaped any review outside the Orlando office. Petitioner is one of two unit supervisors who answer directly to Mr. Melcher. The other is Creighton Hoyt. Each unit supervisor supervises a team of medical disability examiners. Detailed statistics are kept with respect to the job performance of each medical disability examiner, each unit and each branch office in Florida. As one result, a definite rivalry between Unit I and Unit II has grown up in the Orlando office. James Drake, who has succeeded to the position vacated by petitioner's demotion, was formerly a medical disability examiner in Unit II, headed by Mr. Hoyt. Ms. Johnnie M. Sherrod worked as a medical disability examiner in Unit I, headed by petitioner. Harry Jackson Speir, Jr., also worked as a medical disability examiner in HRS' Orlando office, starting in June of 1973. From 1975 until the time of the hearing, a second method of selecting samples for review purposes obtained in HRS' Orlando office, in accordance with national guidelines. Becky Bowman, a clerk IV or "coder," divided the files she received from the medical disability examiners and their supervisors into two piles. In one pile were files involving claims arising exclusively under Title II of the Social Security Act and in the other pile were files involving all other claims. Mrs. Bowman, who is technically directly answerable to Leon Simkins, in Tallahassee, was supervised on a day to day basis by petitioner and Mr. Melcher. She was told to pick every tenth, then, in February of 1977, every fourteenth case from the first pile for mailing to Tallahassee for quality assurance review; every thirtieth case in the first pile she was told to mail to Atlanta; and every fortieth case in the second pile Mrs. Bowman was told to mail to Baltimore. Although the procedures have stayed more or less constant since 1975, the intervals at which cases were to be selected changed in February of 1977 and again in December of 1977. Originally, Mrs. Bowman only sorted, while Ms. Margaret Dingfelder prepared the files for mailing. From time to time, Mr. Speir took cases he was working on to Mr. Melcher to ask for advice on difficult points. In three separate instances, Mr. Melcher said to Mr. Speir, referring to the case they were discussing, "This is one we don't want to go to Q.A. [the quality assurance section]," or words to that effect. In each instance, Mr. Speir attached a note to the case file before delivering it to Mrs. Bowman. The notes read something like "Not for Q.A." On one occasion, Mrs. Bowman asked Mr. Speir who had authorized bypassing quality assurance review and Mr. Speir told her that Mr. Melcher had authorized it. Ms. Elaine Keir, who worked as Mr. Melcher's secretary from June of 1973 until October of 1976, remembers occasions when Mr. Melcher told medical disability examiners that more evidence should be gathered for a particular case. She remembers other occasions when Mr. Melcher told medical disability examiners to see that a particular case was not included in the quality assurance review sample. She had the impression that Mr. Melcher, who was concerned that his office's processing time statistics compare favorably with other branch offices' statistics, asked for further evidence in cases that relatively little time had been spent on, while suggesting bypassing review procedures in cases in which considerable time had already been expended but in which problems persisted nonetheless. From time to time, petitioner Lee, who began work with respondent as a medical disability examiner supervisor in May of 1973, received instructions from Mr. Melcher, his immediate supervisor, to see that a particular file was not sent for quality assurance review. Aware of the impropriety of interfering with sampling procedures intended to ensure randomness, petitioner began, on March 18, 1976, to keep a record of Mr. Melcher's requests. He received 22 such requests through September of 1977. Initially, petitioner attached a note to any file designated by Mr. Melcher as one to be diverted. The notes read "No Q.A." and were intended as directives to Mrs. Bowman. Mrs. Bowman, also aware of the impropriety of sabotaging the sampling procedures, suggested to petitioner that he dispense with the notes. At her suggestion, petitioner began laying files sideways in a tray on Mrs. Bowman's desk, whenever he had been told by Mr. Melcher that a file should not be sent elsewhere for review. Petitioner never indicated to Mrs. Bowman in any way that a case should be diverted from a quality assurance review sample, unless Mr. Melcher had first directed him to do so. On three occasions, Ms. Sherrod heard Mr. Melcher tell petitioner to see that Mrs. Bowman did not include cases in quality assurance review samples. In August or September of 1977, James Drake noticed a file turned sideways in a tray on Mrs. Bowman's desk. When he started to move it, she stopped him, saying that petitioner wanted the case routed around quality assurance review. Mr. Drake reported this incident to Mr. Hoyt, upon the latter's return from vacation. Mr. Hoyt, who had earlier heard a similar story from his secretary, summoned Mrs. Bowman to his office and listened to her confirm the reports that had preceded her. On September 21, 1977, Mr. Hoyt wrote Mr. Melcher a memorandum entitled "Inequities existing between Unit I & Unit II, in which he set out, inter alia, what he had been told by Mrs. Bowman. Mr. Hoyt sent a copy of this memorandum to James C. Russ, which resulted in Mr. Russ' investigating the charges. Inasmuch as petitioner admitted what he had done, Mr. Russ' investigation was short and straightforward. Petitioner did not accuse Mr. Melcher when he was originally interrogated on these matters, and Mr. Melcher denied complicity. Petitioner's superiors greeted petitioner's accusation with skepticism when it did come. They nonetheless conducted a somewhat perfunctory second investigation, which apparently failed to uncover sufficient evidence to satisfy them that Mr. Melcher had indeed directed petitioner to divert certain cases, so that they would not be included in quality assurance review samples. Even so, Mr. Melcher's superiors did reprimand Mr. Melcher, orally and in writing, for the part they perceived him to have played in these events.

Recommendation Upon consideration of the foregoing, it is RECOMMENDED: That respondent rescind petitioner's demotion, and issue a written reprimand to petitioner instead. DONE and ENTERED this 19th day of April, 1978, in Tallahassee, Florida. ROBERT T. BENTON, II Hearing Officer Division of Administrative Hearings Room 530, Carlton Building Tallahassee, Florida 32304 (904) 488-9675 COPIES FURNISHED: Carlton L. Welch, Esquire 331 Laurina Street, No. 547 Jacksonville, Florida 32216 Douglas E. Whitney, Esquire 1350 North Orange Avenue Winter Park, Florida 32789 Mrs. Dorothy B. Roberts Appeals Coordinator 530 Carlton Building Tallahassee, Florida 32304 Carroll Webb, Executive Director Administrative Procedures Committee Room 120, Holland Building Tallahassee, Florida 32304 ================================================================= AGENCY FINAL ORDER (AGAINST SUSPENSION) ================================================================= STATE OF FLORIDA CAREER SERVICE COMMISSION IN THE APPEAL OF RICHARD LEE, AGAINST SUSPENSION Petitioner, vs. CASE NO. 77-2211 BY THE DEPARTMENT OF HEALTH AND REHABILITATIVE SERVICES, OFFICE OF DISABILITY DETERMINATIONS, Respondent. /

# 1
DEPARTMENT OF CHILDREN AND FAMILY SERVICES vs NORWOOD ELEMENTARY AFTER SCHOOL, 02-001664 (2002)
Division of Administrative Hearings, Florida Filed:Miami, Florida Apr. 29, 2002 Number: 02-001664 Latest Update: Jan. 17, 2003

The Issue The issue in this case concerns whether the Respondent,1 Norwood Elementary After School Program, should be fined $200.00 for violation of licensing standards applicable to childcare facilities as alleged in the Charging Document dated December 20, 2001.

Findings Of Fact Norwood Elementary School is a public elementary school located in Miami-Dade County, Florida. The Norwood Elementary After School Program is a childcare facility licensed by the Petitioner. The Norwood Elementary After School Program is operated by the YWCA of Greater Miami. The YWCA of Greater Miami has a long history in child care in the Miami-Dade County community. The YWCA of Greater Miami operates seven early childhood centers, three of which are nationally accredited, and 18 after school sites, caring for over 2,500 children every day. The Norwood Elementary After School Program is a center for children with special needs. The Norwood Elementary After School Program provides services to children who have various disabilities, including children who are profoundly handicapped, trainable handicapped, and severely mentally retarded. Eighty percent of the staff of the Norwood Elementary After School Program is employed by Miami-Dade County Public Schools and are individuals who work with the special needs children during the day as well as after school. These are individuals who are specially trained to work with special needs children. These staff members know the children as well as their parents, and the children know them. Rosalind Dunwoody, the on-site director of the Norwood Elementary After School Program is an employee of Dade County Public Schools. She works as a paraprofessional at Norwood Elementary during the school day and for the Norwood Elementary After School Program in the afternoons. D.D. is a four-year-old boy who is in the special needs program at Norwood Elementary during the day and also is in the Norwood Elementary After School Program. D.D. is in the profoundly retarded classroom at Norwood Elementary and his behavior is impulsive. For his own safety, when he is being fed or when he is sitting to play or color or do some other task, D.D. is secured in his chair by a seat belt. It is a regular wooden child’s chair with a seatbelt, similar to the seatbelts used on airplanes, that fastens around the child’s hips, waist, or lower abdomen. Neither the child’s arms nor legs are restrained. D.D. uses a similar chair during the daytime program at Norwood Elementary and he is able to buckle and unbuckle the seatbelt himself, usually buckling himself in when he sits in the chair to be fed. Such chairs are utilized for similarly disabled children when they are sitting to eat or to perform some task. The purpose of the chair is to ensure the safety of the child. D.D.’s parents have provided Norwood with written permission to buckle D.D. into the chair for his safety. D.D. never has been buckled in his chair for disciplinary purposes or for the purpose of protecting other children in the program. On April 25, 2001, Ms. Reiling, the Petitioner's licensing counselor responsible for the Norwood Elementary After School Program, received a complaint from the mother of another special needs child to the effect that her daughter had been bitten at the Norwood After School Program by D.D. On April 26, 2001, the day after she received the complaint, Ms. Reiling visited the site and she observed D.D. in his chair with the seatbelt buckled. Ms. Reiling suggested to Ms. Dunwoody, the Director of Norwood, that ambulatory children should be separated from non-ambulatory children and that D.D. should not be strapped to a chair. That same day, Ms. Reiling prepared an inspection report in which she noted: “Children must not be strapped to chair for discipline.” The report prepared by Ms. Reiling also noted that, in response to Ms. Reiling’s suggestions, Ms. Dunwoody explained that the chair was not being used for disciplinary purposes. Ms. Reiling’s report states: “Ms. Dunwoody states child not put in chair for discipline.” Subsequent to her April 26, 2001 inspection, Ms. Reiling returned to Norwood on May 16, 2001, and on May 30, 2001. On both of these visits, Ms. Reiling noted that, while the ambulatory children were separated from the non-ambulatory children as she had suggested, D.D. was buckled in his chair. When she again mentioned that she did not believe it was good practice to buckle D.D. into his chair, the staff explained that D.D. was buckled in his chair because they were about to feed him.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Department issue a final order dismissing the charges against the Respondent. DONE AND ENTERED this 10th day of September, 2002, in Tallahassee, Leon County, Florida. MICHAEL M. PARRISH Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 10th day of September, 2002.

Florida Laws (5) 120.569120.57402.301402.310402.319
# 2
JOHN L. WINN, AS COMMISSIONER OF EDUCATION vs PETER NEWTON, 05-000102PL (2005)
Division of Administrative Hearings, Florida Filed:Clearwater, Florida Jan. 13, 2005 Number: 05-000102PL Latest Update: Sep. 06, 2005

The Issue The issues in the case are whether the allegations set forth in the Administrative Complaint filed by Petitioner against Respondent are correct, and, if so, what penalty should be imposed.

Findings Of Fact Respondent is a Florida teacher, holding Florida Educator's Certificate 780153 (covering the area of Emotionally Handicapped education) valid through June 30, 2007. At all times material to this case, Respondent was employed as a teacher of emotionally handicapped third-grade students at Skycrest Elementary School in the Pinellas County School District. Respondent was employed by the Pinellas County School Board as a teacher of emotionally handicapped students for more than six years. The Pinellas County School District assessed student and instructional performance through the use of the "Pinellas Instructional Assessment Portfolio." The portfolio consisted of two tests administered three times each school year. The tests were known as the "Parallel Reading-Florida Comprehensive Assessment Test" and the "Parallel Math-Florida Comprehensive Assessment Test." The portfolio tests were used by the school district to gauge progress towards meeting the Sunshine State Standards established by the Florida Department of Education (DOE) to determine the academic achievement of Florida students. The portfolio tests, administered over a two-day period, also served to prepare students to take the Florida Comprehensive Assessment Test (FCAT). The FCAT was administered according to requirements established though the DOE and was designed to measure progress towards meeting Sunshine State Standards. Third-grade students were required to achieve a passing score on the FCAT in order to move into the fourth grade. One of the purposes of the portfolio tests was to measure student progress and provide information relative to each student's abilities. Based on test results, additional instruction was provided to remedy academic deficiencies and further prepare students to pass the FCAT. Emotionally handicapped students were required to take the reading and the math portfolio tests. The school district had specific procedures in place related to administration of the tests. Teachers responsible for administration of the tests received instruction on appropriate test practices. Respondent was aware of the rules governing administration of the tests. The procedures permitted teachers to offer general encouragement to students, but teachers were prohibited from offering assistance. Teachers were not allowed to read questions to students. Teachers were not permitted to provide any information to students related to the content of test responses. During the December 2002 testing period, Respondent provided improper assistance to the nine emotionally handicapped students he taught. During the test, Respondent reviewed student answers to multiple-choice questions and advised students to work harder on the answers, indicating that the answers were incorrect. Respondent assisted students by reading questions, helping students to pronounce words and phrases, and advising students as to the location in the test materials where answers could be found. Some of Respondent's students were apparently overwhelmed by the test process and ceased working on the tests. Respondent reviewed their progress and advised the students to continue answering questions. There is no evidence that Respondent directly provided answers to students, but Respondent clearly assisted students to determine which responses were correct. The assistance provided by Respondent to his students exceeded that which was allowed under test rules. Respondent acknowledged that the assistance was inappropriate, but asserted that he did so to provide confidence to the students that they could take and pass the FCAT, and advance to the fourth grade. Respondent's improper assistance to his students prevented school officials from obtaining an accurate measurement of the academic abilities of his students. The test results were invalidated and the students were retested. According to the parties, a newspaper article related to the matter was published in a local newspaper.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that Petitioner enter a final order reprimanding Respondent for violating Florida Administrative Code Rule 6B-1.006(3)(a), and placing him on probation for a period of one year. DONE AND ENTERED this 18th day of May, 2005, in Tallahassee, Leon County, Florida. S WILLIAM F. QUATTLEBAUM Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 18th day of May, 2005. COPIES FURNISHED: Kathleen M. Richards, Executive Director Education Practices Commission Department of Education 325 West Gaines Street, Room 224 Tallahassee, Florida 32399-0400 Mark Herdman, Esquire Herdman & Sakellarides, P.A. 2595 Tampa Road, Suite J Palm Harbor, Florida 34684 Ron Weaver, Esquire Post Office Box 5675 Douglasville, Georgia 30154-0012 Marian Lambeth, Program Specialist Bureau of Educator Standards Department of Education 325 West Gaines Street, Suite 224-E Tallahassee, Florida 32399-0400 Daniel J. Woodring, Esquire Department of Education 1244 Turlington Building 325 West Gaines Street Tallahassee, Florida 32399-0400

Florida Laws (3) 1012.011012.795120.57
# 3
PATRICK F. MURPHY, JR. vs DEPARTMENT OF CHILDREN AND FAMILY SERVICES, 99-004884 (1999)
Division of Administrative Hearings, Florida Filed:Jacksonville, Florida Nov. 19, 1999 Number: 99-004884 Latest Update: Feb. 07, 2001

The Issue Is Petitioner entitled to receive developmental services from the Department of Children and Family Services (the Department), due to his developmental disability based on retardation, pursuant to Chapter 393, Florida Statutes.

Findings Of Fact Petitioner was born on June 21, 1979, and at the time of the hearing was 21 years of age. Petitioner was evaluated at the University Hospital of Jacksonville in Jacksonville, Florida, in 1982, at two and one- half years of age. A report from that evaluation indicated that Petitioner was afflicted with a seizure disorder, speech delay, and right-sided dysfunction. During September, 1985, at age six years, three months, he was evaluated at the Hope Haven Children's Clinic, in Jacksonville, Florida. His hearing was tested and determined to be normal. A psychological evaluation noted that his communication skills were below his age level. A report of this evaluation indicates he was a slow learner with weaknesses in processing, retaining, and retrieving information, particularly in the area of speech and language development. On January 21, 1986, Petitioner was again evaluated at the Hope Haven Children's Clinic in Jacksonville, Florida. It was noted at that time he had difficulty in following directions and performing in a regular school environment. He was far behind his classmates academically. During this evaluation he was administered a Peabody Individual Achievement Test and received a standard score of 75 on both mathematics and reading recognition. These scores are above the range of retardation. Petitioner was examined by the School Psychology Services Unit, Student Services, of the Duval County School Board, on February, 17, 1987, when he was seven years and seven months of age. At the time he was receiving "specific learning disabilities full time services" while at Englewood Elementary School in Jacksonville, Florida. It was noted that he was difficult to evaluate because he was easily distracted. During the evaluation, on the Wechsler Intelligence Scale for Children-Revised, Petitioner received a full-scale intelligence quotient (IQ) of 74. This score indicated that he was below average within his verbal abstract reasoning, verbal expression and practical knowledge, visual attentiveness, visual analysis and visual synthesis. He was determined to be within the "slow learner's" range of development. Petitioner was referred to School Social Work Services, Duval County School Board in Jacksonville, Florida, on January 9, 1990. He was referred to the School Psychology Services Unit, where a Wechsler Intelligence Scale for Children-Revised was administered on June 11, 1990. His full-scale IQ was determined to be 74. He was also administered a Vineland assessment, which measures adaptive behavior rather than intelligence. In this case, Petitioner's mother provided answers regarding Petitioner's behavior and adaptability. At the time of this assessment, he was almost 11 years of age. On December 18, 1996, Petitioner was evaluated by the Sand Hills Psychoeducational Program in Augusta, Georgia, when he was 17 years of age. He was administered a battery of tests. The WAIS-R indicated borderline intellectual ability, but not retardation. The Stanford-Binet was 56. This score included a verbal reasoning score of 58 and an abstract visual reasoning score of 72. The split in the scores generates doubt as to the validity of the test. Psychologist Cydney Yerushalmi, Ph.D., an expert witness for the Department, and psychologist Barbara Karylowski, Ph.D., an expert witness for the Petitioner, opined that the Stanford Binet was inappropriate for a person who had attained the age of 17 because it would tend to produce lower scores. Dr. Karylowski tested Petitioner's IQ in February and March 2000. She concluded that Petitioner had a full-scale IQ of 68, which is mild retardation. At the time of that test Petitioner was 20 years of age. Dr. Karylowski opined that the scores she obtained were consistent with all of the scores she had obtained in prior testing because the confidence interval for his IQ was 68 to 77. This would place Petitioner within the range of retardation. Two standard deviations from the mean IQ is 70. It is Dr. Karylowski's opinion that Petitioner is mildly retarded. Her opinion is based on criteria set forth in the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV), published by the American Psychiatric Association. The DSM-IV definition of retardation requires significantly subaverage general intellectual functioning that is accompanied by significant limitations in adaptive behavior. The onset must occur before the age of 18 years. Accordingly, she believes that Petitioner was retarded before attaining the age of 18. Luis Quinones, M.D., was accepted as an expert witness in the field of psychiatry. Dr. Quinones opined that Petitioner meets the definition of mental retardation under DSM-IV. This means that he believes Petitioner was retarded before attaining the age of 18. He gave great weight to the Petitioner's lack of adaptive functioning in forming this opinion. Dr. Quinones opined that the definition of retardation in the DSM-IV is essentially the same as that found in Section 393.063(44), Florida Statutes. Dr. Yerushalmi evaluated Petitioner and reviewed his medical records. She administered the Wechsler Adult Intelligence Scale, Third Edition, to Petitioner, on August 11, 1999. Petitioner had a verbal score of 74, a performance score of 75, and a full-scale score of 72. She opined that Petitioner was not retarded under the definition set forth in Section 393.063(44), Florida Statutes. The aforementioned statute requires that one must meet a threshold of two or more standard deviations from the mean IQ, in order to be classified as retarded. All measures of IQ have a statistical confidence interval or margin of error of approximately five points, according to the DSM-IV. If one accepts the lower range of the confidence interval of the scores Petitioner has attained over the years, then he meets the two or more standard deviation threshold. Acceptance of the upper limits of the confidence interval would indicate that Petitioner clearly does not fall within the range of retardation. The significance of the confidence interval is reduced substantially when test results produced over a long period of time, by different test administrators, all indicate that Petitioner's IQ is not two or more standard deviations from the mean. Petitioner was diagnosed with many disorders by a variety of practitioners prior to the age of 18. No diagnosis of mental retardation was ever made. At age 21, Petitioner often behaves as if he were much younger, has focused on an 11-year-old as a girlfriend, and may become violent when not properly medicated. He likes to play pretend games of the sort that one would think would entertain a child. For instance, he likes to pretend that he is a law enforcement officer when he rides in a car. He prefers interacting with children who are five or six years younger. He is deficient in the area of personal hygiene. He likes to act silly. He is incapable of holding a driver's license. At the time of the hearing, Petitioner lived with his aunt, Ms. Mary Margaret Haeberle, who is a special education school teacher. She has provided a nurturing environment for Petitioner. Although Petitioner's parents divorced when he was a child, they have worked to address his needs. His younger sister understood Petitioner's problems and attempted to ameliorate them. Upon consideration all of the evidence, it is found that Petitioner was not possessed of an IQ which was two or more standard deviations from the mean. Therefore, there is no need to consider his adaptive function in concluding that he is not retarded.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law set forth herein, it is RECOMMENDED: That the Department enter a final order finding that Petitioner is not entitled to receive developmental services due to a developmental disability based on retardation. DONE AND ENTERED this 4th day of January, 2001, in Tallahassee, Leon County, Florida. HARRY L. HOOPER Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 4th day of January, 2001. COPIES FURNISHED: Michael R. Yokan, Esquire 1301 Riverplace Boulevard, Suite 2600 Jacksonville, Florida 32207 Robin Whipple-Hunter, Esquire Department of Children and Family Services 5920 Arlington Expressway Jacksonville, Florida 32231-0083 Virginia A. Daire, Agency Clerk Department of Children and Family Services Building 2, Room 204B 1317 Winewood Boulevard Tallahassee, Florida 32399-0700 Josie Tomayo, General Counsel Department of Children and Family Services 1317 Winewood Boulevard Building 2, Room 204 Tallahassee, Florida 32399-0700

Florida Laws (3) 120.57393.063393.066
# 5
MIKAEL A. FERNANDEZ vs DEPARTMENT OF CHILDREN AND FAMILY SERVICES, 02-000226 (2002)
Division of Administrative Hearings, Florida Filed:Miami, Florida Jan. 16, 2002 Number: 02-000226 Latest Update: Oct. 14, 2002

The Issue Whether the Petitioner is eligible to enroll in the Developmental Disabilities Program administered by the Respondent.

Findings Of Fact Based on the oral and documentary evidence presented at the final hearing and on the entire record of this proceeding, the following findings of fact are made: The Department is the state agency charged with administering and determining eligibility for services to developmentally disabled individuals pursuant to Florida's Developmental Disabilities Prevention and Community Services Act, Chapter 393, Florida Statutes. Section 393.065, Florida Statutes (2001). The program developed by the Department is known as the Developmental Disabilities Program. Mr. Fernandez is 31 years of age and a resident of Miami, Florida. Mr. Fernandez submitted an application to the Department requesting that it enroll him in its Developmental Disabilities Program and provide him services as a developmentally disabled individual under the categories of retardation and autism. The Department evaluated Mr. Fernandez's application and determined that he was not eligible to receive services through the Developmental Disabilities Program under either category. In making this determination, the Department considered a Psychological Evaluation Report dated June 26, 2001, that was prepared by Hilda M. Lopez, Ph.D., a licensed clinical psychologist to whom Mr. Fernandez was referred by the Department.2 To assess Mr. Fernandez's intellectual functioning and cognitive abilities, Dr. Lopez administered the Wechsler Adult Intelligence Scale-Third Edition ("WAIS"). According to her report, Mr. Fernandez attained a Verbal I.Q. score of 80 points, a Performance I.Q. score of 80 points, and a Full Scale I.Q. score of 78 points. These scores place Mr. Fernandez in the Borderline range of intellectual functioning. The Department considers persons who score 70 points or less on the WAIS to be mentally retarded. The mean score on the WAIS is 100 points, and the standard deviation is 15 points. To assess Mr. Fernandez's adaptive behavior, Dr. Lopez administered the Vineland Adaptive Behavior Scales ("Vineland"). Mr. Fernandez attained an Adaptive Behavior Composite score of 66, which indicates that his adaptive behavior is in the low range. His scores reveal deficits in the domains of Living Skills, Communication, and Socialization. Dr. Lopez also tested Mr. Fernandez for autism using the Childhood Autism Rating Scale. In the report, Dr. Lopez noted that Mr. Fernandez was rated by his father and by Dr. Lopez after observing, interacting, and interviewing Mr. Fernandez. Dr. Lopez reported that Mr. Fernandez's score was 26.5 points, which places him within the non-autistic range. Dr. Lopez observed in the report, however, that Mr. Fernandez "showed the following behavior problems: inappropriate emotional reactions, mildly abnormal fear and nervousness, resistance with [sic] changes in routine, mildly abnormal adaptation to change, and restlessness." A score of 30 points or more on the Childhood Autism Rating Scale is indicative of autism disorder. When Mr. Fernandez was a child of four or five years old, he was apparently diagnosed with autism, and he and his family took part in a behavior modification program in Boston, Massachusetts. As a result of the work done by Mr. Fernandez and his parents in this program, Mr. Fernandez learned to talk, although long after his peers, and improved his social skills. Based on her psychological evaluation of Mr. Fernandez, Dr. Lopez recommended the following: Mr. Fernandez will greatly benefit from a program geared at providing him with help to enhance his functional skills. Facilitation of social services to provide needed support and monitoring. Stimulation program oriented to develop his cognitive skills, to improve attention, memory, verbal communication and problem solving in order to achieve optimal capability. He will benefit from supported employment and referral to Vocational Rehabilitation Services for proper counseling and training. Mr. Fernandez was unable to produce any documents relating to his early diagnoses and treatment or his special education placements because these documents were destroyed in a fire that destroyed the Fernandez home. According to his father, Mr. Fernandez makes friends easily and communicates verbally very effectively. He worked for a while in a family business where his limitations were tolerated, and he flourished in this job. On the other hand, Mr. Fernandez is easily frustrated and confused, and he has difficulty following directions in simple matters. His father is seeking services on Mr. Fernandez's behalf that will teach him to live on his own and to become a productive citizen. The uncontroverted evidence presented by Mr. Fernandez establishes that he is in need of several of the services available through the Department's Developmental Disabilities Program. The evidence presented by Mr. Fernandez is not, however, sufficient to establish that he is eligible to participate in the Developmental Disabilities Program under the eligibility criteria established by the legislature for developmental disabilities.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Department of Children and Family Services enter a final order denying the application of Mikael Fernandez for enrollment in the Developmental Disabilities Program. DONE AND ENTERED this 28th day of June, 2002, in Tallahassee, Leon County, Florida. PATRICIA HART MALONO Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 28th day of June, 2002.

Florida Laws (6) 120.569120.57393.062393.063393.065393.066
# 6
THE MARION COUNTY SCHOOL BOARD vs DESIREE SEATON, 21-000303 (2021)
Division of Administrative Hearings, Florida Filed:Ocala, Florida Jan. 25, 2021 Number: 21-000303 Latest Update: Jan. 10, 2025

The Issue Whether Respondent (“Desiree Seaton”) violated Petitioner, the School Board of Marion County’s (“the School Board”),1 drug-free workplace policy; 1 The School Board’s official name is “The School Board of Marion County.” § 1001.40, Fla. Stat. (2020)(providing that “[t]he governing body of each school district shall be a district school board. Each district school board is constituted a body corporate by the name of ‘The School Board of County, Florida.’”). The case style has been amended accordingly. and, if so, whether her employment with the School Board should be terminated.

Findings Of Fact Based on the oral and documentary evidence adduced at the final hearing, the entire record of this proceeding, and matters subject to official recognition, the following Findings of Fact are made: The School Board maintains an alcohol and drug-free workplace. Section 6.33 of the School Board’s Human Resources Manual provides that: It is further the intent of the School Board of Marion County to comply with the Omnibus Transportation Employee Testing Act (OTETA), regulations of the Federal Highway Administration (FHWA) contained in 49 CFR Parts 40 and 382, et al, Section 2345.091, Florida Statutes, the provisions of the Drug-Free Workplace Act, and other applicable state and federal safety programs. This policy shall also affirm the Board’s position that an employee in a safety sensitive position may be considered impaired at any measurable level by the use of alcohol and/or controlled substances. Pursuant to OTETA and its implementing regulations, drug and alcohol testing is mandated for all safety sensitive identified employees who function in a safety sensitive position. Section 6.33 further specifies that prohibited substances include “marijuana, amphetamines, opiates, phencyclidine (PCP), and cocaine.” In 3 Ms. Seaton’s exhibits were misnumbered in that there was no Respondent’s Exhibit 10. addition, “[i]llegal use includes the use or possession of any illegal drug, and the misuse of legally prescribed or obtained prescription drugs.” Also, “when the use of a controlled substance is pursuant to the instructions of a physician, the employee shall immediately notify his/her supervisor.” Section 6.33 states that random drug testing “may take place at any time, with or without proximity to driving,” and that there will be random drug testing for “all identified safety sensitive positions.” A “safety sensitive position” is defined as “[a]ny function for which a commercial driver’s license is mandated and in which a driver operates a vehicle designed to carry sixteen (16) or more passengers, a vehicle which weighs 26,000 + 1 pounds, or a vehicle which carries a placard indicating hazardous cargo.” Furthermore, drug testing shall be conducted by “independent, certified laboratories utilizing recognized techniques.” While the School Board maintains a drug and alcohol-free workplace, it encourages employees with chemical dependency to seek treatment: The School Board of Marion County recognizes that chemical dependency is an illness that can be successfully treated. It is the policy of The School Board of Marion County to seek rehabilitation of employees with a self-admitted or medically determined drug problem. The School Board of Marion County will make every effort to assist those self-admitted and/or referred employees while being treated. Employees who are unwilling to participate in rehabilitation may be subject to appropriate action, pursuant to School Board policy, applicable Florida Statutes, State Board of Education rules, and applicable provisions of collective bargaining agreements. Substance Abuse Program – At any time prior to notification of a required test, an employee is encouraged to contact the Employee Assistance program. Such employees may be required to submit to testing as a part of a treatment program. The laboratory that conducts drug-testing for the School Board randomly selects individuals who will be tested during the upcoming quarter. The School Board then schedules those individuals for testing throughout the quarter so that a large number of drivers are not unavailable for work at the same time. During the next quarter, a different set of individuals are selected. Brent Carson is the School Board’s Director of Professional Practices. He becomes involved in employee disciplinary cases that rise above the level of a reprimand. Mr. Carson testified that the School Board has no ability to test employees other than the individuals the laboratory selects for testing: Q: To protect the integrity of the random testing, do you have the ability to vary from that random list provided by the outside lab? A: We have to test who they say – who they identify as the random employees. Q: So if you decided to pick and choose – if they pick someone and you said, no, I’m not going to bother with that person today, do you believe that could affect the randomness, if you will, if that’s the right word, of the test procedure, that it could affect the testing procedure and call into question if you start picking and choosing who’s not giving tests to people on the list? A: Yes, that would definitely, I think, impugn the efficacy of having random tests. If an employee has a positive drug test for a prescription medication, then the School Board’s Medical Review Officer (“MRO”) gives that employee three days to produce a valid prescription for that medication. If the employee produces a valid prescription, then the positive test is deemed to be a negative test. In addition, an employee can have a urine sample retested at his or her own expense. If there is no retest and no valid prescription is produced, then the School Board puts the employee on paid administrative leave pending the outcome of disciplinary proceedings. With regard to the consequences of a positive test, the Manual states that “[c]overed employees testing positive at any level for alcohol or controlled substances are in violation of district policy and will be immediately removed from their safety sensitive positions. A violation of federal, state, or District requirements shall be grounds for dismissal.” Mr. Carson testified that there is no progressive discipline for safety- sensitive positions. The first time an employee tests positive for an illegal substance or one for which that employee does not have a prescription, that employee is recommended for termination. Mr. Carson testified that the Superintendent has always recommended termination for violations of the School Board’s drug-free workplace policy: “Whether it’s random, whether it’s reasonable suspicion or whether it’s a drug test based off of injury, we have always recommended the termination of the employee.” Ms. Seaton Tests Positive for Opioids Ms. Seaton began working for the School Board as a bus driver in December of 2017. On February 5, 2018, Ms. Seaton signed a document acknowledging that bus drivers must “[s]ubmit to random, post accident and reasonable suspicion drug testing.”[4] Ms. Seaton has undergone surgeries in the past and testified that she has been prescribed hydrocodone “for years on and off depending on the 4 Prior to the positive drug test at issue in the instant case, Ms. Seaton had no disciplinary issues and had no other positive drug tests. surgery.”5 Ms. Seaton claims to be allergic to oxycodone, and it has been her habit to take hydrocodone only when she has excruciating pain.6 Ms. Seaton suffered a work-related injury on October 2, 2020, and described it as follows: I always help out where I can. So we have spare buses that we need to move from one compound to the other, and on this particular day I was taking one of the spare buses back over to another compound. As I was getting off the bus, I always grab with my right hand to the bar and my left hand on the dashboard. My hand slipped off the 5 Ms. Seaton had a double knee replacement surgery in August of 2019 and was prescribed hydrocodone. Respondent’s Exhibit 7 is a photograph of a pill bottle indicating that Ms. Seaton had been prescribed 60 hydrocodone pills. However, no date is visible from the photograph. 6 Respondent’s Exhibit 8 is a letter from a physician stating that Ms. Seaton has treated with him since December 21, 2018. The letter notes that Ms. Seaton is allergic to codeine and Premarin. There is no mention of Ms. Seaton being allergic to oxycodone. Also, hydrocodone was not among the medications this particular physician has prescribed for Ms. Seaton. dashboard and I went forward. And from there I suffered a rotator cuff tear and some other, like, bone spurs.[7] After the accident, Ms. Seaton took a drug test on October 2, 2020, and the test returned negative results for opiates, marijuana, cocaine, amphetamines, propoxyphene, PCP, barbiturates, and benzodiazepines. Medical documentation from an October 5, 2020, evaluation by a workers’ compensation physician indicates Ms. Seaton had a contusion of the left elbow and shoulder, a left shoulder strain, a left elbow strain, and a neck strain. An MRI on January 5, 2021, revealed a posterior labrum tear along with a possible anterior dislocation of her left shoulder. Since her accident, Ms. Seaton had been driving her mother’s car because it is an automatic, and Ms. Seaton has a stick shift. Ms. Seaton flew out-of-town to visit her son in Baltimore on October 22, 2020. Because 7 Ms. Seaton has had a difficult recovery from her injury and is dissatisfied with the treatment she received through workers’ compensation. After receiving a second opinion from her primary care physician, Ms. Seaton had shoulder surgery on February 26, 2021. At the time of the final hearing, she did not know whether the surgery would ultimately prove to be a success: “I am still in ongoing treatment. It started October 2nd. I went through holy heck with our – the way that Concentra work[s] – which is the people they use for workmen’s comp – they make you go through physical therapy before you can actually get an MRI done, because they say that it’s required by the insurance company. They had given me ultrasound – not an ultrasound. X-rays when I first had the injuries. And from there they said I had to go through physical therapy, I went through that a month. And then from there I went for an MRI which determined that they saw something, but they couldn’t know exactly. So they, then again, another MRI, a contrast MRI. I want to say I had that done December 23rd where they finally saw that. And we still, let me still – I didn’t have my surgery until February 26th. So from October 2nd to February 26th, I did not have surgery. And I was in constant pain. At nighttime with the rotator cuff, it’s kind of – in the daytime it’s tolerable, but at nighttime it’s excruciating pain, something to do with the way the muscles go. I’m not a doctor, but – I mean, it would be online. But it’s when you’re laying down you’re in a lot of pain. I had pain from my neck all the way shooting to my arm. It would be like a shooting pain and [ ] constant. On December 23rd, when I actually had the MRI to determine that I did have a rotator cuff tear, at that point I got tired of the Concentra doctors because they weren’t doing anything for my pain, and I went to my primary care for a second opinion, [and] he sent me to a pain management doctor. As of December 30th I have been on pain management with him, which is, like, Lyrica and hydrocodone and tramadol. So between the two. I still have therapy like I go three times a week. And I’m expected – like six more weeks. I still can’t – they’re not feeling that I’m where I’m supposed to be at this point. I’m supposed to be able to lift my arm a certain way, and it’s not. So I still have another set of therapy that I have to go through. I’m praying that everything goes back to normal. But I still have neck pain and we’re waiting to see if that clears up, I might have to go back to a neck specialist next.” Ms. Seaton did not want to leave her mother without transportation, she drove her own manual-shift car to and from an airport in Orlando, 90 minutes each way. However, using her left arm for driving caused her a great deal of pain. Upon her return to Florida, Ms. Seaton took a hydrocodone during the night of Sunday, October 25, 2020, because the pain was preventing her from sleeping. The hydrocodone came from a prescription: A: I’ve had hydrocodone prescribed to me for years on and off, depending on the surgery, because I can’t take oxycodone, which is the one that they’re saying came up on my test. The one that I took for – on October 25th, I want to say, it was a Sunday, it was from my previous surgery that I had. ALJ: Hold on. We need to get this straight. It looks like your drug test was October 27th, according to Petitioner’s Exhibit 1. A: Correct. ALJ: Are you telling me you took something prior to – just prior to October 27th? A: Correct. ALJ: What did you take? A: Hydrocodone. ALJ: Did you have a prescription for hydrocodone? A: Yes. During her stay in Baltimore, Ms. Seaton ate two biscuits sprinkled with poppy seeds. On October 26, 2021, and on the morning of October 27, 2021, Ms. Seaton also ate bagels sprinkled with poppy seeds. Ms. Seaton was notified during the morning of October 27, 2020, that she had been selected for drug testing that day. At that point in time, she was on light duty due to her injury and assigned to the transportation help desk.8 On approximately November 4, 2020, the testing laboratory reported that Ms. Seaton’s urine sample had tested positive for oxycodone and oxymorphone.9 The School Board notified Ms. Seaton on November 5, 2020, that she had been placed on administrative leave, with pay, during the pendency of an internal investigation. Mr. Carson met with Ms. Seaton on December 2, 2020, to inform her of the Superintendent’s recommendation that she be terminated. Ms. Seaton told Mr. Carson that she did not know how she could have tested positive for oxycodone because she is allergic to that medication. Mr. Carson and Ms. Seaton disagree about other aspects of the meeting. Specifically, Ms. Seaton claims that she mentioned during the December 2, 2020, meeting that she took hydrocodone and had a prescription for that medication. Mr. Carson does not recall Ms. Seaton making that comment.10 8 Even though Ms. Seaton was on light duty status, Mr. Carson testified that she was still subject to random drug testing: “Employees that are subject to random drug tests based off of their status because they’re CDL holders and drivers, they’re expected to stay in the pool for random drug tests if they are on light duty. The only time they are removed from that list is if they’re in a no-work status.” 9 The laboratory report entered into evidence was not authenticated, either by a witness or by self-authentication as provided in section 90.902, Florida Statutes (2020). Furthermore, no witness was produced to testify that the laboratory report was a business record and thus subject to an exception to the hearsay rule. The laboratory report is, therefore, unreliable hearsay. 10 During questioning by Petitioner’s counsel, Ms. Seaton claimed that she told the School Board’s MRO about her hydrocodone prescription: Q: Now, the note on the drug test that says it was positive lists oxycodone. Correct? A: Yes. Mr. Carson and Ms. Seaton spoke again on January 8, 2021, and Ms. Seaton stated for the first time to Mr. Carson that she had taken a long trip during the weekend prior to the October 27, 2020,11 drug test. She relayed that she was experiencing a lot of pain after driving and took some pills to alleviate the pain. According to Mr. Carson, Ms. Seaton did not identify the pills she took, state that she had a prescription, or offer him evidence that she had a prescription for opioids.12 As described above in the Preliminary Statement, Ms. Seaton speculated in her December 11, 2020, response to the Superintendent’s allegations that the positive test result could have been caused by poppy seeds she ate in the days preceding the drug test. This was the first time that Q: Do you understand that oxycodone is a different drug than hydrocodone? A: Yes. After doing research, yes. Well, actually speaking with the MRO officer, because he called it Percocet and I said, well, that’s impossible because I can’t take Percocet because I’m allergic to it. And so I told him, I said, all the Percocets, all those things, every time I have a surgery the doctors try to give me that and I tell them, no, I can’t have that because I get really sick and break out with [a] rash and vomiting, so they don’t prescribe that. That’s why I get prescribed hydrocodone. Q: So you’re saying that you told the MRO you took hydrocodone? A: Correct, hydro. Q: And even after you told him that, he still reported a positive test. Correct? A: He said he had to go by what he has there. 11 October 27, 2020, was a Tuesday. 12 Ms. Seaton explained during the final hearing that she did not provide the School Board with a copy of her prescription because no one ever asked her to do so. Mr. Carson was aware of Ms. Seaton asserting that poppy seeds could have caused her positive test result.13 Ms. Seaton testified that she did not tell the School Board about her hydrocodone prescription because she was on desk duty following the accident and did not anticipate ever driving a school bus again: ALJ: I guess what I’m struggling with is given your accidents and the pain you were experiencing, it seems perfectly reasonable that you would be on some sort of opioid. I guess on the other hand, you know, if you tested positive, I guess it seems like a reasonable person would show the School Board a prescription for any kind of pain med, regardless [of] whether they tested positive, or not. I guess that’s what I’m struggling a little bit with. * * * So is it your testimony that – according to my notes, there were three – there have been three conversations or discussions between you and the School Board. The first one with Mr. Carson where he told you about the positive test. And let me just clarify. During that first conversation, did you mention the hydrocodone? A: Yes, I did. With Mr. Carson in the first conversation. ALJ: All right. So you disagree with his testimony that during the first conversation you said simply, I have no idea how that tested positive? 13 Mr. Carson testified that “[m]y brief understanding of it is that you would have to consume a great deal of poppy seeds for it to alter any type of drug test. I don’t know what that limit is. But that’s not something that we’re able to delineate in a drug test, whether it’s truly a substance or if it’s poppy seeds.” Mr. Carson disclosed that the basis for that aforementioned statement came from “the internet.” Because the School Board elicited no testimony indicating that Mr. Carson has any independent knowledge or expertise with drug testing or a related field, the undersigned does not credit his assertion that someone would have to “consume a great deal of poppy seeds” in order to affect a drug test. * * * A: Yes. And I did ask him because I wanted to remember that, I said to him, as much pain as I was in, if I had to do it again, I would. But the difference is I would tell my supervisor. Because I really didn’t – in the role that I was in, which was a desk job, I was not in any safety risk for anyone, I would never get on a bus, nor was I – I knew I wasn’t getting on a bus any time soon with the injury that I had. But I would never, ever put anybody at risk. I wouldn’t even get on a bus because my CDL, I figured my CDL was going to be taken. That’s another thing - - ALJ: Ms. Seaton, did you say, -- I may be mistaken. I thought I heard you testify that you’ve had a hydrocodone prescription for many years. Was that accurate or did I mishear? A: On different occasions for surgeries, correct. * * * ALJ: On the day that you injured your shoulder on the school bus and hurt your rotator cuff, the injury that we were talking about, at that time did you have any hydrocodone prescription? A: Yes. * * * ALJ: Were you taking hydrocodone at that time? A: No, sir.[14] During the final hearing, Ms. Seaton moved Respondent’s Exhibit 7 into evidence, and a portion thereof was a picture of a prescription bottle for 14 Ms. Seaton then testified that her trip to Baltimore resulted in her taking hydrocodone to alleviate pain in her left shoulder. 60 hydrocodone pills with Ms. Seaton’s name on the bottle. Ms. Seaton offered the following testimony in support of that Exhibit: ALJ: So, Ms. Seaton, this picture of the prescription bottle, can you give me some background on this? When was this prescribed to you? When do you fill it? Who prescribed it to you, and why? * * * A: The original prescription was prescribed to me in August, and it was for my double knee replacement by Dr. Raymond Weiand at the Orthopedic Institute. Petitioner’s Counsel: August, you said, prior to the injury, August of 2020? A: No, ‘19. * * * ALJ: I think you may have discussed this, but were you taking hydrocodone consistently or without a break from that date to the day of your accident and beyond? A: No, sir. I only took hydrocodone when I had excruciating pain. This is not something that I take on a regular, like – like if I have pain then I was taking it. That’s why I put Exhibit 1, it will state - - it wasn’t in my system. ALJ: But is your testimony that at some point after your accident which resulted in your injured shoulder, is it your testimony that you are taking hydrocodone to relieve the pain resulting from that accident? A: That is correct. The night when I returned from the trip, I was in so much – I kept waking up out of my sleep because the pain was so bad that I took the pill for it to go to sleep, to go back to sleep, because I did not want to miss work. ALJ: Okay. Mr. Levitt, do you have any cross on that issue regarding this exhibit? Petitioner’s Counsel: Let me think --- So you have August 2019 for a knee operation, and when was the last time you took it for the knee operation? Like back in 2019, or as the judge asked, were you continuing to take it? A: I took it around my birthday, July – July 28th of the 2020, I took some then. Petitioner’s Counsel: For what, for your knee? A: Yes. Petitioner’s Counsel: But this was never prescribed for your shoulder. Correct? A: No, sir. Ms. Seaton had left shoulder surgery on February 26, 2021. The post- operative diagnosis notes she had a rotator cuff tear and superior labral tearing. Ultimate Findings Petitioner’s Exhibit 1 is the only record evidence supporting the School Board’s allegation that Ms. Seaton “provided a urine sample and it was reported as a positive test for opioids.” Petitioner’s Exhibit 1 is a report from a laboratory indicating that the urine sample Ms. Seaton provided on October 27, 2020, tested positive for oxycodone and oxymorphone. Petitioner’s Exhibit 1 is hearsay in that it is an out-of-court statement being offered to prove the truth of the matter asserted therein, i.e., that Ms. Seaton’s urine sample from October 27, 2020, tested positive for opioids. The School Board did not present a records custodian from the testing laboratory or otherwise attempt to have Petitioner’s Exhibit 1 accepted into evidence under the business records exception to the hearsay rule. There is no record evidence supplementing or corroborating that Ms. Seaton’s urine sample was positive for opioids, the allegation specifically pled in the Administrative Complaint. Thus, there is no evidentiary support for the School Board’s allegation that Ms. Seaton committed “misconduct in office” or that there is “just cause for discipline.”

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Administrative Complaint be DISMISSED. DONE AND ENTERED this 28th day of May, 2021, in Tallahassee, Leon County, Florida. S G. W. CHISENHALL Administrative Law Judge 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 28th day of May, 2021. COPIES FURNISHED: Mark E. Levitt, Esquire Allen, Norton & Blue, P.A. 1477 West Fairbanks Avenue, Suite 100 Winter Park, Florida 32789 Dr. Diane Gullett, Superintendent Marion County Public Schools 512 Southeast 3rd Street Ocala, Florida 34471 Matthew Mears, General Counsel Department of Education Turlington Building, Suite 1244 325 West Gaines Street Tallahassee, Florida 32399-0400 Desiree M. Seaton 5 Hemlock Loop Lane Ocala, Florida 34472

CFR (2) 49 CFR 38249 CFR 40 Florida Laws (7) 1001.401012.22120.569120.5790.80190.80390.902 DOAH Case (1) 21-0303
# 7
WILLIAM MARCH vs DIVISION OF PARI-MUTUEL WAGERING, 94-001251F (1994)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Mar. 07, 1994 Number: 94-001251F Latest Update: Oct. 02, 1995

The Issue Whether Petitioner is entitled to attorney's fees and costs pursuant to Section 57.111, Florida Statutes, and is so, what amount.

Findings Of Fact Petitioner, William March (March), is a resident of Miami, Florida. He is a horse trainer and is licensed by the Respondent, Department of Business and Professional Regulation, Division of Pari-Mutuel Wagering (DBPR). March is self-employed. He has a public stable and trains horses for different clients. His clients do not direct his actions. He charges his clients a set rate per day plus expenses. He also receives a certain percentage of the purse for winning horses. March carries workers' compensation insurance. His income tax returns reflect a sole proprietorship. He does not have more than 25 full-time employees. March did not present evidence on the amount of his net worth; thus, I cannot determine whether his net worth is less than $2 million. In April, 1992, March was the trainer of record for a horse named Miami's Finest. On April 22, 1992, Miami's Finest finished first place in the fifth race at the Tampa Bay Downs Racetrack. DBPR took a urine specimen from Miami's Finest after the race. The specimen was sent to DBPR's laboratory in Tallahassee to be analyzed. Four tests were conducted on the sample. The first test is known as a TLC test and is a screening tool. The test resulted in the sample being passed. The second test is known as an ELISA test and is a separate screening procedure. The result of this test was that the sample was called "suspicious." Following the suspicious call on the ELISA test, the sample was sent for confirmation in keeping with the normal procedure. The first confirmation test resulted in an inconclusive finding, though the test did indicate the presence of particular ions for a metabolite of acepromazine. The decision was made to run a second confirmation test. The second confirmation test utilized a re-extraction based on enzyme hydrolysis which provides a cleaner residue to analyze. The test then confirmed the presence of a metabolite of acepromazine based on two criteria, retention time and ions. The positive result was signed by the Bureau Chief and a reviewing Chemist Administrator. On June 5, 1992, Walter Blum, State Steward, received a copy of a memorandum from Jane F. Foos, Chief of the Bureau of Laboratory Services, to William E. Tabor, Director of the Division of Pari-Mutuel Wagering. The memorandum stated that a specimen of horse urine designated as sample number 808691 was analyzed and found to contain "2- (1-hydroxyethyl) promazine, a metabolite of acepromazine (tranquilizer), and/or a derivative thereof)." Sample information form (DBR form 13-003) indicated that sample 808691 was taken from a horse named Miami's Finest at the Tampa Bay Downs racetrack on April 22, 1992. March was listed as the trainer. On June 6, 1992, March met with Blum and the other stewards. March was told of the positive results of the laboratory tests and officially notified of the charges against him. He told the stewards that he had treated Miami's Finest with acepromazine on April 20, 1992, because the horse was high strung and he was going to ship her to Tampa to run in a race on April 22, 1992. March waived formal notice of hearing. A hearing was held before the stewards on June 25, 1992. Prior to the hearing, March had requested a split sample of the urine for testing by a laboratory. There was an insufficient amount of urine available for a split sample test. The Rulings of the Judges/Stewards dated June 25, 1992, found that March violated Sections 550.241(1) and (3)(a), Florida Statutes, suspended March for 25 days effective, June 27, 1992, disqualified Miami's Finest as the first place winner of the fifth race on April 22, 1992, at Tampa Bay Downs, and ordered that the first place money be returned to the horsemens bookkeeper and redistributed. March appealed the Stewards rulings to the Division of Pari-Mutuel Wagering (Division). The Stewards rulings were affirmed by the Division. March appealed the decision of the Division to the Third District Court of Appeal. In March v. Florida Department of Business Regulation, Division of Pari-Mutuel Wagering, 629 So.2d 290, (Fla. 3d DCA 1993), the court reversed, stating that the Division had violated its own rules by not providing a split sample and not staying the hearing once the request for the split sample was made.

Florida Laws (3) 120.57120.6857.111
# 8
KPMG CONSULTING, INC. vs DEPARTMENT OF REVENUE, 02-001719BID (2002)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida May 01, 2002 Number: 02-001719BID Latest Update: Oct. 15, 2002

The Issue The issue to be resolved in this proceeding concerns whether the Department of Revenue (Department, DOR) acted clearly erroneously, contrary to competition, arbitrarily or capriciously when it evaluated the Petitioner's submittal in response to an Invitation to Negotiate (ITN) for a child support enforcement automated management system-compliance enforcement (CAMS CE) in which it awarded the Petitioner a score of 140 points out of a possible 230 points and disqualified the Petitioner from further consideration in the invitation to negotiate process.

Findings Of Fact Procurement Background: The Respondent, the (DOR) is a state agency charged with the responsibility of administering the Child Support Enforcement Program (CSE) for the State of Florida, in accordance with Section 20.21(h), Florida Statutes. The DOR issued an ITN for the CAMS Compliance Enforcement implementation on February 1, 2002. This procurement is designed to give the Department a "state of the art system" that will meet all Federal and State Regulations and Policies for Child Support Enforcement, improve the effectiveness of collections of child support and automate enforcement to the greatest extent possible. It will automate data processing and other decision- support functions and allow rapid implementation of changes in regulatory requirements resulting from revised Federal and State Regulation Policies and Florida initiatives, including statutory initiatives. CSE services suffer from dependence on an inadequate computer system known as the "FLORIDA System" which was not originally designed for CSE and is housed and administered in another agency. The current FLORIDA System cannot meet the Respondent's needs for automation and does not provide the Respondent's need for management and reporting requirements and the need for a more flexible system. The DOR needs a system that will ensure the integrity of its data, will allow the Respondent to consolidate some of the "stand-alone" systems it currently has in place to remedy certain deficiencies of the FLORIDA System and which will help the Child Support Enforcement system and program secure needed improvements. The CSE is also governed by Federal Policy, Rules and Reporting requirements concerning performance. In order to improve its effectiveness in responding to its business partners in the court system, the Department of Children and Family Services, the Sheriff's Departments, employers, financial institutions and workforce development boards, as well as to the Federal requirements, it has become apparent that the CSE agency and system needs a new computer system with the flexibility to respond to the complete requirements of the CSE system. In order to accomplish its goal of acquiring a new computer system, the CSE began the procurement process. The Department hired a team from the Northrup Grumman Corporation headed by Dr. Edward Addy to head the procurement development process. Dr. Addy began a process of defining CSE needs and then developing an ITN which reflected those needs. The process included many individuals in CSE who would be the daily users of the new system. These individuals included Andrew Michael Ellis, Revenue Program Administrator III for Child Support Enforcement Compliance Enforcement; Frank Doolittle, Process Manager for Child Support Enforcement Compliance Enforcement and Harold Bankirer, Deputy Program Director for the Child Support Enforcement Program. There are two alternative strategies for implementing a large computer system such as CAMS CE: a customized system developed especially for CSE or a Commercial Off The Shelf, Enterprise Resource Plan (COTS/ERP). A COTS/ERP system is a pre-packaged software program, which is implemented as a system- wide solution. Because there is no existing COTS/ERP for child support programs, the team recognized that customization would be required to make the product fit its intended use. The team recognized that other system attributes were also important, such as the ability to convert "legacy data" and to address such factors as data base complexity and data base size. The Evaluation Process: The CAMS CE ITN put forth a tiered process for selecting vendors for negotiation. The first tier involved an evaluation of key proposal topics. The key topics were the vendors past corporate experience (past projects) and its key staff. A vendor was required to score 150 out of a possible 230 points to enable it to continue to the next stage or tier of consideration in the procurement process. The evaluation team wanted to remove vendors who did not have a serious chance of becoming the selected vendor at an early stage. This would prevent an unnecessary expenditure of time and resources by both the CSE and the vendor. The ITN required that the vendors provide three corporate references showing their past corporate experience for evaluation. In other words, the references involved past jobs they had done for other entities which showed relevant experience in relation to the ITN specifications. The Department provided forms to the vendors who in turn provided them to their corporate references that they themselves selected. The vendors also included a summary of their corporate experience in their proposal drafted by the vendors themselves. Table 8.2 of the ITN provided positive and negative criteria by which the corporate references would be evaluated. The list in Table 8.2 is not meant to be exhaustive and is in the nature of an "included but not limited to" standard. The vendors had the freedom to select references whose projects the vendors' believed best fit the criteria upon which each proposal was to be evaluated. For the key staff evaluation standard, the vendors provided summary sheets as well as résumés for each person filling a lead role as key staff members on their proposed project team. Having a competent project team was deemed by the Department to be critical to the success of the procurement and implementation of a large project such as the CAMS CE. Table 8.2 of the ITN provided the criteria by which the key staff would be evaluated. The Evaluation Team: The CSE selected an evaluation team which included Dr. Addy, Mr. Ellis, Mr. Bankirer, Mr. Doolittle and Mr. Esser. Although Dr. Addy had not previously performed the role of an evaluator, he has responded to several procurements for Florida government agencies. He is familiar with Florida's procurement process and has a doctorate in Computer Science as well as seventeen years of experience in information technology. Dr. Addy was the leader of the Northrup Grumman team which primarily developed the ITN with the assistance of personnel from the CSE program itself. Mr. Ellis, Mr. Bankirer and Mr. Doolittle participated in the development of the ITN as well. Mr. Bankirer and Mr. Doolittle had previously been evaluators in other procurements for Federal and State agencies prior to joining the CSE program. Mr. Esser is the Chief of the Bureau of Information Technology at the Department of Highway Safety and Motor Vehicles and has experience in similar, large computer system procurements at that agency. The evaluation team selected by the Department thus has extensive experience in computer technology, as well as knowledge of the requirements of the subject system. The Department provided training regarding the evaluation process to the evaluators as well as a copy of the ITN, the Source Selection Plan and the Source Selection Team Reference Guide. Section 6 of the Source Selection Team Reference Guide entitled "Scoring Concepts" provided guidance to the evaluators for scoring proposals. Section 6.1 entitled "Proposal Evaluation Specification in ITN Section 8" states: Section 8 of the ITN describes the method by which proposals will be evaluated and scored. SST evaluators should be consistent with the method described in the ITN, and the source selection process documented in the Reference Guide and the SST tools are designed to implement this method. All topics that are assigned to an SST evaluator should receive at the proper time an integer score between 0 and 10 (inclusive). Each topic is also assigned a weight factor that is multiplied by the given score in order to place a greater or lesser emphasis on specific topics. (The PES workbook is already set to perform this multiplication upon entry of the score.) Tables 8-2 through 8-6 in the ITN Section 8 list the topics by which the proposals will be scored along with the ITN reference and evaluation and scoring criteria for each topic. The ITN reference points to the primary ITN section that describes the topic. The evaluation and scoring criteria list characteristics that should be used to affect the score negatively or positively. While these characteristics should be used by each SST evaluator, each evaluator is free to emphasize each characteristic more or less than any other characteristic. In addition, the characteristics are not meant to be inclusive, and evaluators may consider other characteristics that are not listed . . . (Emphasis supplied). The preponderant evidence demonstrates that all the evaluators followed these instructions in conducting their evaluations and none used a criterion that was not contained in the ITN, either expressly or implicitly. Scoring Method: The ITN used a 0 to 10 scoring system. The Source Selection Team Guide required that the evaluators use whole integer scores. They were not required to start at "7," which was the average score necessary to achieve a passing 150 points, and then to score up or down from 7. The Department also did not provide guidance to the evaluators regarding a relative value of any score, i.e., what is a "5" as opposed to a "6" or a "7." There is no provision in the ITN which establishes a baseline score or starting point from which the evaluators were required to adjust their scores. The procurement development team had decided to give very little structure to the evaluators as they wanted to have each evaluator score based upon his or her understanding of what was in the proposal. Within the ITN the development team could not sufficiently characterize every potential requirement, in the form that it might be submitted, and provide the consistency of scoring that one would want in a competitive environment. This open-ended approach is a customary method of scoring, particularly in more complex procurements in which generally less guidance is given to evaluators. Providing precise guidance regarding the relative value of any score, regarding the imposition of a baseline score or starting point, from which evaluators were required to adjust their scores, instruction as to weighing of scores and other indicia of precise structure to the evaluators would be more appropriate where the evaluators themselves were not sophisticated, trained and experienced in the type of computer system desired and in the field of information technology and data retrieval generally. The evaluation team, however, was shown to be experienced and trained in information technology and data retrieval and experienced in complex computer system procurement. Mr. Barker is the former Bureau Chief of Procurement for the Department of Management Services. He has 34 years of procurement experience and has participated in many procurements for technology systems similar to CAMS CE. He established that the scoring system used by the Department at this initial stage of the procurement process is a common method. It is customary to leave the numerical value of scores to the discretion of the evaluators based upon each evaluator's experience and review of the relevant documents. According wider discretion to evaluators in such a complex procurement process tends to produce more objective scores. The evaluators scored past corporate experience (references) and key staff according to the criteria in Table 8.2 of the ITN. The evaluators then used different scoring strategies within the discretion accorded to them by the 0 to 10 point scale. Mr. Bankirer established a midrange of 4 to 6 and added or subtracted points based upon how well the proposal addressed the CAMS CE requirements. Evaluator Ellis used 6 as his baseline and added or subtracted points from there. Dr. Addy evaluated the proposals as a composite without a starting point. Mr. Doolittle started with 5 as an average score and then added or subtracted points. Mr. Esser gave points for each attribute in Table 8.2, for key staff, and added the points for the score. For the corporate reference criterion, he subtracted a point for each attribute the reference lacked. As each of the evaluators used the same methodology for the evaluation of each separate vendor's proposal, each vendor was treated the same and thus no specific prejudice to KPMG was demonstrated. Corporate Reference Evaluation: KPMG submitted three corporate references: Duke University Health System (Duke), SSM Health Care (SSM), and Armstrong World Industries (Armstrong). Mr. Bankirer gave the Duke reference a score of 6, the SSM reference a score of 5 and the Armstrong reference a score of 7. Michael Strange, the KPMG Business Development Manager, believed that 6 was a low score. He contended that an average score of 7 was required to make the 150-point threshold for passage to the next level of the ITN consideration. Therefore, a score of 7 would represent minimum compliance, according to Mr. Strange. However, neither the ITN nor the Source Selection Team Guide identified 7 as a minimally compliant score. Mr. Strange's designation of 7 as a minimally compliant score is not provided for in the specifications or the scoring instructions. Mr. James Focht, Senior Manager for KPMG testified that 6 was a low score, based upon the quality of the reference that KPMG had provided. However, Mr. Bankirer found that the Duke reference was actually a small-sized project, with little system development attributes, and that it did not include information regarding a number of records, the data base size involved, the estimated and actual costs and attributes of data base conversion. Mr. Bankirer determined that the Duke reference had little similarity to the CAMS CE procurement requirements and did not provide training or data conversion as attributes for the Duke procurement which are attributes necessary to the CAMS CE procurement. Mr. Strange and Mr. Focht admitted that the Duke reference did not specifically contain the element of data conversion and that under the Table 8.2, omission of this information would negatively affect the score. Mr. Focht admitted that there was no information in the Duke Health reference regarding the number of records and the data base size, all of which factors diminish the quality of Duke as a reference and thus the score accorded to it. Mr. Strange opined that Mr. Bankirer had erred in determining that the Duke project was a significantly small sized project since it only had 1,500 users. Mr. Focht believed that the only size criterion in Table 8.2 was the five million dollar cost threshold, and, because KPMG indicated that the project cost was greater than five million dollars, that KPMG had met the size criterion. Mr. Focht believed that evaluators had difficulty in evaluating the size of the projects in the references due to a lack of training. Mr. Focht was of the view that the evaluator should have been instructed to make "binary choices" on issues such as size. He conceded, however, that evaluators may have looked at other criteria in Table 8.2 to determine the size of the project, such as database size and number of users. However, the corporate references were composite scores by the evaluators, as the ITN did not require separate scores for each factor in Table 8.2. Therefore, Mr. Focht's focus on binary scoring for size, to the exclusion of other criteria, mis-stated the objective of the scoring process. The score given to the corporate references was a composite of all of the factors in Table 8.2, and not merely monetary value size. Although KPMG apparently contends that size, in terms of dollar value, is the critical factor in determining the score for a corporate reference, the vendor questions and answers provided at the pre-proposal conference addressed the issue of relevant criteria. Question 40 of the vendor questions and answers, Volume II, did not single out "project greater than five million dollars" as the only size factor or criterion. QUESTION: Does the state require that each reference provided by the bidder have a contract value greater than $5 million; and serve a large number of users; and include data conversion from a legacy system; and include training development? ANSWER: To get a maximum score for past corporate experience, each reference must meet these criteria. If the criteria are not fully met, the reference will be evaluated, but will be assigned a lower score depending upon the degree to which the referenced project falls short of these required characteristics. Therefore, the cost of the project is shown to be only one component of a composite score. Mr. Strange opined that Mr. Bankirer's comment regarding the Duke reference, "little development, mostly SAP implementation" was irrelevant. Mr. Strange's view was that the CAMS CE was not a development project and Table 8.2 did not specifically list development as a factor on which proposals would be evaluated. Mr. Focht stated that in his belief Mr. Bankirer's comment suggested that Mr. Bankirer did not understand the link between the qualifications in the reference and the nature of KPMG's proposal. Both Strange and Focht believe that the ITN called for a COTS/ERP solution. Mr. Focht stated that the ITN references a COTS/ERP approach numerous times. Although many of the references to COTS/ERP in the ITN also refer to development, Mr. Strange also admitted that the ITN was open to a number of approaches. Furthermore, both the ITN and the Source Selection Team Guide stated that the items in Table 8.2 are not all inclusive and that the evaluators may look to other factors in the ITN. Mr. Bankirer noted that there is no current CSE COTS/ERP product on the market. Therefore, some development will be required to adapt an off-the-shelf product to its intended use as a child support case management system. Mr. Bankirer testified that the Duke project was a small-size project with little development. Duke has three sites while CSE has over 150 sites. Therefore, the Duke project is smaller than CAMS. There was no information provided in the KPMG submittal regarding data base size and number of records with regard to the Duke project. Mr. Bankirer did not receive the information he needed to infer a larger sized-project from the Duke reference. Mr. Esser also gave the Duke reference a score of 6. The reference did not provide the data base information required, which was the number of records in the data base and the number of "gigabytes" of disc storage to store the data, and there was no element of legacy conversion. Dr. Addy gave the Duke reference a score of 5. He accepted the dollar value as greater than five million dollars. He thought that the Duke Project may have included some data conversion, but it was not explicitly stated. The Duke customer evaluated training so he presumed training was provided with the Duke project. The customer ratings for Duke were high as he expected they would be, but similarity to the CAMS CE system was not well explained. He looked at size in terms of numbers of users, number of records and database size. The numbers that were listed were for a relatively small-sized project. There was not much description of the methodology used and so he gave it an overall score of 5. Mr. Doolittle gave the Duke reference a score of 6. He felt that it was an average response. He listed the number of users, the number of locations, that it was on time and on budget, but found that there was no mention of data conversion, database size or number of records. (Consistent with the other evaluators). A review of the evaluators comments makes it apparent that KPMG scores are more a product of a paucity of information provided by KPMG corporate references instead of a lack of evaluator knowledge of the material being evaluated. Mr. Ellis gave a score of 6 for the Duke reference. He used 6 as his baseline. He found the required elements but nothing more justifying in his mind raising the score above 6. Mr. Focht and Mr. Strange expressed the same concerns regarding Bankirer's comment, regarding little development, for the SSM Healthcare reference as they had for the Duke Health reference. However, both Mr. Strange and Mr. Focht admitted that the reference provided no information regarding training. Mr. Strange admitted that the reference had no information regarding data conversion. Training and data conversion are criteria contained in Table 8.2. Mr. Strange also admitted that KPMG had access to Table 8.2 before the proposal was submitted and could have included the information in the proposal. Mr. Bankirer gave the SSM reference a score of 5. He commented that the SAP implementation was not relevant to what the Department was attempting to do with the CAMS CE system. CAMS CE does not have any materials management or procurement components, which was the function of the SAP components and the SSM reference procurement or project. Additionally, there was no training indicated in the SSM reference. Mr. Esser gave the SSM reference a score of 3. His comments were "no training provided, no legacy data conversion, project evaluation was primarily for SAP not KPMG". However, it was KPMG's responsibility in responding to the ITN to provide project information concerning a corporate reference in a clear manner rather than requiring that an evaluator infer compliance with the specifications. Mr. Focht believed that legacy data conversion could be inferred from the reference's description of the project. Mr. Strange opined that Mr. Esser's comment was inaccurate as KPMG installed SAP and made the software work. Mr. Esser gave the SSM reference a score of 3 because the reference described SAP's role, but not KPMG's role in the installation of the software. When providing information in the reference SSM gave answers relating to SAP to the questions regarding system capability, system usability, system reliability but did not state KPMG's role in the installation. SAP is a large enterprise software package. This answer created an impression of little KPMG involvement in the project. Dr. Addy gave the SSM reference a score of 6. Dr. Addy found that the size was over five million dollars and customer ratings were high except for a 7 for usability with reference to a "long learning curve" for users. Data conversion was implied. There was no strong explanation of similarity to CAMS CE. It was generally a small-sized project. He could reason some similarity into it, even though it was not well described in the submittal. Mr. Doolittle gave the SSM reference a score of 6. Mr. Doolittle noted, as positive factors, that the total cost of the project was greater than five million dollars, that it supported 24 sites and 1,500 users as well "migration from a mainframe." However, there were negative factors such as training not being mentioned and a long learning curve for its users. Mr. Ellis gave a score of 6 for SSM, feeling that KPMG met all of the requirements but did not offer more than the basic requirements. Mr. Strange opined that Mr. Bankirer, Dr. Addy and Mr. Ellis (evaluators 1, 5 and 4) were inconsistent with each other in their evaluation of the SSM reference. He stated that this inconsistency showed a flaw in the evaluation process in that the evaluators did not have enough training to uniformly evaluate past corporate experience, thereby, in his view, creating an arbitrary evaluation process. Mr. Bankirer gave the SSM reference a score of 5, Ellis a score of 6, and Addy a score of 6. Even though the scores were similar, Mr. Strange contended that they gave conflicting comments regarding the size of the project. Mr. Ellis stated that the size of the project was hard to determine as the cost was listed as greater than five million dollars and the database size given, but the number of records was not given. Mr. Bankirer found that the project was low in cost and Dr. Addy stated that over five million dollars was a positive factor in his consideration. However, the evaluators looked at all of the factors in Table 8.2 in scoring each reference. Other factors that detracted from KPMG's score for the SSM reference were: similarity to the CAMS system not being explained, according to Dr. Addy; no indication of training (all of the evaluators); the number of records not being provided (evaluator Ellis); little development shown (Bankirer) and usability problems (Dr. Addy). Mr. Strange admitted that the evaluators may have been looking at other factors besides the dollar value size in order to score the SSM reference. Mr. Esser gave the Armstrong reference a score of 6. He felt that the reference did not contain any database information or cost data and that there was no legacy conversion shown. Dr. Addy also gave Armstrong a score of 6. He inferred that this reference had data conversion as well as training and the high dollar volume which were all positive factors. He could not tell, however, from the project description, what role KPMG actually had in the project. Mr. Ellis gave a score of 7 for the Armstrong reference stating that the Armstrong reference offered more information regarding the nature of the project than had the SSM and Duke references. Mr. Bankirer gave KPMG a score of 7 for the Armstrong reference. He found that the positive factors were that the reference had more site locations and offered training but, on the negative side, was not specific regarding KPMG's role in the project. Mr. Focht opined that the evaluators did not understand the nature of the product and services the Department was seeking to obtain as the Department's training did not cover the nature of the procurement and the products and services DOR was seeking. However, when he made this statement he admitted he did not know the evaluators' backgrounds. In fact, Bankirer, Ellis, Addy and Doolittle were part of a group that developed the ITN and clearly knew what CSE was seeking to procure. Further, Mr. Esser stated that he was familiar with COTS and described it as a commercial off-the-shelf software package. Mr. Esser explained that an ERP solution or Enterprise Resource Plan is a package that is designed to do a series of tasks, such as produce standard reports and perform standard operations. He did not believe that he needed further training in COTS/ERP to evaluate the proposals. Mr. Doolittle was also familiar with COTS/ERP and believed, based on the amount of funding, that it was a likely response to the ITN. Dr. Addy's doctoral dissertation research was in the area of software re-use. COTS is one of the components that comprise a development activity and re-use. He became aware during his research of how COTS packages are used in software engineering. He has also been exposed to ERP packages. ERP is only one form of a COTS package. In regard to the development of the ITN and the expectations of the development team, Dr. Addy stated that they were amenable to any solution that met the requirements of the ITN. They fully expected the compliance solutions were going to be comprised of mostly COTS and ERP packages. Furthermore, the ITN in Section 1.1, on page 1-2 states, ". . . FDOR will consider an applicable Enterprise Resource Planning (ERP) or Commercial Off the Shelf (COTS) based solution in addition to custom development." Clearly, this ITN was an open procurement and to train evaluators on only one of the alternative solutions would have biased the evaluation process. Mr. Doolittle gave each of the KPMG corporate references a score of 6. Mr. Strange and Mr. Focht questioned the appropriateness of these scores as the corporate references themselves gave KPMG average ratings of 8.3, 8.2 and 8.0. However, Mr. Focht admitted that Mr. Doolittle's comments regarding the corporate references were a mixture of positive and negative comments. Mr. Focht believed, however, that as the reference corporations considered the same factors for providing ratings on the reference forms, that it was inconsistent for Mr. Doolittle to separately evaluate the same factors that the corporations had already rated. However, there is no evidence in the record that KPMG provided Table 8.2 to the companies completing the reference forms and that the companies consulted the table when completing their reference forms. Therefore, KPMG did not prove that it had taken all measures available to it to improve its scores. Moreover, Mr. Focht's criticism would impose a requirement on Mr. Doolittle's evaluation which was not supported by the ITN. Mr. Focht admitted that there was no criteria in the ITN which limited the evaluator's discretion in scoring to the ratings given to the corporate references by those corporate reference customers. All of the evaluators used Table 8.2 as their guide for scoring the corporate references. As part of his evaluation, Dr. Addy looked at the methodology used by the proposers in each of the corporate references to implement the solution for that reference company. He was looking at methodology to determine its degree of similarity to CAMS CE. While not specifically listed in Table 8.2 as a similarity to CAMS, Table 8.2 states that the list is not all inclusive. Clearly, methodology is a measure of similarity and therefore is not an arbitrary criterion. Moreover, as Dr. Addy used the same process and criteria in evaluating all of the proposals there was no prejudice to KPMG by use of this criterion since all vendors were subjected to it. Mr. Strange stated that KPMG appeared to receive lower scores for SAP applications than other vendors. For example, evaluator 1 gave a score of 7 to Deloitte's reference for Suntax. Suntax is an SAP implementation. It is difficult to draw comparisons across vendors, yet the evaluators consistently found that KPMG references lacked key elements such as data conversion, information on starting and ending costs, and information on database size. All of these missing elements contributed to a reduction in KPMG's scores. Nevertheless, KPMG received average scores of 5.5 for Duke, 5.7 for SSM and 6.3 for Armstrong, compared with the score of 7 received by Deloitte for Suntax. There is only a gap of 1.5 to .7 points between Deloitte and KPMG's scores for SAP implementations, despite the deficient information within KPMG's corporate references. Key Staff Criterion: The proposals contain a summary of the experience of key staff and attached résumés. KPMG's proposed key staff person for Testing Lead was Frank Traglia. Mr. Traglia's summary showed that he had 25-years' experience respectively, in the areas of child support enforcement, information technology, project management and testing. Strange and Focht admitted that Traglia's résumé did not specifically list any testing experience. Mr. Focht further admitted that it was not unreasonable for evaluators to give the Testing Lead a lower score due to the lack of specific testing information in Traglia's résumé. Mr. Strange explained that the résumé was from a database of résumés. The summary sheet, however, was prepared by those KPMG employees who prepared the proposal. All of the evaluators resolved the conflicting information between the summary sheet and the résumé by crediting the résumé as more accurate. Each evaluator thought that the résumé was more specific and expected to see specific information regarding testing experience on the résumé for someone proposed as the Testing Lead person. Evaluators Addy and Ellis gave scores to the Testing Lead criterion of 4 and 5. Mr. Ron Vandenberg (evaluator 8) gave the Testing Lead a score of 9. Mr. Vandenberg was the only evaluator to give the Testing Lead a high score. The other evaluators gave the Testing Lead an average score of 4.2. The Vandenberg score thus appears anomalous. All of the evaluators gave the Testing Lead a lower score as it did not specifically list testing experience. Dr. Addy found that the summary sheet listed 25-years of experience in child support enforcement, information technology, and project management and system testing. As he did not believe this person had 100 years of experience, he assumed those experience categories ran concurrently. A strong candidate for Testing Lead should demonstrate a combination of testing experience, education and certification, according to Dr. Addy. Mr. Doolittle also expected to see testing experience mentioned in the résumé. When evaluating the Testing Lead, Mr. Bankirer first looked at the team skills matrix and found it interesting that testing was not one of the categories of skills listed for the Testing Lead. He then looked at the summary sheet and résumé from Mr. Traglia. He gave a lower score to Traglia as he thought that KPMG should have put forward someone with demonstrable testing experience. The evaluators gave a composite score to key staff based on the criteria in Table 8.2. In order to derive the composite score that he gave each staff person, Mr. Esser created a scoring system wherein he awarded points for each attribute in Table 8.2 and then added the points together to arrive at a composite score. Among the criteria he rated, Mr. Esser awarded points for CSE experience. Mr. Focht and Mr. Strange contended that since the term CSE experience is not actually listed in Table 8.2 that Mr. Esser was incorrect in awarding points for CSE experience in his evaluation. Table 8.2 does refer to relevant experience. There is no specific definition provided in Table 8.2 for relevant experience. Mr. Focht stated that relevant experience is limited to COTS/ERP experience, system development, life cycle and project management methodologies. However, these factors are also not listed in Table 8.2. Mr. Strange limited relevance to experience in the specific role for which the key staff person was proposed. This is a limitation that also is not imposed by Table 8.2. CSE experience is no more or less relevant than the factors posited by KPMG as relevant experience. Moreover, KPMG included a column in its own descriptive table of key staffs for CSE experience. KPMG must have seen this information as relevant if it included it in its proposal as well. Inclusion of this information in its proposal demonstrated that KPMG must have believed CSE experience was relevant at the time its submitted its proposal. Mr. Strange held the view that, in the bidders conference in a reply to a vendor question, the Department representative stated that CSE experience was not required. Therefore, Mr. Esser could not use such experience to evaluate key staff. Question 47 of the Vendor Questions and Answers, Volume 2 stated: QUESTION: In scoring the Past Corporate Experience section, Child Support experience is not mentioned as a criterion. Would the State be willing to modify the criteria to include at least three Child Support implementations as a requirement? ANSWER: No. However, a child support implementation that also meets the other characteristics (contract value greater than $5 million, serves a large number of users, includes data conversion from a legacy system and includes training development) would be considered "similar to CAMS CE." The Department's statement involved the scoring of corporate experience not key staff. It was inapplicable to Mr. Esser's scoring system. Mr. Esser gave the Training Lead a score of 1. According to Esser, the Training Lead did not have a ten-year résumé, for which he deducted one point. The Training Lead had no specialty certification or extensive experience and had no child support experience and received no points. Mr. Esser added one point for the minimum of four years of specific experience and one point for the relevance of his education. Mr. Esser gave the Project Manager a score of 5. The Project Manager had a ten-year résumé and required references and received a point for each. He gave two points for exceeding the minimum required informational technology experience. The Project Manager had twelve years of project management experience for a score of one point, but lacked certification, a relevant education and child support enforcement experience for which he was accorded no points. Mr. Esser gave the Project Liaison person a score of According to Mr. Focht, the Project Liaison should have received a higher score since she has a professional history of having worked for the state technology office. Mr. Esser, however, stated that she did not have four years of specific experience and did not have extensive experience in the field, although she had a relevant education. Mr. Esser gave the Software Lead person a score of 4. The Software Lead, according to Mr. Focht, had a long set of experiences with implementing SAP solutions for a wide variety of different clients and should have received a higher score. Mr. Esser gave a point each for having a ten-year résumé, four years of specific experience in software, extensive experience in this area and relevant education. According to Mr. Focht the Database Lead had experience with database pools including the Florida Retirement System and should have received more points. Mr. Strange concurred with Mr. Focht in stating that Esser had given low scores to key staff and stated that the staff had good experience, which should have generated more points. Mr. Strange believed that Mr. Esser's scoring was inconsistent but provided no basis for that conclusion. Other evaluators also gave key staff positions scores of less than 7. Dr. Addy gave the Software Lead person a score of 5. The Software Lead had 16 years of experience and SAP development experience as positive factors but had no development lead experience. He had a Bachelor of Science and a Master of Science in Mechanical Engineering and a Master's in Business Administration, which were not good matches in education for the role of a Software Lead person. Dr. Addy gave the Training Lead person a score of 5. The Training Lead had six years of consulting experience, a background in SAP consulting and some training experience but did not have certification or education in training. His educational background also was electrical engineering, which is not a strong background for a training person. Dr. Addy gave the subcontractor managers a score of 5. Two of the subcontractors did not list managers at all, which detracted from the score. Mr. Doolittle gave the Training Lead person a He believed that based on his experience and training it was an average response. Table 8.2 contained an item in which a proposer could have points detracted from a score if the key staff person's references were not excellent. The Department did not check references at this stage in the evaluation process. As a result, the evaluators simply did not consider that item when scoring. No proposer's score was adversely affected thereby. KPMG contends that checking references would have given the evaluators greater insight into the work done by those individuals and their relevance and capabilities in the project team. Mr. Focht admitted, however, that any claimed effect on KPMG's score is conjectural. Mr. Strange stated that without reference checks information in the proposals could not be validated but he provided no basis for his opinion that reference checking was necessary at this preliminary stage of the evaluation process. Dr. Addy stated that the process called for checking references during the timeframe of oral presentations. They did not expect the references to change any scores at this point in the process. KPMG asserted that references should be checked to ascertain the veracity of the information in the proposals. However, even if the information in some other proposal was inaccurate it would not change the outcome for KPMG. KPMG would still not have the required number of points to advance to the next evaluation tier. Divergency in Scores The Source Selection Plan established a process for resolving divergent scores. Any item receiving scores with a range of 5 or more was determined to be divergent. The plan provided that the Coordinator identify divergent scores and then report to the evaluators that there were divergent scores for that item. The Coordinator was precluded from telling the evaluator, if his score was the divergent score, i.e., the highest or lowest score. Evaluators would then review that item, but were not required to change their scores. The purpose of the divergent score process was to have evaluators review their scores to see if there were any misperceptions or errors that skewed the scores. The team wished to avoid having any influence on the evaluators' scores. Mr. Strange testified that the Department did not follow the divergent score process in the Source Selection Plan as the coordinator did not tell the evaluators why the scores were divergent. Mr. Strange stated that the evaluator should have been informed which scores were divergent. The Source Selection Plan merely instructed the coordinator to inform the evaluators of the reason why the scores were divergent. Inherently scores were divergent, if there was a five-point score spread. The reason for the divergence was self- explanatory. The evaluators stated that they scored the proposals, submitted the scores and each received an e-mail from Debbie Stephens informing him that there were divergent scores and that they should consider re-scoring. None of the evaluators ultimately changed their scores. Mr. Esser's scores were the lowest of the divergent scores but he did not re-score his proposals as he had spent a great deal of time on the initial scoring and felt his scores to be valid. Neither witnesses Focht or Strange for KPMG provided more than speculation regarding the effect of the divergent scores on KPMG's ultimate score and any role the divergent scoring process may have had in KPMG not attaining the 150 point passage score. Deloitte - Suntax Reference: Susan Wilson, a Child Support Enforcement employee connected with the CAMS project signed a reference for Deloitte Consulting regarding the Suntax System. Mr. Focht was concerned that the evaluators were influenced by her signature on the reference form. Mr. Strange further stated that having someone who is heavily involved in the project sign a reference did not appear to be fair. He was not able to state any positive or negative effect on KPMG by Wilson's reference for Deloitte, however. Evaluator Esser has met Susan Wilson but has had no significant professional interaction with her. He could not recall anything that he knew about Ms. Wilson that would favorably influence him in scoring the Deloitte reference. Dr. Addy also was not influenced by Wilson. Mr. Doolittle has only worked with Wilson for a very short time and did not know her well. He has also evaluated other proposals where department employees were a reference and was not influenced by that either. Mr. Ellis has only known Wilson from two to four months. Her signature on the reference form did not influence him either positively or negatively. Mr. Bankirer had not known Wilson for a long time when he evaluated the Suntax reference. He took the reference at face value and was not influenced by Wilson's signature. It is not unusual for someone within an organization to create a reference for a company who is competing for work to be done for the organization.

Recommendation Having considered the foregoing Findings of Fact, Conclusions of Law, the evidence of record and the pleadings and arguments of the parties, it is, therefore, RECOMMENDED that a final order be entered by the State of Florida Department of Revenue upholding the proposed agency action which disqualified KPMG from further participation in the evaluation process regarding the subject CAMS CE Invitation to Negotiate. DONE AND ENTERED this 26th day of September, 2002, in Tallahassee, Leon County, Florida. P. MICHAEL RUFF Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with Clerk of the Division of Administrative Hearings this 26th day of September, 2002. COPIES FURNISHED: Cindy Horne, Esquire Earl Black, Esquire Department of Revenue Post Office Box 6668 Tallahassee, Florida 32399-0100 Robert S. Cohen, Esquire D. Andrew Byrne, Esquire Cooper, Byrne, Blue & Schwartz, LLC 1358 Thomaswood Drive Tallahassee, Florida 32308 Seann M. Frazier, Esquire Greenburg, Traurig, P.A. 101 East College Avenue Tallahassee, Florida 32302 Bruce Hoffmann, General Counsel Department of Revenue 204 Carlton Building Tallahassee, Florida 32399-0100 James Zingale, Executive Director Department of Revenue 104 Carlton Building Tallahassee, Florida 32399-0100

Florida Laws (3) 120.569120.5720.21
# 9
DORIAN KENNETH ZINCK vs BOARD OF PROFESSIONAL ENGINEERS, 94-002664 (1994)
Division of Administrative Hearings, Florida Filed:West Palm Beach, Florida May 10, 1994 Number: 94-002664 Latest Update: Sep. 20, 1995

Findings Of Fact The National Council of Examiners for Engineering and Surveying (hereinafter "NCEES") writes and otherwise prepares the examinations for candidates seeking engineering licenses in 55 states and jurisdictions. The examinations are then administered by the states and jurisdictions which constitute NCEES' member boards. Respondent, State of Florida, Board of Professional Engineers, is a member board and uses NCEES' examinations. The Fundamentals of Engineering (hereinafter "FE") examination is given twice a year, in April and in October. The FE examination measures the basic knowledge a candidate has acquired in a bachelor degree program in the first two years during which the candidate takes basic engineering and science courses. Passage of the examination does not result in licensure as an engineer; it results in either an "engineer intern" or an "engineer in training" certificate which shows that the examinee has completed the necessary educational requirements to sit for that eight-hour examination and to have passed it. The next step is that a successful candidate will then complete four years of experience and then pass a principles and practices examination called the "PE" examination in order to then be licensed as a professional engineer. The FE exam is a minimal competency examination. Questions for the FE examination are written by individuals and are then reviewed by a committee. That committee is composed of registered professional engineers who are practicing engineers and engineers from the academic world, from consulting firms, and from governmental entities. Each question or item on the examination is reviewed by at least 12 to 15 individuals during the review process which takes from one to one and a half years. As part of the development process, individual items appear on examinations as pre-test questions. The purpose of using pre-test questions is to determine the characteristics of that specific item, as to how hard or easy the item is when used on the target population (candidates for the FE examination), and to verify that minimally competent candidates can answer the test item correctly. If pre-test questions perform as expected, they are used on subsequent examinations. If they do not perform adequately, the questions go back to the committee to be changed or to be discarded. Pre-test questions on examinations are not scored, and whether an examinee correctly answers that question is irrelevant to the raw score or final grade achieved by that candidate on the examination. Pre-test questions are distributed proportionately throughout the examination, and no subject area on the examination ever consists of only pre-test questions. Pre-test questions are used by other national testing programs. No unfairness inures to candidates from the presence of pre-test questions on an examination for two reasons. First, all candidates are treated equally. Candidates do not know that the examination contains pre-test questions, and, even if they did, they do not know which questions are pre-test questions and which questions will be scored. Second, the length of the examination itself is not increased by adding pre-test questions. The examination has the same number of questions whether pre-test questions are included or not. In the actual exam preparation, NCEES uses American College Testing and/or Educational Testing Service as contractors. The contractors pull the proper number of items in each subject area from the item bank and assemble the examination which is then sent to the NCEES committee of registered professional engineers to see if changes in the examination are necessary. Once approved, the contractor then prints the examination booklets and sends them to the member boards to administer the examination. Answer sheets from an exam administration are transmitted to the contractor for scanning and statistical analysis. The contractor then recommends a passing point based on a scaling and equating process so that future exams are no easier or harder than past exams. When NCEES approves the passing point, the contractor sends the examination scores or results to the member boards. When the examination is changed in some fashion, a new base line or pass point must be established to ensure that the new examination remains equal in difficulty to past examinations and remains a good measure of competency. The new examination is referred to as the anchor examination. The October, 1990, FE examination was an anchor exam. The member boards of NCEES determined that the October, 1993, FE examination would be changed to a supplied reference document examination, meaning that the candidate during the examination could use only the supplied reference handbook, a pencil, and a calculator. Candidates would no longer be able to bring their own reference materials to use during the examination. One of the reasons for the change was fairness to the candidates. The FE examination was not being administered uniformly nationwide since some member boards prohibited bringing certain publications into the examination which were allowed by other member boards. Accordingly, it was determined that NCEES would write and distribute at the examination its Fundamentals of Engineering Reference Handbook, thereby placing all candidates nationwide on an equal footing in that all examinees would be using this same reference material of charts, mathematical formulas, and conversion tables during the examination, and no other reference materials would be used during the examination itself. In August of 1991, NCEES approved the concept of a supplied reference handbook, and a beginning draft was sent to the FE sub-committee of the examination committee for review. The individual members of the sub-committee actually took two FE examinations using the draft of the supplied reference document to ensure that all material needed to solve the problems on an FE examination was included in the reference document and that the document was accurate. On a later occasion the committee took the examination that would be administered in October of 1993 using a subsequent draft of the supplied reference handbook. The last review of the handbook occurred in February of 1993 when the committee used that draft to review the October 1993 examination for the second time, and NCEES' Fundamentals of Engineering Reference Handbook, First Edition (1993) was finished. When NCEES received its first copies back from the printer, it mailed copies to the deans of engineering at 307 universities in the United States that have accredited engineering programs for review and input. As a result, NCEES became aware of some typographical and other errors contained in that document. In July of 1993 NCEES assembled a group of 12 individuals for a passing point workshop for the October 1993 a/k/a the '93 10 examination. The group consisted of three members of the committee, with the remainder being persons working in the academic world or as accreditation evaluators, and recent engineer interns who had passed the FE examination within the previous year and were not yet professional engineers. That group took the '93 10 FE examination using the first edition of the Handbook and then made judgments to determine the pass point for that examination. During that two day workshop, the errors in the Handbook were pointed out to the working group so it could determine if any of the errors contained in the Handbook had any impact on any of the problems contained in the '93 10 examination. The group determined that none of the errors in the Handbook impacted on any test item on the '93 10 FE examination. In September of 1993 subsequent to the passing point workshop, the '93 10 FE exam and the first edition of the Handbook went back to the committee of registered professional engineers for a final check, and that committee also determined that none of the errors in the Handbook would have any impact on the questions in the '93 10 FE examination. An errata sheet to the first edition of the Handbook was subsequently prepared but was not available until December of 1993. In September of 1994 the second printing of the Handbook was completed, and that version incorporated the changes contained on the errata sheet. Of the errors contained in the first edition of the Handbook, only one error was substantive; that is, one mathematical equation was wrong. However, no item on the '93 10 FE exam could be affected by that mathematical error. The remaining errors were typographical or simply matters of convention, i.e., errors in conventional terminology and symbols found in most textbooks such as the use of upper case instead of lower case or symbols being italicized as opposed to being non-italicized. Candidates for the '93 10 FE examination were able to purchase in advance as a study guide, a Fundamentals of Engineering sample examination which had its second printing in March of 1992. The sample examination was composed of questions taken from previous FE exams which would never be used again on an actual FE examination. The sample examination consisted of actual test questions and multiple choice answers. The sample examination did not show candidates how to solve the problems or work the computation, but merely gave multiple choice responses. Errors were contained on the two pages where the answers to the sample examination were given. The answer key was wrong as to two items on the morning sample examination and was wrong for all of the electrical circuit items, one of the subject areas included in the afternoon sample examination. An errata sheet was prepared and distributed in September of 1993 to those who had purchased the sample examination. Petitioner took the '93 10 FE examination, which contained 140 items during the morning portion and 70 items during the afternoon portion. Approximately 25 percent of the questions on the examination were pre-test questions. The minimum passing score for that examination was 70, and Petitioner achieved a score of only 68. Accordingly, Petitioner failed that examination.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that a final order be entered finding that Petitioner failed to achieve a passing score on the October 1993 Fundamentals of Engineering examination and dismissing the amended petition filed in this cause. DONE and ENTERED this 14th day of April, 1995, at Tallahassee, Florida. LINDA M. RIGOT, Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 14th day of April, 1995. APPENDIX TO RECOMMENDED ORDER Petitioner's proposed findings of fact numbered 1-5 and 8 have been adopted either verbatim or in substance in this Recommended Order. Petitioner's proposed finding of fact numbered 7 has been rejected as being subordinate to the issues herein. Petitioner's proposed findings of fact numbered 6 and 9 have been rejected as not constituting findings of fact but rather as constituting recitation of the testimony or conclusions of law. Respondent's proposed findings of fact numbered 1-15 have been adopted either verbatim or in substance in this Recommended Order. Respondent's proposed finding of fact numbered 16 has been rejected as being unnecessary to the issues involved herein. COPIES FURNISHED: Wellington H. Meffert, II Assistant General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0750 Dorian Kenneth Zinck, pro se 521 Beech Road West Palm Beach, Florida 33409 Angel Gonzalez, Executive Director Board of Professional Engineers Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0755 Lynda Goodgame, General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0792

Florida Laws (3) 120.57471.013471.015
# 10

Can't find what you're looking for?

Post a free question on our public forum.
Ask a Question
Search for lawyers by practice areas.
Find a Lawyer