Elawyers Elawyers
Washington| Change
Find Similar Cases by Filters
You can browse Case Laws by Courts, or by your need.
Find 49 similar cases
STEVEN FRANK vs DEPARTMENT OF HEALTH AND REHABILITATIVE SERVICES, 94-001440 (1994)
Division of Administrative Hearings, Florida Filed:West Palm Beach, Florida Mar. 17, 1994 Number: 94-001440 Latest Update: Oct. 20, 1994

Findings Of Fact Steven Frank (Petitioner) is legally incompetent and his adoptive father, Edward Frank, is his guardian. At the time of the hearing, Petitioner, a 37-year-old male, was a patient at South Florida State Hospital 1/ in West Palm Beach, Florida, on a unit for persons who have been dually diagnosed with developmental disabilities and mental illness. At the facility he is receiving treatment for his mental illness. He has not been given an IQ test since being admitted. As a child, Petitioner was determined to be mentally retarded. Before he was eight years old, Petitioner had been given IQ tests on several occasions, and his IQ ranged from 52 (moderate mental retardation) to 58 (mild mental retardation). At age eight, he tested at 68 (mild mental retardation). As a teenager, Petitioner began to receive psychiatric treatment. Around the age of seventeen, he began to have violent outbursts. Throughout his adult life, Petitioner has received psychiatric treatment at a number of facilities. At some of the facilities, his IQ was tested. In 1983, around the age of twenty-seven, Petitioner was a psychiatric patient at Sharon General Hospital. He was given an IQ test and tested at 72, which equated to general intellectual functioning in the borderline range. In late 1986, Petitioner was admitted to Jackson Memorial Hospital in Miami, Florida, for diagnosis purposes to determine if alternative treatment would be beneficial. He was given an IQ test and tested at 75, which equated to borderline range of intellectual functioning. In 1987, around the age of thirty-one, Petitioner was a psychiatric patient at Montanari Residential Treatment Center, a residential treatment facility in Hialeah, Florida. He was given an IQ test and tested at 75, which equated to borderline range of intellectual functioning. Petitioner was diagnosed, among other things, as being a chronic, residual schizophrenic and as having borderline intellectual functioning and pervasive developmental disorder. He also showed signs of organic brain damage. In 1989, Petitioner was discharged from Montanari, even though there was no improvement in his condition, because of the decision by Developmental Services of the Department of Health and Rehabilitative Services (Respondent) to place Petitioner in a group home setting. In 1987, Petitioner was determined eligible for the Developmental Services Program by Respondent's District XI, the Dade County area, even though he tested 75 on the IQ test. Petitioner has not been given an IQ test since 1987. The accepted criteria used for determining mental retardation and used by Respondent to determine eligibility for its Developmental Services Program is as follows: Significantly subaverage intellectual functioning: an IQ of approximately 70 or below on an individually administered IQ test (for infants, a clinical judgment of significantly subaverage intellectual functioning). Concurrent deficits or impairments in present adaptive functioning (i.e., the person's effective- ness in meeting the standards expected for his or her age by his or her cultural group) in at least two of the following areas: communication, self-care, home living, social/interpersonal skills, use of community resources, self-direction, functional academic skills, work, leisure, health, and safety. The onset is before age 18 years. Code based on degree of severity reflecting level of intellectual impairment: 317 Mild Mental Retardation: IQ level 50-55 to approximately 70 Moderate Mental Retardation: IQ level 35-40 to 50-55 Severe Mental Retardation: IQ level 20-25 to 35-40 Profound Mental Retardation: IQ level below 20 or 25 319 Mental retardation, Severity Unspecified: when there is strong presumption of Mental Retardation but the person's intelligence is untestable by standard tests On the IQ tests there is a three-point margin of error. In determining an individual's eligibility for its Developmental Services Program, Respondent has a two-step process. First, it determines whether the individual meets the IQ requirement for mental retardation. If, and only if, the individual satisfies this first step, does Respondent proceed to the second step which is determining whether the individual meets the adaptive functioning requirements. In making determinations regarding mental retardation, Respondent does not consider IQ test results prior to age nine because such tests results are not considered reliable for placing a child. Environmental factors may interfere with test results and labeling children as mentally retarded may interfere with the child receiving an appropriate education. The basis for placement is clinical judgment. At the request of Respondent's District IX, the Palm Beach County area, in October 1993, while a patient in the psychiatric unit at the University Medical Center in Jacksonville, Florida, a psychological evaluation of Petitioner was performed. The purpose of the evaluation was to assist District IX in determining Petitioner's eligibility for its Developmental Services Program. The evaluation was completed in one day with no intelligence testing being performed due to Petitioner's mental condition at that time. 2/ The psychologist reviewed Petitioner's past records, observed Petitioner, and interviewed staff. She determined that Petitioner was not mentally retarded based upon him testing at 72 and 75 on the IQ tests previously administered as an adult, which was beyond his developmental years, and that he was, therefore, not eligible for Respondent's Developmental Services Program. Respondent's evaluator determined that Petitioner failed to satisfy the IQ requirements and, therefore, it was not necessary to examine Petitioner's adaptive functioning. At the request of Petitioner's parents, in January 1994, a psychological examination was performed on Petitioner, while he was a patient at South Florida State Hospital, by a psychologist. The examination occurred over several occasions, on different days and at different times of the day. Additionally, Petitioner's records were examined and interviews of the hospital staff on Petitioner's unit and his parents were conducted. Again, no IQ test was administered. The psychologist's diagnosis was consistent with that expressed by South Florida State Hospital: Petitioner suffered from both developmental disabilities and mental illness. The psychologist determined that Petitioner was eligible for Respondent's Developmental Services Program and for psychiatric services. Petitioner's IQ results in his late twenties and early thirties should be evaluated from the lower tested result, i.e., at 72, and the margin of error should be placed at the lower, not the higher, spectrum. The lower tested result now becomes 69. Petitioner has a significant delay in social/adaptive skills and has deficits or impairments in adaptive functioning in the following areas: communication, self-care, home living, social/interpersonal skills, self- direction, work, leisure, health, and safety. His strong area is functional academic skills. Petitioner has both developmental and psychiatric needs. One need is not more important than the other; Petitioner requires assistance in both. An intermediate care facility for the mentally retarded is best suited to address Petitioner's dual needs. Respondent's Developmental Services Program is not an entitlement program. Even though an individual may be eligible for the Program, the individual may not be admitted to the program if funds are not available. There is no dispute regarding the onset of Petitioner's condition before eighteen years of age.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Department of Health and Rehabilitative Services enter a final order declaring Steven Frank eligible for the Developmental Services Program and placement in the intermediate care facility for the mentally retarded. DONE AND ENTERED in Tallahassee, Leon County, Florida, this 20th day of October 1994. ERROL H. POWELL Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 20th day of October 1994.

Florida Laws (2) 120.57393.063
# 1
OSCAR JONES vs COASTAL MARITIMES SERVICES, 02-002787 (2002)
Division of Administrative Hearings, Florida Filed:Jacksonville, Florida Jul. 16, 2002 Number: 02-002787 Latest Update: Apr. 30, 2003

The Issue Whether Respondent discriminated against Petitioner in its employment decisions in violation of Section 760.10, Florida Statutes (2001).

Findings Of Fact Petitioner, Oscar Jones (Petitioner), is a black male. He began working for Respondent in July 1997, as a longshoreman working on "chicken boats." In that position, Petitioner loaded boxes of frozen chicken into the holds of refrigerated ships. Respondent, Coastal Maritime Services, LLC (Respondent), is engaged in the business of stevedoring and seaport terminal operations, including loading and unloading ships, and receiving cargo. On May 28, 1998, Respondent injured himself when a very heavy box of frozen chicken fell on his ankle. Other than first aid at the worksite, Petitioner declined further medical treatment that day. He was given a medical form authorizing treatment at the medical clinic which provided medical services to injured employees who might be covered under Respondent's workers' compensation insurance. The next day, on May 29, 1998, Petitioner sought medical treatment for his injury at the medical facility which handled Respondent's workers' compensation injuries. As part of that treatment, Petitioner was asked to take a drug test and Petitioner consented. Although no formal written drug test policy was in effect by Respondent at the time of Petitioner's injury, the general policy and practice was that a work-related injury would subject an employee to a voluntary drug test. Petitioner's drug test came back positive for marijuana. As a result of the positive drug test result, Respondent's insurance carrier controverted Petitioner's workers' compensation claim. There was no evidence that Respondent's management had any responsibility or involvement in the carrier's decision to controvert Petitioner's entitlement to workers' compensation benefits. During the 12-month time period of January 1998 through December 1998, Petitioner was not the only employee of Respondent required to take a drug test after a work place injury. In fact, in June 1998 (the same time period as Petitioner's test) seven white employees were required to take a drug test and three black employees were required to take a drug test. For the entire 1998 calendar year, 51 total drug tests were administered, with 31 of those tests administered to non- black employees (for example, white or Hispanic) and only 21 of those tests administered to black employees. Similarly, for the entire 1998 calendar year, a total of 18 employees were not administered drug tests, either because medical attention was refused or because of the severity of the injury. Of those 18 employees, 11 were non-black employees and seven were black employees. Employees who were not required to take a drug test either were those who refused medical attention or who were severely injured and had to seek treatment from hospital emergency rooms where drug tests were not given. Clearly, race played no factor in who was required to take a drug test or who received a drug test. Petitioner did cite the names of two white employees, Jay Chavers and Andy Wiley, who allegedly were treated more favorably than Petitioner, in that those two employees did not take a drug test. However, those two employees were not "similarly situated" to Petitioner. First, the injuries of both Mr. Chavers and Mr. Wiley were much more serious in nature than the contusion (bruise) that Petitioner had suffered and both were taken to emergency rooms for their injuries where drug tests were not routinely administered. Specifically, Mr. Chavers had fallen from a high distance and suffered numerous broken bones, thus, rendering him incapable of giving consent to a drug test at the hospital. As to Mr. Wiley, his injuries were not subject to workers' compensation coverage, unlike Petitioner's. Thus, given the nature of the injuries of Mr. Chavers and Mr. Wiley, those two individuals were not sufficiently "similarly situated" to Petitioner to enable him to establish a prima facie case of racial discrimination. Petitioner's positive drug test result had no other impact on his employment with Respondent, apart from the controversion of his workers' compensation benefits. Indeed, Respondent attempted to get Petitioner to return to work. Shortly thereafter, in early June 1998, Petitioner contacted the chief financial officer of Respondent, Kathleen Wiley, who in 1998 was Respondent's office manager. Petitioner expressed concern to Ms. Wiley about his workers' compensation benefits and his employment status with Respondent. Ms. Wiley informed Petitioner that he was still considered to be employed with Respondent and that he needed to contact Ben Brown for a light duty assignment. Petitioner was expressly informed that light duty work was available that would meet his medical restrictions imposed after his injury. Petitioner never followed-up with Mr. Brown about light duty work. Almost immediately thereafter in June 1998, Respondent hired Bud Underwood as its new safety manager. Mr. Underwood's responsibilities were to oversee workers' compensation cases and follow up on accidents and injured employees. Ms. Wiley informed Mr. Underwood to follow up on the situation of Petitioner to get him to return for a light duty assignment. In late June or early July 1998, Mr. Underwood contacted Petitioner as directed and offered him light duty work within his medical restrictions. Petitioner informed Mr. Underwood in very obscene terms that he was not going to accept any light duty assignments. Petitioner never appeared for any light duty assignments after that conversation. Based upon Petitioner's response to that telephonic offer of light duty employment, Respondent sent Petitioner a letter around July 9, 1998, informing him that based upon his refusal of light duty work, he had been deemed to have abandoned his employment, and thus was no longer employed by Respondent due to self-termination. Thereafter, in September 1998, Petitioner contacted Respondent by telephone seeking employment. However, by that time, opportunities for longshoremen, such as Mr. Jones were extremely limited, as the "chicken boat" operation had all but shut down for financial reasons, and no positions were available at the time. Thus, Respondent sent Petitioner a letter dated September 2, 1998, informing him that no positions were available, but encouraging him to reapply. Despite the invitation to Petitioner that he should reapply, Petitioner never submitted any subsequent inquiry for employment. Respondent's "chicken boat" operation had shut down completely by February 1999. Petitioner later applied for unemployment compensation benefits, but those benefits were denied on the ground that Petitioner had abandoned his employment by refusing the light duty work that was offered to him. In fact, in an evidentiary hearing held in his unemployment compensation matter, the Unemployment Appeals Referee found as a fact that Petitioner admitted that he had refused the light duty work offered to him. Petitioner's appeal of that adverse decision was, likewise, denied by the Unemployment Appeals Commission. Petitioner's race played no role in Respondent's determination that Petitioner had abandoned his employment or in Respondent's determination that no position existed for Petitioner in September 1998. Similarly, race played no role in the insurance carrier's decisions regarding Petitioner's workers' compensation benefits. In fact, Petitioner voluntarily settled his workers' compensation claim disputes in a settlement agreement signed by him and his attorney dated March 22, 1999. Petitioner had a family to support and needed the money. Pursuant to that settlement agreement, Petitioner agreed to accept $4,500 in full, final and complete settlement, release and discharge of any and all claims against the employer arising out of Petitioner's alleged accident, injury, and disability in issue, including, but not limited to claims for temporary total, temporary partial, permanent total, and/or permanent partial disability compensation, and past and future medical benefits. Petitioner verified that the settlement was adequate and was not entered into under duress. Rather, Petitioner of his own accord thought that the settlement was in his best interest. The Department of Labor approved the settlement. Petitioner has made no credible showing that there was any relationship between his race and the adverse employment actions of which he has complained.

Recommendation Based on the foregoing Findings of Facts and Conclusions of Law, it is RECOMMENDED that the Petition be dismissed. DONE AND ENTERED this 16th day of December, 2002, in Tallahassee, Leon County, Florida. DIANE CLEAVINGER Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 16th day of December, 2002. COPIES FURNISHED: Peter Reed Corbin, Esquire Richard L. Ruth, Jr., Esquire Ford & Harrison LLP 121 West Forsyth Street Suite 1000 Post Office Box 41566 Jacksonville, Florida 32203 Denise Crawford, Agency Clerk Florida Commission on Human Relations 2009 Apalachee Parkway, Suite 100 Tallahassee, Florida 32301 Oscar Jones 1817 East 27th Street Jacksonville, Florida 32206 Cecil Howard, General Counsel Florida Commission on Human Relations 2009 Apalachee Parkway, Suite 100 Tallahassee, Florida 32301

USC (1) 42 U.S.C 2000e Florida Laws (4) 120.57760.01760.10760.11
# 2
CHESTER SMITH vs DEPARTMENT OF CHILDREN AND FAMILY SERVICES, 98-001870 (1998)
Division of Administrative Hearings, Florida Filed:Jacksonville, Florida Apr. 20, 1998 Number: 98-001870 Latest Update: Apr. 21, 1999

The Issue Is Petitioner eligible for Developmental Services from the Department of Children and Family Services?

Findings Of Fact Audrey Smith is the natural mother of Petitioner Chester (Charlie) Smith. She filed an application with Respondent Department of Children and Family Services Developmental Services Program on behalf of her son. The application was denied, and this case followed. Chester Smith did not appear for formal hearing. Audrey Smith represented that she had a power of attorney to act on her son's behalf and that she was his payee for federal SSI benefits, arising from Petitioner's disability and his father's death. Neither of these instruments was offered in evidence, but because she had applied to the agency on Chester's behalf and had requested formal hearing, Mrs. Smith was accepted as Petitioner's "next friend" and qualified representative. The Developmental Services Program, administered by Respondent, provides services to persons with specific developmental disabilities, including mental retardation, cerebral palsy, spina bifida, autism, and Prader-Willi Syndrome, pursuant to Chapter 393, Florida Statutes. Petitioner, born October 18, 1953, had originally been turned down for services as not meeting the statutory and rule requirements of "mental retardation." During the informal hearings following that denial and preceding referral of the disputed issues of fact to the Division of Administrative Hearings for formal hearing, Mrs. Smith asserted Petitioner's entitlement to services on the basis of "autism." She also asserted this entitlement in her request for formal hearing. In determining Petitioner's eligibility for services, agency staff psychologist Fe Ripka reviewed four psychological evaluations previously performed on Petitioner. Ms. Ripka did not testify, but her January 27, 1997, report was placed in evidence. She only reviewed evaluations done in May 1965, July 1966, February 1995, and April 1996. Ms. Ripka's degrees and titles show "M.A.," "LMHC," and "Psychologist." No specific education, training or experience on her part was related. Her report emphasized Petitioner's verbal IQ and full scale IQs as controlling of eligibility. She concluded, on the basis of her review, that Petitioner did not suffer from mental retardation. Her report made no determination on the basis of autism. Petitioner's mother related that Petitioner was deprived of oxygen at birth and never developed normally. She has presumed him "brain damaged." Petitioner has required special classes and other remedial help throughout his life. He is now 45 years old. From 2 ½ to 8 ½ years of age, Petitioner was treated at the Putnam Children's Clinic. Not much is known about the treatment. Petitioner's Exhibit 9 (also part of Respondent's Exhibit 4) contains records from the Devereux Foundation Schools of Devon, Pennsylvania, including an August 22, 1967, "Exit Interview and Discharge Diagnosis Form" with a discharge diagnosis of "ooo-x28 Schizophrenic Reaction, Childhood Type . . . autism and possible mental retardation." The "Initial Psychiatric Evaluation" of November 5, 1965, by Robert Ewalds, M.D., a psychiatrist, related that Petitioner's manner was "generally autistic," with borderline intellectual functioning, "a history of autism," and a thinking disorder/chronic schizophrenic process, and that Petitioner would require custodial care indefinitely. The January 7, 1966, "Psycho- Educational Evaluation" of F. Howard Buss, Ph.D., and W.S. Holloway, B.A., of Devereux's Psychology Department, made an "Educational Diagnosis" of Petitioner as "achieving academically at a level below measured intellectual functioning and well below chronological age expectations." Henry Platt, Ph.D., of the Psychology Department performed a July 30, 1966, "Psychological Evaluation" which related the following critical matters: Intelligence: Current intellectual functioning, as measured by the WISC, was at a low average level in the verbal area (IQ 86), submarginal in the performance area (IQ 62), with a marginal level for the full test (IQ 72). * * * . . . findings were in line with those reported on the WISC about a year ago, despite the slight drop in scores on present testing. VIQ2 PIQ3 FSIQ4 May 1965 89 68 77 July 1966 86 62 72 After Pennsylvania, Petitioner lived in Minnesota with his adult married sister until recently. He received developmental disability benefits from the State of Minnesota until he moved to Florida to live with his mother in 1997. Petitioner was tested February 14, 1995, by Scott County, Minnesota, Human Services agency (Petitioner's Exhibit 10). The Weschler Adult Intelligence Scale and the Vineland Adaptive Behavior Scale tests were administered. In a written opinion, April Leaveck, Psy.D., opined that Petitioner had scored a verbal IQ of 82; performance IQ of 67 and full scale IQ of 74, with a percentile ranking of four, which constituted a "borderline range of intellectual functioning." The Vineland testing showed a low-deficit adaptive level in each of the three domains and overall low-deficit adaptive level with an age equivalent score of seven years, eight months. Petitioner was 42 years old at the time. In the evaluator's opinion, a significant discrepancy in his verbal and performance scores reflected "brain damage at birth." All of the foregoing reports also attest to Petitioner's lifelong impairment in reciprocal interpersonal relationships and social interaction. All of them indicate he was hard to test because of distractibility. An April 1996 evaluation, performed when Petitioner was 43, showed a Stanford-Binet IQ of 59. (Petitioner's Exhibit 2) Approximately April 16, 1998, and subsequent to Ms. Ripka's review, Petitioner was tested by Larry Neidigh, Ph.D., Licensed Psychologist and Diplomate of the American College of Forensic Examiners. His Weschler test scoring when Petitioner was 45, showed a Verbal IQ of 69, a Performance IQ of 62, and a Full Scale Select IQ of 63. He opined that, applying all variables, a valid estimate of Petitioner's intellectual functioning was between 60 and 68. Petitioner is currently being seen at the Clay County Florida Behavioral Services Day Treatment Program by Russell Findley, M.D. Dr. Findley is a Florida-licensed medical physician. He is treating Petitioner for Bipolar Disorder, using a variety of modalities, including psychotropic drugs. He has concluded that Petitioner's medical history, including the historical facts of birth trauma and initiation of mental health treatment when Petitioner was only 2 ½ years old, is suggestive that Petitioner's "primary process is best described as developmental, not [a] mental health problem;" and that Petitioner has significant intellectual impairment, not consonant with BiPolar Disorder. Dr. Findley testified that Petitioner is "mildly mentally retarded," (TR 76-77) and again, "In my clinical impression, it is mental retardation." (TR 77) He did not consider "schizophrenia" to be a valid current diagnosis. Petitioner's Bipolar Disorder is in remission due to the drugs currently being administered to him. With the Bipolar Disorder in remission, what Dr. Findley sees in Petitioner is consistent with mental retardation. It is possible that the new medications render Petitioner's more recent IQ tests more accurate than the earlier ones because he is less distractible and more easily tested. With a standard deviation of two, Dr. Findley is aware of the prior IQs of 72 and 74. He has administered no IQ tests himself. He considers modern testing to be more accurate. Within the DSM-IV standards of medical/psychiatric diagnosis, he considers Petitioner to be "Axis I, bipolar disorder in remission with mild MR5 " extending over the whole of Petitioner's life. (TR 84) Dr. Findley was not asked about autism. The parties agreed to the admission of an excerpt from "Mental Retardation: Definition, Classification, and Systems of Support," published by the American Association of Mental Retardation which reads: Mental Retardation Diagnostic Features The essential feature of Mental Retardation is significantly subaverage general intellectual functioning (Criteria A) that is accompanied by significant limitations in adaptive functioning in at least two of the following skill areas: communication, self-care, home living, social/interpersonal skills, use of community resources, self-direction, functional academic skills, work, leisure, health, and safety (Criterion B). The onset must occur before age 18 years (Criterion C). Mental Retardation has many different etiologies and may be seen as a final common pathway of various pathological processes that affect the functioning of the central nervous system. General intellectual functioning is defined by the intelligence quotient (IQ or IQ-equivalent) obtained by assessment with one or more of the standardized, individually administered intelligence tests (e.g., Wechsler Intelligence Scales for Children -- Revised, Stanford-Binet, Kaufman Assessment Battery for Children). Significantly subaverage intellectual functioning is defined as an IQ of about 70 or below (approximately 2 standard deviations below the mean). It should be noted that there is a measurement error of approximately 5 points in assessing IQ, although this may vary from instrument to instrument (e.g. a Wechsler IQ of 70 is considered to represent a range of 65-75). Thus, it is possible to diagnose Mental Retardation in individuals with . . . [remainder missing] To sum up, Petitioner's documented assessments, by year and age, are as follows: Exhibit No. Date Age Full Scale IQ P-9 5/65 11 ½ 77 Other Diagnosis, if any P-9 11/5/65 12 ½ generally autistic; a history of autism; P-9 7/30/66 13 72 P-9 8/22/67 14 000-x28 schizophrenic reaction, autism and possible mental retardation P-10 2/14/95 42 74 P-2 4/96 43 59 P-6 4/16/98 45 63 true IQ between 60-68 Also, the current diagnosis of Dr. Findley, pursuant to the generally recognized authority of DSM-IV, may be summed up that Petitioner suffers from mild mental retardation, previously camouflaged by his Bipolar Disorder. Petitioner has never met the standards of personal independence and social responsibility of his chronological age. He has never held other than a protected job. He has never solely cared for his own person. Since infancy, he has been under the care and supervision of either his family in Pennsylvania, his adult sister in Minnesota, where he has long received developmental benefits, or his mother since 1997. He has suffered from impairment in reciprocal social interaction continuously since infancy.

Recommendation Upon the foregoing findings of fact and conclusions of law, it is RECOMMENDED that the Department of Children and Family Services enter a Final Order determining Petitioner eligible for "autism" benefits and denying him retardation benefits. DONE AND ENTERED this 8th day of January, 1999, in Tallahassee, Leon County, Florida. ELLA JANE P. DAVIS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 8th day of January, 1999.

Florida Laws (2) 120.57393.063
# 3
DEPARTMENT OF LAW ENFORCEMENT, CRIMINAL JUSTICE STANDARDS AND TRAINING COMMISSION vs JO ANNE THORNTON, 94-004174 (1994)
Division of Administrative Hearings, Florida Filed:Miami, Florida Jul. 26, 1994 Number: 94-004174 Latest Update: Feb. 05, 1996

Findings Of Fact Based upon the oral and documentary evidence adduced at the hearing and the entire record in this proceeding, the following findings of fact are made: Respondent is a certified correctional officer in the State of Florida having been issued certificate # 84145 on April 23, 1991. Respondent was employed as a correctional officer with the Metro-Dade Corrections and Rehabilitation Department ("M-D CR") beginning in April 1991. Prior to obtaining her certification as a correctional officer, Respondent worked for the State Corrections Department for approximately seven (7) years as a clerk and later as a technician. No evidence has been presented in this case as to any prior disciplinary action taken against Respondent or any other job related problems. By memorandum dated July 9, 1993, Respondent was notified of her biannual physical which was to include a drug/alcohol screening. The scheduled date for the physical and screening was August 5, 1993 at 9:00 a.m. On August 5, 1993, Respondent presented at Mount Sinai Medical Center for her physical. She filled out and signed a Consent & Release Form and a Specimen Collection Checklist & Chain of Custody Form. She then submitted a urine sample for testing. Respondent's urine sample was handled in accordance with a standard set of procedures for dividing, labelling and sealing the specimen. Respondent had an opportunity to observe the splitting of the sample and she initialed the containers after they were sealed. Respondent's urine specimens were transported by courier to Toxicology Testing Service ("TTS") for routine screening. The evidence established that TTS has adopted adequate procedures to track the chain of custody of the urine samples it receives and protect the integrity of the samples. There is no evidence in this case that there are any gaps or breaks in the chain of custody for Respondent's samples, that the integrity of the samples was ever compromised, that the testing procedures were not followed and/or that the equipment was contaminated or not working properly. After Respondent's samples were received at TTS, an immunoassay screening test was performed on a portion of one of the samples. That screening test was positive for the presence of cocaine at a level that was barely over the minimum threshold level of 50 Nanograms per milliliter. 1/ After the initial screening test was determined to be positive, Respondent's sample was analyzed with a confirmatory testing procedure which utilized gas chromatography/mass spectrometry ("GCMS"). 2/ On or about August 10, 1993, Dr. Terry Hall, Director of TTS, issued a final report indicating that Respondent's urine had tested positive for cocaine. Specifically, the Report stated that, upon analysis, the urine sample provided by Respondent tested positive for the presence of the cocaine metabolite, benzoylecgonine, in a concentration of 71 Nanograms per milliliter. The TTS test results of Respondent's urine are consistent with the ingestion of cocaine because cocaine is the only drug commonly available that, when ingested into the human body, produces the cocaine metabolite, benzoylecgonine. While the testing by TTS demonstrated the presence of cocaine metabolite in Respondent's system, it does not establish how ingestion occurred. Absent proof that the drug was possessed or administered under the authority of a prescription issued by a physician or that the presence of cocaine metabolite could otherwise be lawfully explained, unlawful ingestion is a reasonable inference. However, it is also possible that the ingestion was involuntary and/or unknowing. 3/ M-D CR and Respondent were notified on August 11, 1993 that the urine sample Respondent provided on August 5, 1993 tested positive for cocaine. Respondent has not worked as a correctional officer since that date. Upon notification of the test results, Respondent vehemently denied using drugs. She took immediate steps to try to prove her innocence. Respondent contacted the Dade County Police Benevolent Association (the "PBA") which arranged for Consulab of Cedars of Lebanon Hospital to do a drug screen at the 50 Nanogram per milliliter level on a urine sample provided by Respondent. On August 12, 1993, Respondent provided a urine sample to Consulab. Respondent claims that the results of that test did not reveal the presence of cocaine or cocaine metabolite in her urine. 4/ The Consulab test result reported by Respondent is not necessarily inconsistent with the results reported by TTS because the levels detected by TTS were relatively small and any cocaine in Respondent's system could have been fully metabolized during the time between the two tests. On September 2, 1993, the PBA, on behalf of Respondent, requested a retest of Respondent's August 5, 1995 urine sample. Prior to the retest, Respondent was present and able to inspect the seal on the container from the split sample of her August 5, 1993 urine specimen. On or about September 9, 1993, Dr. Terry Hall issued a final report on the retest of Respondent's August 5 urine sample. The retest was positive for cocaine metabolite at a level of 67 Nanograms per milliliter. This result is consistent with the earlier GC/MS test result. On or about August 19, 1993, Respondent's employer, the M-D CR, issued a Disciplinary Action Report to Respondent based on the TTS reports. The Report advised Respondent that proceedings were being initiated to dismiss her from employment. On or about November 5, 1993, Director Charles A. Felton of the M-D CR dismissed Respondent from her employment with the M-D CR. By letter dated November 9, 1993, Commander Miriam Carames, Employee Discipline Coordinator for the M-D CR advised the Florida Department of Law Enforcement ("FDLE") of Respondent's termination. On or about November 22, 1993, Respondent wrote a personal letter to Director Felton explaining her side of the events leading to her termination and proclaiming her innocence. In accordance with the PBA's collective bargaining agreement, Respondent requested an arbitration hearing on her dismissal. The arbitration hearing on Respondent's termination was conducted on December 21, 1993. The decision of Arbitrator Charles A. Hall of the American Arbitration Association was rendered on February 1, 1994 and issued by letter dated February 9, 1994. That decision found that Respondent should be returned to full duty, without loss of pay, providing she agreed to six months of random drug testing. By letter dated May 3, 1994, Metro-Dade County Manager Joaquin Avino overturned the decision of Arbitrator Charles A. Hall and ordered Respondent dismissed from her employment with the M-D CR. That decision is currently being appealed. There is no evidence that Respondent has had any problems or difficulties in carrying out her responsibilities as a correctional officer. From Respondent's initial employment as a clerk with the state corrections department through her employment as a correctional officer beginning in 1991, Respondent has consistently been recognized as a professional, loyal and dedicated employee. Her job evaluations have always been satisfactory or better. Respondent received the State of Florida Department of Corrections, Circuit 11, Employee of the Year Award for 1988. She has further demonstrated dedication to her profession through continued training in the law enforcement field. Respondent's coworkers and supervisors testified that Respondent has a reputation for integrity, honesty and fairness in the treatment of inmates and coworkers. They also testified that she respects the rights of others, respects the law and has a reputation for overall good moral character and has never been observed to be impaired, or known to use drugs. Respondent is the mother of 3 teenage girls and has been very active in her Church. She has devoted substantial personal time and resources to community service. Respondent strongly denies taking or ingesting cocaine. Respondent provided no explanations at hearing for the positive test results. She was at a loss to provide a plausible explanation for what she perceives to be an aberration. Respondent presented the testimony of a number of witnesses who know her well to lend credence to her denial. Those witnesses testified credibly that Respondent is a person of good moral character who, among other qualities, has the ability to differentiate between right and wrong and the character to observe the difference, has respect for the rights of others, has respect for the law, and can be relied upon in a position of trust and confidence. Those witnesses, who have known Respondent for an extended period of time commencing well before the incident in question, believe it is the antithesis of Respondent's character to have ingested or used cocaine. In summary, the results of the urinalysis create a suspicion of unlawful drug use. However, the test results alone do not conclusively establish unlawful use. The results could have been due to some unknown test failure or inadvertent ingestion. After considering the nominal amount of cocaine metabolite disclosed by testing, the evidence presented regarding Respondent's character, as well as her employment record, the evidence is not clear and convincing that Respondent has unlawfully ingested cocaine. While no conclusion can be reached, with any degree of certainty, as to the reason for the positive test results, the test results cannot and should not be ignored. Without a plausible explanation for the test results, those results do raise some unanswered questions and doubts as to Respondent's character which do provide a basis for action by the Commission under its rules.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that a Final Order be entered finding that there are some doubts regarding Respondent's moral fitness for continued service in accordance with Rule 11B-27.0011(4)(c)4. In view of this finding, Respondent should be placed on probation for two years subject to random drug testing. DONE AND RECOMMENDED this 18th day of August, 1995, in Tallahassee, Leon County, Florida. J. STEPHEN MENTON Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 18th day of August, 1995.

Florida Laws (6) 120.57120.60893.03893.13943.13943.1395 Florida Administrative Code (3) 11B-27.001111B-27.0022511B-27.005
# 4
JOHN L. WINN, AS COMMISSIONER OF EDUCATION vs PETER NEWTON, 05-000102PL (2005)
Division of Administrative Hearings, Florida Filed:Clearwater, Florida Jan. 13, 2005 Number: 05-000102PL Latest Update: Sep. 06, 2005

The Issue The issues in the case are whether the allegations set forth in the Administrative Complaint filed by Petitioner against Respondent are correct, and, if so, what penalty should be imposed.

Findings Of Fact Respondent is a Florida teacher, holding Florida Educator's Certificate 780153 (covering the area of Emotionally Handicapped education) valid through June 30, 2007. At all times material to this case, Respondent was employed as a teacher of emotionally handicapped third-grade students at Skycrest Elementary School in the Pinellas County School District. Respondent was employed by the Pinellas County School Board as a teacher of emotionally handicapped students for more than six years. The Pinellas County School District assessed student and instructional performance through the use of the "Pinellas Instructional Assessment Portfolio." The portfolio consisted of two tests administered three times each school year. The tests were known as the "Parallel Reading-Florida Comprehensive Assessment Test" and the "Parallel Math-Florida Comprehensive Assessment Test." The portfolio tests were used by the school district to gauge progress towards meeting the Sunshine State Standards established by the Florida Department of Education (DOE) to determine the academic achievement of Florida students. The portfolio tests, administered over a two-day period, also served to prepare students to take the Florida Comprehensive Assessment Test (FCAT). The FCAT was administered according to requirements established though the DOE and was designed to measure progress towards meeting Sunshine State Standards. Third-grade students were required to achieve a passing score on the FCAT in order to move into the fourth grade. One of the purposes of the portfolio tests was to measure student progress and provide information relative to each student's abilities. Based on test results, additional instruction was provided to remedy academic deficiencies and further prepare students to pass the FCAT. Emotionally handicapped students were required to take the reading and the math portfolio tests. The school district had specific procedures in place related to administration of the tests. Teachers responsible for administration of the tests received instruction on appropriate test practices. Respondent was aware of the rules governing administration of the tests. The procedures permitted teachers to offer general encouragement to students, but teachers were prohibited from offering assistance. Teachers were not allowed to read questions to students. Teachers were not permitted to provide any information to students related to the content of test responses. During the December 2002 testing period, Respondent provided improper assistance to the nine emotionally handicapped students he taught. During the test, Respondent reviewed student answers to multiple-choice questions and advised students to work harder on the answers, indicating that the answers were incorrect. Respondent assisted students by reading questions, helping students to pronounce words and phrases, and advising students as to the location in the test materials where answers could be found. Some of Respondent's students were apparently overwhelmed by the test process and ceased working on the tests. Respondent reviewed their progress and advised the students to continue answering questions. There is no evidence that Respondent directly provided answers to students, but Respondent clearly assisted students to determine which responses were correct. The assistance provided by Respondent to his students exceeded that which was allowed under test rules. Respondent acknowledged that the assistance was inappropriate, but asserted that he did so to provide confidence to the students that they could take and pass the FCAT, and advance to the fourth grade. Respondent's improper assistance to his students prevented school officials from obtaining an accurate measurement of the academic abilities of his students. The test results were invalidated and the students were retested. According to the parties, a newspaper article related to the matter was published in a local newspaper.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that Petitioner enter a final order reprimanding Respondent for violating Florida Administrative Code Rule 6B-1.006(3)(a), and placing him on probation for a period of one year. DONE AND ENTERED this 18th day of May, 2005, in Tallahassee, Leon County, Florida. S WILLIAM F. QUATTLEBAUM Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 18th day of May, 2005. COPIES FURNISHED: Kathleen M. Richards, Executive Director Education Practices Commission Department of Education 325 West Gaines Street, Room 224 Tallahassee, Florida 32399-0400 Mark Herdman, Esquire Herdman & Sakellarides, P.A. 2595 Tampa Road, Suite J Palm Harbor, Florida 34684 Ron Weaver, Esquire Post Office Box 5675 Douglasville, Georgia 30154-0012 Marian Lambeth, Program Specialist Bureau of Educator Standards Department of Education 325 West Gaines Street, Suite 224-E Tallahassee, Florida 32399-0400 Daniel J. Woodring, Esquire Department of Education 1244 Turlington Building 325 West Gaines Street Tallahassee, Florida 32399-0400

Florida Laws (3) 1012.011012.795120.57
# 5
ROBIN CARTER MILLAN vs DEPARTMENT OF CHILDREN AND FAMILY SERVICES, 98-005602 (1998)
Division of Administrative Hearings, Florida Filed:Largo, Florida Dec. 22, 1998 Number: 98-005602 Latest Update: Jan. 24, 2000

The Issue The issue in this case is whether the Petitioner, Robin Carter Millan, is eligible for the Developmental Services Program of the Department of Children and Family Services (DCFS).

Findings Of Fact The Petitioner, Robin Carter Millan, requested developmental services from the Department of Children and Family Services (DCFS) in September 1997, when she was 26 years old. The Petitioner's mother, Ann Millan, met with an intake counselor and completed a Referral/Intake Information Questionnaire. Consistent with a long-standing preference not to label her child as autistic, Mrs. Millan listed her daughter's primary disability as mental retardation. After the Petitioner submitted additional information, DCFS psychologist specialist-coordinator Jane Schiereck sent the Petitioner a letter dated March 6, 1998, notifying the Petitioner that DCFS had determined her ineligible for developmental services because the information submitted included IQ test scores exceeding the maximum for mental retardation. At the hearing, the Petitioner's mother presented evidence that the Petitioner actually has autism--a pervasive, neurologically-based developmental disability which causes severe learning, communication, and behavior disorders with age of onset during childhood. Schiereck testified that the evidence proved the Petitioner is eligible for developmental services under the category of autism. According to Schiereck, the Petitioner did not apply for services under the category of autism and that the Petitioner had to reapply under autism. The Petitioner agreed to do so. However, Schiereck also testified that the intake procedures and eligibility determination preceded the filing of an application.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that DCFS enter a final order determining the Petitioner eligible for developmental services. DONE AND ENTERED this 14th day of May, 1999, in Tallahassee, Leon County, Florida. J. LAWRENCE JOHNSTON Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 14th day of May, 1999. COPIES FURNISHED: Amy V. Archibald, Esquire Department of Children and Family Services 11351 Ulmerton Road, Suite 100 Largo, Florida 33778-1630 Robin Carter Millan c/o Robert and Ann Millan 3963 Eagle Cove West Drive Palm Harbor, Florida 34685 John S. Slye, General Counsel Department of Children and Family Services Building 2, Room 204 1317 Winewood Boulevard Tallahassee, Florida 32399-0700 Gregory D. Venz, Agency Clerk Department of Children and Family Services Building 2, Room 204 1317 Winewood Boulevard Tallahassee, Florida 32399-0700

Florida Laws (3) 393.062393.063393.065
# 7
JIM HORNE, AS COMMISSIONER OF EDUCATION vs LISA M. GAUSE, 04-003635PL (2004)
Division of Administrative Hearings, Florida Filed:Avon Park, Florida Oct. 06, 2004 Number: 04-003635PL Latest Update: Jul. 11, 2005

The Issue The issue is whether Respondent committed the acts alleged in the Amended Administrative Complaint, and if so, what discipline should be imposed.

Findings Of Fact Respondent holds, and at all relevant times, held a valid Florida Educator’s Certificate. Respondent is and, at all relevant times, was a fifth- grade teacher at Avon Park Elementary School in Highlands County. Respondent has been an elementary school teacher for 19 years. She taught fourth and fifth grade at Zolfo Springs Elementary School in Hardee County from 1986 through the end of the 2000-01 school year. She started teaching at Avon Park Elementary School at the beginning of the 2001-02 school year. Respondent is currently on a year-to-year contract. Her contract was renewed for the 2003-04 and 2004-05 school years notwithstanding the allegations in this case, which occurred during the 2002-03 school year. Respondent has not had any disciplinary problems over the course of her career, and other than the allegations in this case, she has never been accused of any unethical or unprofessional conduct. Respondent has always received good annual performance evaluations. Respondent’s most recent performance evaluations - - for the 2002-03 and 2003-04 school years –- state that she “meets or exceeds expectations” in all categories, including the category that assesses whether Respondent “act[s] in a professional and ethical manner and adhere[s] to the Code and Principles of Professional Conduct.” Consistent with the information in Respondent’s annual performance evaluations, the principal at Avon Park Elementary School, who is Respondent’s current supervisor, testified that Respondent “does a good job” as a teacher and that she values Respondent quite highly as a teacher; the former principal at Zolfo Springs Elementary School, who was Respondent’s supervisor for approximately five of the years that Respondent taught at that school, testified that Respondent’s reputation for complying with the code of ethics is “excellent” and that Respondent always “monitored and cherished” her professionalism; one of Respondent’s co-workers at Avon Park Elementary School testified that Respondent is “a very effective and professional teacher”; and the students who testified at the hearing characterized Respondent as a good teacher. Respondent has administered the FCAT to her students since the test’s inception in the 1990s, and as a result, she is very familiar with what teachers can and cannot do when administering the test. Respondent and other teachers at Avon Park Elementary School received training on the administration of the 2003 FCAT, and as part of the training, Respondent received a copy of the Test Administration Manual for the 2003 FCAT. The Test Administration Manual is published by the state Department of Education (Department) and is distributed to teachers by the testing coordinators at each school. The school-level testing coordinators report to a testing coordinator at the school district level, who is ultimately responsible for the administration of the FCAT to the district’s students. The Test Administration Manual summarizes the “dos and don’ts” of test administration for the FCAT. It also includes a copy of the statute and rule governing test security, which for the 2003 FCAT were Section 228.301, Florida Statutes, and Florida Administrative Code Rule 6A-10.042. On the issue of test security, the Test Administration Manual explains that: it is not appropriate to talk with [students] about any test item or to help them answer any test item. For example, if students finish the test before the allotted time for the session has elapsed, or have not attempted to complete a question, it would be appropriate to encourage them to go back and check their work. It is not acceptable to provide the students with any information that would allow them to infer the correct answer, such as suggesting that they might want to check their work on a specific question. (Emphasis in original). The FCAT is required by state law to be administered annually to public school students in the third through tenth grades to measure the students’ proficiency in reading, writing, science, and math. The FCAT measures the students’ performance against state standards. The Norm Referenced Test (NRT), which is administered in conjunction with the FCAT, measures the students’ performance in math and reading against national standards. The FCAT is an important test, both to students and the schools. The student’s promotion to the next grade and/or class placement is affected to some degree by his or her performance on the FCAT. The school’s grade, which has an impact on the funding that the school district receives from the state, is also affected to some degree by the students’ performance on the FCAT. The math and reading portions of the 2003 FCAT were administered to fifth graders on Monday through Wednesday, March 3-5, 2003. The science portion of the FCAT and the NRT were administered the following week, on Monday through Wednesday, March 10-12, 2003. Throughout the 2002-03 school year, Respondent “taught the FCAT” and gave her class practice FCAT questions. She used the questions as teaching tools and to help prepare her students for the actual FCAT. Respondent would sometimes explain the wording of the practice questions to her students and, as needed, she would provide the students other assistance, both individually and as a class, while they were working on the practice questions. On Friday, February 28, 2003, Respondent administered two practice tests to her students in which she tried to simulate the environment in which the students would be taking the actual FCAT the following week. For example, the tests were timed and Respondent walked around the room as she proctored the tests. Respondent helped the students during the practice tests as she had done with the practice questions administered throughout the year. At one point, she stopped the test and reviewed a math problem on the board with the class because she observed a number of students having problems with a particular question. Respondent administered the math and reading portions of the actual FCAT to 18 students in her homeroom class on March 3-5, 2003. None of those students were exceptional education students who were entitled to special accommodations. Respondent did a 15 to 20 minute “mini-review” each morning that the students were taking the actual FCAT during which she went over terminology and concepts that the students might see on the test that day. Respondent started the administration of the actual FCAT by reading the directions verbatim from the “scripts” in the Test Administration Manual. Once the students began taking the test, she monitored them from her desk and she also walked around the room on a periodic basis. Respondent also went to students’ desks when they raised their hands. The Test Administration Manual contemplates that students might raise their hands and ask questions during the test; indeed, the “scripts” that the teacher is required to read verbatim state more than once, “Please raise your hand if you have any questions.” Respondent denied giving the students any assistance in answering the test questions on the actual FCAT. According to Respondent, when a student asked her about a particular test question, she told the student that “I can’t help you,” “go back and re-read the directions,” “do the best you can,” or other words to that effect. The Department’s testing coordinator, Victoria Ash, testified that responses such as those are acceptable. Respondent also made a general statement to the class during the test reminding the students to go back and check their work if they finished the test before the allotted time expired. Ms. Ash testified that a general reminder such as that is “absolutely acceptable.” Respondent’s testimony was corroborated by student J.M., who credibly testified that he recalled more than once hearing Respondent tell other students that she could not help them during the actual FCAT. Several students testified that Respondent helped them during the actual FCAT by explaining words that they did not understand, explaining how to solve math problems, and/or by suggesting that they check their work on particular problems. That testimony was not persuasive because it lacked specificity and precision, and other than A.P., B.B. (boy), and K.J., the students testified that they were not certain that the help they remembered receiving was on the actual FCAT rather than on the practice tests that they were given by Respondent. With respect to B.B. (boy), the undersigned did not find his testimony persuasive because he also testified that Respondent helped the entire class with a math problem during the actual test, which contradicted the statements given by the other students and which suggests that he was recalling events from the practice test during which Respondent gave such help to the entire class. With respect to A.P. and K.J., the undersigned did not find them to be particularly credible witnesses based upon their demeanors while testifying. There were other inconsistencies in the students’ accounts of Respondent’s administration of the FCAT that make their testimony generally unpersuasive. For example, B.B. (girl) testified that Respondent played classical music during the actual test, which was not corroborated by any other student in the class and was contradicted by Respondent’s credible testimony that she played music during the practice tests to relax the students but that she and the other fifth-grade teachers at Avon Park Elementary School made a conscious decision not to play music during the actual FCAT. As a result of the students’ apparent confusion regarding events occurring during practice tests rather than the actual FCAT, the inconsistencies in the students’ accounts of the events during the administration of the test, the general lack of specificity and precision in the students’ accounts of the events, and Respondent’s credible denial of any wrongdoing, the evidence does not clearly and convincingly establish the truth of the allegations against Respondent. In making the foregoing finding, due consideration was given to the investigation undertaken by the district-level testing coordinator, Rebecca Fleck, at the time of the allegations against Respondent, and the materials generated through that investigation. The reason for the investigation was a phone call that Ms. Fleck received on Wednesday, March 5, 2003, from a Department employee who told Ms. Fleck that the Department had received an anonymous complaint about Respondent’s administration of the FCAT. Ms. Fleck went to Avon Park Elementary School on Friday, March 7, 2003, to investigate the complaint. On that date, she met with the school’s assistant principal and interviewed several of the students in Respondent’s class. She also spoke briefly with Respondent to “get her side of the story,” which consistent with her testimony at the hearing, was an unequivocal denial of any wrongdoing. Ms. Fleck decided, based upon the student interviews, that Respondent should not administer the science portion of the FCAT or the NRT the following week. As a result, Respondent was assigned to work at the school district office on March 10-12, 2003, while her students were taking the tests on those dates. Ms. Fleck also decided to interview and get statements from all of the students in Respondent’s class, which she did on the following Monday and Tuesday, March 10 and 11, 2003. On those days, the students were called to the principal’s office in groups of two or three and they were asked to fill out a questionnaire developed by Ms. Fleck. Pam Burnaham, the principal of Avon Park Elementary School, and Ms. Fleck supervised the students while they filled out the questionnaires. The students were not told that Ms. Fleck was investigating alleged wrongdoing by Respondent; they were told that the purpose of the questionnaire was to find out about their “FCAT experience.” Ms. Fleck testified that she was confident that the students understood that the questionnaire related only to the actual FCAT and not any of the practice tests administered by Respondent; however, Ms. Burnaham testified that she did not place any emphasis on the distinction, and as noted above, the students’ testimony at the hearing indicates that they may have been confused on this issue. Ms. Fleck concluded based upon the students’ responses on the questionnaires that Respondent “coached” the students during the administration of the actual FCAT. As a result, she invalidated the tests of all 18 students in Respondent’s class. Ms. Fleck’s decision to invalidate the students’ tests was not unreasonable based upon what she was told by the students, which she believed to be true; however, the invalidation of the tests is not sufficient in and of itself to impose discipline on Respondent because, as discussed above, the truth of the students’ allegations was not clearly and convincingly proven at the hearing. Several of the students gave written statements to a Department investigator in late May 2003 regarding the help that they recalled being given by Respondent on the FCAT. No weight is given to those statements because no credible evidence was presented regarding the circumstances under which the statements were made, the statements were made several months after the events described in the statements, and as was the case with the questionnaires the students filled out for Ms. Fleck, the undersigned is not persuaded that the students understood at the time they were giving the statements that they were describing events that occurred during the actual FCAT rather than the practice tests that they were given by Respondent. There is no persuasive evidence that any of the students in Respondent’s class whose tests were invalidated suffered any adverse educational consequences. Even though the school administrators did not have the benefit of the students’ FCAT scores for purposes of placement and/or developing a remediation plan, they had other information on which they could make those decisions, including the students’ scores on the NRT, which was administered the week after the FCAT and was not invalidated. Other than being reassigned to the school district office during the administration of the NRT, Respondent did not suffer any adverse employment consequences from the school district as a result of the students’ allegations and/or the invalidation of the students’ tests. To the contrary, Respondent continued to get good performance reviews and her contract has been renewed twice since the administration of the 2003 FCAT. Respondent did not administer the 2004 FCAT because this case was still pending. She was given other duties at Avon Park Elementary School while her students were taking the 2004 FCAT.

Recommendation Based upon the foregoing findings of fact and conclusions of law, it is RECOMMENDED that the Commission issue a final order dismissing the Amended Administrative Complaint against Respondent. DONE AND ENTERED this 6th day of April, 2005, in Tallahassee, Leon County, Florida. S T. KENT WETHERELL, II Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 6th day of April, 2005.

Florida Laws (8) 1008.221008.241012.791012.7951012.796120.569120.5790.803
# 8
KPMG CONSULTING, INC. vs DEPARTMENT OF REVENUE, 02-001719BID (2002)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida May 01, 2002 Number: 02-001719BID Latest Update: Oct. 15, 2002

The Issue The issue to be resolved in this proceeding concerns whether the Department of Revenue (Department, DOR) acted clearly erroneously, contrary to competition, arbitrarily or capriciously when it evaluated the Petitioner's submittal in response to an Invitation to Negotiate (ITN) for a child support enforcement automated management system-compliance enforcement (CAMS CE) in which it awarded the Petitioner a score of 140 points out of a possible 230 points and disqualified the Petitioner from further consideration in the invitation to negotiate process.

Findings Of Fact Procurement Background: The Respondent, the (DOR) is a state agency charged with the responsibility of administering the Child Support Enforcement Program (CSE) for the State of Florida, in accordance with Section 20.21(h), Florida Statutes. The DOR issued an ITN for the CAMS Compliance Enforcement implementation on February 1, 2002. This procurement is designed to give the Department a "state of the art system" that will meet all Federal and State Regulations and Policies for Child Support Enforcement, improve the effectiveness of collections of child support and automate enforcement to the greatest extent possible. It will automate data processing and other decision- support functions and allow rapid implementation of changes in regulatory requirements resulting from revised Federal and State Regulation Policies and Florida initiatives, including statutory initiatives. CSE services suffer from dependence on an inadequate computer system known as the "FLORIDA System" which was not originally designed for CSE and is housed and administered in another agency. The current FLORIDA System cannot meet the Respondent's needs for automation and does not provide the Respondent's need for management and reporting requirements and the need for a more flexible system. The DOR needs a system that will ensure the integrity of its data, will allow the Respondent to consolidate some of the "stand-alone" systems it currently has in place to remedy certain deficiencies of the FLORIDA System and which will help the Child Support Enforcement system and program secure needed improvements. The CSE is also governed by Federal Policy, Rules and Reporting requirements concerning performance. In order to improve its effectiveness in responding to its business partners in the court system, the Department of Children and Family Services, the Sheriff's Departments, employers, financial institutions and workforce development boards, as well as to the Federal requirements, it has become apparent that the CSE agency and system needs a new computer system with the flexibility to respond to the complete requirements of the CSE system. In order to accomplish its goal of acquiring a new computer system, the CSE began the procurement process. The Department hired a team from the Northrup Grumman Corporation headed by Dr. Edward Addy to head the procurement development process. Dr. Addy began a process of defining CSE needs and then developing an ITN which reflected those needs. The process included many individuals in CSE who would be the daily users of the new system. These individuals included Andrew Michael Ellis, Revenue Program Administrator III for Child Support Enforcement Compliance Enforcement; Frank Doolittle, Process Manager for Child Support Enforcement Compliance Enforcement and Harold Bankirer, Deputy Program Director for the Child Support Enforcement Program. There are two alternative strategies for implementing a large computer system such as CAMS CE: a customized system developed especially for CSE or a Commercial Off The Shelf, Enterprise Resource Plan (COTS/ERP). A COTS/ERP system is a pre-packaged software program, which is implemented as a system- wide solution. Because there is no existing COTS/ERP for child support programs, the team recognized that customization would be required to make the product fit its intended use. The team recognized that other system attributes were also important, such as the ability to convert "legacy data" and to address such factors as data base complexity and data base size. The Evaluation Process: The CAMS CE ITN put forth a tiered process for selecting vendors for negotiation. The first tier involved an evaluation of key proposal topics. The key topics were the vendors past corporate experience (past projects) and its key staff. A vendor was required to score 150 out of a possible 230 points to enable it to continue to the next stage or tier of consideration in the procurement process. The evaluation team wanted to remove vendors who did not have a serious chance of becoming the selected vendor at an early stage. This would prevent an unnecessary expenditure of time and resources by both the CSE and the vendor. The ITN required that the vendors provide three corporate references showing their past corporate experience for evaluation. In other words, the references involved past jobs they had done for other entities which showed relevant experience in relation to the ITN specifications. The Department provided forms to the vendors who in turn provided them to their corporate references that they themselves selected. The vendors also included a summary of their corporate experience in their proposal drafted by the vendors themselves. Table 8.2 of the ITN provided positive and negative criteria by which the corporate references would be evaluated. The list in Table 8.2 is not meant to be exhaustive and is in the nature of an "included but not limited to" standard. The vendors had the freedom to select references whose projects the vendors' believed best fit the criteria upon which each proposal was to be evaluated. For the key staff evaluation standard, the vendors provided summary sheets as well as résumés for each person filling a lead role as key staff members on their proposed project team. Having a competent project team was deemed by the Department to be critical to the success of the procurement and implementation of a large project such as the CAMS CE. Table 8.2 of the ITN provided the criteria by which the key staff would be evaluated. The Evaluation Team: The CSE selected an evaluation team which included Dr. Addy, Mr. Ellis, Mr. Bankirer, Mr. Doolittle and Mr. Esser. Although Dr. Addy had not previously performed the role of an evaluator, he has responded to several procurements for Florida government agencies. He is familiar with Florida's procurement process and has a doctorate in Computer Science as well as seventeen years of experience in information technology. Dr. Addy was the leader of the Northrup Grumman team which primarily developed the ITN with the assistance of personnel from the CSE program itself. Mr. Ellis, Mr. Bankirer and Mr. Doolittle participated in the development of the ITN as well. Mr. Bankirer and Mr. Doolittle had previously been evaluators in other procurements for Federal and State agencies prior to joining the CSE program. Mr. Esser is the Chief of the Bureau of Information Technology at the Department of Highway Safety and Motor Vehicles and has experience in similar, large computer system procurements at that agency. The evaluation team selected by the Department thus has extensive experience in computer technology, as well as knowledge of the requirements of the subject system. The Department provided training regarding the evaluation process to the evaluators as well as a copy of the ITN, the Source Selection Plan and the Source Selection Team Reference Guide. Section 6 of the Source Selection Team Reference Guide entitled "Scoring Concepts" provided guidance to the evaluators for scoring proposals. Section 6.1 entitled "Proposal Evaluation Specification in ITN Section 8" states: Section 8 of the ITN describes the method by which proposals will be evaluated and scored. SST evaluators should be consistent with the method described in the ITN, and the source selection process documented in the Reference Guide and the SST tools are designed to implement this method. All topics that are assigned to an SST evaluator should receive at the proper time an integer score between 0 and 10 (inclusive). Each topic is also assigned a weight factor that is multiplied by the given score in order to place a greater or lesser emphasis on specific topics. (The PES workbook is already set to perform this multiplication upon entry of the score.) Tables 8-2 through 8-6 in the ITN Section 8 list the topics by which the proposals will be scored along with the ITN reference and evaluation and scoring criteria for each topic. The ITN reference points to the primary ITN section that describes the topic. The evaluation and scoring criteria list characteristics that should be used to affect the score negatively or positively. While these characteristics should be used by each SST evaluator, each evaluator is free to emphasize each characteristic more or less than any other characteristic. In addition, the characteristics are not meant to be inclusive, and evaluators may consider other characteristics that are not listed . . . (Emphasis supplied). The preponderant evidence demonstrates that all the evaluators followed these instructions in conducting their evaluations and none used a criterion that was not contained in the ITN, either expressly or implicitly. Scoring Method: The ITN used a 0 to 10 scoring system. The Source Selection Team Guide required that the evaluators use whole integer scores. They were not required to start at "7," which was the average score necessary to achieve a passing 150 points, and then to score up or down from 7. The Department also did not provide guidance to the evaluators regarding a relative value of any score, i.e., what is a "5" as opposed to a "6" or a "7." There is no provision in the ITN which establishes a baseline score or starting point from which the evaluators were required to adjust their scores. The procurement development team had decided to give very little structure to the evaluators as they wanted to have each evaluator score based upon his or her understanding of what was in the proposal. Within the ITN the development team could not sufficiently characterize every potential requirement, in the form that it might be submitted, and provide the consistency of scoring that one would want in a competitive environment. This open-ended approach is a customary method of scoring, particularly in more complex procurements in which generally less guidance is given to evaluators. Providing precise guidance regarding the relative value of any score, regarding the imposition of a baseline score or starting point, from which evaluators were required to adjust their scores, instruction as to weighing of scores and other indicia of precise structure to the evaluators would be more appropriate where the evaluators themselves were not sophisticated, trained and experienced in the type of computer system desired and in the field of information technology and data retrieval generally. The evaluation team, however, was shown to be experienced and trained in information technology and data retrieval and experienced in complex computer system procurement. Mr. Barker is the former Bureau Chief of Procurement for the Department of Management Services. He has 34 years of procurement experience and has participated in many procurements for technology systems similar to CAMS CE. He established that the scoring system used by the Department at this initial stage of the procurement process is a common method. It is customary to leave the numerical value of scores to the discretion of the evaluators based upon each evaluator's experience and review of the relevant documents. According wider discretion to evaluators in such a complex procurement process tends to produce more objective scores. The evaluators scored past corporate experience (references) and key staff according to the criteria in Table 8.2 of the ITN. The evaluators then used different scoring strategies within the discretion accorded to them by the 0 to 10 point scale. Mr. Bankirer established a midrange of 4 to 6 and added or subtracted points based upon how well the proposal addressed the CAMS CE requirements. Evaluator Ellis used 6 as his baseline and added or subtracted points from there. Dr. Addy evaluated the proposals as a composite without a starting point. Mr. Doolittle started with 5 as an average score and then added or subtracted points. Mr. Esser gave points for each attribute in Table 8.2, for key staff, and added the points for the score. For the corporate reference criterion, he subtracted a point for each attribute the reference lacked. As each of the evaluators used the same methodology for the evaluation of each separate vendor's proposal, each vendor was treated the same and thus no specific prejudice to KPMG was demonstrated. Corporate Reference Evaluation: KPMG submitted three corporate references: Duke University Health System (Duke), SSM Health Care (SSM), and Armstrong World Industries (Armstrong). Mr. Bankirer gave the Duke reference a score of 6, the SSM reference a score of 5 and the Armstrong reference a score of 7. Michael Strange, the KPMG Business Development Manager, believed that 6 was a low score. He contended that an average score of 7 was required to make the 150-point threshold for passage to the next level of the ITN consideration. Therefore, a score of 7 would represent minimum compliance, according to Mr. Strange. However, neither the ITN nor the Source Selection Team Guide identified 7 as a minimally compliant score. Mr. Strange's designation of 7 as a minimally compliant score is not provided for in the specifications or the scoring instructions. Mr. James Focht, Senior Manager for KPMG testified that 6 was a low score, based upon the quality of the reference that KPMG had provided. However, Mr. Bankirer found that the Duke reference was actually a small-sized project, with little system development attributes, and that it did not include information regarding a number of records, the data base size involved, the estimated and actual costs and attributes of data base conversion. Mr. Bankirer determined that the Duke reference had little similarity to the CAMS CE procurement requirements and did not provide training or data conversion as attributes for the Duke procurement which are attributes necessary to the CAMS CE procurement. Mr. Strange and Mr. Focht admitted that the Duke reference did not specifically contain the element of data conversion and that under the Table 8.2, omission of this information would negatively affect the score. Mr. Focht admitted that there was no information in the Duke Health reference regarding the number of records and the data base size, all of which factors diminish the quality of Duke as a reference and thus the score accorded to it. Mr. Strange opined that Mr. Bankirer had erred in determining that the Duke project was a significantly small sized project since it only had 1,500 users. Mr. Focht believed that the only size criterion in Table 8.2 was the five million dollar cost threshold, and, because KPMG indicated that the project cost was greater than five million dollars, that KPMG had met the size criterion. Mr. Focht believed that evaluators had difficulty in evaluating the size of the projects in the references due to a lack of training. Mr. Focht was of the view that the evaluator should have been instructed to make "binary choices" on issues such as size. He conceded, however, that evaluators may have looked at other criteria in Table 8.2 to determine the size of the project, such as database size and number of users. However, the corporate references were composite scores by the evaluators, as the ITN did not require separate scores for each factor in Table 8.2. Therefore, Mr. Focht's focus on binary scoring for size, to the exclusion of other criteria, mis-stated the objective of the scoring process. The score given to the corporate references was a composite of all of the factors in Table 8.2, and not merely monetary value size. Although KPMG apparently contends that size, in terms of dollar value, is the critical factor in determining the score for a corporate reference, the vendor questions and answers provided at the pre-proposal conference addressed the issue of relevant criteria. Question 40 of the vendor questions and answers, Volume II, did not single out "project greater than five million dollars" as the only size factor or criterion. QUESTION: Does the state require that each reference provided by the bidder have a contract value greater than $5 million; and serve a large number of users; and include data conversion from a legacy system; and include training development? ANSWER: To get a maximum score for past corporate experience, each reference must meet these criteria. If the criteria are not fully met, the reference will be evaluated, but will be assigned a lower score depending upon the degree to which the referenced project falls short of these required characteristics. Therefore, the cost of the project is shown to be only one component of a composite score. Mr. Strange opined that Mr. Bankirer's comment regarding the Duke reference, "little development, mostly SAP implementation" was irrelevant. Mr. Strange's view was that the CAMS CE was not a development project and Table 8.2 did not specifically list development as a factor on which proposals would be evaluated. Mr. Focht stated that in his belief Mr. Bankirer's comment suggested that Mr. Bankirer did not understand the link between the qualifications in the reference and the nature of KPMG's proposal. Both Strange and Focht believe that the ITN called for a COTS/ERP solution. Mr. Focht stated that the ITN references a COTS/ERP approach numerous times. Although many of the references to COTS/ERP in the ITN also refer to development, Mr. Strange also admitted that the ITN was open to a number of approaches. Furthermore, both the ITN and the Source Selection Team Guide stated that the items in Table 8.2 are not all inclusive and that the evaluators may look to other factors in the ITN. Mr. Bankirer noted that there is no current CSE COTS/ERP product on the market. Therefore, some development will be required to adapt an off-the-shelf product to its intended use as a child support case management system. Mr. Bankirer testified that the Duke project was a small-size project with little development. Duke has three sites while CSE has over 150 sites. Therefore, the Duke project is smaller than CAMS. There was no information provided in the KPMG submittal regarding data base size and number of records with regard to the Duke project. Mr. Bankirer did not receive the information he needed to infer a larger sized-project from the Duke reference. Mr. Esser also gave the Duke reference a score of 6. The reference did not provide the data base information required, which was the number of records in the data base and the number of "gigabytes" of disc storage to store the data, and there was no element of legacy conversion. Dr. Addy gave the Duke reference a score of 5. He accepted the dollar value as greater than five million dollars. He thought that the Duke Project may have included some data conversion, but it was not explicitly stated. The Duke customer evaluated training so he presumed training was provided with the Duke project. The customer ratings for Duke were high as he expected they would be, but similarity to the CAMS CE system was not well explained. He looked at size in terms of numbers of users, number of records and database size. The numbers that were listed were for a relatively small-sized project. There was not much description of the methodology used and so he gave it an overall score of 5. Mr. Doolittle gave the Duke reference a score of 6. He felt that it was an average response. He listed the number of users, the number of locations, that it was on time and on budget, but found that there was no mention of data conversion, database size or number of records. (Consistent with the other evaluators). A review of the evaluators comments makes it apparent that KPMG scores are more a product of a paucity of information provided by KPMG corporate references instead of a lack of evaluator knowledge of the material being evaluated. Mr. Ellis gave a score of 6 for the Duke reference. He used 6 as his baseline. He found the required elements but nothing more justifying in his mind raising the score above 6. Mr. Focht and Mr. Strange expressed the same concerns regarding Bankirer's comment, regarding little development, for the SSM Healthcare reference as they had for the Duke Health reference. However, both Mr. Strange and Mr. Focht admitted that the reference provided no information regarding training. Mr. Strange admitted that the reference had no information regarding data conversion. Training and data conversion are criteria contained in Table 8.2. Mr. Strange also admitted that KPMG had access to Table 8.2 before the proposal was submitted and could have included the information in the proposal. Mr. Bankirer gave the SSM reference a score of 5. He commented that the SAP implementation was not relevant to what the Department was attempting to do with the CAMS CE system. CAMS CE does not have any materials management or procurement components, which was the function of the SAP components and the SSM reference procurement or project. Additionally, there was no training indicated in the SSM reference. Mr. Esser gave the SSM reference a score of 3. His comments were "no training provided, no legacy data conversion, project evaluation was primarily for SAP not KPMG". However, it was KPMG's responsibility in responding to the ITN to provide project information concerning a corporate reference in a clear manner rather than requiring that an evaluator infer compliance with the specifications. Mr. Focht believed that legacy data conversion could be inferred from the reference's description of the project. Mr. Strange opined that Mr. Esser's comment was inaccurate as KPMG installed SAP and made the software work. Mr. Esser gave the SSM reference a score of 3 because the reference described SAP's role, but not KPMG's role in the installation of the software. When providing information in the reference SSM gave answers relating to SAP to the questions regarding system capability, system usability, system reliability but did not state KPMG's role in the installation. SAP is a large enterprise software package. This answer created an impression of little KPMG involvement in the project. Dr. Addy gave the SSM reference a score of 6. Dr. Addy found that the size was over five million dollars and customer ratings were high except for a 7 for usability with reference to a "long learning curve" for users. Data conversion was implied. There was no strong explanation of similarity to CAMS CE. It was generally a small-sized project. He could reason some similarity into it, even though it was not well described in the submittal. Mr. Doolittle gave the SSM reference a score of 6. Mr. Doolittle noted, as positive factors, that the total cost of the project was greater than five million dollars, that it supported 24 sites and 1,500 users as well "migration from a mainframe." However, there were negative factors such as training not being mentioned and a long learning curve for its users. Mr. Ellis gave a score of 6 for SSM, feeling that KPMG met all of the requirements but did not offer more than the basic requirements. Mr. Strange opined that Mr. Bankirer, Dr. Addy and Mr. Ellis (evaluators 1, 5 and 4) were inconsistent with each other in their evaluation of the SSM reference. He stated that this inconsistency showed a flaw in the evaluation process in that the evaluators did not have enough training to uniformly evaluate past corporate experience, thereby, in his view, creating an arbitrary evaluation process. Mr. Bankirer gave the SSM reference a score of 5, Ellis a score of 6, and Addy a score of 6. Even though the scores were similar, Mr. Strange contended that they gave conflicting comments regarding the size of the project. Mr. Ellis stated that the size of the project was hard to determine as the cost was listed as greater than five million dollars and the database size given, but the number of records was not given. Mr. Bankirer found that the project was low in cost and Dr. Addy stated that over five million dollars was a positive factor in his consideration. However, the evaluators looked at all of the factors in Table 8.2 in scoring each reference. Other factors that detracted from KPMG's score for the SSM reference were: similarity to the CAMS system not being explained, according to Dr. Addy; no indication of training (all of the evaluators); the number of records not being provided (evaluator Ellis); little development shown (Bankirer) and usability problems (Dr. Addy). Mr. Strange admitted that the evaluators may have been looking at other factors besides the dollar value size in order to score the SSM reference. Mr. Esser gave the Armstrong reference a score of 6. He felt that the reference did not contain any database information or cost data and that there was no legacy conversion shown. Dr. Addy also gave Armstrong a score of 6. He inferred that this reference had data conversion as well as training and the high dollar volume which were all positive factors. He could not tell, however, from the project description, what role KPMG actually had in the project. Mr. Ellis gave a score of 7 for the Armstrong reference stating that the Armstrong reference offered more information regarding the nature of the project than had the SSM and Duke references. Mr. Bankirer gave KPMG a score of 7 for the Armstrong reference. He found that the positive factors were that the reference had more site locations and offered training but, on the negative side, was not specific regarding KPMG's role in the project. Mr. Focht opined that the evaluators did not understand the nature of the product and services the Department was seeking to obtain as the Department's training did not cover the nature of the procurement and the products and services DOR was seeking. However, when he made this statement he admitted he did not know the evaluators' backgrounds. In fact, Bankirer, Ellis, Addy and Doolittle were part of a group that developed the ITN and clearly knew what CSE was seeking to procure. Further, Mr. Esser stated that he was familiar with COTS and described it as a commercial off-the-shelf software package. Mr. Esser explained that an ERP solution or Enterprise Resource Plan is a package that is designed to do a series of tasks, such as produce standard reports and perform standard operations. He did not believe that he needed further training in COTS/ERP to evaluate the proposals. Mr. Doolittle was also familiar with COTS/ERP and believed, based on the amount of funding, that it was a likely response to the ITN. Dr. Addy's doctoral dissertation research was in the area of software re-use. COTS is one of the components that comprise a development activity and re-use. He became aware during his research of how COTS packages are used in software engineering. He has also been exposed to ERP packages. ERP is only one form of a COTS package. In regard to the development of the ITN and the expectations of the development team, Dr. Addy stated that they were amenable to any solution that met the requirements of the ITN. They fully expected the compliance solutions were going to be comprised of mostly COTS and ERP packages. Furthermore, the ITN in Section 1.1, on page 1-2 states, ". . . FDOR will consider an applicable Enterprise Resource Planning (ERP) or Commercial Off the Shelf (COTS) based solution in addition to custom development." Clearly, this ITN was an open procurement and to train evaluators on only one of the alternative solutions would have biased the evaluation process. Mr. Doolittle gave each of the KPMG corporate references a score of 6. Mr. Strange and Mr. Focht questioned the appropriateness of these scores as the corporate references themselves gave KPMG average ratings of 8.3, 8.2 and 8.0. However, Mr. Focht admitted that Mr. Doolittle's comments regarding the corporate references were a mixture of positive and negative comments. Mr. Focht believed, however, that as the reference corporations considered the same factors for providing ratings on the reference forms, that it was inconsistent for Mr. Doolittle to separately evaluate the same factors that the corporations had already rated. However, there is no evidence in the record that KPMG provided Table 8.2 to the companies completing the reference forms and that the companies consulted the table when completing their reference forms. Therefore, KPMG did not prove that it had taken all measures available to it to improve its scores. Moreover, Mr. Focht's criticism would impose a requirement on Mr. Doolittle's evaluation which was not supported by the ITN. Mr. Focht admitted that there was no criteria in the ITN which limited the evaluator's discretion in scoring to the ratings given to the corporate references by those corporate reference customers. All of the evaluators used Table 8.2 as their guide for scoring the corporate references. As part of his evaluation, Dr. Addy looked at the methodology used by the proposers in each of the corporate references to implement the solution for that reference company. He was looking at methodology to determine its degree of similarity to CAMS CE. While not specifically listed in Table 8.2 as a similarity to CAMS, Table 8.2 states that the list is not all inclusive. Clearly, methodology is a measure of similarity and therefore is not an arbitrary criterion. Moreover, as Dr. Addy used the same process and criteria in evaluating all of the proposals there was no prejudice to KPMG by use of this criterion since all vendors were subjected to it. Mr. Strange stated that KPMG appeared to receive lower scores for SAP applications than other vendors. For example, evaluator 1 gave a score of 7 to Deloitte's reference for Suntax. Suntax is an SAP implementation. It is difficult to draw comparisons across vendors, yet the evaluators consistently found that KPMG references lacked key elements such as data conversion, information on starting and ending costs, and information on database size. All of these missing elements contributed to a reduction in KPMG's scores. Nevertheless, KPMG received average scores of 5.5 for Duke, 5.7 for SSM and 6.3 for Armstrong, compared with the score of 7 received by Deloitte for Suntax. There is only a gap of 1.5 to .7 points between Deloitte and KPMG's scores for SAP implementations, despite the deficient information within KPMG's corporate references. Key Staff Criterion: The proposals contain a summary of the experience of key staff and attached résumés. KPMG's proposed key staff person for Testing Lead was Frank Traglia. Mr. Traglia's summary showed that he had 25-years' experience respectively, in the areas of child support enforcement, information technology, project management and testing. Strange and Focht admitted that Traglia's résumé did not specifically list any testing experience. Mr. Focht further admitted that it was not unreasonable for evaluators to give the Testing Lead a lower score due to the lack of specific testing information in Traglia's résumé. Mr. Strange explained that the résumé was from a database of résumés. The summary sheet, however, was prepared by those KPMG employees who prepared the proposal. All of the evaluators resolved the conflicting information between the summary sheet and the résumé by crediting the résumé as more accurate. Each evaluator thought that the résumé was more specific and expected to see specific information regarding testing experience on the résumé for someone proposed as the Testing Lead person. Evaluators Addy and Ellis gave scores to the Testing Lead criterion of 4 and 5. Mr. Ron Vandenberg (evaluator 8) gave the Testing Lead a score of 9. Mr. Vandenberg was the only evaluator to give the Testing Lead a high score. The other evaluators gave the Testing Lead an average score of 4.2. The Vandenberg score thus appears anomalous. All of the evaluators gave the Testing Lead a lower score as it did not specifically list testing experience. Dr. Addy found that the summary sheet listed 25-years of experience in child support enforcement, information technology, and project management and system testing. As he did not believe this person had 100 years of experience, he assumed those experience categories ran concurrently. A strong candidate for Testing Lead should demonstrate a combination of testing experience, education and certification, according to Dr. Addy. Mr. Doolittle also expected to see testing experience mentioned in the résumé. When evaluating the Testing Lead, Mr. Bankirer first looked at the team skills matrix and found it interesting that testing was not one of the categories of skills listed for the Testing Lead. He then looked at the summary sheet and résumé from Mr. Traglia. He gave a lower score to Traglia as he thought that KPMG should have put forward someone with demonstrable testing experience. The evaluators gave a composite score to key staff based on the criteria in Table 8.2. In order to derive the composite score that he gave each staff person, Mr. Esser created a scoring system wherein he awarded points for each attribute in Table 8.2 and then added the points together to arrive at a composite score. Among the criteria he rated, Mr. Esser awarded points for CSE experience. Mr. Focht and Mr. Strange contended that since the term CSE experience is not actually listed in Table 8.2 that Mr. Esser was incorrect in awarding points for CSE experience in his evaluation. Table 8.2 does refer to relevant experience. There is no specific definition provided in Table 8.2 for relevant experience. Mr. Focht stated that relevant experience is limited to COTS/ERP experience, system development, life cycle and project management methodologies. However, these factors are also not listed in Table 8.2. Mr. Strange limited relevance to experience in the specific role for which the key staff person was proposed. This is a limitation that also is not imposed by Table 8.2. CSE experience is no more or less relevant than the factors posited by KPMG as relevant experience. Moreover, KPMG included a column in its own descriptive table of key staffs for CSE experience. KPMG must have seen this information as relevant if it included it in its proposal as well. Inclusion of this information in its proposal demonstrated that KPMG must have believed CSE experience was relevant at the time its submitted its proposal. Mr. Strange held the view that, in the bidders conference in a reply to a vendor question, the Department representative stated that CSE experience was not required. Therefore, Mr. Esser could not use such experience to evaluate key staff. Question 47 of the Vendor Questions and Answers, Volume 2 stated: QUESTION: In scoring the Past Corporate Experience section, Child Support experience is not mentioned as a criterion. Would the State be willing to modify the criteria to include at least three Child Support implementations as a requirement? ANSWER: No. However, a child support implementation that also meets the other characteristics (contract value greater than $5 million, serves a large number of users, includes data conversion from a legacy system and includes training development) would be considered "similar to CAMS CE." The Department's statement involved the scoring of corporate experience not key staff. It was inapplicable to Mr. Esser's scoring system. Mr. Esser gave the Training Lead a score of 1. According to Esser, the Training Lead did not have a ten-year résumé, for which he deducted one point. The Training Lead had no specialty certification or extensive experience and had no child support experience and received no points. Mr. Esser added one point for the minimum of four years of specific experience and one point for the relevance of his education. Mr. Esser gave the Project Manager a score of 5. The Project Manager had a ten-year résumé and required references and received a point for each. He gave two points for exceeding the minimum required informational technology experience. The Project Manager had twelve years of project management experience for a score of one point, but lacked certification, a relevant education and child support enforcement experience for which he was accorded no points. Mr. Esser gave the Project Liaison person a score of According to Mr. Focht, the Project Liaison should have received a higher score since she has a professional history of having worked for the state technology office. Mr. Esser, however, stated that she did not have four years of specific experience and did not have extensive experience in the field, although she had a relevant education. Mr. Esser gave the Software Lead person a score of 4. The Software Lead, according to Mr. Focht, had a long set of experiences with implementing SAP solutions for a wide variety of different clients and should have received a higher score. Mr. Esser gave a point each for having a ten-year résumé, four years of specific experience in software, extensive experience in this area and relevant education. According to Mr. Focht the Database Lead had experience with database pools including the Florida Retirement System and should have received more points. Mr. Strange concurred with Mr. Focht in stating that Esser had given low scores to key staff and stated that the staff had good experience, which should have generated more points. Mr. Strange believed that Mr. Esser's scoring was inconsistent but provided no basis for that conclusion. Other evaluators also gave key staff positions scores of less than 7. Dr. Addy gave the Software Lead person a score of 5. The Software Lead had 16 years of experience and SAP development experience as positive factors but had no development lead experience. He had a Bachelor of Science and a Master of Science in Mechanical Engineering and a Master's in Business Administration, which were not good matches in education for the role of a Software Lead person. Dr. Addy gave the Training Lead person a score of 5. The Training Lead had six years of consulting experience, a background in SAP consulting and some training experience but did not have certification or education in training. His educational background also was electrical engineering, which is not a strong background for a training person. Dr. Addy gave the subcontractor managers a score of 5. Two of the subcontractors did not list managers at all, which detracted from the score. Mr. Doolittle gave the Training Lead person a He believed that based on his experience and training it was an average response. Table 8.2 contained an item in which a proposer could have points detracted from a score if the key staff person's references were not excellent. The Department did not check references at this stage in the evaluation process. As a result, the evaluators simply did not consider that item when scoring. No proposer's score was adversely affected thereby. KPMG contends that checking references would have given the evaluators greater insight into the work done by those individuals and their relevance and capabilities in the project team. Mr. Focht admitted, however, that any claimed effect on KPMG's score is conjectural. Mr. Strange stated that without reference checks information in the proposals could not be validated but he provided no basis for his opinion that reference checking was necessary at this preliminary stage of the evaluation process. Dr. Addy stated that the process called for checking references during the timeframe of oral presentations. They did not expect the references to change any scores at this point in the process. KPMG asserted that references should be checked to ascertain the veracity of the information in the proposals. However, even if the information in some other proposal was inaccurate it would not change the outcome for KPMG. KPMG would still not have the required number of points to advance to the next evaluation tier. Divergency in Scores The Source Selection Plan established a process for resolving divergent scores. Any item receiving scores with a range of 5 or more was determined to be divergent. The plan provided that the Coordinator identify divergent scores and then report to the evaluators that there were divergent scores for that item. The Coordinator was precluded from telling the evaluator, if his score was the divergent score, i.e., the highest or lowest score. Evaluators would then review that item, but were not required to change their scores. The purpose of the divergent score process was to have evaluators review their scores to see if there were any misperceptions or errors that skewed the scores. The team wished to avoid having any influence on the evaluators' scores. Mr. Strange testified that the Department did not follow the divergent score process in the Source Selection Plan as the coordinator did not tell the evaluators why the scores were divergent. Mr. Strange stated that the evaluator should have been informed which scores were divergent. The Source Selection Plan merely instructed the coordinator to inform the evaluators of the reason why the scores were divergent. Inherently scores were divergent, if there was a five-point score spread. The reason for the divergence was self- explanatory. The evaluators stated that they scored the proposals, submitted the scores and each received an e-mail from Debbie Stephens informing him that there were divergent scores and that they should consider re-scoring. None of the evaluators ultimately changed their scores. Mr. Esser's scores were the lowest of the divergent scores but he did not re-score his proposals as he had spent a great deal of time on the initial scoring and felt his scores to be valid. Neither witnesses Focht or Strange for KPMG provided more than speculation regarding the effect of the divergent scores on KPMG's ultimate score and any role the divergent scoring process may have had in KPMG not attaining the 150 point passage score. Deloitte - Suntax Reference: Susan Wilson, a Child Support Enforcement employee connected with the CAMS project signed a reference for Deloitte Consulting regarding the Suntax System. Mr. Focht was concerned that the evaluators were influenced by her signature on the reference form. Mr. Strange further stated that having someone who is heavily involved in the project sign a reference did not appear to be fair. He was not able to state any positive or negative effect on KPMG by Wilson's reference for Deloitte, however. Evaluator Esser has met Susan Wilson but has had no significant professional interaction with her. He could not recall anything that he knew about Ms. Wilson that would favorably influence him in scoring the Deloitte reference. Dr. Addy also was not influenced by Wilson. Mr. Doolittle has only worked with Wilson for a very short time and did not know her well. He has also evaluated other proposals where department employees were a reference and was not influenced by that either. Mr. Ellis has only known Wilson from two to four months. Her signature on the reference form did not influence him either positively or negatively. Mr. Bankirer had not known Wilson for a long time when he evaluated the Suntax reference. He took the reference at face value and was not influenced by Wilson's signature. It is not unusual for someone within an organization to create a reference for a company who is competing for work to be done for the organization.

Recommendation Having considered the foregoing Findings of Fact, Conclusions of Law, the evidence of record and the pleadings and arguments of the parties, it is, therefore, RECOMMENDED that a final order be entered by the State of Florida Department of Revenue upholding the proposed agency action which disqualified KPMG from further participation in the evaluation process regarding the subject CAMS CE Invitation to Negotiate. DONE AND ENTERED this 26th day of September, 2002, in Tallahassee, Leon County, Florida. P. MICHAEL RUFF Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with Clerk of the Division of Administrative Hearings this 26th day of September, 2002. COPIES FURNISHED: Cindy Horne, Esquire Earl Black, Esquire Department of Revenue Post Office Box 6668 Tallahassee, Florida 32399-0100 Robert S. Cohen, Esquire D. Andrew Byrne, Esquire Cooper, Byrne, Blue & Schwartz, LLC 1358 Thomaswood Drive Tallahassee, Florida 32308 Seann M. Frazier, Esquire Greenburg, Traurig, P.A. 101 East College Avenue Tallahassee, Florida 32302 Bruce Hoffmann, General Counsel Department of Revenue 204 Carlton Building Tallahassee, Florida 32399-0100 James Zingale, Executive Director Department of Revenue 104 Carlton Building Tallahassee, Florida 32399-0100

Florida Laws (3) 120.569120.5720.21
# 9
DORIAN KENNETH ZINCK vs BOARD OF PROFESSIONAL ENGINEERS, 94-002664 (1994)
Division of Administrative Hearings, Florida Filed:West Palm Beach, Florida May 10, 1994 Number: 94-002664 Latest Update: Sep. 20, 1995

Findings Of Fact The National Council of Examiners for Engineering and Surveying (hereinafter "NCEES") writes and otherwise prepares the examinations for candidates seeking engineering licenses in 55 states and jurisdictions. The examinations are then administered by the states and jurisdictions which constitute NCEES' member boards. Respondent, State of Florida, Board of Professional Engineers, is a member board and uses NCEES' examinations. The Fundamentals of Engineering (hereinafter "FE") examination is given twice a year, in April and in October. The FE examination measures the basic knowledge a candidate has acquired in a bachelor degree program in the first two years during which the candidate takes basic engineering and science courses. Passage of the examination does not result in licensure as an engineer; it results in either an "engineer intern" or an "engineer in training" certificate which shows that the examinee has completed the necessary educational requirements to sit for that eight-hour examination and to have passed it. The next step is that a successful candidate will then complete four years of experience and then pass a principles and practices examination called the "PE" examination in order to then be licensed as a professional engineer. The FE exam is a minimal competency examination. Questions for the FE examination are written by individuals and are then reviewed by a committee. That committee is composed of registered professional engineers who are practicing engineers and engineers from the academic world, from consulting firms, and from governmental entities. Each question or item on the examination is reviewed by at least 12 to 15 individuals during the review process which takes from one to one and a half years. As part of the development process, individual items appear on examinations as pre-test questions. The purpose of using pre-test questions is to determine the characteristics of that specific item, as to how hard or easy the item is when used on the target population (candidates for the FE examination), and to verify that minimally competent candidates can answer the test item correctly. If pre-test questions perform as expected, they are used on subsequent examinations. If they do not perform adequately, the questions go back to the committee to be changed or to be discarded. Pre-test questions on examinations are not scored, and whether an examinee correctly answers that question is irrelevant to the raw score or final grade achieved by that candidate on the examination. Pre-test questions are distributed proportionately throughout the examination, and no subject area on the examination ever consists of only pre-test questions. Pre-test questions are used by other national testing programs. No unfairness inures to candidates from the presence of pre-test questions on an examination for two reasons. First, all candidates are treated equally. Candidates do not know that the examination contains pre-test questions, and, even if they did, they do not know which questions are pre-test questions and which questions will be scored. Second, the length of the examination itself is not increased by adding pre-test questions. The examination has the same number of questions whether pre-test questions are included or not. In the actual exam preparation, NCEES uses American College Testing and/or Educational Testing Service as contractors. The contractors pull the proper number of items in each subject area from the item bank and assemble the examination which is then sent to the NCEES committee of registered professional engineers to see if changes in the examination are necessary. Once approved, the contractor then prints the examination booklets and sends them to the member boards to administer the examination. Answer sheets from an exam administration are transmitted to the contractor for scanning and statistical analysis. The contractor then recommends a passing point based on a scaling and equating process so that future exams are no easier or harder than past exams. When NCEES approves the passing point, the contractor sends the examination scores or results to the member boards. When the examination is changed in some fashion, a new base line or pass point must be established to ensure that the new examination remains equal in difficulty to past examinations and remains a good measure of competency. The new examination is referred to as the anchor examination. The October, 1990, FE examination was an anchor exam. The member boards of NCEES determined that the October, 1993, FE examination would be changed to a supplied reference document examination, meaning that the candidate during the examination could use only the supplied reference handbook, a pencil, and a calculator. Candidates would no longer be able to bring their own reference materials to use during the examination. One of the reasons for the change was fairness to the candidates. The FE examination was not being administered uniformly nationwide since some member boards prohibited bringing certain publications into the examination which were allowed by other member boards. Accordingly, it was determined that NCEES would write and distribute at the examination its Fundamentals of Engineering Reference Handbook, thereby placing all candidates nationwide on an equal footing in that all examinees would be using this same reference material of charts, mathematical formulas, and conversion tables during the examination, and no other reference materials would be used during the examination itself. In August of 1991, NCEES approved the concept of a supplied reference handbook, and a beginning draft was sent to the FE sub-committee of the examination committee for review. The individual members of the sub-committee actually took two FE examinations using the draft of the supplied reference document to ensure that all material needed to solve the problems on an FE examination was included in the reference document and that the document was accurate. On a later occasion the committee took the examination that would be administered in October of 1993 using a subsequent draft of the supplied reference handbook. The last review of the handbook occurred in February of 1993 when the committee used that draft to review the October 1993 examination for the second time, and NCEES' Fundamentals of Engineering Reference Handbook, First Edition (1993) was finished. When NCEES received its first copies back from the printer, it mailed copies to the deans of engineering at 307 universities in the United States that have accredited engineering programs for review and input. As a result, NCEES became aware of some typographical and other errors contained in that document. In July of 1993 NCEES assembled a group of 12 individuals for a passing point workshop for the October 1993 a/k/a the '93 10 examination. The group consisted of three members of the committee, with the remainder being persons working in the academic world or as accreditation evaluators, and recent engineer interns who had passed the FE examination within the previous year and were not yet professional engineers. That group took the '93 10 FE examination using the first edition of the Handbook and then made judgments to determine the pass point for that examination. During that two day workshop, the errors in the Handbook were pointed out to the working group so it could determine if any of the errors contained in the Handbook had any impact on any of the problems contained in the '93 10 examination. The group determined that none of the errors in the Handbook impacted on any test item on the '93 10 FE examination. In September of 1993 subsequent to the passing point workshop, the '93 10 FE exam and the first edition of the Handbook went back to the committee of registered professional engineers for a final check, and that committee also determined that none of the errors in the Handbook would have any impact on the questions in the '93 10 FE examination. An errata sheet to the first edition of the Handbook was subsequently prepared but was not available until December of 1993. In September of 1994 the second printing of the Handbook was completed, and that version incorporated the changes contained on the errata sheet. Of the errors contained in the first edition of the Handbook, only one error was substantive; that is, one mathematical equation was wrong. However, no item on the '93 10 FE exam could be affected by that mathematical error. The remaining errors were typographical or simply matters of convention, i.e., errors in conventional terminology and symbols found in most textbooks such as the use of upper case instead of lower case or symbols being italicized as opposed to being non-italicized. Candidates for the '93 10 FE examination were able to purchase in advance as a study guide, a Fundamentals of Engineering sample examination which had its second printing in March of 1992. The sample examination was composed of questions taken from previous FE exams which would never be used again on an actual FE examination. The sample examination consisted of actual test questions and multiple choice answers. The sample examination did not show candidates how to solve the problems or work the computation, but merely gave multiple choice responses. Errors were contained on the two pages where the answers to the sample examination were given. The answer key was wrong as to two items on the morning sample examination and was wrong for all of the electrical circuit items, one of the subject areas included in the afternoon sample examination. An errata sheet was prepared and distributed in September of 1993 to those who had purchased the sample examination. Petitioner took the '93 10 FE examination, which contained 140 items during the morning portion and 70 items during the afternoon portion. Approximately 25 percent of the questions on the examination were pre-test questions. The minimum passing score for that examination was 70, and Petitioner achieved a score of only 68. Accordingly, Petitioner failed that examination.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that a final order be entered finding that Petitioner failed to achieve a passing score on the October 1993 Fundamentals of Engineering examination and dismissing the amended petition filed in this cause. DONE and ENTERED this 14th day of April, 1995, at Tallahassee, Florida. LINDA M. RIGOT, Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 14th day of April, 1995. APPENDIX TO RECOMMENDED ORDER Petitioner's proposed findings of fact numbered 1-5 and 8 have been adopted either verbatim or in substance in this Recommended Order. Petitioner's proposed finding of fact numbered 7 has been rejected as being subordinate to the issues herein. Petitioner's proposed findings of fact numbered 6 and 9 have been rejected as not constituting findings of fact but rather as constituting recitation of the testimony or conclusions of law. Respondent's proposed findings of fact numbered 1-15 have been adopted either verbatim or in substance in this Recommended Order. Respondent's proposed finding of fact numbered 16 has been rejected as being unnecessary to the issues involved herein. COPIES FURNISHED: Wellington H. Meffert, II Assistant General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0750 Dorian Kenneth Zinck, pro se 521 Beech Road West Palm Beach, Florida 33409 Angel Gonzalez, Executive Director Board of Professional Engineers Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0755 Lynda Goodgame, General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0792

Florida Laws (3) 120.57471.013471.015
# 10

Can't find what you're looking for?

Post a free question on our public forum.
Ask a Question
Search for lawyers by practice areas.
Find a Lawyer