Elawyers Elawyers
Washington| Change
Find Similar Cases by Filters
You can browse Case Laws by Courts, or by your need.
Find 48 similar cases
JOHN D. WATSON vs FLORIDA ENGINEERS MANAGEMENT CORPORATION, 98-004756 (1998)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Oct. 26, 1998 Number: 98-004756 Latest Update: Apr. 20, 1999

The Issue The issue in this case is whether the Petitioner is entitled to additional credit for his response to question number 123 of the Principles & Practice Civil/Sanitary Engineer Examination administered on April 24, 1998.

Findings Of Fact Petitioner took the April 24, 1998, Principles & Practice Civil/Sanitary Engineer examination. A score of 70 is required to pass the exam. Petitioner obtained a score of 69. In order to achieve a score of 70, Petitioner needs a raw score of 48. Petitioner obtained a score of 69 which is a raw score of 47. Therefore, Petitioner is in need of one (1) additional raw score point. On question number 123, Petitioner received a score of six points out of a possible ten. Question nimber 123 is scored in increments of two raw points. Two additional raw score points awarded to the Petitioner would equal a raw score of 49, equating to a conversion score of seventy-one, a passing score. The National Council of Examiners for Engineering and Surveying (NCEES), the organization that produces the examination, provides a Solution and Scoring Plan which outlines the scoring process used in question number 123. The Petitioner is not allowed a copy of the examination question or the Solution and Scoring Plan for preparation of the Proposed Recommended Order. Question number 123 has three parts: part A, part B, and part C. For a score of ten on question number 123, the Solution and Scoring Plan states that the solution to part A must be correct within allowable tolerances; the solution to part B must state two variables that affect the answer in part A; and the solution to part C must state that anti-lock brakes do not leave skid marks thus making it very had to determine braking distance. For a score of eight points on question number 123, the Solution and Scoring Plan states that part A could contain one error and lists specific allowable errors, and that part B and part C must be answered correctly showing mastery of the concepts involved. Petitioner made an error in part A which falls into the allowable errors listed in the Solution and Scoring Plan under the eight-point scoring plan. Petitioner answered part B correctly. Petitioner contends that he also answered correctly part C, and should be awarded eight points. NCEES marked part C incorrect. Question number 123 is a problem involving a vehicle (vehicle number one) that skids on asphalt and hits another vehicle (vehicle number two). Part C asks "Explain how your investigation of this accident would have changed if vehicle one had been equipped with anti-lock brakes." The Petitioner's answer was as follows: If vehicle one does not "lock" its brakes, its deceleration will be dependent upon its brakes. (Not f). [Judge's note: f is used as the symbol for the co-efficient of friction between the tires and road surface in the problem.] The rate of deceleration (a) must be determined (from testing, mfg, [manufacturer,] etc.) As stated above, the Board accepts a solution that recognizes that the vehicle equipped with anti-lock brakes will not leave skid marks which can be used for computing initial speed using the skid distance equation. The Petitioner's answer pre-supposes that there are no skid marks because the vehicle's wheels do not lock because of the anti-lock brakes; therefore, if the co-efficient of friction of the tires, which generates the skid marks, has no effect. The Petitioner introduced a portion of a commonly used manual for preparation for examination (Petitioner's Exhibit 1), which states, regarding a vehicle that does not lock its brakes, "its decelerations will be dependent upon its brakes." The Board's expert recognized the statement by the Petitioner in response to part C as true, but indicated it was not responsive to the question in that it did not state specifically that the vehicle would not produce skid marks that would be able to be measured for use in the skid distance equation. The solution sheet states regarding part C, "Part C is answered correctly by explaining that anti-lock brakes would not leave skid marks thus making it very had to determine the braking distance."

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law set forth herein, it is, RECOMMENDED: That the Board of Professional Engineers enter a Final Order giving Petitioner credit for part C on the examination and passing the test. DONE AND ENTERED this 25th day of March, 1999, in Tallahassee, Leon County, Florida. STEPHEN F. DEAN Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 25th day of March, 1999. COPIES FURNISHED: Natalie A. Lowe Vice President of Legal Affairs Florida Engineers Management Corporation 1208 Hays Street Tallahassee, Florida 32301 John D. Watson 88 Marine Street St. Augustine, Florida 32084 Dennis Barton, Executive Director Board of Professional Engineers 1208 Hays Street Tallahassee, Florida 32301

Florida Laws (1) 120.57
# 1
JULIE MCCUE vs PAM STEWART, AS COMMISSIONER OF EDUCATION, 17-000423 (2017)
Division of Administrative Hearings, Florida Filed:Orlovista, Florida Jan. 18, 2017 Number: 17-000423 Latest Update: Jan. 22, 2018

The Issue The issue for determination is whether Petitioner’s challenge to the failing score she received on the essay section of the Florida Educational Leadership Examination (FELE) should be sustained.

Findings Of Fact Petitioner is a teacher. She received her undergraduate degree in education with a major in social studies from Bowling Green State University in 1996. Since earning her bachelor’s degree, she has taught history, psychology, and sociology over a 20-year span, at high schools in North Carolina, Ohio, and for the past three years, Florida. Petitioner holds a Florida teacher certificate. She did not have to take an exam for that certificate. She likely was issued her Florida teacher certificate on the basis of the Ohio teacher certificate she held when she moved to Florida. Petitioner aspires to add to her teacher certificate by attaining certification in educational leadership, which would require that she take and pass all subparts of the FELE. Petitioner testified that in the district where she is employed as a teacher, she would qualify for a raise in her teacher’s pay upon receiving a master’s degree in educational leadership followed by DOE certification in educational leadership. Petitioner accomplished the first step by receiving a master’s degree in educational leadership from Concordia University in Chicago, Illinois, in 2015.3/ She then initiated the process to take the FELE. Educational leadership certification would also make Petitioner eligible for a leadership position, such as principal, vice principal, or a school district administrative leadership position, if she chooses to go that route. However, Petitioner’s primary motivation in seeking this certification is for the additional compensation, and not because she wants an educational leadership position.4/ Respondent, Pam Stewart, as Commissioner of Education, is the state’s chief educational officer and executive director of DOE. §§ 20.15(2) and 1001.10(1), Fla. Stat. One of DOE’s responsibilities is to review applications for educator certification, and determine the qualifications of applicants according to eligibility standards and prerequisites for the specific type of certification sought. See § 1012.56, Fla. Stat. One common prerequisite is taking and passing an examination relevant to the particular certification. Respondent is authorized to contract for development, administration, and scoring of educator certification exams. § 1012.56(9)(a), Fla. Stat. Pursuant to this authority, following a competitive procurement in 2011, Pearson was awarded a contract to administer and score Florida’s educator certification exams, including the FELE. The State Board of Education (SBE) is the collegial agency head of DOE. § 20.15(1), Fla. Stat. As agency head, the SBE was required to approve the contract with Pearson. The SBE is also charged with promulgating certain rules that set forth policies related to educator certification, such as requirements to achieve a passing score on certification exams. DOE develops recommendations for the SBE regarding promulgating and amending these rules. In developing its recommendations, DOE obtains input and information from a diverse group of Florida experts and stakeholders, including active teachers and principals, district administrators, and academicians from colleges and universities. FELE Essay Development and Scoring DOE develops the FELE, as well as the other educator certification exams, in-house. The FELE is developed and periodically revised to align with SBE-promulgated standards for educational leadership, as well as SBE-promulgated generic subject area competencies. In addition, as required by statute, certification exams, including the FELE, must be aligned to SBE- approved student standards. Details about the FELE, such as the applicable generic competencies, the exam organization, and passing score requirements, are set forth in Florida Administrative Code Rule 6A-4.00821 (the FELE rule). The FELE rule has been amended periodically, but the current version includes a running history, setting forth FELE details that applied during past time periods, as well as those currently in effect. The FELE consists of three subtests. Subtest one is a multiple choice test covering the area described as “Leadership for Student Learning.” Subtest two, also a multiple choice test, covers “Organizational Development.” Subtest three covers “Systems Leadership,” and has two sections: a multiple choice section; and a written performance assessment, or essay, section. The FELE has contained an essay component for many years (as far back as any witness could remember). Before January 2015, the essay score was included in a single composite score given for subtest three. The multiple choice part accounted for most of the weight of the composite score (70 percent); the essay portion accounted for 30 percent of the composite score. Based on input from educators, academicians, and other subject matter experts, DOE recommended that the FELE subtest three be changed by establishing separate passing score requirements for each section, thereby requiring examinees to pass each section. The SBE adopted the recommendation, which is codified in the FELE rule, and has applied to FELE scoring since January 1, 2015. The effect of the change is that an examinee not as proficient in effective written communications can no longer compensate for a weak essay with a strong performance on the multiple choice section. To a lesser extent (given the prior 70:30 weight allocation), the reverse is also true. The policy underlying this scoring change is to give more emphasis to testing writing skills, in recognition of the critical importance of those skills. By giving heightened scrutiny to writing skills, the FELE better aligns with increasingly rigorous SBE-approved student standards for written performance. This policy change is reasonable and within the purview of the SBE; in any event, it is not subject to debate in this case, because Petitioner did not challenge the FELE rule. The generic competencies to be demonstrated by means of the FELE are set forth in the publication “Competencies and Skills Required for Certification in Education Leadership in Florida, Fourth Edition 2012,” adopted by reference in the FELE rule and effective as of January 1, 2014. The competency and skills generally tested by the FELE written performance assessment are: Knowledge of effective communication practices that accomplish school and system- wide goals by building and maintaining collaborative relationships with stakeholders Analyze data and communicate, in writing, appropriate information to stakeholders. Analyze data and communicate, in writing, strategies for creating opportunities within a school that engage stakeholders. Analyze data and communicate, in writing, strategies that increase motivation and improve morale while promoting collegial efforts. This generic description provides a high-level view (aptly described as from the 30,000-foot level) of the competency and skills that an educational leader should possess, which are tested by the written performance assessment. DOE’s job is to distill those qualities down to a test. As reasonably summarized by DOE’s witnesses, the purpose of the FELE written performance assessment, as established by the SBE, is to test for effective written communication skills, and data analysis that drives appropriate strategies for improvement. These overall concepts are built into the general FELE rubric which serves as a guide to scoring, the individual essay prompts, and the supplemental rating criteria (essentially prompt-specific rubrics, making the general rubric specific to each essay prompt). The FELE rule sets forth requirements for how the “test scoring agency” (Pearson) must conduct the scoring of the written performance assessment: Raters Judges. The test scoring agency shall appoint persons to score the written performance assessment who have prior experience as educational leaders, instructional leaders, or school building administrators. Chief Raters. The chief raters shall be raters who have prior experience as educational leaders, instructional leaders, or school building administrators and have demonstrated success as raters. Pursuant to Pearson’s agreement with DOE, DOE retains the right to approve raters who will be scoring the written performance assessments. Therefore, Pearson proposes raters who meet the specified qualifications, and then DOE approves or disapproves the proposed raters. Approved raters must undergo training before they are appointed by Pearson to conduct scoring. There is currently one chief rater for the FELE written performance assessment. The chief rater was a rater before being trained for, and assuming, the chief rater position. The chief rater was trained by Florida DOE chief raters when Pearson became the contractor and the scoring was transitioned to Pearson’s offices in Hadley, Massachusetts, during 2012 to 2013. Pearson employs holistic scoring as the exclusive method for scoring essays, including FELE written performance assessments (as specified in Pearson’s contract with DOE). The holistic scoring method is used to score essay examinations by professionals across the testing service industry. Pearson has extensive experience in the testing service industry, currently providing test scoring services to more than 20 states. Dr. Michael Grogan, Pearson’s director of performance assessment scoring services and a former chief rater, has been leading sessions in holistic scoring or training others since 2003. He described the holistic scoring method as a process of evaluating the overall effect of a response, weighing its strengths and weaknesses, and assigning the response one score. Through training and use of tools, such as rubrics and exemplars, the evaluation process becomes less subjective and more standardized, with professional bias of individual raters minimized, and leading to consistent scoring among trained raters. Training is therefore an integral part of Pearson’s testing services for which DOE contracted. In an intensive two-day training program conducted by the chief rater in Hadley, prospective raters are trained in the holistic scoring method used to score FELE essays. Pearson’s rater training program begins with a review of background about the holistic scoring method generally, including discussions about rater bias. From there, trainees are oriented to the FELE-specific training material. They thoroughly review and discuss the rubric, the score scale, the operational prompt raters will be scoring, and exemplars (other responses to the prompt that have been pre-scored). The rater candidates then employ these tools to begin independently scoring exemplars. Raters-in-training conduct many rounds of independent scoring sessions, interspersed with group discussions regarding how the essays should have been scored. The trainees then move into the calibration test phase, in which they independently score essay exemplars, paired with an experienced rater who independently scores the same exemplars. The trainees score essay after essay, then compare scores with the experienced rater, with the goal to achieve consistency in scores, by equaling or coming within one point of the other rater’s score. Ultimately, the raters must pass the calibration test by achieving scoring consistency to qualify for appointment as raters to score actual FELE essays. Each FELE essay is scored independently by two DOE- approved raters who meet the qualifications in the FELE rule and who have successfully completed training. Pairs of raters receive scoring assignments, one prompt at a time. The assignments are received anonymously; one rater does not know who the other assigned rater is. And neither rater knows anything about the examinee, as the essay is identified solely by a blind number. FELE essay raters work in one room, at individual computer terminals, in Hadley. Security of all testing information is vigilantly maintained, through confidentiality agreements and secure, limited, and protected computer access. For each scoring assignment, raters adhere to a step- by-step process that reinforces their initial training. Raters must first score sample responses to a historic prompt that is different from the assigned prompt, as a training refresher to invoke the holistic scoring mindset. From there, raters review the assigned prompt and the scoring guides (general rubric and supplemental rating criteria). Raters then must score an anchor set of six sample responses, one exemplifying each score category; the historic scores are not revealed until the raters complete their scoring. Raters compare their scores with the anchor scores, and work through any discrepancies. Raters then go through a calibration process of scoring 10 more sample responses to the same prompt. After scoring all 10 essays, the raters learn the scores deemed appropriate for those responses, and must work through any discrepancies until consistency is achieved. Only after scoring many sample essays and achieving success in scoring consistency are the raters permitted to turn to the assigned FELE essay for review and scoring. The chief rater supervises and monitors the raters while they are engaged in their scoring work. The chief rater is physically present in the same room with the raters, monitoring their work online in real time. As raters enter scores, those scores are immediately known by the chief rater, so that any “red flag” issues in scoring results and trends can be addressed immediately. As another tool, “ghost papers,” which are pre- scored essays, are randomly assigned to raters as if they are actual FELE essays. The chief rater monitors ghost paper scoring as another check on consistency with a predetermined measure. The scores of the two raters assigned to score a FELE essay are added together for the total holistic score. Thus, the total score range for a FELE essay is between two points and 12 points: the lowest possible score of two points would be achieved if each rater assigns a score of one point; and the highest score of 12 points would be achieved if each rater assigns six points. The sum of the two raters’ scores will be the score that the FELE essay receives unless the raters’ scores disagree by more than one point. If the two raters’ scores differ by more than one point, then the chief rater steps in to resolve the discrepancy. After FELE essays are scored, the examinee is informed of the final score of between two and 12 points, and the examinee is told whether the score is a passing or failing score. Seven points is a passing score, according to the FELE rule. Raters do not develop written comments as part of their evaluation of FELE essays. Their holistic evaluation is expressed by the point value they assign to the essay. Through the intensive training and the subsequent calibration and recalibration before each FELE essay scoring assignment, Pearson has achieved excellent consistency in rater scoring of the FELE written performance assessment. From September 12, 2016, through October 8, 2016, the four Pearson raters who were scoring FELE essays (including Petitioner’s essay) achieved a coefficient alpha index of 98 percent, meaning that 98 percent of the time, the scores assigned to an essay by a pair of raters were either identical or adjacent (within one point), and when adjacent, were balanced (i.e., each rater was as often the higher scorer as he or she was the lower scorer). This exceeds industry standards. A comparable, high coefficient alpha index was achieved by FELE essay raters for each month in 2015 and 2016. The lowest coefficient alpha index, still exceeding industry standards, was 93 percent in a single month (February 2015). In two months (December 2015 and July 2016), the coefficient alpha index was 94 percent, with the remaining 21 months at between 95 percent and 98 percent. Examinee Perspective: Preparation for the FELE Essay DOE provides detailed information and aids on its website regarding the FELE, including the essay section, for potential examinees. This includes a 40-page test information guide for the FELE. The test information guide contains all of the SBE-adopted competencies and skills, including the competency and skills tested by the written performance assessment. The guide also contains the general FELE essay scoring rubric, and a sample prompt that is representative of the essay prompts actually used. DOE also posts on its website three additional sample FELE essay prompts along with the supplemental rating criteria that correspond to those prompts. Petitioner does not challenge the appropriateness of these materials generally, which she accessed and used to prepare for the FELE written performance assessment. However, Petitioner complained that DOE does not provide more study guide materials or endorse specific vendors of study guide materials so as to more thoroughly prepare potential examinees for their essay tests. Petitioner also complained that when an examinee fails an essay test, DOE does not provide substantive explanations to help the examinee understand the reasons for the failing score and how the examinee can perform better. DOE appropriately responded to this criticism by reference to standards for testing agencies adopted by three authoritative bodies: the American Educational Research Association, the American Psychological Association, and the National Council of Measurement Education. These standards dictate that as testing agency, DOE’s responsibility is to develop tests that evaluate whether individuals are prepared with the necessary skills. It is not DOE’s responsibility, and it would not be appropriate for DOE, as the testing agency, to prepare individuals to pass its tests, or coach individuals on how to perform better on tests they do not pass. The information DOE makes publicly available is appropriate and sufficient to explain the FELE essay exam and scoring process, and to allow an examinee to know what to expect in a prompt and what is expected of the examinee in a response. The DOE test information guide explains the FELE essay and scoring process, as follows: Your response will be scored holistically by two raters. The personal views you express will not be an issue; however, the skill with which you express those views, the logic of your arguments, the quality of your data analysis and interpretation, and the appropriateness of your implementation plans will be very important in the scoring. Your response will be scored on two constructs: communication skills, including ideas, focus, organization, and mechanics (capitalization, punctuation, spelling, and usage) and data analysis, interpretation, and evaluation, including data explanation, application, relevant implications, and analysis of trends. The raters will use the criteria on the following page when evaluating your response. The score you receive for your written performance assessment will be the combined total of the two raters’ scores. (R. Exh. 2 at 13 of 40). On “the following page” of the test information guide, the general FELE essay rubric is set forth in its entirety. The rubric is also available on the DOE website as a separate, stand- alone document. The rubric is simply a comparative description of the extent to which an essay demonstrates the generic competency and skills to be tested--effective written communication skills, with data analysis that drives appropriate strategies for improvement. For example, recognizing that part of effective written communication is use of proper grammar and syntax, the rubric describes that quality comparatively, differentiating between best, better, good, not-so-good, worse, and worst. Similarly, the rubric addresses whether proposed strategies are appropriate by comparing the extent to which the strategies are aligned with the data findings, relevant implications, and trends. But these are just parts--and not discrete parts--of the evaluation. As explained in the test information guide, holistic evaluation judges the overall effect of a response, considering all aspects of effective communication and data analysis, in a process of weighing and balancing strengths and weaknesses. Of course, DOE does not make publicly available those essay prompts being used in FELE tests, or the supplemental rating criteria for those prompts; these are protected, confidential testing material. It would be unreasonable for examinees to expect more from a testing agency than what DOE makes available. Score Verification An examinee who fails the written performance assessment (or any other FELE subtest or section) may request score verification, to verify that the failed exam was scored correctly. The score verification procedures are set forth in the FELE rule. The score verification rule provides that DOE makes the determination as to whether an examinee’s test was scored correctly. DOE is authorized to consult with field-specific subject matter experts in making this determination. In practice, though not required by the FELE rule, when a score verification request is directed to the score assigned to a FELE written performance assessment, DOE always consults with a field-specific subject matter expert known as a “chief reviewer.” Chief reviewers are another category of experts (in addition to raters and chief raters) proposed by Pearson pursuant to qualifications identified by DOE, subject to DOE approval. Once approved by DOE, prospective chief reviewers undergo the same rater training in the holistic scoring process as do all other raters, to gain experience in scoring essays and undergo calibration to achieve scoring consistency. In addition, chief reviewers are given training for the chief reviewer role of conducting review and scoring of essays when scores have been contested.5/ Unlike raters and chief raters, chief reviewers do not work at Pearson in Hadley, Massachusetts; they are Florida experts, actively working as principals of Florida schools. Chief reviewers only become involved when an examinee who failed the FELE written performance assessment invokes the score verification process. A chief reviewer is assigned to evaluate whether that essay was scored correctly. The chief reviewer conducts that evaluation by first going through the same step-by-step process as raters, following the same retraining and calibration steps that involve scoring many sample essays. Upon achieving success in the calibration test, the chief reviewer moves on to evaluate the assigned essay response independently, before reviewing the scores the raters gave to that essay. Upon reviewing the raters’ scores, the chief reviewer offers his or her view as to whether the essay score should stand or be changed, and provides a summary rationale for that opinion. This information is conveyed to DOE, which determines the action to take--verify or change the score--and notifies the examinee of the action taken. Petitioner’s FELE Attempts Petitioner took all parts of the FELE for the first time in the summer of 2015, in June and July. She passed subtest one, but failed subtest two and both sections (multiple choice and written performance assessment) of subtest three. FELE examinees can retake failed subtests/sections, and need only retake the parts failed. There are no limits on the number of retakes. The requirements for retakes are that at least 30 days must have elapsed since the last exam attempt, and that examinees pay the registration fees specified in the FELE rule for each retake of a failed subtest and/or section. On April 23, 2016, roughly nine months after her first attempt, Petitioner retook subtest two and both sections of subtest three. To prepare, Petitioner used the “very limited” resources on the DOE website, and purchased some “supplementals,” which she described as materials “on the market that supposed FELE experts sell.” (Tr. 33). She used the material to study and practice writing essays. Petitioner passed subpart two and the multiple choice portion of subpart three. However, she did not pass the written assessment section of subpart three. Petitioner retook the written performance assessment 33 days later (May 26, 2016), but again, did not pass. Petitioner did not invoke the score verification process to question the failing scores she received on her first three FELE essays. Those three failing scores stand as final, as she did not challenge them. Petitioner explained that she did not challenge them because she was embarrassed, because as a teacher, she believed that she would pass the test. However, while Petitioner has had many years of success as a teacher, the skills for teaching do not necessarily correlate to the skills required for educational leadership positions, as several DOE witnesses credibly attested. Nonetheless, Petitioner tried again, in an effort to qualify for the pay raise her district would provide. She retook the FELE essay section for the fourth time on September 28, 2016. Petitioner testified that, as she had done before, she reviewed the material on DOE’s website, such as the test information guide with its general rubric, and she practiced writing essays using the sample essay prompts and supplemental rating criteria. In what was described as a “eureka moment,” she also found what she described as “the rubric” on the website, which she proceeded to memorize. Rather than the rubric, however, what Petitioner memorized was the generic competency and skills tested by the written performance assessment. Petitioner made a point of incorporating words from the competency and skills document in her essay. Petitioner did not pass. Each of the four times Petitioner took the FELE written performance assessment, including the most recent attempt at issue in this case, both raters assigned to score her essay gave the essay three points, for a total score of six points. Since in each of her four attempts, Petitioner’s essay was scored the same by both raters, Petitioner’s essays were never reviewed by a chief rater, because there was never a discrepancy in the raters’ scores for the chief rater to resolve. Petitioner’s Challenge to Her Fourth Six-Point Essay Score When Petitioner was notified that her fourth essay attempt resulted in the same score--six, on a scale ranging from two points to 12 points--this time Petitioner took the next step, by requesting a score verification session. Following the procedures in the FELE rule for score verification, Petitioner registered, paid the required fee, and went to the designated Pearson site. There, she was able to review the essay prompt, as well as her written response. Petitioner testified that she prepared a “statement of specific scoring errors” (so named in the FELE rule--more aptly, in her case, a statement explaining why she thinks her essay score was erroneous), which she submitted to Pearson at the end of her session. By rule, the statement is then filed with DOE. The statement Petitioner prepared was not offered into evidence, apparently by choice, as Petitioner was looking for it at one point, stating that it was “part of the confidential stuff” (Tr. 78) that had been produced by DOE. Petitioner attempted to describe the statement of scoring errors that she recalls completing. She described it as primarily demonstrating where in her essay she addressed what she characterized as the “rubric” that she had found on DOE’s website and memorized. As noted previously, this was not the rubric, but rather, was the high-level description of the competency and skills tested by the FELE written performance assessment. As described, Petitioner’s statement explaining that she “memorized” the competency/skills ingredients, and showing where she included competency/skills buzz-words in her essay (e.g., “morale”; she also said “celebration,” but that word does not appear in the competency/skills), would not seem to be the sort of statement that would be persuasive as to a claim of an erroneous score. It would be a mistake to memorize and repeat words from the generic competency/skills without regard to whether they are used in a way that makes sense in the responding to the specific instructions of the essay prompt. DOE conducted its review, and the score was verified through a process consistent with DOE’s practice of consulting a chief reviewer retained by Pearson with DOE approval, who was qualified as a subject matter expert in the field of Florida educational leadership. The assigned chief reviewer was also qualified by Pearson training in the holistic scoring method and in conducting score verification reviews. The chief reviewer who undertook to verify Petitioner’s essay score did not review Petitioner’s statement explaining why she believed her essay score was erroneous. Instead, he independently evaluated Petitioner’s essay, following the same holistic method, including the step-by-step retraining and calibration process, used by all raters to score a FELE essay. Then the chief reviewer reviewed the scores separately assigned by the two raters who scored Petitioner’s essay. He concluded that the assigned scores of three were appropriate for Petitioner’s essay, and that no change should be made. The chief reviewer provided a summary rationale for his determination.6/ Petitioner complains that the chief reviewer should have been given her statement explaining why her score was erroneous, because that might have affected the chief reviewer’s decision. However, pursuant to the FELE rule, the chief reviewer’s role is consultative only; DOE makes the determination of whether Petitioner’s essay was scored correctly, which is why the rule provides that the statement of asserted scoring errors is filed with DOE. Petitioner presented no evidence proving that DOE did not consider Petitioner’s statement explaining why she believed her essay score was erroneous. No testimony was offered by a witness with personal knowledge of any review given to Petitioner’s statement; that review would have been done by a member of DOE’s “scoring and reporting team” (Tr. 260-261), none of whom testified. If Petitioner had proven that the statement was not considered by DOE, the failure to offer that statement into evidence would make it impossible to determine the import, if any, of such failure. Petitioner was notified by DOE that the “essay score that you questioned has been reviewed by a Chief Reviewer. As a result of this review, the Department has determined that the written performance section that you questioned is indeed scored correctly.” Petitioner was informed that if she was not satisfied with the outcome, she was entitled to dispute the decision pursuant to sections 120.569 and 120.57. Petitioner availed herself of that opportunity,7/ and was given the chance in a de novo evidentiary hearing to present evidence to support her challenge to her exam score. At the hearing, Petitioner offered only her own testimony as support for her challenge to the scoring of her essay. She isolated portions of the supplemental rating criteria and attempted to identify where her essay addressed the isolated portions, for which, in her view, she ought to have been awarded “a point” here or “a half-point” there. She also referred to isolated parts of the summary comments from the raters and chief reviewers, and attempted to identify the parts of her essay that did or did not do what the comment portions stated. Petitioner was not shown to be, tendered as, or qualified as an expert in either educational leadership or holistic scoring of essays. Her attempt to tally points by comparing isolated parts of the prompt-specific rubric to isolated parts of her essay is contrary to the holistic scoring approach used to score the FELE written performance assessment. Petitioner offered no comprehensive, holistic evaluation of her essay as a whole, nor was she shown to be qualified to do so. Besides being contrary to the holistic scoring method, Petitioner’s critique of the scoring of her essay was wholly unpersuasive. Without undermining the confidentiality of the ingredients of Petitioner’s testimony (the essay prompt, her essay, the supplemental rating criteria, and the historic anchors), overall, the undersigned did not find Petitioner’s critique credible or accurate. Although awkward to try to explain in code, some examples follow to illustrate the basis for this overall finding. As one example, Petitioner referred to data points that the prompt-specific rubric indicated should be identified in response to the prompt. If a “data point” that should have been identified was that A was consistently lower than B, Petitioner called attention to a part of her essay identifying A as low. She acknowledged that her essay did not expressly compare A to B at all, much less over time, but Petitioner argued that those comparisons were implicit. She said that she should have gotten at least a half-point for partially identifying the data point. That argument is rejected. The point that needed to be made was a comparative assessment over a time span. Where another data point called for identifying that two things were “substantially lower” than other things, Petitioner said that she sufficiently identified this point by saying that one of those two things was “lowest” (or “worst”). However, the point that needed to be made was not just that something was lowest or worst, but also, that another thing was also lower, and that the degree of separation between those two things and other things was substantial. Overall as to the data points, Petitioner failed to identify several significant trends, and failed to offer sufficient comparative analysis as to the trends she did identify. She reported data or averages of data without identifying the relevant implications of the data, as would have come from making the appropriate comparisons and identifying the appropriate trends. In terms of the competency/skills language, she did not analyze the data and communicate, in writing, appropriate information to the stakeholders identified in the prompt as the target audience. The data point failures were particularly problematic when taken to the next step of proposing specific strategies that would lead to improvement in the areas shown to be needed from the data points. For example, Petitioner’s failure to identify the second data point in the supplemental rating criteria resulted in Petitioner proposing action that was at odds with what the second data point showed.8/ Petitioner’s attempted critique of her essay score was riddled with other inconsistencies. For example, Petitioner acknowledged that she often failed to summarize specific data for each of the three years, choosing instead to provide three-year averages. Petitioner’s explanation was that she did not want to repeat data in the prompt because that would be condescending to her target audience. This is a weak rationale, one which is at odds with the instructions given with the prompt. Petitioner also said it should have been a positive that instead of just citing yearly numbers, she went to the trouble of calculating three-year averages. Instead, it appeared more negative than positive, by masking information needed to respond to the prompt. While Petitioner defended her omission of specific data because of the target audience she was instructed to address, Petitioner inconsistently sought to explain an odd statement using the word “celebrated” (Jt. Exh. 3 at 1, first sentence of second paragraph) as being directed more to certain other stakeholders than to the target audience. She did this because the “rubric” (i.e., the competency/skills), said to communicate to stakeholders, and also “talks about morale and celebration.” (Tr. 59). This is an example of Petitioner’s ineffective strategy of throwing out words from the competency/skills in ways that were contrary to specific instructions in the prompt. The target audience identified in an essay prompt may be certain stakeholders, instead of all stakeholders. For example, the sample prompt in the test information guide (R. Exh. 2 at 34), instructs the writer to prepare a memorandum for school advisory council members. The use of the word “stakeholders” in the competency/skills would not justify ignoring the essay prompt instructions by writing with a communication style more suited to a different audience of other stakeholders. Petitioner disagreed with the suggestion in both chief reviewers’ written comments that the essay’s responses to the third and fourth bullet points in the prompt (Jt. Exh. 1) were generalized, lacking specifics and examples. Petitioner failed to persuasively establish that her essay provided sufficient detail in this regard to avoid being fairly characterized as responding to these bullet points with “generalizations.” By failing to adequately analyze the data, relevant implications, and trends, Petitioner’s responses to these bullet points were either too general (e.g., research to find strategies), or in the one instance where specific action was described, the action was at odds with data points she missed. Her responses lacked appropriate specific action driven by data analysis. Petitioner admitted that her essay had a number of misspellings, grammatical errors, and punctuation errors. She acknowledged that this is an area that the raters are supposed to consider. It is a necessary part of effective written communication. In this regard, by the undersigned’s count, 29 of the 37 sentences in Petitioner’s essay suffer from one or more errors of grammar, syntax, punctuation, or misspellings. More than half of those sentences (at least 15 of 29) suffer from errors of grammar and syntax, such as pairing “neither” with “or” instead of “neither . . . nor,” using non-parallel structure, using plural subjects with singular verbs or singular subjects with plural verbs, and using conditional language (such as “would do” and “would be”) without a corresponding condition (e.g., that action would be appropriate, if the trend continues). In addition, the last sentence of the second paragraph on page one is not a complete sentence, ending in mid-word. Petitioner admitted that she ran out of time to complete the thought. As to this consideration, Petitioner’s essay appears to the undersigned to fall somewhere between the general rubric’s description for a “three” (“The writer demonstrates some errors in the use of proper grammar and syntax that do not detract from the overall effect.”), and the general rubric’s description for a “two” (“The writer demonstrates serious and frequent errors in proper grammar and syntax.”). Petitioner’s essay admittedly did not meet the general rubric’s description for a score of “four” (“The writer demonstrates satisfactory use of proper grammar and syntax.”). This does not automatically doom Petitioner’s essay to a score of three or less than three. However, it demonstrates the fallacy of Petitioner’s approach of seizing on isolated parts of the prompt-specific rubric (supplemental rating criteria) to compare to her essay, without approaching the scoring process holistically. Even if Petitioner had persuasively critiqued parts of the essay scoring, as Respondent aptly notes, it is not simply a matter of checking off boxes and adding up points. Petitioner failed to prove that the holistic scoring of her essay was incorrect, arbitrary, capricious, or devoid of logic and reason. She offered no evidence that a proper holistic evaluation of her essay would result in a higher total score than six; indeed, she offered no holistic evaluation of her essay at all. Petitioner’s critique of various parts in isolation did not credibly or effectively prove that her score of six was too low; if anything, a non-expert’s review of various parts in isolation could suggest that a score of six would be generous. But that is not the scoring approach called for here. Petitioner failed to prove that there was anything unfair, discriminatory, or fraudulent about the process by which the written performance assessment exam was developed, administered, and scored.9/ Petitioner pointed to the passage rate on the FELE written performance exam following the adoption of a separate passing score requirement. In 2015 and 2016, the passage rates for first-time test takers were 54 percent and 50 percent, respectively. The data is collected and reported for first-time test takers only, because that is considered the most reliable. Historically, performance on essay examinations goes down, not up, with multiple retakes. The passage rates reflect a mix of both examinees prepared in an academic educational leadership program geared to Florida standards, and those whose educational background does not include a Florida-focused program. Historically, examinees from academic programs aligned to Florida standards have greater success passing the FELE essay than those from out-of-state academic programs that are not aligned to Florida standards. Petitioner may have been at a disadvantage in this regard, as it does not appear that her master’s program at Concordia University was aligned to Florida’s educational leadership standards. The passage rates, standing alone, do not prove that the written performance assessment is unfair, arbitrary, or capricious. It may be that the SBE’s decision to increase scrutiny of the writing skills of FELE examinees results in fewer examinees achieving a passing score. Perhaps that is a good thing. Perhaps too many examinees achieved passing scores on the FELE in the past, despite weak written communication skills. In any event, the overall written performance assessment passage rates, standing alone, provide no support for Petitioner’s challenge to the score given to her essay. Petitioner failed to prove that the scoring verification process was unfair, arbitrary, capricious, or contrary to the procedures codified in the FELE rule. Petitioner pointed to evidence that essay scores are changed only on occasion, and that no scores were changed in 2016. Those facts, standing alone, do not support an inference that the score verification process is unfair, arbitrary, or capricious. An equally reasonable or more reasonable inference is that the scores to be verified were appropriate.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that a final order be entered rejecting Petitioner’s challenge to the failing score she received on the written performance assessment section of the Florida Educational Leadership Exam taken in September 2016, and dismissing the petition in this proceeding. DONE AND ENTERED this 13th day of October, 2017, in Tallahassee, Leon County, Florida. S ELIZABETH W. MCARTHUR Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 13th day of October, 2017.

Florida Laws (5) 1001.101012.56120.569120.5720.15
# 2
WAYNE TERWILLIGER vs BOARD OF PROFESSIONAL ENGINEERS, 94-003745 (1994)
Division of Administrative Hearings, Florida Filed:Fort Lauderdale, Florida Jul. 07, 1994 Number: 94-003745 Latest Update: Jun. 03, 1996

Findings Of Fact The Petitioner sat for the October 1993 administration of the licensure examination for Metallurgical Engineering. When his examination was graded, he was assigned a raw score of 45 points. A raw score of 48 points is the minimum passing grade on the subject examination. The Respondent stipulated at hearing that the Petitioner is entitled to one additional raw score point, which brings the Petitioner's total undisputed raw score to 46. One of the essay questions on the subject examination was Item 258. According to the scoring plan for Item 258, an exam-taker could earn from 0 to 10 points in two-point increments depending on the quality of his answer. The scoring plan for Item 258 specifies that 2 points should be awarded for an answer that demonstrates "rudimentary knowledge" and that 4 points should be awarded for an answer that demonstrates "more than rudimentary knowledge but [is] insufficient to demonstrate competence." When the Petitioner's examination was graded the first time, he was awarded 0 points for his answer to Item 258. When the Petitioner's examination was regraded, he was awarded 2 points for his answer to Item 258. 1/ The evidence at hearing establishes that the Petitioner's answer to Item 258 demonstrates more than rudimentary knowledge, but is insufficient to demonstrate competence. 2/ Accordingly, pursuant to the scoring plan for Item 258 the Petitioner is entitled to receive 4 points for his answer to Item 258.

Recommendation On the basis of all of the foregoing, it is RECOMMENDED that a Final Order be issued in this case concluding that the Petitioner is entitled to a raw score of 48 points on the subject examination, which is a passing grade. DONE AND ENTERED this 8th day of March 1995 in Tallahassee, Leon County, Florida. MICHAEL M. PARRISH Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 8th day of March 1995.

Florida Laws (3) 120.57471.013471.015
# 3
DEPARTMENT OF BUSINESS AND PROFESSIONAL REGULATION, BOARD OF ACCOUNTANCY vs AHMAD LABIB BALTAGI, 11-006262PL (2011)
Division of Administrative Hearings, Florida Filed:Marathon, Florida Dec. 12, 2011 Number: 11-006262PL Latest Update: Jul. 06, 2012

The Issue Whether Respondent committed the violations alleged in the Administrative Complaint and, if so, what disciplinary action should be taken against him.

Findings Of Fact The Department is the state agency charged with the duty to regulate the practice of certified public accountants in Florida and to prosecute administrative complaints pursuant to chapters 120, 455, and 473, Florida Statutes. At all times relevant to the allegations of the Complaint, Baltagi was licensed in Florida as a certified public accountant. Baltagi's license number is AC 0028318. Baltagi is the sole owner of Baltagi Business Services, Inc., which does business as "Fast Cash Services." He is also the sole owner of Labib Baltagi, Inc., which prepares income tax returns as a Jackson Hewitt tax franchise. The two businesses are located adjacent to each other, but are two separate businesses. Fast Cash Services had as its primary business operation the cashing of checks for the customers of Baltagi's Jackson Hewitt tax franchise. In 2006, the United States, on behalf of the Internal Revenue Service (IRS), filed a complaint against Baltagi, alleging that he had prepared 32 federal tax returns for Native Americans that failed to include per-capita distributions from gaming proceeds in their taxable income. In June of 2006, Baltagi signed a Stipulated Judgment of Permanent Injunction, which provided as follows: Pursuant to 26 U.S.C. §§ 7402(a), 7407 and 7408, defendants and their employees are permanently enjoined from: preparing or assisting in the preparation of, or counseling of or advising the preparation of filing of, federal tax returns which assert that per capita distributions of gaming proceeds paid to Native Americans are exempt from federal income tax; preparing or assisting in the preparation of, or counseling or advising of federal tax returns that assert any position for which there is not a realistic possibility of being sustained on its merits that results in the understatement of tax liability, or that evinces a willful, intentional, or reckless disregard for the applicable laws, rules, and regulations; engaging in any fraudulent or deceptive conduct which interferes with the proper administration of the internal revenue laws. In summary, the Stipulated Judgment prohibited Baltagi from preparing federal income tax returns that asserted that per capita gaming proceeds were exempt from federal income taxes, and from preparing federal income tax returns that understate tax liability by asserting any other frivolous or unrealistic position. The Stipulated Judgment merely prohibited Baltagi from actions that all persons, whether they are certified public accountants or not, are prohibited from performing. Baltagi was never prohibited from filing tax returns, and his license was not, in any manner, disciplined. In fact, subsequent to this Stipulated Judgment, the IRS accepted Baltagi into its Enrolled Agent Program. There was no clear and convincing evidence establishing that the IRS is a licensing agency, or that it regulates Florida certified public accountants. Also in 2006, Fast Cash Services entered into a contract with iStream Financial Services, Inc., and its affiliate, Kenny Bank and Trust (KBT). Fast Cash Services was tasked with verifying identification via current driver's licenses or other appropriate form of identification for clients who sought to cash a check. In July and August of 2009, KBT received multiple Department of Treasury reclamation claims from Fast Cash Services. Someone other than the named payee cashed the reclamation checks, and one of Baltagi's employees failed to notice the discrepancy. As a result of these checks being cashed, the Circuit Court for Waukesha County, Wisconsin, entered a default judgment against Baltagi in the amount of $276,160.42 in response to a complaint filed by iStream Financial Services and KBT. Baltagi began to pay KBT damages as a result of the judgment, although he firmly believes he was the victim of fraud as to the cashing of those reclamation checks. The fact that a default judgment was entered against Baltagi does not, standing alone, establish that Baltagi failed to maintain good moral character.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that Petitioner dismiss the Administrative Complaint against Respondent. DONE AND ENTERED this 12th day of April, 2012, in Tallahassee, Leon County, Florida. S JESSICA E. VARN Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 12th day of April, 2012. COPIES FURNISHED: Frederick R. Dudley, Esquire Holland & Knight 315 S. Calhoun Street, Suite 600 Tallahassee, Florida 32301 fred.dudley@hklaw.com C. Erica White, Esquire Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399 erica.white@dbpr.state.fl.us Eric R. Hurst, Esquire Department of Business and Professional Regulation 1940 North Monroe Street, Suite 42 Tallahassee, Florida 32399-2202 eric.hurst@dbpr.state.fl.us Layne Smith, General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-2202 eric.hurst@dbpr.state.fl.us Veloria Kelly, Director Division of Certified Public Accounting Board of Accountancy 240 Northwest 76th Drive, Suite A Gainesville, Florida 32607

Florida Laws (6) 120.569120.57455.227473.308473.3141473.323
# 4
VADIM J. ALTSHULER vs BOARD OF PROFESSIONAL ENGINEERS, 98-002342 (1998)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida May 18, 1998 Number: 98-002342 Latest Update: Jan. 27, 1999

The Issue Whether Petitioner is entitled to additional credit for his response to Question Number 146 of the Principles and Practice of Engineering examination administered on October 31 through November 1, 1997.

Findings Of Fact Petitioner took the professional engineering licensing examination with emphasis in mechanical engineering on October 31, 1997. Passing score on the examination was 70. Petitioner obtained a score of 65 and a raw score of 43. A score of 70 would have generated a raw score of 48. Petitioner needed at least 5 additional raw score points to achieve a passing grade and a converted score of 70. Out of a possible 10 points on Question Number 146, Petitioner received a score of 4 points. The National Council of Examiners for Engineering and Surveying (NCEES), the organization that produces the examination, provides a Solution and Scoring Plan outlining the scoring process for question 146. Further, NCEES rescored Petitioner’s test but found no basis to award additional points. There are 5 categories to question 146. All six elements of question 146 must be completely and correctly answered to receive full credit of 10 points for the question. Instructions for the question provide: A perfect solution is not required, as the examinee is allowed minor psychometric chart reading errors (two maximum) or minor math errors (two maximum). The total number of minor errors allowed is two. Errors in solution methodology are not allowed. Examinee handles all concepts (i.e., sensible and total heat, sensible heat ratio, coil ADP and BF, adiabatic mixing, and coil heat transfer) correctly. (emphasis supplied.) Testimony at the final hearing of Petitioner’s expert in mechanical engineering establishes that Petitioner did not qualify for additional points for answers provided for question 146. Petitioner failed to use the definition of bypass factor indicated in the problem. Instead, Petitioner used the Lindenburg method rather than the Carrier method to calculate the bypass factor. The Carrier Method was implied in the problem due to the way the problem was structured. The system outlined in question 146 did not have the special configuration that would be listed if the Lindenburg method were utilized. Petitioner also missed the total coil capacity due to misreading the psychometric chart. By his own admission at the final hearing, Petitioner misread the data provided because they were printed one right above the other in the question. Petitioner read the point on the psychometric chart for an outdoor dry bulb temperature at 95 degrees and a 78 percent relative humidity as the outdoor air. The question required a dry bulb temperature of 95 degrees and a wet bulb temperature of 78 degrees. Petitioner’s misreading constituted an error in methodology as opposed to a minor chart reading error. Question Number 146 on the examination was properly designed to test the candidate’s competency, provided enough information for a qualified candidate to supply the correct answer, and was graded correctly and in accord with the scoring plan.

Recommendation Based on the foregoing, it is, hereby, RECOMMENDED: That a final order be entered confirming Petitioner’s score on the examination question which is at issue in this proceeding. DONE AND ENTERED this 25th day of August, 1998, in Tallahassee, Leon County, Florida. DON W. DAVIS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 Filed with the Clerk of the Division of Administrative Hearings this 25th day of August, 1998. COPIES FURNISHED: Natalie A. Lowe, Esquire Board of Professional Engineers 1208 Hays Street Tallahassee, Florida 32301 Vadim J. Altshuler 9794 Sharing Cross Court Jacksonville, Florida 32257 Dennis Barton, Executive Director Board of Professional Engineers 1208 Hays Street Tallahassee, Florida 32301 Lynda L. Goodgame, General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0792

Florida Laws (1) 120.57
# 5
SHARON G. YOUNG vs DEPARTMENT OF CHILDREN AND FAMILY SERVICES, 98-000984 (1998)
Division of Administrative Hearings, Florida Filed:Jacksonville, Florida Mar. 02, 1998 Number: 98-000984 Latest Update: Apr. 06, 1999

The Issue Whether the Petitioner is eligible for developmental services provided by the Department of Children and Family Services?

Findings Of Fact The Petitioner, Sharon G. Young, is a 23-year old white female, who is currently a patient in a rehabilitation facility for treatment of permanent disabilities suffered when she contracted encephalitis when she was 19-years old. After a series of serious seizures as an infant, Petitioner was identified has having a seizure disorder and mental disabilities. As a result she was medicated with Phenobarbital and placed in special education programs in the public school system. She was tested in this program periodically. Because Petitioner's current condition prevents assessing her status now to determine what her condition was prior to contracting encephalitis, the degree of her mental disability must be determined using the tests performed by the school system. Reports of tests performed in 1978, 1983, 1986, and 1989 were introduced. The Department's expert, Filipinas Ripka, conducted the review of these tests' reports to determine whether Petitioner was eligible for services. Ms. Ripka was accepted as an expert in psychological testing. She did not examine the Petitioner, and had never tested the Petitioner. Her opinion was based solely upon review of the reports prepared by the school psychologists in the years indicated above. According to her testimony, Ms. Ripka gave different emphasis to the various tests and reached different conclusions regarding Petitioner's condition than the school psychologists. The initial test in 1978 was conducted when Petitioner was approximately 3 years, 7 months old (40 months). That test report references an earlier evaluation on February 16, 1978, when Petitioner was 33 months old. At 33 months the Petitioner exhibited expressive language development of 12 months, receptive language development of 16-20 months, perceptual performance abilities in the range of 20-25 months, social skills at 30 months, fine motor skills at 18-23 months, cognitive/linguistic/ verbal skills at 17-20 months, and gross motor skills at 18-20 months. The school psychologist examined and tested Petitioner, and observed that she was easily distracted and had a short attention span. Assessment of Petitioner was attempted using several different tools. On those tests upon which Petitioner could be scored, she tested in the mild range of retardation with an IQ of 50. She was unable to perform certain of these tests sufficiently to reliably score her; however, the results of those tests were consistent with the findings that she was mildly mentally retarded, i.e., had an IQ of 50. In 1983, the Petitioner was retested. That report references tests performed in 1981, and their results showed Petitioner had a Verbal IQ of 82, a Performance IQ of 73, and a Full Scale IQ of 76. The examiner found the Petitioner was functioning at the borderline level according to a Wechsler Intelligence Scale of 70. However, she demonstrated an inability to copy abstract symbols, which placed her six standard deviations below the expectancy of her age group on the Bender Visual Motor Gestalt Test. She was two standard deviations below her expectancy on the VADS score, indicating a significant weakness for processing digits. Her Draw-A-Person Test was interpreted to indicate neurological impairment. The Petitioner was re-tested in 1986 when she was 11 years old because she was not performing well and was having academic difficulty in her school placement. Petitioner had scored in the 19th percentile in reading, the 16th percentile in math, and 23rd percentile in language on the Stanford Achievement Test. Upon testing, Petitioner had a Verbal IQ of 70, and Performance IQ of 68, and a Full Scale IQ of 68 on the Wechsler Intelligence Scale for Children-Revised. Petitioner had a score of 58 on the Peabody Picture Vocabulary Test, or an age equivalence of 6 years, 6 months. Her Bender Visual Motor Gestalt Test showed an age equivalency of 5 years, 8 months, and her error score was more than four standard deviations below the mean age. Her short-term retention was within one standard deviation relative to chronological are. In 1989, Petitioner was tested for triennial evaluation. Petitioner was 13 years old and cooperated with the examiner. On the Wechsler Intelligence Scale, Petitioner received a Verbal IQ of 64, a Performance IQ of 80, and a Full Scale IQ of 70 plus or minus 3. The examiner concluded that there was a 68 percent probability that her true IQ was between 67 and 73. She showed a significant difference between her Performance IQ and Verbal IQ. The examiner found Petitioner functioned in the lower end of the Borderline range of intelligence, and that her strengths were her ability to visually analyze and her fine motor skills. Her lowest scores were in the area of word knowledge. She demonstrated a processing deficit in visual-motor integration. The Respondent is the state agency charged with providing developmental services to eligible persons in Florida. A score of 70 or less places a person two standard deviations below the mean score on the Wechsler Intelligence Scale.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law set forth herein, it is RECOMMENDED: That the Department enter a final order finding that the Petitioner is eligible for developmental services. DONE AND ENTERED this 10th day of August, 1998, in Tallahassee, Leon County, Florida. STEPHEN F. DEAN Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-921-6847 Filed with the Clerk of the Division of Administrative Hearings this 10th day of August, 1998. COPIES FURNISHED: Robert Bencivenga, Esquire Jacksonville Area Legal Aid, Inc. 126 West Adams Street Jacksonville, Florida 32202 Roger L.D. Williams, Esquire Department of Children and Family Services Post Office Box 2417 Jacksonville, Florida 32231 Gregory D. Venz, Agency Clerk Department of Children and Family Services Building 2, Room 204 1317 Winewood Boulevard Tallahassee, Florida 32399-0700 Richard A. Doran, General Counsel Department of Children and Family Services Building 2, Room 204 1317 Winewood Boulevard Tallahassee, Florida 32399-0700

Florida Laws (2) 120.57393.063
# 6
JUVENILE SERVICES PROGRAM, INC. vs DEPARTMENT OF JUVENILE JUSTICE, 96-005982BID (1996)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Dec. 27, 1996 Number: 96-005982BID Latest Update: May 05, 1997

The Issue The issues for determination in this case are: 1) whether the Respondent’s decision to award a contract to operate a juvenile work release halfway house program to the Henry and Rilla White Foundation was clearly erroneous, contrary to competition, arbitrary, or capricious; and 2) whether the award of the contract is void as a matter of law because of procedural violations by the selection committee and the Respondent.

Findings Of Fact Petitioner, JUVENILE SERVICES PROGRAM, INC. (JSP), is a Florida-based private not-for-profit corporation which was founded to serve troubled youths and their families. Respondent, FLORIDA DEPARTMENT OF JUVENILE JUSTICE (DJJ), is the agency of the State of Florida with the statutory authorization for planning, coordinating, and managing programs for the delivery of services within the juvenile justice consortium. Section 20.316, Florida Statutes. RFP #16P05 On September 27, 1996, Respondent DJJ advertised and released a Request For Proposal (RFP) #16P05 to provide a Work Release Halfway House for Delinquent Males in District IX, serving Palm Beach County, Florida. In response to the RFP, four bids were submitted to DJJ by the following parties: the Henry and Rilla White Foundation, Total Recovery, Inc., Psychotherapeutic Services Inc., and Petitioner JSP. The DJJ bid selection committee of evaluators for the RFP were Jack Ahern, Steve Brown, Jaque Layne, Patricia Thomas, and from the Office of Budget Finance, Fred Michael Mauterer. The contract manager for the RFP was Diane Rosenfelder. On October 28, 1996, each DJJ evaluator was sent a package consisting of a copy of the RFP, which included the evaluation sheet, a copy of each proposal submitted to DJJ, a conflict of interest questionnaire, a certificate of compliance, a description of the proposal selection process, and instructions. Each package sent to the evaluators had a different colored cover sheet which identified the specific evaluator. After completing the evaluations, each evaluator returned the signed conflict of interest forms, and certificates of compliance to Diane Rosenfelder. The evaluations were identified by the color of the cover sheets, as well as the signed conflict of interest forms and certificates of compliance. DJJ initially intended to provide each evaluator with an Award Preference Form which were to be used in the event the final evaluation scores were very close. The Award Preference Forms, however, were inadvertently omitted from the packages sent to the evaluators. The evaluation process resulted in the Henry and Rilla White Foundation receiving the highest average score of 391.50 points. Petitioner JSP received the second highest average score of 360.50 points. The award of points was determined by each evaluator which is indicated by the evaluator checking the box on Section 5 of the evaluation sheet, or by filling in the appropriate point score. The contract manager, Diane Rosenfelder, corrected addition errors on the scoring sheets. The budget part of the evaluation was completed by Fred Michael Mauterer, Senior Management Analyst Supervisor. In accordance with the evaluation scores, DJJ determined that the best response was submitted by the Henry and Rilla White Foundation which was awarded the contract. On November 8, 1996, Petitioner JSP filed a timely Notice of Protest of the award, which was supplemented on December 9, 1996 with the required posting of a $5000 bond. Alleged Errors and Discrepancies in the Evaluation Process Petitioner JSP alleges that several errors in the evaluation process require that the contract award to the Henry and Rilla White Foundation be set aside and that the RFP be reissued and rebid. Petitioner first alleges that the bid selection committee failed to follow the certain instructions during the evaluation process. The instructions were prepared by the contract manager Diane Rosenfelder. The instructions were not required by rule or policy of DJJ. The contract manager considered the instructions advisory in nature. The instructions stated that the members of the bid selection committee should not contact each other with respect to the proposals under evaluation. The evaluators, however, were permitted to contact the contract manager who would record all questions and answers. There were instances in which the contract manager did not record questions from the evaluators to the contract manager. There is no evidence that the evaluators contacted each other regarding the proposals during the evaluation process. The instructions asked the evaluators to explain high or low scores given to the proposals under consideration. None of the evaluators made specific explanations of high or low scores. The contract manager who prepared the instructions considered this instruction discretionary, and there is no evidence that any score given by an individual evaluator was without basis. The evaluators were instructed to provide page numbers from the proposals used to score each item. None of the evaluators complied with this instruction. As indicated above, however, there is no evidence that the actual scores give by the evaluators were without basis. As set forth above, none of the evaluators received the Award Preference Form. This form was to be used in the case of very close scoring of the proposals. The actual scores from the bid selection committee reflected a clear preference for the proposal submitted by the Henry and Rilla White Foundation. Accordingly, there was no demonstrated need for DJJ to rely upon the Award Preference Forms in making its decision to award the contract. The letter of introduction sent to the bid selection committee members from the contract manager stated that the proposal score sheets and the evaluators award preference and the best interest of the district would be considered in determining the award. The contract manager considered this statement advisory in nature. DJJ has not promulgated specific standards relating to the best interest of District IX; however, the proposal evaluation forms sent to the bid selection committee inherently include criteria setting out standards for the determination of the best proposal for the district. The evidence reflects that one of the evaluators, Patricia Thomas, erroneously checked the box on each proposal which gave each of the proposals fifty points as certified minority enterprises, and erroneously wrote “50” as a point count on one evaluation score sheet. None of the proposals included a copy of the certification for minority enterprise as required by Section 287.0945, Florida Statutes, and the contract manager recognized that the evaluator had made a mistake in this regard. In response to this error, the contract manager consulted her supervisors. Because each proposal was awarded the same points, DJJ did not consider the evaluator’s error as prejudicial to any proposal or to the bid selection process, and did reject the evaluator’s scoring of the proposals. There is no showing that Petitioner JPS was prejudiced by DJJ’s decision in this regard. The contract manager added signature lines to the last page of the evaluation sheets. Some of the sheets were returned unsigned from the evaluators. There is no DJJ requirement that the evaluation sheets specifically contain the signatures of the evaluators. The contract manager did not consider the signature page mandatory, and the evaluation proposal score sheets were clearly identified by both color coding and the certificates of conflict signed by the evaluators. There is no evidence that the procedural discrepancies affected the substance of the evaluator’s scoring of the proposals, nor did the procedural discrepancies prejudice the evaluators’ consideration of Petitioner’s proposal.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is recommended that the Respondent enter a final order upholding the proposed agency action to award the contract to the Henry and Rilla White Foundation, and dismissing the Petition filed in this case. DONE and ORDERED this 23rd day of April, 1997, in Tallahassee, Florida. RICHARD HIXSON Administrative Law Judge Division of Administrative Hearings DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (904) 488-9675 SUNCOM 278-9675 Fax Filing (904) 921-6847 Filed with the Clerk of the Division of Administrative Hearings this 23rd day of April, 1997. COPIES FURNISHED: Dominic E. Amadio, Esquire Republic Bank Building, Suite 305 100 34th Street North St. Petersburg, Florida 33713 Scott C. Wright, Assistant General Counsel Department of Juvenile Justice 2737 Centerview Drive Tallahassee, Florida 32399-3100 Calvin Ross, Secretary Department of Juvenile Justice 2737 Centerview Drive Tallahassee, Florida 32399-3100 Janet Ferris, General Counsel Department of Juvenile Justice 2737 Centerview Drive Tallahassee, Florida 32399-3100

Florida Laws (2) 120.5720.316
# 7
GEORGIOS GAITANTZIS vs FLORIDA ENGINEERS MANAGEMENT CORPORATION, 98-004757 (1998)
Division of Administrative Hearings, Florida Filed:Jacksonville, Florida Oct. 26, 1998 Number: 98-004757 Latest Update: Apr. 20, 1999

The Issue Did Petitioner pass the Mechanical Engineers Examination he took on April 24, 1998?

Findings Of Fact On April 24, 1998, Petitioner took the Mechanical Engineers Examination. He received a score of 69 for his effort. A passing score was 70. The Mechanical Engineers Examination was administered under Respondent's auspices. As alluded to in the preliminary statement, Petitioner challenged the score received on problem 146. The maximum score available for that problem was ten points. Petitioner received eight points. In accordance with the National Council of Examiners for Engineering and Surveying Principles in Practice of Engineering Examinations for spring 1998, score conversion table - discipline specific, Petitioner had a raw score of 47 which equated to a conversion of 69, to include the eight raw points received for problem 146. In addition, the examination provided a scoring plan for problem 146, which assigns scores in increments of two points from zero to ten. To pass, it would be necessary for Petitioner to receive an incremental increase of two points, raising his score from eight points to ten points. This would give him a raw score of 49 points. According to the score conversion table - discipline specific, that would give Petitioner 71 points. According to the scoring plan for problem 146 to receive the ten points, Petitioner would have to demonstrate: Exceptional competence (it is not necessary that the solution to the problem be perfect) generally complete, one math error. Shows in-depth understanding of cooling load calculation psychrometrics. Problem 146 required Petitioner to: Determine the required cooling coil supply air quantity (cfm) and the conditions (°F db and °F wb) of the air entering and leaving the coil." Petitioner was provided a psychrometric chart to assist in solving problem 146. The examination candidates were also allowed to bring reference sources to the examination to assist in solving the examination problems. Petitioner brought to the examination, the Air-Conditioning Systems Design Manual prepared by the ASHRAE 581-RP Project Team, Harold G. Lorsch, Principal Investigator. Petitioner used that manual to determine the wet-bulb temperature of the air entering the coil. In particular, he used an equation from the manual involving air mixtures. For that part of the solution he arrived at a temperature of 65.6°F wb. According to the problem solution by Respondent's affiliate testing agency, reference ASHRAE Fundamentals Chapter 26, the coil entering wet-bulb temperature taken from the psychrometric chart was 66.12°F wb. The scorer in grading Petitioner's solution for problem 146 placed an "x" by the answer provided 65.6°F wb and wrote the words "psychrometric chart." No other entry or comment was made by that scorer in initially reviewing the solution Petitioner provided for that problem. This led to the score of eight. The scoring plan for problem 146 for the April 1998 examination taken by Respondent equates the score of eight as: MORE THAN MINIMUM BUT LESS THAN EXCEPTIONAL COMPETENCE Either a) Provides correct solution to problem with two math errors or incorrect dry-bulb or wet-bulb for coil entering or leaving conditions or minor total cooling load error, or b) Provides correct solution to items c and d correctly and minor math errors in items a and b of Score 6 below. Petitioner was entitled to review the results of his examination. He exercised that opportunity on September 21, 1998, through a post-examination review session. Petitioner requested and was provided re-scoring of his solution to problem 146. According to correspondence from the National Council of Examiners for Engineering and Surveying to the Florida Member Board from Patricia M. Simpson, Assistant Supervisor of scoring services, the score did not change through re-scoring. In this instance, the October 14, 1998 correspondence on re-scoring states, in relation to problem 146: Incorrect methodology used in calculating coil entering wet-bulb temperature. Incorrect coil entering wet-bulb temperature provided. No calculation provided for coil leaving temperature conditions. The coil leaving wet-bulb temperature in Respondent's proposed solution was 53.22°F wb taken from the psychrometric chart. Petitioner's solution for the coil leaving wet-bulb temperature taken from the psychrometric chart was 53.3°F wb. At hearing Respondent did not provide an expert to establish the basis for point deduction in the original score and the re-scoring of Petitioner's solution for problem 146. Moreover, Respondent did not present expert witnesses to defend the commentary, the preferred written solution in its examination materials. Consequently, Respondent's preferred solution constitutes hearsay about which no facts may be found accepting the validity of Respondent's proposed solution, as opposed to merely reporting that information.1 By contrast, Petitioner provided direct evidence concerning the solution provided for problem 146 in response to the criticisms of his solution that were unsupported by competent evidence at hearing. More importantly the criticisms were responded to at hearing by Geoffrey Spencer, P.E., a mechanical engineer licensed to practice in Florida, who was accepted as an expert in that field for purposes of the hearing. As Petitioner explained at hearing, he used the Air- Conditioning Systems Design Manual equation to arrive at the coil entering wet-bulb temperature, which he believed would provide the answer as readily as the use of the psychrometric chart. (Although the psychrometric chart had been provided to Petitioner for solving problem 146, the instructions for that problem did not prohibit the use of the equation or formula.) Petitioner in his testimony pointed out the equivalency of the process of the use of the psychrometric chart and the equation. Petitioner deemed the equation to be more accurate than the psychrometric chart. Petitioner had a concern that if the answer on the coil entering wet-bulb temperature was inaccurate, this would present difficulty in solving the rest of problem 146 because the error would be carried forward. Petitioner pointed out in his testimony that the solution for determining the coil entering wet-bulb temperature was set out in his answer. The answer that was derived by use of the formula was more time consuming but less prone to error, according the Petitioner's testimony. Petitioner points out in his testimony that the answer he derived, 65.6°F wb, is not significantly different than Respondent's proposed solution of 66.12°F wb. (The instructions concerning problem 146 did not explain what decimal point of a degree the candidate had to respond to in order to get full credit for that portion of the solution to the problem.) Petitioner in his testimony concerning his solution for the coil leaving wet-bulb temperature indicated that the calculation for arriving at that temperature was taken from the psychrometric chart and is sufficiently detailed to be understood. Further, Petitioner testified that the degree of accuracy in which the answer was given as 53.3°F wb, as opposed to Respondent's proposed solution of 53.22°F wb, is in recognition of the use of the psychrometric chart. Petitioner questions whether the proposed solution by Respondent, two decimal points, could be arrived at by the use of the psychrometric chart. In relation to the calculation of the coil entering wet-bulb temperature, Mr. Spencer testified that the formula from the Air-Conditioning Systems Design Manual or the psychrometric chart could have been used. Moreover, Mr. Spencer stated his opinion that the solution for coil entering wet-bulb temperature of 65.6°F wb by Petitioner is sufficiently close to Respondent's proposed solution of 66.12°F wb to be acceptable. Mr. Spencer expressed the opinion that Petitioner had correctly used the formula from the manual in solving the coil entering wet-bulb temperature. Mr. Spencer expressed the opinion that the psychrometric chart is an easier source for obtaining the solution than the use of the formula from the manual. In Mr. Spencer's opinion, the formula shows a more basic knowledge of the physics involved than the use of the psychrometric chart would demonstrate. In relation to the coil leaving wet-bulb temperature, Mr. Spencer expressed the opinion that Petitioner had adequately explained the manner of deriving the answer. Further, Mr. Spencer expressed the opinion that the answer derived was sufficiently accurate. The testimony of Petitioner and opinion of Mr. Spencer is unrefuted and accepted.

Recommendation Upon consideration of the facts found and conclusions of law reached, it is RECOMMENDED: That a final order be entered which finds that Petitioner passed the Florida Board of Professional Engineers April 24, 1998, Mechanical Engineers Examination with a score of 71. DONE AND ENTERED this 22nd day of February, 1999, in Tallahassee, Leon County, Florida. CHARLES C. ADAMS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 22nd day of February, 1999.

Florida Laws (2) 120.569120.57
# 8
DON HALL vs DEPARTMENT OF CHILDREN AND FAMILY SERVICES, 99-004530 (1999)
Division of Administrative Hearings, Florida Filed:Jacksonville, Florida Oct. 26, 1999 Number: 99-004530 Latest Update: Sep. 28, 2000

The Issue The issue is whether Petitioner's son is eligible for assistance from the Developmental Services Program.

Findings Of Fact Based upon all of the evidence, the following findings of fact are determined: Background In this proceeding, Petitioner, Donald Hall, Sr., has appealed an eligibility decision of Respondent, Department of Children and Family Services (Department), which denied an application for mental retardation assistance for his son, Donald Hall, Jr. (Don), now almost 21 years of age, under the Developmental Services Program (Program). As a ground, the Department simply stated that the son was "not eligible for assistance." As clarified at hearing, Respondent takes the position that Don does not meet the statutory definition of a retarded person and therefore he does not qualify for assistance. The test for assistance The Program provides services to persons with specific developmental disabilities, such as mental retardation, cerebral palsy, spina bifida, and autism. In order to be eligible for mental retardation assistance, an individual must meet the definition of "retardation," as that term is defined in Section 393.063(44), Florida Statutes (1999). That provision defines the term as meaning "significantly subaverage general intellectual functioning existing concurrently with deficits in adaptive behavior and manifested during the period from conception to age 18." As further defined by the same statute, the term "significantly subaverage general intellectual functioning" means "performance which is two or more standard deviations from the mean score on a standardized intelligence test specified in the rules of the department." In this case, the mean score is 100, and the standard deviation is 15; thus, an individual must have general intellectual functioning of at least two deviations below 100, or a score of less than 70, in order to qualify under this part of the definition. To determine intellectual functioning, standardized testing is performed; one such test is the Wechsler Intelligence Scale for Children (Wechsler), as revised from time to time, which was administered to Don. "Adaptive behavior" is defined as "the effectiveness or degree with which an individual meets the standards of personal independence and social responsibility expected of his or her age, cultural group, and community." In plainer terms, adaptive behavior means the individual's ability to function in everyday tasks in the world. This includes such things as providing personal care to oneself, expressing oneself, and finding one's way around. This behavior is measured by instruments such as the Vineland Adaptive Behavior Scale (Vineland). Finally, both the subaverage general intellectual functioning and deficits in adaptive behavior must have manifested and been present before the individual reached the age of 18. In this case, the Department asserts that it is "eighty percent" sure that Don is not mentally retarded. It acknowledges, however, that he does have "significant difficulties in all areas of functioning." More specifically, the Department bases its denial on the fact that Don's 1995 tests indicated that his adaptive behavior was equivalent to other children of the same age, and that his intellectual functioning tests, principally the 1990 test and one score in 1995, revealed that he is in the borderline range between low average and mentally retarded. Don's background Don was born on November 5, 1979. Even while attending an educable mentally handicapped class at Parkwood Heights Elementary School, a public school in Duval County, Florida, Don experienced difficulty in coping with the curriculum. Indeed, after he had already repeated the first and third grades, and he was in danger of failing the fourth grade as well, public school officials transferred Don from the public school to Morning Star School (Morning Star), a private school for students with learning disabilities, including those who are mildly mentally handicapped. Later, when teachers at Morning Star expressed concern that Don had "gone as far as they could help him," and he was too old to retain eligibility, Don was referred by a child study team to Alden Road Exceptional Student Center (Alden Road), a public school (grades 6-12) for mentally handicapped students. Due to his present age (almost 21), he has only one year of eligibility left at Alden Road. At the school, Don receives limited academic instruction and has a supervised job. Don became eligible for Social Security death benefits when his natural mother died. Recently, his parents (father and stepmother) made application for those benefits to be converted to greater, more permanent Social Security benefits because of his condition. Their request was quickly approved, and Donald now receives lifetime monthly Social Security benefits. Don's test results for general intellectual functioning On April 24, 1990, when Don was 10 years old, he was given a psychological evaluation, which included the Wechsler test, to produce verbal, performance, and full scale intelligence quotients (IQs). The verbal IQ is a composite score of several subtests that make up the intelligence scale, including verbal reasoning, verbal memory, and verbal expressive skills. The performance score is based on a group of nonverbal tests, such as putting blocks and puzzles together, sequencing pictures, and marking coded symbols in a timed environment. Those results indicated a verbal IQ of 78, a performance IQ of 77, and a full scale IQ of 76. These scores placed him in the "borderline range" of intellectual functioning somewhere between low average and mentally retarded. The Wechsler test was revised in 1991 to provide a more valid estimate of intellectual functioning compared to the current day population. This resulted in students who retook the test scoring at least 5 points lower, and sometimes even lower, than they did on the earlier version of the test. Therefore, it is not surprising that Don attained lower scores on subsequent tests. The evidence establishes that a child will typically attain higher IQ scores at an earlier age, and that as he grows older, his scores will "tail off." This is because a child's intellectual skills reach a plateau, and the child is not learning new skills at a higher level as his age increases. Therefore, later tests scores are more indicative of Don's intellectual functioning. In 1993, when he was 13 years old, Don was again evaluated by the Duval County School Board and received a verbal IQ of 65, a performance IQ of 54, and a full scale IQ of 56 on the Wechsler test. More than likely for the two reasons given above, these scores were substantially lower than the scores achieved in 1990, and they indicated that Don was "in the range of mild mental retardatation" and therefore eligible for services. In 1995, when Don was 16 years old, he was again given the Wechsler test by a psychologist and was found to have a verbal IQ of 71, a performance IQ of 54, and a full scale IQ of Except for the verbal score, Don's IQ scores placed him in the range of mild mental retardation. On the 1995 verbal IQ score, which is made up of ten subtests, Don had one subtest with a score of 91, which raised his overall verbal IQ score to 71. Without that score, the verbal IQ would have been in the 60s, or in the mildly mentally retarded range. The evidence shows that it is quite common for children with mild to moderate deficiencies to score within the average range on some types of achievement measures. For example, some mildly retarded children will achieve a high level on academic tests, such as in the 80s or 90s, but they have little comprehension as to what those words mean. More than likely, Don fits within this category, and an overall verbal score of less than 70 is more reflective of his intellectual functioning. Based on the 1993 and 1995 tests, Don has general intellectual functioning of at least two deviations below 100, and therefore he qualifies for assistance under this part of the test. Adaptive behavior skills As noted above, this category measures Don's ability to deal with everyday tasks. To be eligible for services, an applicant must have deficits in his adaptive behavior which manifested before the age of 18. Presently, and for eight months out of the year, Don works from noon until 8:00 p.m. Monday through Friday at Jacksonville University "in the skullery room and [doing] tables." He relies on community transportation (from door to door) to get to and from work. When not working, he attends Alden Road where he receives limited academic instruction. According to a Vineland instrument prepared by an Alden Road teacher in December 1995, Don then had an overall adaptive behavior composite of 16 years old, or one roughly equivalent to other children of the same age. More specifically, in terms of communication, he was functioning at the age of 16; in terms of daily living skills, he was reporting at a greater level than the 18-year-old level; and in terms of socialization, he was slightly lower than a 16-year-old. The teacher who prepared the raw data on which the test score was derived was surprised to learn that her data produced a result which indicated that Don had adaptive skills equivalent to someone his own age. Based on her actual experience with him in the classroom, she found Don to be "functioning way below" her own son, who was the same age as Don. She further established that he can follow only the most "simple" instructions, and he will always need someone "looking out for him." This was corroborated by Don's parents and family friends. The Vineland test result also differs markedly from Don's real life experience. Don lives at home with his father and stepmother; he requires "constant supervision all day," even while working; and he is unable to live by himself. He is a "very trusting person," is easily subject to unscrupulous persons who could take advantage of him, and cannot manage his own money. Indeed, his psychologist described him as being "an easy target to be taken advantage of [by others]." Although Don is able to administer to some of his basic personal hygiene needs, he still requires constant reminders to do such things as wash his hair or brush his teeth. Finally, Don has minimal problem solving skills, and he is easily confused by instructions unless they are "very simple." In short, these are real deficits in adaptive behavior and are sufficient to make Don eligible for Program services.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Department of Children and Family Services enter a final order granting Petitioner's application for Program benefits for Donald Hall, Jr. DONE AND ENTERED this 14th day of July, 2000, in Tallahassee, Leon County, Florida. DONALD R. ALEXANDER Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 14th day of July, 2000. COPIES FURNISHED: Virginia A. Daire, Agency Clerk Department of Children and Family Services Building 2, Room 204B 1317 Winewood Boulevard Tallahassee, Florida 32399-0700 Josefina M. Tomayo, General Counsel Department of Children and Family Services Building 2, Room 204 1317 Winewood Boulevard Tallahassee, Florida 32399-0700 Kathryn L. Sands, Esquire 1830 Atlantic Boulevard Jacksonville, Florida 32207-3404 Roger L. D. Williams, Esquire Department of Children and Family Services Post Office Box 2417 Jacksonville, Florida 32231-0083

Florida Laws (3) 120.569120.57393.063
# 9
# 10

Can't find what you're looking for?

Post a free question on our public forum.
Ask a Question
Search for lawyers by practice areas.
Find a Lawyer