Elawyers Elawyers
Washington| Change
Find Similar Cases by Filters
You can browse Case Laws by Courts, or by your need.
Find 49 similar cases
JONATHAN A. BATISTA vs BOARD OF PROFESSIONAL ENGINEERS, 20-003075RX (2020)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Jul. 10, 2020 Number: 20-003075RX Latest Update: Sep. 22, 2024

The Issue The issue in this case is whether Florida Administrative Code Rule 61G15-21.004(2) is an invalid exercise of delegated legislative authority.

Findings Of Fact The Parties Petitioner is an applicant for licensure as a professional engineer ("P.E.")2 in Florida, and currently works in the discipline of environmental engineering in Florida. His practice focuses primarily on water-related areas within that discipline. Petitioner is not currently licensed as a P.E. Respondent is a board within the Department of Business and Professional Regulation ("Department"). It is the state agency created pursuant to section 471.007, Florida Statutes, and charged with licensing professional engineers in Florida. Respondent is vested with the authority to adopt rules to implement chapter 471, regarding the regulation of the practice of engineering in Florida, as defined in section 471.005(7). Respondent adopted the Challenged Rule at issue in this proceeding. Statutory and Rule Background The engineering profession in Florida is regulated pursuant to chapter 471. A person may become licensed as a P.E. in Florida by applying for licensure, fulfilling specified educational and experience requirements, and 2 For purposes of this Final Order, the terms "professional engineer" or "P.E." will be used to refer to persons who are licensed engineers under chapter 471, Florida Statutes. either being endorsed for licensure as provided in sections 471.015(3) and (5), or passing the required licensure examinations. § 471.015, Fla. Stat. Pursuant to section 471.015, Respondent has adopted Florida Administrative Code Chapter 61G15-20, which codifies, in rule, the requirements for licensure as a P.E. in Florida. An applicant for licensure must be a graduate of a Board-approved engineering program; have the requisite number of years of engineering experience; and have passed the specified licensure exams. Fla. Admin. Code R. 61G15-20.0010. Section 455.217(1)(d) authorizes Respondent to adopt, by rule, the use of a national professional licensing examination that the Department has certified as meeting requirements of national examinations and generally accepted testing standards. To implement section 455.217(1), Respondent has adopted rule 61G15- 21.001, titled "Examination Designated; General Requirements." This rule requires that, unless an applicant qualifies for licensure by endorsement, he or she must pass the National Council of Examiners for Engineering and Surveyors ("NCEES") licensure exam. Part I of the NCEES exam is the Fundamentals of Engineering ("FE") exam, and part II of the NCEES exam is the Principles and Practice Exam ("PP") exam. Respondent has entered into a contract with NCEES to provide the FE and PP exams in Florida. A person must pass both the FE and PP exams to be licensed as a P.E. in Florida. § 471.015(1), Fla. Stat. The Challenged Rule states: "[t]he passing grade for Principles and Practice Exam is determined by the National Council of Examiners for Engineering and Surveying, where psychometric statistical methods are used to determine the level of performance that corresponds with minimal competence in the discipline." Fla. Admin. Code R. 61G15-21.004(2). The Challenged Rule is a subsection of rule 61G15-21.004, which is titled "Passing Grade." The Challenged Rule specifically and exclusively addresses the method for determining the passing grade on the PP exam. Sections 455.217(1)(d) and 471.013 are cited as the rulemaking authority for the Challenged Rule, and sections 455.217(1)(d) and 471.015(1) are cited as the law implemented by the Challenged Rule. The term "engineering," as used in section 471.005(7), includes the term "professional engineering," and defines the types of services and creative work that constitutes "engineering." An "engineer," as defined in section 471.005(5), includes the terms "professional engineer" and "licensed engineer," and means a person who is licensed to engage in the practice of engineering under chapter 471. By contrast, an "engineer intern," as defined in section 471.005(6), means a person who has graduated from a Board-approved engineering curriculum and has passed the FE exam. By definition, these are distinct terms. The term "engineer" is used to describe a person licensed as a P.E. under chapter 471, while the term "engineer intern" is used to described a person who may engage in the kinds of activities described within the term "engineering," as defined in section 471.005(7), but who is not licensed as a P.E. in Florida, and, therefore, is not authorized to hold himself or herself out as a licensed engineer in Florida. The Rule Challenge Petition The Rule Challenge Petition alleges four grounds under section 120.52(8) for invalidating the Challenged Rule. Alleged Invalidity of Challenged Rule under Section 120.52(8)(c) In paragraph 15 of the Rule Challenge Petition, Petitioner asserts that section 455.217(1)(c)—which, at the time the Rule Challenge Petition was filed, was cited as the specific authority for, and law implemented by, the Challenged Rule—did not authorize the Challenged Rule, so that the Challenged Rule enlarged, modified, or contravened the specific provisions of law implemented, pursuant to section 120.52(8)(c). As a result of Petitioner having filed the Rule Challenge Petition, Respondent discovered that it had not updated its citation of the specific authority for, and law implemented by, the Challenged Rule, when section 455.217(1) was amended and renumbered in 1997, so that section 455.217(1)(c) no longer was the correct citation to the law implemented by the Challenged Rule. Respondent requested the Department of State, Administrative Code and Register Section ("DOS"), to make a technical, non-substantive change to the Challenged Rule. As authorized by Florida Administrative Code Rule 1-1.010(10), DOS updated the statutory citation to section 471.217(1)(d), which is the correct citation to the law implemented by the Challenged Rule. This technical change nullifies the alleged invalidity ground set forth in paragraph 15 of the Rule Challenge Petition, and Petitioner concedes this. In paragraph 16 of the Rule Challenge Petition, Petitioner also alleges that the Challenged Rule enlarges, modifies, or contravenes section 455.217(1)(a), because the PP examination does not adequately and reliably measure an applicant's ability to practice the profession regulated by the Department. However, as discussed below, section 455.217(1)(a) is not cited as a specific provision of law implemented by the Challenged Rule, so cannot form the basis of a challenge to the Rule under section 120.52(8)(c). Alleged Invalidity of Challenged Rule under Section 120.52(8)(d) In paragraph 17 of the Rule Challenge Petition, Petitioner alleges that the Challenged Rule is invalid under section 120.52(8)(d) because it is vague, fails to establish adequate standards for agency decisions, or vests unbridled discretion in the agency. In support of this alleged invalidity ground, Petitioner asserts that the Challenged Rule is vague because "the level of performance on the PP exam is stated to correspond with minimal competency, yet there are no rules which provide definitive guidance to NCEES on what constitutes the general areas of competency in regards to engineering practice."3 Petitioner also alleges, in paragraph 19 of the Rule Challenge Petition, that the Challenged Rule is invalid under section 120.52(8)(d) because it equates passage of the PP exam with a 30-year practice experience requirement for licensure by endorsement set forth in section 471.015(5)(b). To this point, Petitioner states: "I can't think of anything more arbitrary than the principles and practice exam equating to near[-]retirement level experience."4 Alleged Invalidity of Challenged Rule under Section 120.52(8)(e) In paragraph 18 of the Rule Challenge Petition, Petitioner alleges that the Challenged Rule is invalid pursuant to section 120.52(8)(e) because it is arbitrary, for several reasons. Paraphrased, these reasons are: passage of the PP exam does not accurately reflect, or equate to, minimal competence in the discipline; the PP Exam does not accurately evaluate an individual's engineering ability level, but instead evaluates an individual's exam performance compared to average group exam performance; the PP exam does not reliably distinguish between minimal competence and incompetence to practice engineering, as evidenced by the fact that engineers who fail the PP exam still competently perform, and, thus, keep, their engineering jobs; passing the PP exam, by itself, does not certify an individual to competently perform any engineering service or creative work as defined in 3 Section 455.217(1)(b) requires, for each exam developed by the Department or a contracted vendor, that the general areas of competency covered by the exam be specified by rule. The last sentence of that subsection states that the requirements of subsection (b) do not apply to national exams, such as the NCEES PP exam, which are approved and administered pursuant to section 455.217(1)(d). Thus, the law implemented by the Challenged Rule does not require areas of competency to be specified in the Challenged Rule. 4 Because paragraph 19 of the Rule Challenge Petition alleges that the rule is arbitrary, the undersigned considers this paragraph to constitute a challenge to the Challenged Rule under section 120.52(8)(e), rather than under section 120.52(8)(d), as cited in the Rule Challenge Petition, and has addressed this ground in the Conclusions of Law section dealing with that alleged invalidity ground. section 471.005(7); the PP exam does not reliably determine if an examinee is minimally competent, due to an incorrect reference point; and the PP exam does not reliably distinguish between individuals whose practice of engineering would protect the public health and safety and those whose practice of engineering would constitute a danger to public health and safety. Alleged Invalidity of Challenged Rule under Section 120.52(8)(f) In paragraph 20 of the Rule Challenge Petition, Petitioner alleges that the Challenged Rule is invalid pursuant to section 120.52(8)(f) because it imposes regulatory costs on the regulated person, county, or city, which could be reduced by the adoption of less costly alternatives that substantially accomplish the statutory objectives. To this point, Petitioner proposes a state- conducted investigation of an applicant as a substitute for the PP exam, and contends that "there's a premium associated with a national exam. It's also clear that the regulatory cost imposed on me and all future examinees could be substantially reduced if the Board conducted their exam as an investigation and did not incorporate the NCEES exam." However, as more fully discussed below, this challenge ground is time- barred by section 120.541(1)(g), and, therefore, is not a legally-cognizable basis for invalidating the Challenged Rule in this proceeding. The Parties' Stipulated Facts Petitioner is an Engineer Extern, Texas EIT 56990. Pursuant to section 471.015(1), the Florida Engineers Management Corporation "shall issue a license to any applicant who the Board certifies is qualified to practice engineering and who has passed the [FE] exam and the [PP] exam." Both the FE exam and the PP exam are created by NCEES, pursuant to section 455.217(1)(d), which states, in pertinent part: "a board . . . may approve by rule the use of any national examination which the [Department of Business and Professional Regulation] has certified as meeting the requirements of national examinations and generally accepted testing standards pursuant to department rules." Pursuant to section 471.015(1), Petitioner passed the FE exam on September 3, 2016. Passing the FE is a prerequisite to take the PP exam. Petitioner registered to take the NCEES computer-based PP exam for environmental engineering on April 22, 2020. He paid the $350 exam fee and additional monies for test preparation material, and spent at least 100 hours preparing for the exam. He was notified, on April 6, 2020, by Pearson Vue, the test center company, that the exam was cancelled due to the Covid-19 pandemic. On April 7, 2020, Petitioner registered to take the exam on July 15, 2020, which was the earliest available date for taking the exam in his local area. On April 9, 2020, Petitioner canceled his registration for the July 15, 2020, exam and decided to apply for licensure as a P.E. without passing the PP exam. Pursuant to section 471.015(2)(a)1., on May 18, 2020, Petitioner filed his application for licensure with the Board. The application provides information stating that Petitioner meets the requirements of section 471.013(1)(a)1., and has at least four years of active engineering experience of a character indicating competence to be in responsible charge. In compliance with the education requirements of section 471.013(1)(a)1., Petitioner earned a Bachelor of Science degree in Chemical Engineering from the University of Florida, which is accredited by ABET. Petitioner provided an official transcript to the Board for verification purposes. In compliance with rule 61G15-20.002, Petitioner states in his application that he has at least four years of acceptable engineering practice, consisting of one year of experience equivalent through his Master of Engineering degree from the University of Texas at Austin, an ABET- accredited institution; and over three years of professional engineering work verified by licensed engineers. According to rule 61G15-20.002, an applicant must list three current personal references who are professional engineers. Thirteen licensed engineers submitted documentation to the Board regarding Petitioner's type of qualifying experience, level of engineering competency, and professional integrity. All references circled "yes" to the question "would you employ the applicant in a position of trust?" Two additional references in the engineering industry also provided their opinion on Petitioner's integrity and competence. On June 1, 2020, Petitioner received notification from the Board that his Florida 0901 1031-P.E. Endorsement application was incomplete. Petitioner addressed all items listed in a timely fashion to participate in the June 10, 2020, Board meeting. The Board stated during an informal hearing on June 10, 2020, that Petitioner would not be granted P.E. licensure due to not having passed the PP Exam. Petitioner is substantially affected by the Challenged Rule because it disqualifies him from becoming a licensed engineer in Florida without passing the PP exam. Findings of Fact Based on Evidence Presented at the Final Hearing A. Findings Regarding the Evidence Presented in Petitioner's Case Testimony of Witnesses Hoot and Grace David Hoot and Nigel Grace, both of whom are licensed professional engineers, testified regarding Petitioner's abilities and skills as a practicing engineer. Neither Hoot nor Grace was qualified as an expert in the field of psychometrics or related topics. Therefore, any opinions regarding these subject matters to which they testified at the hearing have been treated as personal opinions, rather than expert opinions. Hoot characterized Petitioner as a good, diligent young engineer, and described Petitioner's role in various projects on which they worked together. Based on his work with Petitioner, it was Hoot's personal opinion that Petitioner possesses the integrity and competence to work as a licensed engineer who would serve the public health and safety. Hoot testified that when he took the PP exam approximately 38 years ago, it was a free response exam. He stated that he was "not exactly sure" that a multiple-choice exam captures an examinee's ability to apply reason and judgment, but he acknowledged that he does not know how the PP exam currently is developed. He offered his personal opinion that it was possible for a competent engineer to fail the PP exam. Hoot also offered his personal view that engineering licensure constitutes a standard of care; however, he did not think anything provides a guarantee of competence. It was his personal view that experience tended to make one more competent. He also offered his personal view that as an engineer gains more experience and becomes more specialized, it is understandable that he or she would not score as well as an engineer who is gearing up toward the four-year experience goal which enables them to apply for licensure as a P.E. As Hoot put it, "life happens. You have children involved. You move jobs. I think you . . . maybe have less time to study. . . you get farther away from the study habits of . . . learning to be able to take tests." Grace, who is employed as a P.E. with Brown and Caldwell ("B&C"), a large U.S. engineering firm, testified regarding Petitioner's work experience while he was employed by B&C. Petitioner's experience included working on drinking water projects, utilities, upgrading process equipment, site evaluations, bench scale testing, and other projects. Based on Grace's work with Petitioner, it was his personal opinion that Petitioner possesses the integrity and competence to work as a licensed engineer who would serve the public health and safety. Grace took the PP exam approximately 28 years ago, and at the time, a major portion of the exam consisted of long-form written exam questions that provided the opportunity for examinees to provide free response answers and earn partial credit for partially-correct answers. He testified that the exam also had a multiple-choice component. Grace testified that, "based on instinct," he knows well-designed multiple-choice questions can provide the same insight into an examinee's decision-making judgment as long-form questions. Grace's personal view is that passing the PP exam does not guarantee competence, and it is possible for an engineer to be competent in some engineering disciplines but not others. Grace also agreed that it was logical that an examinee with approximately four years' experience would perform better on the PP exam because he or she would have better-honed test-taking skills and be fresher in some areas tested on the exam. By contrast, engineers who have practiced longer have more experience, but often have become specialized and further removed from the test-taking environment. Thus, passing a broad-spectrum exam becomes a bigger hurdle for engineers who have practiced longer. Petitioner's Testimony Petitioner testified on his own behalf at the final hearing. As discussed above, Petitioner holds a bachelor of science degree in chemical engineering from an ABET-accredited institution, and holds a master of engineering degree from an ABET-accredited institution. Although Petitioner, through his training as an engineer, is skilled at mathematics and statistics, he is not trained, and does not have any substantial experience, in the field of psychometrics.5 5 As discussed in greater detail below, psychometrics is a specialized field of study that concerns the theory and technique of objective psychological measurement of skills, knowledge, abilities, and educational achievement. Petitioner acknowledged that he does not have a degree in psychometrics and that he is not trained in, or knowledgeable about, preparing and administering high-stakes professional examinations. At the time he filed the Rule Challenge Petition, Petitioner had no personal knowledge of the work done by psychometricians; did not know what a "cut score" was; and was not familiar with the Modified Angoff Method, item response theory, the specifics of converting raw scores to scaled scores, or other any psychometric tools and methods employed to prepare and score the PP exam. Petitioner acquired some rudimentary knowledge in a few of these areas in preparing for the final hearing in this proceeding.6 Section 90.701(2), Florida Statutes, prohibits a lay witness from testifying as to an opinion regarding a matter involving specialized knowledge.7 Accordingly, it is determined that Petitioner is not competent to provide an expert opinion regarding psychometrics and related areas, such as item response theory. He was not qualified, tendered, or accepted at the final hearing as an expert in psychometrics or any related areas. Because Petitioner was not qualified to testify as an expert witness at the final hearing, Petitioner's testimony regarding psychometrics, item response theory, scaled scores, the Modified Angoff Method, high-stakes professional testing, and all other specialized subject matters, consisted of opinion testimony by a lay witness. The only instances in which a lay witness 6 Petitioner does not have any special knowledge, formal training, education, or experience in the specialized field of psychometrics. His knowledge about these areas was acquired by reading and study in preparation for the final hearing. Because Petitioner lacks special knowledge, experience, training, and education in psychometrics, he is not qualified to testify as an expert in psychometrics or related topics, such as item response theory. See Chavez v. State, 12 So. 3d 199, 205 (Fla 2009)(in determining whether a witness is qualified to render an opinion as an expert in a specialized field, the court must determine whether the witness is adequately qualified to render an opinion based on special knowledge, experience, training, or education). 7 Chesser v. State, 30 So. 3d 625 (Fla. 1st DCA 2010)(it is error for a court to accept opinion testimony of a lay witness in a specialized subject matter area). may provide opinion testimony are when the lay witness's opinion is based on firsthand knowledge through personal perception.8 As the undersigned explained during the final hearing, rather than excluding Petitioner's lay opinion testimony, she would consider these pertinent evidentiary principles in determining the weight to be afforded Petitioner's lay opinion testimony in this proceeding. Applying these evidentiary principles, it is determined that Petitioner's testimony regarding psychometrics and related topics, such as item response theory, classical response theory, and high-stakes test reliability and validity, concerned specialized subject matters not within the realm of common knowledge or based on Petitioner's personal perception. Rather, such specialized subject matters required expert witness testimony, pursuant to section 90.702, and as addressed above, Petitioner was not shown to be an expert in any of these specialized subject matters. Because Petitioner's testimony constituted the type of opinion testimony that is not permissible by a lay witness, pursuant to section 90.701, such testimony is not afforded weight in this proceeding. Petitioner testified that "minimal competence," which is the standard measured on the PP Exam, equates to "competence" as defined in the dictionary—that is, the "sufficient ability for a specific need or requirement." However, this position ignores that, for purposes of the Challenged Rule, "minimal competence" is a term of art specifically used, in the psychometric measurement context, to describe the level of competence that corresponds to a passing score on the PP exam. As more fully discussed below, the PP exam is developed and scored using psychometric tools and methods. 8 Nat'l Commc'ns. Indus., Inc. v. Tarlini, 367 So. 2d 670, 671 (Fla. 1st DCA 1979)(lay witness testimony regarding a specialized subject matter was not admissible into evidence because the testimony was not regarding a subject matter about which the witness could testify based on common knowledge or his personal perception.) Petitioner contends that engineering experience is, by itself, a reliable measure of competence, so it is unnecessary to also pass the PP exam. Petitioner testified "the application process and [PP] exam have two entirely different methods to identify the same result: whether an engineer in training is competent enough to become a licensed engineer It is not logical for two checks of competence to come up with different results. There should be consistency." In support of this position, Petitioner relies on section 471.015(2)(a), which requires "at least 4 years of active engineering experience of a character indicating competence to be in 'responsible charge' of engineering." § 471.015(2)(a), Fla. Stat. "Responsible charge" is defined in rule 61G15-18.011(1) as the degree of control an engineer is required to maintain over engineering decisions made personally or by others over which the engineer exercises supervisory direction and control authority. An engineer in responsible charge is the "engineer of record," as defined in rule 61G15-30.002(1). Rule 61G15-30.002(1) defines "engineer of record" as a Florida professional engineer who is in responsible charge. Thus, an engineer who is qualified, for purposes of being in responsible charge pursuant to section 471.015(2)(a) must, in addition to having the minimum statutory experience, be a licensed P.E. This means that he or she necessarily must have passed the PP examination. These statutory and rule provisions collectively reinforce the point that for an engineer to demonstrate competence for purposes of holding himself or herself out as an "engineer," as defined in section 471.005(7), he or she must satisfy all three requirements of section 471.015(2)(a)—i.e., education, experience, and passing the licensing exam.9 9 The requirement to meet these three requirements, including the PP exam, is codified in section 471.051(2)(a). Eliminating the exam requirement and relying strictly on education and/or experience for licensure would require the Florida Legislature to amend this statute Petitioner echoed the testimony of Hoot and Grace that licensure is not a guarantee of competence, and that passing the PP exam does not guarantee minimal competence. To this point, he testified that he does not believe that the PP exam adequately and reliably measures an applicant's ability to practice engineering, and that experience is a better indicator of competence than passing the exam. By way of example, Petitioner described his own experience10—which he characterized as "directly matching" the activities in which a licensed engineer engages—and compared that experience to measuring competence by an exam, which Petitioner characterized as "attempt[ing] to indirectly measure my ability as an engineer." Based on his personal experience, Petitioner contends that experience better demonstrates competence to be licensed as a P.E.; that passing the PP exam does not indicate minimal competence to practice engineering; and that failing the PP exam does not mean that the examinee is not minimally competent. He further testified that examinees who fail the PP exam likely are minimally competent, since the engineering jobs they hold when applying for licensure likely would require that they be minimally competent in order to have been hired.11 to eliminate the exam requirement. The undersigned is not authorized by statute or the Florida Constitution to eliminate the PP exam requirement for licensure under chapter 471. 10 Petitioner's experience, set forth in his P.E. licensure application, was verified by his supervising engineers. 11 Petitioner appears to conflate being determined not "minimally competent" for purposes of passing the PP exam, with "incompetence," which is defined in Florida Administrative Code Rule 61G15-19.001(5) as the "physical or mental incapacity or inability of a professional engineer to perform the duties normally required of the professional engineer." Part of this confusion may be due, in part, to Respondent's response to one of Petitioner's interrogatories asking for a definition of "minimal competence." Rather than directly answering the interrogatory, Respondent referred Petitioner to the definition of "incompetence" for purposes of imposing discipline under Respondent's disciplinary rules—thus causing Petitioner to understandably assume that failing to demonstrate minimal competence through passing the PP exam equates to "incompetence," as defined in rule 61G15-19.001(5). However, the fact that Petitioner has not demonstrated "minimal competence" on the PP exam does not mean that he is incompetent; it simply means that he has not yet passed the PP exam for licensure as a P.E. in Florida. To this point, if failing to demonstrate "minimal competence" by passing the PP exam equated to being incompetence, every person who performs engineering work in To further illustrate this point, Petitioner noted that the data regarding passage rate of the PP exam shows that examinees having zero years of experience are almost twice as likely to pass the PP exam as examinees having 11 or more years of experience. However, as Hood and Grace explained, and as further discussed below, engineers having more than four to five years of experience begin to specialize in narrower fields and "life happens," in that personal and professional circumstances render it more difficult to prepare for and take a high-stakes test. Petitioner also disputed the accuracy of the PP exam preparation and scoring process. In particular, he took issue with the "model law engineer" standard to which the exam is designed. As discussed more extensively below, this standard equates to the competence level of an engineer having four years of engineering experience and who is capable of practicing engineering in a manner that protects the public health and safety. In particular, Petitioner contends that designing the PP exam to the "model law engineer" standard is unfair to anyone taking the exam that does not have exactly four of years of engineering experience. Notably, however, section 471.015(1), which is the statute implemented by the Challenged Rule, establishes four years as the engineering experience required for licensure as a P.E. Thus, the "model law engineer" standard is rationally related to the statutory minimum experience level for purposes of demonstrating minimum competency to be licensed. Petitioner also contends that the subject matters tested on the PP exam are unfairly broad, so that engineers who specialize in a particular area within an engineering discipline—such as specializing in water-related areas in environmental engineering—are disadvantaged by being required to take Florida but has not passed the PP exam would be "incompetent," and, thus, potentially subject to disciplinary action. an exam that covers a broad range of areas beyond his or her area of specialty. Petitioner further contends that it is irrational to test an examinee on particular areas that are irrelevant to his or her work and/or desired career path. However, the PP exam for a particular discipline is specifically designed to ensure that a licensed P.E. is competent to practice over a range of specific areas encompassed within that particular discipline. This is because once a person becomes a licensed P.E., he or she may practice engineering within any discipline or specific area within that particular discipline, subject to the professional and ethical requirements to limit practice to the disciplines and areas in which the engineer is actually competent. Thus, the breadth of the PP exam is designed to help ensure minimal competence to practice engineering in a manner that protects the public health and safety. Petitioner also contends that because the PP exam for some engineering disciplines tests a broader range of areas than the PP exam may test for other disciplines, the exam inconsistently measures minimal competency across the range of engineering disciplines. However, as discussed in detail below, the subject matters tested on the PP exam for a given discipline are chosen by subject-matter experts who are licensed engineers practicing in that particular engineering discipline, and are deemed, by those subject matter experts, to be most important to test for purposes of measuring competency in that discipline. Thus, while the number of discrete subject matters tested on the PP exams may differ across the various engineering disciplines, this difference is, factually and logically, a function of expert consensus regarding which subject matters need to tested to demonstrate minimal competence. Petitioner also contends that the Challenged Rule is vague because it does not specifically identify the disciplines, and the areas within each discipline, that are tested on the PP exam. Respondent has contracted with NCEES to be responsible for preparing, administering, and scoring the PP exams, pursuant to section 455.217(1)(d). NCEES conducts a methodical process, discussed in detail below, to determine the specific disciplines for which to develop a PP exam and the areas to be covered on the PP exam for a discipline. Exam specifications are then developed by subject matter experts within that discipline, and are published by NCEES. These specifications inform prospective examinees regarding the particular areas that will be tested on the PP exam for the discipline, and the number of questions for each specific area that will appear on the exam. Thus, prospective examinees are not left to wonder or guess about which disciplines will be tested; the areas within each discipline that will be tested; or the relative weight that will be assigned to each area tested.12 Although the Challenged Rule does not identify the specific disciplines tested on the PP exam, rule 61G15-21.001(1)(b)—which actually adopts the PP exam as an engineering licensure exam in Florida—states that the PP exam "is given by discipline." Therefore, even if section 455.217(1)(d) required the specific areas of competency to be identified by rule, such areas would have been identified in rule 61G15-21.001(1), rather than in the Challenged Rule.13 12 See Cole Vision v. Dep't of Bus. And Prof'l Reg., 688 So. 2d 404, 410 (Fla. 1st DCA 1997)(a rule is impermissibly vague if it is drafted in terms so vague that men of common intelligence must necessarily guess at its meaning or application. 13 Neither sections 455.217(1)(d) nor 471.015(1) specifically authorize or require Respondent to adopt rules identifying the general areas of competency tested on the PP exam. By contrast, exams developed by the agency pursuant to section 455.217(1)(b) must identify, by rule, the general areas of competency to be tested. Had the Legislature intended for exams authorized under section 455.217(1)(d)—of which the PP exam is an example—to adhere to the same requirement, the statute would have so stated. See Pro-Art Dental Lab, Inc. v. V- Strategic Grp., LLC, 986 So. 2d 1244, 1258 (Fla. 2008)(the specific mention of one thing in a statute implies the exclusion of another). Furthermore, section 120.54(1)(g) expressly requires a rule to address only one subject. Thus, if the Challenged Rule also addressed the areas of competency to be tested on the PP exam, it would violate section 120.54(1)(g). Petitioner also asserted, at the final hearing, that Respondent did not certify the PP exam as meeting the requirements of national examinations and generally accepted testing standards pursuant to department rules, as required by section 455.217(1)(d). However, Petitioner did not raise this alleged invalidity basis in the Rule Challenge Petition, so he is foreclosed from raising and litigating it at the final hearing. See § 120.56(1)(b), Fla. Stat. Petitioner also testified that, in transitioning from paper-and-pencil PP exams to computer-based exams, NCEES is relying on two different theories—classical test theory and item response theory—and that this reliance does not comport with generally accepted testing standards. However, as discussed above, Petitioner was not qualified as an expert in the specialized area of high-stakes examination preparation and scoring; thus, his testimony constitutes lay opinion regarding this specialized subject matter. He did not present any competent substantial evidence to support his contention that the PP exam does not meet generally accepted testing standards.14 Petitioner also testified that item response theory, which is a psychometric tool used in developing and scoring the PP exam, is an invalid means of determining the competence of an engineer. To this point, Petitioner testified that the "model law engineer" is not a real person, but is instead an imaginary person created by subject matter experts to define what a minimally competent engineer should know. Thus, according to Petitioner, 14 Because Petitioner was not qualified, tendered, or accepted as an expert in these specialized subject matters, and because his testimony on these matters consists of inadmissible lay opinion testimony, this testimony has not been afforded weight. §§ 90.701 and 90.702, Fla. Stat. the model law engineer standard is the wrong reference point for determining minimal competency to practice engineering.15 Petitioner also testified that the PP exam does not accurately measure ability, which is a latent trait for which an arbitrary measurement scale must be created. He testified that the model law engineer standard is the midpoint of this scale, and that the purpose of the scale is to determine whether examinees fall above or below that midpoint.16 He further contended that the PP exam does not accurately measure ability, because performance on the exam may be influenced by extraneous variables, such as test anxiety. Petitioner also testified regarding item response theory, which, as previously noted, is a psychometric tool used in developing and scoring high- stakes exams—a subject about which Petitioner had no training in, or knowledge about, until he prepared for the final hearing in this proceeding. Specifically, Petitioner testified that the item characteristic curve is the basic building block of item response theory, and that there are two technical properties of an item characteristic curve: difficulty of the item, and the ability of the item to discriminate between examinees' abilities. Petitioner testified that another basic principle of item response theory is that the examinee's ability is a variable with respect to the items used to determine it. According to Petitioner, this principle rests on two 15 Petitioner's testimony on this point was based on excerpts from a book titled Item Response Theory and a book titled The End of Average. The Item Response Theory book is a treatise on psychometrics, a highly specialized field about which Petitioner was not qualified to testify as an expert, and which is not susceptible to lay witness opinion testimony. Thus, Petitioner's testimony on these points is not assigned weight. See §§ 90.701 and 90.702, Fla. Stat. Additionally, excerpts from The End of Average were determined irrelevant, so were not admitted into evidence. 16 As support for this testimony, Petitioner selectively cited and quoted the deposition testimony of Timothy Miller, Respondent's expert on the development and scoring of NCEES's psychometric-based PP exams. The specific context of Miller's deposition testimony was that when an exam item is overexposed, it is subject to drift, which means that the percentage of correct answers for the item increases to the point that the item no longer is a good discriminator. As further discussed below, Petitioner's testimony on this point was directly and persuasively countered by Miller's expert testimony regarding scaled scores and setting the passing score for the PP exam. conditions: that all items measure the same underlying latent trait, and the values of all item parameters are in a common metric. According to Petitioner, this principle reflects that the item characteristic curve spans the entire ability scale; thus, the practical implication is that a test located anywhere along the ability scale can be used to estimate an examinee's ability, such that an examinee could take a test that is easy or hard, and on average, would score at the same estimated ability level. Petitioner testified that this stands in contrast to classical test theory, which he contends is a better discriminator of examinee ability.17 Petitioner also testified that the psychometric methods used to develop and score the computer-based PP exams are flawed because "difficulty" is subjective and entirely dependent on the individuals developing the PP exam. Thus, according to Petitioner, in scoring a computer-based multiple-choice PP exam, it is impossible to know whether a particular examinee got the answer right due to a reasonable approach in answering the question, or by guessing. Petitioner contends that for this reason, multiple-choice test questions developed using item response theory are not good discriminators of examinees' ability; thus, even if an examinee does not correctly answer enough questions to pass the exam, that does not mean that the examinee is not knowledgeable in that area.18 Petitioner further testified that because difficulty is a subjective parameter, different forms of the PP exam inherently have different levels of difficulty. Thus, according to Petitioner, it is a matter of luck whether an examinee takes a more difficult form or an easier form of the exam. Further to this point, Petitioner testified that because an examinee does not take multiple forms of the exam, but instead takes only one form, the determination of the examinee's ability is solely dependent on a subjective 17 Refer to note 15, supra. 18 Refer to note 15, supra. parameter—i.e., the difficulty of the test questions as determined by subject matter experts. Petitioner contends that, as a result, the PP exam does not accurately measure an examinee's ability, and, therefore, is not a valid exam.19 Petitioner also testified that because statistical indices of reliability and validity are not attributes of an exam, a researcher may select what seems to be an appropriate test for his or her purposes, when, in fact, the selected test does not have any level of reliability or validity. Thus, Petitioner testified, reliability and validity are values that reside in test scoring, not in the test itself. Petitioner testified that validation, in statistics, is the process of accumulating evidence that supports the appropriateness of the inferences that are made of student responses for assessment uses. He testified that validity refers to the degree to which the evidence indicates these interpretations are correct and the manner in which the interpretations are used is appropriate.20 Petitioner testified regarding three types of validity evidence: content, construct, and criterion evidence.21 Specifically, Petitioner testified that content evidence refers to the extent to which an examinee's responses to a given assessment reflect the examinee's knowledge of the content being tested; thus, to the extent an exam inadvertently measures a parameter that is not related to the examinee's knowledge of the content being tested, it is invalid. 19 Refer to note 15, supra. 20 Petitioner's testimony relied on, or was paraphrased from, a document titled The Scoring Rubric Development. Again, because this topic and document address a matter within the specialized fields of psychometrics, high-stakes testing, and test-scoring statistics, which are areas in which Petitioner was not qualified as an expert, and which are not susceptible to lay opinion testimony, pursuant to sections 90.701 and 90.702, Petitioner's testimony relying on this document, including his testimony regarding content, construct, and criterion-related evidence, is not afforded weight. 21 Refer to note 15, supra. This determination regarding the weight afforded Petitioner's testimony applies to paragraphs 94 through 100 herein. According to Petitioner, the content-related evidence for the PP exam for each discipline is inconsistent, so that the PP exam for a given discipline does not accurately measure minimal competence for that discipline. Petitioner also testified that the weighting of different topics on the PP exam necessarily creates an advantage for engineers who work in areas more heavily weighted on the exam, while creating a disadvantage for engineers who work in areas that are less heavily weighted on the exam. Petitioner also testified that to accurately determine minimal competence in all engineers, the model law engineer standard should be keyed to, and the content tested on the exam should be directed toward all engineers, including those having more than four years of experience. Petitioner also testified that construct-related evidence consists of external benchmarks, such as results and explanations, of internal evidence of psychological processes, such as reasoning. Petitioner testified that because multiple-choice exams do not provide evidence of an individual's reasoning process, they do not generate construct-related evidence for purposes of determining exam validity. Petitioner testified that free response paper-and pencil-exams provide construct-related evidence, so are better indicators of an examinee's knowledge. Petitioner also testified regarding criterion-related evidence, which relates to the extent to which the results of an assessment, such as the PP exam, correlate with a current or future event. By way of illustration, Petitioner testified that criterion-related evidence considers the extent to which a student's performance may be generalized to other relevant areas. Petitioner testified that an examinee's performance on the PP exam is not generalizable to other relevant activities, so it is impossible to determine whether the exam actually corresponds to minimal competence in the workplace. In sum, Petitioner testified that the PP exam does not meet content, construct, or criterion-based evidence for purposes of determining whether it is a valid exam. Thus, Petitioner reasons, it is logical to conclude that because the PP exam is not a good discriminator between minimally competent and incompetent engineers, it does not reliably and adequately measure competence. Petitioner also testified that because passing the PP exam is only one component of licensure, it fails to meet criterion-based validity, in that the exam, by itself, does not certify a passing examinee to practice as a P.E. As Petitioner put it, "you're just passing the exam as part of the requirement for licensure." Petitioner reasoned that if passing the PP exam corresponds to minimal competence, the experience and education requirements of section 471.015(2)(a) are redundant. Also to this point, Petitioner testified that the preapproval process to take the PP exam is directly related to an examinee's actual work experience as an engineer, while taking the exam merely entails answering questions about engineering work. Thus, Petitioner contends, a competent engineer, as shown through Respondent's preapproval process, may nonetheless fail the exam. Petitioner asserts that this further shows that the exam does not accurately measure minimal competence. Petitioner also testified that, in his view, delaying licensure of potentially competent engineers due to postponing the exam due to the Covid-19 pandemic does not serve the interest of public health and safety. To that point, he testified that the inability to obtain a variance, which would relieve examinees from having to take and pass the exam under such circumstances, renders the Challenged Rule arbitrary.22 Petitioner also contended that passing the PP exam should not be required, because other engineering professional associations—specifically, the European Federation of National Engineering Associations ("FEANI"), which represents engineers in European countries—allow licensure through 22 Petitioner's point regarding inability to obtain a variance or waiver is addressed in the Conclusions of Law. education and experience requirement, without requiring a professional exam to be taken and passed. However, because section 471.015, which governs the licensure of engineers in the state of Florida, requires a professional licensure examination to be taken and passed as part of the P.E. licensure requirements, Petitioner's testimony and argument regarding FEANI's practices and requirements take issue with the statute, rather than the Challenged Rule, and, thus are irrelevant. Petitioner also contends that the examination fee for the PP exam is arbitrarily set, rendering the Challenged Rule arbitrary. However, as discussed above, the Challenged Rule only addresses determining the passing grade for the PP exam using psychometric methods. The Challenged Rule has nothing to do with establishing or setting an examination fee. Thus, this challenge ground has no basis in fact or law.23 C. Findings Regarding the Evidence Presented in Respondent's Case Respondent's Expert Witnesses Respondent presented the testimony of Timothy Miller, P.E., who serves as Director of Examination Services for NCEES. Miller has held this position for approximately 15 years. His job-related activities and responsibilities include directing exam development, publication, scoring, and fulfillment of the licensing exams for engineers and surveyors; coordinating exam development committees consisting of over 800 volunteer subject matter experts who work on developing each NCEES exam; overseeing the exam development process and providing advice and guidance regarding engineering exam development, administration, production, scoring, analysis, and reporting; serving as a testing process consultant to exam development committees; and other exam-development and administration-related matters. 23 Additionally, this challenge ground was not raised in the Rule Challenge Petition, so is not at issue in this proceeding. See § 120.56(1)(b), Fla. Stat. Before Miller was promoted to his current position, he served as an examination development engineer for NCEES. In that position, Miller was responsible for planning and coordinating engineering exam development, production, administration, scoring analysis, and reporting for certain assigned examinations; serving as a testing consultant working with engineering exam development committee chairs regarding quality and number of exam development volunteers; and overseeing development and administration of the licensing exams in the specific fields of environmental controls systems, metallurgical engineering, and mechanical engineering. Through his experience in these positions with NCEES, Miller is an expert in professional examination development and scoring, particularly with respect to the development and scoring of the NCEES FE and PP examinations. Before being employed with NCEES, Miller practiced civil and structural engineering with several private-sector engineering firms. He has been a professional engineer since 1984, and is licensed as a P.E. in South Carolina, North Carolina, Maryland, Delaware, Pennsylvania, and New Jersey. Respondent also presented the testimony of Dr. Michelle Rodenberry, P.E., an associate dean and associate professor at the Florida A&M University–Florida State University College of Engineering. Her engineering expertise is in the field of structural engineering—specifically, bridge engineering. Rodenberry was appointed to the Board in 2012, and she served as a Board member until 2018. She is now an emeritus Board member. While on the Board, she served as chair of the education committee, and was involved in reviewing applications for licensure as a P.E. in Florida. Development, Scoring, and Validation of the PP Exam The NCEES engineering exams are national licensing exams that are recognized by every engineering licensing entity in each of the U.S. states, as well as by the engineering licensing entities in Washington, D.C.; Puerto Rico; the U.S. Virgin Islands; and the other U.S. territories and protectorates. There are approximately 26 different engineering disciplines, each of which is tested by a separate PP exam specific to that discipline. In the 1990s, NCEES decided to transition from subjectively-graded pencil-and-paper examinations to an objectively-graded computer-based multiple-choice exam format. Currently, approximately one-third of the PP exams, including environmental engineering, have been converted to a computer-based format, and all but one of the exams in the other disciplines is in the process of being converted. The reason NCEES is transitioning the PP exam from a pencil-and- paper format to a computer-based format consisting of multiple-choice questions is to help eliminate subjectivity in grading, so that the exam papers are consistently graded across groups of examinees. Additionally, a computer-based format consisting of objective multiple-choice questions allows the difficulty of the exam to be psychometrically evaluated for purposes of determining the passing score for a particular administration of the exam. To that point, because computer- based multiple-choice exams are objectively scored, exams offered at different times during the year are able to be compared, or equated, for purposes of setting the passing grade for a particular exam administration.24 Respondent entered into a contract with NCEES in 2009, pursuant to which NCEES provides the FE and PP exams for engineer licensure in Florida. In 2013, the contract was amended to allow NCEES to provide the exams by computer-based testing, using Pearson Vue as its exam 24 As Miller explained,"[i]f they were different on a difficulty level, the harder exam, the standard would actually be lowered so that it would be fair across administration so everybody was treated consistently. Or if my exam was less difficult, the standard would be raised. I would have to get more questions right." administering entity. The FE and PP exams are administered by Pearson Vue at its testing centers. NCEES develops model laws and rules that represent best practices with respect to state licensing of engineers. The aim of these model laws and rules is to achieve uniformity and consistency throughout the states and the U.S. territories and protectorates in the licensure of professional engineers. A significant benefit of such consistency and uniformity is the resulting "mobility" for licensed professional engineers—that is, the ability to more easily become licensed to practice engineering in multiple states. The NCEES model laws and rules establish the "model law engineer," which defines and constitutes the standard for minimal competence in a specific engineering discipline for purposes of being licensed as a P.E. in that discipline. The model laws and rules define the "model law engineer" as a person who holds a degree from an engineering educational program accredited by ABET, has four years of active engineering practice experience, and passes the FE and PP exams. The model law engineer standard equates to the competence level of an engineer having four years of engineering experience who is capable of practicing engineering in a manner that protects the public health and safety. This constitutes the minimum competence level that an applicant must demonstrate for purposes of being licensed as a P.E.25 in the 50 states and the U.S. territories and protectorates. Thus, the NCEES PP exam is constructed to test engineering ability keyed to the model law engineer standard. That is, the PP exam is designed to determine the ability level of an applicant for P.E. licensure for purposes of 25 Refer to note 12, supra. The term "minimal competence," as used in the Challenged Rule, is specifically keyed to the "model law engineer" standard for purposes of being licensed as a P.E. It is not meant to indicate or imply that an engineer who does not take or pass the PP exam is per se incompetent, such that he or she is not competent to engage in work constituting engineering, as defined in section 471.005(7). comparing that ability level to that of an engineer having four years' experience who is able to practice engineering in a manner that protects the public health and safety. Examinees having four years of engineering experience after graduation have the highest pass rates on the PP exams. Pass rates for examinees with more or less than four years of experience are lower, typically in proportion to the length of time before or after the four-year experience mark when they take the PP exam. Miller explained, credibly and persuasively, that the reason for the drop-off in PP exam performance after the four-year mark is that "life happens." Engineers gain more experience, and many become specialized in a relatively narrow niche, or move into managerial, non-technical positions. Additionally, because the PP exam does, in part, test subjects that one learns in college, the longer an examinee is out of college, the less subject matter recall in certain areas he or she may have. "Psychometrics" is the specialized field of study concerned with the theory and technique of psychological measurement. Specifically, psychometrics entails the objective measurement of skills and knowledge, abilities, and educational achievement. Among other specialized areas of practice, psychometricians focus on the construction and validation of assessment instruments, and theories, such as item response theory, that relate to psychological measurement. Psychometricians typically have graduate training and all possess specialized qualifications that enable them to engage in objective psychological measurement. PP exams are designed to determine minimal competence in a specific engineering discipline. "Minimal competence" is the minimal amount of knowledge required to practice in that particular engineering discipline in order to protect the public health and safety. For any specific engineering discipline for which it has been determined that a PP exam should be given,26 there is an approximately three-year due diligence period in which subject matter experts in that discipline work to determine the topics that should be tested on the exam. The PP exam for each specific discipline is developed by subject matter experts, who volunteer and meet on a monthly basis to develop, review, and evaluate the questions for the PP exam for that specific discipline.27 The process of determining which topics should be tested on a PP exam, termed the Professional Activities and Knowledge Study ("PAKS") process, is a standard practice used to determine the specific topics to be tested on a PP exam. As part of the PAKS process, a consulting psychometrician28 employed by Pearson Vue; 15 to 20 engineers who are licensed in another engineering discipline; and subject matter experts who may teach a particular engineering discipline for which the PP exam is being developed, work together to develop consensus regarding the specific topics that engineers having four years of experience practicing in that discipline need to know in order to safely practice engineering in a manner that protects the public health and safety.29 26 For a PP exam to be developed for a new engineering discipline, at least ten NCEES- member state engineering boards must request that such exam be developed, and at least one ABET-accredited program in that specific discipline must exist. 27 Over the years of development and administration of the PP exam, hundreds of licensed engineers have provided input regarding the topics that should be, and are, tested in each discipline and the relative weight given to each topic on the PP exam. 28 Pearson Vue's psychometricians who develop, score, and evaluate NCEES's exams have Ph.D. degrees in psychometrics or statistical analysis. 29 Subject matter experts selected to develop the PP exam questions are chosen based on consideration of the type of practice, such as governmental and private practice; gender; ethnicity; length of time of licensure as a P.E.; and geographic considerations. All subject matter experts must be licensed as a P.E. by a state engineering licensure board in order to The consulting psychometrician builds a questionnaire that lists the specific topics identified by the PAKS committee, and distributes an online survey to engineers who practice in the discipline for which the PP exam is being developed. The survey seeks input regarding the relative importance of each specific topic for purposes of testing to demonstrate minimal competence in the discipline. Based on the survey responses from engineers practicing in the discipline, exam specifications are developed. The exam specifications identify each specific topic to be tested on the PP exam, and the number or percentage of exam questions that will address each specific topic within that discipline. The exam specifications must be approved by an oversight committee. Once the exam specifications have been approved, the subject matter experts for that specific engineering discipline for which the PP exam is being developed prepare the PP exam questions—also termed "items"—and review and evaluate them for clarity, demographic neutrality, and other parameters, so that the items will reliably and validly test engineering ability. In computer-based multiple-choice PP exams, the questions are prepared such that for each question, there is only one correct answer and three other plausible, but incorrect, alternative choices. The individual exam questions are reviewed numerous times by the subject matter experts before they are moved into an exam question bank for use on the PP exam. Once the exam questions have been developed and banked for use on a PP exam, a standard-setting committee, consisting of ten to 15 licensed engineers having diverse backgrounds, experience, and demographic features, reviews the exam to determine the minimum passing score—or "cut score"—on the exam. The cut score equates to the ability level of an engineer serve in this capacity. As noted above, over the years of development and administration of the PP exam, hundreds of engineers have provided input in developing each PP exam. having four years of experience who is minimally competent to practice engineering at a level that protects the public health and safety. This method of using subject matter experts to examine the content of each exam question and predict how many minimally-qualified examinees would answer each question correctly is termed the "Modified Angoff Method." The standard-setting committee then takes the exam, and the psychometrician analyzes the data from the committee's exam sitting. Using this data and analysis, the standard-setting committee then reviews, and reaches consensus, regarding each question, for purposes of determining the proportion of minimally competent engineers who would answer that specific question correctly. Based on the information generated by this process, the psychometrician develops the "panel recommended passing score," with a statistical margin of error. The psychometrician presents this recommended passing score to a committee of five persons, consisting of two state licensing board members and three subject matter experts who observed the exam development process. Based on the psychometrician's recommendation, the committee makes the final decision regarding the minimum passing score for the exam. Each PP exam question is developed and evaluated using the process described above, and is placed in bank for use on a PP exam. The psychometrician uses item response theory to "calibrate"—i.e., determine the relative difficulty level of—each exam question.30 An exam question is not banked for use on future sittings of the exam unless it has had at least 200 responses on a previous exam, so that statistics for each item's performance can be generated for purposes of item calibration. 30 Item response theory is one of many psychometric methods, or tools, used to weight exam questions for purposes of creating different forms of exams having the same level of difficulty. Depending on the specific discipline, a question bank for a PP exam may consist of many thousands of questions.31 Using item response theory, the psychometrician converts the passing score to create a scale from -5 to +5, which will equate to the examinee's ability level as measured by the exam. Once the passing score for the PP exam has been determined, different PP exam "forms" are created for administration in different exam sittings. Exam "forms" are essentially different versions of the PP exam that consist of different individual questions of the same difficulty level, as determined using item response theory, for each specific topic on the exam. Thus, if a PP exam was administered, for example, in April and October, the different exam administrations would consist of different forms—meaning that the exam would consist of different questions, but the questions would be of the same difficulty level for a specific topic tested on the exam.32 Additionally, because exam item difficulty has been calibrated using item response theory, different forms of a PP exam can be given during the same exam administration at different locations.33 Importantly, because the difficulty of the exam items has been calibrated using item response theory, the different exam forms are statistically equivalent in difficulty. 31 The only items that will be used on the graded portion of the PP exam are questions that have known statistics such that they have been calibrated for difficulty. However, there may be other "pretest" questions on the exam strictly for purposes of gathering statistics regarding performance on the questions for potential inclusion in the exam item bank; these "pretest" exam questions are not graded for purposes of determining the examinee's score on the exam. 32 As Miller explained, for an administration of an exam at different locations at the same time, the form administered at a particular location consists of different questions than the form administered at another location; however, the exam forms are equivalent in terms of the number of questions addressing a particular topic and the difficulty of the items addressing that topic. 33 Using the "linear-on-the-fly" ("LOFT") method to generate different forms of the exam also helps ensures exam security, since persons sitting near each other during an administration of the exam will not have the same exam form. As a result of using these processes, including the Modified Angoff Method, and applying item response theory to calibrate the exam items for purposes of constructing different, but statistically equivalent, forms of the PP exam, examinees are not graded on how they perform against each other, but instead are graded against the cut score set for the exam. To ensure that different forms of the exam are statistically equivalent in difficulty, Pearson Vue uses the LOFT method,34 which employs an algorithm to ensure that, across all of the exam forms, all examinees get the same number of questions of the same level of difficulty on the same topics. The algorithm randomly generates, or assembles from banked exam questions, different exam forms based on the exam specifications (i.e., the specific topics tested and relative weight/number of exam questions for that topic) and the difficulty level of the exam questions, such that the different exam forms generated by LOFT are statistically equivalent to each other. Using item response theory to calibrate specific exam question difficulty based on the statistical probability of being answered correctly enables examinees taking different, but statistically equivalent, forms of the exam to be compared to the passing standard for purposes of determining whether they pass the exam. Thus, examinees are compared to an ability level—here, minimal competence—rather than to each other.35 This method ensures that all examinees take an exam of equivalent difficulty, which, in turn, helps ensure the fairness of the exam. 34 For the engineering disciplines having too small a group of examinees to employ item response theory or LOFT to generate different exam forms, each examinee takes the same exam instead of taking different forms of the exam, and the exam typically is offered only on one day, rather than multiple days, per year. 35 By way of example, Miller explained that if two examinees each answer five questions on the same topic on the exam, and one examinee answers four easier questions correctly and the other examinee answers two comparatively more difficult questions correctly, the examinee answering the two more difficult questions correctly may have a higher ability level on that particular topic, due to the comparative difficulty of the questions that examinee answered correctly. Once a PP exam is administered, Pearson Vue scores the examination and sends NCEES the information regarding whether each examinee has passed or failed the exam. Pearson Vue also provides each examinee the information regarding his or her performance on the exam compared to the minimum competence standard. The examinee's performance is expressed as a scaled score, for each specific topic tested on the exam, and for the entire exam. Specifically, using psychometric statistical methods, the ability level of the examinee is expressed as "theta," and is placed on a scale of 0 to 15 for each of the specific topics tested. The examinee's overall theta across all specific topics tested is then compared to the "minimal competence" passing standard, which is also expressed as a scaled score using the same 0 to 15 scale. After an exam is taken and scored, the consulting psychometrician analyzes this data, called "response data," for each exam question, for each examinee, to calibrate the items for purposes of determining whether a particular question performs well in discriminating ability level of the examinees. The psychometrician may recommend that an exam question be "retired" because it is not performing as a good discriminator of ability level. Examples are where an exam question is too difficult or too easy, such that it does not discriminate well in determining ability level; where an item takes too long to answer or is ambiguous; where an item has been "overexposed" by having become publicized such that future examinees have access to the question and scores on the question become high; or where an exam contains "bad pair" items, such that the answer to one item may suggest, or lead to, the answer on another similar item. Having a psychometrician involved in tracking and analyzing exam data enables such circumstances and situations—which may influence the scores on a test item for reasons not related to the examinee's ability—to be identified and corrected. Returning to a free response, paper-and-pencil exam format for the P.E. licensing exam would provide a far less objective, fair, and accurate measure for determining minimal competence for purposes of being licensed to practice engineering. It also would negatively affect the ability of licensed engineers to become licensed in other states. Due to the use of psychometric methods in developing and scoring, the PP exams are very reliable across multiple administrations of the exam— to the point that all of NCEES's psychometric-based PP exams score upwards of .9 on a scale of 1.0.36 Psychometric methods, including item response theory, are used in developing, administering, and scoring many different types of high-stakes professional and academic examinations, including medical school admissions examinations, and nursing, medical examiner, internal auditor, and architecture licensure examinations. NCEES audits approximately one-third of the exams administered by Pearson Vue on an annual basis, to independently evaluate the accuracy of the psychometric services provided by Pearson Vue, and to ensure that the exams have been created pursuant to NCEES's guidelines, procedures, and requirements. NCEES also retains independent psychometricians to review Pearson Vue's exam-related reports and analyses, to ensure that Pearson Vue is following standard psychometric rules of good practice. In sum, the use of objective psychometric methods, including the methods discussed above, to develop, score, and evaluate the PP exam ensures that minimal competence, for purposes of licensure as a P.E., is accurately measured by the exam. Role of the PP Exam in Licensure of PEs in Florida As discussed above, to be licensed as a P.E. in Florida, an applicant must have a college degree from an ABET-accredited institution, four years of 36 Test reliability refers to the degree of consistency with which a test measures a particular subject matter across different administrations of the test. A test has a high reliability score if it consistently produces similar results under consistent conditions. A 1.0 reliability score reflects perfect consistency in results across different administrations of a test. An acceptable reliability score target for high-stakes tests is .7 or higher. active experience in engineering practice, and have passed the FE and PP exams. Thus, the P.E. exam is a vital component of determining that an engineer licensed as a P.E. to practice in Florida is able to practice at a competence level that protects the public health and safety. Unlike the education and experience requirements for licensure— both of which may entail a great deal of variability in quality and breadth across applicants—the PP exam constitutes an objective, consistent tool37 to measure an applicant's level of competence for purposes of determining whether the applicant possesses the minimal competence needed to practice engineering in a manner that protects the public health and safety. As such, the PP exam constitutes a uniform measure of minimal competency for purposes of licensure as a P.E. in Florida. As discussed above, this does not mean that a person who engages in engineering work but has not passed the PP exam is incompetent; it merely means that he or she has not demonstrated minimal competency on this required objective measure of competency for licensure purposes. As discussed above, the PP exam is specifically designed to ensure that a licensed P.E. is competent to practice over a range of specific areas encompassed within a particular discipline. This is because a licensed P.E. is authorized to practice engineering within any discipline or area, subject to professional and ethical standards. The breadth of the PP exam thus helps ensure minimal competence to practice engineering in a manner that protects the public health and safety. 37 This consistency and uniformity is the direct result of the psychometrically-based exam development, scoring, calibration, and validation methods discussed above. The purpose of P.E. licensure is to inform and protect the public, which is entitled to rely on such licensure as indicating that the licensee is competent to practice engineering.38 Administration of the PP Exam During the Covid-19 Pandemic As discussed above, Petitioner has alleged that the Challenged Rule is arbitrary on the basis that it does not address contingencies for offering the exam if unforeseen circumstances prevent regular administration of the PP exam. Specifically, Petitioner points to the fact that Pearson Vue cancelled the April 2020 PP exam administration due to the Covid-19 pandemic. Pearson Vue has rescheduled the PP exams for various times and at various locations around the country in an effort to make the PP exam available for prospective examinees during the pandemic.39 Pearson Vue also is taking substantial steps to protect persons who have applied to take the PP exams during the Covid-19 pandemic. To that point, Pearson Vue has retrofitted its testing centers to help ensure the safety of the examinees as they sit for the PP exam. Specifically, the number of examinees who will be in a testing room has been reduced; masks are required to be worn by examinees and proctors; testing stations are cleaned between each use; some additional test center locations have been added; and some states have relaxed rules to allow examinees to sit for the exam in states other than the one for which they are applying for licensure. At present, the exams are not able to be offered over the internet so that examinees are able to take the exam at a remote location. A substantial reason for this is lack of exam security, which is necessary to protect and 38 As noted above, a person does not have to be licensed as a P.E. to engage in engineering work in Florida. However, if a person wishes to hold himself or herself out to the public as a P.E., then that person must satisfy the requirement to pass the PP exam, which is an indicator of minimal competence for purposes of licensure. 39 Among other things, Petitioner alleges, in paragraph 19 of the Rule Challenge Petition, that the Challenged Rule is invalid under section 120.52(8)(d) because it does not address circumstances where an examination cannot be administered due to force majeure. As maintain the exam's integrity. Additionally, the lack of guarantee of internet service reliability and functionality for every examinee is a crucial consideration, since failed internet connections could significantly affect the fairness of the exam.

Florida Laws (15) 120.52120.54120.541120.542120.56120.569120.57120.68455.217471.005471.007471.013471.01590.70190.702 Florida Administrative Code (8) 1-1.01061G15-18.01161G15-19.00161G15-20.001061G15-20.00261G15-21.00161G15-21.00461G15-30.002 DOAH Case (1) 20-3075RX
# 1
EDUCATION PRACTICES COMMISSION vs. WILLIE RUTH WILLIAMS, 81-002775 (1981)
Division of Administrative Hearings, Florida Number: 81-002775 Latest Update: Sep. 20, 1982

Findings Of Fact Respondent holds Florida Teacher's Certificate No. 051784, Post Graduate, Rank II, valid until June 30, 1987, covering the areas of math and junior college. During the 1980-1981 school year Respondent was employed as a mathematics teacher at Shanks High School in Quincy, Florida. Respondent has, in fact, been an employee of the Gadsden County School System as a classroom teacher for the past 27 years. On April 9, 1981, Linda Charleston, a student in Respondent's fifth grade remedial math class, was tested on Mini-Skills Test J-24. These so-called "mini-skills tests" are instruments used for the remediation of students in a variety of "skills," and consist of ten multiple-choice questions. For each "skill" tested, there existed two separate forms, one of which could be administered every three to six weeks in the event a student failed when the test was first taken. The tests were given this length of time apart so that students could be prevented from memorizing the ten multiple-choice questions. Judy Parramore, a teacher's aide at James A. Shanks High School who was in charge of administering the mini-skills tests, gave this particular test to Linda Charleston on April 9, 1981, and graded the results of the test. Miss Charleston correctly answered nine of the ten multiple-choice questions on the test given to her on April 9, 1981. Prior to that time. Ms. Parramore had administered the same skills test to Linda Charleston, and the student had on that occasion scored zero. Miss Charleston's answers were identical on both tests. Answers given by the student were the answers to Mini-Skills Test J-24, Form B. Because of this dramatic and incongruous improvement in Miss Charleston's test scores, Ms. Parramore brought the matter to the attention of the school administration. As a result, on April 9, 1981, Carlos Deason, the Shanks High School Principal; Ms. Parramore; and Bettye Ponder, Curriculum Assistant and teacher at Shanks High School, interviewed Miss Charleston. During the course of this interview, Miss Charleston furnished information which implicated Jackie Gibson, a fellow Shanks student, who, according to Miss Charleston, had given her the answers to Mini-Skills Test, J-24, Form B. According to Miss Charleston, Miss Gibson had advised her at the time that she received the answers that Miss Gibson had gotten the answers from Respondent. Mr. Deason, Ms. Parramore, and Ms. Ponder then interviewed Jackie Gibson, who advised them further that Respondent had given her the answers to Mini-Skills Test RC-7, Forms A and B. Miss Gibson also at that time furnished a sheet purportedly containing the answers to the State Student Assessment Test, Part II, Math, for the April 2, 1981, testing session, which she said had been given to her by Brenda Robinson, a fellow student. According to Miss Gibson, Miss Robinson told her that she had received this answer sheet from Respondent. Brenda Robinson was then called into the meeting to be interviewed by Mr. Deason, Ms. Parramore and Ms. Ponder. Miss Robinson at that time advised that she had gone to Respondent's homeroom class on April 2, 1981, the day the SSAT II exam had been administered, and had gotten the answers to that test from Respondent. Finally, Linda Moye, another Shanks student, was interviewed, and also implicated Respondent as having furnished her with answers to unspecified mini- skills tests. At the conclusion of the student interviews on April 9, 1981, each of the aforementioned students signed written statements attesting to the facts hereinabove recited. Subsequently, approximately one week later, these students were called back to give sworn statements before a court reporter. Although these statements were not introduced into evidence at the final hearing in this cause, each of the students either recanted his or her earlier statements to school administrators or refused to again discuss the matter. Since giving their initial statements to school administrators, all of the four students mentioned above have at one time or another--including in their testimony at final hearing in this cause--withdrawn their stories implicating Respondent as having furnished answers to either the SSAT II exam or the mini-skills tests. The students explain this change in testimony by accusing the school administrators who took their statements of having threatened them with refusing to allow them to graduate if they refused to identify Respondent as the source of the test answers. There is no credible evidence in this record to support such an accusation. Conversely, Petitioner explains the apparent change of heart to community pressure brought upon the witnesses at Respondent's behest. Again, there is insufficient evidence in this record to establish that fact. The only consistent thread in the testimony of Linda Charleston is her admission that she had answers to certain of the mini-skills tests prior to the time she took them, and that she received those answers from Jackie Gibson. Beyond this, even the most cursory review of the record in this cause will reveal that the remainder of Miss Charleston's testimony is so contradictory and inconsistent as to be totally unworthy of belief in any particular. The testimony of Jackie Gibson is consistent only in that she admits having answers to Mini-Skills Test RC-7, Forms A and B, and that she received a sheet containing answers to the SSAT II, Math, test from Brenda Robinson sometime after that test had been administered on April 2, 1981. At various points in this record, Miss Gibson implicates Respondent as being the source of these answers, and at other times testifies that she received the answers from Bruce Bennett, Respondent's student assistant. Even when implicating Bennett, however, Miss Gibson is unclear as to where Mr. Bennett obtained the answers. In short, because of the inconsistencies, inaccuracies, and contradictions in her testimony, it is specifically found that Miss Gibson's testimony is unworthy of belief beyond the point of establishing that she, in fact, furnished answers to Mini-Skills Test J-24, Form B, and RC-7, Forms A and B, to Linda Charleston, and that Brenda Robinson gave her what she thought to be an answer sheet to the SSAT II, Math, test for safekeeping sometime after that test had been administered. Finally, Linda Moye also denied at the final hearing that Respondent furnished her answers to any mini-skills test. Each of these student witnesses--Linda Charleston, Jackie Gibson, Brenda Robinson, and Linda Moye--were called as Petitioner's witnesses. They were neither declared, nor asked to be declared, "adverse" or "hostile" witnesses. Respondent, testifying in her own defense, denied having furnished answers to any mini-skills tests to any of her students. The record in this case establishes that at least one security breach occurred at Shanks High School resulting in the loss or theft of at least one mini-skills test. In addition, the school administration apparently was aware that, prior to the incidents involved in this case, students had answers to various of these tests. In fact, this record establishes that the Respondent mentioned her concerns with security involving the mini-skills test to the principal, Mr. Deason, prior to the instant controversy. However, no effort was made to change the contents of the tests, at least from 1978 through the time of the improprieties alleged to have occurred in this case. There is, therefore, some explanation of record as to how students could have come into possession of these test answers. There is not, however, sufficient, competent, credible evidence to indicate that Respondent at any time, or in any fashion, furnished answers to any of these mini-skills tests to any student at James A. Shanks High School. As previously mentioned, Brenda Robinson, a student in Respondent's fifth period remedial math class, on April 9, 1981, furnished to school administrators what appears from the record to be a key to the SSAT II, Math, test for the April 2, 1981, testing session. These answers were handwritten on a yellow, legal size piece of paper. Miss Robinson gave the answer sheet to Jackie Gibson to keep for her. This sheet was verified by Jim Diamond, general supervisor in charge of guidance and testing for the Gadsden County School Board, and Pat Gwen an employee in the student assessment section of the State Department of Education, as containing a majority of the correct answers to the SSAT II, Math, exam for the April 2, 1981, testing. Alonzo Brown, a student in Respondent's homeroom class, testified that on the morning of April 2, 1981, Respondent told her homeroom class that she expected every student in the class to pass the SSAT test to be administered that day. Mr. Brown further testified that Respondent told her homeroom class that she had the answers to this test, and proceeded to pass out copies of the answers to the test for the students to copy. According to Mr. Brown, he took a copy of the answers home with him and placed them in his dresser drawer. According to Mr. Brown, this answer sheet was later misplaced, and was not produced by Mr. Brown at final hearing. Respondent denies Mr. Brown's versions of the occurrences on April 2, 1981, and the greater weight of the evidence in this cause establishes Respondent was not even in her homeroom on the morning in question. Respondent's homeroom class was regularly assigned to another teacher on Tuesdays and Thursdays, so that it would have been highly unusual for her to have been in the room at all on April 2, 1981, which was a Thursday. In addition, the record clearly establishes that Respondent was involved in both a senior class meeting and a student council meeting on the morning in question during the period when her homeroom met on the morning of April 2, 1981. More important, however, is the question of how Respondent could have come into possession of the answers to the SSAT II, Math, test in the first instance. This examination is altered after each administration, so that no one, having seen a prior examination, would be able to use those answers on a subsequent test. In addition, no answer key to the SSAT II test is ever sent to an individual school district. The only keys available to this test were located in Iowa City, Iowa, and in the Department of Education in Tallahassee. As a result, the only feasible means available for obtaining answers to the SSAT II test administered on April 2, 1981, would have been to obtain a copy of the test booklet itself. The test booklets were not received in the Gadsden County School System until the afternoon of April 1, 1981, the day before the test was administered. The tests were immediately secured and locked in the school's main office. Additionally, each of the tests contains an individual serial number, and is sealed prior to distribution to students at the time of testing. Any irregularity in the sealing of test booklets is required to be brought to the attention of school personnel at the time the test is administered. The record in this case contains no evidence of a breach in the security of the SSAT II test, other than the answer sheet which Jackie Gibson gave to school administrators on April 9, 1981. The security of the test was, therefore, obviously breached at some time prior to the time the test was given on April 2, 1981, or the "answer sheet" was made at or after the administration of the test. The latter possibility would, of course, exculpate Respondent of the charge in this case. Even in the former event, however, Respondent's denial of her involvement in any furnishing of answers to this examination is more credible than the accusation that she did so based solely upon answers contained on a piece of yellow legal paper obtained a week after the test was administered, when taken together with the more credible evidence placing Respondent outside her homeroom during the morning of April 2, 1981. During the summer of 1978, Respondent, along with 36 other teachers, graded Criterion Reference Tests which had been administered to students at the conclusion of summer school at Shanks High School. During the grading of these tests, Respondent, in the presence of Katherine Peddie, a fellow teacher, made erasures and pencil marks on students' answer sheets on two or three occasions. Ms. Peddie did not immediately mention anything about this incident to Respondent, nor did she bring it to the attention of any of the monitors present in the room while the tests were being graded. After the test scoring had been completed, however, Ms. Peddie went to Carlos Deason, the principal of Shanks High School, and advised him of what she had observed. It should be noted here that Respondent is not simply charged with making marks and erasures on these students' answer sheets. Instead, she is charged with changing answers on these tests the effect of which changes is alleged to have resulted in ". . . altering the true score on those tests." Even when viewed in its most favorable light, Ms. Peddie's testimony does not establish any alteration in a student's test score. She merely observed some unexplained marks and erasures, the true effect of which is not clear from the record in this case, at least in part due to the fact that Ms. Peddie did not bring anyone's attention to the matter when the actual tests could have been examined for alterations. Indeed, this Hearing Officer is unwilling to draw an inference that any such marks and erasures in fact have had the effect of altering the true score of any student's test, when the record does not clearly establish that fact. As indicated earlier in this order, Linda Moye and Brenda Robinson gave statements implicating Respondent to school administrators on April 9, 1981. On Saturday evening, April 11, 1981, Miss Moye and Miss Robinson were questioned about their statements at a meeting of the local NAACP. Subsequently, on the afternoon of Sunday, April 12, 1981, Respondent and Leola Francis, a fellow teacher, spoke to Brenda Robinson about the statement she had given to school administrators on April 9, 1981. During the week following April 9, 1981, Jackie Gibson voluntarily went to Respondent at Shanks High School and asked Respondent to write a note for her to sign recanting her original statement of April 9, 1981. Respondent wrote such a note, which was signed by Jackie Gibson. As previously indicated, each of the students who gave statements to school administrators on April 9, 1981, subsequently recanted those statements either in sworn statements, depositions, or sworn testimony at final hearing. Another student, Barbara Jones, who testified in this proceeding, also received contacts from fellow students and from Leola Francis concerning her testimony in this proceeding. Ms. Jones also, shortly prior to the final hearing in this cause, received two telephone calls threatening her life should she appear to testify against Respondent. Notwithstanding these threatening telephone calls, Ms. Jones did, in fact, appear and testify. It does not appear from the record in this case that the threatening telephone calls affected Ms. Jones' testimony. Finally, Bettye Ponder, a curriculum assistant and teacher at Shanks High School, also received a threatening telephone call in the summer of 1981, and a "T" for traitor was burned in her front yard in August of 1981. Again, notwithstanding these threats, Ms. Ponder did, in fact, appear to testify at final hearing, and this record does not establish that these threats affected Ms. Ponder's testimony. It is clear from the record in this case that this entire episode has generated a great deal of interest and anxiety in the Quincy community. It is equally clean that a number of people, students, teachers, and citizens alike, spoke with several of the witnesses concerning their testimony in this case. This record does not, however, establish that the Respondent either directly or indirectly had any hand in attempting to influence the testimony of any of the students or other witnesses in this cause.

Florida Laws (2) 120.57120.68
# 2
PAM STEWART, AS COMMISSIONER OF EDUCATION vs SUSAN REID BRUSS, 14-005129PL (2014)
Division of Administrative Hearings, Florida Filed:Tampa, Florida Oct. 30, 2014 Number: 14-005129PL Latest Update: Sep. 22, 2024
# 3
GARY COOK vs BARBER`S BOARD, 97-001863 (1997)
Division of Administrative Hearings, Florida Filed:Crawfordville, Florida Apr. 15, 1997 Number: 97-001863 Latest Update: Sep. 02, 1997

The Issue The issue in this case is whether Petitioner, Gary Cook, should have received a passing score on the Barber Practical Examination taken by him in November 1996.

Findings Of Fact On or about November 25, 1996, Petitioner, Gary Cook, took the Barber Practical Examination (hereinafter referred to as the "Exam"). The Exam was scored by two examiners: Geri Scott and Don Gibson. The Bureau of Testing of Respondent, the Department of Business and Professional Regulation (hereinafter referred to as the "Department") subsequently notified Mr. Cook that he had earned a total score of 70 on the Exam. A score of 75 is considered a passing grade. Mr. Cook was notified by the Department that he earned a total score of 14.00 points on the sanitation portion of the Exam. The maximum score which may be earned for the sanitation portion of the Exam is 25.00. On or about December 30, 1996, Mr. Cook requested a formal administrative hearing to contest the determination of his score on the Exam. Mr. Cook challenged his score on the sanitation portion of the Exam. The sanitation portion of the Exam consists of ten criteria for which points may be earned: Criteria Maximum Score Used proper linen setup for the shampoo 2 Properly stored clean and dirty linen during the shampoo 3 Washed hands before beginning the haircut 2 Used the proper linen setup for the haircut 3 During the haircut tools were replaced in sanitizer after each use 3 Properly stored clan and dirty linen during the haircut 2 Washed hands before beginning the permanent wave 2 Used the proper linen/cottonwrap setup for the permanent wave 3 Kept tools sanitized during the permanent wave 3 Properly stored clean and dirty linen during the permanent wave 2 TOTAL POSSIBLE POINTS 25 The criteria of the sanitation portion of the Exam are designated as "procedures" which candidates are required to meet during the Exam. If both examiners determine that a candidate carried out a procedure, the candidate is awarded the total available points for the procedure. If both examiners determine that a candidate did not carry out the procedure, the candidate is awarded no points for the procedure. Finally, if one examiner determines that a candidate carried out the procedure and the other examiner disagrees, the candidate is awarded half of the available points for the procedure. On the sanitation portion of the Exam Mr. Cook received no points for procedures B-2, C-2, and C-3. Mr. Cook received half the points available for procedures B-4 and C-4. Mr. Cook specifically alleged that he should have been awarded the maximum points for procedures B-2, B-4, C-2, C-3, and C-4. For procedure B-2, the examiners were to determine whether "[t]he candidate used the proper linen setup for the haircut." This procedure was worth a total of 3 points. Both examiners determined that Mr. Cook had not used the proper linen setup. For purposes of procedure B-2, the haircut includes shaving around the outline of the hair. Therefore, proper linen setup for the shave is a part of the haircut. Mr. Cook did not dispute the fact that he had not used the proper linen setup for the shave portion of the haircut. Mr. Cook suggested that the haircut portion of the Exam did not include the shave. The evidence failed to support this assertion. Rule 61GK3-16.001(7)(a)8., Florida Administrative Code, provides that a "haircut" for purposes of barber examinations includes a determination that "[s]ideburns, outline and neckline are clean shaven." See also, Page 7 of the Candidate Information Booklet, Respondent's Exhibit 3. Mr. Cook failed to prove that he fulfilled the requirements of procedure B-2. For procedure B-4, the examiners were to determine whether "[t]he candidate properly stored clean and dirty linen during the haircut." [Emphasis added] This procedure was worth a total of 2 points. One examiner determined that Mr. Cook had not met this criterion. Mr. Cook, therefore, was awarded 1 point for this procedure. The examiner that found that Mr. Cook had not performed procedure B-4 properly determined that Mr. Cook had placed a box of rubber gloves on a bar behind the area in which he was working. The Department has cited no authority which defines the term "linens" as including rubber gloves. The common definition of the term "linens" does not suggest that rubber gloves constitute linens. The term "linen" is defined as follows: 1 a : cloth made of flax and noted for its strength, coolness, and luster b : thread or yarn spun from flax 2 : clothing or household articles made of linen cloth or similar fabric3 : paper made from linen fibers or with a linen finish Webster's Ninth New Collegiate Dictionary 1984. Mr. Cook should have received full credit for procedure B-4. Therefore, Mr. Cook should have received one additional point on procedure B-4. For procedure C-2, the examiners were to determine whether "[t]he candidate used the proper linen/cotton wrap setup for the permanent wave." This procedure was worth a total of 3 points. Both examiners determined that Mr. Cook had not met this criterion. Both examiners determined that Mr. Cook had failed to use a proper cotton-wrap setup. Mr. Cook failed to explain what steps he undertook in setting up for the permanent wave. Mr. Cook, therefore, failed to prove that he fulfilled the requirements of procedure C-2. For procedure C-3, the examiners were to determine whether "[t]he candidate kept tools sanitized during the permanent wave." This procedure was worth a total of 3 points. Both examiners determined that Mr. Cook had not met this criterion. Both examiners determined that Mr. Cook had placed rods used for the permanent on the back bar. Mr. Cook failed to prove that the did not leave rods on the back bar while performing the permanent wave. Mr. Cook, therefore, failed to prove that he fulfilled the requirements of procedure C-3. For procedure C-4, the examiners were to determine whether "[t]he candidate properly stored clean and dirty linen during the permanent wave." This procedure was worth a total of 2 points. One examiner determined that Mr. Cook had not met this criterion. The examiner who found that Mr. Cook had not met this criterion determined that Mr. Cook had left end-wraps on the back bar. Mr. Cook failed to prove that he did not leave end- wraps on the back bar. Mr. Cook, therefore, failed to prove that he fulfilled the requirement of procedure C-4. All of the criteria for the sanitation portion of the Exam are listed in a Candidate Information Booklet for the Barber Examination. See page 6 of Respondent's Exhibit 3. The booklet also explains the scoring procedure. Mr. Cook proved that he should have been awarded one additional point on the sanitation portion of the Exam. Therefore, Mr. Cook earned a total score of 71 on the Exam. Mr. Cook's score is below a passing score of 75.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that a Final Order be entered by the Department of Business and Professional Regulation, Barbers Board, finding that Gary Cook should have received a total score of 71 on the Barbers Practical Examination of November 1996. DONE AND ENTERED this 2nd day of September, 1997, in Tallahassee, Leon County, Florida. LARRY J. SARTIN Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (904) 488-9675 SUNCOM 278-9675 Fax Filing (904) 921-6847 Filed with the Clerk of the Division of Administrative Hearings this 2nd day of September, 1997. COPIES FURNISHED: Gary Cook 202 Mulberry Circle Crawfordville, Florida 32327 R. Beth Atchison, Assistant General Counsel Department of Business and Professional Regulation Northwood Centre 1940 North Monroe Street Tallahassee, Florida 32399-0792 Joe Baker Department of Business and Professional Regulation Board of Barbers 1940 North Monroe Street Tallahassee, Florida 32399-0792 Lynda L. Goodgame, General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0792

# 4
GARY ROSEN vs DEPARTMENT OF BUSINESS AND PROFESSIONAL REGULATION, 11-002321 (2011)
Division of Administrative Hearings, Florida Filed:Westbay, Florida May 10, 2011 Number: 11-002321 Latest Update: Nov. 17, 2011

The Issue The issue is whether the National Association of Environmentally Responsible Mold Contractors' Initial Mold Assessor and Initial Mold Remediator licensing examinations meet requirements of section 455.217, Florida Statutes, and Florida Administrative Code Rule 61-11.015, such that they should be certified by the State of Florida for use in licensing mold assessors and mold remediators in Florida.

Findings Of Fact The Parties Petitioner is the applicant for certification of the National Association of Environmentally Responsible Mold Contractors' (?NAERMC?) Initial Mold Assessor Exam and Initial Mold Remediator Exam (?NAERMC Exams?) as national exams for use in licensing mold assessors and mold remediators in Florida. Petitioner has extensive academic training and professional experience in mold assessment and mold remediation. He is a Florida-licensed mold assessor and mold remediator, and is certified or accredited by numerous professional mold-related services organizations.3 He has been a full-time professional mold assessor and mold remediator since 2004-2005, having performed over 1,000 mold and construction defect investigations and over 500 mold remediation projects. He has authored numerous texts on mold-related subjects and mold remediation. Petitioner has no formal training or experience in the development of professional licensing examinations. Petitioner's only training in exam development consisted of one ten-hour course offered as part of a U.S. Green Building Council certification program. He also informally reviewed exam development materials provided by Respondent's Examination Development Specialist. Respondent is the state agency statutorily charged with regulating mold-related services and administering the mold- related services licensing program in Florida under chapter 468, part XVI, Florida Statutes. Florida's Mold-Related Services Regulatory Program Mold-related services consist of mold assessment4 and mold remediation,5 which are performed by state-licensed mold assessors and mold remediators. There are two means by which persons may become licensed to provide mold-related services in Florida: initial licensure by examination, and licensure by endorsement. A person desiring to be initially licensed by examination to provide mold-related services in Florida must, among other things, pass a professional licensing examination. By statute, Respondent is required to provide, contract, or approve services for the development, preparation, administration, scoring, score reporting, and evaluation of professional licensing examinations, including mold-related services licensing examinations. Respondent may approve, for use in professional licensing, any national examination that it has certified as meeting the requirements of national examinations and generally accepted testing standards. § 455.217(1), Fla. Stat. Respondent's Evaluation of the NAERMC Exams The NAERMC Exams are examinations that Petitioner offers in connection with courses approved by the State of Texas for training mold assessors and mold remediators, modified to address Florida-specific issues. Petitioner applied to Respondent for certification of the NAERMC Exams for use in licensing mold assessors and mold remediators in Florida. He did this by submitting two completed Exam Evaluation Questionnaires (?EEQ?) to Respondent. The EEQ is an instrument Respondent has developed to determine whether an examination proposed for use in professional licensing meets the requirements of section 455.217(1)(d), Florida Statutes——that is, whether it is a ?national examination? as defined in rule 61-11.015, and whether it has been developed using generally accepted testing standards. Petitioner submitted the first version of his EEQ on or about November 8, 2010. Respondent determined, from a review of the EEQ, that the NAERMC Exams did not meet the statutory standards for examinations that may be approved for use in professional licensing in Florida. Respondent sent Petitioner a written analysis and comments regarding the NAERMC Exams' deficiencies. After receiving Respondent's analysis and comments, Petitioner requested and obtained a copy of the completed EEQ that ACAC submitted for its examinations. The ACAC examinations have been approved for use in mold-related services licensing in Florida.6 Petitioner revised his EEQ responses and submitted an amended EEQ for the NAERMC Exams on or about December 8, 2010. Several, although not all, of the revised responses are substantially similar or identical to ACAC's responses. Respondent's analysis of Petitioner's amended EEQ noted the similarity between many of Petitioner's and ACAC's responses. Respondent asserts that Petitioner copied ACAC's responses rather than providing truthful responses that accurately describe the NAERMC Exams. Petitioner denies he copied the ACAC responses and claims that his revised responses reflect updates to the NAERMC Exams he made after having studied the ACAC EEQ responses in order to determine Respondent's exam certification requirements. Respondent determined that the NAERMC Exams do not meet the statutory requirements in section 455.217, and on March 8, 2011, issued a Notice of Intent to Deny, proposing to deny certification of the NAERMC Exams for use in professional licensure of mold assessors and mold remediators in Florida. Statutory Standard for Certification of Professional Licensing Examinations Pursuant to section 455.217(1)(d), Respondent may only approve, for use in professional licensing, national examinations it has certified meet the requirements of generally accepted testing standards and national examinations. Steven Allen, an Examination Development Specialist with Respondent's Bureau of Education and Testing, testified on behalf of Respondent regarding generally accepted testing standards and national examinations. Mr. Allen has a Master's Degree in evaluation and measurement. His employment duties include evaluating exams submitted to Respondent by independent examination providers for certification for use in professional licensing, to determine whether they are national examinations and have been developed using generally accepted testing standards. These duties require Mr. Allen to be fully versed in generally accepted testing standards and the national examination rule, and the application of these standards in certifying professional licensing exams. Mr. Allen was involved in reviewing the EEQs submitted for the NAERMC Exams. Generally Accepted Testing Standards Professional licensing examinations, including examinations for mold-related services licensure, must meet generally accepted testing standards. These standards are well- known, published standards for educational testing and evaluation in the United States that are set by three national organizations.7 All testing organizations engaged in developing high-stakes licensing exams must follow these standards. Exams must be prepared according to these standards to ensure that they are valid and reliable. Exam validity involves determining whether the exam covers a representative sample of the content and skills intended to be measured. Exam reliability means that the exam provides consistent results when measuring a test taker's knowledge, skills, and abilities. The starting point in developing an exam pursuant to generally accepted testing standards is the performance of a job/task analysis. A job/task analysis entails an analysis and compilation of the knowledge, skills, and abilities to be tested on a particular exam. If a job/task analysis is not accurately performed, the validity of the exam——that is, whether the exam actually measures what it purports to measure——cannot be verified. Therefore, performing a job/task analysis is essential to preparing a valid examination. The first step in a job task analysis consists of the assembly, by the testing organization, of a panel of experts in the particular subject matter that the exam is being developed to test. These subject matter experts must constitute a representative sample of practitioners for the particular profession for which the exam is being developed. Once the subject matter expert panel is assembled, panel members complete an occupational survey instrument to identify the knowledge, skills, and abilities for the particular competency level for which the exam is being developed. For example, for an entry level skills licensing exam, the subject matter expert panelists would complete an occupational survey to identify the knowledge, skills, and abilities that an applicant of minimum competency for licensure must demonstrate in order to be licensed. The end product of the job/task analysis is a collaboratively developed content outline identifying the areas to be tested on the examination, with respective weight assigned to each. Subject matter experts often have differing opinions regarding content that should be tested on an exam. Therefore, obtaining a consensus among subject matter expert panelists regarding the content to be tested is essential to developing a valid exam that tests the content intended to be tested. An individual subject matter expert, working on his or her own, is unable to engage in the collaborative process integral to developing a valid exam. After the job/task analysis is complete, the exam items (questions) are prepared by subject matter experts according to the content outline. Before preparing the items, the subject matter experts are trained to draft items that accurately, reliably, and fairly test the content. After the items have been prepared, they are reviewed by an item review committee. These iterative review processes, conducted by subject matter and psychometrics expert panelists, are essential to developing exams that are valid and reliable. Petitioner did not present evidence showing that he developed the NAERMC Exams using a job/task analysis, as that term is understood in the field of psychometric measurement. Petitioner did not demonstrate that he conducted an occupational survey of subject matter experts. Instead, he compiled content lists that he used in developing mold-related services courses8 and writing books on mold-related topics. These compilations were not developed for licensing examinations,9 and the evidence does not establish that they were developed using the collaborative processes entailed in a psychometrically sound job/task analysis. Petitioner's EEQ response also appears to misrepresent key information regarding the NAERMC Exams. Specifically, Petitioner's response to Item No. 24 of the December 8, 2010, EEQ, addressing job/task analysis performance, states: ?. . . a review committee is formed from among industry experts and stakeholders across the United States.? However, at hearing, Petitioner conceded that he is the only expert involved in developing the NAERMC Exams and is the sole member of the ?review committee.? Petitioner's EEQ responses regarding job/task analysis performance conflict with his testimony and, thus, are not credible. For these reasons, it is determined that Petitioner did not present credible, persuasive evidence demonstrating that he performed a job/task analysis in developing the NAERMC Exams. Accordingly, he did not show that the NAERMC Exams are valid. Petitioner also failed to demonstrate that the NAERMC Exams are reliable, as that term is used in psychometric measurement. Exam reliability is demonstrated by providing statistical analyses addressing the long-term performance of individual exam items and of the exam as a whole. In his November 8, 2010, EEQ response, Petitioner stated that he performed an item analysis to identify poorly performing items, but did not keep copies of the analysis. However, in his December 8, 2010, EEQ response, Petitioner provided a statistical analysis for an item and an explanation that substantially mimicked ACAC's response for that item. As a matter of practice in the professional examination industry, exam developers keep and readily provide item reliability analysis information upon request from exam certification entities. The fact that Petitioner initially represented that he did not keep such information, but then soon after provided a response that mimicked ACAC's, undermines the EEQ's credibility and calls into question its accuracy with respect to the NAERMC Exams' reliability. Petitioner's testimony and other evidence in the record also call into question the credibility and accuracy of other responses in the December 8, 2011, EEQ. Specifically, the EEQ asked how many subject matter experts review each exam item for accuracy and relevancy to the practice. Petitioner responded that five experts would review each item; however, at hearing, he was unable to identify any of those experts. Moreover, his EEQ responses directly conflict with a discovery response10 in which he stated that he was the sole subject matter expert for development of the NAERMC Exams. Based on inconsistencies in Petitioner's testimony, EEQ responses, and discovery responses; his failure to perform a psychometrically sound job/task analysis; his lack of significant training in exam development; and his lack of understanding of generally accepted testing standards and their role in preparing valid, reliable exams, it is determined that Petitioner did not provide credible, persuasive evidence showing that the NAERMC Exams meet generally accepted testing standards, as required by section 455.217(1)(d). National Examination To implement the ?national examination? requirement in section 455.217, Respondent has adopted rule 61-11.015, Florida Administrative Code, entitled "Definition of a National Examination." This rule establishes the criteria an exam must meet to be a "national examination" that Respondent may use to test professional licensure applicants. All rule criteria must be met for an exam to be a "national examination." National or Multi-state Professional Organization To be a ?national examination,? the examination must be developed by or for a national or multi-state professional organization. Fla. Admin. Code R. 61-11.015(2)(emphasis added). To be a national or multi-state organization, the organization must be generally recognized by practitioners across the nation in the form of representatives from state licensing boards, or must have membership representing a substantial number of the nation's or states' practitioners who have been licensed through the national examination. Fla. Admin. Code R. 61-11.015(3). Petitioner created Certified Mold & Allergen Free, Corp. (?CMAFC?) to, among other things, provide online training courses in mold-related services. The courses are offered through Petitioner's CMAFC, NAERMC, and Green Buildings.org websites, and the U.S. Green Building Council (?USGBC?) website. The State of Texas approved two CMAFC courses for training persons seeking licensure as mold assessors and mold remediators in Texas. CMAFC training courses have been taken by persons located in states other than Florida. Petitioner also created NAERMC, an ?association? that provides free internet-based mold-related services training courses11 and ?certification examinations? that test the topics covered in the online courses. Successful completion of the ?certification exams? allows one to become certified by NAERMC. ?Certification? by NAERMC entitles one to a certificate of accomplishment and a logo symbol that can be placed on business cards. Petitioner is NAERMC's only officer. NAERMC does not have bylaws and does not prepare an annual report. Petitioner testified that anyone who passes the certification exams becomes a NAERMC member, but he did not provide any specific information regarding NAERMC's membership. NAERMC does not conduct membership meetings or provide mailings to its membership. There is no evidence establishing that the NAERMC Exams were developed by a national or multi-state professional organization, as that term is defined in rule 61-11.015(3). Petitioner did not present any evidence showing that NAERMC's membership includes or consists of practitioners across the nation in the form of representatives from state mold-related services licensing boards. Nor did Petitioner present any evidence that NAERMC's membership includes or consists of a substantial number of the nation's or state's mold-related practitioners who have been licensed through the NAERMC Exams. Petitioner also did not present evidence establishing that the NAERMC Exams were developed for a national or multi- state professional organization, as provided in rule 61- 11.015(3). The evidence shows only that Petitioner, through his websites, offers mold-related services training courses to persons in multiple states, and that successful completion of the courses and exams offered at the end of the courses entitles one to NAERMC certification and membership. Petitioner testified that the USGBC is a nationwide organization having 40,000 members, and presented evidence showing that some of his CMAFC-copyrighted courses are offered through the USGBC website. However, he did not present any evidence showing that USGBC is generally recognized by practitioners across the nation in the form of representatives from state mold-related services licensing boards, or that USGBC's membership represents a substantial number of the nation's or state's mold-related practitioners who have been licensed through NAERMC's Exams. Moreover, NAERMC's certification examinations are not licensing examinations. Petitioner conceded this point at hearing. For these reasons, Petitioner failed to establish that the NAERMC Exams were developed by or for a national or multi- state organization, as required by rules 61-11.015(2) and 61- 11.015(3). Establishment of Entry Level Standards of Practice To be approved by Respondent as a ?national examination,? the exam's purpose must be to establish entry level standards of practice that are common to all practitioners in the licensing area. Fla. Admin. Code R. 61-11.015(2)(a). Petitioner did not show that the NAERMC Exams meet this criterion. As previously discussed, performing a psychometrically sound job/task analysis is essential to developing an exam that tests for the content intended to be tested——here, the knowledge, skills, and abilities that an entry level professional mold assessor or mold remediator should possess. Petitioner, acting as a ?committee of one,? compiled content lists based on his knowledge of mold-related topics that he used to develop training courses and write books. At hearing, Petitioner referred to these content compilations as a ?job/task analysis,? but they are not. A ?job/task analysis? is a term of art used in psychometric measure to describe a specific, collaborative process for developing exam content. The evidence does not establish that Petitioner performed a job/task analysis. Petitioner asserts that the NAERMC Exams test entry level skills because he is a mold-related services subject matter expert, so knows what content entry level mold-related services professionals should know. Petitioner misapprehends the importance of generally accepted testing standards in developing exams that accurately test the knowledge, skills, and abilities intended to be tested. Petitioner has no training or experience in licensure examination development, and his testimony that the NAERMC Exams test entry level skills was not persuasive. For these reasons, Petitioner did not demonstrate that the NAERMC Exams' purpose is to establish entry levels of practice common to all mold-related services practitioners, as required by rule 61-11.015(2)(a). Definition of Practice by a National Occupational Survey Rule 61-11.015(2)(b) requires that the practice of the profession at the national level be established through an occupational survey with a representative sample of all practitioners and professional practices. Petitioner did not meet this requirement. Petitioner did not provide evidence establishing that he utilized a survey instrument.12 As previously discussed, Petitioner compiled mold- related content lists that he used to develop training courses and write books. However, these lists do not constitute an occupational survey. Petitioner testified that he was involved with an international organization in preparing standards for mold assessment and in an online community of mold experts. However, he did not present any evidence to show that these entities comprise a representative sample of all mold-related services practitioners, as required by the rule. In sum, Petitioner did not provide credible, persuasive evidence demonstrating that he developed the NAERMC Exams using an occupational survey to define the mold-related services practice at the national level, as required by rule 61- 11.015(2)(b). Assessment of Scope or Practice and Entry Skills Rule 61-11.015(2)(c) provides that the licensure examination must assess the scope of practice and the entry skills defined by the national occupational survey. As previously discussed, Petitioner did not perform an occupational survey in developing the NAERMC Exams——a necessary endeavor to ensure that an exam accurately assesses the content it is intended to assess. Because no occupational survey was performed for the NAERMC Exams, it is not possible to verify that they assess scope of practice and entry level skills, as required by the rule. Accordingly, the NAERMC Exams do not meet this criterion. Oversight and Scoring of the National Examination Rule 61-11.015(4) requires the organization to be the responsible body for overseeing the development and scoring of the national examination. Petitioner is the sole officer of NAERMC. He testified and provided information in the EEQs stating that he alone develops the NAERMC Exams, and that he and his wife hand- score the exams. Respondent did not present evidence showing that these oversight measures are deficient under the rule. Accordingly, Petitioner showed that NAERMC is responsible for overseeing development and scoring of the NAERMC Exams. However, because Petitioner has not established that the NAERMC Exams are ?national examinations,? this criterion is not met. Examination Development and Scoring Security Rule 61-11.015(5) requires the organization to provide security guidelines for the development and grading of the national examination and to oversee the enforcement of these guidelines. Petitioner testified that the NAERMC Exams are encrypted and electronically stored on Petitioner's computer and a computer located in Nevada. Petitioner is the only person who develops the NAERMC Exams and has access to them. These measures do not conform to standard security measures employed by exam developers in the professional examination industry. Typically, examination papers are inventoried when they are removed from the vault for administration, re-inventoried at the exam site before they are administered, closely monitored during the examination process, then re-inventoried by tracking forms once the exam is completed. Measured against the industry standard, the NAERMC Exams' security measures are deficient. For these reasons, Petitioner did not present credible, persuasive evidence demonstrating that rule 61- 11.015(5) is met. Having considered the competent evidence in the record, the undersigned determines, as a matter of ultimate fact, that Petitioner failed to establish, by a preponderance of the evidence, that the NAERMC Exams meet the requirements of section 455.217 and rule 61-11.015.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that Respondent enter a Final Order denying Petitioner's application for certification of the National Association of Environmentally Responsible Mold Contractors' Initial Mold Assessor Exam and Initial Mold Remediator Exam for use in the professional licensing of mold assessors and mold remediators in Florida. DONE AND ENTERED this 24th day of October, 2011, in Tallahassee, Leon County, Florida. S CATHY M. SELLERS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 24th day of October, 2011.

Florida Laws (6) 120.52120.569120.57455.217468.8411468.8413
# 5
SCHOOL BOARD OF WALTON COUNTY vs ANN FARRIOR, 99-001904 (1999)
Division of Administrative Hearings, Florida Filed:Defuniak Springs, Florida Apr. 23, 1999 Number: 99-001904 Latest Update: Aug. 07, 2000

The Issue The issues to be resolved in this proceeding concern whether the Petitioner school board has good cause to reject the Walton County School superintendent's recommendation of Ann Farrior (Respondent) for renewal of an annual contract to serve in the position of school psychologist.

Findings Of Fact Ann Farrior was employed as a school psychologist by the Walton County School District for the 1998-1999 school year. She was employed on the recommendation of the superintendent and under an annual contract for that school year. Title 20, United States Code, Chapter 33, is known as the Individuals with Disabilities Education Act (IDEA). The intelligence testing and questions regarding assessment and placement of exceptional education students is governed by that federal statute and rules pendent thereto. The federal regulations implementing the IDEA provide certain federal funds to assist in their implementation by local school districts. The Walton County School District receives federal funding to implement the IDEA. The failure to comply with appropriate federal regulations governing testing, assessment and placement of exceptional education students can result in a loss of such federal funding for the District. The Superintendent, Mr. Bludworth, nominated Ms. Farrior for the school psychologist position at issue for the 1998-1999 school year with the understanding that although she was not certified as a school psychologist, she was eligible to be certified as such. During the course of her employment as a school psychologist that school year, state audit personnel determined that she was not properly credentialed to administer intelligence testing as part of the assessment process for exceptional education students, which is necessary to the formulation of Individualized Educational Plans (IEPs) which is in turn a necessary element of the ultimate decision of proper placement of such students in the educational system in a school district. In view of this situation, Mr. Sam Goff of the Bureau of Instructional Support and Community Services of the Department of Education wrote the superintendent on January 20, 1999, outlining specific requirements that the District would have to meet in order to bring itself into compliance with the IDEA as a result of Ms. Farrior's ineligibility to administer intelligence testing as part of the assessment and evaluation process for exceptional students. The superintendent also received notice by memorandum of January 28, 1999, and by letter of January 29, 1999, from the Auditor General's staff and the Auditor General (in evidence as Petitioner Exhibits 4 and 5), that audit findings had determined that the District employed a person as a school psychologist (the Respondent) concerning whom school district records did not indicate a basis for that person being qualified for the school psychologist's position. The Auditor General's findings noted that the position description for school psychologist employed by the school district included responsibilities for administering testing and assessing placement for all exceptional education students. The preliminary findings noted that the employee, the Respondent, then serving as a school psychologist possessed only a temporary Florida teaching certificate in "psychology" which had expired on June 30, 1998, and which did not constitute certification as a "school psychologist." District records did not show that the Respondent had renewed her teaching certificate or had otherwise met the minimum job requirements for the school psychologist position. The Auditor General recommended that the school district document its records with a basis upon which the individual, the Respondent, was determined to be qualified for the school psychologist position or to take appropriate action to provide for a licensed or certified school psychologist for administering testing and for assessing placement for exceptional students. As a result of receiving these communications and preliminary findings, the superintendent met with the Respondent and felt compelled to request her resignation. Nancy Holder had been the school psychologist in the position that Ann Farrior assumed. Early in the 1998-1999 school year, Ms. Holder, who is a certified school psychologist, had been transferred to the position of "Staffing Specialist" upon which occurrence Ann Farrior then occupied the position of school psychologist. Ms. Holder, in her testimony, described the duties of school psychologist as including, in addition to performing intelligence testing of students, testing for academic achievement, and personality testing as well as counseling duties involving students, their parent, and teachers. The school psychologist must also participate in staffing meetings and in the IEP formulation process and resulting decisions regarding placement of exceptional students; she must assist classroom teachers and parents with the particular problems involving both exceptional students as well as students who do not have exceptionalities or diagnoses. Because of the above-referenced preliminary audit findings by the Department of Education, Ms. Holder was required to assume the additional responsibility of supervising Ms. Farrior's activities for the remainder of her annual contract year as well as undertaking to re-test those students whom Ms. Farrior had previously tested. The school district alternatively obtained a consultant to perform the educational testing that otherwise would have been done by Ms. Farrior as school psychologist had she been qualified under the pertinent regulations to do so. The school district received a statement from the Department of Education's Bureau of Teacher Certification, dated March 22, 1999, concerning the Respondent's eligibility to apply for or to receive certification as a school psychologist. That statement of eligibility noted that the Respondent lacked 27- semester hours of graduate school credit in school psychology which would necessarily have to include six-semester hours of graduate credit in a supervised school psychology internship. Additionally, Ms. Farrior would have to submit a passing score on the state-required teacher certification examination. Ms. Farrior enrolled in an appropriate school psychology internship program for the 1999-2000 school year, but as of the date of the hearing in this case, she still lacked 24 of the required semester hours of graduate credit in school psychology and had not yet submitted a passing score on the Florida State Teacher Certification examination. The Walton County School Board has a written policy adopted August 13, 1996, and in force at times pertinent hereto which authorizes the superintendent "to select and recommended non-certificated instructional personnel for appointment pursuant to Section 321.1725, Florida Statutes, and State Board of Education Rule 6A-1.0502, when special services are needed to deliver instruction." Section 228.041(9), Florida Statutes defines the term "instructional personnel" as including "school psychologists." There is no showing in the evidence of record, however, that "special services" are needed to deliver instruction. That is, although the school psychologist position is statutorily deemed to be in the category of "instructional personnel" it does not involve the teaching of students. Rather the school psychologist position, which is the subject of this case, involves testing, evaluation, assessment, and assistance in the placement of exceptional students in appropriate courses of instruction. There was no showing that special services were needed to actually deliver instruction, as envisioned by the above-referenced written policy of the School Board concerning the appointment of non-certificated instructional personnel, such as Ms. Farrior. Given the above-referenced audit findings in relation to the controlling federal regulations referenced above and the Board's policy allowing employment of certificated personnel "out-of-field" only in cases where special services are needed to deliver instruction, it has not been demonstrated that the School Board realistically had an option, in the proper exercise of its discretionary authority, to hire Ms. Farrior "out-of-field" as a "school psychologist" based merely on her only certification, which was a temporary certificate authorizing the teaching of psychology (not certification as a school psychologist which is really a pupil support position). Moreover, the School Board's policy authorizes the employment of teachers for instruction in areas other than that for which they are certificated only in the absence of available qualified, certified instructors. Although the school psychologist position at issue remains unfilled, there is no evidence to demonstrate why it is unfilled and no evidence of record to demonstrate that there are not qualified, certified personnel available to be hired as a school psychologist to fill that position. When the superintendent recommended the Respondent for a second annual contract in April of 1999, he was already aware that she was not qualified to perform the duties of a school psychologist and that the District would have to contract with outside consultants or other qualified persons to at least secure the administration of intelligence and other psychological testing, which testing is a part of the job description and duties of a school psychologist. The then exceptional education director for the District, Ms. Rushing, had suggested to the superintendent that he recommend the Respondent in April of 1999 for the position of "evaluation specialist." This would more represent the actual duties Ms. Farrior had been performing after the Department of Education audit finding that she was not qualified to serve as a school psychologist. Unfortunately, however, there was no authorized position of "evaluation specialist" and the superintendent has no authority to set the qualifications for a particular position or a recommend a person for a position that had not otherwise been approved nor its qualifications approved of by the School Board. In summary, as of the date of the hearing, the Respondent was not yet eligible to receive either a regular or temporary certificate from the Department of Education as a school psychologist and still lacked 24 semester hours of graduate credit necessary for such certification; she had not yet passed the Florida State Teacher Certification Examination for school psychologist although she had secured and enrolled in an appropriate internship to satisfy the above-referenced six-hour internship requirement.

Recommendation Having considered the foregoing Findings of Fact, the evidence of record, the candor and demeanor of the witnesses and the pleadings and arguments of the parties, it is, therefore, RECOMMENDED that a final order be entered by the School Board of Walton County rejecting the nomination of Ann Farrior to serve in the position of school psychologist for the school year 1999-2000, because good cause for such action has been demonstrated by a preponderance of the evidence in the manner found and concluded above. DONE AND ENTERED this 16th day of June, 2000, in Tallahassee, Leon County, Florida. P. MICHAEL RUFF Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 16th day of June, 2000. COPIES FURNISHED: Joseph L. Hammons, Esquire Hammons & Whittaker, P.A. 17 West Cervantes Street Pensacola, Florida 32501 George R. Mead, II, Esquire Clark, Pennington, Hart, Larry, Bond, Stackhouse & Stone 125 West Romana Street, Suite 800 Post Office Box 13010 Pensacola, Florida 32591-3010 John F. Bludworth Superintendent of Schools Walton County School District 145 Park Street, Suite 3 DeFuniak Springs, Florida 32433

Florida Laws (2) 120.569120.57 Florida Administrative Code (1) 6A-1.0502
# 6
BETTY CASTOR, AS COMMISSIONER OF EDUCATION vs FREEDA BRIDGES, 91-005918 (1991)
Division of Administrative Hearings, Florida Filed:Miami, Florida Sep. 17, 1991 Number: 91-005918 Latest Update: Jul. 13, 1992

Findings Of Fact Respondent holds a valid teaching certificate from the State of Florida, number 512951. Respondent's teaching certificate is valid through June 30, 1993. Respondent is certified to teach elementary education. Respondent is employed by the Dade County Public School Board (the "School Board"). Respondent was employed as a teacher at Palm Springs Elementary School in Dade County, Florida ("Palm Springs") for the school years 1989-1990 and 1990-1991. On or about April 30, 1991, Detective Michael Segarra, a police officer in Pembroke Pines, Florida, was investigating a bank robbery in that city at the site of the robbery. Respondent approached Detective Segarra and gave him relevant information concerning two men who may have committed the robbery. Respondent was riding in an automobile with two men who said they were going to rob a bank. Respondent was able to get out of the car by telling the two men that she wanted to go into a McDonald's restaurant across the street from the bank for some orange juice. The two men let her out of the car, and Respondent hid from them. They returned after the robbery was committed, searched without success for Respondent, and left. Respondent walked across the street and gave Detective Segarra the information she had. Based upon Respondent's unusual demeanor and behavior, Detective Segarra asked if he could inspect Respondent's purse. Respondent consented to the search, and Detective Segarra found nine small plastic bags of cocaine and a small cigarette with 20 grams or less of cannabis. Respondent admitted to Detective Segarra that she had used both controlled substances. Detective Segarra did not arrest Respondent at the time of the consent search because Respondent agreed to help him trace the source of the cocaine and the whereabouts of the two men during the previous day and a half. Detective Segarra questioned Respondent further at the police department, took her written statement, and then dropped her off at her residence. 5/ An Information was filed against Respondent on July 24, 1990, for possession of cocaine and cannabis. Respondent pled guilty to both charges on December 5, 1990. Adjudication of guilt was withheld. Respondent was ordered to pay a fine of $240 and placed on probation for two years. The terms of probation included random drug testing and regular drug evaluations. Respondent violated the terms of her probation by failing to timely pay her fine, by testing positive for cocaine, and by failing to report for regular drug evaluation. She was charged by affidavit dated February 15, 1991, with violating the terms of her probation. On April 5, 1991, Respondent pled guilty to violating her probation and to one count of possession of cocaine. Adjudication of guilt was again withheld, and her probation was revoked. Respondent was sentenced to two years of probation and required to complete a drug rehabilitation program at Mount Sinai Hospital. Respondent was removed from the classroom without pay sometime in August, 1990. She returned to the classroom in February, 1991, and was removed again without pay in April, 1991. Although Respondent has not returned to the classroom, the School Board never terminated her employment. She has remained on leave without pay for approximately a year and a half. 6/ Respondent is the first employee of the School Board to qualify for and participate in the Alternative Discipline Program (the "ADP"). The ADP is designed to rehabilitate employees with superior performance histories who have developed a chemical dependency and return them to the classroom as effective teachers. The program is adopted from a similar program developed at Mount Sinai Hospital for physicians with a chemical dependence. The ADP was developed in consultation with Dr. John Eustace, an addictionologist at Mount Sinai Hospital, and through the combined efforts of the School Board's Employee Assistance Program ("EAP"), the School Board's Office Of Professional Standards, and the United Teachers of Dade (the "UTD"). Respondent entered the ADP on August 15, 1991. The ADP is a two year program that places qualified employees with a chemical dependence on leave without pay. If the participant has no connection with a chemical substance for a period of two years, there is a very strong possibility of permanent recovery. Approximately 80 percent of the individuals who have no connection with a chemical substance for two years recover permanently. A participant in the ADP is not entitled to utilize hardship benefits or extra pay benefits while on leave without pay but retains other fringe benefits, including hospitalization. During his or her leave, the participant is hospitalized and receives medical treatment. The participant is required to live in a halfway house, then a three-quarter house, and then to participate in programs of recovery such as Alcoholics Anonymous ("AA") or Narcotics Anonymous ("NA"). If the participant completes that part of the program successfully, the participant is entitled to return to the classroom on a part time basis and then on a permanent basis subject to probation for a year or more. During probation, the participant's performance, attendance, and participation in a program of recovery is strictly monitored. The terms of probation require that the participant sign a letter of resignation and waiver of right to appeal any termination of employment if the participant fails to successfully complete the ADP. In order to participate in the ADP, an employee must enter into a written agreement in which he or she agrees to: participate in a drug screening program utilizing random urine and blood testing within 24 hours of notification; abstain from all mood altering substances, including alcohol, marijuana, crack/cocaine, over the counter preparations, stimulants, street drugs, and pharmaceuticals; participate in a structured chemical dependance program recommended by the EAP or designated program administrator; follow all recommendations of the treatment facility, including a residential long term treatment in a half-way house or other appropriate facility; participate in weekly aftercare upon completion of primary care at Mount Sinai Hospital; provide documentation of attendance at a minimum of five meetings a week at an appropriate program of recovery; obtain an AA or NA sponsor and complete a 12 step recovery program; encourage family members to attend their own 12 step support groups; utilize the comprehensive services available through EAP and the hospital for personal, physical, family, and stress related problems; seek part-time employment upon completion of the structured treatment program only with permission of the program; attend monthly monitoring conferences with a designated fitness supervisor, union representative, and EAP coordinator; and be responsible for all treatment fees not covered by insurance. A participant in the ADP further agrees to resign their employment and waive their right to appeal in the event the participant fails to successfully complete the terms of the ADP. Respondent is the first School Board employee to qualify for the ADP. Only teachers with good performance records qualify for the ADP. Prior to her substance abuse, Respondent had a good performance record. She was more than acceptable. She had very good performance evaluations and recommendations. Respondent executed the first written agreement utilized in the ADP. The written agreement executed by Respondent is substantially equivalent to but not identical to the form Settlement Agreement developed since Respondent entered the program. The form Settlement Agreement includes a letter of resignation which a participant must sign upon entering the ADP and which becomes effective immediately without appeal if the participant fails to complete the ADP successfully. Respondent has successfully completed the major portion of the ADP. She is currently eligible to return to the classroom as a substitute teacher for three days a week. If she successfully completes her part time employment, she will be eligible to return to full time teaching in August, 1992, on a probationary basis. Respondent will be required to execute a Settlement Agreement prior to returning to full time teaching on a probationary basis. Respondent, with the advice and consent of her attorney, agreed under oath during the formal hearing to immediately and voluntarily relinquish her teaching certificate if she failed to complete the remainder of the ADP. The ADP will not be successful if a participant has his or her teaching certificate revoked or suspended prior to completion of the program. Full time teaching on a probationary period for at least one year is an integral part of the ADP. If the participant has his or her teaching certificate revoked or suspended, he or she cannot complete the full time probationary phase of the ADP. Revocation of Respondent's teaching certificate would cause her to lose her continuing contract status. If she obtained a teaching certificate following revocation, she would be required sign an annual contract.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is recommended that Respondent be found guilty of violating Section 231.28(1)(e), Florida Statutes, and that Respondent be placed on probation pursuant to Section 231.262(6)(d), Florida Statutes. It is further recommended that the terms and conditions of Respondent's probation should be the same terms and conditions as those prescribed in the agreement entered into between Respondent and the Employee Assistance Program when Respondent entered the Alternative Discipline Program (the "ADP") and any additional terms and conditions contained in the Settlement Agreement Respondent will be required to enter into upon resumption of full time employment. As a further condition of probation, it is recommended that Respondent be required to successfully complete the ADP and, in the event Respondent fails to do so, voluntarily and immediately resign her employment from the Dade County School Board and surrender her teaching certificate to Petitioner. RECOMMENDED this 19th of February 1992, in Tallahassee, Florida. DANIEL MANRY Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 19th day of February 1992.

Florida Laws (7) 120.57458.331493.6118626.611626.621943.13943.1395 Florida Administrative Code (1) 6B-4.009
# 7
SUSAN E. WILSON vs BOARD OF PROFESSIONAL ENGINEERS, 97-003468 (1997)
Division of Administrative Hearings, Florida Filed:Jacksonville, Florida Jul. 28, 1997 Number: 97-003468 Latest Update: Jan. 27, 1999

The Issue Is Petitioner entitled to one additional point on the October 1996 Professional Civil Engineer Examination so as to achieve a passing score for licensure in Florida?

Findings Of Fact Petitioner took the Civil Engineer Examination given in October 1996. The Department of Business and Professional Regulation's Bureau of Testing notified Petitioner by Examination Grade Report dated February 17, 1997, that she had earned a score of 69.00 on the Civil Engineer Examination. The minimum passing score for the Civil Engineer Examination is 70.00. Petitioner timely requested formal hearing and challenged only Question 120, for which she received no points. Petitioner is trained as a materials engineer. Question 120 is a soils and foundation problem outside her concentrated area of study. It is an open book examination question. Petitioner selected the correct equation from the applicable manual, but acknowledged that she solved the variables of that equation incorrectly. The National Council of Examiners for Engineering and Surveying (NCEES) produced, distributed, and was responsible for grading the examinations. Petitioner contended that the examiner who graded her answer sheet applied different criteria than the examination criteria published by the NCEES. Petitioner further contended that since one criterion her grader actually used was merely to "write the correct equation," she should be awarded at least one point on that basis. However, a comparison of the actual grader's handwritten "summary" on Petitioner's Solution Pamphlet (Respondent's Exhibit 3) and the NCEES's Solutions and Scoring Plan (Respondent's Exhibit 2) does not bear out Petitioner's theory. It is clear that out of five possible parts of the question, which five parts total two points' credit each, merely selecting the correct equation from an open text would not amount to two points, or even one point, credit. I accept as more competent, credible and persuasive the testimony of Eugene N. Beauchamps, the current Chairman of the NCEES Examination Policy Committee and a Florida licensed Professional Engineer, that the grader's "summary" describes what he actually reviewed in Petitioner's written solution to Question 120 rather than establishing one or more different grading criteria. In order to receive a score of two on Question 120, the candidate was required to demonstrate any one of five requirements listed in the NCEES Solution and Scoring Plan for "2-Rudimentary Knowledge." The first requirement in the NCEES Solution and Scoring Plan (Respondent's Exhibit 2) for receiving a score of two points is, "Determines effective overburden stress at mid- depth of clay layer." The remaining four NCEES scoring criteria required that the examinee: Computes the change in effective stress at mid- depth of the clay layer due to placement of the fill. Computes the primary consolidation settlement, based on a change in effective stress, due to the fill surcharge. Evaluates the Average Degree of Consolidation and the Time Factor. Determines the waiting period after fill placement recognizing the existence of double-drained conditions. In order to gain two more points (total 4 points) so as to demonstrate "More Than Rudimentary Knowledge But Insufficient to Demonstrate Minimum Competence," Petitioner would have to have met two of the five bulleted criteria. For two more points (total 6 points) for "Minimum Competence," Petitioner would have had to score three bullets. For two more points (total 8 points) for "More than Minimum But Less Than Exceptional Competence," Petitioner would have had to score four bullets. Finally, to attain "Exceptional Competence" for 10 total points, Petitioner would have had to score all five bullets. In the first correct equation for answering Question 120, "p sub zero" (p naught) equals the present effective overburden pressure, which represents what clay was present before anything was put on top of the clay layer. "P" equals the total pressure acting at mid-height of the consolidating clay layer or the pressure of the dirt and the water in the dirt. "H" equals the thickness of the consolidating clay layer. Petitioner's solution for the first bullet, "determining the effective overburden stress at mid-depth of clay layer," indicated p sub zero (p naught) as the "present effective overburden pressure," but it incorrectly calculated p sub zero equaling 125 pounds multiplied by 13 feet. This is incorrect because the effective overburden pressure would not include 13 feet of fill. The 13 feet of fill is not part of p sub zero, the present effective overburden pressure. Petitioner's solution for the first bullet, also multiplied water, represented by 62.4, by 12, which is incorrect. She should have used a multiplier of 10 to receive credit for this problem. The grader indicated the correct equation was used incorrectly by Petitioner because of the two foregoing incorrect calculations. The equation, as Petitioner stated it, was correct and her multiplication was correct. Her solution identified P sub zero as present effective overburden pressure but present effective overburden pressure would not include the fill. Petitioner had the correct equation for the present effective overburden pressure and her mathematics were correct. However, she did not use the consolidation equation correctly, not obtaining the correct percentage of primary consolidation. As stated, the problem did not consider the fill as part of the present effective overburden pressure. Her solution also contained the correctly written time rate of settlement equation but failed to use it, and no waiting period was determined. The practical result of Petitioner's error could range from a cracked building to a collapsed building, depending upon the degree of error to site and materials.

Recommendation Upon the foregoing findings of fact and conclusions of law, it is RECOMMENDED that the Department of Business and Professional Regulation enter a Final Order denying Petitioner's challenge and affirming her score as one point below passing. RECOMMENDED this 3rd day of March, 1998, in Tallahassee, Leon County, Florida. ELLA JANE P. DAVIS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 Filed with the Clerk of the Division of Administrative Hearings this 3rd day of March, 1998. COPIES FURNISHED: Susan E. Wilson 3581 Jose Terrace Jacksonville, Florida 32217 R. Beth Atchison Assistant General Counsel Department of Business and Profession Regulation 1940 North Monroe Street Tallahassee, Florida 32399 Angel Gonzalez, Executive Director Department of Business and Profession Regulation 1940 North Monroe Street Tallahassee, Florida 32399 Lynda L. Goodgame General Counsel Department of Business and Profession Regulation 1940 North Monroe Street Tallahassee, Florida 32399

Florida Laws (1) 120.57
# 8
LAKE COUNTY SCHOOL BOARD vs. LAWRENCE R. CAMPBELL, 81-001087 (1981)
Division of Administrative Hearings, Florida Number: 81-001087 Latest Update: Oct. 23, 1989

The Issue Whether respondent, a junior high school teacher, should be dismissed from employment pursuant to Section 231.36(4), Florida Statutes (1979), on grounds of incompetency and inefficiency--specifically, his alleged failure to use proper testing and grading techniques and procedures.

Findings Of Fact The Respondent Respondent, a classroom teacher under continuing contract, has worked for the Lake County School Board for 24 years; he has taught social studies at Leesburg Junior High School for the last eight years. He is certified to teach social studies to grades 1 through 12. (Testimony of respondent; P-6.) II. June, 1980: Identification of Respondent's Grading and Testing Deficiencies In June, 1980 -- at the conclusion of the 1979-80 school year -- P. Jeffrey Ladd, assistant principal at Leesburg Junior High School, received an inquiry concerning a report card completed by respondent. Upon investigation, Ladd discovered that respondent had failed to calculate a student's final grade pursuant to the School Board Grading Policy No. 7.07 3(d) 3/ . At Ladd's request, respondent returned to school and recomputed the grades awarded to his other students; it was then discovered respondent had similarly miscalculated the grades for all 117 students assigned to him during the school year. (Testimony of Polk; P-4, P-46, P-51.) Ladd then reviewed the examinations which respondent had given and found: (1) one final exam contained a 26-point "gift" to students; (2) generous 10-to-30 point grading curves had been used; (3) 20 out of 74 questions on an eightR-grade exam required simple "unscrambling" of words within the social studies context; (4) many pages on final exams had no marks indicating they had been graded, and (5) the method used in calculating final grades was unclear. (P-51.) On June 10, 1980, Ladd expressed his concern to respondent concerning the miscalculation of final grades and the deficiencies he had found in the final examinations. Respondent's attitude was cooperative; he did not disagree that the problems existed or that they required correcting. Ladd followed up the conference by sending respondent, by registered mail, a written summary of the matters discussed; respondent refused to accept the letter. (Testimony of respondent; P-16, P-51.) When Joseph R. Polk, principal of Leesburg Junior High School, returned from summer vacation, Ladd informed him of respondent's grading miscalculations and respondent's refusal of the registered letter. Polk then met with respondent on July 24, 1980, and went over the grading and testing problems identified in Ladd's registered letter. Because these problems had a far reaching affect on students, Polk considered them to be serious deficiencies; he told respondent that such mistakes could not be tolerated, that a teacher with respondent's experience should not take them lightly, that other action, including dismissal, would have to be taken if the problems continued. (Tr. 19, 22.) 4/ Respondent acknowledged his grading deficiencies and stated that he intended to take care of the problem. (Testimony of Polk.) III. Respondent's Subsequent Performance First Nine-Week Period For the purpose of determining whether Respondent's grading and testing deficiencies had been corrected, Polk and Ladd reviewed the next examination given by respondent. Their review at the end of the first nine-week period in the 1980-81 school year indicated that students had done poorly on the test; there were three As, five Bs, eleven Cs, nine Ds, and ninety-one Fs. The test covered material not included in the lesson book; it contained open-ended subjective questions which could have been answered in a number of ways. Examples were: 2. The United States is a ( ) ( ) ( ) ( ). [Answer: (land)(of)(great)(re- sources).] 6. The ( )( ) is one of the most beautiful regions of our nation. [Answer: (Pacific)(Coast).] (P-18.) This examination was not an effective measure of the students' progress. On October 20, 1980, Polk and Ladd went over the examination with respondent, pointed out the deficiencies, and recommended that he take a course in tests and measurements to improve his ability to give examinations. The next day, Polk completed and delivered to respondent a teacher Pre-Assessment Form which identified the test deficiencies discussed at the conference and explicitly recommended corrective action: "Take a course in tests and measurements to improve your [respondent's] testing abilities." (P-18.) Respondent was given another nine weeks to show improvement, and warned that failure to correct the area of concern could lead to his dismissal or non-renewal. (Testimony of Polk; P-18, P-51.) Second Nine-Week Period During the next nine-week period, respondent asked Polk and Ladd to review his semester exams before they were administered; Polk agreed. On December 15, 1980, Polk and Ladd met with respondent and reviewed the proposed examinations. The exam questions had been taken directly from a teacher's manual which accompanied the course textbook; Polk and Ladd concluded that the questions were extremely difficult. When Polk asked respondent five of the questions, respondent was unable to give a correct answer. The test also covered material which may not have been taught to the students. Polk and Ladd suggested improvements to respondent; but there was insufficient time to revise the tests since they were to be given the next day. (Testimony of Polk; P-21, P-51.) Respondent taught five classes in U.S. History to eighth graders; students were grouped in these classes according to their ability levels. Ladd and Polk also questioned whether the tests prepared by respondent were adequate to test the abilities of the five different levels of students. (Testimony of Polk; P-51.) On December 17, 1980, respondent administered the tests which had been reviewed earlier by Polk and Ladd. Fifty-two percent of his students -- in all classes -- received a D or F on the exams. One class had been given the test, with answers, three days earlier; such action is unusual and not a sound educational practice. Polk met with respondent that afternoon, told him to stop providing students with examinations in advance, and asked for all his exams so that they could each be checked individually. In January, Polk and Ladd reviewed these examinations to see if respondent's grading problems had been corrected. They found little improvement. All of the exams in two of respondent's classes had been graded incorrectly. Some material on the exams had not been taught in class. On January 15, 1981, Polk wrote respondent a memorandum describing these testing deficiencies, concluding that the problems noted in October, 1980, had not been corrected, and making six specific recommendations for improvement. These recommendations required respondent observe testing procedures in other social studies classes; to improve his math skills 5/ , to cease providing students with exams in advance; and to take a course in testing and grading techniques, as previously requested in October, 1980. Polk also asked respondent to again submit his examinations for review during the next (third) nine-week grading period. Respondent was reminded that his course planning book should indicate that students have been taught the material included on a test. The memorandum cautioned that if respondent's testing problems were not corrected by the end of the third nine-week period (March 12, 1981) he might be dismissed or returned to annual contract. (Testimony of Polk; P-26, P-51.) Third Nine-Week Period During the next nine-week period, respondent did not comply with Polk's October, 1980, and January, 1981, requests to take a course on testing and measurement procedures. 6/ In March, respondent did ask Polk for help in establishing such a course at the local teacher's education center; although Polk suggested contacting the education center, respondent did not do so. A college course in testing and grading techniques was available at several schools in the Lake County area during the first quarter of 1980. Notice of the course offered by the University of Central Florida was posted in the junior high school's teachers' lounge. (Testimony of Polk; P-52.) In February and early March, 1981, respondent, as requested, submitted his examinations for prior review. Some improvement was noted by Ladd and Polk. The tests appeared to follow the respondent's planbook and, on their face, were generally acceptable. 7/ (Testimony of Polk; p-27, P-34.) However, the results of the nine-week final examinations showed that respondent had not corrected his deficiencies in grading examinations. Numerous computation errors were detected; 59 out of 121 papers were incorrectly graded. Questions that were incorrect were marked correct -- and vice versa. Also, there was an unusually large number of poor grades: 65 Fs, 18 Ds, 34 Cs, 2 Bs, and 1 A. Assuming that respondent adequately instructed his students, these grades indicate that the tests did not adequately measure what had been taught during the course. (Testimony of Polk; P-36.) Although Polk had suggested in his January memorandum to respondent that he take a course to improve his proficiency in math, respondent did not do so. Such courses were readily available at the Adult Education Center and Sumter Community College. (Testimony of Polk.) Respondent's Actions After Recommendation for Dismissal On March 17, 1981, Polk notified respondent that he would be recommended for dismissal based on incompetency and inefficiency as demonstrated by his failure to use proper testing and grading techniques. Polk concluded that despite repeated efforts to assist 8/ respondent, no meaningful improvement had been made. (Testimony of Polk; P-37.) Thereafter, during June, 1981, respondent took a course on testing and evaluation techniques at Bethune-Cookman College in Daytona Beach. (Testimony of respondent; R-1.)

Recommendation Based on the foregoing findings of fact and conclusions of law, it is RECOMMENDED: That the School Board of Lake County enter a final order dismissing respondent from his employment. DONE AND ORDERED this 11th day of September, 1981, in Tallahassee, Florida. R. L. CALEEN, JR. Hearing Officer Division of Administrative Hearings The Oakland Building 2009 Apalachee Parkway Tallahassee, Florida 32301 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 11th day of September, 1981.

Florida Laws (2) 120.577.07
# 9
ESTHER C. REEDY vs. DEPARTMENT OF EDUCATION, 80-001346 (1980)
Division of Administrative Hearings, Florida Number: 80-001346 Latest Update: Nov. 15, 1990

Findings Of Fact On or about January 30, 1979, Respondent, State of Florida, Department of Education (hereinafter "Department") issued an Announcement of Position Vacancy for Position number 00533 for an Educational Data Analyst I (hereinafter "EDA-I"). The deadline for filing applications was February 20, 1979. The minimum qualifications were: Graduation from an accredited four-year college or university and two years of experience in school administration, teaching or experience directly related to the specific school service program. Professional or technical experience in one of the above areas may be substituted for the required college training on a year-for-year basis. These qualifications, known as class specifications, are issued by the Department of Administration. The advertised position was in the Teacher Certification Section. John Stables was the Administrator for the Teacher Certification Section during all times material hereto. The first step in the selection process begins with the request for announcement of position vacancy. When this is approved, the Department issues an Announcement of Position Vacancy. The Announcement contains a closing date by which all those interested in the position must file their applications. The applications are received in the Department's Personnel Office and are screened to determine whether or not the applicant meets the minimum qualifications as set forth in the Announcement. As the applications are received, they are forwarded to the section where the vacant position is available. After the applications are received by the section that has the vacant position, the applications are reviewed, interviews are conducted, and the top candidates are designated. A specific recommendation is made by the section head, approved by the Division Director, and then reviewed by the Personnel Office for compliance with appropriate rules and to verify that all paperwork has been properly completed. The recommendation is then forwarded to Francis N. Millett, Jr., the Deputy Commissioner of Education, who is also the Department's Equal Employment Opportunity Officer, who reviews it. The recommendation then goes to Commissioner Turlington, who makes the final decision and signs the appointment letter. In the Teacher Certification Section, Patricia Wortham had the duty of receiving all applications and compiling a list of the applicants. It was also her duty to make arrangements for interviews of any applicants that requested an interview. In addition, when the Section had made its recommendation, she typed the Department's form containing statistical information and returned the form with the applications of those who had not been selected to the Personnel Office. In the Teacher Certification Section, Myra Burkhalter, an Educational Consultant III, had the duty of conducting interviews with the applicants in the first instance. Burkhalter had been employed in the Section for approximately ten years and had served as an Educational Consultant III for the last three or four of those years. The Educational Data Analysts I and II were under her general supervision. She was the highest ranking employee in the Section, outranked only by Staples, and was specifically given the task of interviewing applicants. After Burkhalter completed her interview with a particular applicant, she would introduce the applicant to Staples if he were available. The EDA-I position is a meticulous job that requires from one to one and a half years of training before the individual is capable of performing the job. There is contact between the EDA-I and persons in educational institutions outside the Department. The analysts review transcripts of persons applying for a teaching certificate in the State of Florida. The duties require counseling and interviewing with teacher-applicants and further require an almost instant recall of all the statutes and State Board of Education teacher certification rules. The analyst reviews the courses and experience of the teacher-applicant and applies the course work and credits against the rules to determine whether the person applying for a certificate meets the minimum qualifications. Some 1,600 to 1,700 institutions from which the certification applicants obtained schooling have to have their accreditation status verified by the analyst in order to determine whether or not that institution meets the standards set by the State of Florida. Additionally, many applicants have degrees from institutions in countries other than the United States, and the analyst must either know or be able to find information regarding the schools in those foreign countries. Staples and Burkhalter considered the interview process of an applicant for an EDA-I position to be imperative. Burkhalter explained to the applicant in some detail the preciseness required in performing the job and the pressures of the job, since there were always teacher certification requests to be analyzed. The year's training procedure, the amount of knowledge that must be acquired by the analyst in order to perform the required functions, and the importance of the screening process of the certification applicants in order to assure that only qualified teachers are certified were explained to the interviewee. Additionally, Burkhalter talked with the EDA-I applicant to determine how that person's reaction would be (as near as possible in an interview) to the type of work and duties of the position. The communicative skills of the job applicant were discussed. It is absolutely essential that the EDA-I have the ability to communicate both orally and in writing. The reason for the high degree of communication ability is that an EDA-I, after the training period, writes letters to educational institutions concerning transcripts and talks to and corresponds with persons requesting certification. Part of the duties involve contact with the public. An analyst spends one week out every six or eight weeks at the front desk working with office visitors. Good communicative skills are required in order that the EDA can answer questions posed by applicants for teacher certification. The interviewing process is thus required in order to ascertain the applicant's communicative skills. Another purpose of the interview is to determine as nearly as possible the applicant's attitude in interpersonal relationships, since there are approximately eighteen analysts doing the same thing, and it is essential that they work as a team. The job places the EDA-I under considerable pressure in working closely with other people, especially during the training period. The training period, by necessity, requires close supervision of the EDA-I and involves frequent correction of the trainee's work. Since the Department invests over a year in the training of an EDA-I, it is essential that an applicant's future plans be discussed, particularly the applicant's intentions concerning how long the applicant intends to remain in Tallahassee, the job location, and how long the applicant intends to remain in the position. Inasmuch as the information to be given to and received from the interviewee can only be communicated and evaluated in a face-to-face meeting, it is essential that those applying for the position be interviewed. The Department has no established policy regarding the conducting of employment interviews. The method utilized is left up to the particular section doing the interviewing. Furthermore, the Department of Administration has promulgated no rules or guidelines requiring that interviews be conducted in a certain manner, that an agency interview a certain number of applicants, or that an agency interview any applicants. Since there were no state or department rules for conducting interviews, it was the practice of the Section to interview those applicants requesting an interview. Since there were many applicants for each EDA-I position, and since most of the applicants met the minimum qualifications, experience had shown that there would be a sufficient number of applicants that requested an interview from which the top four or five names would be submitted to Staples for his recommendation. Staples believed that the fact that a person would call and ask for an interview was indicative of the person's enthusiasm and interest in the job itself. He believed it was a further indication of the person's self-confidence and desire to obtain employment. Burkhalter and Staples endeavored to evaluate whether the applicant would fit into the EDA-I job during the interview process. Staples and Burkhalter never refused to interview anyone who requested an interview. Additionally, no one was hired who had not been interviewed. On or about February 16, 1979, Petitioner filed an application with the Personnel Office for the EDA-I Position number 00533. She was born in Puerto Rico, where the main language is Spanish. Her family spoke French and Spanish while she was growing up, and Petitioner speaks English with an accent and Spanish. Petitioner's application was forwarded to the Teacher Certification Section. Twenty-five applications were received for Position number 00533. Eight persons were interviewed by Burkhalter for Position number 00533--five were interviewed in February and March, 1979, and three had been interviewed on previous occasions. Approximately two weeks after Petitioner filed her application at the Personnel Office, she called the Teacher Certification Section inquiring as to what action had been taken with her application. Since the person answering the telephone had no information regarding the applications for the position, Petitioner requested that Staples return her phone call. When she did not receive a return call from Staples, she again called the Teacher Certification Section, again spoke to someone with no information regarding the pending applications, and again requested that Staples return her call. When she did not receive a return phone call from Staples, Petitioner called the Teacher Certification Section a third time. Patricia Wortham, the person in charge of scheduling interviews of applicants, took the third phone call and distinctly remembers her conversation with Petitioner. Petitioner asked if the position had been filled and why she had not been called for an interview. Wortham explained that the Section did not call applicants to schedule interviews, but rather waited until an applicant requested an interview. Wortham asked Petitioner if she would like to come in for an interview, and Petitioner replied that she did not want an interview. Wortham was surprised by Petitioner's refusal to come in for an interview since in the seven years that Northam had worked in that position, she had never had an applicant decline to come in for an interview. Petitioner's telephone conversation with Wortham concerning an interview occurred before anyone had been selected to fill the position. Petitioner was informed that the position had not been filled, and that an interview was available. Although Petitioner denies that she was offered an interview, she does admit that during her third phone call to the Teacher Certification Section an interview was discussed. By the time Petitioner called the Section for the fourth time, the position had been filled, and she was so advised by Burkhalter. Shortly thereafter, she received a letter officially notifying her that a selection had been made. Margaret Goforth filled an application and met the minimum qualifications for the position. She requested and was granted an interview. Since she was believed to be the best applicant of those interviewed, she was selected. Staples signed the recommendation to hire Goforth on March 16, 1979, and she began work on April 24, 1979. After Ralph Turlington became Commissioner of Education in 1974, he determined that the Department needed to have an Equal Employment Opportunity (hereinafter "EEO") policy committee and EEO officer. The Department subsequently instituted an EEO policy. The purpose of the policy is to provide people of all racial and ethnic backgrounds a greater opportunity to apply for and be selected for positions in the Department. To implement the policy, the Department began to advertise widely positions that became open so that people meeting the qualifications would have an opportunity to apply. An EDA-I is considered a professional position. The Department sends position vacancy announcements for professional positions to approximately six hundred locations, including universities, community colleges, school districts, minority groups, affirmative action groups, and also distributes the announcement within the Department. The purpose of the EEO policy is to ensure that all applications for positions are given equal treatment. The policy sets forth target areas such as minorities, handicapped persons, and affirmative action groups in order for these persons to be notified and have the opportunity to apply for positions. The EEO policy does not specify how job applicants are to be interviewed or selected for interviews. The procedure for conducting the interviews and making the final selection is left up to the individual section, provided the procedure used does not discriminate against an applicant.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is, therefore, RECOMMENDED THAT: A final order be entered by the Florida Commission on Human Relations finding that Esther C. Reedy was not discriminated against on the basis of her age or national origin and dismissing her Petition for Relief with prejudice. RECOMMENDED this 31st day of August, 1982, in Tallahassee, Florida. LINDA M. RIGOT, Hearing Officer Division of Administrative Hearings The Oakland Building 2009 Apalachee Parkway Tallahassee, Florida 32301 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 31st day of August, 1982. COPIES FURNISHED: Robert I. Scanlan, Esquire Post Office Box 10311 Tallahassee, Florida 32302 Gene T. Sellers, Esquire State Board of Education Knott Building Tallahassee, Florida 32301 Aurelio Durana, Esquire Assistant General Counsel Florida Commission on Human Relations 2562 Executive Center Circle, East Suite 100, Montgomery Building Tallahassee, Florida 32301 Mr. Richard Williams Executive Director Florida Commission on Human Relations 2562 Executive Center Circle, East Suite 100, Montgomery Building Tallahassee, Florida 32301

USC (1) 42 U.S.C 2000e Florida Laws (1) 120.57
# 10

Can't find what you're looking for?

Post a free question on our public forum.
Ask a Question
Search for lawyers by practice areas.
Find a Lawyer