Elawyers Elawyers
Ohio| Change
Find Similar Cases by Filters
You can browse Case Laws by Courts, or by your need.
Find 49 similar cases
NATIONAL COMPUTER SYSTEMS, INC. vs DEPARTMENT OF EDUCATION, 99-001226BID (1999)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Mar. 17, 1999 Number: 99-001226BID Latest Update: Jul. 19, 1999

The Issue The primary issue is whether the process used by the Department of Education (Department) for evaluating and ranking the proposals submitted in response to Request For Proposal (RFP) 99-03 for the Florida Comprehensive Assessment Test (FCAT) administration contract was contrary to the provisions of the RFP in a way that was clearly erroneous, contrary to competition, arbitrary, or capricious.

Findings Of Fact The RFP for the FCAT describes a five stage process for evaluating proposals. In Stage I, the Department’s Purchasing Office determined whether a proposal contained certain mandatory documents and statements and was sufficiently responsive to the requirements of the RFP to permit a complete evaluation. Stage II involved the Department’s evaluation of a bidder’s corporate qualifications to determine whether the bidder has the experience and capability to do the type of work that will be required in administering the FCAT. Stage III was the Department’s evaluation of a bidder’s management plan and production proposal. In Stage IV, the Department evaluated a bidder’s cost proposal. Stage V involved the ranking of proposals based on points awarded in Stages II-IV. If a proposal did not meet the requirements at any one stage of the evaluation process, it was not to be evaluated in the following stage. Instead, it was to be disqualified from further consideration. Stages II and III of the evaluation process were conducted by an evaluation team comprised of six Department employees: Dr. Debby Houston, Ms. Lynn Joszefczyk, Dr. Peggy Stillwell, Dr. Cornelia Orr, Dr. Laura Melvin, and Ms. Karen Bennett. Dr. Thomas Fisher, head of the Department’s Assessment and Evaluation Services Section, and Dr. Mark Heidorn, Administrator for K-12 Assessment Programs within the Department’s Assessment and Evaluation Services Section, served as non-voting co-chairs of the evaluation team. The focus of this proceeding is Stage II of the evaluation process addressing a bidder’s corporate qualifications. RFP Provisions Regarding Corporate Qualification The FCAT administration contractor will be required to administer tests to approximately one and a half million students each year in a variety of subject areas at numerous grade levels. The FCAT program involves a complex set of interrelated work activities requiring specialized human resources, technological systems and procedures. The FCAT must be implemented annually within limited time periods. The FCAT administration contractor must meet critical deadlines for the delivery of test materials to school districts and the delivery of student scores prior to the end of the school year. In developing the RFP, the Department deliberately established a set of minimum requirements for corporate qualifications that a bidder was to demonstrate in order for its proposal to be eligible for further evaluation. The purpose of the RFP’s minimum corporate qualifications requirements was to limit bidding to qualified vendors who have demonstrated prior experience in successfully administering large-scale assessment projects like the FCAT, thereby providing the Department with some degree of assurance that the winning bidder could successfully administer the FCAT. The instructions to bidders regarding the minimum requirements for corporate qualifications are contained in RFP Section 10, which gives directions on proposal preparation. Section 10.1, which lists certain mandatory documents and statements to be included in the bidder’s proposal, requires that a transmittal letter contain "[a] statement certifying that the bidder has met the minimum corporate qualifications as specified in the RFP." These "minimum corporate qualifications" are set forth in RFP Appendix J. RFP Section 10.2 identifies what a bidder is required to include in its proposal with respect to corporate qualifications. The first paragraph of Section 10.2 directs a bidder generally to describe its qualifications and experience performing tasks similar to those that it would perform in administering the FCAT, in order to demonstrate that the bidder is qualified where it states: Part II of a bidder’s proposal shall be entitled Corporate Qualifications. It shall provide a description of the bidder’s qualifications and prior experience in performing tasks similar to those required in this RFP. The discussion shall include a description of the bidder’s background and relevant experience that qualifies it to provide the products and services required by the RFP. RFP Section 10.2, however, is not limited to a directive that qualifications and past experience be described generally. Instead, Section 10.2, also communicates, in plain and unambiguous terms, that there are specific minimum corporate qualifications a bidder must demonstrate: The minimum expectations for corporate qualifications and experience are shown in Appendix J. There are two separate sets of factors, one set of eight for the developmental contractor and another set of nine for the administration contractor. Bidders must demonstrate their Corporate Qualifications in terms of the factors that are applicable to the activities for which a bid is being submitted -- development or administration. For each criterion, the bidder must demonstrate that the minimum threshold of experience has been achieved with prior completed projects. (Emphasis added.) Moreover, Section 10.2 singles out for emphasis, in relation to the administration component of the RFP, the importance placed on a bidder’s ability to demonstrate experience processing a large volume of tests: The [bidder’s prior completed] projects must have included work tasks similar to those described herein, particularly in test development or processing a comparable number of tests. The bidder will provide a description of the contracted services; the contract period; and the name, address, and telephone number of a contact person for each of the contracting agencies. This description shall (1) document how long the organization has been providing similar services; (2) provide details of the bidder’s experience relevant to the services required by this RFP; and (3) describe the bidder’s other testing projects, products, and services that are similar to those required by this RFP. (Emphasis added.) The Department thus made clear its concern that bidders demonstrate experience with large-scale projects. RFP Appendix J sets forth nine different criteria (C1 through C9) for the administration contractor. As stated in RFP Section 10.2, "[f]or each criterion, the bidder must demonstrate that the minimum threshold of experience has been achieved with prior completed projects . . . ." (emphasis added). Appendix J contains a chart which lists for each criterion: (1) a summary of the related FCAT work task, (2) the detailed criteria for the bidder’s experience related to that work task, and (3) the necessary documentation a bidder must provide. Criterion C4 and Criterion C6 include work tasks that involve the use of image-based scoring technology. C4 and C6 are the only corporate qualifications criteria at issue in this proceeding. RFP Provisions Involving Corporate Qualifications for Image-Based Scoring "Handscoring" is the test administration activity in which open-ended or performance-based student responses are assessed. This practice involves a person reading something the student has written as part of the test, as distinguished from machine scoring multiple choice responses (i.e., the filled-in "bubbles" on an answer sheet). There are two types of handscoring: (1) paper-based handscoring, and (2) image-based handscoring. Paper-based handscoring requires that a student response paper be sent to a reader, who then reviews the student’s response as written on the paper and enters a score on a separate score sheet. Image-based handscoring involves a scanned image of the student’s response being transmitted to a reader electronically. The student’s response is then projected on a computer screen, where the reader reviews it and assigns a score using the computer. The RFP requires that the reading and math portions of the FCAT be handscored on-line using imaging technology beginning with the February 2000 FCAT administration. The RFP provides that the writing portion of the FCAT may be handscored using either the paper-based method or on-line imaging technology during the February 2000 and 2001 FCAT administrations. However, on-line image-based scoring of the writing portion of the FCAT is required for all FCAT administrations after February 2001. An image-based scoring system involves complex computer technology. William Bramlett, an expert in designing and implementing large-scale imaging computer systems and networks, presented unrefuted testimony that an image-based scoring system will be faced with special challenges when processing large volumes of tests. These challenges involve the need to automate image quality control, to manage the local and wide area network load, to assure adequate server performance and storage requirements, and to manage the work flow in a distributed environment. In particular, having an image-based scoring system process an increasing volume of tests is not simply a matter of adding more components. Rather, the system’s basic software architecture must be able to understand and manage the added elements and volume involved in a larger operation. According to Bramlett, there are two ways that the Department could assess the ability of a bidder to perform a large- scale, image-based scoring project such as the FCAT from a technological perspective: (1) have the bidder provide enough technological information about its system to be able to model or simulate the system and predict its performance for the volumes involved, or (2) require demonstrated ability through completion of prior similar projects. Dr. Mark Heidorn, Administrator for Florida’s K-12 Statewide Assessment Programs, was the primary author of RFP Sections 1-8, which describe the work tasks for the FCAT -- the goods and services vendors are to provide and respond to in their technical proposals. Dr. Heidorn testified that in the Department’s testing procurements involving complex technology, the Department has never required specific descriptions of the technology to be used. Instead, the Department has relied on the bidder’s experience in performing similar projects. Thus, the RFP does not specifically require that bidders describe in detail the particular strategies and approaches they intend to employ when designing and implementing an image-based scoring system for FCAT. Instead, the Department relied on the RFP requirements calling for demonstrated experience as a basis to understand that the bidder could implement such an image-based scoring system. Approximately 717,000 to 828,000 student tests will be scored annually by the FCAT administration contractor using imaging technology. The RFP, however, does not require that bidders demonstrate image-based scoring experience at that magnitude. Instead, the RFP requires bidders to demonstrate only a far less demanding minimum level of experience using image-based scoring technology. Criterion C4 and Criterion C6 in Appendix J of the RFP each require that a bidder demonstrate prior experience administering "a minimum of two" assessment programs using imaged- based scoring that involved "at least 200,000 students annually." The requirements for documenting a "minimum of two" programs or projects for C4 and C6 involving "at least 200,000 students annually" are material because they are intended to provide the Department with assurance that the FCAT administration contractor can perform the large-scale, image-based scoring requirements of the contract from a technological perspective. Such experience would indicate that the bidder would have been required to address the sort of system issues described by Bramlett. Dr. Heidorn testified that the number 200,000 was used in C4 and C6 "to indicate the level of magnitude of experience which represented for us a comfortable level to show that a contractor had enough experience to ultimately do the project that we were interested in completing." Dr. Fisher, who authored Appendix J, testified that the 200,000 figure was included in C4 and C6 because it was a number judged sufficiently characteristic of large-scale programs to be relevant for C4 and C6. Dr. Fisher further testified that the Department was interested in having information that a bidder’s experience included projects of a sufficient magnitude so that the bidder would have experienced the kinds of processing issues and concerns that arise in a large-scale testing program. The Department emphasized this specific quantitative minimum requirement in response to a question raised at the Bidder’s Conference held on November 13, 1998: Q9: In Appendix J, the criteria for evaluating corporate quality for the administration operations C4, indicates that the bidder must have experience imaging as indicated. Does this mean that the bid [sic] must bid for using [sic] imaging technology for reading and mathematics tests? A: Yes. The writing assessment may be handscored for two years, and then it will be scored using imaging technology. To be responsive, a bid must be for imaging. The corporate experience required (200,000 students annually for which reports were produced in three months) could be the combined experience of the primary contractor and the subcontractors. (Emphasis added.) Criterion C4 addresses the RFP work tasks relating to handscoring, including both the image-based handscoring of the reading and math portions of the FCAT for all administrations and the writing portions of the FCAT for later administrations. The "Work Task" column for C4 in Appendix J of the RFP states: Design and implement efficient and effective procedures for handscoring student responses to performance tasks within the limited time constraints of the assessment schedule. Handscoring involves image-based scoring of reading and mathematics tasks for all administrations and writing tasks for later administrations at secure scoring sites. Retrieve and score student responses from early district sample schools and deliver required data to the test development contractor within critical time periods for calibration and scaling. The "Necessary Documentation" column for C4 in Appendix J states: Bidder must document successful completion of a minimum of two performance item scoring projects for statewide assessment programs during the last four years for which the bidder was required to perform as described in the Criteria column. (Emphasis added.) The "Criteria" column for C4 in Appendix J, like the related work tasks in the RFP, addresses both image-based handscoring of reading and math, as well as paper-based or image- based handscoring of writing. In connection with all handscoring work tasks, "[t]he bidder must demonstrate completion of test administration projects for a statewide program for which performance items were scored using scoring rubrics and associated scoring protocols." With respect to the work tasks for handscoring the reading and math portions of the FCAT, "[t]he bidder must demonstrate completion of statewide assessment programs involving scoring multiple-choice and performance items for at least 200,000 students annually for which reports were produced in three months." In addition, for the reading and math work tasks, "[e]xperience must been shown in the use of imaging technology and hand-scoring student written responses with completion of scoring within limited time restrictions." This provision dealing with "imaging technology" experience self-evidently addresses the reading and math components, because separate language addresses imaging experience in connection with the writing component. The relevant handscoring experience for the reading and math aspects of the program is experience using image-based technology. By contrast, with respect to the work tasks for scoring the writing portions of the FCAT, "the bidder must also demonstrate completion of statewide assessment programs involving paper-based or imaged scoring student responses to writing assessment prompts for at least 200,000 students annually for which reports were produced in three months." (Emphasis added.) Criterion C6 addresses work tasks relating to designing and implementing systems for processing, scanning, imaging and scoring student responses to mixed-format tests within limited time constraints. The "Work Task" column for C6 in RFP Appendix J states: Design and implement systems for the processing, scanning, imaging, and scoring of student responses to test forms incorporating both multiple-choice and constructed response items (mixed-format) within the limited time constraints of the assessment schedule. Scoring of student responses involves implementation of IRT scoring tables and software provided by the development contractor within critical time periods. The "Necessary Documentation" column for C6 in Appendix J states: Bidder must document successful completion of a minimum of two test administration projects for statewide assessment programs during the last four years in which the bidder was required to perform as described in the Criteria column. (Emphasis added.) The Criteria column for C6 in Appendix J states: The bidder must demonstrate completion of test administration projects for statewide assessment programs or other large-scale assessment programs that required the bidder to design and implement systems for processing, scanning, imaging, and scoring responses to mixed-format tests for at least 200,000 students annually for which reports were produced in three months. Experience must be shown in use of imaging student responses for online presentation to readers during handscoring. (Emphasis added.) RFP Provisions Per Corporate Qualifications The procedure for evaluating a bidder’s corporate qualifications is described in RFP Section 11.3: The Department will evaluate how well the resources and experience described in each bidder’s proposal qualify the bidder to provide the services required by the provisions of this RFP. Consideration will be given to the length of time and the extent to which the bidder and any proposed subcontractors have been providing services similar or identical to those requested in this RFP. The bidder’s personnel resources as well as the bidder’s computer, financial, and other technological resources will be considered in evaluating a bidder’s qualifications to meet the requirements of this RFP. Client references will be contacted and such reference checks will be used in judging a bidder’s qualifications. The criteria to be used to rate a bidder’s corporate qualifications to meet the requirements of this RFP are shown in Appendix J and will be applied as follows: * * * Administrative Activities. Each of the nine administration activities criteria in Appendix J will be individually rated by members of the evaluation team. The team members will use the rating scale shown in Figure 1 below. Individual team members will review the bidder’s corporate qualifications and rate the response with a rating of one to five. The ratings across all evaluators for each factor will be averaged, rounded to the nearest tenth, and summed across all criteria. If each evaluator assigns the maximum number of points for each criterion, the total number of points will be 45. To meet the requirements of Stage II, the proposal must achieve a minimum rating of 27 points and have no individual criterion for which the number of points averaged across evaluators and then rounded is less than 3.0. Each proposal that receives a qualifying score based on the evaluation of the bidder’s qualifications will be further evaluated in Stage III. Figure 1 Evaluation Scale for Corporate Qualifications 5 Excellent 4 3 Satisfactory 2 1 Unsatisfactory The bidder has demonstrated exceptional experience and capability to perform the required tasks. The bidder has demonstrated that it meets an acceptable level of experience and capability to perform the required tasks. The bidder either has not established its corporate qualifications or does not have adequate qualifications. RFP Section 11.3 provides that each of the nine corporate qualifications criteria for administration operations in Appendix J (C1 through C9) will be individually rated by the six members of the evaluation team using a scale of one to five. A rating of three is designated as "satisfactory" which means that "[t]he bidder has demonstrated that it meets an acceptable level of experience and capability to perform the required tasks." In order to be further evaluated, Section 11.3 provides that there must be no individual corporate qualifications criterion for which the bidder’s proposal receives a score less than 3.0 (average points across evaluators). Dr. Fisher, the primary author of Section 11.3 of the RFP, referred to the 3.0 rating as the "cut score." (Emphasis added.) The RFP’s clear and unambiguous terms thus establish the "minimum threshold" of experience that a bidder "must demonstrate" in its proposal for Criterion C1 through Criterion C9. The "minimum threshold" of experience that a bidder must demonstrate for each criterion is described in Appendix J of the RFP. If a proposal failed to demonstrate that the bidder meets the minimum threshold of experience for a particular criterion in Appendix J, the bidder obviously would not have demonstrated "that it meets an acceptable level of experience and capability to perform the required tasks." Thus, in that setting, an evaluator was to have assigned the proposal a rating of less than "satisfactory," or less than three, for that criterion. (Emphasis added.) The fact that a score less than "3" was expected for -- and would eliminate -- proposals that did not demonstrate the "minimum threshold" of experience does not render meaningless the potential scores of "1" and "2." Those scores may reflect the degree to which a bidder’s demonstrated experience was judged to fall below the threshold. Although some corporate capability minimums were stated quantitatively (i.e., "minimum of two," or "at least 200,000"), others were open to a more qualitative assessment (i.e., "large-scale," "systems," or "reports"). Moreover, a proposal that included demonstrated experience in some manner responsive to each aspect of Appendix J might nevertheless be assigned a score of less than "3," based on how an evaluator assessed the quality of the experience described in the proposal. By the terms of the RFP, however, an average score across evaluators of less than 3 represented essentially a decision that the minimum threshold of experience was not demonstrated. Had the Department truly intended Appendix J to reflect only general targets or guidelines, there were many alternative ways to communicate such an intent without giving mandatory direction about what bidders "must demonstrate" or without establishing quantitative minimums (i.e. "a minimum of two," or "at least 200,000"). RFP Appendix K, for instance, sets forth the evaluation criteria for technical proposals in broad terms that do not require the bidder to provide anything in particular. Even within Appendix J, other than in Criterion C4 and Criterion C6, bidders were to show experience with "large-scale" projects rather than experience at a quantified level. Pursuant to the RFP’s plain language, in order to meet the "minimum threshold" of experience for Criterion C4 and Criterion C6, a bidder "must demonstrate," among other things, successful completion of a "minimum of two" projects, each involving the use of image-based scoring technology in administering tests to "at least 200,000 students annually." Department’s Evaluation of Corporate Qualifications In evaluating Harcourt’s proposal, the Department failed to give effect to the plain RFP language stating that a bidder "must document" successful completion of a "minimum of two" testing projects involving "at least 200,000 students annually" in order to meet the "minimum threshold" of experience for C4 and C6. Dr. Fisher was the primary author of Sections 10, 11 and Appendix J of the RFP. He testified that during the Stage II evaluation of corporate qualifications, the evaluation team applied a "holistic" approach, like that used in grading open-ended written responses in student test assessments. Under the holistic approach that Dr. Fisher described, each member of the evaluation team was to study the proposals, compare the information in the proposals to everything contained in Appendix J, and then assign a rating for each criterion in Appendix J based on "how well" the evaluator felt the proposal meets the needs of the agency. Notwithstanding Dr. Fisher’s present position, the RFP’s terms and their context demonstrate that the minimum requirements for corporate qualifications are in RFP Appendix J. During the hearing, Dr. Fisher was twice asked to identify language in the RFP indicating that the Department would apply a "holistic" approach when evaluating corporate qualifications. Both times, Dr. Fisher was unable to point to any explicit RFP language putting bidders on notice that the Department would be using a "holistic" approach to evaluating proposals and treating the Appendix J thresholds merely as targets. In addition, Dr. Fisher testified that the Department did not engage in any discussion at the bidders’ conference about the evaluation method that was going to be used other than drawing the bidders’ attention to the language in the RFP. As written, the RFP establishes minimum thresholds of experience to be demonstrated. Where, as in the RFP, certain of those minimum thresholds are spelled out in quantitative terms that are not open to interpretation or judgment, it is neither reasonable nor logical to rate a proposal as having demonstrated "an acceptable level of experience" when it has not demonstrated the specified minimum levels, even if other requirements with which it was grouped were satisfied. The plain RFP language unambiguously indicates that an analytic method, not a "holistic" method, will be applied in evaluating corporate qualifications. Dr. Fisher acknowledged that, in an assessment using an analytic method, there is considerable effort placed up front in deciding the specific factors that will be analyzed and those factors are listed and explained. Dr. Fisher admitted that the Department went into considerable detail in Appendix J of the RFP to explain to the bidders the minimums they had to demonstrate and the documentation that was required. In addition, Dr. Orr, who served as a member of the evaluation team and who herself develops student assessment tests, stated that in assessments using the "holistic" method there is a scoring rubric applied, but that rubric does not contain minimum criteria like those found in the RFP for FCAT. The holistic method applied by the Department ignores very specific RFP language which spells out minimum requirements for corporate qualifications. Harcourt’s Corporate Qualifications for C4 and C6 Harcourt’s proposal lists the same three projects administered by Harcourt for both Criterion C4 and Criterion C6: the Connecticut Mastery Test ("CMT"), the Connecticut Academic Performance Test ("CAPT") and the Delaware Student Testing Program ("DSTP"). Harcourt’s proposal also lists for Criterion C4 projects administered by its proposed scoring subcontractors, Measurement Incorporated ("MI") and Data Recognition Corporation ("DRC"). However, none of the projects listed for MI or DRC involve image- based scoring. Thus, the MI and DRC projects do not demonstrate any volume of image-based scoring as required by C6 and by the portion of C4 which relates to the work task for the imaged-based scoring of the math and reading portions of the FCAT. Harcourt’s proposal states that "[a]pproximately 35,000 students per year in grade 10 are tested with the CAPT." Harcourt’s proposal states that "[a]pproximately 120,000 students per year in grades 4, 6 and 8 are tested with the CMT." Harcourt’s proposal states that "[a]pproximately 40,000 students in grades 3, 5, 8, and 10" are tested with the DSTP. Although the descriptions of the CMT and the CAPT in Harcourt’s proposal discuss image-based scoring, there is nothing in the description of the DSTP that addresses image-based scoring. There is no evidence that the evaluators were ever made aware that the DSTP involved image-based scoring. Moreover, although the Department called the Delaware Department of Education ("DDOE") as a reference for Harcourt’s development proposal, the Department did not discuss Harcourt’s administration of the DSTP (including whether the DSTP involves image-based scoring) with the DDOE. Harcourt overstated the number of students tested in the projects it referenced to demonstrate experience with image-based scoring. Harcourt admitted at hearing that, prior to submitting its proposal, Harcourt had never tested 120,000 students with the CMT. In fact, the total number of students tested by Harcourt on an annual basis under the CMT has ranged from 110,273 in the 1996- 97 school year to 116,679 in the 1998-99 school year. Harcourt also admitted at hearing that, prior to submitting its proposal, Harcourt had never tested 35,000 students in grade 10 with the CAPT. Instead, the total number of grade 10 students tested by Harcourt on an annual basis with the CAPT ranged from 30,243 in 1997 to 31,390 in 1998. In addition, Harcourt admitted at hearing that, prior to submitting its proposal, it had conducted only one "live" administration of the DSTP (as distinguished from field testing). That administration of the DSTP involved only 33,051, not 40,000, students in grades 3, 5, 8 and 10. Harcourt itself recognized that "field tests" of the DSTP are not responsive to C4 and C6, as evidenced by Harcourt’s own decision not to include in its proposal the number of students field tested under the DSTP. Even assuming that the numbers in Harcourt’s proposal are accurate, and that the description of the DSTP in Harcourt’s proposal reflected image-based scoring, Harcourt’s proposal on its face does not document any single project administered by Harcourt for C4 or C6 involving image-based testing of more than 120,000 students annually. When the projects are aggregated, the total number of students claimed as tested annually still does not reach the level of "at least 200,000;" it comes to only 195,000, and it reaches that level only once due to the single administration of the DSTP. Moreover, even if that 195,000 were considered "close enough" to the 200,000 level required, it was achieved only one time, while Appendix J plainly directs that there be a minimum of two times that testing at that level has been performed. The situation worsens for Harcourt when using the true numbers of students tested under the CMT, CAPT, and DSTP, because Harcourt cannot document any single image-based scoring project it has administered involving testing more than 116,679 students annually. Moreover, when the true numbers of students tested are aggregated, the total rises only to 181,120 students tested annually on one occasion, and no more than 141,663 tested annually on any other occasion. Despite this shortfall from the minimum threshold of experience, under the Department’s holistic approach the evaluators assigned Harcourt’s proposal four ratings of 3.0 and two ratings of 4.0 for C4, for an average of 3.3 on C4; and five ratings of 3.0 and one rating of 4.0 for C6, for an average of 3.2 on C6. Applying the plain language of the RFP in Sections 10 and 11 and Appendix J, Harcourt did not demonstrate that it meets an acceptable level of experience and capability for C4 or C6, because Harcourt did not satisfy the minimum threshold for each criterion by demonstrating a minimum of two prior completed projects involving image-based scoring requiring testing of at least 200,000 students annually. Harcourt’s proposal should not have received any rating of 3.0 or higher on C4 or C6 and should have been disqualified from further evaluation due to failure to demonstrate the minimum experience that the Department required in order to be assured that Harcourt can successfully administer the FCAT program. NCS’s Compliance With RFP Requirements Even though the NCS proposal did not meet all of the mandatory requirements, and despite the requirement of Section 11.2 that the proposal be automatically disqualified under such circumstances, the Department waived NCS’s noncompliance as a minor irregularity. The factors in C4 and C6 were set, minimal requirements with which NCS did not comply. For example, one of the two programs NCS submitted in response to Criteria C4 and C6 was the National Assessment of Educational Progress program ("NAEP"). NAEP, however, is not a "statewide assessment program" within the meaning of that term as used in Criteria C4 and C6. Indeed, NCS admitted that NAEP is not a statewide assessment program and that, without consideration of that program, NCS’s proposal is not responsive to Criteria C4 and C6 because NCS has not have submitted the required proof of having administered two statewide assessment programs. This error cannot be cured by relying on the additional experience of NCS’s subcontractor because that experience does not show that its subcontractor produced reports within three months, and so such experience does not demonstrate compliance with Criteria C4. The Department deliberately limited the competition for the FCAT contract to firms with specified minimum levels of experience. As opined at final hearing, if the Department in the RFP had announced to potential bidders that the type of experience it asked vendors to describe were only targets, goals and guidelines, and that a failure to demonstrate target levels of experience would not be disqualifying, then the competitive environment for this procurement would have differed since only 2.06 evaluation points (out of a possible 150) separated the NCS and Harcourt scores. Dr. Heidorn conceded that multiple companies with experience in different aspects of the FCAT program -- a computer/imaging company and a firm experienced in educational testing -- might combine to perform a contract like the FCAT. Yet, that combination of firms would be discouraged from bidding because they could not demonstrate the minimum experience spelled out in the RFP. Language in the RFP, indicating the "holistic" evaluation that was to be applied, could have resulted in a different field of potential and actual bidders.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is recommended that Respondent, State of Florida, Department of Education, enter a Final Order rejecting the bids submitted by Harcourt and NCS for the administration component of the RFP. The Department should then seek new proposals. DONE AND ENTERED this 25th day of May, 1999, in Tallahassee, Leon County, Florida. DON W. DAVIS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 25th day of May, 1999. COPIES FURNISHED: Karen D. Walker, Esquire Holland and Knight, LLP Post Office Drawer 810 Tallahassee, Florida 32302 Mark D. Colley, Esquire Holland and Knight, LLP Suite 400 2100 Pennsylvania Avenue, Northwest Washington, D.C. 20037 Charles S. Ruberg, Esquire Department of Education The Capitol, Suite 1701 Tallahassee, Florida 32399-0400 Paul R. Ezatoff, Jr., Esquire Christopher B. Lunny, Esquire Katz, Kutter, Haigler, Alderman, Bryant and Yon, P.A. 106 East College Avenue, Suite 1200 Tallahassee, Florida 32302-7741 Tom Gallagher Commissioner of Education Department of Education The Capitol, Plaza Level 08 Tallahassee, Florida 32399-0400 Michael H. Olenick, General Counsel Department of Education The Capitol, Suite 1701 Tallahassee, Florida 32399-0400

Florida Laws (3) 120.57287.012287.057
# 2
MAGDALENA COSTIN vs FLORIDA ENGINEERS MANAGEMENT CORPORATION, 98-002584 (1998)
Division of Administrative Hearings, Florida Filed:Jacksonville, Florida Jun. 05, 1998 Number: 98-002584 Latest Update: Feb. 23, 1999

The Issue The issue to be resolved is whether Petitioner is entitled to additional credit for her response to question nos. 122 and 222 of the civil engineering examination administered on October 31, 1997.

Findings Of Fact On October 31, 1997, Petitioner took the civil professional engineering licensing examination. A score of 70 is required to pass the test. Petitioner obtained a score of 69. Petitioner challenged the scoring of question nos. 122 and 222. As part of the examination challenge process, Petitioner's examination was returned to the National Council of Examiners for Engineering and Surveying where it was re-scored. In the re-score process, the grader deducted points from Petitioner's original score. Petitioner was given the same raw score of 6 on question number 122; however, on question number 222 her raw score of 4 was reduced to a 2. Petitioner needed a raw score of 48 in order to achieve a passing score of 70; she needed at least three additional raw score points to obtain a passing raw score of 48. Petitioner is entitled to a score of 6 on problem number 122. The solution and scoring plan for that problem required the candidate to obtain a culvert size in the range of 21-36 inches. The Petitioner incorrectly answered 3.1 feet or 37.2 inches. She is not entitled to additional credit for problem number 122 because she answered the question with the wrong size culvert. Problem number 122 required the candidate to use a predevelopment peak flow of 40 cubic feet per second (cfs). Petitioner used 58.33 cfs. She chose the maximum flow rather than the predevelopment peak flow. In solving problem number 122, Petitioner chose a design headwater depth of 4.8 feet. The correct solution required a design headwater depth of 5.7 feet. Petitioner made another mistake in problem number 122; she failed to check the water depth in the downstream swale. Petitioner concedes she was given sufficient information to solve problem number 122. She understood what the question was asking of her. She admits that she did not compute the critical depth of the water and that she did not complete the solution. Question number 222 had three parts. The candidate was required to determine the footing size, to select the reinforcing steel, and to provide a sketch for a concrete column located along the edge of a building. Petitioner understood the question and was provided enough information to solve the problem. Petitioner correctly checked the footing size as required by the first part; however, she did not select the reinforcing steel or show the required sketch. Therefore, Petitioner did not complete enough of the problem to qualify for a score of 4 points. She is entitled to a score of 2 points. The examination questions at issue here were properly designed to test the candidate's competency in solving typical problems in real life. The grader (re-scorer) utilized the scoring plan correctly. Petitioner has been in the United States for approximately eleven years. She lived in Romania before she came to the United States. In Romania, Petitioner used only the metric system in her professional work. While she has used the English system since moving to the United States, Petitioner is more familiar with the metric system. The Principles and Practice examination is an open-book examination. Petitioner took a book entitled the Fundamentals of Engineering Reference Handbook to the examination. When the proctor examined her books, she told the Petitioner she was not permitted to keep the handbook. The proctor took the handbook from the Petitioner. Petitioner protested the confiscation of her reference book because she had used the same book in two previous tests. About ten minutes later, the proctor's supervisor returned the book to Petitioner. Petitioner's book was returned at least ten minutes before the test began. She was permitted to use the book during the test. There is no persuasive evidence that the proctor's mistake in temporarily removing Petitioner's reference book caused her to be so upset that she failed the test. Candidates were not permitted to study their books prior to the beginning of the examination. Petitioner may have been nervous when the test began. However, Petitioner received a perfect score of ten points on the first problem she worked, problem number 121.

Recommendation Based upon the findings of fact and conclusions of law, it is RECOMMENDED that the Board of Professional Engineers enter a Final Order confirming Petitioner's score on the examination and dismissing the Petitioner's challenge. DONE AND ENTERED this 13th day of January, 1999, in Tallahassee, Leon County, Florida. SUZANNE F. HOOD Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 13th day of January, 1999. COPIES FURNISHED: Natalie A. Lowe, Esquire Board of Professional Engineers 1208 Hays Street Tallahassee, Florida 32301 William Bruce Muench, Esquire 438 East Monroe Street Jacksonville, Florida 32202 Lynda L. Goodgame, General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0792 Dennis Bartin, President Florida Engineers Management Corporation 1940 North Monroe Street Tallahassee, Florida 32399-0792

Florida Laws (1) 120.57
# 3
UNIVERSITY OF BRIDGEPORT vs DEPARTMENT OF HEALTH, BOARD OF CHIROPRACTIC MEDICINE, 01-004389 (2001)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Nov. 08, 2001 Number: 01-004389 Latest Update: Apr. 05, 2002

The Issue The issue in this case is whether Petitioner’s application for continuing education course approval should be granted by the Board of Chiropractic Medicine.

Findings Of Fact Respondent, Board of Chiropractic Medicine, is the state agency responsible for the licensure and regulation of chiropractic medicine in the State of Florida. Section 456.013(6) and Chapter 460, Florida Statutes. The Board has the responsibility to approve continuing education courses sponsored by chiropractic colleges. Section 460.408, Florida Statutes. Continuing education providers established through medical osteopathic or chiropractic colleges send their initial courses to the Board for approval. Ordinarily, once the course is approved they become an approved provider and do not send subsequent continuing courses to the Board for approval. Petitioner is an approved continuing education course provider. On July 24, 2001, Petitioner submitted an application of an online course to the Board for approval. The submitted course, ChiroCredit.com, is a 13-hour course consisting of nine regular hours, two HIV/AIDS hours, and two risk management hours. With the application, Petitioner submitted a letter dated July 19, 2001, by Drs. Richard Saporito and Paul Powers, Petitioner’s representative. The letter requested the Board “to review the issue of acceptance of distance based online education credits for Chiropractors continuing education requirements in the State of Florida.” On August 22, 2001, Stephanie Baxley, Regulatory Specialist for the Board, sent a memorandum to Dr. Gene Jenkins, D.C., chair of the Continuing Education Committee, requesting continuing education review. Dr. Jenkins signed and marked the memorandum "approved" on August 29, 2001. On the same date, Dr. Jenkins also indicated approval of an online course offered by another provider, Logan College. Ms. Baxley wrote to Dr. Richard Saporito notifying him that ChiroCredit.com had been approved for continuing education credit. Vicki Grant is a programs operations administrator with the Department of Health. Her responsibilities include managing the licensing and discipline of four professions, including chiropractic medicine. Ms. Grant received a phone call from Dr. Jenkins who informed her that he had made a mistake by indicating approval of the online course offered by Petitioner. In response to his inquiry as to how to proceed, she advised him to notify the continuing education staff, tell them he had made a mistake, and ask that the matter be presented to the full board. She also spoke to Sharon Guilford regarding the matter. Ms. Guilford is Ms. Baxley's supervisor. Sharon Guilford is a program operations administrator with the Department of Health. One of her responsibilities is serving as the administrator for the continuing education section that consists of six professions, including chiropractic medicine. Ms. Guilford and Ms. Grant spoke about Dr. Jenkins' phone call. On September 11, 2001, Ms. Guilford wrote a note on a copy of the August 29, 2001 letter from Ms. Baxley to Dr. Saporito that stated as follows: "Per Dr. Jenkins-course should've never been approved. Send letter correcting the error of approval." On September 11, 2001, Ms. Baxley sent a letter to Dr. Saporito advising him that the approval letter of August 29, 2001, was sent in error and that the Board would take up the matter at their October 2001 meeting.1/ The Board did address the matter at their October 1, 2001 meeting which was held via teleconference. Dr. Saporito and Dr. Paul Powers spoke to the Board on behalf of Petitioner. During the last part of the Board's consideration of this matter, various board members expressed concern that the Board did not have enough information to vote for an approval of the course and discussed having an opportunity to receive more information. After much discussion, the Board unanimously voted to deny Petitioner's application for approval of the course for continuing education purposes. At the same meeting, the Board also denied an application of Logan College to provide continuing education via an online course. The Notice of Intent to Deny states the grounds for denial: As grounds for denial, the Board found that the course did not meet the requirements of Florida Administrative Code Rule 64B2- 13.004. Specifically, the rule does not contemplate the awarding of credit for virtual courses or those taken online by use of a computer. The Board opined that 'classroom hours' as used in the rule means in-person education and not time spent in front of a computer. The course offered by the applicant is an online offering. Additionally, the Board expressed concerns about the educational merit and security protocols used by online course providers, but welcomes more information regarding these topics. The Board has never approved an online, homestudy, or video-taped presentation for continuing education course credit. The courses presented to the Board by Petitioner and Logan College were the first online courses to be presented for Board approval. The Board interprets its applicable rule, which requires each licensee to obtain 40 classroom hours of continuing education, to require live and in-person classroom hours. Petitioner offered the testimony of two expert witnesses, Dr. Terry Heller and Dr. Joseph Boyle. Dr. Heller has knowledge regarding theories of learning and education, but lacks knowledge about chiropractors, chiropractic education, or chiropractic continuing education and does not appear to be very familiar with Petitioner’s particular online course. Dr. Boyle is familiar with both chiropractic continuing education and Petitioner's course. He disagrees with the Board's interpretation that the term "classroom hours" must mean a lecture or live format. However, Dr. Boyle described the broadest definition of "classroom" to be "anywhere, anyplace, at any pace, anytime." He acknowledged that the Board could set up criteria for online courses that differ from the criteria for traditional classrooms. Respondent’s expert witness, Dr. David Brown, noted that most chiropractors practice in isolation and very few have staff privileges at hospitals. In his opinion, a legitimate policy reason for requiring chiropractors to obtain a certain amount of in-person continuing education is that they can “rub shoulders with their peers” and learn from one another. Dr. Brown noted that many states impose restrictions on the number of online hours that may be taken or on the type of licensees who are eligible to receive credit. Dr. Brown interpreted the word "classroom" within the context of the rule containing the requirement of 40 classroom hours of continuing education to mean ". . . to physically sit in a room, in a classroom type environment which could be an auditorium or some other environment, with your peers who are also taking the class in order to obtain course credit. I think that's a traditional type of view." Dr. Brown's interpretation of "classroom" within the context of the Board's rule is more persuasive than those of Petitioner's experts.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED: That a Final Order be entered denying Petitioner’s application for continuing education course approval.2/ DONE AND ENTERED this 5th day of March, 2002, in Tallahassee, Leon County, Florida. BARBARA J. STAROS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 5th day of March, 2002.

Florida Laws (3) 120.57456.013460.408
# 4
MICHAEL REGGIA vs. BOARD OF PROFESSIONAL ENGINEERS, 86-001808 (1986)
Division of Administrative Hearings, Florida Number: 86-001808 Latest Update: Sep. 19, 1986

The Issue The issue in this proceeding is whether Michael Reggia meets the Florida licensure requirements for a professional engineer in the field of manufacturing engineering. The issue is specifically whether the practice and principles portion of the licensing exam was valid. Procedural Matters At the final hearing, Petitioner, Michael Reggia testified in his own behalf and presented the testimony of manufacturing engineer, Howard Bender. Petitioner's exhibits #1 and #2, letters from Martin Marietta Aerospace and Harris Corporation, were rejected as hearsay. Exhibit #3, selected pages from Fundamentals of Engineering, published by the National Council of Engineering Examiners, was admitted without objection. Respondent presented two witnesses: Cass Hurc, P.E. (by deposition, by agreement of the parties) and Allen Rex Smith, Executive Director of the Board of Professional Engineers. Respondent initially submitted four exhibits: #1 and #4 were admitted without objection, #2(a) and #2(b), were admitted over Petitioner's objection, and #3 was withdrawn. The parties requested and were given 20 days to submit post-hearing briefs and proposed orders. On September 15, 1986, Petitioner filed his arguments and summary of she testimony and evidence. Nothing was filed by Respondent.

Findings Of Fact Michael Reggia resides in Titusville and works at the Kennedy Space Center. He is licensed in the state of California as a professional engineer and has practiced in the field of manufacturing engineering. California, like Florida, does not license an individual in a particular discipline of engineering but requires that an individual select an area in which he or she will be tested. Mr. Reggia took the professional engineering license exam in Florida in October 1985. For part two of the examination, Professional Practice and Principles, he chose to be tested in his field of manufacturing engineering. He achieved a score of 64.4; in order to pass, a score of 70 is required. The examination given in Florida is a national examination produced by the National Council of Engineering Examiners (NCEE) for certification or licensure throughout the United States. The October 1985 exam was developed based upon an extensive survey study initiated by NCEE in 1979. A report of that study was published in March 1981 as "A Task Analysis of Licensed Engineers". (Respondent's exhibit #4) The primary purpose of the study was to aid NCEE in developing"... fair, meaningful, uniform, and objective standards with which to measure minimum competency for professional licensure." (exhibit #4, page E1) In drafting an exam the NCEE relies on the societies representing various engineering disciplines to submit examination problems for consideration. The Society of Manufacturing Engineers, through its professional registration committee, provides that service on behalf of the manufacturing engineers. The October 1985 examination for manufacturing engineers did not include questions relating to electrical engineering, which is Mr. Reggia's sub- area of emphasis in the area of manufacturing engineering. Since manufacturing engineering includes overlap into the basic engineering disciplines, Mr. Reggia contends the exam was one-sided and invalid as he felt it concentrated on tool designing and mechanical engineering. Some industries, particularly the aerospace industries now include a substantial number of electrical engineers on their staff. Engineering is an evolving discipline and manufacturing engineering has undergone changes with new technologies in recent years. One way of addressing the diversity and changes in the field is to provide a two-book exam that would offer the applicant a wider variety of problems from which he or she could select. This has been recommended to the NCEE by the Society of Manufacturing Engineers. Another approach, and the one utilized by the NCEE, is to conduct periodic surveys to determine the tasks which engineers are actually performing and the level of judgement required to perform the tasks effectively. It would be impossible, and perhaps inappropriate to develop an exam that would test each individual only on his or her particular expertise. In the area of manufacturing engineering the exams developed by NCEE are passed by 65- 75 percent of the candidates, a rate which is comparable to that of the mechanical engineers for their exam. Seven out of ten applicants passed the same exam which Mr. Reggia took in October 1985.

Florida Laws (2) 455.213455.217
# 5
W. EDWIN CONNERY vs. CONSTRUCTION INDUSTRY LICENSING BOARD, 88-000232 (1988)
Division of Administrative Hearings, Florida Number: 88-000232 Latest Update: Dec. 13, 1988

Findings Of Fact In order for the Petitioner to obtain his license as a building contractor in Florida, he is required to successfully complete a certification examination which consists of three tests. The examination is prepared by the ACSI National Assessment Institute and administered by the Department of Professional Regulation. The June 1987, examination involved a new format, new scoring methods, and areas of competency which had not been tested in previous exams. A post examination report prepared by the Office of Examination Services of the Department of Professional Regulation reveals that, while forty seven per cent of the examinees passed at least one part of the examination, only seven per cent passed the entire examination. Historically, pass rates for previous examinations ranged from thirty five to fifty five per cent. The reasons given for the low pass rate on this particular exam by the Office of Examination Services were: 1) Candidates are currently required to demonstrate competency in each of the three content areas. If the exam was graded in the same manner as the grading method used in prior exams (compensatory scoring), the pass rate would have increased to twenty one per cent in this examination. 2) Whenever an examination is significantly changed, the performance of the candidates will decrease until they prepare for the demands of the new examination. 3) There appeared to be a time problem. Many of the candidates did not timely complete the answers to all of the questions in the second and third test. The Petitioner was not prepared for the new format. The review course taken by him shortly before the exam did not alert him to the changes approved by the Board. As a reexamination candidate, his expectations as to exam content were even more entrenched than those of first time candidates. The Petitioner failed all three tests in the exam. A review of the Petitioner's score sheets on all three tests reveal that he timely completed all of the answers, so the time problem does not appear to have affected his results. If the compensatory scoring method had been used on this exam, as it had been in prior exams, the Petitioner would still not have passed the examination administered in June 1987. The Petitioner did not demonstrate that the Respondent failed to follow standard procedures for conducting or grading the examination. The Petitioner was not treated differently from other candidates who took the examination. Although the content in this exam was different than the preceding exam, the content of the exam had been properly promulgated in Rule 21E-16.001, Florida Administrative Code, as amended May 3, 1987. The Respondent has agreed to allow the Petitioner the opportunity to take the next scheduled examination, without charge.

Florida Laws (3) 120.57489.111489.113
# 6
ROGER S. EVANS vs BOARD OF PROFESSIONAL ENGINEERS, 91-001580 (1991)
Division of Administrative Hearings, Florida Filed:Tampa, Florida Mar. 12, 1991 Number: 91-001580 Latest Update: Aug. 20, 1991

The Issue Whether Petitioner's application for licensure by examination as an engineering intern should be granted.

Findings Of Fact Prior to his admission to the Mechanical Engineering Program at the University of South Florida on August 30, 1982, Petitioner Evans attended a three-year full-time Mechanical Engineering Diploma Program at the College of Arts, Science and Technology in Kingston, Jamaica. Upon completion of the program, Petitioner was awarded the College Mechanical Engineering Diploma. The diploma from the College of Arts, Science and Technology was conferred in an educational system based upon the English System of Education. The diploma was not a university degree, such as a Bachelor of Science. It is more akin to a certificate from a specialized training program. Such diplomas are often called Associate Degrees when they are issued by junior colleges in the United States. 750 credit hours were transferred from the College of Arts, Science and Technology and were applied to the lower level requirements for the Mechanical Engineering Program when Petitioner was enrolled at the University of South Florida. As with all transfers from other schools of higher education, Petitioner was not given credit for those courses in the grade point average (GPA) he was required to achieve at the university. Throughout his enrollment at the university prior to the actual award of his Bachelor of Science (BS) degree, Petitioner Evans was in the Mechanical Engineering Program. During the thirteen terms the Petitioner attended the university before he was awarded his BS degree, he repeated the following engineering department courses: EGN 3313 STATICS (3 times); EML 4503 MACH AN & DES 2 (2 times); ENG 4314 AUTO CONTROLS I (3 times) and EML 4106 C THERM SYS & ECO (4 times). Petitioner ultimately achieved a "A" in EGN 3313 STATICS; a "C" in EML 4503 MACH AN & DES 2, as well as ENG 4314 AUTO CONTROLS I. His final grade in the coursework for EML 4106 C THERM SYS & ECO was a "B". At all times while Petitioner was in attendance at the university, the Mechanical Engineering Department required students to have a GPA of 2.2 or better in a specific schedule of coursework before a Bachelor of Science in Mechanical Engineering (BSME) degree would be awarded by the faculty of the Department. The curriculum for the Mechanical Engineering Program at the University of South Florida was accredited by the Accreditation Board for Engineering and Technology (ABET) based upon the program requirement that a degree in mechanical engineering would be conferred only on students with a 2.2 or better GPA. The fall term of August 24, 1987 - December 12, 1987, was designated as Petitioner's final term of his senior year as an undergraduate seeking a BSME degree. Although the means used by the Mechanical Engineering faculty to calculate a GPA during this particular time period was unavailable, there is no dispute that the faculty applied its policy and determined that a BSME could not be awarded to Petitioner because he did not meet the academic standard of 2.2 or better GPA in the scheduled courses. Due to the averaging required to arrive at a GPA, Petitioner's repetition of so many courses lowered his overall GPA even though he successfully completed each course on his final attempt. When Petitioner was personally informed of the faculty's decision by his assigned faculty adviser, he questioned whether he could retake some of the courses to bring his GPA status up to the level demanded by the faculty. This idea was discouraged by his adviser because Petitioner would have to repeat a large number of courses over a lengthy period of time. The averaging techniques used to compute a GPA makes such an endeavor very time consuming with small results for the effort spent. Based upon the advice he received, Petitioner acquiesced in the faculty's decision to award him a B.S. in Engineering-Option in General and accepted the degree. At the close of his undergraduate academic pursuits, Petitioner had an overall GPA of 2.082 and a GPA in departmental course work of 1.79. This departmental GPA was calculated by eliminating 3 "Fs" from his transcript, per the university's forgiveness policy. All other course repeats lowered his overall GPA and his departmental GPA. In spite of the overall GPA and departmental GPA determination, Petitioner did take and successfully passed every course within the curriculum of the Mechanical Engineering Program at the University of South Florida. The B.S. degree awarded to Petitioner is an alternate degree within the university. It is designed for students who have either completed a specialized program but were unable to meet a faculty's higher GPA standard or for those students who never designated a specialty within the engineering school, but met general university degree requirements. This program has never been accredited by ABET. ABET relied upon the faculty's representation that students who received BSME degrees would obtain a 2.2 or better GPA in the program before the degree was awarded when accreditation was granted by the board. It is unknown as to whether the program would have been approved if a lower success standard had been set for the students. On July 9, 1990, Petitioner's application for the Fundamentals Examination was received by the Department. The application was rejected on September 24, 1990, because the Department determined Petitioner did not meet the statutory and rule provisions governing admissions to the examination. From August 27,, 1984 - December 11, 1987, Petitioner was in the final year of an approved engineering curriculum in a university approved by the Board. He successfully completed the courses in the curriculum, but his GPA in the program was lowered by his numerous repetitions of the same courses before successful completion occurred.

Recommendation Based upon the foregoing, it is RECOMMENDED: Petitioner's application to take the examination administered by the Department for the Board be denied. DONE and ENTERED this 20th day of August, 1991, in Tallahassee, Leon County, Florida. VERONICA E. DONNELLY Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904)488-9675 Filed with the Clerk of the Division of Administrative Hearings this 20th day of August, 1991. APPENDIX TO RECOMMENDED ORDER Petitioner's proposed findings of fact are addressed as follows: Pages 1-2: Accepted. See Preliminary Statement Issue I-Page 3: Paragraph one. Accepted. See HO #11. Paragraph two. Accepted. See HO #7. Paragraph three. Accepted. See HO #3. Paragraph four. Accepted. See HO #8. Paragraph five. Accepted. See HO #4, #10, #11 and #12. Paragraph six. Accepted. Paragraph seven. Accepted. See HO #15. Paragraph eight. Accepted. See HO #12. Paragraph nine. Accepted. Paragraph ten. Accepted. Paragraph ten. Rejected. Cumulative. Issue II-Page 7: Paragraph one. Accepted. See HO #13. Issue III-Page 8:Paragraph one. Accepted. Paragraph two. Rejected. Cumulative. Paragraph three. Accepted. Paragraph four. Rejected. Mixed Question of Law and Fact. Witness Incompetent to determine. Paragraph five. Rejected. Cumulative. Respondent's proposed findings of fact are addressed as follows: Accepted. See HO #14. Accepted. See HO #14. Accepted. See HO #1. Accepted. See HO #3 and #4. Accepted. See HO #13. Accepted. See HO #12 and #13. Accepted. See HO #8 and #13. Rejected. Contrary to fact. See HO #5. Accepted. See HO #11. Rejected. Irrelevant. Accepted. Accepted. See HO #10. Rejected. Insufficient facts presented. See HO #8. Accepted. See HO #6. Accepted. Rejected. Irrelevant. Rejected. Irrelevant. Rejected. Improper legal conclusion. Rejected. Contrary to fact. See HO #12. COPIES FURNISHED: Weldon Earl Brennan, Esquire SHEAR NEWMAN HAHN & ROSENKRANZ, P.A. 201 E. Kennedy Boulevard, Suite 1000 Post Office Box 2378 Tampa, Florida 33601 Edwin A. Bayo, Esquire Assistant Attorney General Department of Legal Affairs Suite LL04, The Capitol Tallahassee, Florida 32399-1050 Carrie Flynn, Executive Director Jack McRay, General Counsel Florida Board of Professional Department of Professional Engineers Regulation Northwood Centre, Suite 60 Northwood Centre, Suite 60 1940 North Monroe Street 1940 North Monroe Street Tallahassee, Florida 32399-0755 Tallahassee, FL 32399-0792

Florida Laws (5) 120.56120.57455.11471.005471.013
# 7
BOARD OF ACCOUNTANCY vs. GARY L. WHEELER, 79-002310 (1979)
Division of Administrative Hearings, Florida Number: 79-002310 Latest Update: Mar. 26, 1980

Findings Of Fact Based upon my observation of the witnesses and their demeanor while testifying, the arguments of counsel and the entire record compiled herein, the following relevant facts are found. Gary L. Wheeler, Respondent, is a graduate of Bob Jones University, having received a Bachelor of Science degree therefrom in accounting in 1974. On July 27, 1979, Respondent received his California certificate as a certified public accountant. Thereafter, Respondent filed an application to obtain a reciprocal C.P.A. certificate in Florida based on his certificate issued by the State of California (Certificate No. E-28234). His application was denied by the Petitioner on October 26, 1979, for the following reason: Applicant failed to satisfy the requirements set forth in Section 7(3)(b), Chapter 79-202, Laws of Florida, inasmuch as the license issued to Gary L. Wheeler in California is not issued under criteria substantially equivalent to that in effect in Florida at the time the California license was issued. Bob Jones University was not recognized as an accredited university in Florida by the Board when Respondent received his California certificate inasmuch as it was not listed among the institutions of postsecondary education by the Council on Postsecondary Accreditation (COPA). During September, 1976, Petitioner adopted the COPA list of schools as the schools from which it would accept graduates to sit for its examination. This was done for the avowed purpose of ensuring minimum competence and technical fitness among the ranks of Florida accountants. Douglas H. Thompson, Jr., the Petitioner's Executive Director since 1968, is the Board's chief operating officer and carries out its functions respecting applications for licensure. As such, Mr. Thompson was the person charged with examining Respondent's application pursuant to his California certificate to determine whether the Respondent's certificate was issued under criteria "substantially equivalent" to Florida's licensing criteria. Respondent's application was considered by the Board on two (2) occasions and rejected because Respondent's alma mater, Bob Jones University, is not listed among the accredited schools and universities by COPA. See Sections 473.306; 473.307 and 473.308, Florida Statutes, as amended; and Chapter 21A-28.06, Florida Administrative Code. As an aside, it was noted that the Board, in adopting its procedure for evaluating the criteria for applicants who were seeking to obtain certificates based on the reciprocal qualifications guidelines also adopted other equivalency procedures which provide Respondent an alternative method for which he may obtain a Florida certificate. In this regard, Respondent is only approximately six (6) quarter hours away from obtaining his certificate under the alternative equivalency procedures established by the Board. See Chapters 21A-9.01 through 9.04(4), Florida Administrative Code.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is hereby RECOMMENDED that Respondent's appeal of the Board's action in denying his application for a reciprocal license to practice public accounting based on the issuance of his California certificate be DENIED. DONE AND ORDERED in Tallahassee, Leon County, Florida, this 26th day of March, 1980. JAMES E. BRADWELL Hearing Officer Division of Administrative Hearings Oakland Building 2009 Apalachee Parkway Tallahassee, Florida 32301 (904) 488-9675

Florida Laws (3) 120.57473.306473.308
# 8
MIAN M. SUBHANI vs DEPARTMENT OF BUSINESS AND PROFESSIONAL REGULATION, FLORIDA ENGINEERS MANAGEMENT CORPORATION, 99-002054 (1999)
Division of Administrative Hearings, Florida Filed:Fort Lauderdale, Florida May 05, 1999 Number: 99-002054 Latest Update: Mar. 06, 2000

The Issue Whether Petitioner is entitled to additional credit for his solutions to four problems on the Principles and Practice of Engineering portion of the engineering licensure examination administered on October 30, 1998, by the National Council of Examiners for Engineers and Surveyors.

Findings Of Fact Based upon the evidence adduced at hearing and the record as a whole, the following findings of fact are made: On October 30, 1998, as part of his effort to obtain a Florida engineering license, Petitioner sat for the Principles and Practice of Engineering Examination (Examination). This is a national examination developed and administered by the National Council of Examiners for Engineers and Surveyors (NCEES). Petitioner chose to be tested in civil engineering. Petitioner received a raw score of 45 on the Examination. For the civil engineering specialization, a raw score of 45 converts to a score of 67. To pass the Examination, a converted score of 70 is needed. Petitioner formally requested (in writing, by letter dated March 26, 1999) that his solutions to Problems 120, 125, and 222 on the Examination be rescored. Petitioner's written request was made to Natalie Lowe of the Board, who forwarded it to the NCEES. Appended to Petitioner's letter to Ms. Lowe were two pages of "scratch paper" on which Petitioner had written during his post-examination review on March 19, 1999. On the first page were written comments he had made regarding the scoring of Problems 120 and 125. On the second page were the following written comments he had made regarding the scoring of Problems 220 and 222: 220 a, b, & c 2 parts b & c correct. Min. mark I should get[:] At least 5 instead of 2 and maybe 7. There is an error. 222 ok The NCEES's rescoring of Petitioner's solutions to Problems 120, 125, and 222 resulted in his receiving a raw score of 43 (or a converted score of 65, 5 points less than he needed to pass the Examination). The Board received the NCEES's rescoring results on May 12, 1999. The Board subsequently referred the matter to the Division to conduct an administrative hearing. At the administrative hearing that was held pursuant to the Board's referral, Petitioner challenged the grading of his solutions to Problems 120, 125, and 220 of the Examination, and indicated that he had "no dispute concerning the grading of [his solution to Problem] 222," notwithstanding that he had requested, in his March 26, 1999, letter to Ms. Lowe, that his solution to Problem 222 be rescored. Petitioner explained that he had made this request as a result of inadvertence and that he had actually intended to seek rescoring of his solution to Problem 220, not Problem 222. Problems 120, 125, and 222 were worth ten raw points each. Problem 120 contained four subparts (or requirements). Petitioner initially received four raw points for his solution to Problem 120. Rescoring did not result in any change to this score. Petitioner solved two subparts of Problem 120 correctly (subparts (a) and (b)). The solutions to the other two subparts of Problem 120 (subparts (c) and (d)), however, were incorrect inasmuch as Petitioner had neglected, in making the lateral force calculations and drawing the diagrams required by these subparts, to include the force attributable to the movement of the groundwater referred to in the problem. Therefore, in accordance with the requirements and guidelines of the NCEES scoring plan for this problem, the highest raw score that he could have received for his solution to this problem was a four, which is the score he received. Problem 125 contained three subparts (or requirements). Petitioner initially received a raw score of two for his solution to Problem 125. Upon rescoring, no change was made this raw score. Petitioner correctly solved only one of the three subparts of Problem 125 (subpart (c)). In his solution to subpart (a) of Problem 125, Petitioner did not provide, as required by this subpart, the quantities of water, cement, and aggregate necessary for the project described in the problem. Petitioner's solution to subpart (b) did not describe one of the acceptable slump increasing methods that the candidates were required describe in their solution to this subpart. Accordingly, giving Petitioner a raw score of two for his solution to Problem 125 was consistent with the requirements and guidelines of the NCEES scoring plan for this problem. Petitioner received a raw score of two for his solution to Problem 220. He did not request, in his March 26, 1999, letter to Ms. Lowe, a rescoring of his solution to this problem, and, as a result, his solution was not rescored. At the administrative hearing, Petitioner testified on his own behalf regarding the scoring of this solution and, during his testimony, contended that the score he received was too low; however, neither a copy of the problem, nor a copy of the NCEES scoring plan for this problem, was offered into evidence. Accordingly, the record is insufficient to support a finding that the score Petitioner received for his solution to Problem 220 was undeservedly low in light of the NCEES scoring plan for this problem. Petitioner initially received a raw score of eight for his solution to Problem 220. Rescoring resulted in this score being reduced two points to a six. Petitioner did not present any evidence supporting the position (which he advances in his Proposed Recommended Order) that he should have received a higher score for his solution to this problem, and, consequently, Respondent's expert, in his testimony at hearing, did not address the matter. While there were exhibits offered (by Respondent) and received into evidence relating to the scoring of Petitioner's solution to Problem 222, it is not apparent from a review of these exhibits that such scoring deviated from the requirements of the NCEES scoring plan for this problem (which was received into evidence as part of Respondent's Exhibit 12).

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that a final order be entered rejecting Petitioner's challenge to the failing score he received from the NCEES on the Principles and Practice of Engineering portion of the October 30, 1998, engineering licensure examination. DONE AND ENTERED this 20th day of December, 1999, in Tallahassee, Leon County, Florida. STUART M. LERNER Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 20th day of December, 1999.

Florida Laws (5) 120.57455.217471.013471.015471.038 Florida Administrative Code (6) 61-11.01061-11.01261-11.01561-11.01761G15-21.00161G15-21.004
# 9
JAMES R. EASON vs BOARD OF PROFESSIONAL ENGINEERS, 97-003779 (1997)
Division of Administrative Hearings, Florida Filed:Brooksville, Florida Aug. 13, 1997 Number: 97-003779 Latest Update: Mar. 16, 1998

The Issue The issue in this case is whether Petitioner's request for license by endorsement as a professional engineer should be granted.

Findings Of Fact Based upon all of the evidence, the following findings of fact are determined: Petitioner, James R. Eason (Petitioner), is the pavement management coordinator for the Hernando County Public Works Department. He is a registered professional engineer in the State of Georgia, having received Professional Engineering Registration Number 17320 in 1988. In March 1997, Petitioner filed an application with Respondent, Board of Professional Engineers (Board), seeking licensure by endorsement as a professional engineer in this state. On July 1, 1997, the Board issued its preliminary decision in the form of a letter advising Petitioner that his application had been denied. As grounds, the Board stated that Petitioner had received a raw score of 67 with five points awarded for Veterans Preference on the Principles and Practice portion of the examination. The letter further explained that a raw score of 70 or above was required in order for his score on the Georgia examination to be recognized in the State of Florida and that "Chapter 471, F.S. does not provide for awarding of points for Veterans Preference." The denial of the application prompted Petitioner to bring this action. Petitioner is a graduate of, and holds a bachelor's degree in civil engineering from, the Georgia Institute of Technology. He has a record of four years active engineering experience of a character indicating competence to be in responsible charge of engineering. The parties have also stipulated he is of good moral character, and he has never been under investigation in another state for any act which would constitute a violation of Chapters 455 or 471, Florida Statutes. Petitioner passed the Fundamentals portion of the professional engineering examination administered in 1973 by the State of Georgia. He obtained a score of more than 70. In April 1988, Petitioner took the Principles and Practice portion of the examination. A grade of 70 was required to pass the Georgia examination. Petitioner received a grade of 67 on the initial scoring of the Principles and Practice portion of the examination, plus a five-point Veterans Preference credit, for a total grade of 72. The Veterans Preference credit is provided by Georgia law to all candidates who are members or former members of the Armed Forces of the United States and meet certain service requirements. In Petitioner's case, he had served eight years on active duty as a member of the United States Naval Reserve, and he was honorably discharged as a Lieutenant on July 3, 1969, upon expiration of his active duty commitment. At least ninety days of his active duty military service was during wartime or at a time when military personnel were committed by the President of the United States. The examination administered by the State of Georgia in April 1988 was a national examination published by the National Council of Examiners for Engineering and Surveying, and it was identical to the examination administered by the State of Florida at that time. Florida, like Georgia, requires a grade of 70 to pass the examination, but it does not provide a Veterans Credit for service to candidates who are members or former members of the Armed Forces of the United States. Therefore, in the State of Georgia, a veteran can pass the examination with a raw score as low as 65. To this extent, the two examinations are not substantially equivalent. Among other things, Petitioner pointed out at hearing that he needed only three points to achieve a passing grade on the Principles and Practice portion of the examination. Therefore, he concluded that the awarding of that amount of extra points for being a veteran amounted to only a single standard deviation, and thus the extra points were immaterial in relation to the overall score. However, the Board does not construe this three-point deficiency as being "immaterial," and had Petitioner received the same score in Florida, he would not have passed the examination.

Recommendation Based on the foregoing findings of fact and conclusions of law, it is RECOMMENDED that the Board of Professional Engineers enter a Final Order denying Petitioner's request for licensure by endorsement as a professional engineer. DONE AND ORDERED this 25th day of November 1997, in Tallahassee, Leon County, Florida. DONALD R. ALEXANDER Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 Filed with the Clerk of the Division of Administrative Hearings this 25th day of November, 1997. COPIES FURNISHED: Joseph M. Mason, Jr., Esquire Post Office Box 1090 Brooksville, Florida 34605-1900 Edwin A. Bayo, Esquire Department of Legal Affairs The Capitol Tallahassee, Florida 32399-1050 Angel Gonzalez, Executive Director Board of Professional Engineers 1940 North Monroe Street Tallahassee, Florida 32399-0755

Florida Laws (2) 120.57471.015 Florida Administrative Code (1) 61G15-21.004
# 10

Can't find what you're looking for?

Post a free question on our public forum.
Ask a Question
Search for lawyers by practice areas.
Find a Lawyer