The Issue Whether the Petitioner's challenge to the licensure examination should be sustained.
Findings Of Fact The Petitioner is a candidate for optometry licensure. He took the examination for licensure in August 1999. The Respondent is the state agency charged with the responsibility of administering license examinations. In September 1999, the results of the August 1999 examination were provided to Petitioner. The examination grade report advised Petitioner that he had failed two portions of the licensure examination. A candidate must pass all portions of the exam to become licensed. As to the clinical portion of the examination, the Petitioner challenged the results due to what he maintained were "discrepancies" in the grading system. As to each question challenged in the clinical portion, the Petitioner cited the differing grades from the two examiners as the basis for his dispute. When the Petitioner received credit for the question from one examiner, he believed he should have received credit from the second as well. The clinical portion of the exam was scored by two examiners who independently reviewed the candidate's work. Typically, the candidate for licensure indicates when the examiner is to evaluate the work by stating "grade me now." As to each task, the candidate receives two scores. The scores are added together and divided by two to reach the overall clinical score. Based upon when the candidate directs the examiner to grade, it is possible to receive conflicting results in the scoring process. It is the overall score that determines whether a candidate receives a passing grade on the clinical portion. According to Dr. Liebetreu, a marginal candidate may well be able to correctly perform the task for one examiner yet do so incorrectly for the second reviewer. The method of scoring therefore gives the marginal candidate some credit. As to the questions challenged in the pharmacological portion of the exam, the Petitioner argued that the questions were misleading or had multiple correct answers. Each question challenged offered one most correct answer that the Petitioner should have selected in order to receive full credit. The Petitioner has failed to established that the answers he provided were "more correct" than the ones used by the Respondent to grant credit. The photographs used in the examination were of sufficient quality to provide the candidate with appropriate views to answer questions. The questions challenged were not ambiguous or misleading. The candidates were provided adequate time to complete all portions of the examination. Persons scoring the Petitioner's work during the clinical portion of the exam were not permitted to confer. Their scores were to be based solely on the work they observed. The overall scores issued by persons scoring the Petitioner's work were within acceptable statistical standards.
Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Department of Health, Board of Optometry, enter a final order denying the Petitioner's challenge to the August 1999 examination. DONE AND ENTERED this 20th day of June, 2000, in Tallahassee, Leon County, Florida. J. D. PARRISH Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 20th day of June, 2000. COPIES FURNISHED: Navin Singh, O.D., pro se 103 Knights Court Royal Palm Beach, Florida 33411 Amy M. Jones, Esquire Office of the General Counsel Department of Health 2020 Capital Circle Southeast, Bin A02 Tallahassee, Florida 32399-1703 Eric G. Walker, Executive Director Board of Optometry Department of Health 1940 North Monroe Street Tallahassee, Florida 32399-0750 William W. Large, General Counsel Department of Health 4052 Bald Cypress Way Bin A00 Tallahassee, Florida 32399-1701 Angela T. Hall, Agency Clerk Department of Health 2020 Capital Circle Southeast, Bin A02 Tallahassee, Florida 32399-1703
The Issue The issues presented are: (1) whether or not Respondent wrongfully eliminated materials from the Candidate/Petitioner during the April 19, 1990 engineering examination, and if so, (2) whether the Candidate/Petitioner received a failing grade because the materials were wrongfully eliminated.
Findings Of Fact The Petitioner (#100021) received a score of 69.0 on the Professional Engineer Fundamentals Examination given April 19, 1990. A minimum passing score was 70.0 on the examination which is written by National Council of Engineering Examiners and graded by Education Testing Service. (Transcript Pages 36 and 39) Prior to the April 1990 examination, the Board sent each candidate a letter, dated December 18, 1989 (Exhibit P-1) (Transcript Page 9 and 12), which said, "No review publications directed principally toward sample questions and their solutions of engineering problems are permitted in the examination room." (Transcript Page 31). The candidates were also provided with a "Candidate Information Booklet" dated January 1990 (Exhibit R-1, Transcript Page 77). The booklet states on page 14, "No books with contents directed toward sample questions or solutions of engineering problems are permitted in the examination room." (Transcript Pages 77 and 96). Petitioner, who also took the October 1989 examination had received notice at that examination that the Board of Engineers intended to change the procedure allowing reference materials in the examination. (Transcript Page 89 and Respondent's Exhibit 2.) The Board of Professional Engineers advised the examination supervisor and proctors that no engineering "review" materials would be allowed in the examination although engineering "reference" materials could be brought into and used for the examination. However the books which were excluded included books without "review" in the title, books with "reference" in the title, and books which contained problems and solutions. Before the examination began Deena Clark, an examination supervisor, read over a loud speaker system names of books that would not be permitted (Transcript Page 81). Practice examination and solution manuals were not allowed for use by engineering candidates (Transcript Pages 93 and 94). Schram's outlines and other materials were also excluded (Transcript Page 91). Also excluded was Lindeburg's 6th edition, "Engineering In Training Review Manual." (Transcript Pages 16 and 79). This decision was verified by the Board before the examination began (Transcript Page 81). After the examination had begun, Ms. Clark announced that the candidates could put certain copyrighted materials in a three-ring binder and use them which had been excluded earlier (Transcript Page 85). This was in response to candidates who needed economics tables for the examination However, no time was provided the candidate to prepare these references and only one minute was added to the examination time. (Transcript Page 85). Petitioner did not bring any economic tables to the examination site except those contained in books which were not allowed in the examination. (Transcript Page 19). Petitioner did not remove the economic tables and permitted references from the Lindeburg's review manual until lunch and these tables were not available to him on the morning examination. (Transcript Pages 22 and 88). Of the six engineering economics questions on the morning portion for the examination, the candidate correctly answered four. No data was provided on the nature of these questions. The Candidate correctly answered 53 questions in the morning (weighted x 1) and 23 questions in the afternoon (weighted x 2) for a total of 99 weighted required points. He answered eight questions correctly in the "addition" portion of the examination. The table for eight additional questions correct in the "Scoring Information Booklet" used in determining the candidates final grade shows the adjusted equated score was 126 and his scaled score was 69. (Page 21 of booklet). The value of each economics question converted to final scoring scale was enough that passage of one economics question would have resulted in passage of the examination. The exclusion of certain materials from the examination was arbitrary and capricious and was done by a few individuals without any stated objective standard published by the board. Further, the board knew before the examination which books were to be excluded and could have notified examinees of the exact items to be excluded. The Board's generally poor handling of this matter is exemplified in announcing after the examination had begun that items previously excluded could be used if placed in a ring binder but not allowing any time to prepare such materials. (Tx. pgs., 74-80, 84-86, and 91-97) The Petitioner would have used several tables which were excluded if the announcement had been made before the morning examination began with time to put the items in acceptable form. After notifications in October 1989, December 1989, and January 1990, Petition admitted that he did not call the Board of Professional Engineers to ask for guidance on books that would not be allowed on the April 1990 examination (Transcript Page 29). However, a final decision on books to be excluded was not made until approximately two weeks before the examination. The Petitioner did not show that the two questions which he missed on the Engineering Economics portion of the morning examination were missed for lack of the tables. The examination is a national examination and there is no evidence that the requirements and limits established by the Board in Florida were applicable nationwide. To alter the national instructions locally potentially adversely affects Florida results.
Recommendation Based upon the foregoing findings of fact and conclusions of law, it is recommended that the Petitioner be permitted to take the examination without charge on one occasion. RECOMMENDED this 27th day of March, 1991, in Tallahassee, Florida. STEPHEN F. DEAN Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, FL 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 27th day of March, 1991. 1/ The general information provided to examinees by the State Board regarding the values of questions on the examination and scoring it misleading or inaccurate because neither the weighted required score nor the adjusted score was 48% of 80, 280, or any other number related to the scaled score of 70. The manner in which these values are associated with the scale score of 70 is contrary to the Board's explanation and is not self evident. This is a potential problem if the matter were formally challenged, and it appears the Board needs to reassess its procedures and instructions. APPENDIX TO RECOMMENDED ORDER, CASE NO. 90-5728 The Petitioner did not submit proposed findings. The Respondent submitted proposed findings which were read and considered. The following proposed findings were adopted or reject for the reasons stated: Adopted. Issue not fact. - 4. Rejected. Preliminary statement not fact. 5. -12. Adopted. Rejected. Preliminary statement not fact. Rejected as irrelevant. Rejected as preliminary statement. Adopted. Adopted. COPIES FURNISHED: Alan K. Garman Civil-Tech, Inc. 3573 Commercial Way Street B Spring Hill, FL 34606 William F. Whitson, Law Clerk Department of Professional Regulation 1940 North Monroe Street Tallahassee, FL 32399-0792 Rex Smith Executive Director Board of Professional Engineers Department of Professional Regulation 1940 North Monroe Street Tallahassee, FL 32399-0792 Jack McRay, General Counsel Department of Professional Regulation 1940 North Monroe Street Tallahassee, FL 32399-0792
Findings Of Fact The Petitioner sat for the October 1993 administration of the licensure examination for Metallurgical Engineering. When his examination was graded, he was assigned a raw score of 45 points. A raw score of 48 points is the minimum passing grade on the subject examination. The Respondent stipulated at hearing that the Petitioner is entitled to one additional raw score point, which brings the Petitioner's total undisputed raw score to 46. One of the essay questions on the subject examination was Item 258. According to the scoring plan for Item 258, an exam-taker could earn from 0 to 10 points in two-point increments depending on the quality of his answer. The scoring plan for Item 258 specifies that 2 points should be awarded for an answer that demonstrates "rudimentary knowledge" and that 4 points should be awarded for an answer that demonstrates "more than rudimentary knowledge but [is] insufficient to demonstrate competence." When the Petitioner's examination was graded the first time, he was awarded 0 points for his answer to Item 258. When the Petitioner's examination was regraded, he was awarded 2 points for his answer to Item 258. 1/ The evidence at hearing establishes that the Petitioner's answer to Item 258 demonstrates more than rudimentary knowledge, but is insufficient to demonstrate competence. 2/ Accordingly, pursuant to the scoring plan for Item 258 the Petitioner is entitled to receive 4 points for his answer to Item 258.
Recommendation On the basis of all of the foregoing, it is RECOMMENDED that a Final Order be issued in this case concluding that the Petitioner is entitled to a raw score of 48 points on the subject examination, which is a passing grade. DONE AND ENTERED this 8th day of March 1995 in Tallahassee, Leon County, Florida. MICHAEL M. PARRISH Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 8th day of March 1995.
The Issue The ultimate issue in this proceeding is whether proposed Florida Administrative Code Rule 61G15-21 is an invalid exercise of delegated legislative authority.
Findings Of Fact Florida Administrative Code Rule 61G15-21.004, in relevant part, states: The criteria for determining the minimum score necessary for passing the Engineering Fundamentals Examination shall be developed through the collective judgment of qualified experts appointed by NCEES to set the raw score that represents the minimum amount of knowledge necessary to pass the examination. The judges shall use a Modified Angoff Method in determining the minimally acceptable raw score necessary to pass the Fundamentals of Engineering Examination. Using the above mentioned Modified Angoff Method, the judges will indicate the probability that a minimally knowledgeable Fundamentals of Engineering examinee would answer any specific questions correctly. The probability of a correct response is then assigned to each question. Each judge will then make an estimate of the percentage of minimally knowledgeable examinees who would know the answer to each question. The totals each of the judges is added together and divided by the number of judges to determine the overall estimate of the minimum standards necessary. The minimum number of correct answers required to achieve a passing score will take into account the relative difficulty of each examination through scaling and equating each examination to the base examination. The raw score necessary to show competence shall be deemed to be a 70 on a scale of 100. A passing grade on Part Two of the examination is defined as a grade of 70 or better. The grades are determined by a group of knowledgeable professional engineers, who are familiar with engineering practice and with what is required for an applicable engineering practice and with what is required for an applicable engineering task. These professional engineers will establish a minimum passing score on each individual test item (i.e., examination problem). An Item Specific Scoring Plan (ISSP) will be prepared for each examination item based upon the NCEES standard scoring plan outline form. An ISSP will be developed by persons who are familiar with each discipline including the item author, the item scorer, and other NCEES experts. On a scale of 0-10, six (6) will be a minimum passing standard and scores between six (6) and ten (10) will be considered to be passing scores for each examination item. A score of five (5) or lower will be considered an unsatisfactory score for that item and examinee will be considered to have failed that item. To pass, an examinee must average six (6) or greater on his/her choice of eight (8) exam items, that is, the raw score must be forty- eight (48) or greater based on a scale of eighty (80). This raw score is then converted to a base 100 on which, as is noted above, a passing grade will be seventy (70). The proposed changes to Florida Administrative Code Rule 61G15-21.004, in relevant part, state: The passing grade for the Engineering Fundamentals Examination is 70 or better. The criteria for determining the minimum score necessary for passing the Engineering Fundamentals Examination shall be developed through the collective judgment of qualified experts appointed by NCEES to set the raw score that represents the minimum amount of knowledge necessary to pass the examination. The judges shall use a Modified Angoff Method in determining the minimally acceptable raw score necessary to pass the Fundamentals of Engineering Examination. Using the above mentioned Modified Angoff Method, the judges will indicate the probability that a minimally knowledgeable Fundamentals of Engineering examinee would answer any specific questions correctly. The probability of a correct response is then assigned to each question. Each judge will then make an estimate of the percentage of minimally knowledgeable examinees who would know the answer to each question. The totals each of the judges is added together and divided by the number of judges to determine the overall estimate of the minimum standards necessary. The minimum number of correct answers required to achieve a passing score will take into account the relative difficulty of each examination through scaling and equating each examination to the base examination. The raw score necessary to show competence shall be deemed to be a 70 on a scale of 100. The passing grade for the Principles and Practice Examination is 70 or better. A passing grade on Part Two of the examination is defined as a grade of 70 or better. The grades are determined by a group of knowledgeable professional engineers, who are familiar with engineering practice and with what is required for an applicable engineering practice and with what is required for an applicable engineering task. These professional engineers will establish a minimum passing score on each individual test item (i.e., examination problem). An Item Specific Scoring Plan (ISSP) will be prepared for each examination item based upon the NCEES standard scoring plan outline form. An ISSP will be developed by persons who are familiar with each discipline including the item author, the item scorer, and other NCEES experts. On a scale of 0-10, six (6) will be a minimum passing standard and scores between six (6) and ten (10) will be considered to be passing scores for each examination item. A score of five (5) or lower will be considered an unsatisfactory score for that item and examinee will be considered to have failed that item. To pass, an examinee must average six (6) or greater on his/her choice of eight (8) exam items, that is, the raw score must be forty- eight (48) or greater based on a scale of eighty (80). This raw score is then converted to a base 100 on which, as is noted above, a passing grade will be seventy (70). Petitioner resides in Tampa, Florida. On April 11, 2003, Petitioner took a national examination that Petitioner must pass to be licensed by the state as a professional engineer. On July 1, 2003, Petitioner received a letter from the Board advising Petitioner that he had received a failing grade on the examination. On July 2, 2003, Petitioner unsuccessfully requested the raw scores on his examination from a representative of the National Council of Examiners for Engineering and Surveying (NCEES). The NCEES is the national testing entity that conducts examinations and determines scores for the professional engineer examination required by the state. On July 9, 2003, Petitioner submitted a formal request to the Board for all of the raw scores related to Petitioner "and all past P.E. Exams that the Petitioner had taken." A representative of the Board denied Petitioner's request explaining that the raw scores are kept by the NCEES and "it is not their policy to release them." The Board's representative stated that the Board was in the process of adopting new rules "that were in-line with the policies of the NCEES." On July 31, 2003, Petitioner requested the Board to provide Petitioner with any statute or rule that authorized the Board to deny Petitioner's request for raw scores pursuant to Section 119.07(1)(a), Florida Statutes (2003). On the same day, counsel for the Board explained to Petitioner that the Board is not denying the request. The Board is unable to comply with the request because the Board does not have physical possession of the raw scores. Petitioner and counsel for Respondent engaged in subsequent discussions that are not material to this proceeding. On August 6, 2003, Petitioner requested counsel for Respondent to provide Petitioner with copies of the proposed rule changes that the Board intended to consider on August 8, 2003. On August 27, 2003, Petitioner filed a petition with the Board challenging existing Florida Administrative Code Rule 61G15-21.004. The petition alleged that parts of the existing rule are invalid. Petitioner did not file a challenge to the existing rule with DOAH. The Petition for Hearing states that Petitioner is filing the Petition for Hearing pursuant to Subsections 120.56(1) and (3)(b), Florida Statutes (2003). However, the statement of how Petitioner's substantial interests are affected is limited to the proposed changes to the existing rule. During the hearing conducted on January 29, 2004, Petitioner explained that he does not assert that the existing rule is invalid. Rather, Petitioner argues that the Board deviates from the existing rule by not providing examinees with copies of their raw scores and by failing to use raw scores in the determination of whether an applicant achieved a passing grade on the exam. Petitioner further argues that the existing rule benefits Petitioner by purportedly requiring the Board to use raw scores in the determination of passing grades. The elimination of that requirement in the proposed rule arguably will adversely affect Petitioner's substantial interests. The Petition for Hearing requests several forms of relief. The Petition for Hearing seeks an order granting Petitioner access to raw scores, a determination that Petitioner has met the minimum standards required under the existing rule, and an order that the Board grant a license to Petitioner. The Petition for Hearing does not request an order determining that the proposed rule changes constitute an invalid exercise of delegated legislative authority.
Findings Of Fact L.B. Thanki received a degree in Civil Engineering at the University of Durham at Kings College, Newcastle Upon Tyne in the United Kingdom in 1956. Petitioner received a batchelor of law degree from Sardar Patel University (India) in 1967. This degree is the equivalent of two years study in law. The degree obtained from the University of Durham is not the equivalent of the degree received from an ABET approved university in the United States because it lacks 16 credit hours in Humanities and Social Sciences. Petitioner presented no evidence that his degree from the University of Durham or the curriculum he completed at any other university included the missing 16 hours in Humanities and Social Sciences. Petitioner presented a certificate (which was not offered into evidence) that he had completed a course in computer services meeting the board's evidentiary requirements of computer skills.
Recommendation Based on foregoing Findings of Fact and Conclusions of Law, it is recommended that a Final Order be entered denying Petitioner's application for licensure by examination as an engineering intern. RECOMMENDED this 10th day of May, 1991, in Tallahassee, Leon County, Florida. N. AYERS Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904)488-9675 Filed with the Clerk of the Division of Administrative Hearings this 10th day of May, 1991. COPIES FURNISHED: B. Thanki 1106 East Hillsborough Avenue Tampa, Florida 33604 Edwin A. Bayo, Esquire Assistant Attorney General Department of Legal Affairs The Capitol, Suite LL04 Tallahassee, Florida 32399-1050 Carrie Flynn, Acting Executive Director Florida Board of Professional Engineers Northwood Centre, Suite 60 1940 North Monroe Street Tallahassee, Florida 32399-0755 Jack L. McRay, General Counsel Department of Professional Regulation Northwood Centre, Suite 60 1940 North Monroe Street Tallahassee, Florida 32399-0792
The Issue The issue for disposition in this proceeding is whether Petitioner is entitled to a passing grade on the Principles and Practice of Engineering examination administered on October 30, 1998.
Findings Of Fact Petitioner is an applicant for licensure as a professional engineer in the State of Florida. Respondent is a nonprofit corporation created by the Florida Legislature to provide administrative, investigative and prosecutorial services to the Board of Professional Engineers pursuant to Section 471.038, Florida Statutes. On October 30, 1998, Petitioner sat for the Principles and Practice Engineering Examination in electrical engineering. This is a nation-wide examination developed, controlled, and administered by the National Council of Examiners for Engineering and Surveying (NCEES). Petitioner received a raw score of 47 on this examination. For the electrical engineering discipline, a raw score of 47 results in a converted score of 69. A minimum converted score of 70 is required to pass this examination. A raw score of 48 results in a converted score of 70. Petitioner needs 1 raw score point to achieve a passing score on this examination. Petitioner initially challenged the scoring of multiple choice questions nos. 527 and 530. Petitioner had received a raw score of 0 on these two questions. Petitioner requested NCEES to rescore questions nos. 527 and 530, but after the rescoring NCEES determined that he was not entitled to any additional raw score points. Questions nos. 527 and 530 are each worth 1 raw score point. Petitioner's answer to question no. 527 represents the most practical, "real world" answer to this question, as conceded by Respondent's expert, and Petitioner is entitled to 1 raw score point for his answer. Although he made an articulate, reasonable explanation for his answer to question no. 530 and for his challenge to the text of the question, Petitioner has agreed to accept the 1 additional point he needed for a passing score and abandon the challenge to question no. 530 as moot.
Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that a final order be entered granting Petitioner credit for his response to examination question no. 527 and adjusting his examination grade to reflect a passing score. DONE AND ENTERED this 3rd day of August, 1999, in Tallahassee, Leon County, Florida. MARY CLARK Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 3rd day of August, 1999. COPIES FURNISHED: David E. Alley 4827 Springwater Circle Melbourne, Florida 32940 William H. Hollimon, Esquire Ausley & McMullen, P.A. 227 South Calhoun Street Tallahassee, Florida 32302 Dennis Barton, Executive Director Board Professional Engineers Department of Business and Professional Regulation 1208 Hays Street Tallahassee, Florida 32301 Natalie A. Lowe, Esquire Vice President for Legal Affairs Florida Engineers Management Corporation Department of Business and Professional Regulation 1208 Hays Street Tallahassee, Florida 32301 William Woodyard, General Counsel Department of Business and Professional Regulation Northwood Centre 1940 North Monroe Street Tallahassee, Florida 32399-0792
The Issue The issues for determination in this case are: 1) whether the Respondent’s decision to award a contract to operate a juvenile work release halfway house program to the Henry and Rilla White Foundation was clearly erroneous, contrary to competition, arbitrary, or capricious; and 2) whether the award of the contract is void as a matter of law because of procedural violations by the selection committee and the Respondent.
Findings Of Fact Petitioner, JUVENILE SERVICES PROGRAM, INC. (JSP), is a Florida-based private not-for-profit corporation which was founded to serve troubled youths and their families. Respondent, FLORIDA DEPARTMENT OF JUVENILE JUSTICE (DJJ), is the agency of the State of Florida with the statutory authorization for planning, coordinating, and managing programs for the delivery of services within the juvenile justice consortium. Section 20.316, Florida Statutes. RFP #16P05 On September 27, 1996, Respondent DJJ advertised and released a Request For Proposal (RFP) #16P05 to provide a Work Release Halfway House for Delinquent Males in District IX, serving Palm Beach County, Florida. In response to the RFP, four bids were submitted to DJJ by the following parties: the Henry and Rilla White Foundation, Total Recovery, Inc., Psychotherapeutic Services Inc., and Petitioner JSP. The DJJ bid selection committee of evaluators for the RFP were Jack Ahern, Steve Brown, Jaque Layne, Patricia Thomas, and from the Office of Budget Finance, Fred Michael Mauterer. The contract manager for the RFP was Diane Rosenfelder. On October 28, 1996, each DJJ evaluator was sent a package consisting of a copy of the RFP, which included the evaluation sheet, a copy of each proposal submitted to DJJ, a conflict of interest questionnaire, a certificate of compliance, a description of the proposal selection process, and instructions. Each package sent to the evaluators had a different colored cover sheet which identified the specific evaluator. After completing the evaluations, each evaluator returned the signed conflict of interest forms, and certificates of compliance to Diane Rosenfelder. The evaluations were identified by the color of the cover sheets, as well as the signed conflict of interest forms and certificates of compliance. DJJ initially intended to provide each evaluator with an Award Preference Form which were to be used in the event the final evaluation scores were very close. The Award Preference Forms, however, were inadvertently omitted from the packages sent to the evaluators. The evaluation process resulted in the Henry and Rilla White Foundation receiving the highest average score of 391.50 points. Petitioner JSP received the second highest average score of 360.50 points. The award of points was determined by each evaluator which is indicated by the evaluator checking the box on Section 5 of the evaluation sheet, or by filling in the appropriate point score. The contract manager, Diane Rosenfelder, corrected addition errors on the scoring sheets. The budget part of the evaluation was completed by Fred Michael Mauterer, Senior Management Analyst Supervisor. In accordance with the evaluation scores, DJJ determined that the best response was submitted by the Henry and Rilla White Foundation which was awarded the contract. On November 8, 1996, Petitioner JSP filed a timely Notice of Protest of the award, which was supplemented on December 9, 1996 with the required posting of a $5000 bond. Alleged Errors and Discrepancies in the Evaluation Process Petitioner JSP alleges that several errors in the evaluation process require that the contract award to the Henry and Rilla White Foundation be set aside and that the RFP be reissued and rebid. Petitioner first alleges that the bid selection committee failed to follow the certain instructions during the evaluation process. The instructions were prepared by the contract manager Diane Rosenfelder. The instructions were not required by rule or policy of DJJ. The contract manager considered the instructions advisory in nature. The instructions stated that the members of the bid selection committee should not contact each other with respect to the proposals under evaluation. The evaluators, however, were permitted to contact the contract manager who would record all questions and answers. There were instances in which the contract manager did not record questions from the evaluators to the contract manager. There is no evidence that the evaluators contacted each other regarding the proposals during the evaluation process. The instructions asked the evaluators to explain high or low scores given to the proposals under consideration. None of the evaluators made specific explanations of high or low scores. The contract manager who prepared the instructions considered this instruction discretionary, and there is no evidence that any score given by an individual evaluator was without basis. The evaluators were instructed to provide page numbers from the proposals used to score each item. None of the evaluators complied with this instruction. As indicated above, however, there is no evidence that the actual scores give by the evaluators were without basis. As set forth above, none of the evaluators received the Award Preference Form. This form was to be used in the case of very close scoring of the proposals. The actual scores from the bid selection committee reflected a clear preference for the proposal submitted by the Henry and Rilla White Foundation. Accordingly, there was no demonstrated need for DJJ to rely upon the Award Preference Forms in making its decision to award the contract. The letter of introduction sent to the bid selection committee members from the contract manager stated that the proposal score sheets and the evaluators award preference and the best interest of the district would be considered in determining the award. The contract manager considered this statement advisory in nature. DJJ has not promulgated specific standards relating to the best interest of District IX; however, the proposal evaluation forms sent to the bid selection committee inherently include criteria setting out standards for the determination of the best proposal for the district. The evidence reflects that one of the evaluators, Patricia Thomas, erroneously checked the box on each proposal which gave each of the proposals fifty points as certified minority enterprises, and erroneously wrote “50” as a point count on one evaluation score sheet. None of the proposals included a copy of the certification for minority enterprise as required by Section 287.0945, Florida Statutes, and the contract manager recognized that the evaluator had made a mistake in this regard. In response to this error, the contract manager consulted her supervisors. Because each proposal was awarded the same points, DJJ did not consider the evaluator’s error as prejudicial to any proposal or to the bid selection process, and did reject the evaluator’s scoring of the proposals. There is no showing that Petitioner JPS was prejudiced by DJJ’s decision in this regard. The contract manager added signature lines to the last page of the evaluation sheets. Some of the sheets were returned unsigned from the evaluators. There is no DJJ requirement that the evaluation sheets specifically contain the signatures of the evaluators. The contract manager did not consider the signature page mandatory, and the evaluation proposal score sheets were clearly identified by both color coding and the certificates of conflict signed by the evaluators. There is no evidence that the procedural discrepancies affected the substance of the evaluator’s scoring of the proposals, nor did the procedural discrepancies prejudice the evaluators’ consideration of Petitioner’s proposal.
Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is recommended that the Respondent enter a final order upholding the proposed agency action to award the contract to the Henry and Rilla White Foundation, and dismissing the Petition filed in this case. DONE and ORDERED this 23rd day of April, 1997, in Tallahassee, Florida. RICHARD HIXSON Administrative Law Judge Division of Administrative Hearings DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (904) 488-9675 SUNCOM 278-9675 Fax Filing (904) 921-6847 Filed with the Clerk of the Division of Administrative Hearings this 23rd day of April, 1997. COPIES FURNISHED: Dominic E. Amadio, Esquire Republic Bank Building, Suite 305 100 34th Street North St. Petersburg, Florida 33713 Scott C. Wright, Assistant General Counsel Department of Juvenile Justice 2737 Centerview Drive Tallahassee, Florida 32399-3100 Calvin Ross, Secretary Department of Juvenile Justice 2737 Centerview Drive Tallahassee, Florida 32399-3100 Janet Ferris, General Counsel Department of Juvenile Justice 2737 Centerview Drive Tallahassee, Florida 32399-3100
The Issue The primary issue is whether the process used by the Department of Education (Department) for evaluating and ranking the proposals submitted in response to Request For Proposal (RFP) 99-03 for the Florida Comprehensive Assessment Test (FCAT) administration contract was contrary to the provisions of the RFP in a way that was clearly erroneous, contrary to competition, arbitrary, or capricious.
Findings Of Fact The RFP for the FCAT describes a five stage process for evaluating proposals. In Stage I, the Department’s Purchasing Office determined whether a proposal contained certain mandatory documents and statements and was sufficiently responsive to the requirements of the RFP to permit a complete evaluation. Stage II involved the Department’s evaluation of a bidder’s corporate qualifications to determine whether the bidder has the experience and capability to do the type of work that will be required in administering the FCAT. Stage III was the Department’s evaluation of a bidder’s management plan and production proposal. In Stage IV, the Department evaluated a bidder’s cost proposal. Stage V involved the ranking of proposals based on points awarded in Stages II-IV. If a proposal did not meet the requirements at any one stage of the evaluation process, it was not to be evaluated in the following stage. Instead, it was to be disqualified from further consideration. Stages II and III of the evaluation process were conducted by an evaluation team comprised of six Department employees: Dr. Debby Houston, Ms. Lynn Joszefczyk, Dr. Peggy Stillwell, Dr. Cornelia Orr, Dr. Laura Melvin, and Ms. Karen Bennett. Dr. Thomas Fisher, head of the Department’s Assessment and Evaluation Services Section, and Dr. Mark Heidorn, Administrator for K-12 Assessment Programs within the Department’s Assessment and Evaluation Services Section, served as non-voting co-chairs of the evaluation team. The focus of this proceeding is Stage II of the evaluation process addressing a bidder’s corporate qualifications. RFP Provisions Regarding Corporate Qualification The FCAT administration contractor will be required to administer tests to approximately one and a half million students each year in a variety of subject areas at numerous grade levels. The FCAT program involves a complex set of interrelated work activities requiring specialized human resources, technological systems and procedures. The FCAT must be implemented annually within limited time periods. The FCAT administration contractor must meet critical deadlines for the delivery of test materials to school districts and the delivery of student scores prior to the end of the school year. In developing the RFP, the Department deliberately established a set of minimum requirements for corporate qualifications that a bidder was to demonstrate in order for its proposal to be eligible for further evaluation. The purpose of the RFP’s minimum corporate qualifications requirements was to limit bidding to qualified vendors who have demonstrated prior experience in successfully administering large-scale assessment projects like the FCAT, thereby providing the Department with some degree of assurance that the winning bidder could successfully administer the FCAT. The instructions to bidders regarding the minimum requirements for corporate qualifications are contained in RFP Section 10, which gives directions on proposal preparation. Section 10.1, which lists certain mandatory documents and statements to be included in the bidder’s proposal, requires that a transmittal letter contain "[a] statement certifying that the bidder has met the minimum corporate qualifications as specified in the RFP." These "minimum corporate qualifications" are set forth in RFP Appendix J. RFP Section 10.2 identifies what a bidder is required to include in its proposal with respect to corporate qualifications. The first paragraph of Section 10.2 directs a bidder generally to describe its qualifications and experience performing tasks similar to those that it would perform in administering the FCAT, in order to demonstrate that the bidder is qualified where it states: Part II of a bidder’s proposal shall be entitled Corporate Qualifications. It shall provide a description of the bidder’s qualifications and prior experience in performing tasks similar to those required in this RFP. The discussion shall include a description of the bidder’s background and relevant experience that qualifies it to provide the products and services required by the RFP. RFP Section 10.2, however, is not limited to a directive that qualifications and past experience be described generally. Instead, Section 10.2, also communicates, in plain and unambiguous terms, that there are specific minimum corporate qualifications a bidder must demonstrate: The minimum expectations for corporate qualifications and experience are shown in Appendix J. There are two separate sets of factors, one set of eight for the developmental contractor and another set of nine for the administration contractor. Bidders must demonstrate their Corporate Qualifications in terms of the factors that are applicable to the activities for which a bid is being submitted -- development or administration. For each criterion, the bidder must demonstrate that the minimum threshold of experience has been achieved with prior completed projects. (Emphasis added.) Moreover, Section 10.2 singles out for emphasis, in relation to the administration component of the RFP, the importance placed on a bidder’s ability to demonstrate experience processing a large volume of tests: The [bidder’s prior completed] projects must have included work tasks similar to those described herein, particularly in test development or processing a comparable number of tests. The bidder will provide a description of the contracted services; the contract period; and the name, address, and telephone number of a contact person for each of the contracting agencies. This description shall (1) document how long the organization has been providing similar services; (2) provide details of the bidder’s experience relevant to the services required by this RFP; and (3) describe the bidder’s other testing projects, products, and services that are similar to those required by this RFP. (Emphasis added.) The Department thus made clear its concern that bidders demonstrate experience with large-scale projects. RFP Appendix J sets forth nine different criteria (C1 through C9) for the administration contractor. As stated in RFP Section 10.2, "[f]or each criterion, the bidder must demonstrate that the minimum threshold of experience has been achieved with prior completed projects . . . ." (emphasis added). Appendix J contains a chart which lists for each criterion: (1) a summary of the related FCAT work task, (2) the detailed criteria for the bidder’s experience related to that work task, and (3) the necessary documentation a bidder must provide. Criterion C4 and Criterion C6 include work tasks that involve the use of image-based scoring technology. C4 and C6 are the only corporate qualifications criteria at issue in this proceeding. RFP Provisions Involving Corporate Qualifications for Image-Based Scoring "Handscoring" is the test administration activity in which open-ended or performance-based student responses are assessed. This practice involves a person reading something the student has written as part of the test, as distinguished from machine scoring multiple choice responses (i.e., the filled-in "bubbles" on an answer sheet). There are two types of handscoring: (1) paper-based handscoring, and (2) image-based handscoring. Paper-based handscoring requires that a student response paper be sent to a reader, who then reviews the student’s response as written on the paper and enters a score on a separate score sheet. Image-based handscoring involves a scanned image of the student’s response being transmitted to a reader electronically. The student’s response is then projected on a computer screen, where the reader reviews it and assigns a score using the computer. The RFP requires that the reading and math portions of the FCAT be handscored on-line using imaging technology beginning with the February 2000 FCAT administration. The RFP provides that the writing portion of the FCAT may be handscored using either the paper-based method or on-line imaging technology during the February 2000 and 2001 FCAT administrations. However, on-line image-based scoring of the writing portion of the FCAT is required for all FCAT administrations after February 2001. An image-based scoring system involves complex computer technology. William Bramlett, an expert in designing and implementing large-scale imaging computer systems and networks, presented unrefuted testimony that an image-based scoring system will be faced with special challenges when processing large volumes of tests. These challenges involve the need to automate image quality control, to manage the local and wide area network load, to assure adequate server performance and storage requirements, and to manage the work flow in a distributed environment. In particular, having an image-based scoring system process an increasing volume of tests is not simply a matter of adding more components. Rather, the system’s basic software architecture must be able to understand and manage the added elements and volume involved in a larger operation. According to Bramlett, there are two ways that the Department could assess the ability of a bidder to perform a large- scale, image-based scoring project such as the FCAT from a technological perspective: (1) have the bidder provide enough technological information about its system to be able to model or simulate the system and predict its performance for the volumes involved, or (2) require demonstrated ability through completion of prior similar projects. Dr. Mark Heidorn, Administrator for Florida’s K-12 Statewide Assessment Programs, was the primary author of RFP Sections 1-8, which describe the work tasks for the FCAT -- the goods and services vendors are to provide and respond to in their technical proposals. Dr. Heidorn testified that in the Department’s testing procurements involving complex technology, the Department has never required specific descriptions of the technology to be used. Instead, the Department has relied on the bidder’s experience in performing similar projects. Thus, the RFP does not specifically require that bidders describe in detail the particular strategies and approaches they intend to employ when designing and implementing an image-based scoring system for FCAT. Instead, the Department relied on the RFP requirements calling for demonstrated experience as a basis to understand that the bidder could implement such an image-based scoring system. Approximately 717,000 to 828,000 student tests will be scored annually by the FCAT administration contractor using imaging technology. The RFP, however, does not require that bidders demonstrate image-based scoring experience at that magnitude. Instead, the RFP requires bidders to demonstrate only a far less demanding minimum level of experience using image-based scoring technology. Criterion C4 and Criterion C6 in Appendix J of the RFP each require that a bidder demonstrate prior experience administering "a minimum of two" assessment programs using imaged- based scoring that involved "at least 200,000 students annually." The requirements for documenting a "minimum of two" programs or projects for C4 and C6 involving "at least 200,000 students annually" are material because they are intended to provide the Department with assurance that the FCAT administration contractor can perform the large-scale, image-based scoring requirements of the contract from a technological perspective. Such experience would indicate that the bidder would have been required to address the sort of system issues described by Bramlett. Dr. Heidorn testified that the number 200,000 was used in C4 and C6 "to indicate the level of magnitude of experience which represented for us a comfortable level to show that a contractor had enough experience to ultimately do the project that we were interested in completing." Dr. Fisher, who authored Appendix J, testified that the 200,000 figure was included in C4 and C6 because it was a number judged sufficiently characteristic of large-scale programs to be relevant for C4 and C6. Dr. Fisher further testified that the Department was interested in having information that a bidder’s experience included projects of a sufficient magnitude so that the bidder would have experienced the kinds of processing issues and concerns that arise in a large-scale testing program. The Department emphasized this specific quantitative minimum requirement in response to a question raised at the Bidder’s Conference held on November 13, 1998: Q9: In Appendix J, the criteria for evaluating corporate quality for the administration operations C4, indicates that the bidder must have experience imaging as indicated. Does this mean that the bid [sic] must bid for using [sic] imaging technology for reading and mathematics tests? A: Yes. The writing assessment may be handscored for two years, and then it will be scored using imaging technology. To be responsive, a bid must be for imaging. The corporate experience required (200,000 students annually for which reports were produced in three months) could be the combined experience of the primary contractor and the subcontractors. (Emphasis added.) Criterion C4 addresses the RFP work tasks relating to handscoring, including both the image-based handscoring of the reading and math portions of the FCAT for all administrations and the writing portions of the FCAT for later administrations. The "Work Task" column for C4 in Appendix J of the RFP states: Design and implement efficient and effective procedures for handscoring student responses to performance tasks within the limited time constraints of the assessment schedule. Handscoring involves image-based scoring of reading and mathematics tasks for all administrations and writing tasks for later administrations at secure scoring sites. Retrieve and score student responses from early district sample schools and deliver required data to the test development contractor within critical time periods for calibration and scaling. The "Necessary Documentation" column for C4 in Appendix J states: Bidder must document successful completion of a minimum of two performance item scoring projects for statewide assessment programs during the last four years for which the bidder was required to perform as described in the Criteria column. (Emphasis added.) The "Criteria" column for C4 in Appendix J, like the related work tasks in the RFP, addresses both image-based handscoring of reading and math, as well as paper-based or image- based handscoring of writing. In connection with all handscoring work tasks, "[t]he bidder must demonstrate completion of test administration projects for a statewide program for which performance items were scored using scoring rubrics and associated scoring protocols." With respect to the work tasks for handscoring the reading and math portions of the FCAT, "[t]he bidder must demonstrate completion of statewide assessment programs involving scoring multiple-choice and performance items for at least 200,000 students annually for which reports were produced in three months." In addition, for the reading and math work tasks, "[e]xperience must been shown in the use of imaging technology and hand-scoring student written responses with completion of scoring within limited time restrictions." This provision dealing with "imaging technology" experience self-evidently addresses the reading and math components, because separate language addresses imaging experience in connection with the writing component. The relevant handscoring experience for the reading and math aspects of the program is experience using image-based technology. By contrast, with respect to the work tasks for scoring the writing portions of the FCAT, "the bidder must also demonstrate completion of statewide assessment programs involving paper-based or imaged scoring student responses to writing assessment prompts for at least 200,000 students annually for which reports were produced in three months." (Emphasis added.) Criterion C6 addresses work tasks relating to designing and implementing systems for processing, scanning, imaging and scoring student responses to mixed-format tests within limited time constraints. The "Work Task" column for C6 in RFP Appendix J states: Design and implement systems for the processing, scanning, imaging, and scoring of student responses to test forms incorporating both multiple-choice and constructed response items (mixed-format) within the limited time constraints of the assessment schedule. Scoring of student responses involves implementation of IRT scoring tables and software provided by the development contractor within critical time periods. The "Necessary Documentation" column for C6 in Appendix J states: Bidder must document successful completion of a minimum of two test administration projects for statewide assessment programs during the last four years in which the bidder was required to perform as described in the Criteria column. (Emphasis added.) The Criteria column for C6 in Appendix J states: The bidder must demonstrate completion of test administration projects for statewide assessment programs or other large-scale assessment programs that required the bidder to design and implement systems for processing, scanning, imaging, and scoring responses to mixed-format tests for at least 200,000 students annually for which reports were produced in three months. Experience must be shown in use of imaging student responses for online presentation to readers during handscoring. (Emphasis added.) RFP Provisions Per Corporate Qualifications The procedure for evaluating a bidder’s corporate qualifications is described in RFP Section 11.3: The Department will evaluate how well the resources and experience described in each bidder’s proposal qualify the bidder to provide the services required by the provisions of this RFP. Consideration will be given to the length of time and the extent to which the bidder and any proposed subcontractors have been providing services similar or identical to those requested in this RFP. The bidder’s personnel resources as well as the bidder’s computer, financial, and other technological resources will be considered in evaluating a bidder’s qualifications to meet the requirements of this RFP. Client references will be contacted and such reference checks will be used in judging a bidder’s qualifications. The criteria to be used to rate a bidder’s corporate qualifications to meet the requirements of this RFP are shown in Appendix J and will be applied as follows: * * * Administrative Activities. Each of the nine administration activities criteria in Appendix J will be individually rated by members of the evaluation team. The team members will use the rating scale shown in Figure 1 below. Individual team members will review the bidder’s corporate qualifications and rate the response with a rating of one to five. The ratings across all evaluators for each factor will be averaged, rounded to the nearest tenth, and summed across all criteria. If each evaluator assigns the maximum number of points for each criterion, the total number of points will be 45. To meet the requirements of Stage II, the proposal must achieve a minimum rating of 27 points and have no individual criterion for which the number of points averaged across evaluators and then rounded is less than 3.0. Each proposal that receives a qualifying score based on the evaluation of the bidder’s qualifications will be further evaluated in Stage III. Figure 1 Evaluation Scale for Corporate Qualifications 5 Excellent 4 3 Satisfactory 2 1 Unsatisfactory The bidder has demonstrated exceptional experience and capability to perform the required tasks. The bidder has demonstrated that it meets an acceptable level of experience and capability to perform the required tasks. The bidder either has not established its corporate qualifications or does not have adequate qualifications. RFP Section 11.3 provides that each of the nine corporate qualifications criteria for administration operations in Appendix J (C1 through C9) will be individually rated by the six members of the evaluation team using a scale of one to five. A rating of three is designated as "satisfactory" which means that "[t]he bidder has demonstrated that it meets an acceptable level of experience and capability to perform the required tasks." In order to be further evaluated, Section 11.3 provides that there must be no individual corporate qualifications criterion for which the bidder’s proposal receives a score less than 3.0 (average points across evaluators). Dr. Fisher, the primary author of Section 11.3 of the RFP, referred to the 3.0 rating as the "cut score." (Emphasis added.) The RFP’s clear and unambiguous terms thus establish the "minimum threshold" of experience that a bidder "must demonstrate" in its proposal for Criterion C1 through Criterion C9. The "minimum threshold" of experience that a bidder must demonstrate for each criterion is described in Appendix J of the RFP. If a proposal failed to demonstrate that the bidder meets the minimum threshold of experience for a particular criterion in Appendix J, the bidder obviously would not have demonstrated "that it meets an acceptable level of experience and capability to perform the required tasks." Thus, in that setting, an evaluator was to have assigned the proposal a rating of less than "satisfactory," or less than three, for that criterion. (Emphasis added.) The fact that a score less than "3" was expected for -- and would eliminate -- proposals that did not demonstrate the "minimum threshold" of experience does not render meaningless the potential scores of "1" and "2." Those scores may reflect the degree to which a bidder’s demonstrated experience was judged to fall below the threshold. Although some corporate capability minimums were stated quantitatively (i.e., "minimum of two," or "at least 200,000"), others were open to a more qualitative assessment (i.e., "large-scale," "systems," or "reports"). Moreover, a proposal that included demonstrated experience in some manner responsive to each aspect of Appendix J might nevertheless be assigned a score of less than "3," based on how an evaluator assessed the quality of the experience described in the proposal. By the terms of the RFP, however, an average score across evaluators of less than 3 represented essentially a decision that the minimum threshold of experience was not demonstrated. Had the Department truly intended Appendix J to reflect only general targets or guidelines, there were many alternative ways to communicate such an intent without giving mandatory direction about what bidders "must demonstrate" or without establishing quantitative minimums (i.e. "a minimum of two," or "at least 200,000"). RFP Appendix K, for instance, sets forth the evaluation criteria for technical proposals in broad terms that do not require the bidder to provide anything in particular. Even within Appendix J, other than in Criterion C4 and Criterion C6, bidders were to show experience with "large-scale" projects rather than experience at a quantified level. Pursuant to the RFP’s plain language, in order to meet the "minimum threshold" of experience for Criterion C4 and Criterion C6, a bidder "must demonstrate," among other things, successful completion of a "minimum of two" projects, each involving the use of image-based scoring technology in administering tests to "at least 200,000 students annually." Department’s Evaluation of Corporate Qualifications In evaluating Harcourt’s proposal, the Department failed to give effect to the plain RFP language stating that a bidder "must document" successful completion of a "minimum of two" testing projects involving "at least 200,000 students annually" in order to meet the "minimum threshold" of experience for C4 and C6. Dr. Fisher was the primary author of Sections 10, 11 and Appendix J of the RFP. He testified that during the Stage II evaluation of corporate qualifications, the evaluation team applied a "holistic" approach, like that used in grading open-ended written responses in student test assessments. Under the holistic approach that Dr. Fisher described, each member of the evaluation team was to study the proposals, compare the information in the proposals to everything contained in Appendix J, and then assign a rating for each criterion in Appendix J based on "how well" the evaluator felt the proposal meets the needs of the agency. Notwithstanding Dr. Fisher’s present position, the RFP’s terms and their context demonstrate that the minimum requirements for corporate qualifications are in RFP Appendix J. During the hearing, Dr. Fisher was twice asked to identify language in the RFP indicating that the Department would apply a "holistic" approach when evaluating corporate qualifications. Both times, Dr. Fisher was unable to point to any explicit RFP language putting bidders on notice that the Department would be using a "holistic" approach to evaluating proposals and treating the Appendix J thresholds merely as targets. In addition, Dr. Fisher testified that the Department did not engage in any discussion at the bidders’ conference about the evaluation method that was going to be used other than drawing the bidders’ attention to the language in the RFP. As written, the RFP establishes minimum thresholds of experience to be demonstrated. Where, as in the RFP, certain of those minimum thresholds are spelled out in quantitative terms that are not open to interpretation or judgment, it is neither reasonable nor logical to rate a proposal as having demonstrated "an acceptable level of experience" when it has not demonstrated the specified minimum levels, even if other requirements with which it was grouped were satisfied. The plain RFP language unambiguously indicates that an analytic method, not a "holistic" method, will be applied in evaluating corporate qualifications. Dr. Fisher acknowledged that, in an assessment using an analytic method, there is considerable effort placed up front in deciding the specific factors that will be analyzed and those factors are listed and explained. Dr. Fisher admitted that the Department went into considerable detail in Appendix J of the RFP to explain to the bidders the minimums they had to demonstrate and the documentation that was required. In addition, Dr. Orr, who served as a member of the evaluation team and who herself develops student assessment tests, stated that in assessments using the "holistic" method there is a scoring rubric applied, but that rubric does not contain minimum criteria like those found in the RFP for FCAT. The holistic method applied by the Department ignores very specific RFP language which spells out minimum requirements for corporate qualifications. Harcourt’s Corporate Qualifications for C4 and C6 Harcourt’s proposal lists the same three projects administered by Harcourt for both Criterion C4 and Criterion C6: the Connecticut Mastery Test ("CMT"), the Connecticut Academic Performance Test ("CAPT") and the Delaware Student Testing Program ("DSTP"). Harcourt’s proposal also lists for Criterion C4 projects administered by its proposed scoring subcontractors, Measurement Incorporated ("MI") and Data Recognition Corporation ("DRC"). However, none of the projects listed for MI or DRC involve image- based scoring. Thus, the MI and DRC projects do not demonstrate any volume of image-based scoring as required by C6 and by the portion of C4 which relates to the work task for the imaged-based scoring of the math and reading portions of the FCAT. Harcourt’s proposal states that "[a]pproximately 35,000 students per year in grade 10 are tested with the CAPT." Harcourt’s proposal states that "[a]pproximately 120,000 students per year in grades 4, 6 and 8 are tested with the CMT." Harcourt’s proposal states that "[a]pproximately 40,000 students in grades 3, 5, 8, and 10" are tested with the DSTP. Although the descriptions of the CMT and the CAPT in Harcourt’s proposal discuss image-based scoring, there is nothing in the description of the DSTP that addresses image-based scoring. There is no evidence that the evaluators were ever made aware that the DSTP involved image-based scoring. Moreover, although the Department called the Delaware Department of Education ("DDOE") as a reference for Harcourt’s development proposal, the Department did not discuss Harcourt’s administration of the DSTP (including whether the DSTP involves image-based scoring) with the DDOE. Harcourt overstated the number of students tested in the projects it referenced to demonstrate experience with image-based scoring. Harcourt admitted at hearing that, prior to submitting its proposal, Harcourt had never tested 120,000 students with the CMT. In fact, the total number of students tested by Harcourt on an annual basis under the CMT has ranged from 110,273 in the 1996- 97 school year to 116,679 in the 1998-99 school year. Harcourt also admitted at hearing that, prior to submitting its proposal, Harcourt had never tested 35,000 students in grade 10 with the CAPT. Instead, the total number of grade 10 students tested by Harcourt on an annual basis with the CAPT ranged from 30,243 in 1997 to 31,390 in 1998. In addition, Harcourt admitted at hearing that, prior to submitting its proposal, it had conducted only one "live" administration of the DSTP (as distinguished from field testing). That administration of the DSTP involved only 33,051, not 40,000, students in grades 3, 5, 8 and 10. Harcourt itself recognized that "field tests" of the DSTP are not responsive to C4 and C6, as evidenced by Harcourt’s own decision not to include in its proposal the number of students field tested under the DSTP. Even assuming that the numbers in Harcourt’s proposal are accurate, and that the description of the DSTP in Harcourt’s proposal reflected image-based scoring, Harcourt’s proposal on its face does not document any single project administered by Harcourt for C4 or C6 involving image-based testing of more than 120,000 students annually. When the projects are aggregated, the total number of students claimed as tested annually still does not reach the level of "at least 200,000;" it comes to only 195,000, and it reaches that level only once due to the single administration of the DSTP. Moreover, even if that 195,000 were considered "close enough" to the 200,000 level required, it was achieved only one time, while Appendix J plainly directs that there be a minimum of two times that testing at that level has been performed. The situation worsens for Harcourt when using the true numbers of students tested under the CMT, CAPT, and DSTP, because Harcourt cannot document any single image-based scoring project it has administered involving testing more than 116,679 students annually. Moreover, when the true numbers of students tested are aggregated, the total rises only to 181,120 students tested annually on one occasion, and no more than 141,663 tested annually on any other occasion. Despite this shortfall from the minimum threshold of experience, under the Department’s holistic approach the evaluators assigned Harcourt’s proposal four ratings of 3.0 and two ratings of 4.0 for C4, for an average of 3.3 on C4; and five ratings of 3.0 and one rating of 4.0 for C6, for an average of 3.2 on C6. Applying the plain language of the RFP in Sections 10 and 11 and Appendix J, Harcourt did not demonstrate that it meets an acceptable level of experience and capability for C4 or C6, because Harcourt did not satisfy the minimum threshold for each criterion by demonstrating a minimum of two prior completed projects involving image-based scoring requiring testing of at least 200,000 students annually. Harcourt’s proposal should not have received any rating of 3.0 or higher on C4 or C6 and should have been disqualified from further evaluation due to failure to demonstrate the minimum experience that the Department required in order to be assured that Harcourt can successfully administer the FCAT program. NCS’s Compliance With RFP Requirements Even though the NCS proposal did not meet all of the mandatory requirements, and despite the requirement of Section 11.2 that the proposal be automatically disqualified under such circumstances, the Department waived NCS’s noncompliance as a minor irregularity. The factors in C4 and C6 were set, minimal requirements with which NCS did not comply. For example, one of the two programs NCS submitted in response to Criteria C4 and C6 was the National Assessment of Educational Progress program ("NAEP"). NAEP, however, is not a "statewide assessment program" within the meaning of that term as used in Criteria C4 and C6. Indeed, NCS admitted that NAEP is not a statewide assessment program and that, without consideration of that program, NCS’s proposal is not responsive to Criteria C4 and C6 because NCS has not have submitted the required proof of having administered two statewide assessment programs. This error cannot be cured by relying on the additional experience of NCS’s subcontractor because that experience does not show that its subcontractor produced reports within three months, and so such experience does not demonstrate compliance with Criteria C4. The Department deliberately limited the competition for the FCAT contract to firms with specified minimum levels of experience. As opined at final hearing, if the Department in the RFP had announced to potential bidders that the type of experience it asked vendors to describe were only targets, goals and guidelines, and that a failure to demonstrate target levels of experience would not be disqualifying, then the competitive environment for this procurement would have differed since only 2.06 evaluation points (out of a possible 150) separated the NCS and Harcourt scores. Dr. Heidorn conceded that multiple companies with experience in different aspects of the FCAT program -- a computer/imaging company and a firm experienced in educational testing -- might combine to perform a contract like the FCAT. Yet, that combination of firms would be discouraged from bidding because they could not demonstrate the minimum experience spelled out in the RFP. Language in the RFP, indicating the "holistic" evaluation that was to be applied, could have resulted in a different field of potential and actual bidders.
Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is recommended that Respondent, State of Florida, Department of Education, enter a Final Order rejecting the bids submitted by Harcourt and NCS for the administration component of the RFP. The Department should then seek new proposals. DONE AND ENTERED this 25th day of May, 1999, in Tallahassee, Leon County, Florida. DON W. DAVIS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 25th day of May, 1999. COPIES FURNISHED: Karen D. Walker, Esquire Holland and Knight, LLP Post Office Drawer 810 Tallahassee, Florida 32302 Mark D. Colley, Esquire Holland and Knight, LLP Suite 400 2100 Pennsylvania Avenue, Northwest Washington, D.C. 20037 Charles S. Ruberg, Esquire Department of Education The Capitol, Suite 1701 Tallahassee, Florida 32399-0400 Paul R. Ezatoff, Jr., Esquire Christopher B. Lunny, Esquire Katz, Kutter, Haigler, Alderman, Bryant and Yon, P.A. 106 East College Avenue, Suite 1200 Tallahassee, Florida 32302-7741 Tom Gallagher Commissioner of Education Department of Education The Capitol, Plaza Level 08 Tallahassee, Florida 32399-0400 Michael H. Olenick, General Counsel Department of Education The Capitol, Suite 1701 Tallahassee, Florida 32399-0400
Findings Of Fact In order for the Petitioner to obtain his license as a building contractor in Florida, he is required to successfully complete a certification examination which consists of three tests. The examination is prepared by the ACSI National Assessment Institute and administered by the Department of Professional Regulation. The June 1987, examination involved a new format, new scoring methods, and areas of competency which had not been tested in previous exams. A post examination report prepared by the Office of Examination Services of the Department of Professional Regulation reveals that, while forty seven per cent of the examinees passed at least one part of the examination, only seven per cent passed the entire examination. Historically, pass rates for previous examinations ranged from thirty five to fifty five per cent. The reasons given for the low pass rate on this particular exam by the Office of Examination Services were: 1) Candidates are currently required to demonstrate competency in each of the three content areas. If the exam was graded in the same manner as the grading method used in prior exams (compensatory scoring), the pass rate would have increased to twenty one per cent in this examination. 2) Whenever an examination is significantly changed, the performance of the candidates will decrease until they prepare for the demands of the new examination. 3) There appeared to be a time problem. Many of the candidates did not timely complete the answers to all of the questions in the second and third test. The Petitioner was not prepared for the new format. The review course taken by him shortly before the exam did not alert him to the changes approved by the Board. As a reexamination candidate, his expectations as to exam content were even more entrenched than those of first time candidates. The Petitioner failed all three tests in the exam. A review of the Petitioner's score sheets on all three tests reveal that he timely completed all of the answers, so the time problem does not appear to have affected his results. If the compensatory scoring method had been used on this exam, as it had been in prior exams, the Petitioner would still not have passed the examination administered in June 1987. The Petitioner did not demonstrate that the Respondent failed to follow standard procedures for conducting or grading the examination. The Petitioner was not treated differently from other candidates who took the examination. Although the content in this exam was different than the preceding exam, the content of the exam had been properly promulgated in Rule 21E-16.001, Florida Administrative Code, as amended May 3, 1987. The Respondent has agreed to allow the Petitioner the opportunity to take the next scheduled examination, without charge.