Elawyers Elawyers
Ohio| Change
Find Similar Cases by Filters
You can browse Case Laws by Courts, or by your need.
Find 49 similar cases
MICHAEL REGGIA vs. BOARD OF PROFESSIONAL ENGINEERS, 86-001808 (1986)
Division of Administrative Hearings, Florida Number: 86-001808 Latest Update: Sep. 19, 1986

The Issue The issue in this proceeding is whether Michael Reggia meets the Florida licensure requirements for a professional engineer in the field of manufacturing engineering. The issue is specifically whether the practice and principles portion of the licensing exam was valid. Procedural Matters At the final hearing, Petitioner, Michael Reggia testified in his own behalf and presented the testimony of manufacturing engineer, Howard Bender. Petitioner's exhibits #1 and #2, letters from Martin Marietta Aerospace and Harris Corporation, were rejected as hearsay. Exhibit #3, selected pages from Fundamentals of Engineering, published by the National Council of Engineering Examiners, was admitted without objection. Respondent presented two witnesses: Cass Hurc, P.E. (by deposition, by agreement of the parties) and Allen Rex Smith, Executive Director of the Board of Professional Engineers. Respondent initially submitted four exhibits: #1 and #4 were admitted without objection, #2(a) and #2(b), were admitted over Petitioner's objection, and #3 was withdrawn. The parties requested and were given 20 days to submit post-hearing briefs and proposed orders. On September 15, 1986, Petitioner filed his arguments and summary of she testimony and evidence. Nothing was filed by Respondent.

Findings Of Fact Michael Reggia resides in Titusville and works at the Kennedy Space Center. He is licensed in the state of California as a professional engineer and has practiced in the field of manufacturing engineering. California, like Florida, does not license an individual in a particular discipline of engineering but requires that an individual select an area in which he or she will be tested. Mr. Reggia took the professional engineering license exam in Florida in October 1985. For part two of the examination, Professional Practice and Principles, he chose to be tested in his field of manufacturing engineering. He achieved a score of 64.4; in order to pass, a score of 70 is required. The examination given in Florida is a national examination produced by the National Council of Engineering Examiners (NCEE) for certification or licensure throughout the United States. The October 1985 exam was developed based upon an extensive survey study initiated by NCEE in 1979. A report of that study was published in March 1981 as "A Task Analysis of Licensed Engineers". (Respondent's exhibit #4) The primary purpose of the study was to aid NCEE in developing"... fair, meaningful, uniform, and objective standards with which to measure minimum competency for professional licensure." (exhibit #4, page E1) In drafting an exam the NCEE relies on the societies representing various engineering disciplines to submit examination problems for consideration. The Society of Manufacturing Engineers, through its professional registration committee, provides that service on behalf of the manufacturing engineers. The October 1985 examination for manufacturing engineers did not include questions relating to electrical engineering, which is Mr. Reggia's sub- area of emphasis in the area of manufacturing engineering. Since manufacturing engineering includes overlap into the basic engineering disciplines, Mr. Reggia contends the exam was one-sided and invalid as he felt it concentrated on tool designing and mechanical engineering. Some industries, particularly the aerospace industries now include a substantial number of electrical engineers on their staff. Engineering is an evolving discipline and manufacturing engineering has undergone changes with new technologies in recent years. One way of addressing the diversity and changes in the field is to provide a two-book exam that would offer the applicant a wider variety of problems from which he or she could select. This has been recommended to the NCEE by the Society of Manufacturing Engineers. Another approach, and the one utilized by the NCEE, is to conduct periodic surveys to determine the tasks which engineers are actually performing and the level of judgement required to perform the tasks effectively. It would be impossible, and perhaps inappropriate to develop an exam that would test each individual only on his or her particular expertise. In the area of manufacturing engineering the exams developed by NCEE are passed by 65- 75 percent of the candidates, a rate which is comparable to that of the mechanical engineers for their exam. Seven out of ten applicants passed the same exam which Mr. Reggia took in October 1985.

Florida Laws (2) 455.213455.217
# 3
ARMANDO PEREZ AND MIGUEL OYARZUN vs. CONSTRUCTION INDUSTRY LICENSING BOARD, 75-001231 (1975)
Division of Administrative Hearings, Florida Number: 75-001231 Latest Update: Jan. 17, 1977

Findings Of Fact On May 23, 1975, the Petitioners herein took the general contractors licensing exam in Miami, Florida. Petitioners failed to achieve a passing score on said examination and thereafter several reviews of the examination questions and answers resulted. The exam in question was administered to approximately 659 examinees. The test was made up of 100 questions and the examinees were allotted 4 hours to complete the exam. The examinees were instructed in the examination booklet to answer as many questions as possible within the time limit and to always select the best possible answer out of the listed choices. The examinees were furnished a list of reference materials by Respondent and they were advised that the exam questions would come from some 18 odd reference books supplied on the reference lists. Petitioner Dennis Milch took the general contractor's examination administered on May 23, 1975. 2/ In preparation of the exam he took a prep course given by Cole Construction and began his preparation approximately two months prior to the exam. He received a score of 67.5 on the exam. He earned a degree from the University of South Florida and took advance construction courses at FIU where he earned a degree in marketing. His work experience consisted of serving as an apprentice carpenter for Burke Construction Company in Miami for approximately two months and as a contractor to build residential homes in Houston Texas. Milch voiced his opinion that the exam questions failed to satisfy the statutory requirements of being "objective" within the meaning of Chapter 468, Florida Statutes. Joseph Cole, the founder of Cole Construction College in 1949, testified that he had approximately 30 years experience in teaching construction, architectural and engineering courses. He had conducted various seminars for students and received a B.S. Degree from the University of Miami. He received a B.S. Degree in biochemistry and civil engineering from the University of Pittsburgh and conducted seminars at the University of Florida in Math, Physics and Engineering. He also conducted seminars at the Markowitz Engineering School. He was licensed in 1947 in Coral Gables, Miami Beach, and in Miami where he has built approximately 2,000 single-family homes, high-rise buildings and apartments. He aired his opinion that Milch missed approximately 36 questions of which approximately 24 were what he regarded as "impossible" questions. He expressed his awareness that during the morning sessions two questions were voided and credits were given to all examinees having a point value of 1 point each. Two questions were also voided from the afternoon session. Thomas H. Hebert, an associate of the testing agency which compiled the exam for the Board, i.e., Bryon, Harlow, Schaefer, Reed and Associates, explained the procedure for compiling the tests for the Board (Respondent). He stated that data is taken from Board references and an exam format is established. Examinees are tested on plan reading and estimating using standard plans for takeoff and specification requirements. The test is first administered to contractors and others who had previously passed the exam. This is done to test time limits, etc., and grading procedures. By so doing, he testified that it is possible to correct deficiencies in the exam. After the examination is compiled and is administered to the agency employees and other contractors who have passed the exam, an item analysis is compiled and computerized. There are five possible answers for each question. The exam is divided into three segments, i.e., the upper 27 percent, the mid 46 percent, and the lower 27 percent. After the tests results are in, the weighting on various questions are checked to see if large numbers of examinees "jump" one question and further to see if questions are ambiguous. If found to be or that there are two correct answers for a given question, credit is given for both answers. Thereafter, a discrimination index is compiled based on the lower and upper 27 percent. These papers are scrutinized and if there is a discrepancy in excess of .1 to .8 percent, the question is examined and a solution is derived at based on results gathered from the scrutiny. He testified further that if an exam paper is mutilated or is otherwise difficult to machine score, it is hand graded. All exam papers in which the score ranged from 0 to 30 are hand graded as are those where the score ranges between 60 to 70. Of those questions where there is no correct answer, the question is deleted and a new base is established. For example, if a question is deemed faulty, each question has a weight of 1.1 one hundredths of a point. If they have three correct answers, points are given for all three answers. It was further brought out during his testimony that it was not necessary for a contractor to pass the certification examination in order to practice contracting in Florida. Evidence reveals that there are two kinds of licenses issued by the Board, i.e., registration and certification. The registration process only requires compliance with local requirements and the filing of a form with the Board, which may be the passing of a local competency exam or simply obtaining a local occupational license. The Certification method is optional and if the contractor passes the certification examination, it is unnecessary for him to take any local examinations. After going over various questions missed by Petitioner Dennis Milch, Petitioners argue that the scope of the Board Certification Examination included questions affecting the business of contracting as well as the technical aspects of for example how to nail two boards together to make a safe structure. Florida Statutes Chapter 468.106(2)(a) provides for an examination covering knowledge of basic principles of contracting and construction. Chapter 468.101 declares the purpose of Chapter 468 and states in pertinent part that "any person desiring to obtain a certificate to engage in the business shall be required to establish his competency and qualifications." Hence, the legislature has covered the business of contracting as well as the theory of construction. This serves the purpose of Chapter 468 by making it safer for owners to contract with the contractors and to have assurances that no liens will be placed on their property by subcontractors, that the owners are safe from suit from work that was done on the job, that the payments made on the construction will not be diverted and that the contractor understands his obligations. This requires general knowledge of the mechanics lien law, basic contract law and workers compensation law, all of which were tested by the subject examination. Respecting Petitioners' argument that they were denied certain constitutional guarantees when they were instructed by Respondent to select the best possible answer but that after the test was administered and Respondent determined that many questions had no best choice, the Board failed to delete such questions from the exam, it was noted that after the Respondent discovered that several test questions were deemed acceptable but that the answers offered did not meet the tests of selecting the best possible answer, adjustments were made. In other words, there was no single best possible answer for approximately four questions. Rather than deleting the entire question, Respondent permitted those examinees who selected either the answer originally preferred by the Board or one of the later adopted alternate answers to achieve full credit for such questions and answers. The statute (Chapter 468, F.S.) mandates that the examination be an "objective" written examination. The criteria of objectivity is not met where the examining body is granted the discretion to accept alternate answers to a given question. A "best" answer is something different from an acceptable answer. To give the Board discretion to accept alternate answers would authorize a substitution of standards which is' not permitted by Chapter 468, F.S. Once subjectivity comes into play, Respondent becomes vested with almost unbridled discretion in deciding who shall become a certified general contractor. This was prohibited by the legislature by requiring objectivity in setting a uniform minimum test grade. As relates to Petitioner Milch, it was noted that a subsequent review of his exam resulted in his being awarded a half credit for his answer to question number 39 and the Board determined that after review answers B and A were both correct. The net result of this was that his overall score was 68. A review of the court cases revealed that Florida courts have not been involved in the minute details of how examination grades or points are awarded. See the cases of State ex rel. Topp v. Board of Electrical Examiners, 101 So.2d 583 (Fla. App. 1st 1968), and State ex real. Lane v. Dade County 258 So.2d (347 Fla. App. 3rd, 1972). These cases generally show that unless there is a clear abuse of discretion, courts shall not substitute their judgment for the agencies as to how examinations are graded. Petitioners also submit that once regrading had commenced, the Board should have deleted all questions with wrong answers or more than one equally acceptable answer, distributed the weight of the deleted questions in proportionate fashion of the remaining questions and considered passing to be 70 percent of the total points available, rather than 70 cumulative points. During the hearing, Petitioners failed to show how they were injured by the difference in the award of the points for questions deleted. A wrong without damage does not constitute a good cause of action. Based on the evidence presented it appears that the Petitioners are treated the same as all other examinees. Since the Petitioners have failed to establish that if the assignment of points were different, they would have passed the examination, this argument is moot. Petitioners also alleges that they were denied certain constitutional protections by Respondent's failure to adopt and promulgate uniform rules and regulations concerning preparation, administration and review of licensing examinations. Florida Statutes, Chapter 468, requires Respondent to conduct its affairs pursuant to Florida Statutes Chapter 120. The Administrative Procedure Acts set out specific procedures to be followed by State agencies in adopting, promulgating and enforcing rules. Statutory authority governing the granting of a license should be strictly followed. In this case, there is no evidence of the existence of any unlawful rule or regulation adopted by Respondent to govern any of the variety of issues concerning the licensing of general contractors. Petitioners also submit that the Board should be required to promulgate and enforce rules concerning examinations and appeals or results thereof and cite the reasons for the actions it takes prior to its review of the examination. The Board is not required to adopt rules and regulations in every area in which it is authorized to act by statute. Where the statute is clear, there is no requirement or reason by the Board to adopt rules. Here, the Board provided the applicants with a chance to examine the questions, their papers, grades and to complain if they wished about the questions either individually or at board meetings with the possibility that the fairness of the questions could be resolved quickly and informally and if necessary, as in this case, without the full ponoply of an administrative hearing. By so doing, the Board was clearly following its statutory duty to provide the applicants with a chance to see their examination papers and grades. (F.S. Chapter 466.110). Based on the above, it is concluded that Respondent compiled the May 23, 1975 examination based on objective standards. When the Board determined that certain questions were defective either because there was more than one answer or for other reasons, the Board reviewed said questions and credited those examinees who failed to properly answer the question. By so doing, Petitioners were treated the same as all examinees who took the exam. Based on the record evidence, it further appears that all the questions meet the statutory tests of being objective and the Board's determination that a cumulative score of 70 percent is necessary to successfully obtain a certification, was not shown by any competent or substantial evidence to be an abuse of discretion. It is therefore recommended that the agency's action be sustained and that the petition filed herein be dismissed.

Recommendation Based on the foregoing findings of fact and conclusions of law, it is therefore recommended that the agency's actions be affirmed and the petition filed herein be DISMISSED. DONE and ENTERED this 17th day of January, 1977, in Tallahassee, Florida. JAMES E. BRADWELL, Hearing Officer Division of Administrative Hearings Room 530, Carlton Building Tallahassee, Florida 32304 (904) 488-9675

Florida Laws (1) 119.07
# 4
DON BLACKBURN vs BOARD OF PROFESSIONAL ENGINEERS, 90-005731 (1990)
Division of Administrative Hearings, Florida Filed:Fort Myers, Florida Sep. 10, 1990 Number: 90-005731 Latest Update: Nov. 28, 1990

Findings Of Fact Based upon all of the evidence, the following findings of fact are determined: On April 19, 1990, petitioner, Don R. Blackburn, was a candidate on the engineering intern portion of the professional engineer examination given in Miami, Florida. The test was administered by the Department of Professional Regulation (DPR) on behalf of respondent, Board of Professional Engineers (Board). On July 25, 1990, the Board issued a written uniform grade notice advising petitioner that he had received a grade of 66 on the examination. A grade of 70 is necessary to pass this part of the examination. By letter dated August 15, 1990, petitioner requested a formal hearing to contest his score. In his letter, Blackburn generally contended that the examination was unfairly administered because certain books were allowed to be used by some but not all candidates, untrained proctors were given the authority to scan review materials and determine which could or could not be used by the candidates, and because of the chaos and confusion that occurred during the examination, he was unable to attain a score that he otherwise would have been able to achieve. Blackburn is an engineer for Lee County and is seeking to pass the engineering intern portion of the examination. A passing grade on that portion is a prerequisite to sitting on the second part of the professional engineer examination. He has taken the examination on a number of occasions and has gradually improved his score to just short of passing. Indeed, on the October 1989 examination, Blackburn scored a 69, or just one point less than the required 70. Prior to the April 1990 examination, the engineering intern portion of the professional engineer examination was an unrestricted open book examination. This meant candidates could use any and all reference and review materials during the examination. Beginning with the April 1990 examination, the Board imposed certain restrictions on the use of review materials. As early as October 9, 1989, the Board's executive director sent a memorandum to all candidates on the October 1989 examination, including Blackburn, concerning the new restrictions. The memorandum stated in part: Please be advised of certain restrictions listed in the Candidate Information Booklet which will not be implemented until the April 1990 examination. These restrictions are found in the "Examination Administration Information" section and are concerning the following two areas: * * * 2. Books or information containing sample questions or engineering problems may also be brought provided they are bound. Again, the new restrictions listed in the Candidate Information Booklet regarding the above two areas WILL NOT be implemented until the April 1990 examination. All candidates on the April 1990 examination were given a Candidate Information Booklet prepared in January 1990 by DPR's Bureau of Examination Services. On pages 13 and 14 of the booklet was found the following information: This is an open book examination. Candidates may use textbooks, handbooks, notes, and reference materials which are bound, copyrighted and printed. The term "bound" refers to material that is bound permanently, hard or paperback stitched or glued, or spiral, plastic or three-ringed bound. The printed material must remain contained (bound) in its cover during the entire examination. No writing tablets, unbound tablets or unbound "loose notes" will be allowed. No books with contents directed toward sample questions or solutions of engineering problems are permitted in the examination room. Examinees are not permitted to exchange reference materials or aids during the examination. (Emphasis in original) What the emphasized language meant is that "review" manuals, which contain problems and solutions, were prohibited from use during the examination while "reference" books were not. However, the booklet did not list the specific names of published materials that would be permitted or excluded. In order to ascertain which books he might use on the next examination, on March 27, 1990, Blackburn telephoned the Board in Tallahassee and spoke with a female employee named "B. J." who advised him that "review publications directed principally towards the solution of engineering problems" would be excluded. When asked if "Lindeburg's Sixth Edition" would be authorized, B. J. told Blackburn she wasn't sure and that it would be left up to the proctors in the room. She did say, however, that a review manual authored by Schaum could be used. The engineering intern examination in April 1990 was administered in two separate rooms at the Radisson Hotel in Miami, Florida. Blackburn was in a "very large" upstairs room with approximately thirty other candidates while a similar number took the examination in a downstairs room. The examination in the upstairs room began at 8:43 a.m. after various instructions were read to the candidates by the examination supervisor, Jeannie Smith, a veteran of twenty years in proctoring and supervising professional examinations. According to Smith, there was "considerable confusion" concerning which books could be used by the candidates, particularly since this was the first examination given with the new restrictions. She also acknowledged that there was "chaos" prior to the beginning of the examination and that this was, "extremely upsetting" to the examinees. However, before the examination began, Smith announced on a microphone the names of certain books which the Board had given her that were either prohibited or could be used by candidates. She further advised that if candidates had any questions they were to come to a bulletin board by the microphone where she had posted Xerox copies of the covers of various books. If a book could be used, it had the word "YES" printed on the cover while a "NO" was printed on those covers of books that could not be used. 1/ It is noted that only one cover sheet with a "YES" was posted, that being the Civil Engineering Reference Manual, Fourth Edition, Michael R. Lindeburg. However, at least three candidates who took the examination that morning, including petitioner, did not see the posted materials nor hear the invitation for candidates to come to the bulletin board. One book in issue that was specifically prohibited was Engineer In Training Review Manual, Sixth Edition, Michael R. Lindeburg, which contained 378 solved problems, and thus fell within the general prohibition of review manuals described on page 14 of the Candidate Information Booklet. However, those candidates who had the Seventh Edition of the same book were allowed to keep and use that manual even though it contained 422 solved problems, or some 44 more solved problems than was contained in the prohibited Sixth Edition. By allowing those students having the Seventh Edition to use the same even though it contained "review" materials, DPR violated the instructions contained in the Candidate Information Booklet and gave an advantage to those candidates not enjoyed by others, including petitioner. In addition, at least one other candidate in the upstairs group was allowed to use a prohibited review manual (Schaum's Outline Series, Theory and Problems of Electric Power Systems) but still that candidate did not attain a passing grade. Petitioner also contended that candidates taking the examination in the downstairs room were allowed to use language dictionaries during the morning part of the examination while those upstairs could not. 2/ Petitioner's contention is grounded upon hearsay evidence and accordingly it is found that no competent proof to support this claim was submitted. However, there was obviously some confusion over this matter because, after receiving complaints of this nature from two candidates, Smith telephoned the Board's offices in Tallahassee during the lunch break to ascertain whether such books could be used. Upon learning that they could not, she advised the upstairs group at the beginning of the afternoon session that dictionaries were not allowed. Blackburn also established that during the examination proctors went from desk to desk examining the materials that each candidate had in his possession. If a candidate had what the proctor perceived to be a book containing solutions to problems, the candidate was told to put the book on the floor. In the alternative, she candidates were told that if they tore the offending pages out of the book, they could continue using the remaining materials. Petitioner has complained that the proctors were not engineers and they were untrained in determining whether a book was acceptable or not. The Board has conceded that engineers do not proctor examinations but asserted that they are intelligent enough to determine whether books fall within the proscribed category. According to Blackburn's proctor at the examination, George Walton, a retired Coast Guard captain and engineering graduate of the Coast Guard Academy, he relied upon the list of approved and disapproved books supplied by the Board prior to the examination in determining whether materials would be excluded or not. Walton also stated that if he examined a book and found it contained solutions, he would disallow the same unless the offending pages were removed. A DPR expert in testing and measurements, Dr. Joseph A. Klock, examined the pass/fail rate for the examination taken by Blackburn and compared that rate to the October 1989 examination rate. Doctor Klock found no significant difference in the two rates and concluded that there was no statistically significant difference in performance of candidates over those time periods despite the confusion which occurred during the April 1990 examination. Blackburn did not present any evidence to show that if he had used the Seventh Edition of the Engineer In Training Review Manual, he would have been able to achieve more points on a particular problem and thus would have had a passing grade. Blackburn's principal complaint was that he had spent many hours preparing for the examination in question, that he was forced to guess which books to bring to the examination, and because of the confusion and chaos that took place at the beginning of the examination as well as his awareness that others were using a review manual with solved problems, it was impossible for him to give his best effort on the examination.

Recommendation Based on the foregoing findings of fact and conclusions of law, it is RECOMMENDED that petitioner's request to receive a passing grade on the April 1990 professional engineer examination be DENIED. However, petitioner should be entitled to retake the next examination at no charge. DONE and ENTERED this 28th day of November, 1990, in Tallahassee, Florida. DONALD R. ALEXANDER Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, FL 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 28th day of November, 1990.

Florida Laws (1) 120.57
# 5
L. B. THANKI vs BOARD OF PROFESSIONAL ENGINEERS, 91-001545 (1991)
Division of Administrative Hearings, Florida Filed:Tampa, Florida Mar. 08, 1991 Number: 91-001545 Latest Update: May 10, 1991

Findings Of Fact L.B. Thanki received a degree in Civil Engineering at the University of Durham at Kings College, Newcastle Upon Tyne in the United Kingdom in 1956. Petitioner received a batchelor of law degree from Sardar Patel University (India) in 1967. This degree is the equivalent of two years study in law. The degree obtained from the University of Durham is not the equivalent of the degree received from an ABET approved university in the United States because it lacks 16 credit hours in Humanities and Social Sciences. Petitioner presented no evidence that his degree from the University of Durham or the curriculum he completed at any other university included the missing 16 hours in Humanities and Social Sciences. Petitioner presented a certificate (which was not offered into evidence) that he had completed a course in computer services meeting the board's evidentiary requirements of computer skills.

Recommendation Based on foregoing Findings of Fact and Conclusions of Law, it is recommended that a Final Order be entered denying Petitioner's application for licensure by examination as an engineering intern. RECOMMENDED this 10th day of May, 1991, in Tallahassee, Leon County, Florida. N. AYERS Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904)488-9675 Filed with the Clerk of the Division of Administrative Hearings this 10th day of May, 1991. COPIES FURNISHED: B. Thanki 1106 East Hillsborough Avenue Tampa, Florida 33604 Edwin A. Bayo, Esquire Assistant Attorney General Department of Legal Affairs The Capitol, Suite LL04 Tallahassee, Florida 32399-1050 Carrie Flynn, Acting Executive Director Florida Board of Professional Engineers Northwood Centre, Suite 60 1940 North Monroe Street Tallahassee, Florida 32399-0755 Jack L. McRay, General Counsel Department of Professional Regulation Northwood Centre, Suite 60 1940 North Monroe Street Tallahassee, Florida 32399-0792

Florida Laws (2) 455.11471.013
# 6
SUSAN E. WILSON vs BOARD OF PROFESSIONAL ENGINEERS, 97-003468 (1997)
Division of Administrative Hearings, Florida Filed:Jacksonville, Florida Jul. 28, 1997 Number: 97-003468 Latest Update: Jan. 27, 1999

The Issue Is Petitioner entitled to one additional point on the October 1996 Professional Civil Engineer Examination so as to achieve a passing score for licensure in Florida?

Findings Of Fact Petitioner took the Civil Engineer Examination given in October 1996. The Department of Business and Professional Regulation's Bureau of Testing notified Petitioner by Examination Grade Report dated February 17, 1997, that she had earned a score of 69.00 on the Civil Engineer Examination. The minimum passing score for the Civil Engineer Examination is 70.00. Petitioner timely requested formal hearing and challenged only Question 120, for which she received no points. Petitioner is trained as a materials engineer. Question 120 is a soils and foundation problem outside her concentrated area of study. It is an open book examination question. Petitioner selected the correct equation from the applicable manual, but acknowledged that she solved the variables of that equation incorrectly. The National Council of Examiners for Engineering and Surveying (NCEES) produced, distributed, and was responsible for grading the examinations. Petitioner contended that the examiner who graded her answer sheet applied different criteria than the examination criteria published by the NCEES. Petitioner further contended that since one criterion her grader actually used was merely to "write the correct equation," she should be awarded at least one point on that basis. However, a comparison of the actual grader's handwritten "summary" on Petitioner's Solution Pamphlet (Respondent's Exhibit 3) and the NCEES's Solutions and Scoring Plan (Respondent's Exhibit 2) does not bear out Petitioner's theory. It is clear that out of five possible parts of the question, which five parts total two points' credit each, merely selecting the correct equation from an open text would not amount to two points, or even one point, credit. I accept as more competent, credible and persuasive the testimony of Eugene N. Beauchamps, the current Chairman of the NCEES Examination Policy Committee and a Florida licensed Professional Engineer, that the grader's "summary" describes what he actually reviewed in Petitioner's written solution to Question 120 rather than establishing one or more different grading criteria. In order to receive a score of two on Question 120, the candidate was required to demonstrate any one of five requirements listed in the NCEES Solution and Scoring Plan for "2-Rudimentary Knowledge." The first requirement in the NCEES Solution and Scoring Plan (Respondent's Exhibit 2) for receiving a score of two points is, "Determines effective overburden stress at mid- depth of clay layer." The remaining four NCEES scoring criteria required that the examinee: Computes the change in effective stress at mid- depth of the clay layer due to placement of the fill. Computes the primary consolidation settlement, based on a change in effective stress, due to the fill surcharge. Evaluates the Average Degree of Consolidation and the Time Factor. Determines the waiting period after fill placement recognizing the existence of double-drained conditions. In order to gain two more points (total 4 points) so as to demonstrate "More Than Rudimentary Knowledge But Insufficient to Demonstrate Minimum Competence," Petitioner would have to have met two of the five bulleted criteria. For two more points (total 6 points) for "Minimum Competence," Petitioner would have had to score three bullets. For two more points (total 8 points) for "More than Minimum But Less Than Exceptional Competence," Petitioner would have had to score four bullets. Finally, to attain "Exceptional Competence" for 10 total points, Petitioner would have had to score all five bullets. In the first correct equation for answering Question 120, "p sub zero" (p naught) equals the present effective overburden pressure, which represents what clay was present before anything was put on top of the clay layer. "P" equals the total pressure acting at mid-height of the consolidating clay layer or the pressure of the dirt and the water in the dirt. "H" equals the thickness of the consolidating clay layer. Petitioner's solution for the first bullet, "determining the effective overburden stress at mid-depth of clay layer," indicated p sub zero (p naught) as the "present effective overburden pressure," but it incorrectly calculated p sub zero equaling 125 pounds multiplied by 13 feet. This is incorrect because the effective overburden pressure would not include 13 feet of fill. The 13 feet of fill is not part of p sub zero, the present effective overburden pressure. Petitioner's solution for the first bullet, also multiplied water, represented by 62.4, by 12, which is incorrect. She should have used a multiplier of 10 to receive credit for this problem. The grader indicated the correct equation was used incorrectly by Petitioner because of the two foregoing incorrect calculations. The equation, as Petitioner stated it, was correct and her multiplication was correct. Her solution identified P sub zero as present effective overburden pressure but present effective overburden pressure would not include the fill. Petitioner had the correct equation for the present effective overburden pressure and her mathematics were correct. However, she did not use the consolidation equation correctly, not obtaining the correct percentage of primary consolidation. As stated, the problem did not consider the fill as part of the present effective overburden pressure. Her solution also contained the correctly written time rate of settlement equation but failed to use it, and no waiting period was determined. The practical result of Petitioner's error could range from a cracked building to a collapsed building, depending upon the degree of error to site and materials.

Recommendation Upon the foregoing findings of fact and conclusions of law, it is RECOMMENDED that the Department of Business and Professional Regulation enter a Final Order denying Petitioner's challenge and affirming her score as one point below passing. RECOMMENDED this 3rd day of March, 1998, in Tallahassee, Leon County, Florida. ELLA JANE P. DAVIS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 Filed with the Clerk of the Division of Administrative Hearings this 3rd day of March, 1998. COPIES FURNISHED: Susan E. Wilson 3581 Jose Terrace Jacksonville, Florida 32217 R. Beth Atchison Assistant General Counsel Department of Business and Profession Regulation 1940 North Monroe Street Tallahassee, Florida 32399 Angel Gonzalez, Executive Director Department of Business and Profession Regulation 1940 North Monroe Street Tallahassee, Florida 32399 Lynda L. Goodgame General Counsel Department of Business and Profession Regulation 1940 North Monroe Street Tallahassee, Florida 32399

Florida Laws (1) 120.57
# 7
NATIONAL COMPUTER SYSTEMS, INC. vs DEPARTMENT OF EDUCATION, 99-001226BID (1999)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Mar. 17, 1999 Number: 99-001226BID Latest Update: Jul. 19, 1999

The Issue The primary issue is whether the process used by the Department of Education (Department) for evaluating and ranking the proposals submitted in response to Request For Proposal (RFP) 99-03 for the Florida Comprehensive Assessment Test (FCAT) administration contract was contrary to the provisions of the RFP in a way that was clearly erroneous, contrary to competition, arbitrary, or capricious.

Findings Of Fact The RFP for the FCAT describes a five stage process for evaluating proposals. In Stage I, the Department’s Purchasing Office determined whether a proposal contained certain mandatory documents and statements and was sufficiently responsive to the requirements of the RFP to permit a complete evaluation. Stage II involved the Department’s evaluation of a bidder’s corporate qualifications to determine whether the bidder has the experience and capability to do the type of work that will be required in administering the FCAT. Stage III was the Department’s evaluation of a bidder’s management plan and production proposal. In Stage IV, the Department evaluated a bidder’s cost proposal. Stage V involved the ranking of proposals based on points awarded in Stages II-IV. If a proposal did not meet the requirements at any one stage of the evaluation process, it was not to be evaluated in the following stage. Instead, it was to be disqualified from further consideration. Stages II and III of the evaluation process were conducted by an evaluation team comprised of six Department employees: Dr. Debby Houston, Ms. Lynn Joszefczyk, Dr. Peggy Stillwell, Dr. Cornelia Orr, Dr. Laura Melvin, and Ms. Karen Bennett. Dr. Thomas Fisher, head of the Department’s Assessment and Evaluation Services Section, and Dr. Mark Heidorn, Administrator for K-12 Assessment Programs within the Department’s Assessment and Evaluation Services Section, served as non-voting co-chairs of the evaluation team. The focus of this proceeding is Stage II of the evaluation process addressing a bidder’s corporate qualifications. RFP Provisions Regarding Corporate Qualification The FCAT administration contractor will be required to administer tests to approximately one and a half million students each year in a variety of subject areas at numerous grade levels. The FCAT program involves a complex set of interrelated work activities requiring specialized human resources, technological systems and procedures. The FCAT must be implemented annually within limited time periods. The FCAT administration contractor must meet critical deadlines for the delivery of test materials to school districts and the delivery of student scores prior to the end of the school year. In developing the RFP, the Department deliberately established a set of minimum requirements for corporate qualifications that a bidder was to demonstrate in order for its proposal to be eligible for further evaluation. The purpose of the RFP’s minimum corporate qualifications requirements was to limit bidding to qualified vendors who have demonstrated prior experience in successfully administering large-scale assessment projects like the FCAT, thereby providing the Department with some degree of assurance that the winning bidder could successfully administer the FCAT. The instructions to bidders regarding the minimum requirements for corporate qualifications are contained in RFP Section 10, which gives directions on proposal preparation. Section 10.1, which lists certain mandatory documents and statements to be included in the bidder’s proposal, requires that a transmittal letter contain "[a] statement certifying that the bidder has met the minimum corporate qualifications as specified in the RFP." These "minimum corporate qualifications" are set forth in RFP Appendix J. RFP Section 10.2 identifies what a bidder is required to include in its proposal with respect to corporate qualifications. The first paragraph of Section 10.2 directs a bidder generally to describe its qualifications and experience performing tasks similar to those that it would perform in administering the FCAT, in order to demonstrate that the bidder is qualified where it states: Part II of a bidder’s proposal shall be entitled Corporate Qualifications. It shall provide a description of the bidder’s qualifications and prior experience in performing tasks similar to those required in this RFP. The discussion shall include a description of the bidder’s background and relevant experience that qualifies it to provide the products and services required by the RFP. RFP Section 10.2, however, is not limited to a directive that qualifications and past experience be described generally. Instead, Section 10.2, also communicates, in plain and unambiguous terms, that there are specific minimum corporate qualifications a bidder must demonstrate: The minimum expectations for corporate qualifications and experience are shown in Appendix J. There are two separate sets of factors, one set of eight for the developmental contractor and another set of nine for the administration contractor. Bidders must demonstrate their Corporate Qualifications in terms of the factors that are applicable to the activities for which a bid is being submitted -- development or administration. For each criterion, the bidder must demonstrate that the minimum threshold of experience has been achieved with prior completed projects. (Emphasis added.) Moreover, Section 10.2 singles out for emphasis, in relation to the administration component of the RFP, the importance placed on a bidder’s ability to demonstrate experience processing a large volume of tests: The [bidder’s prior completed] projects must have included work tasks similar to those described herein, particularly in test development or processing a comparable number of tests. The bidder will provide a description of the contracted services; the contract period; and the name, address, and telephone number of a contact person for each of the contracting agencies. This description shall (1) document how long the organization has been providing similar services; (2) provide details of the bidder’s experience relevant to the services required by this RFP; and (3) describe the bidder’s other testing projects, products, and services that are similar to those required by this RFP. (Emphasis added.) The Department thus made clear its concern that bidders demonstrate experience with large-scale projects. RFP Appendix J sets forth nine different criteria (C1 through C9) for the administration contractor. As stated in RFP Section 10.2, "[f]or each criterion, the bidder must demonstrate that the minimum threshold of experience has been achieved with prior completed projects . . . ." (emphasis added). Appendix J contains a chart which lists for each criterion: (1) a summary of the related FCAT work task, (2) the detailed criteria for the bidder’s experience related to that work task, and (3) the necessary documentation a bidder must provide. Criterion C4 and Criterion C6 include work tasks that involve the use of image-based scoring technology. C4 and C6 are the only corporate qualifications criteria at issue in this proceeding. RFP Provisions Involving Corporate Qualifications for Image-Based Scoring "Handscoring" is the test administration activity in which open-ended or performance-based student responses are assessed. This practice involves a person reading something the student has written as part of the test, as distinguished from machine scoring multiple choice responses (i.e., the filled-in "bubbles" on an answer sheet). There are two types of handscoring: (1) paper-based handscoring, and (2) image-based handscoring. Paper-based handscoring requires that a student response paper be sent to a reader, who then reviews the student’s response as written on the paper and enters a score on a separate score sheet. Image-based handscoring involves a scanned image of the student’s response being transmitted to a reader electronically. The student’s response is then projected on a computer screen, where the reader reviews it and assigns a score using the computer. The RFP requires that the reading and math portions of the FCAT be handscored on-line using imaging technology beginning with the February 2000 FCAT administration. The RFP provides that the writing portion of the FCAT may be handscored using either the paper-based method or on-line imaging technology during the February 2000 and 2001 FCAT administrations. However, on-line image-based scoring of the writing portion of the FCAT is required for all FCAT administrations after February 2001. An image-based scoring system involves complex computer technology. William Bramlett, an expert in designing and implementing large-scale imaging computer systems and networks, presented unrefuted testimony that an image-based scoring system will be faced with special challenges when processing large volumes of tests. These challenges involve the need to automate image quality control, to manage the local and wide area network load, to assure adequate server performance and storage requirements, and to manage the work flow in a distributed environment. In particular, having an image-based scoring system process an increasing volume of tests is not simply a matter of adding more components. Rather, the system’s basic software architecture must be able to understand and manage the added elements and volume involved in a larger operation. According to Bramlett, there are two ways that the Department could assess the ability of a bidder to perform a large- scale, image-based scoring project such as the FCAT from a technological perspective: (1) have the bidder provide enough technological information about its system to be able to model or simulate the system and predict its performance for the volumes involved, or (2) require demonstrated ability through completion of prior similar projects. Dr. Mark Heidorn, Administrator for Florida’s K-12 Statewide Assessment Programs, was the primary author of RFP Sections 1-8, which describe the work tasks for the FCAT -- the goods and services vendors are to provide and respond to in their technical proposals. Dr. Heidorn testified that in the Department’s testing procurements involving complex technology, the Department has never required specific descriptions of the technology to be used. Instead, the Department has relied on the bidder’s experience in performing similar projects. Thus, the RFP does not specifically require that bidders describe in detail the particular strategies and approaches they intend to employ when designing and implementing an image-based scoring system for FCAT. Instead, the Department relied on the RFP requirements calling for demonstrated experience as a basis to understand that the bidder could implement such an image-based scoring system. Approximately 717,000 to 828,000 student tests will be scored annually by the FCAT administration contractor using imaging technology. The RFP, however, does not require that bidders demonstrate image-based scoring experience at that magnitude. Instead, the RFP requires bidders to demonstrate only a far less demanding minimum level of experience using image-based scoring technology. Criterion C4 and Criterion C6 in Appendix J of the RFP each require that a bidder demonstrate prior experience administering "a minimum of two" assessment programs using imaged- based scoring that involved "at least 200,000 students annually." The requirements for documenting a "minimum of two" programs or projects for C4 and C6 involving "at least 200,000 students annually" are material because they are intended to provide the Department with assurance that the FCAT administration contractor can perform the large-scale, image-based scoring requirements of the contract from a technological perspective. Such experience would indicate that the bidder would have been required to address the sort of system issues described by Bramlett. Dr. Heidorn testified that the number 200,000 was used in C4 and C6 "to indicate the level of magnitude of experience which represented for us a comfortable level to show that a contractor had enough experience to ultimately do the project that we were interested in completing." Dr. Fisher, who authored Appendix J, testified that the 200,000 figure was included in C4 and C6 because it was a number judged sufficiently characteristic of large-scale programs to be relevant for C4 and C6. Dr. Fisher further testified that the Department was interested in having information that a bidder’s experience included projects of a sufficient magnitude so that the bidder would have experienced the kinds of processing issues and concerns that arise in a large-scale testing program. The Department emphasized this specific quantitative minimum requirement in response to a question raised at the Bidder’s Conference held on November 13, 1998: Q9: In Appendix J, the criteria for evaluating corporate quality for the administration operations C4, indicates that the bidder must have experience imaging as indicated. Does this mean that the bid [sic] must bid for using [sic] imaging technology for reading and mathematics tests? A: Yes. The writing assessment may be handscored for two years, and then it will be scored using imaging technology. To be responsive, a bid must be for imaging. The corporate experience required (200,000 students annually for which reports were produced in three months) could be the combined experience of the primary contractor and the subcontractors. (Emphasis added.) Criterion C4 addresses the RFP work tasks relating to handscoring, including both the image-based handscoring of the reading and math portions of the FCAT for all administrations and the writing portions of the FCAT for later administrations. The "Work Task" column for C4 in Appendix J of the RFP states: Design and implement efficient and effective procedures for handscoring student responses to performance tasks within the limited time constraints of the assessment schedule. Handscoring involves image-based scoring of reading and mathematics tasks for all administrations and writing tasks for later administrations at secure scoring sites. Retrieve and score student responses from early district sample schools and deliver required data to the test development contractor within critical time periods for calibration and scaling. The "Necessary Documentation" column for C4 in Appendix J states: Bidder must document successful completion of a minimum of two performance item scoring projects for statewide assessment programs during the last four years for which the bidder was required to perform as described in the Criteria column. (Emphasis added.) The "Criteria" column for C4 in Appendix J, like the related work tasks in the RFP, addresses both image-based handscoring of reading and math, as well as paper-based or image- based handscoring of writing. In connection with all handscoring work tasks, "[t]he bidder must demonstrate completion of test administration projects for a statewide program for which performance items were scored using scoring rubrics and associated scoring protocols." With respect to the work tasks for handscoring the reading and math portions of the FCAT, "[t]he bidder must demonstrate completion of statewide assessment programs involving scoring multiple-choice and performance items for at least 200,000 students annually for which reports were produced in three months." In addition, for the reading and math work tasks, "[e]xperience must been shown in the use of imaging technology and hand-scoring student written responses with completion of scoring within limited time restrictions." This provision dealing with "imaging technology" experience self-evidently addresses the reading and math components, because separate language addresses imaging experience in connection with the writing component. The relevant handscoring experience for the reading and math aspects of the program is experience using image-based technology. By contrast, with respect to the work tasks for scoring the writing portions of the FCAT, "the bidder must also demonstrate completion of statewide assessment programs involving paper-based or imaged scoring student responses to writing assessment prompts for at least 200,000 students annually for which reports were produced in three months." (Emphasis added.) Criterion C6 addresses work tasks relating to designing and implementing systems for processing, scanning, imaging and scoring student responses to mixed-format tests within limited time constraints. The "Work Task" column for C6 in RFP Appendix J states: Design and implement systems for the processing, scanning, imaging, and scoring of student responses to test forms incorporating both multiple-choice and constructed response items (mixed-format) within the limited time constraints of the assessment schedule. Scoring of student responses involves implementation of IRT scoring tables and software provided by the development contractor within critical time periods. The "Necessary Documentation" column for C6 in Appendix J states: Bidder must document successful completion of a minimum of two test administration projects for statewide assessment programs during the last four years in which the bidder was required to perform as described in the Criteria column. (Emphasis added.) The Criteria column for C6 in Appendix J states: The bidder must demonstrate completion of test administration projects for statewide assessment programs or other large-scale assessment programs that required the bidder to design and implement systems for processing, scanning, imaging, and scoring responses to mixed-format tests for at least 200,000 students annually for which reports were produced in three months. Experience must be shown in use of imaging student responses for online presentation to readers during handscoring. (Emphasis added.) RFP Provisions Per Corporate Qualifications The procedure for evaluating a bidder’s corporate qualifications is described in RFP Section 11.3: The Department will evaluate how well the resources and experience described in each bidder’s proposal qualify the bidder to provide the services required by the provisions of this RFP. Consideration will be given to the length of time and the extent to which the bidder and any proposed subcontractors have been providing services similar or identical to those requested in this RFP. The bidder’s personnel resources as well as the bidder’s computer, financial, and other technological resources will be considered in evaluating a bidder’s qualifications to meet the requirements of this RFP. Client references will be contacted and such reference checks will be used in judging a bidder’s qualifications. The criteria to be used to rate a bidder’s corporate qualifications to meet the requirements of this RFP are shown in Appendix J and will be applied as follows: * * * Administrative Activities. Each of the nine administration activities criteria in Appendix J will be individually rated by members of the evaluation team. The team members will use the rating scale shown in Figure 1 below. Individual team members will review the bidder’s corporate qualifications and rate the response with a rating of one to five. The ratings across all evaluators for each factor will be averaged, rounded to the nearest tenth, and summed across all criteria. If each evaluator assigns the maximum number of points for each criterion, the total number of points will be 45. To meet the requirements of Stage II, the proposal must achieve a minimum rating of 27 points and have no individual criterion for which the number of points averaged across evaluators and then rounded is less than 3.0. Each proposal that receives a qualifying score based on the evaluation of the bidder’s qualifications will be further evaluated in Stage III. Figure 1 Evaluation Scale for Corporate Qualifications 5 Excellent 4 3 Satisfactory 2 1 Unsatisfactory The bidder has demonstrated exceptional experience and capability to perform the required tasks. The bidder has demonstrated that it meets an acceptable level of experience and capability to perform the required tasks. The bidder either has not established its corporate qualifications or does not have adequate qualifications. RFP Section 11.3 provides that each of the nine corporate qualifications criteria for administration operations in Appendix J (C1 through C9) will be individually rated by the six members of the evaluation team using a scale of one to five. A rating of three is designated as "satisfactory" which means that "[t]he bidder has demonstrated that it meets an acceptable level of experience and capability to perform the required tasks." In order to be further evaluated, Section 11.3 provides that there must be no individual corporate qualifications criterion for which the bidder’s proposal receives a score less than 3.0 (average points across evaluators). Dr. Fisher, the primary author of Section 11.3 of the RFP, referred to the 3.0 rating as the "cut score." (Emphasis added.) The RFP’s clear and unambiguous terms thus establish the "minimum threshold" of experience that a bidder "must demonstrate" in its proposal for Criterion C1 through Criterion C9. The "minimum threshold" of experience that a bidder must demonstrate for each criterion is described in Appendix J of the RFP. If a proposal failed to demonstrate that the bidder meets the minimum threshold of experience for a particular criterion in Appendix J, the bidder obviously would not have demonstrated "that it meets an acceptable level of experience and capability to perform the required tasks." Thus, in that setting, an evaluator was to have assigned the proposal a rating of less than "satisfactory," or less than three, for that criterion. (Emphasis added.) The fact that a score less than "3" was expected for -- and would eliminate -- proposals that did not demonstrate the "minimum threshold" of experience does not render meaningless the potential scores of "1" and "2." Those scores may reflect the degree to which a bidder’s demonstrated experience was judged to fall below the threshold. Although some corporate capability minimums were stated quantitatively (i.e., "minimum of two," or "at least 200,000"), others were open to a more qualitative assessment (i.e., "large-scale," "systems," or "reports"). Moreover, a proposal that included demonstrated experience in some manner responsive to each aspect of Appendix J might nevertheless be assigned a score of less than "3," based on how an evaluator assessed the quality of the experience described in the proposal. By the terms of the RFP, however, an average score across evaluators of less than 3 represented essentially a decision that the minimum threshold of experience was not demonstrated. Had the Department truly intended Appendix J to reflect only general targets or guidelines, there were many alternative ways to communicate such an intent without giving mandatory direction about what bidders "must demonstrate" or without establishing quantitative minimums (i.e. "a minimum of two," or "at least 200,000"). RFP Appendix K, for instance, sets forth the evaluation criteria for technical proposals in broad terms that do not require the bidder to provide anything in particular. Even within Appendix J, other than in Criterion C4 and Criterion C6, bidders were to show experience with "large-scale" projects rather than experience at a quantified level. Pursuant to the RFP’s plain language, in order to meet the "minimum threshold" of experience for Criterion C4 and Criterion C6, a bidder "must demonstrate," among other things, successful completion of a "minimum of two" projects, each involving the use of image-based scoring technology in administering tests to "at least 200,000 students annually." Department’s Evaluation of Corporate Qualifications In evaluating Harcourt’s proposal, the Department failed to give effect to the plain RFP language stating that a bidder "must document" successful completion of a "minimum of two" testing projects involving "at least 200,000 students annually" in order to meet the "minimum threshold" of experience for C4 and C6. Dr. Fisher was the primary author of Sections 10, 11 and Appendix J of the RFP. He testified that during the Stage II evaluation of corporate qualifications, the evaluation team applied a "holistic" approach, like that used in grading open-ended written responses in student test assessments. Under the holistic approach that Dr. Fisher described, each member of the evaluation team was to study the proposals, compare the information in the proposals to everything contained in Appendix J, and then assign a rating for each criterion in Appendix J based on "how well" the evaluator felt the proposal meets the needs of the agency. Notwithstanding Dr. Fisher’s present position, the RFP’s terms and their context demonstrate that the minimum requirements for corporate qualifications are in RFP Appendix J. During the hearing, Dr. Fisher was twice asked to identify language in the RFP indicating that the Department would apply a "holistic" approach when evaluating corporate qualifications. Both times, Dr. Fisher was unable to point to any explicit RFP language putting bidders on notice that the Department would be using a "holistic" approach to evaluating proposals and treating the Appendix J thresholds merely as targets. In addition, Dr. Fisher testified that the Department did not engage in any discussion at the bidders’ conference about the evaluation method that was going to be used other than drawing the bidders’ attention to the language in the RFP. As written, the RFP establishes minimum thresholds of experience to be demonstrated. Where, as in the RFP, certain of those minimum thresholds are spelled out in quantitative terms that are not open to interpretation or judgment, it is neither reasonable nor logical to rate a proposal as having demonstrated "an acceptable level of experience" when it has not demonstrated the specified minimum levels, even if other requirements with which it was grouped were satisfied. The plain RFP language unambiguously indicates that an analytic method, not a "holistic" method, will be applied in evaluating corporate qualifications. Dr. Fisher acknowledged that, in an assessment using an analytic method, there is considerable effort placed up front in deciding the specific factors that will be analyzed and those factors are listed and explained. Dr. Fisher admitted that the Department went into considerable detail in Appendix J of the RFP to explain to the bidders the minimums they had to demonstrate and the documentation that was required. In addition, Dr. Orr, who served as a member of the evaluation team and who herself develops student assessment tests, stated that in assessments using the "holistic" method there is a scoring rubric applied, but that rubric does not contain minimum criteria like those found in the RFP for FCAT. The holistic method applied by the Department ignores very specific RFP language which spells out minimum requirements for corporate qualifications. Harcourt’s Corporate Qualifications for C4 and C6 Harcourt’s proposal lists the same three projects administered by Harcourt for both Criterion C4 and Criterion C6: the Connecticut Mastery Test ("CMT"), the Connecticut Academic Performance Test ("CAPT") and the Delaware Student Testing Program ("DSTP"). Harcourt’s proposal also lists for Criterion C4 projects administered by its proposed scoring subcontractors, Measurement Incorporated ("MI") and Data Recognition Corporation ("DRC"). However, none of the projects listed for MI or DRC involve image- based scoring. Thus, the MI and DRC projects do not demonstrate any volume of image-based scoring as required by C6 and by the portion of C4 which relates to the work task for the imaged-based scoring of the math and reading portions of the FCAT. Harcourt’s proposal states that "[a]pproximately 35,000 students per year in grade 10 are tested with the CAPT." Harcourt’s proposal states that "[a]pproximately 120,000 students per year in grades 4, 6 and 8 are tested with the CMT." Harcourt’s proposal states that "[a]pproximately 40,000 students in grades 3, 5, 8, and 10" are tested with the DSTP. Although the descriptions of the CMT and the CAPT in Harcourt’s proposal discuss image-based scoring, there is nothing in the description of the DSTP that addresses image-based scoring. There is no evidence that the evaluators were ever made aware that the DSTP involved image-based scoring. Moreover, although the Department called the Delaware Department of Education ("DDOE") as a reference for Harcourt’s development proposal, the Department did not discuss Harcourt’s administration of the DSTP (including whether the DSTP involves image-based scoring) with the DDOE. Harcourt overstated the number of students tested in the projects it referenced to demonstrate experience with image-based scoring. Harcourt admitted at hearing that, prior to submitting its proposal, Harcourt had never tested 120,000 students with the CMT. In fact, the total number of students tested by Harcourt on an annual basis under the CMT has ranged from 110,273 in the 1996- 97 school year to 116,679 in the 1998-99 school year. Harcourt also admitted at hearing that, prior to submitting its proposal, Harcourt had never tested 35,000 students in grade 10 with the CAPT. Instead, the total number of grade 10 students tested by Harcourt on an annual basis with the CAPT ranged from 30,243 in 1997 to 31,390 in 1998. In addition, Harcourt admitted at hearing that, prior to submitting its proposal, it had conducted only one "live" administration of the DSTP (as distinguished from field testing). That administration of the DSTP involved only 33,051, not 40,000, students in grades 3, 5, 8 and 10. Harcourt itself recognized that "field tests" of the DSTP are not responsive to C4 and C6, as evidenced by Harcourt’s own decision not to include in its proposal the number of students field tested under the DSTP. Even assuming that the numbers in Harcourt’s proposal are accurate, and that the description of the DSTP in Harcourt’s proposal reflected image-based scoring, Harcourt’s proposal on its face does not document any single project administered by Harcourt for C4 or C6 involving image-based testing of more than 120,000 students annually. When the projects are aggregated, the total number of students claimed as tested annually still does not reach the level of "at least 200,000;" it comes to only 195,000, and it reaches that level only once due to the single administration of the DSTP. Moreover, even if that 195,000 were considered "close enough" to the 200,000 level required, it was achieved only one time, while Appendix J plainly directs that there be a minimum of two times that testing at that level has been performed. The situation worsens for Harcourt when using the true numbers of students tested under the CMT, CAPT, and DSTP, because Harcourt cannot document any single image-based scoring project it has administered involving testing more than 116,679 students annually. Moreover, when the true numbers of students tested are aggregated, the total rises only to 181,120 students tested annually on one occasion, and no more than 141,663 tested annually on any other occasion. Despite this shortfall from the minimum threshold of experience, under the Department’s holistic approach the evaluators assigned Harcourt’s proposal four ratings of 3.0 and two ratings of 4.0 for C4, for an average of 3.3 on C4; and five ratings of 3.0 and one rating of 4.0 for C6, for an average of 3.2 on C6. Applying the plain language of the RFP in Sections 10 and 11 and Appendix J, Harcourt did not demonstrate that it meets an acceptable level of experience and capability for C4 or C6, because Harcourt did not satisfy the minimum threshold for each criterion by demonstrating a minimum of two prior completed projects involving image-based scoring requiring testing of at least 200,000 students annually. Harcourt’s proposal should not have received any rating of 3.0 or higher on C4 or C6 and should have been disqualified from further evaluation due to failure to demonstrate the minimum experience that the Department required in order to be assured that Harcourt can successfully administer the FCAT program. NCS’s Compliance With RFP Requirements Even though the NCS proposal did not meet all of the mandatory requirements, and despite the requirement of Section 11.2 that the proposal be automatically disqualified under such circumstances, the Department waived NCS’s noncompliance as a minor irregularity. The factors in C4 and C6 were set, minimal requirements with which NCS did not comply. For example, one of the two programs NCS submitted in response to Criteria C4 and C6 was the National Assessment of Educational Progress program ("NAEP"). NAEP, however, is not a "statewide assessment program" within the meaning of that term as used in Criteria C4 and C6. Indeed, NCS admitted that NAEP is not a statewide assessment program and that, without consideration of that program, NCS’s proposal is not responsive to Criteria C4 and C6 because NCS has not have submitted the required proof of having administered two statewide assessment programs. This error cannot be cured by relying on the additional experience of NCS’s subcontractor because that experience does not show that its subcontractor produced reports within three months, and so such experience does not demonstrate compliance with Criteria C4. The Department deliberately limited the competition for the FCAT contract to firms with specified minimum levels of experience. As opined at final hearing, if the Department in the RFP had announced to potential bidders that the type of experience it asked vendors to describe were only targets, goals and guidelines, and that a failure to demonstrate target levels of experience would not be disqualifying, then the competitive environment for this procurement would have differed since only 2.06 evaluation points (out of a possible 150) separated the NCS and Harcourt scores. Dr. Heidorn conceded that multiple companies with experience in different aspects of the FCAT program -- a computer/imaging company and a firm experienced in educational testing -- might combine to perform a contract like the FCAT. Yet, that combination of firms would be discouraged from bidding because they could not demonstrate the minimum experience spelled out in the RFP. Language in the RFP, indicating the "holistic" evaluation that was to be applied, could have resulted in a different field of potential and actual bidders.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is recommended that Respondent, State of Florida, Department of Education, enter a Final Order rejecting the bids submitted by Harcourt and NCS for the administration component of the RFP. The Department should then seek new proposals. DONE AND ENTERED this 25th day of May, 1999, in Tallahassee, Leon County, Florida. DON W. DAVIS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 25th day of May, 1999. COPIES FURNISHED: Karen D. Walker, Esquire Holland and Knight, LLP Post Office Drawer 810 Tallahassee, Florida 32302 Mark D. Colley, Esquire Holland and Knight, LLP Suite 400 2100 Pennsylvania Avenue, Northwest Washington, D.C. 20037 Charles S. Ruberg, Esquire Department of Education The Capitol, Suite 1701 Tallahassee, Florida 32399-0400 Paul R. Ezatoff, Jr., Esquire Christopher B. Lunny, Esquire Katz, Kutter, Haigler, Alderman, Bryant and Yon, P.A. 106 East College Avenue, Suite 1200 Tallahassee, Florida 32302-7741 Tom Gallagher Commissioner of Education Department of Education The Capitol, Plaza Level 08 Tallahassee, Florida 32399-0400 Michael H. Olenick, General Counsel Department of Education The Capitol, Suite 1701 Tallahassee, Florida 32399-0400

Florida Laws (3) 120.57287.012287.057
# 8
CLARK W. BRIDGMAN vs. BOARD OF PROFESSIONAL ENGINEERS, 87-004993 (1987)
Division of Administrative Hearings, Florida Number: 87-004993 Latest Update: Jun. 30, 1988

The Issue The issue presented for decision herein is whether or not the Petitioner successfully completed the answers posed on the April, 1987 professional engineer's examination.

Findings Of Fact Petitioner took the April, 1987 professional engineering examination and was advised that he failed the principles and practice portion of the examine. His raw score was 45 points and the parties stipulated that he needed a minimum raw score of 48 points to pass the examination. In his request for hearing, Petitioner challenged questions 120, 123 and 420. However, during the hearing, he only presented testimony and challenged question 420. Question 420 is worth 10 points and is set forth in its entirety in Petitioner's Exhibit Number 1. For reasons of test security, the exhibit has been sealed. Question 420 requires the examinee to explore the area regarding "braced excavations" and explores the principles involved in such excavations. Question 420 requires the examinee to calculate the safety factor for a braced excavation including the depth of excavation which would cause failure by "bottom heaving". Petitioner, in calculating the safety factor, made a mathematical error when he incorporated the B-prime value calculation which was inserted into the equation in making his calculations. Question 420 does not direct the applicant to apply the calculations to either a square excavation or to a rectangular excavation. Petitioner assumed the shape of the excavation to be square and calculated the factor of safety according to that assumption. In assuming the square excavation, Petitioner did not make the more conservative calculation that will be required in making the safety factor calculation for a rectangular excavation. In this regard, an examination of Petitioner's work sheet indicates that he referenced the correct calculation on his work sheet but the calculation was not transferred to or utilized in the equation. Respondent utilizes the standard scoring plan outline, which is more commonly known as the Items Specific Scoring Plan (ISSP) which is used by the scorers in grading the exam. The ISSP provides a scoring breakdown for each question so that certain uniform criteria are met by all applicants. For example, four points are given for a correct solution on a specific question regardless of the scorer. This criteria is supplied by the person or persons who prepared the exam. The criteria indicates "in problem-specific terms, the types of deficiencies that would lead to scoring at each of the eleven (0-10) points on the scale". The ISSP awards six points on question 420 when the applicants meets the following standards: "all categories satisfied, applicant demonstrate minimally adequate knowledge in all relevant aspect of the item." ISSP awards seven points on question 420 when the applicant's answer meets the following standard: "all categories satisfied, obtains solution, but chooses less than optimum approach. Solution is awkward but reasonable". The ISSP awards eight points on question 420 when the applicant's answer meets the following standards: "all categories satisfied. Errors attributable to misread tables or calculating devices. Errors would be corrected by routine checking. Results reasonable, though not correct". The ISSP awards nine points on question 420 when the applicant's answer meets the following standard: "all categories satisfied, correct solution but excessively conservative in choice of working values; or presentation lacking in completeness of equations, diagrams, orderly steps in solution, etc." The ISSP criteria for awarding nine points as to question 420 clearly requires that the Petitioner calculate the correct solution without mathematical errors. The Petitioner's answer was not correct regardless of the assumption as to the shape of the excavation since he made a mathematical error. The ISSP criteria for awarding eight points as to question 420 allows Petitioner to calculate the answer with mathematical errors with the requirements that the results are reasonable. Petitioner made a mathematical error although his result was reasonable. His answer fits the criteria for the award of eight points in conformity with the ISSP criteria. Petitioner received six points for his answer to question 420 whereas he is entitled to an award of eight points.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that: Respondent enter a Final Order determining that Petitioner failed the principles and practice portion of the April, 1987 engineering examination. RECOMMENDED this 30th day of June 1988, in Tallahassee, Florida. JAMES E. BRADWELL Hearing Officer Division of Administrative Hearings The Oakland Building 2009 Apalachee Parkway Tallahassee, Florida 32301 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 30th day of June, 1988. COPIES FURNISHED: Glen E. Wichinsky, Esquire 900 Glades Road, 5th Floor Boca Raton, Florida 33431 Michael A. Mone', Esquire Department of Professional Regulation 130 North Monroe Street Tallahassee, Florida 32399-0750 Allen R. Smith, Jr. Executive Director Department of Professional Regulation, Board of Professional Engineers 130 North Monroe Street Tallahassee, Florida 32399-0750 William O'Neil, Esquire General Counsel Department of Professional Regulation 130 North Monroe Street Tallahassee, Florida 32399-0750

Florida Laws (3) 120.57471.013471.015
# 9
CARLOS MARTINEZ MALLEN vs BOARD OF PROFESSIONAL ENGINEERS, 89-005973 (1989)
Division of Administrative Hearings, Florida Filed:Miami Beach, Florida Nov. 01, 1989 Number: 89-005973 Latest Update: Mar. 28, 1990

Findings Of Fact Petitioner, Carlos Martinez Mallen, is an applicant for licensure by endorsement to become a professional engineer in the State of Florida. He filed his application for licensure with the Florida Board of Professional Engineers (hereinafter "Board") in January 1988, relying on the facts that he was licensed in Spain approximately 25 years ago and has approximately 30 years of experience as a professional engineer. The Board subsequently determined that he could not be considered for licensure by endorsement. Petitioner has never taken a licensing examination in the United States which is substantially equivalent to the examination required for licensure by Section 471.013, Florida Statutes, and described in Chapter 21H, Florida Administrative Code. Further, Petitioner has never been licensed in any state or territory of the United States, although he does hold a license to practice engineering in Spain. On the other hand, Petitioner's engineering experience record shows that he has considerable experience in the practice of engineering which would meet the additional experience requirements of Section 471.013, Florida Statutes. The Board, having determined that Petitioner does not qualify for licensure by endorsement, performed an analysis of Petitioner's application to determine whether his degree from the University of Madrid was an engineering degree which might qualify him to sit for the 1icensure examination and to ascertain if Petitioner could obtain licensure by that alternative method. An analysis was made by the Board's Education Advisory Committee to determine whether the curriculum for Petitioner's degree from the University of Madrid met the requirements of Rule 21H-20.006, Florida Administrative Code. This analysis was specifically directed to determine whether Petitioner's curriculum conformed to the criteria for accrediting engineering programs set forth by the Engineering Accreditation Commission of the Accreditation Board of Engineering and Technology, Inc., (hereinafter "ABET"). The analysis of Petitioner's degree shows that, when compared with ABET criteria, Petitioner's engineering education was deficient four semester hours in mathematics and included no courses in engineering design, sixteen semester hours of which are required by ABET criteria. Further, Petitioner's education included no computer application of engineering design programs, a mandated requirement by ABET standards. Petitioner has never taken any of these courses subsequent to receiving his degree in Spain. Petitioner's degree, rather than being an engineering degree, is the equivalent of a bachelor's degree in chemistry. Petitioner's degree is significantly deficient in required course areas, so that it does not meet the Board's criteria. Petitioner thus cannot be considered as an applicant for examination since in order to sit for the professional engineer examination in the State of Florida, one must have an engineering degree which meets standards acceptable to the Board. Finally, Petitioner's background was reviewed to determine whether he could be considered for licensure under a different provision for licensure by endorsement. Petitioner has never held a professional engineer registration or license from another State of the United States. The Board has never interpreted the word "state" found in the statutes and rules regulating the licensure of professional engineers in Florida to include foreign counties. Petitioner is not a graduate of the State University System. Petitioner did not notify the Department before July 1, 1984, that he was engaged in engineering work on July 1, 1981, and wished to take advantage of a temporary educational waiver. As a result of the Board's review of all avenues to licensure available to Petitioner, Petitioner's application was denied either to sit for the examination to become a professional engineer or to be licensed by endorsement, unless and until he meets the educational requirements to sit for the professional engineer examination.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that a Final Order be entered denying Petitioner's application for licensure by endorsement and further finding that Petitioner's educational background does not meet the requirements necessary to take the examination to become licensed in the State of Florida. DONE AND ENTERED in Tallahassee, Leon County, Florida, this 28th day of March, 1990. LINDA M. RIGOT Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 28th day of March, 1990. APPENDIX TO RECOMMENDED ORDER, CASE NO. 89-5973 Petitioner's proposed paragraphs numbered 0.00, .10, .20, .30, .40, .50, 1.10, 1.20, 2.20, 3.10, 3.20, 3.40, 3.60, 4.10, 4.11, 4.13, 5.00, 5.30, 5.40, 5.41, 5.50, 5.51, 5.52, 6.00, 6.10, 6.20, 6.21, 6.22, 6.23, 6.24, 6.25, 6.26, 7.00, 7.40, and 7.50 have been rejected as not constituting findings of fact but rather as constituting argument or conclusions of law. Petitioner's proposed paragraphs numbered 1.21, 3.00, 4.00, 7.10, 7.20, 730, 7.41, 7.42, and 7.43 have been rejected as being contrary to the weight of the evidence in this cause. Petitioner's proposed paragraphs numbered 1.22 and 2.10 have been adopted either verbatim or in substance in this Recommended Order. Petitioner's proposed paragraphs numbered 3.30, 3.50, 3.70, 4.12, 4.20, 5.10, 5.11, and 5.20 have been rejected as being irrelevant to the issues involved in this proceeding. Respondent's proposed findings of fact numbered 1-8 have been adopted either verbatim or in substance in this Recommended Order. COPIES FURNISHED: John J. Rimes, III, Esquire Office of Attorney General Department of Legal Affairs The Capitol Tallahassee, Florida 32399-1050 Carlos Martinez Mallen 33C Venetian Way #66 Miami Beach, Florida 33139 Kenneth E. Easley, General Counsel Department of Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0792 Rex Smith, Executive Director Department of Professional Regulation Board of Professional Engineers 1940 North Monroe Street Tallahassee, Florida 32399-0792

Florida Laws (9) 120.57471.005471.013471.0156.107.207.417.437.50
# 10

Can't find what you're looking for?

Post a free question on our public forum.
Ask a Question
Search for lawyers by practice areas.
Find a Lawyer