Elawyers Elawyers
Washington| Change
Find Similar Cases by Filters
You can browse Case Laws by Courts, or by your need.
Find 49 similar cases
MARK W. NELSON vs FLORIDA ENGINEERS MANAGEMENT CORPORATION, 98-005321 (1998)
Division of Administrative Hearings, Florida Filed:Gainesville, Florida Dec. 07, 1998 Number: 98-005321 Latest Update: Jul. 09, 1999

The Issue Whether Petitioner is entitled to additional credit for his responses to question numbers 21 and 24 of the Principles and Practice of Engineering Examination administered in April 1998.

Findings Of Fact Petitioner took the April 24, 1998 professional engineering licensing examination with an emphasis in civil engineering. A score of 70 is required to pass the test. Petitioner obtained a score of 69. In order to achieve a score of 70, Petitioner needs a raw score of 48. Therefore, Petitioner is in need of at least one additional raw score point. Petitioner is challenging question numbers 21 and 24. They are both multiple-choice questions and worth one point each. Exhibit 10 contains a diagram for the candidate's use in answering question numbers 21 and 24. Question 21 requires the examinee to calculate the percentage of wooded land on the diagram. The diagram contains a rectangle labeled "woodlot," and within the rectangle are three non-contiguous areas marked with schematics of trees. The Petitioner reduced the percentage of wooded area to conform to the portion of the area labeled "woodlot" marked with schematics of trees. In regard to question number 21, the Petitioner asserts that as a matter of convention, by failing to put the trees everywhere in the wooded lot, one may assume that there are trees only where there is a schematic of the trees. The Petitioner's challenge was rejected on the basis that the scorer opined that it is standard practice that drawings are only partially filled with details, and the most reasonable interpretation of the site plan drawings is that the woodlot fills the entire area enclosed by the rectangle. John Howath, a professional engineer, testified regarding accepted conventions in engineering drawings. In Howath's opinion the drawing on the examination used inconsistent methodologies and was confusing regarding whether all of the area designated by the label or "call out" of woodlot was in fact wooded. Both the Petitioner and Mr. Howath referred to drawings in the Civil Engineering Reference Manual which showed areas on drawings totally covered with visual indications of a particular material or condition. Peter Sushinsky, a professional engineer, testified as an expert for the Respondent. Mr. Sushinsky acknowledged the Petitioner's exhibits; however, Mr. Sushinsky noted that these were only a few examples of drawings that are available. Mr. Sushinsky referenced construction drawings he had seen in his practice with partial "cross-hatching" just like the diagram on the examination. In sum, Mr. Sushinsky's experience was that diagram might be totally or partially "cross-hatched." In Mr. Sushinsky's opinion it was not a bad diagram, only subject to a different interpretation by a minor group. Question number 24 asked the candidate to calculate the weir peak discharge from the catchment area using the rational formula. The Petitioner asserts the question is misleading and should read, "What is the peak discharge from the watershed?" The Petitioner bases his assertion on the ground that the "rational formula" is used to compute discharge from a watershed not a weir, as mandated by the question. The scorer did not address the Petitioner's concerns. The scorer stated, "It is clear from the item statement that the weir equation is not to be used." However, the questions ask the candidate to compute the weir discharge. Jennifer Jacobs, a professor of engineering, testified regarding the rationale formula that it was used to calculate watershed discharge and not weir discharge. All experts agreed that the rational formula is not used to compute weir discharge. The experts all agree that the question was confusing because the rational formula is not used to calculate the discharge from a weir. The Respondent's expert justifies the answer deemed correct on the basis that if one uses the rational formula and computes the watershed discharge, one of the answers provided is close to the result. The Respondent's expert calculated the watershed discharge as 230.6 cubic feet per second (cfs). The answer deemed correct was 232 cfs. The expert stated the weir attenuates flow. If the weir attenuates flow one would expect an answer less than 230.6 cfs., not an answer equal to or greater than 230.6 cfs. The amount of attenuation is based upon the physical features of the impoundment area and the mouth of the weir. Weir Attenuation varies. The only answers smaller than 230.6 are 200 or 32. Is the 232 cfs. answer wrong because it does not allow for attenuation by the weir? How much did the weir attenuate the flow? Under these facts, the question is capricious. The Respondent argues that the Petitioner didn't follow instructions while acknowledging that the "correct" answer is not the answer to the question that was asked.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law set forth herein, it is RECOMMENDED: That the Respondent enter a final order awarding Petitioner two raw points and a passing score on the Principles and Practice of Engineering Examination. DONE AND ENTERED this 20th day of May, 1999, in Tallahassee, Leon County, Florida. STEPHEN F. DEAN Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 20th day of May, 1999. COPIES FURNISHED: Mark W. Nelson 720 Northwest 31st Avenue Gainesville, Florida 32609 Natalie A. Lowe, Esquire Board of Professional Engineers 1208 Hays Street Tallahassee, Florida 32301 Dennis Barton, Executive Director Board of Professional Engineers 1208 Hays Street Tallahassee, Florida 32301 William Woodyard, General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0792

Florida Laws (1) 120.57
# 1
KENNETH E. MARSHALL vs CONSTRUCTION INDUSTRY LICENSING BOARD, 97-002368 (1997)
Division of Administrative Hearings, Florida Filed:Fort Lauderdale, Florida May 16, 1997 Number: 97-002368 Latest Update: Jul. 15, 2004

The Issue Whether Petitioner is entitled to additional credit for his responses to Questions 23 and 27 of the Contract Administration section of the General Contractor licensure examination administered in July 1996, and, if so, whether the additional credit would give him a passing grade. Whether Petitioner is entitled to additional credit for his responses to Questions 11, 23, and 35 of the Contract Administration section of the General Contractor licensure examination administered in April 1997 and, if so, whether the additional credit would give him a passing grade.

Findings Of Fact Petitioner took the Contract Administration section of the General Contractor’s licensure examination in July 1996 and in April 1997. Between the two exams, Petitioner passed all sections of the examination except the Contract Administration section. Petitioner’s score on the Contract Administration section of the July 1996 examination, as graded by Respondent’s Bureau of Testing, was 65. His score on the Contract Administration section of the April 1997 examination was 67.5. For both examinations, there were 40 questions on the Contract Administration section. A candidate had to achieve a score of 70 to pass that section of the examination. Because each question was equally weighted, a candidate would have to correctly answer 28 questions to earn the passing score. All questions challenged by Petitioner were multiple- choice questions where the candidate was instructed to give the best answer from four possible choices. Prior to the examinations, the candidates were given a list of approved reference materials. The candidates were permitted to refer to those reference materials while taking the examinations. Respondent’s score of 65 on the July 1996 examination was based on the Bureau of Testing’s determination that Petitioner correctly answered 26 of the 40 questions. To earn a passing grade on the Contract Administration section of the July 1996 examination, Petitioner would have to receive credit for correctly answering two additional questions. His score of 67.5 on the April 1997 was based on the determination that he correctly answered 27 of the 40 questions. To earn a passing grade on the Contract Administration section of the April 1997 examination, Petitioner would have to receive credit for correctly answering one additional question. QUESTION 23 OF THE JULY 1996 EXAM The correct answer for Question 23 of the July examination is choice “D.” Of the four possible responses, choice “D” is the best answer to the question. Petitioner’s answer to this question was choice “A.” Petitioner did not receive credit for his response to this question because he did not select the best answer. The answer selected by Petitioner would not be the most accurate and cost-effective because the methodology he selected would not detect errors made by the first person performing the computations. The challenged question is a question that a candidate for licensure should be able to answer. The challenged question is not beyond the scope of knowledge that a candidate for licensure should have. The challenged question is not ambiguous. Petitioner is not entitled to additional credit for his response to Question 23 of the July 1996 exam. QUESTION 27 OF THE JULY 1996 EXAM The correct answer for Question 27 of the July examination is choice “C.” This correct answer is supported by reference materials made available to all candidates. Petitioner’s answer to this question was choice “B.” Petitioner did not receive credit for his response to this question because he did not select the correct answer to the question. The challenged question is a question that a candidate for licensure should be able to answer. The challenged question is not beyond the scope of knowledge that a candidate for licensure should have. The challenged question is not ambiguous. Petitioner is not entitled to additional credit for his response to Question 27 of the July 1996 exam. QUESTION 11 OF THE APRIL 1997 EXAM The correct answer for Question 11 of the April 1997 examination is choice “C.” This correct answer is supported by reference materials made available to all candidates. Petitioner’s answer to this question was choice “D.” Petitioner did not receive credit for his response to this question because he did not select the correct answer to the question. The challenged question is a question that a candidate for licensure should be able to answer. The challenged question is not beyond the scope of knowledge that a candidate for licensure should have. The challenged question is not ambiguous. Petitioner is not entitled to additional credit for his response to Question 11 of the April 1997 exam. QUESTION 23 OF THE APRIL 1997 EXAM The best answer for Question 23 of the April 1997 examination is choice “C.” This correct answer is supported by reference materials made available to all candidates. Petitioner’s answer to this question was choice “A.” While there is some support in the reference material for Petitioner's answer, the greater weight of the evidence established that his choice was not the best answer. Petitioner did not receive credit for his response to this question because he did not select the best answer to the question. The challenged question is a question that a candidate for licensure should be able to answer. The challenged question is not beyond the scope of knowledge that a candidate for licensure should have. The challenged question is not ambiguous. Petitioner is not entitled to additional credit for his response to Question 23 of the April 1997 exam. QUESTION 35 OF THE APRIL 1997 EXAM The correct answer for Question 11 of the April 1997 examination is choice “C.” This correct answer is supported by reference materials made available to all candidates. Petitioner’s answer to this question was choice “D.” Petitioner did not receive credit for his response to this question because he did not select the correct answer to the question. The challenged question is a question that a candidate for licensure should be able to answer. The challenged question is not beyond the scope of knowledge that a candidate for licensure should have. The challenged question is not ambiguous. Petitioner is not entitled to additional credit for his response to Question 11 of the April 1997 exam.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that Respondent enter a Final Order that dismisses the challenges brought by Petitioner to Questions 23 and 27 on the July 1996 exam and to Questions 11, 23, and 35 of the April 1997 exam. DONE AND ENTERED this 3rd day of December, 1997, in Tallahassee, Leon County, Florida. CLAUDE B. ARRINGTON Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (904) 488-9675 SUNCOM 278-9675 Fax Filing (904) 921-6847 Filed with the Clerk of the Division of Administrative Hearings this 3rd day of December, 1997. COPIES FURNISHED: R. Beth Atchison, Esquire Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0750 Mr. Kenneth Marshall 624 Southwest 11th Court Fort Lauderdale, Florida 33315 John Preston Seiler, Esquire 2900 East Oakland Park Boulevard, No. 200 Fort Lauderdale, Florida 33306 Lynda L. Goodgame, General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0792 Rodney Hurst, Executive Director Construction Industry Licensing Board 7960 Arlington Expressway, Suite 300 Jacksonville, Florida 32211-7467

Florida Laws (2) 120.57489.113
# 2
CURTIS LORD vs BOARD OF PROFESSIONAL ENGINEERS, 90-007502 (1990)
Division of Administrative Hearings, Florida Filed:Fort Lauderdale, Florida Nov. 28, 1990 Number: 90-007502 Latest Update: Mar. 14, 1991

The Issue The issue presented is whether Mr. Lord should be granted additional credit for his answer to question number 144 on the April 1990 Professional Engineer licensure examination.

Findings Of Fact Mr. Lord (Candidate #301402) received a score of 66.3 percent on the April 20, 1991, Principals and Practice portion of the Professional Engineer examination. A minimum passing score was 70.0 percent. Mr. Lord challenged the scoring of his response to question number 144. Question number 144 is an essay question involving an assembly line problem where four separate stations are used to assemble a product in sequence. A fifth station can assist in maximizing the number of finished products produced per hour, and is capable of performing all operations. The correct answer to question number 144 was 100 products per hour, while Mr. Lord's answer was 25 pieces per hour. Petitioner received a score of 2 (out of a possible 10) points on question number 144. This was based on the scoring plan developed for the exam by the National Council of Examiners for Engineering and Surveying. Mr. Lord used a method of averaging station assembly times to determine the maximum average number of products each station could produce. The averaging method gave a solution which did not identify the central issue presented by the essay question: identifying and eliminating the bottlenecks in production. Mr. Lord also made an assumption that the initial four stations could do all operations, thus defining the model inaccurately. This misreading allowed Mr. Lord to use an averaging methodology. Mr. Granata, the Department's expert, testified that it is a coincidence of the numbers that if you multiply Respondent's answer (25) by four (the initial number of machines) you get the Board's answer (100). Mr. Greenbaum, Petitioner's expert witness, testified that Petitioner's answer is "unique" and that he, as an expert, would have answered the question using a methodology similar to the one developed by the Department's expert, Mr. Granata, and by the NCEE (National Council of Examiners for Engineering).

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is recommended that the challenge to the grading of Mr. Lord's response to question 144 on the April 1990 Professional Engineer licensure examination be dismissed. RECOMMENDED this 14th day of March, 1991, at Tallahassee, Florida. WILLIAM R. DORSEY, JR. Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 14th day of March, 1991. COPIES FURNISHED: William F. Whitson, Law Clerk Department of Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0792 Curtis Lord 1416A Old Lystra Road Chapel Hill, NC 27514 Rex Smith, Executive Director Department of Professional Regulation Board of Professional Engineers 1940 North Monroe Street Tallahassee, Florida 32399-0792 Jack McRay, General Counsel Department of Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0792

Florida Laws (1) 120.57
# 3
CHRISTOPHER NATHANIEL LOVETT vs DEPARTMENT OF BUSINESS AND PROFESSIONAL REGULATION, BOARD OF PROFESSIONAL ENGINEERS, 03-004013RP (2003)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Oct. 29, 2003 Number: 03-004013RP Latest Update: May 26, 2005

The Issue The ultimate issue in this proceeding is whether proposed Florida Administrative Code Rule 61G15-21 is an invalid exercise of delegated legislative authority.

Findings Of Fact Florida Administrative Code Rule 61G15-21.004, in relevant part, states: The criteria for determining the minimum score necessary for passing the Engineering Fundamentals Examination shall be developed through the collective judgment of qualified experts appointed by NCEES to set the raw score that represents the minimum amount of knowledge necessary to pass the examination. The judges shall use a Modified Angoff Method in determining the minimally acceptable raw score necessary to pass the Fundamentals of Engineering Examination. Using the above mentioned Modified Angoff Method, the judges will indicate the probability that a minimally knowledgeable Fundamentals of Engineering examinee would answer any specific questions correctly. The probability of a correct response is then assigned to each question. Each judge will then make an estimate of the percentage of minimally knowledgeable examinees who would know the answer to each question. The totals each of the judges is added together and divided by the number of judges to determine the overall estimate of the minimum standards necessary. The minimum number of correct answers required to achieve a passing score will take into account the relative difficulty of each examination through scaling and equating each examination to the base examination. The raw score necessary to show competence shall be deemed to be a 70 on a scale of 100. A passing grade on Part Two of the examination is defined as a grade of 70 or better. The grades are determined by a group of knowledgeable professional engineers, who are familiar with engineering practice and with what is required for an applicable engineering practice and with what is required for an applicable engineering task. These professional engineers will establish a minimum passing score on each individual test item (i.e., examination problem). An Item Specific Scoring Plan (ISSP) will be prepared for each examination item based upon the NCEES standard scoring plan outline form. An ISSP will be developed by persons who are familiar with each discipline including the item author, the item scorer, and other NCEES experts. On a scale of 0-10, six (6) will be a minimum passing standard and scores between six (6) and ten (10) will be considered to be passing scores for each examination item. A score of five (5) or lower will be considered an unsatisfactory score for that item and examinee will be considered to have failed that item. To pass, an examinee must average six (6) or greater on his/her choice of eight (8) exam items, that is, the raw score must be forty- eight (48) or greater based on a scale of eighty (80). This raw score is then converted to a base 100 on which, as is noted above, a passing grade will be seventy (70). The proposed changes to Florida Administrative Code Rule 61G15-21.004, in relevant part, state: The passing grade for the Engineering Fundamentals Examination is 70 or better. The criteria for determining the minimum score necessary for passing the Engineering Fundamentals Examination shall be developed through the collective judgment of qualified experts appointed by NCEES to set the raw score that represents the minimum amount of knowledge necessary to pass the examination. The judges shall use a Modified Angoff Method in determining the minimally acceptable raw score necessary to pass the Fundamentals of Engineering Examination. Using the above mentioned Modified Angoff Method, the judges will indicate the probability that a minimally knowledgeable Fundamentals of Engineering examinee would answer any specific questions correctly. The probability of a correct response is then assigned to each question. Each judge will then make an estimate of the percentage of minimally knowledgeable examinees who would know the answer to each question. The totals each of the judges is added together and divided by the number of judges to determine the overall estimate of the minimum standards necessary. The minimum number of correct answers required to achieve a passing score will take into account the relative difficulty of each examination through scaling and equating each examination to the base examination. The raw score necessary to show competence shall be deemed to be a 70 on a scale of 100. The passing grade for the Principles and Practice Examination is 70 or better. A passing grade on Part Two of the examination is defined as a grade of 70 or better. The grades are determined by a group of knowledgeable professional engineers, who are familiar with engineering practice and with what is required for an applicable engineering practice and with what is required for an applicable engineering task. These professional engineers will establish a minimum passing score on each individual test item (i.e., examination problem). An Item Specific Scoring Plan (ISSP) will be prepared for each examination item based upon the NCEES standard scoring plan outline form. An ISSP will be developed by persons who are familiar with each discipline including the item author, the item scorer, and other NCEES experts. On a scale of 0-10, six (6) will be a minimum passing standard and scores between six (6) and ten (10) will be considered to be passing scores for each examination item. A score of five (5) or lower will be considered an unsatisfactory score for that item and examinee will be considered to have failed that item. To pass, an examinee must average six (6) or greater on his/her choice of eight (8) exam items, that is, the raw score must be forty- eight (48) or greater based on a scale of eighty (80). This raw score is then converted to a base 100 on which, as is noted above, a passing grade will be seventy (70). Petitioner resides in Tampa, Florida. On April 11, 2003, Petitioner took a national examination that Petitioner must pass to be licensed by the state as a professional engineer. On July 1, 2003, Petitioner received a letter from the Board advising Petitioner that he had received a failing grade on the examination. On July 2, 2003, Petitioner unsuccessfully requested the raw scores on his examination from a representative of the National Council of Examiners for Engineering and Surveying (NCEES). The NCEES is the national testing entity that conducts examinations and determines scores for the professional engineer examination required by the state. On July 9, 2003, Petitioner submitted a formal request to the Board for all of the raw scores related to Petitioner "and all past P.E. Exams that the Petitioner had taken." A representative of the Board denied Petitioner's request explaining that the raw scores are kept by the NCEES and "it is not their policy to release them." The Board's representative stated that the Board was in the process of adopting new rules "that were in-line with the policies of the NCEES." On July 31, 2003, Petitioner requested the Board to provide Petitioner with any statute or rule that authorized the Board to deny Petitioner's request for raw scores pursuant to Section 119.07(1)(a), Florida Statutes (2003). On the same day, counsel for the Board explained to Petitioner that the Board is not denying the request. The Board is unable to comply with the request because the Board does not have physical possession of the raw scores. Petitioner and counsel for Respondent engaged in subsequent discussions that are not material to this proceeding. On August 6, 2003, Petitioner requested counsel for Respondent to provide Petitioner with copies of the proposed rule changes that the Board intended to consider on August 8, 2003. On August 27, 2003, Petitioner filed a petition with the Board challenging existing Florida Administrative Code Rule 61G15-21.004. The petition alleged that parts of the existing rule are invalid. Petitioner did not file a challenge to the existing rule with DOAH. The Petition for Hearing states that Petitioner is filing the Petition for Hearing pursuant to Subsections 120.56(1) and (3)(b), Florida Statutes (2003). However, the statement of how Petitioner's substantial interests are affected is limited to the proposed changes to the existing rule. During the hearing conducted on January 29, 2004, Petitioner explained that he does not assert that the existing rule is invalid. Rather, Petitioner argues that the Board deviates from the existing rule by not providing examinees with copies of their raw scores and by failing to use raw scores in the determination of whether an applicant achieved a passing grade on the exam. Petitioner further argues that the existing rule benefits Petitioner by purportedly requiring the Board to use raw scores in the determination of passing grades. The elimination of that requirement in the proposed rule arguably will adversely affect Petitioner's substantial interests. The Petition for Hearing requests several forms of relief. The Petition for Hearing seeks an order granting Petitioner access to raw scores, a determination that Petitioner has met the minimum standards required under the existing rule, and an order that the Board grant a license to Petitioner. The Petition for Hearing does not request an order determining that the proposed rule changes constitute an invalid exercise of delegated legislative authority.

Florida Laws (4) 119.07120.56120.68455.217
# 5
KENNETH A. CARPER vs. BOARD OF PROFESSIONAL ENGINEERS, 87-004979 (1987)
Division of Administrative Hearings, Florida Number: 87-004979 Latest Update: Feb. 29, 1988

The Issue The single issue for determination is whether Petitioner is entitled to at least three more points on his response to question #121. If not, he has failed the examination.

Findings Of Fact Kenneth A. Carper graduated summa cum laude with a bachelor's degree from the University of Central Florida. In the nine years since graduation he has worked for an engineering firm primarily in the area of drainage design. Question #121 is the type of problem he deals with daily. The ultimate objective of the question is to determine whether the flow of an open channel with given specifications is subcritical or supercritical. The question required the computation of the channel's critical depth and normal depth. In the hypothetical situation described by the question, certain extraneous information was given. An appropriate answer required that this "red herring" be ignored. The ISSP is a standardized grading device by which a person subjectively grading a problem will consistently apply a score based upon specified types and numbers of deficiencies. The intent is to reduce the chance of over-leniency or an overly strict approach by different graders. The ISSP developed by the National Council of Engineering Examiners for question #121 provides in pertinent part: 10. QUALIFIED: All CATEGORIES satisfied, correct solution, well organized, all relevant ASPECTS fully addressed. Correct approach; numerical answers correct within rounding errors; conclusion correct; adequate written records. All parts are of equal weight (3 parts). 9. QUALIFIED: All CATEGORIES satisfied, correct solution but exces- sively conservative in choice of working values; or presen- tation lacking in completeness of equations, diagrams, orderly steps in solution, etc. All correct, as in 10 above, except for a single math/units error; or inadequate written record. 8. QUALIFIED: All CATEGORIES satisfied, errors attributable to misread table or calculating devices. Errors would be corrected by routine checking. Results reasonable, though not correct. All correct, as in 10 above, except for multiple math/units errors; or inadequate written record; or in combination. 7. QUALIFIED: All CATEGORIES satisfied. Obtains solution, but chooses less than optimum approach. Solution is awkward but reasonable. Same as 8 above, except for more gross errors; or in combination; or a single part of three parts required completely wrong or missing, with the other two parts correct. 6. QUALIFIED: All CATEGORIES satisfied, applicant demonstrates minimally adequate knowledge in all relevant ASPECTS of the item. Multiple math/units/records errors; or in combination; or one part completely missing or wrong, with other errors; or in combination. (Joint Exhibit 1) The grader of Carper's examination did not testify, but provided notations on the answer sheet. The solution required selection of an appropriate formula, which Carper did; it also required a trial and error mathematical computation of the value of "y." In the first part of the question Carper found "y" to be "... between 9.2 and 9.3, say 9.3'." The grader crossed out this answer with the notation,-- "not an engineering answer-Finish iteration to a close enough' final value." The grader's answer was 9.24. In the second part of the question, Carper indicated "y" was "... between 6.8 and 7.0, say 7.0'." The grader's answer was 6.99, and similar notations, were made, "not an engineering answer. Finish the iteration." It is apparent that the grader felt that the solution should be carried out to the nearest hundredth place. Yet, in a very similar question (#421), also requiring computation of normal depth, Carper's answer, 4.7' was marked "OK", and he received the full 10 points for his solution. Nothing in the instructions specifically requires a solution to the nearest hundredth. This is left to the judgement of the engineer. "Real world" engineering practice would not require a solution to the nearest hundredth place. The design of a large open channel is substantially less precise than the design of a bridge or multi-story building. In hydraulics, the practice is often to round up, for example, from a 9.8 to 10, as a conservative measure. It is also common to use estimates; for example, the roughness coefficient (resistance of the channel walls) is a textbook figure, rather than one derived from the structure itself. Given the lack of precision inherent in the formula, the computation of value beyond the tenth place serves no valid purpose. The sample solution to #121 provided by the grader specifically states "ignore backwater curve." While Carper's solution does ignore the "red herring," his work sheet does not affirmatively note that he did. Respondent claims that the grader could not know whether the back water curve was properly ignored, or just overlooked. At worst, this minor deficiency constitutes an inadequate written record. The appropriate score, based on the ISSP table reflected in paragraph 4, above, is "9." Carper selected the proper formula, performed the mathematics and arrived at answers reflecting acceptable engineering practice. The descriptions of deficiencies for the scores of less than 9 do not apply to Carper's solution for this question. Respondent's expert conceded that the solution did not contain a mathematics error. In making these findings I have considered and weighed the opinions of the three experts who testified in this proceeding. Both experts presented by Petitioner were qualified, without objection, in the engineering fields of hydraulics, hydrology and water resource management. They both have over 30 years of extensive practical experience in those fields, and they both have lectured or taught in colleges and universities. The weight of their testimony is tempered by their personal knowledge of Petitioner for eight or nine years and by their knowledge of the score he needed to pass the examination. Nothing in the substance of their testimony, however, revealed a bias in favor of their colleague, and their testimony was considered candid and forthright. They would have scored #121 as "9" or "10". Respondent's expert, a consulting engineer, employed as an Associate Professor in the University of Florida Civil Engineering Department did not know Carper, nor was he advised of the score he would need to pass. He would have given Carper a "6" or "7" on question #121, but more likely a 7, based on Carper's failure to carry his answer to "three significant figures." This opinion was not adequately explained in terms of acceptable engineering practice, but rather was based on acceptance of the test grader's judgement. (Joint Exhibit #2, Deposition, p. 29) Respondent's expert was less qualified than Petitioner's experts. His primary experience as a consulting engineer has been in review of the work of others, rather than active design.

Recommendation Based upon the foregoing, it is hereby RECOMMENDED: That a Final Order be entered, awarding Kenneth Carper 9 points for question #121, thereby providing a passing grade for the engineering examination. DONE and RECOMMENDED this 29th day of February, 1988, in Tallahassee, Florida. MARY CLARK Hearing Officer Division of Administrative Hearings The Oakland Building 2009 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 29th day of February, 1988. APPENDIX TO RECOMMENDED ORDER, CASE NO. 87-4979 The following constitute my rulings on the findings of fact proposed by the parties: Petitioner 1-5. Addressed in Background. 6-7. Adopted in paragraph #11. 8. Addressed in Background. Respondent Addressed in Background. Adopted in substance in paragraph #3. Adopted in paragraph #10. Adopted in substance in paragraph #10. Adopted in paragraph #9. Adopted in substance in paragraph #5. Rejected as unsubstantiated speculation. COPIES FURNISHED: Brian E. Currie, Esquire SANDERS, McEWAN, MIMS & MARTINEZ, P.A 108 East Central Boulevard Post Office Box 753 Orlando, Florida 32802-0753 H. Reynolds Sampson, Esquire Department of Professional Regulation 130 North Monroe Street Tallahassee, Florida 32399-0750 Allen R. Smith, Jr. Executive Director Board of Professional Engineers Department of Professional Regulation 130 North Monroe Street Tallahassee, Florida 32399-0750 William O'Neal, Esquire Department of Professional Regulation 130 North Monroe Street Tallahassee, Florida 32399-0750

Florida Laws (1) 120.57
# 6
MAGDALENA COSTIN vs FLORIDA ENGINEERS MANAGEMENT CORPORATION, 98-002584 (1998)
Division of Administrative Hearings, Florida Filed:Jacksonville, Florida Jun. 05, 1998 Number: 98-002584 Latest Update: Feb. 23, 1999

The Issue The issue to be resolved is whether Petitioner is entitled to additional credit for her response to question nos. 122 and 222 of the civil engineering examination administered on October 31, 1997.

Findings Of Fact On October 31, 1997, Petitioner took the civil professional engineering licensing examination. A score of 70 is required to pass the test. Petitioner obtained a score of 69. Petitioner challenged the scoring of question nos. 122 and 222. As part of the examination challenge process, Petitioner's examination was returned to the National Council of Examiners for Engineering and Surveying where it was re-scored. In the re-score process, the grader deducted points from Petitioner's original score. Petitioner was given the same raw score of 6 on question number 122; however, on question number 222 her raw score of 4 was reduced to a 2. Petitioner needed a raw score of 48 in order to achieve a passing score of 70; she needed at least three additional raw score points to obtain a passing raw score of 48. Petitioner is entitled to a score of 6 on problem number 122. The solution and scoring plan for that problem required the candidate to obtain a culvert size in the range of 21-36 inches. The Petitioner incorrectly answered 3.1 feet or 37.2 inches. She is not entitled to additional credit for problem number 122 because she answered the question with the wrong size culvert. Problem number 122 required the candidate to use a predevelopment peak flow of 40 cubic feet per second (cfs). Petitioner used 58.33 cfs. She chose the maximum flow rather than the predevelopment peak flow. In solving problem number 122, Petitioner chose a design headwater depth of 4.8 feet. The correct solution required a design headwater depth of 5.7 feet. Petitioner made another mistake in problem number 122; she failed to check the water depth in the downstream swale. Petitioner concedes she was given sufficient information to solve problem number 122. She understood what the question was asking of her. She admits that she did not compute the critical depth of the water and that she did not complete the solution. Question number 222 had three parts. The candidate was required to determine the footing size, to select the reinforcing steel, and to provide a sketch for a concrete column located along the edge of a building. Petitioner understood the question and was provided enough information to solve the problem. Petitioner correctly checked the footing size as required by the first part; however, she did not select the reinforcing steel or show the required sketch. Therefore, Petitioner did not complete enough of the problem to qualify for a score of 4 points. She is entitled to a score of 2 points. The examination questions at issue here were properly designed to test the candidate's competency in solving typical problems in real life. The grader (re-scorer) utilized the scoring plan correctly. Petitioner has been in the United States for approximately eleven years. She lived in Romania before she came to the United States. In Romania, Petitioner used only the metric system in her professional work. While she has used the English system since moving to the United States, Petitioner is more familiar with the metric system. The Principles and Practice examination is an open-book examination. Petitioner took a book entitled the Fundamentals of Engineering Reference Handbook to the examination. When the proctor examined her books, she told the Petitioner she was not permitted to keep the handbook. The proctor took the handbook from the Petitioner. Petitioner protested the confiscation of her reference book because she had used the same book in two previous tests. About ten minutes later, the proctor's supervisor returned the book to Petitioner. Petitioner's book was returned at least ten minutes before the test began. She was permitted to use the book during the test. There is no persuasive evidence that the proctor's mistake in temporarily removing Petitioner's reference book caused her to be so upset that she failed the test. Candidates were not permitted to study their books prior to the beginning of the examination. Petitioner may have been nervous when the test began. However, Petitioner received a perfect score of ten points on the first problem she worked, problem number 121.

Recommendation Based upon the findings of fact and conclusions of law, it is RECOMMENDED that the Board of Professional Engineers enter a Final Order confirming Petitioner's score on the examination and dismissing the Petitioner's challenge. DONE AND ENTERED this 13th day of January, 1999, in Tallahassee, Leon County, Florida. SUZANNE F. HOOD Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 13th day of January, 1999. COPIES FURNISHED: Natalie A. Lowe, Esquire Board of Professional Engineers 1208 Hays Street Tallahassee, Florida 32301 William Bruce Muench, Esquire 438 East Monroe Street Jacksonville, Florida 32202 Lynda L. Goodgame, General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0792 Dennis Bartin, President Florida Engineers Management Corporation 1940 North Monroe Street Tallahassee, Florida 32399-0792

Florida Laws (1) 120.57
# 7
ALAN K. GARMAN vs BOARD OF PROFESSIONAL ENGINEERS, 90-005728 (1990)
Division of Administrative Hearings, Florida Filed:Brooksville, Florida Sep. 10, 1990 Number: 90-005728 Latest Update: Mar. 27, 1991

The Issue The issues presented are: (1) whether or not Respondent wrongfully eliminated materials from the Candidate/Petitioner during the April 19, 1990 engineering examination, and if so, (2) whether the Candidate/Petitioner received a failing grade because the materials were wrongfully eliminated.

Findings Of Fact The Petitioner (#100021) received a score of 69.0 on the Professional Engineer Fundamentals Examination given April 19, 1990. A minimum passing score was 70.0 on the examination which is written by National Council of Engineering Examiners and graded by Education Testing Service. (Transcript Pages 36 and 39) Prior to the April 1990 examination, the Board sent each candidate a letter, dated December 18, 1989 (Exhibit P-1) (Transcript Page 9 and 12), which said, "No review publications directed principally toward sample questions and their solutions of engineering problems are permitted in the examination room." (Transcript Page 31). The candidates were also provided with a "Candidate Information Booklet" dated January 1990 (Exhibit R-1, Transcript Page 77). The booklet states on page 14, "No books with contents directed toward sample questions or solutions of engineering problems are permitted in the examination room." (Transcript Pages 77 and 96). Petitioner, who also took the October 1989 examination had received notice at that examination that the Board of Engineers intended to change the procedure allowing reference materials in the examination. (Transcript Page 89 and Respondent's Exhibit 2.) The Board of Professional Engineers advised the examination supervisor and proctors that no engineering "review" materials would be allowed in the examination although engineering "reference" materials could be brought into and used for the examination. However the books which were excluded included books without "review" in the title, books with "reference" in the title, and books which contained problems and solutions. Before the examination began Deena Clark, an examination supervisor, read over a loud speaker system names of books that would not be permitted (Transcript Page 81). Practice examination and solution manuals were not allowed for use by engineering candidates (Transcript Pages 93 and 94). Schram's outlines and other materials were also excluded (Transcript Page 91). Also excluded was Lindeburg's 6th edition, "Engineering In Training Review Manual." (Transcript Pages 16 and 79). This decision was verified by the Board before the examination began (Transcript Page 81). After the examination had begun, Ms. Clark announced that the candidates could put certain copyrighted materials in a three-ring binder and use them which had been excluded earlier (Transcript Page 85). This was in response to candidates who needed economics tables for the examination However, no time was provided the candidate to prepare these references and only one minute was added to the examination time. (Transcript Page 85). Petitioner did not bring any economic tables to the examination site except those contained in books which were not allowed in the examination. (Transcript Page 19). Petitioner did not remove the economic tables and permitted references from the Lindeburg's review manual until lunch and these tables were not available to him on the morning examination. (Transcript Pages 22 and 88). Of the six engineering economics questions on the morning portion for the examination, the candidate correctly answered four. No data was provided on the nature of these questions. The Candidate correctly answered 53 questions in the morning (weighted x 1) and 23 questions in the afternoon (weighted x 2) for a total of 99 weighted required points. He answered eight questions correctly in the "addition" portion of the examination. The table for eight additional questions correct in the "Scoring Information Booklet" used in determining the candidates final grade shows the adjusted equated score was 126 and his scaled score was 69. (Page 21 of booklet). The value of each economics question converted to final scoring scale was enough that passage of one economics question would have resulted in passage of the examination. The exclusion of certain materials from the examination was arbitrary and capricious and was done by a few individuals without any stated objective standard published by the board. Further, the board knew before the examination which books were to be excluded and could have notified examinees of the exact items to be excluded. The Board's generally poor handling of this matter is exemplified in announcing after the examination had begun that items previously excluded could be used if placed in a ring binder but not allowing any time to prepare such materials. (Tx. pgs., 74-80, 84-86, and 91-97) The Petitioner would have used several tables which were excluded if the announcement had been made before the morning examination began with time to put the items in acceptable form. After notifications in October 1989, December 1989, and January 1990, Petition admitted that he did not call the Board of Professional Engineers to ask for guidance on books that would not be allowed on the April 1990 examination (Transcript Page 29). However, a final decision on books to be excluded was not made until approximately two weeks before the examination. The Petitioner did not show that the two questions which he missed on the Engineering Economics portion of the morning examination were missed for lack of the tables. The examination is a national examination and there is no evidence that the requirements and limits established by the Board in Florida were applicable nationwide. To alter the national instructions locally potentially adversely affects Florida results.

Recommendation Based upon the foregoing findings of fact and conclusions of law, it is recommended that the Petitioner be permitted to take the examination without charge on one occasion. RECOMMENDED this 27th day of March, 1991, in Tallahassee, Florida. STEPHEN F. DEAN Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, FL 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 27th day of March, 1991. 1/ The general information provided to examinees by the State Board regarding the values of questions on the examination and scoring it misleading or inaccurate because neither the weighted required score nor the adjusted score was 48% of 80, 280, or any other number related to the scaled score of 70. The manner in which these values are associated with the scale score of 70 is contrary to the Board's explanation and is not self evident. This is a potential problem if the matter were formally challenged, and it appears the Board needs to reassess its procedures and instructions. APPENDIX TO RECOMMENDED ORDER, CASE NO. 90-5728 The Petitioner did not submit proposed findings. The Respondent submitted proposed findings which were read and considered. The following proposed findings were adopted or reject for the reasons stated: Adopted. Issue not fact. - 4. Rejected. Preliminary statement not fact. 5. -12. Adopted. Rejected. Preliminary statement not fact. Rejected as irrelevant. Rejected as preliminary statement. Adopted. Adopted. COPIES FURNISHED: Alan K. Garman Civil-Tech, Inc. 3573 Commercial Way Street B Spring Hill, FL 34606 William F. Whitson, Law Clerk Department of Professional Regulation 1940 North Monroe Street Tallahassee, FL 32399-0792 Rex Smith Executive Director Board of Professional Engineers Department of Professional Regulation 1940 North Monroe Street Tallahassee, FL 32399-0792 Jack McRay, General Counsel Department of Professional Regulation 1940 North Monroe Street Tallahassee, FL 32399-0792

Florida Laws (3) 120.57455.217471.013
# 8
NATIONAL COMPUTER SYSTEMS, INC. vs DEPARTMENT OF EDUCATION, 99-001226BID (1999)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Mar. 17, 1999 Number: 99-001226BID Latest Update: Jul. 19, 1999

The Issue The primary issue is whether the process used by the Department of Education (Department) for evaluating and ranking the proposals submitted in response to Request For Proposal (RFP) 99-03 for the Florida Comprehensive Assessment Test (FCAT) administration contract was contrary to the provisions of the RFP in a way that was clearly erroneous, contrary to competition, arbitrary, or capricious.

Findings Of Fact The RFP for the FCAT describes a five stage process for evaluating proposals. In Stage I, the Department’s Purchasing Office determined whether a proposal contained certain mandatory documents and statements and was sufficiently responsive to the requirements of the RFP to permit a complete evaluation. Stage II involved the Department’s evaluation of a bidder’s corporate qualifications to determine whether the bidder has the experience and capability to do the type of work that will be required in administering the FCAT. Stage III was the Department’s evaluation of a bidder’s management plan and production proposal. In Stage IV, the Department evaluated a bidder’s cost proposal. Stage V involved the ranking of proposals based on points awarded in Stages II-IV. If a proposal did not meet the requirements at any one stage of the evaluation process, it was not to be evaluated in the following stage. Instead, it was to be disqualified from further consideration. Stages II and III of the evaluation process were conducted by an evaluation team comprised of six Department employees: Dr. Debby Houston, Ms. Lynn Joszefczyk, Dr. Peggy Stillwell, Dr. Cornelia Orr, Dr. Laura Melvin, and Ms. Karen Bennett. Dr. Thomas Fisher, head of the Department’s Assessment and Evaluation Services Section, and Dr. Mark Heidorn, Administrator for K-12 Assessment Programs within the Department’s Assessment and Evaluation Services Section, served as non-voting co-chairs of the evaluation team. The focus of this proceeding is Stage II of the evaluation process addressing a bidder’s corporate qualifications. RFP Provisions Regarding Corporate Qualification The FCAT administration contractor will be required to administer tests to approximately one and a half million students each year in a variety of subject areas at numerous grade levels. The FCAT program involves a complex set of interrelated work activities requiring specialized human resources, technological systems and procedures. The FCAT must be implemented annually within limited time periods. The FCAT administration contractor must meet critical deadlines for the delivery of test materials to school districts and the delivery of student scores prior to the end of the school year. In developing the RFP, the Department deliberately established a set of minimum requirements for corporate qualifications that a bidder was to demonstrate in order for its proposal to be eligible for further evaluation. The purpose of the RFP’s minimum corporate qualifications requirements was to limit bidding to qualified vendors who have demonstrated prior experience in successfully administering large-scale assessment projects like the FCAT, thereby providing the Department with some degree of assurance that the winning bidder could successfully administer the FCAT. The instructions to bidders regarding the minimum requirements for corporate qualifications are contained in RFP Section 10, which gives directions on proposal preparation. Section 10.1, which lists certain mandatory documents and statements to be included in the bidder’s proposal, requires that a transmittal letter contain "[a] statement certifying that the bidder has met the minimum corporate qualifications as specified in the RFP." These "minimum corporate qualifications" are set forth in RFP Appendix J. RFP Section 10.2 identifies what a bidder is required to include in its proposal with respect to corporate qualifications. The first paragraph of Section 10.2 directs a bidder generally to describe its qualifications and experience performing tasks similar to those that it would perform in administering the FCAT, in order to demonstrate that the bidder is qualified where it states: Part II of a bidder’s proposal shall be entitled Corporate Qualifications. It shall provide a description of the bidder’s qualifications and prior experience in performing tasks similar to those required in this RFP. The discussion shall include a description of the bidder’s background and relevant experience that qualifies it to provide the products and services required by the RFP. RFP Section 10.2, however, is not limited to a directive that qualifications and past experience be described generally. Instead, Section 10.2, also communicates, in plain and unambiguous terms, that there are specific minimum corporate qualifications a bidder must demonstrate: The minimum expectations for corporate qualifications and experience are shown in Appendix J. There are two separate sets of factors, one set of eight for the developmental contractor and another set of nine for the administration contractor. Bidders must demonstrate their Corporate Qualifications in terms of the factors that are applicable to the activities for which a bid is being submitted -- development or administration. For each criterion, the bidder must demonstrate that the minimum threshold of experience has been achieved with prior completed projects. (Emphasis added.) Moreover, Section 10.2 singles out for emphasis, in relation to the administration component of the RFP, the importance placed on a bidder’s ability to demonstrate experience processing a large volume of tests: The [bidder’s prior completed] projects must have included work tasks similar to those described herein, particularly in test development or processing a comparable number of tests. The bidder will provide a description of the contracted services; the contract period; and the name, address, and telephone number of a contact person for each of the contracting agencies. This description shall (1) document how long the organization has been providing similar services; (2) provide details of the bidder’s experience relevant to the services required by this RFP; and (3) describe the bidder’s other testing projects, products, and services that are similar to those required by this RFP. (Emphasis added.) The Department thus made clear its concern that bidders demonstrate experience with large-scale projects. RFP Appendix J sets forth nine different criteria (C1 through C9) for the administration contractor. As stated in RFP Section 10.2, "[f]or each criterion, the bidder must demonstrate that the minimum threshold of experience has been achieved with prior completed projects . . . ." (emphasis added). Appendix J contains a chart which lists for each criterion: (1) a summary of the related FCAT work task, (2) the detailed criteria for the bidder’s experience related to that work task, and (3) the necessary documentation a bidder must provide. Criterion C4 and Criterion C6 include work tasks that involve the use of image-based scoring technology. C4 and C6 are the only corporate qualifications criteria at issue in this proceeding. RFP Provisions Involving Corporate Qualifications for Image-Based Scoring "Handscoring" is the test administration activity in which open-ended or performance-based student responses are assessed. This practice involves a person reading something the student has written as part of the test, as distinguished from machine scoring multiple choice responses (i.e., the filled-in "bubbles" on an answer sheet). There are two types of handscoring: (1) paper-based handscoring, and (2) image-based handscoring. Paper-based handscoring requires that a student response paper be sent to a reader, who then reviews the student’s response as written on the paper and enters a score on a separate score sheet. Image-based handscoring involves a scanned image of the student’s response being transmitted to a reader electronically. The student’s response is then projected on a computer screen, where the reader reviews it and assigns a score using the computer. The RFP requires that the reading and math portions of the FCAT be handscored on-line using imaging technology beginning with the February 2000 FCAT administration. The RFP provides that the writing portion of the FCAT may be handscored using either the paper-based method or on-line imaging technology during the February 2000 and 2001 FCAT administrations. However, on-line image-based scoring of the writing portion of the FCAT is required for all FCAT administrations after February 2001. An image-based scoring system involves complex computer technology. William Bramlett, an expert in designing and implementing large-scale imaging computer systems and networks, presented unrefuted testimony that an image-based scoring system will be faced with special challenges when processing large volumes of tests. These challenges involve the need to automate image quality control, to manage the local and wide area network load, to assure adequate server performance and storage requirements, and to manage the work flow in a distributed environment. In particular, having an image-based scoring system process an increasing volume of tests is not simply a matter of adding more components. Rather, the system’s basic software architecture must be able to understand and manage the added elements and volume involved in a larger operation. According to Bramlett, there are two ways that the Department could assess the ability of a bidder to perform a large- scale, image-based scoring project such as the FCAT from a technological perspective: (1) have the bidder provide enough technological information about its system to be able to model or simulate the system and predict its performance for the volumes involved, or (2) require demonstrated ability through completion of prior similar projects. Dr. Mark Heidorn, Administrator for Florida’s K-12 Statewide Assessment Programs, was the primary author of RFP Sections 1-8, which describe the work tasks for the FCAT -- the goods and services vendors are to provide and respond to in their technical proposals. Dr. Heidorn testified that in the Department’s testing procurements involving complex technology, the Department has never required specific descriptions of the technology to be used. Instead, the Department has relied on the bidder’s experience in performing similar projects. Thus, the RFP does not specifically require that bidders describe in detail the particular strategies and approaches they intend to employ when designing and implementing an image-based scoring system for FCAT. Instead, the Department relied on the RFP requirements calling for demonstrated experience as a basis to understand that the bidder could implement such an image-based scoring system. Approximately 717,000 to 828,000 student tests will be scored annually by the FCAT administration contractor using imaging technology. The RFP, however, does not require that bidders demonstrate image-based scoring experience at that magnitude. Instead, the RFP requires bidders to demonstrate only a far less demanding minimum level of experience using image-based scoring technology. Criterion C4 and Criterion C6 in Appendix J of the RFP each require that a bidder demonstrate prior experience administering "a minimum of two" assessment programs using imaged- based scoring that involved "at least 200,000 students annually." The requirements for documenting a "minimum of two" programs or projects for C4 and C6 involving "at least 200,000 students annually" are material because they are intended to provide the Department with assurance that the FCAT administration contractor can perform the large-scale, image-based scoring requirements of the contract from a technological perspective. Such experience would indicate that the bidder would have been required to address the sort of system issues described by Bramlett. Dr. Heidorn testified that the number 200,000 was used in C4 and C6 "to indicate the level of magnitude of experience which represented for us a comfortable level to show that a contractor had enough experience to ultimately do the project that we were interested in completing." Dr. Fisher, who authored Appendix J, testified that the 200,000 figure was included in C4 and C6 because it was a number judged sufficiently characteristic of large-scale programs to be relevant for C4 and C6. Dr. Fisher further testified that the Department was interested in having information that a bidder’s experience included projects of a sufficient magnitude so that the bidder would have experienced the kinds of processing issues and concerns that arise in a large-scale testing program. The Department emphasized this specific quantitative minimum requirement in response to a question raised at the Bidder’s Conference held on November 13, 1998: Q9: In Appendix J, the criteria for evaluating corporate quality for the administration operations C4, indicates that the bidder must have experience imaging as indicated. Does this mean that the bid [sic] must bid for using [sic] imaging technology for reading and mathematics tests? A: Yes. The writing assessment may be handscored for two years, and then it will be scored using imaging technology. To be responsive, a bid must be for imaging. The corporate experience required (200,000 students annually for which reports were produced in three months) could be the combined experience of the primary contractor and the subcontractors. (Emphasis added.) Criterion C4 addresses the RFP work tasks relating to handscoring, including both the image-based handscoring of the reading and math portions of the FCAT for all administrations and the writing portions of the FCAT for later administrations. The "Work Task" column for C4 in Appendix J of the RFP states: Design and implement efficient and effective procedures for handscoring student responses to performance tasks within the limited time constraints of the assessment schedule. Handscoring involves image-based scoring of reading and mathematics tasks for all administrations and writing tasks for later administrations at secure scoring sites. Retrieve and score student responses from early district sample schools and deliver required data to the test development contractor within critical time periods for calibration and scaling. The "Necessary Documentation" column for C4 in Appendix J states: Bidder must document successful completion of a minimum of two performance item scoring projects for statewide assessment programs during the last four years for which the bidder was required to perform as described in the Criteria column. (Emphasis added.) The "Criteria" column for C4 in Appendix J, like the related work tasks in the RFP, addresses both image-based handscoring of reading and math, as well as paper-based or image- based handscoring of writing. In connection with all handscoring work tasks, "[t]he bidder must demonstrate completion of test administration projects for a statewide program for which performance items were scored using scoring rubrics and associated scoring protocols." With respect to the work tasks for handscoring the reading and math portions of the FCAT, "[t]he bidder must demonstrate completion of statewide assessment programs involving scoring multiple-choice and performance items for at least 200,000 students annually for which reports were produced in three months." In addition, for the reading and math work tasks, "[e]xperience must been shown in the use of imaging technology and hand-scoring student written responses with completion of scoring within limited time restrictions." This provision dealing with "imaging technology" experience self-evidently addresses the reading and math components, because separate language addresses imaging experience in connection with the writing component. The relevant handscoring experience for the reading and math aspects of the program is experience using image-based technology. By contrast, with respect to the work tasks for scoring the writing portions of the FCAT, "the bidder must also demonstrate completion of statewide assessment programs involving paper-based or imaged scoring student responses to writing assessment prompts for at least 200,000 students annually for which reports were produced in three months." (Emphasis added.) Criterion C6 addresses work tasks relating to designing and implementing systems for processing, scanning, imaging and scoring student responses to mixed-format tests within limited time constraints. The "Work Task" column for C6 in RFP Appendix J states: Design and implement systems for the processing, scanning, imaging, and scoring of student responses to test forms incorporating both multiple-choice and constructed response items (mixed-format) within the limited time constraints of the assessment schedule. Scoring of student responses involves implementation of IRT scoring tables and software provided by the development contractor within critical time periods. The "Necessary Documentation" column for C6 in Appendix J states: Bidder must document successful completion of a minimum of two test administration projects for statewide assessment programs during the last four years in which the bidder was required to perform as described in the Criteria column. (Emphasis added.) The Criteria column for C6 in Appendix J states: The bidder must demonstrate completion of test administration projects for statewide assessment programs or other large-scale assessment programs that required the bidder to design and implement systems for processing, scanning, imaging, and scoring responses to mixed-format tests for at least 200,000 students annually for which reports were produced in three months. Experience must be shown in use of imaging student responses for online presentation to readers during handscoring. (Emphasis added.) RFP Provisions Per Corporate Qualifications The procedure for evaluating a bidder’s corporate qualifications is described in RFP Section 11.3: The Department will evaluate how well the resources and experience described in each bidder’s proposal qualify the bidder to provide the services required by the provisions of this RFP. Consideration will be given to the length of time and the extent to which the bidder and any proposed subcontractors have been providing services similar or identical to those requested in this RFP. The bidder’s personnel resources as well as the bidder’s computer, financial, and other technological resources will be considered in evaluating a bidder’s qualifications to meet the requirements of this RFP. Client references will be contacted and such reference checks will be used in judging a bidder’s qualifications. The criteria to be used to rate a bidder’s corporate qualifications to meet the requirements of this RFP are shown in Appendix J and will be applied as follows: * * * Administrative Activities. Each of the nine administration activities criteria in Appendix J will be individually rated by members of the evaluation team. The team members will use the rating scale shown in Figure 1 below. Individual team members will review the bidder’s corporate qualifications and rate the response with a rating of one to five. The ratings across all evaluators for each factor will be averaged, rounded to the nearest tenth, and summed across all criteria. If each evaluator assigns the maximum number of points for each criterion, the total number of points will be 45. To meet the requirements of Stage II, the proposal must achieve a minimum rating of 27 points and have no individual criterion for which the number of points averaged across evaluators and then rounded is less than 3.0. Each proposal that receives a qualifying score based on the evaluation of the bidder’s qualifications will be further evaluated in Stage III. Figure 1 Evaluation Scale for Corporate Qualifications 5 Excellent 4 3 Satisfactory 2 1 Unsatisfactory The bidder has demonstrated exceptional experience and capability to perform the required tasks. The bidder has demonstrated that it meets an acceptable level of experience and capability to perform the required tasks. The bidder either has not established its corporate qualifications or does not have adequate qualifications. RFP Section 11.3 provides that each of the nine corporate qualifications criteria for administration operations in Appendix J (C1 through C9) will be individually rated by the six members of the evaluation team using a scale of one to five. A rating of three is designated as "satisfactory" which means that "[t]he bidder has demonstrated that it meets an acceptable level of experience and capability to perform the required tasks." In order to be further evaluated, Section 11.3 provides that there must be no individual corporate qualifications criterion for which the bidder’s proposal receives a score less than 3.0 (average points across evaluators). Dr. Fisher, the primary author of Section 11.3 of the RFP, referred to the 3.0 rating as the "cut score." (Emphasis added.) The RFP’s clear and unambiguous terms thus establish the "minimum threshold" of experience that a bidder "must demonstrate" in its proposal for Criterion C1 through Criterion C9. The "minimum threshold" of experience that a bidder must demonstrate for each criterion is described in Appendix J of the RFP. If a proposal failed to demonstrate that the bidder meets the minimum threshold of experience for a particular criterion in Appendix J, the bidder obviously would not have demonstrated "that it meets an acceptable level of experience and capability to perform the required tasks." Thus, in that setting, an evaluator was to have assigned the proposal a rating of less than "satisfactory," or less than three, for that criterion. (Emphasis added.) The fact that a score less than "3" was expected for -- and would eliminate -- proposals that did not demonstrate the "minimum threshold" of experience does not render meaningless the potential scores of "1" and "2." Those scores may reflect the degree to which a bidder’s demonstrated experience was judged to fall below the threshold. Although some corporate capability minimums were stated quantitatively (i.e., "minimum of two," or "at least 200,000"), others were open to a more qualitative assessment (i.e., "large-scale," "systems," or "reports"). Moreover, a proposal that included demonstrated experience in some manner responsive to each aspect of Appendix J might nevertheless be assigned a score of less than "3," based on how an evaluator assessed the quality of the experience described in the proposal. By the terms of the RFP, however, an average score across evaluators of less than 3 represented essentially a decision that the minimum threshold of experience was not demonstrated. Had the Department truly intended Appendix J to reflect only general targets or guidelines, there were many alternative ways to communicate such an intent without giving mandatory direction about what bidders "must demonstrate" or without establishing quantitative minimums (i.e. "a minimum of two," or "at least 200,000"). RFP Appendix K, for instance, sets forth the evaluation criteria for technical proposals in broad terms that do not require the bidder to provide anything in particular. Even within Appendix J, other than in Criterion C4 and Criterion C6, bidders were to show experience with "large-scale" projects rather than experience at a quantified level. Pursuant to the RFP’s plain language, in order to meet the "minimum threshold" of experience for Criterion C4 and Criterion C6, a bidder "must demonstrate," among other things, successful completion of a "minimum of two" projects, each involving the use of image-based scoring technology in administering tests to "at least 200,000 students annually." Department’s Evaluation of Corporate Qualifications In evaluating Harcourt’s proposal, the Department failed to give effect to the plain RFP language stating that a bidder "must document" successful completion of a "minimum of two" testing projects involving "at least 200,000 students annually" in order to meet the "minimum threshold" of experience for C4 and C6. Dr. Fisher was the primary author of Sections 10, 11 and Appendix J of the RFP. He testified that during the Stage II evaluation of corporate qualifications, the evaluation team applied a "holistic" approach, like that used in grading open-ended written responses in student test assessments. Under the holistic approach that Dr. Fisher described, each member of the evaluation team was to study the proposals, compare the information in the proposals to everything contained in Appendix J, and then assign a rating for each criterion in Appendix J based on "how well" the evaluator felt the proposal meets the needs of the agency. Notwithstanding Dr. Fisher’s present position, the RFP’s terms and their context demonstrate that the minimum requirements for corporate qualifications are in RFP Appendix J. During the hearing, Dr. Fisher was twice asked to identify language in the RFP indicating that the Department would apply a "holistic" approach when evaluating corporate qualifications. Both times, Dr. Fisher was unable to point to any explicit RFP language putting bidders on notice that the Department would be using a "holistic" approach to evaluating proposals and treating the Appendix J thresholds merely as targets. In addition, Dr. Fisher testified that the Department did not engage in any discussion at the bidders’ conference about the evaluation method that was going to be used other than drawing the bidders’ attention to the language in the RFP. As written, the RFP establishes minimum thresholds of experience to be demonstrated. Where, as in the RFP, certain of those minimum thresholds are spelled out in quantitative terms that are not open to interpretation or judgment, it is neither reasonable nor logical to rate a proposal as having demonstrated "an acceptable level of experience" when it has not demonstrated the specified minimum levels, even if other requirements with which it was grouped were satisfied. The plain RFP language unambiguously indicates that an analytic method, not a "holistic" method, will be applied in evaluating corporate qualifications. Dr. Fisher acknowledged that, in an assessment using an analytic method, there is considerable effort placed up front in deciding the specific factors that will be analyzed and those factors are listed and explained. Dr. Fisher admitted that the Department went into considerable detail in Appendix J of the RFP to explain to the bidders the minimums they had to demonstrate and the documentation that was required. In addition, Dr. Orr, who served as a member of the evaluation team and who herself develops student assessment tests, stated that in assessments using the "holistic" method there is a scoring rubric applied, but that rubric does not contain minimum criteria like those found in the RFP for FCAT. The holistic method applied by the Department ignores very specific RFP language which spells out minimum requirements for corporate qualifications. Harcourt’s Corporate Qualifications for C4 and C6 Harcourt’s proposal lists the same three projects administered by Harcourt for both Criterion C4 and Criterion C6: the Connecticut Mastery Test ("CMT"), the Connecticut Academic Performance Test ("CAPT") and the Delaware Student Testing Program ("DSTP"). Harcourt’s proposal also lists for Criterion C4 projects administered by its proposed scoring subcontractors, Measurement Incorporated ("MI") and Data Recognition Corporation ("DRC"). However, none of the projects listed for MI or DRC involve image- based scoring. Thus, the MI and DRC projects do not demonstrate any volume of image-based scoring as required by C6 and by the portion of C4 which relates to the work task for the imaged-based scoring of the math and reading portions of the FCAT. Harcourt’s proposal states that "[a]pproximately 35,000 students per year in grade 10 are tested with the CAPT." Harcourt’s proposal states that "[a]pproximately 120,000 students per year in grades 4, 6 and 8 are tested with the CMT." Harcourt’s proposal states that "[a]pproximately 40,000 students in grades 3, 5, 8, and 10" are tested with the DSTP. Although the descriptions of the CMT and the CAPT in Harcourt’s proposal discuss image-based scoring, there is nothing in the description of the DSTP that addresses image-based scoring. There is no evidence that the evaluators were ever made aware that the DSTP involved image-based scoring. Moreover, although the Department called the Delaware Department of Education ("DDOE") as a reference for Harcourt’s development proposal, the Department did not discuss Harcourt’s administration of the DSTP (including whether the DSTP involves image-based scoring) with the DDOE. Harcourt overstated the number of students tested in the projects it referenced to demonstrate experience with image-based scoring. Harcourt admitted at hearing that, prior to submitting its proposal, Harcourt had never tested 120,000 students with the CMT. In fact, the total number of students tested by Harcourt on an annual basis under the CMT has ranged from 110,273 in the 1996- 97 school year to 116,679 in the 1998-99 school year. Harcourt also admitted at hearing that, prior to submitting its proposal, Harcourt had never tested 35,000 students in grade 10 with the CAPT. Instead, the total number of grade 10 students tested by Harcourt on an annual basis with the CAPT ranged from 30,243 in 1997 to 31,390 in 1998. In addition, Harcourt admitted at hearing that, prior to submitting its proposal, it had conducted only one "live" administration of the DSTP (as distinguished from field testing). That administration of the DSTP involved only 33,051, not 40,000, students in grades 3, 5, 8 and 10. Harcourt itself recognized that "field tests" of the DSTP are not responsive to C4 and C6, as evidenced by Harcourt’s own decision not to include in its proposal the number of students field tested under the DSTP. Even assuming that the numbers in Harcourt’s proposal are accurate, and that the description of the DSTP in Harcourt’s proposal reflected image-based scoring, Harcourt’s proposal on its face does not document any single project administered by Harcourt for C4 or C6 involving image-based testing of more than 120,000 students annually. When the projects are aggregated, the total number of students claimed as tested annually still does not reach the level of "at least 200,000;" it comes to only 195,000, and it reaches that level only once due to the single administration of the DSTP. Moreover, even if that 195,000 were considered "close enough" to the 200,000 level required, it was achieved only one time, while Appendix J plainly directs that there be a minimum of two times that testing at that level has been performed. The situation worsens for Harcourt when using the true numbers of students tested under the CMT, CAPT, and DSTP, because Harcourt cannot document any single image-based scoring project it has administered involving testing more than 116,679 students annually. Moreover, when the true numbers of students tested are aggregated, the total rises only to 181,120 students tested annually on one occasion, and no more than 141,663 tested annually on any other occasion. Despite this shortfall from the minimum threshold of experience, under the Department’s holistic approach the evaluators assigned Harcourt’s proposal four ratings of 3.0 and two ratings of 4.0 for C4, for an average of 3.3 on C4; and five ratings of 3.0 and one rating of 4.0 for C6, for an average of 3.2 on C6. Applying the plain language of the RFP in Sections 10 and 11 and Appendix J, Harcourt did not demonstrate that it meets an acceptable level of experience and capability for C4 or C6, because Harcourt did not satisfy the minimum threshold for each criterion by demonstrating a minimum of two prior completed projects involving image-based scoring requiring testing of at least 200,000 students annually. Harcourt’s proposal should not have received any rating of 3.0 or higher on C4 or C6 and should have been disqualified from further evaluation due to failure to demonstrate the minimum experience that the Department required in order to be assured that Harcourt can successfully administer the FCAT program. NCS’s Compliance With RFP Requirements Even though the NCS proposal did not meet all of the mandatory requirements, and despite the requirement of Section 11.2 that the proposal be automatically disqualified under such circumstances, the Department waived NCS’s noncompliance as a minor irregularity. The factors in C4 and C6 were set, minimal requirements with which NCS did not comply. For example, one of the two programs NCS submitted in response to Criteria C4 and C6 was the National Assessment of Educational Progress program ("NAEP"). NAEP, however, is not a "statewide assessment program" within the meaning of that term as used in Criteria C4 and C6. Indeed, NCS admitted that NAEP is not a statewide assessment program and that, without consideration of that program, NCS’s proposal is not responsive to Criteria C4 and C6 because NCS has not have submitted the required proof of having administered two statewide assessment programs. This error cannot be cured by relying on the additional experience of NCS’s subcontractor because that experience does not show that its subcontractor produced reports within three months, and so such experience does not demonstrate compliance with Criteria C4. The Department deliberately limited the competition for the FCAT contract to firms with specified minimum levels of experience. As opined at final hearing, if the Department in the RFP had announced to potential bidders that the type of experience it asked vendors to describe were only targets, goals and guidelines, and that a failure to demonstrate target levels of experience would not be disqualifying, then the competitive environment for this procurement would have differed since only 2.06 evaluation points (out of a possible 150) separated the NCS and Harcourt scores. Dr. Heidorn conceded that multiple companies with experience in different aspects of the FCAT program -- a computer/imaging company and a firm experienced in educational testing -- might combine to perform a contract like the FCAT. Yet, that combination of firms would be discouraged from bidding because they could not demonstrate the minimum experience spelled out in the RFP. Language in the RFP, indicating the "holistic" evaluation that was to be applied, could have resulted in a different field of potential and actual bidders.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is recommended that Respondent, State of Florida, Department of Education, enter a Final Order rejecting the bids submitted by Harcourt and NCS for the administration component of the RFP. The Department should then seek new proposals. DONE AND ENTERED this 25th day of May, 1999, in Tallahassee, Leon County, Florida. DON W. DAVIS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 25th day of May, 1999. COPIES FURNISHED: Karen D. Walker, Esquire Holland and Knight, LLP Post Office Drawer 810 Tallahassee, Florida 32302 Mark D. Colley, Esquire Holland and Knight, LLP Suite 400 2100 Pennsylvania Avenue, Northwest Washington, D.C. 20037 Charles S. Ruberg, Esquire Department of Education The Capitol, Suite 1701 Tallahassee, Florida 32399-0400 Paul R. Ezatoff, Jr., Esquire Christopher B. Lunny, Esquire Katz, Kutter, Haigler, Alderman, Bryant and Yon, P.A. 106 East College Avenue, Suite 1200 Tallahassee, Florida 32302-7741 Tom Gallagher Commissioner of Education Department of Education The Capitol, Plaza Level 08 Tallahassee, Florida 32399-0400 Michael H. Olenick, General Counsel Department of Education The Capitol, Suite 1701 Tallahassee, Florida 32399-0400

Florida Laws (3) 120.57287.012287.057
# 9
STEPHEN A. COHEN vs. BOARD OF ACCOUNTANCY, 80-002332 (1980)
Division of Administrative Hearings, Florida Number: 80-002332 Latest Update: Sep. 16, 1981

Findings Of Fact The Petitioner is a certified public accountant licensed in the State of Pennsylvania, having been licensed in 1961. The Petitioner is seeking licensure as a certified public accountant in Florida pursuant to the provisions of Chapter 43.308(3)(b), Florida Statutes, and Rule 21A-29.01(1)(b), Florida Administrative Code, that is, he seeks licensure in Florida by endorsement based upon his Pennsylvania licensure without the necessity for taking the Florida examination. At the time of the Petitioner's initial licensing in the State of Pennsylvania in 1961 he met Florida's requirements in the areas of education and experience. The Petitioner currently holds a valid license in Pennsylvania and is licensed in other states. The Board of Accountancy reviewed the Petitioner's application and determined that he met the Florida requirements for education and experience and that he was administered the same examination in Pennsylvania in 1961 that was administered in Florida in 1961, the uniform certified public accountancy examination administered by the American Institute of Certified Public Accountants (AICPA). The Board determined, however, in its non-final order, that the Petitioner did not receive grades on that examination administered in Pennsylvania that would have constituted passing grades in Florida and denied his application. The rules of the Board require that an applicant for licensure as a certified public accountant receive a grade of 75 or above on all parts of an examination administered by the American Institute of Certified Public Accountants. See Rule 2IA-28.05(2)(3), Florida Administrative Code. The rules in effect in 1961 also required that a grade of 75 or above be received on all four subjects of the examination in order to achieve licensure in Florida. See Rules of the State Board of Accountancy Relative to Examinations and the Issuance and Revocation of Certificates, Rule 1(f). See also Section 473.10, Florida Statutes (1961). The requirement that applicants for licensure by endorsement receive grades on all four areas of the AICPA Exam of 75 or better has been enforced in Florida since the 1930's and has been a requirement embodied in the rules of the Board since 1949. In February, 1961, the Pennsylvania Board of Accountancy, pursuant to a resolution enacted for insular reasons of its own, determined to accept as passing the Petitioner's and other candidates' scores in the Law and Practice portions of the AICPA licensure examination, even though those grades were below the score of 75. The Board thus deemed that the Petitioner passed the examination for purposes of licensure in Pennsylvania with a score of "75" by fiat, even though in fact the Petitioner did not receive an actual score of 75 in those two subject areas as determined by the AICPA which administered and graded the examination. The acceptance of the lower grade on the part of the Pennsylvania Board was not done pursuant to a regrading of the Petitioner's exam in an attempt to correct mistakes or errors in the AICPA's finding regarding his score, but was rather simply due to an arbitrary determination by the Pennsylvania Board that for the Petitioner and certain other Pennsylvania applicants the lower grade in that particular instance would be considered as passing. The Petitioner had no knowledge that the Pennsylvania Board had taken this action in arbitrarily upgrading his scores on two portions of the exam so that he passed the entire exam until he began his application process with the Florida State Board of Accountancy in September, 1980. During its investigation of the Petitioner's application for licensure by endorsement, the Florida Board of Accountancy ascertained that the Petitioner had in fact received grades of 65 in the Law and Practice pertions of the Uniform AICPA Examination which were then subsequently arbitrarily raised by resolution of the Pennsylvania Board. The Florida Beard has at no time accepted as passing grades for a licensure examination those grades by applicants of less than 75 on the AICPA examination. It is true that prior to the Florida Board's becoming aware, in 1973, of the fact that Pennsylvania had arbitrarily raised some grades of its applicants, it did in fact accept some similarly situated candidates for licensure by endorsement in Florida. After becoming aware at that time of this arbitrary grade-raising process, the Board has consistently refused licensure to applicants from other states who actually received less than 75 on the AICPA Examination as determined by the AICPA. For considerations of equity and fairness the Board did, however, allow candidates who had already been licensed in Florida by endorsement prior to the Board's becoming aware of this anomaly to retain their licenses. Since the Petitioner failed to meet the AICPA examination requirement of a grade of 75 or better on all portions of the examination which was set forth and adopted in the Florida rules and statutes in effect at the time of his licensure in Pennsylvania in 1961, his request for licensure by endorsement was denied by the Board's non-final order on December 8, 1980.

Recommendation Having considered the foregoing Findings of Fact and Conclusions of Law, the evidence in the record, the candor and demeanor of the witnesses and the pleadings and arguments of counsel, it is RECOMMENDED that the denial of the Petitioner's application for licensure by endorsement by the Board of Accountancy of the State of Florida be upheld and that the petition be denied. DONE AND ENTERED this 22nd day of June, 1981 in Tallahassee, Leon County, Florida. P. MICHAEL RUFF Hearing Officer Division of Administrative Hearings The Oakland Building 2009 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 22nd day of June, 1981. COPIES FURNISHED: George L. Waas, Esquire 1114 East Park Avenue Tallahassee, Florida 32301 John J. Rimes, III, Esquire Assistant Attorney General Suite 1601, The Capitol Tallahassee, Florida 32301

Florida Laws (3) 120.57473.306473.308
# 10

Can't find what you're looking for?

Post a free question on our public forum.
Ask a Question
Search for lawyers by practice areas.
Find a Lawyer