Elawyers Elawyers
Ohio| Change
Find Similar Cases by Filters
You can browse Case Laws by Courts, or by your need.
Find 48 similar cases
COURTYARD CENTER, INC. vs DEPARTMENT OF BUSINESS AND PROFESSIONAL REGULATION, 95-001970BID (1995)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Apr. 24, 1995 Number: 95-001970BID Latest Update: Nov. 02, 1995

The Issue Whether the Respondent, the Department of Business and Professional Regulation, acted in a fraudulent, dishonest, illegal or arbitrary fashion in its determination to award to the Intervenor, Three Oaks Plaza, Ltd., lease number 790:0056.

Findings Of Fact General The Petitioner is Courtyard Plaza, Inc., which is the owner of Courtyard Plaza. The Intervenor is Three Oaks Plaza, Ltd., which is the owner of Three Oaks Plaza. The Respondent is the Department of Business and Professional Regulation, which is a state agency created under Chapter 20, Florida Statutes, and an agency within the context of Chapter 255, Florida Statutes. Requirements of the RFP Respondent advertised an RFP pertaining to the provision of lease premises for lease number 790:0056 in a defined geographical area within Jacksonville, Florida. (T. 214). The geographical area for the RFP was approved by the Division of Facilities Management of the Department of Management Services. (Tx. 250, 256). The requirement of the RFP were as follows: Provision of 15,674 square feet of general office space located in Duval County. (A map accompanied the RFP which further limited the location of the lease to an area roughly bounded by Arlington Expressway to the north, Southside Boulevard to the east, Baymeadows Road to the south, and Phillips Highway-University Boulevard, and the St. Johns River to the west.); Availability of the space by December 1, 1995, for a period of five years with an option to renew for 10/1 years; Full service utilities, interior and exterior maintenance, janitorial services and supplies; Availability of 72 off-street [parking] spaces within 150 feet of the facility's main entrance dedicated to the exclusive use of employees and clients of the Department; Availability to public transportation; Dining facilities within 2 blocks of the offered facilities. Building code minimum of 50 pounds per square foot liveload, with 100 pounds per square foot live-load available in some sections to house vaults (applicable to multistory buildings); Departmental approval of renovation plans and specifications; and As and for security, locks on all outside/interior office doors, locks on all outside windows, a night light located within 10 feet of any outside door and night lights in all parking areas nearest the building. Pre-bid Procedures Petitioner and Intervenor attended a pre-bid conference held by the Respondent on February 28, 1995. Most of the evaluators were at the pre-bid conference. They were Brad Engleman, Tracy Pyke, Bob Miller, Elizabeth Doyle, Jerry Jenson and Jim Bob Cooper. Other than Engleman and Pyke, who were stationed in Tallahassee, the evaluators served as division chiefs in Jacksonville. There were three proposals submitted in response to the RFP: one from Three Oaks, one from Courtyard, and one from the current landlord, Adnan El- Yazingi. A review by Engleman and Pyke determined that El-Yazingi's proposal was not responsive on the basis that it was outside the geographic area designated in the RFP. On March 29, 1995, the evaluators visited the facilities proposed by Courtyard and Three Oaks during the day. The evaluators visited Courtyard first. Evaluation Score Sheets The evaluators used score sheets provided by the Department to evaluate the facilities. The evaluators did not see the bid specifications, and worked from the criteria of the evaluation sheets which do not relate to the criteria in the RFP. The score sheets for each facility contained the following criteria: Rental Rates (a) [39 Points] Based upon total present value for basic term of the lease using present value discount rate. [1 Point] For optional renewal terms of lease. Location: Environmental factors, including the physical characteristics of the building and the area surround it, efficiency and economic operations in the requested space. (a) [5 Points] Frequency and availability of satisfactory public transportation within proximity of the offered space. [10 Points] Proximity of offered space to clients to be served by the Department at this facility. [10 Points] Aesthetics of building, property the building sits on, and the surrounding neighborhood. [15 Points] Security issues posed by building, by associated parking and surrounding neighborhood. The criteria used for the security issue of the neighborhood shall be determined by the agency. Police crime statistics may be incorporated into the criteria. Facility (a) [10 Points] Susceptibility of the design of the space offered to efficient layout and good utilization, i.e., ability to house large units together and in close proximity to interdependent units. [10 Points] Susceptibility of the building, parking area and property as a whole for future expansions. Evaluation The evaluation score sheets were divided into two parts, the first part dealing with the financial criteria and involving a computation and comparison of the present value of the lease proposal of each bidder, and the second part assessing if an option period were offered. The first part was worth 39 points, and the second was worth one (1) point. The most cost effective bid received 39 points, and that value of that bid became the denominator of a fraction in which the value of the other proposal(s) became the numerator. The total number of available points (39) was then multiplied by the aforesaid fraction to complete the proposal(s). If the bidder provided for additional option period, the bidder received an additional one point. Because the computation of the present value was a mechanical act in which there was no discretion used by the evaluators, it was computed by Engleman and Pyke in Tallahassee after the evaluators completed the site visits and assessments of each facility. The present value of each of the proposals was computed using a computer program provided by Management Services. Courtyard was determined to be the lowest and best bid, and received 39 points. The value of the Three Oaks was also computed, and Three Oaks received 32 points. Both bidders provided for options. Three Oaks received 33 points for it financial portion of the bid, and Courtyard received 40 points. On Site Evaluations The evaluators based their scoring of the score sheets solely on the on-site visits, and nothing else. 2(a) Availability of Public Transportation [5 points] The record reveals that public bus transportation is available on a regularly scheduled basis within one half block of both Courtyard and Three Oaks. The evaluators, whose initials are indicated below, scored element 2(a) as follows: B.E. B.M. J.J. J.C. E.D. T.P. Ctyd 5 4 4 4 3 5 Oaks 5 5 4 4 5 5 2(b) Proximity to Clients [10 points] The Courtyard in the RFP is located almost in the middle of the geographic area designated by the Department. It is on Beach Boulevard two miles east of the Southside Boulevard at the intersection of Beach Boulevard and Parental Home Road. The Courtyard is approximately one-half block from the Beach Boulevard access to the Commodore Point Expressway which runs from Beach Boulevard northwest to the vicinity of the Gator Bowl where it splits and runs due west downtown to the vicinity of city-county buildings downtown, and due north to intersect with the accesses to the Mathews Bridge and the Haines Expressway. Three Oaks is located on the most northerly boundary of the geographic area designated by the RFP. Three Oaks is located on the south side of Arlington Expressway, approximately three miles east the Mathews Bridge and two miles west of the intersection of Southside Boulevard and Arlington Expressway. From the western end of the Mathews Bridge it is approximately two miles from I-95 via high-speed routes. Southside Boulevard runs south to intersect with I-295/I-95, and north to cross the St. Johns River over the Dames Point Bridge. The evaluators scored element 2(b) as follows: B.E. B.M. J.J. J.C. E.D. T.P. Ctyd 6 5 8 6 3 4 Oaks 10 10 8 6 7 8 2(c) Aesthetics of the Building [10 points] The Courtyard property is a medium sized office park, consisting of two, two story buildings, each with two wings joined at a ninety degree angle at main entrance area. The site visit to the Courtyard was conducted by Mary Farwell, President of Petra Management, Inc., the agent for the Petitioner, Courtyard. Ms. Farwell explained that all the buildings had been vacant for the past two years and the landscaping and the buildings had not been kept up; however, that the landlord was committed to making improvements to the landscaping and buildings to suit the tenants. She made a presentation in which she showed the evaluators various architectural renderings of modifications to the exterior elevations of the buildings which Courtyard was willing to undertake for a major lessor who would be able to provide input to the design. The Courtyard site had many mature trees, and shaded areas around the building and parking lots. The parking lots needed maintenance, resealing, and restriping. Ms. Farwell also pointed out an area away from the building being proposed, but on the property, where flooding had occurred during the recent serious storms. She told the evaluators that the Department of Transportation was committed to changing the drainage system which adjoined the property to correct the problem. Three Oaks proposed space split between two floors in a seven story office building which is one of three similar buildings located on the property. The part of the space being offered was not completed, but the evaluators were shown existing completed and occupied space in the building. It was represented that the proposed space would be completed in a comparable fashion. The buildings at Three Oaks were landscaped immediately adjacent to the buildings, but were surrounded by large paved parking lots devoid of trees or landscaping. The evaluators scored element 2(c) as follows: B.E. B.M. J.J. J.C. E.D. T.P. Ctyd 7 4 8 6 5 6 Oaks 9 9 9 10 9 9 2(d) Security Issues [15 points] There were night lights at Courtyard as required by the RFP. The parking lots at The Courtyard were lit with lights mounted on the buildings as required. Because of the growth of the trees surrounding the buildings, these lights did not fully illuminate the parking lot. This was an oft mentioned concern of the evaluators; however, this problem was capable of being altered by the addition of lights and pruning of trees and other on-site landscaping which the landlord was committed to do. It was the impression of the evaluators that The Courtyard was in a "worse" neighborhood that Three Oaks. These concerns were not mentioned to the landlord's representative, and crime statistics were not obtained from the local police regarding comparative crime rates. The photographic evidence presented by both sides of The Courtyard reveals no vandalism or graffiti on buildings vacant for two years. The evaluators ratings of Courtyard on this requirement were highly subjective and clearly beyond the requirements of the RFP. The open parking lots at Three Oaks were lighted, and the evaluators gave great value to the representation that there were private security officers available to assist occupants to their cars. The fact that a bank with its security personnel was located in the building was also brought to the attention of the evaluators. Security personnel were not RFP requirements. The evaluators scored element 2 (d) as follows: B.E. B.M. J.J. J.C. E.D. T.P. Ctyd 10 7 10 7 5 10 Oaks 13 13 11 12 15 13 3(a) Efficient Layout [10 points] Ms. Farwell explained to the evaluators that the Courtyard's landlord was prepared to gut the two story wing making up one-half of the eastern building which was being proposed for their use, and remodel the interior space to the Department's specifications. Although, some of the activities would have had to be separated on the two floor, this was not an operational problem for the activities which were geographically separated at the time. Ms. Farwell also indicated there had been discussions about providing day care services on site through a local agency. Three Oaks proposed to construct the interior space to the Department's specification. There were already some elements of the Department located in the building, and this would have been a small benefit for the Department, although the Department's units were independent and used to being geographically separated. Both bidders could provide spaces meeting the vault floor load requirements. Both bidders could provide the requisite number of parking spaces within the required distance. The evaluators scored element 3 (a) as follows: B.E. B.M. J.J. J.C. E.D. T.P. Ctyd 9 8 9 10 9 10 Oaks 10 9 9 9 9 10 Capacity to expand space and parking [10 points] The Courtyard had three additional unrented buildings the same size as the one presented in its proposal. As the major lessor, the Department would have priority in expanding into the other buildings if required. Parking as required was available and adequate for any future expansion. Three Oaks would have had to move other tenants to accommodate the further expansion of the lease; however, the landlord was willing to move other tenants who were on short terms leases. Parking as required was available and adequate for any future expansion. Both bidders had the office and parking capacity to accommodate further expansion. The evaluators scored element 3 (b) as follows: B.E. B.M. J.J. J.C. E.D. T.P. Ctyd 8 5 8 9 8 9 Oaks 9 6 8 9 9 10 Bid Criteria from RFP not addressed on the score sheet Both facilities had dinning facilities within two blocks. Neither landlords had completed the facilities offered; however, the RFP did not envision completed facilities because it called for Departmental approval of renovation plans and specifications. Both landlords offered to make their facilities available before December 1, 1995, and the nature of the proposed renovations at both facilities was such that this could be done. No evidence was received specifically on the nature and quality of the janitorial supplies and services; however, both landlords offered to provide these supplies and services together with maintenance on the facilities. Because The Courtyard was unoccupied, the evaluators could not assess the adequacy of the services at that facility. They did evaluate favorably the maintenance and upkeep at Three Oaks. Similarly, no evidence was received specifically on locks on the doors and windows at the facilities; however, both facilities had fixed windows, and the doors were capable of locking. There was no evidence of fraudulent activity in the exercise of the Respondent's discretion. There is no evidence of dishonesty in the exercise of Respondent's discretion. There is no evidence of illegal activity in the exercise of the Respondent's discretion. There is evidence that the Respondent's evaluations considered additional factors in assessing location and security of the properties in evaluating the proposals which were not part of the criteria of the RFP.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law set forth herein, it is, RECOMMENDED: That the Department not award the contract, and reinitiate the evaluation process using criteria clearly reflective of the requirements stated in the RFP. DONE and ENTERED this 7th day of July, 1995, in Tallahassee, Florida. STEPHEN F. DEAN, Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 7th day of July, 1995. APPENDIX The parties filed proposed findings of fact which were read and considered. The following states which of those findings were adopted, and which were rejected and why: Petitioner's Recommended Order Findings Paragraphs 1,2 Paragraph 9 Paragraph 3 Rejected as contrary to most credible evidence. Paragraph 4 Subsumed in Paragraph 9 et seq. Paragraph 5a Subsumed in Paragraphs 10,11. Paragraph 5b Subsumed in Paragraph 15 et seq. Paragraph 5c Subsumed in Paragraph 19 et seq. which are based upon the more credible evidence. Paragraph 5d Subsumed in Paragraph 26 et seq. Paragraph 6a Subsumed in Paragraphs 13,14. Paragraph 6b Subsumed in Paragraph 15 et seq. Paragraph 6c Subsumed in Paragraph 19 et seq. Paragraph 6d Subsumed in Paragraph 16 et seq. Paragraph 7 Subsumed in Paragraphs 13-29. Paragraphs 8,9 Rejected as Conclusion of Law. Respondent's Recommended Order Findings Paragraphs 1-4 Paragraph 1-4 Paragraph 5 Paragraph 6,7-11 Paragraph 6 Paragraphs 14,18,25,29,33,37,38. Paragraph 7 Statement of Case. Paragraph 8 Paragraph 39. Paragraphs 9-11 Paragraph 42. Paragraphs 12-14 Rejected as contrary to best evidence. Paragraphs 15,16 Irrelevant. Paragraph 17 Subsumed in 34. Paragraphs 18-21 Subsumed in 10,11. Paragraph 22 Paragraphs 9,12. Paragraph 23 Paragraph 13. Paragraph 24 Subsumed in 16,17. Paragraphs 25,26 Subsumed in 15,16. Paragraph 27 There was testimony to this effect which indicates that the evaluators considered issues which were not part of the RFP. Paragraph 28 Rejected as contrary to best evidence. See discussion in Paragraph 53. Paragraph 29 True, but this was not part of the criteria stated in the RFP. Paragraphs 30-33 Subsumed in Paragraph 30 et seq. Paragraphs 34-36 Subsumed in Paragraph 26 et seq. Paragraph 37 Subsumed in Paragraph 30 et seq. Paragraphs 38,39 Subsumed in Paragraph 34 et seq. Paragraph 40 Paragraphs 34,35. Paragraph 41 Paragraph 22. Paragraph 42 Subsumed in specific findings addressed above. Intervenor's Recommended Order Findings Paragraphs 1,2 Paragraph 4. Paragraph 3 Paragraph 7. Paragraph 4 Paragraph 6. Paragraph 5 Paragraph 8-11. Paragraph 5(Sic) Paragraph 8,12. Paragraph 6 Subsumed in Paragraphs 11,12,38. Paragraph 7 Irrelevant. Paragraph 8 Subsumed in Paragraph 38. Paragraph 9 Adopted in part and rejected in part. See specific findings in paragraphs 1-42. Paragraph 10 Conclusion of Law. COPIES FURNISHED: Mary C. Sorrell, Esquire 2275 Atlantic Blvd. Jacksonville, FL 32266 William M. Woodyard, Esquire Stephen Willis, Esquire Department of Business and Professional Regulation 1940 North Monroe Street Northwood Centre Tallahassee, FL 32399-0750 Melissa Fletcher Allaman, Esquire Thomas M. Ervin, Jr., Esquire Ervin, Varn, Jacobs, Odom & Ervin 305 South Gadsden Street Post Office Drawer 1170 Tallahassee, FL 32302 Delane Anderson, Acting Secretary Department of Business and Professional Regulation 1940 North Monroe Street Northwood Centre Tallahassee, FL 32399-0750 Jack McRay, General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Northwood Centre Tallahassee, FL 32399-0750

Florida Laws (4) 120.53120.57120.68255.25
# 1
KPMG CONSULTING, INC. vs DEPARTMENT OF REVENUE, 02-001719BID (2002)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida May 01, 2002 Number: 02-001719BID Latest Update: Oct. 15, 2002

The Issue The issue to be resolved in this proceeding concerns whether the Department of Revenue (Department, DOR) acted clearly erroneously, contrary to competition, arbitrarily or capriciously when it evaluated the Petitioner's submittal in response to an Invitation to Negotiate (ITN) for a child support enforcement automated management system-compliance enforcement (CAMS CE) in which it awarded the Petitioner a score of 140 points out of a possible 230 points and disqualified the Petitioner from further consideration in the invitation to negotiate process.

Findings Of Fact Procurement Background: The Respondent, the (DOR) is a state agency charged with the responsibility of administering the Child Support Enforcement Program (CSE) for the State of Florida, in accordance with Section 20.21(h), Florida Statutes. The DOR issued an ITN for the CAMS Compliance Enforcement implementation on February 1, 2002. This procurement is designed to give the Department a "state of the art system" that will meet all Federal and State Regulations and Policies for Child Support Enforcement, improve the effectiveness of collections of child support and automate enforcement to the greatest extent possible. It will automate data processing and other decision- support functions and allow rapid implementation of changes in regulatory requirements resulting from revised Federal and State Regulation Policies and Florida initiatives, including statutory initiatives. CSE services suffer from dependence on an inadequate computer system known as the "FLORIDA System" which was not originally designed for CSE and is housed and administered in another agency. The current FLORIDA System cannot meet the Respondent's needs for automation and does not provide the Respondent's need for management and reporting requirements and the need for a more flexible system. The DOR needs a system that will ensure the integrity of its data, will allow the Respondent to consolidate some of the "stand-alone" systems it currently has in place to remedy certain deficiencies of the FLORIDA System and which will help the Child Support Enforcement system and program secure needed improvements. The CSE is also governed by Federal Policy, Rules and Reporting requirements concerning performance. In order to improve its effectiveness in responding to its business partners in the court system, the Department of Children and Family Services, the Sheriff's Departments, employers, financial institutions and workforce development boards, as well as to the Federal requirements, it has become apparent that the CSE agency and system needs a new computer system with the flexibility to respond to the complete requirements of the CSE system. In order to accomplish its goal of acquiring a new computer system, the CSE began the procurement process. The Department hired a team from the Northrup Grumman Corporation headed by Dr. Edward Addy to head the procurement development process. Dr. Addy began a process of defining CSE needs and then developing an ITN which reflected those needs. The process included many individuals in CSE who would be the daily users of the new system. These individuals included Andrew Michael Ellis, Revenue Program Administrator III for Child Support Enforcement Compliance Enforcement; Frank Doolittle, Process Manager for Child Support Enforcement Compliance Enforcement and Harold Bankirer, Deputy Program Director for the Child Support Enforcement Program. There are two alternative strategies for implementing a large computer system such as CAMS CE: a customized system developed especially for CSE or a Commercial Off The Shelf, Enterprise Resource Plan (COTS/ERP). A COTS/ERP system is a pre-packaged software program, which is implemented as a system- wide solution. Because there is no existing COTS/ERP for child support programs, the team recognized that customization would be required to make the product fit its intended use. The team recognized that other system attributes were also important, such as the ability to convert "legacy data" and to address such factors as data base complexity and data base size. The Evaluation Process: The CAMS CE ITN put forth a tiered process for selecting vendors for negotiation. The first tier involved an evaluation of key proposal topics. The key topics were the vendors past corporate experience (past projects) and its key staff. A vendor was required to score 150 out of a possible 230 points to enable it to continue to the next stage or tier of consideration in the procurement process. The evaluation team wanted to remove vendors who did not have a serious chance of becoming the selected vendor at an early stage. This would prevent an unnecessary expenditure of time and resources by both the CSE and the vendor. The ITN required that the vendors provide three corporate references showing their past corporate experience for evaluation. In other words, the references involved past jobs they had done for other entities which showed relevant experience in relation to the ITN specifications. The Department provided forms to the vendors who in turn provided them to their corporate references that they themselves selected. The vendors also included a summary of their corporate experience in their proposal drafted by the vendors themselves. Table 8.2 of the ITN provided positive and negative criteria by which the corporate references would be evaluated. The list in Table 8.2 is not meant to be exhaustive and is in the nature of an "included but not limited to" standard. The vendors had the freedom to select references whose projects the vendors' believed best fit the criteria upon which each proposal was to be evaluated. For the key staff evaluation standard, the vendors provided summary sheets as well as résumés for each person filling a lead role as key staff members on their proposed project team. Having a competent project team was deemed by the Department to be critical to the success of the procurement and implementation of a large project such as the CAMS CE. Table 8.2 of the ITN provided the criteria by which the key staff would be evaluated. The Evaluation Team: The CSE selected an evaluation team which included Dr. Addy, Mr. Ellis, Mr. Bankirer, Mr. Doolittle and Mr. Esser. Although Dr. Addy had not previously performed the role of an evaluator, he has responded to several procurements for Florida government agencies. He is familiar with Florida's procurement process and has a doctorate in Computer Science as well as seventeen years of experience in information technology. Dr. Addy was the leader of the Northrup Grumman team which primarily developed the ITN with the assistance of personnel from the CSE program itself. Mr. Ellis, Mr. Bankirer and Mr. Doolittle participated in the development of the ITN as well. Mr. Bankirer and Mr. Doolittle had previously been evaluators in other procurements for Federal and State agencies prior to joining the CSE program. Mr. Esser is the Chief of the Bureau of Information Technology at the Department of Highway Safety and Motor Vehicles and has experience in similar, large computer system procurements at that agency. The evaluation team selected by the Department thus has extensive experience in computer technology, as well as knowledge of the requirements of the subject system. The Department provided training regarding the evaluation process to the evaluators as well as a copy of the ITN, the Source Selection Plan and the Source Selection Team Reference Guide. Section 6 of the Source Selection Team Reference Guide entitled "Scoring Concepts" provided guidance to the evaluators for scoring proposals. Section 6.1 entitled "Proposal Evaluation Specification in ITN Section 8" states: Section 8 of the ITN describes the method by which proposals will be evaluated and scored. SST evaluators should be consistent with the method described in the ITN, and the source selection process documented in the Reference Guide and the SST tools are designed to implement this method. All topics that are assigned to an SST evaluator should receive at the proper time an integer score between 0 and 10 (inclusive). Each topic is also assigned a weight factor that is multiplied by the given score in order to place a greater or lesser emphasis on specific topics. (The PES workbook is already set to perform this multiplication upon entry of the score.) Tables 8-2 through 8-6 in the ITN Section 8 list the topics by which the proposals will be scored along with the ITN reference and evaluation and scoring criteria for each topic. The ITN reference points to the primary ITN section that describes the topic. The evaluation and scoring criteria list characteristics that should be used to affect the score negatively or positively. While these characteristics should be used by each SST evaluator, each evaluator is free to emphasize each characteristic more or less than any other characteristic. In addition, the characteristics are not meant to be inclusive, and evaluators may consider other characteristics that are not listed . . . (Emphasis supplied). The preponderant evidence demonstrates that all the evaluators followed these instructions in conducting their evaluations and none used a criterion that was not contained in the ITN, either expressly or implicitly. Scoring Method: The ITN used a 0 to 10 scoring system. The Source Selection Team Guide required that the evaluators use whole integer scores. They were not required to start at "7," which was the average score necessary to achieve a passing 150 points, and then to score up or down from 7. The Department also did not provide guidance to the evaluators regarding a relative value of any score, i.e., what is a "5" as opposed to a "6" or a "7." There is no provision in the ITN which establishes a baseline score or starting point from which the evaluators were required to adjust their scores. The procurement development team had decided to give very little structure to the evaluators as they wanted to have each evaluator score based upon his or her understanding of what was in the proposal. Within the ITN the development team could not sufficiently characterize every potential requirement, in the form that it might be submitted, and provide the consistency of scoring that one would want in a competitive environment. This open-ended approach is a customary method of scoring, particularly in more complex procurements in which generally less guidance is given to evaluators. Providing precise guidance regarding the relative value of any score, regarding the imposition of a baseline score or starting point, from which evaluators were required to adjust their scores, instruction as to weighing of scores and other indicia of precise structure to the evaluators would be more appropriate where the evaluators themselves were not sophisticated, trained and experienced in the type of computer system desired and in the field of information technology and data retrieval generally. The evaluation team, however, was shown to be experienced and trained in information technology and data retrieval and experienced in complex computer system procurement. Mr. Barker is the former Bureau Chief of Procurement for the Department of Management Services. He has 34 years of procurement experience and has participated in many procurements for technology systems similar to CAMS CE. He established that the scoring system used by the Department at this initial stage of the procurement process is a common method. It is customary to leave the numerical value of scores to the discretion of the evaluators based upon each evaluator's experience and review of the relevant documents. According wider discretion to evaluators in such a complex procurement process tends to produce more objective scores. The evaluators scored past corporate experience (references) and key staff according to the criteria in Table 8.2 of the ITN. The evaluators then used different scoring strategies within the discretion accorded to them by the 0 to 10 point scale. Mr. Bankirer established a midrange of 4 to 6 and added or subtracted points based upon how well the proposal addressed the CAMS CE requirements. Evaluator Ellis used 6 as his baseline and added or subtracted points from there. Dr. Addy evaluated the proposals as a composite without a starting point. Mr. Doolittle started with 5 as an average score and then added or subtracted points. Mr. Esser gave points for each attribute in Table 8.2, for key staff, and added the points for the score. For the corporate reference criterion, he subtracted a point for each attribute the reference lacked. As each of the evaluators used the same methodology for the evaluation of each separate vendor's proposal, each vendor was treated the same and thus no specific prejudice to KPMG was demonstrated. Corporate Reference Evaluation: KPMG submitted three corporate references: Duke University Health System (Duke), SSM Health Care (SSM), and Armstrong World Industries (Armstrong). Mr. Bankirer gave the Duke reference a score of 6, the SSM reference a score of 5 and the Armstrong reference a score of 7. Michael Strange, the KPMG Business Development Manager, believed that 6 was a low score. He contended that an average score of 7 was required to make the 150-point threshold for passage to the next level of the ITN consideration. Therefore, a score of 7 would represent minimum compliance, according to Mr. Strange. However, neither the ITN nor the Source Selection Team Guide identified 7 as a minimally compliant score. Mr. Strange's designation of 7 as a minimally compliant score is not provided for in the specifications or the scoring instructions. Mr. James Focht, Senior Manager for KPMG testified that 6 was a low score, based upon the quality of the reference that KPMG had provided. However, Mr. Bankirer found that the Duke reference was actually a small-sized project, with little system development attributes, and that it did not include information regarding a number of records, the data base size involved, the estimated and actual costs and attributes of data base conversion. Mr. Bankirer determined that the Duke reference had little similarity to the CAMS CE procurement requirements and did not provide training or data conversion as attributes for the Duke procurement which are attributes necessary to the CAMS CE procurement. Mr. Strange and Mr. Focht admitted that the Duke reference did not specifically contain the element of data conversion and that under the Table 8.2, omission of this information would negatively affect the score. Mr. Focht admitted that there was no information in the Duke Health reference regarding the number of records and the data base size, all of which factors diminish the quality of Duke as a reference and thus the score accorded to it. Mr. Strange opined that Mr. Bankirer had erred in determining that the Duke project was a significantly small sized project since it only had 1,500 users. Mr. Focht believed that the only size criterion in Table 8.2 was the five million dollar cost threshold, and, because KPMG indicated that the project cost was greater than five million dollars, that KPMG had met the size criterion. Mr. Focht believed that evaluators had difficulty in evaluating the size of the projects in the references due to a lack of training. Mr. Focht was of the view that the evaluator should have been instructed to make "binary choices" on issues such as size. He conceded, however, that evaluators may have looked at other criteria in Table 8.2 to determine the size of the project, such as database size and number of users. However, the corporate references were composite scores by the evaluators, as the ITN did not require separate scores for each factor in Table 8.2. Therefore, Mr. Focht's focus on binary scoring for size, to the exclusion of other criteria, mis-stated the objective of the scoring process. The score given to the corporate references was a composite of all of the factors in Table 8.2, and not merely monetary value size. Although KPMG apparently contends that size, in terms of dollar value, is the critical factor in determining the score for a corporate reference, the vendor questions and answers provided at the pre-proposal conference addressed the issue of relevant criteria. Question 40 of the vendor questions and answers, Volume II, did not single out "project greater than five million dollars" as the only size factor or criterion. QUESTION: Does the state require that each reference provided by the bidder have a contract value greater than $5 million; and serve a large number of users; and include data conversion from a legacy system; and include training development? ANSWER: To get a maximum score for past corporate experience, each reference must meet these criteria. If the criteria are not fully met, the reference will be evaluated, but will be assigned a lower score depending upon the degree to which the referenced project falls short of these required characteristics. Therefore, the cost of the project is shown to be only one component of a composite score. Mr. Strange opined that Mr. Bankirer's comment regarding the Duke reference, "little development, mostly SAP implementation" was irrelevant. Mr. Strange's view was that the CAMS CE was not a development project and Table 8.2 did not specifically list development as a factor on which proposals would be evaluated. Mr. Focht stated that in his belief Mr. Bankirer's comment suggested that Mr. Bankirer did not understand the link between the qualifications in the reference and the nature of KPMG's proposal. Both Strange and Focht believe that the ITN called for a COTS/ERP solution. Mr. Focht stated that the ITN references a COTS/ERP approach numerous times. Although many of the references to COTS/ERP in the ITN also refer to development, Mr. Strange also admitted that the ITN was open to a number of approaches. Furthermore, both the ITN and the Source Selection Team Guide stated that the items in Table 8.2 are not all inclusive and that the evaluators may look to other factors in the ITN. Mr. Bankirer noted that there is no current CSE COTS/ERP product on the market. Therefore, some development will be required to adapt an off-the-shelf product to its intended use as a child support case management system. Mr. Bankirer testified that the Duke project was a small-size project with little development. Duke has three sites while CSE has over 150 sites. Therefore, the Duke project is smaller than CAMS. There was no information provided in the KPMG submittal regarding data base size and number of records with regard to the Duke project. Mr. Bankirer did not receive the information he needed to infer a larger sized-project from the Duke reference. Mr. Esser also gave the Duke reference a score of 6. The reference did not provide the data base information required, which was the number of records in the data base and the number of "gigabytes" of disc storage to store the data, and there was no element of legacy conversion. Dr. Addy gave the Duke reference a score of 5. He accepted the dollar value as greater than five million dollars. He thought that the Duke Project may have included some data conversion, but it was not explicitly stated. The Duke customer evaluated training so he presumed training was provided with the Duke project. The customer ratings for Duke were high as he expected they would be, but similarity to the CAMS CE system was not well explained. He looked at size in terms of numbers of users, number of records and database size. The numbers that were listed were for a relatively small-sized project. There was not much description of the methodology used and so he gave it an overall score of 5. Mr. Doolittle gave the Duke reference a score of 6. He felt that it was an average response. He listed the number of users, the number of locations, that it was on time and on budget, but found that there was no mention of data conversion, database size or number of records. (Consistent with the other evaluators). A review of the evaluators comments makes it apparent that KPMG scores are more a product of a paucity of information provided by KPMG corporate references instead of a lack of evaluator knowledge of the material being evaluated. Mr. Ellis gave a score of 6 for the Duke reference. He used 6 as his baseline. He found the required elements but nothing more justifying in his mind raising the score above 6. Mr. Focht and Mr. Strange expressed the same concerns regarding Bankirer's comment, regarding little development, for the SSM Healthcare reference as they had for the Duke Health reference. However, both Mr. Strange and Mr. Focht admitted that the reference provided no information regarding training. Mr. Strange admitted that the reference had no information regarding data conversion. Training and data conversion are criteria contained in Table 8.2. Mr. Strange also admitted that KPMG had access to Table 8.2 before the proposal was submitted and could have included the information in the proposal. Mr. Bankirer gave the SSM reference a score of 5. He commented that the SAP implementation was not relevant to what the Department was attempting to do with the CAMS CE system. CAMS CE does not have any materials management or procurement components, which was the function of the SAP components and the SSM reference procurement or project. Additionally, there was no training indicated in the SSM reference. Mr. Esser gave the SSM reference a score of 3. His comments were "no training provided, no legacy data conversion, project evaluation was primarily for SAP not KPMG". However, it was KPMG's responsibility in responding to the ITN to provide project information concerning a corporate reference in a clear manner rather than requiring that an evaluator infer compliance with the specifications. Mr. Focht believed that legacy data conversion could be inferred from the reference's description of the project. Mr. Strange opined that Mr. Esser's comment was inaccurate as KPMG installed SAP and made the software work. Mr. Esser gave the SSM reference a score of 3 because the reference described SAP's role, but not KPMG's role in the installation of the software. When providing information in the reference SSM gave answers relating to SAP to the questions regarding system capability, system usability, system reliability but did not state KPMG's role in the installation. SAP is a large enterprise software package. This answer created an impression of little KPMG involvement in the project. Dr. Addy gave the SSM reference a score of 6. Dr. Addy found that the size was over five million dollars and customer ratings were high except for a 7 for usability with reference to a "long learning curve" for users. Data conversion was implied. There was no strong explanation of similarity to CAMS CE. It was generally a small-sized project. He could reason some similarity into it, even though it was not well described in the submittal. Mr. Doolittle gave the SSM reference a score of 6. Mr. Doolittle noted, as positive factors, that the total cost of the project was greater than five million dollars, that it supported 24 sites and 1,500 users as well "migration from a mainframe." However, there were negative factors such as training not being mentioned and a long learning curve for its users. Mr. Ellis gave a score of 6 for SSM, feeling that KPMG met all of the requirements but did not offer more than the basic requirements. Mr. Strange opined that Mr. Bankirer, Dr. Addy and Mr. Ellis (evaluators 1, 5 and 4) were inconsistent with each other in their evaluation of the SSM reference. He stated that this inconsistency showed a flaw in the evaluation process in that the evaluators did not have enough training to uniformly evaluate past corporate experience, thereby, in his view, creating an arbitrary evaluation process. Mr. Bankirer gave the SSM reference a score of 5, Ellis a score of 6, and Addy a score of 6. Even though the scores were similar, Mr. Strange contended that they gave conflicting comments regarding the size of the project. Mr. Ellis stated that the size of the project was hard to determine as the cost was listed as greater than five million dollars and the database size given, but the number of records was not given. Mr. Bankirer found that the project was low in cost and Dr. Addy stated that over five million dollars was a positive factor in his consideration. However, the evaluators looked at all of the factors in Table 8.2 in scoring each reference. Other factors that detracted from KPMG's score for the SSM reference were: similarity to the CAMS system not being explained, according to Dr. Addy; no indication of training (all of the evaluators); the number of records not being provided (evaluator Ellis); little development shown (Bankirer) and usability problems (Dr. Addy). Mr. Strange admitted that the evaluators may have been looking at other factors besides the dollar value size in order to score the SSM reference. Mr. Esser gave the Armstrong reference a score of 6. He felt that the reference did not contain any database information or cost data and that there was no legacy conversion shown. Dr. Addy also gave Armstrong a score of 6. He inferred that this reference had data conversion as well as training and the high dollar volume which were all positive factors. He could not tell, however, from the project description, what role KPMG actually had in the project. Mr. Ellis gave a score of 7 for the Armstrong reference stating that the Armstrong reference offered more information regarding the nature of the project than had the SSM and Duke references. Mr. Bankirer gave KPMG a score of 7 for the Armstrong reference. He found that the positive factors were that the reference had more site locations and offered training but, on the negative side, was not specific regarding KPMG's role in the project. Mr. Focht opined that the evaluators did not understand the nature of the product and services the Department was seeking to obtain as the Department's training did not cover the nature of the procurement and the products and services DOR was seeking. However, when he made this statement he admitted he did not know the evaluators' backgrounds. In fact, Bankirer, Ellis, Addy and Doolittle were part of a group that developed the ITN and clearly knew what CSE was seeking to procure. Further, Mr. Esser stated that he was familiar with COTS and described it as a commercial off-the-shelf software package. Mr. Esser explained that an ERP solution or Enterprise Resource Plan is a package that is designed to do a series of tasks, such as produce standard reports and perform standard operations. He did not believe that he needed further training in COTS/ERP to evaluate the proposals. Mr. Doolittle was also familiar with COTS/ERP and believed, based on the amount of funding, that it was a likely response to the ITN. Dr. Addy's doctoral dissertation research was in the area of software re-use. COTS is one of the components that comprise a development activity and re-use. He became aware during his research of how COTS packages are used in software engineering. He has also been exposed to ERP packages. ERP is only one form of a COTS package. In regard to the development of the ITN and the expectations of the development team, Dr. Addy stated that they were amenable to any solution that met the requirements of the ITN. They fully expected the compliance solutions were going to be comprised of mostly COTS and ERP packages. Furthermore, the ITN in Section 1.1, on page 1-2 states, ". . . FDOR will consider an applicable Enterprise Resource Planning (ERP) or Commercial Off the Shelf (COTS) based solution in addition to custom development." Clearly, this ITN was an open procurement and to train evaluators on only one of the alternative solutions would have biased the evaluation process. Mr. Doolittle gave each of the KPMG corporate references a score of 6. Mr. Strange and Mr. Focht questioned the appropriateness of these scores as the corporate references themselves gave KPMG average ratings of 8.3, 8.2 and 8.0. However, Mr. Focht admitted that Mr. Doolittle's comments regarding the corporate references were a mixture of positive and negative comments. Mr. Focht believed, however, that as the reference corporations considered the same factors for providing ratings on the reference forms, that it was inconsistent for Mr. Doolittle to separately evaluate the same factors that the corporations had already rated. However, there is no evidence in the record that KPMG provided Table 8.2 to the companies completing the reference forms and that the companies consulted the table when completing their reference forms. Therefore, KPMG did not prove that it had taken all measures available to it to improve its scores. Moreover, Mr. Focht's criticism would impose a requirement on Mr. Doolittle's evaluation which was not supported by the ITN. Mr. Focht admitted that there was no criteria in the ITN which limited the evaluator's discretion in scoring to the ratings given to the corporate references by those corporate reference customers. All of the evaluators used Table 8.2 as their guide for scoring the corporate references. As part of his evaluation, Dr. Addy looked at the methodology used by the proposers in each of the corporate references to implement the solution for that reference company. He was looking at methodology to determine its degree of similarity to CAMS CE. While not specifically listed in Table 8.2 as a similarity to CAMS, Table 8.2 states that the list is not all inclusive. Clearly, methodology is a measure of similarity and therefore is not an arbitrary criterion. Moreover, as Dr. Addy used the same process and criteria in evaluating all of the proposals there was no prejudice to KPMG by use of this criterion since all vendors were subjected to it. Mr. Strange stated that KPMG appeared to receive lower scores for SAP applications than other vendors. For example, evaluator 1 gave a score of 7 to Deloitte's reference for Suntax. Suntax is an SAP implementation. It is difficult to draw comparisons across vendors, yet the evaluators consistently found that KPMG references lacked key elements such as data conversion, information on starting and ending costs, and information on database size. All of these missing elements contributed to a reduction in KPMG's scores. Nevertheless, KPMG received average scores of 5.5 for Duke, 5.7 for SSM and 6.3 for Armstrong, compared with the score of 7 received by Deloitte for Suntax. There is only a gap of 1.5 to .7 points between Deloitte and KPMG's scores for SAP implementations, despite the deficient information within KPMG's corporate references. Key Staff Criterion: The proposals contain a summary of the experience of key staff and attached résumés. KPMG's proposed key staff person for Testing Lead was Frank Traglia. Mr. Traglia's summary showed that he had 25-years' experience respectively, in the areas of child support enforcement, information technology, project management and testing. Strange and Focht admitted that Traglia's résumé did not specifically list any testing experience. Mr. Focht further admitted that it was not unreasonable for evaluators to give the Testing Lead a lower score due to the lack of specific testing information in Traglia's résumé. Mr. Strange explained that the résumé was from a database of résumés. The summary sheet, however, was prepared by those KPMG employees who prepared the proposal. All of the evaluators resolved the conflicting information between the summary sheet and the résumé by crediting the résumé as more accurate. Each evaluator thought that the résumé was more specific and expected to see specific information regarding testing experience on the résumé for someone proposed as the Testing Lead person. Evaluators Addy and Ellis gave scores to the Testing Lead criterion of 4 and 5. Mr. Ron Vandenberg (evaluator 8) gave the Testing Lead a score of 9. Mr. Vandenberg was the only evaluator to give the Testing Lead a high score. The other evaluators gave the Testing Lead an average score of 4.2. The Vandenberg score thus appears anomalous. All of the evaluators gave the Testing Lead a lower score as it did not specifically list testing experience. Dr. Addy found that the summary sheet listed 25-years of experience in child support enforcement, information technology, and project management and system testing. As he did not believe this person had 100 years of experience, he assumed those experience categories ran concurrently. A strong candidate for Testing Lead should demonstrate a combination of testing experience, education and certification, according to Dr. Addy. Mr. Doolittle also expected to see testing experience mentioned in the résumé. When evaluating the Testing Lead, Mr. Bankirer first looked at the team skills matrix and found it interesting that testing was not one of the categories of skills listed for the Testing Lead. He then looked at the summary sheet and résumé from Mr. Traglia. He gave a lower score to Traglia as he thought that KPMG should have put forward someone with demonstrable testing experience. The evaluators gave a composite score to key staff based on the criteria in Table 8.2. In order to derive the composite score that he gave each staff person, Mr. Esser created a scoring system wherein he awarded points for each attribute in Table 8.2 and then added the points together to arrive at a composite score. Among the criteria he rated, Mr. Esser awarded points for CSE experience. Mr. Focht and Mr. Strange contended that since the term CSE experience is not actually listed in Table 8.2 that Mr. Esser was incorrect in awarding points for CSE experience in his evaluation. Table 8.2 does refer to relevant experience. There is no specific definition provided in Table 8.2 for relevant experience. Mr. Focht stated that relevant experience is limited to COTS/ERP experience, system development, life cycle and project management methodologies. However, these factors are also not listed in Table 8.2. Mr. Strange limited relevance to experience in the specific role for which the key staff person was proposed. This is a limitation that also is not imposed by Table 8.2. CSE experience is no more or less relevant than the factors posited by KPMG as relevant experience. Moreover, KPMG included a column in its own descriptive table of key staffs for CSE experience. KPMG must have seen this information as relevant if it included it in its proposal as well. Inclusion of this information in its proposal demonstrated that KPMG must have believed CSE experience was relevant at the time its submitted its proposal. Mr. Strange held the view that, in the bidders conference in a reply to a vendor question, the Department representative stated that CSE experience was not required. Therefore, Mr. Esser could not use such experience to evaluate key staff. Question 47 of the Vendor Questions and Answers, Volume 2 stated: QUESTION: In scoring the Past Corporate Experience section, Child Support experience is not mentioned as a criterion. Would the State be willing to modify the criteria to include at least three Child Support implementations as a requirement? ANSWER: No. However, a child support implementation that also meets the other characteristics (contract value greater than $5 million, serves a large number of users, includes data conversion from a legacy system and includes training development) would be considered "similar to CAMS CE." The Department's statement involved the scoring of corporate experience not key staff. It was inapplicable to Mr. Esser's scoring system. Mr. Esser gave the Training Lead a score of 1. According to Esser, the Training Lead did not have a ten-year résumé, for which he deducted one point. The Training Lead had no specialty certification or extensive experience and had no child support experience and received no points. Mr. Esser added one point for the minimum of four years of specific experience and one point for the relevance of his education. Mr. Esser gave the Project Manager a score of 5. The Project Manager had a ten-year résumé and required references and received a point for each. He gave two points for exceeding the minimum required informational technology experience. The Project Manager had twelve years of project management experience for a score of one point, but lacked certification, a relevant education and child support enforcement experience for which he was accorded no points. Mr. Esser gave the Project Liaison person a score of According to Mr. Focht, the Project Liaison should have received a higher score since she has a professional history of having worked for the state technology office. Mr. Esser, however, stated that she did not have four years of specific experience and did not have extensive experience in the field, although she had a relevant education. Mr. Esser gave the Software Lead person a score of 4. The Software Lead, according to Mr. Focht, had a long set of experiences with implementing SAP solutions for a wide variety of different clients and should have received a higher score. Mr. Esser gave a point each for having a ten-year résumé, four years of specific experience in software, extensive experience in this area and relevant education. According to Mr. Focht the Database Lead had experience with database pools including the Florida Retirement System and should have received more points. Mr. Strange concurred with Mr. Focht in stating that Esser had given low scores to key staff and stated that the staff had good experience, which should have generated more points. Mr. Strange believed that Mr. Esser's scoring was inconsistent but provided no basis for that conclusion. Other evaluators also gave key staff positions scores of less than 7. Dr. Addy gave the Software Lead person a score of 5. The Software Lead had 16 years of experience and SAP development experience as positive factors but had no development lead experience. He had a Bachelor of Science and a Master of Science in Mechanical Engineering and a Master's in Business Administration, which were not good matches in education for the role of a Software Lead person. Dr. Addy gave the Training Lead person a score of 5. The Training Lead had six years of consulting experience, a background in SAP consulting and some training experience but did not have certification or education in training. His educational background also was electrical engineering, which is not a strong background for a training person. Dr. Addy gave the subcontractor managers a score of 5. Two of the subcontractors did not list managers at all, which detracted from the score. Mr. Doolittle gave the Training Lead person a He believed that based on his experience and training it was an average response. Table 8.2 contained an item in which a proposer could have points detracted from a score if the key staff person's references were not excellent. The Department did not check references at this stage in the evaluation process. As a result, the evaluators simply did not consider that item when scoring. No proposer's score was adversely affected thereby. KPMG contends that checking references would have given the evaluators greater insight into the work done by those individuals and their relevance and capabilities in the project team. Mr. Focht admitted, however, that any claimed effect on KPMG's score is conjectural. Mr. Strange stated that without reference checks information in the proposals could not be validated but he provided no basis for his opinion that reference checking was necessary at this preliminary stage of the evaluation process. Dr. Addy stated that the process called for checking references during the timeframe of oral presentations. They did not expect the references to change any scores at this point in the process. KPMG asserted that references should be checked to ascertain the veracity of the information in the proposals. However, even if the information in some other proposal was inaccurate it would not change the outcome for KPMG. KPMG would still not have the required number of points to advance to the next evaluation tier. Divergency in Scores The Source Selection Plan established a process for resolving divergent scores. Any item receiving scores with a range of 5 or more was determined to be divergent. The plan provided that the Coordinator identify divergent scores and then report to the evaluators that there were divergent scores for that item. The Coordinator was precluded from telling the evaluator, if his score was the divergent score, i.e., the highest or lowest score. Evaluators would then review that item, but were not required to change their scores. The purpose of the divergent score process was to have evaluators review their scores to see if there were any misperceptions or errors that skewed the scores. The team wished to avoid having any influence on the evaluators' scores. Mr. Strange testified that the Department did not follow the divergent score process in the Source Selection Plan as the coordinator did not tell the evaluators why the scores were divergent. Mr. Strange stated that the evaluator should have been informed which scores were divergent. The Source Selection Plan merely instructed the coordinator to inform the evaluators of the reason why the scores were divergent. Inherently scores were divergent, if there was a five-point score spread. The reason for the divergence was self- explanatory. The evaluators stated that they scored the proposals, submitted the scores and each received an e-mail from Debbie Stephens informing him that there were divergent scores and that they should consider re-scoring. None of the evaluators ultimately changed their scores. Mr. Esser's scores were the lowest of the divergent scores but he did not re-score his proposals as he had spent a great deal of time on the initial scoring and felt his scores to be valid. Neither witnesses Focht or Strange for KPMG provided more than speculation regarding the effect of the divergent scores on KPMG's ultimate score and any role the divergent scoring process may have had in KPMG not attaining the 150 point passage score. Deloitte - Suntax Reference: Susan Wilson, a Child Support Enforcement employee connected with the CAMS project signed a reference for Deloitte Consulting regarding the Suntax System. Mr. Focht was concerned that the evaluators were influenced by her signature on the reference form. Mr. Strange further stated that having someone who is heavily involved in the project sign a reference did not appear to be fair. He was not able to state any positive or negative effect on KPMG by Wilson's reference for Deloitte, however. Evaluator Esser has met Susan Wilson but has had no significant professional interaction with her. He could not recall anything that he knew about Ms. Wilson that would favorably influence him in scoring the Deloitte reference. Dr. Addy also was not influenced by Wilson. Mr. Doolittle has only worked with Wilson for a very short time and did not know her well. He has also evaluated other proposals where department employees were a reference and was not influenced by that either. Mr. Ellis has only known Wilson from two to four months. Her signature on the reference form did not influence him either positively or negatively. Mr. Bankirer had not known Wilson for a long time when he evaluated the Suntax reference. He took the reference at face value and was not influenced by Wilson's signature. It is not unusual for someone within an organization to create a reference for a company who is competing for work to be done for the organization.

Recommendation Having considered the foregoing Findings of Fact, Conclusions of Law, the evidence of record and the pleadings and arguments of the parties, it is, therefore, RECOMMENDED that a final order be entered by the State of Florida Department of Revenue upholding the proposed agency action which disqualified KPMG from further participation in the evaluation process regarding the subject CAMS CE Invitation to Negotiate. DONE AND ENTERED this 26th day of September, 2002, in Tallahassee, Leon County, Florida. P. MICHAEL RUFF Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with Clerk of the Division of Administrative Hearings this 26th day of September, 2002. COPIES FURNISHED: Cindy Horne, Esquire Earl Black, Esquire Department of Revenue Post Office Box 6668 Tallahassee, Florida 32399-0100 Robert S. Cohen, Esquire D. Andrew Byrne, Esquire Cooper, Byrne, Blue & Schwartz, LLC 1358 Thomaswood Drive Tallahassee, Florida 32308 Seann M. Frazier, Esquire Greenburg, Traurig, P.A. 101 East College Avenue Tallahassee, Florida 32302 Bruce Hoffmann, General Counsel Department of Revenue 204 Carlton Building Tallahassee, Florida 32399-0100 James Zingale, Executive Director Department of Revenue 104 Carlton Building Tallahassee, Florida 32399-0100

Florida Laws (3) 120.569120.5720.21
# 3
GEORGIOS GAITANTZIS vs FLORIDA ENGINEERS MANAGEMENT CORPORATION, 98-004757 (1998)
Division of Administrative Hearings, Florida Filed:Jacksonville, Florida Oct. 26, 1998 Number: 98-004757 Latest Update: Apr. 20, 1999

The Issue Did Petitioner pass the Mechanical Engineers Examination he took on April 24, 1998?

Findings Of Fact On April 24, 1998, Petitioner took the Mechanical Engineers Examination. He received a score of 69 for his effort. A passing score was 70. The Mechanical Engineers Examination was administered under Respondent's auspices. As alluded to in the preliminary statement, Petitioner challenged the score received on problem 146. The maximum score available for that problem was ten points. Petitioner received eight points. In accordance with the National Council of Examiners for Engineering and Surveying Principles in Practice of Engineering Examinations for spring 1998, score conversion table - discipline specific, Petitioner had a raw score of 47 which equated to a conversion of 69, to include the eight raw points received for problem 146. In addition, the examination provided a scoring plan for problem 146, which assigns scores in increments of two points from zero to ten. To pass, it would be necessary for Petitioner to receive an incremental increase of two points, raising his score from eight points to ten points. This would give him a raw score of 49 points. According to the score conversion table - discipline specific, that would give Petitioner 71 points. According to the scoring plan for problem 146 to receive the ten points, Petitioner would have to demonstrate: Exceptional competence (it is not necessary that the solution to the problem be perfect) generally complete, one math error. Shows in-depth understanding of cooling load calculation psychrometrics. Problem 146 required Petitioner to: Determine the required cooling coil supply air quantity (cfm) and the conditions (°F db and °F wb) of the air entering and leaving the coil." Petitioner was provided a psychrometric chart to assist in solving problem 146. The examination candidates were also allowed to bring reference sources to the examination to assist in solving the examination problems. Petitioner brought to the examination, the Air-Conditioning Systems Design Manual prepared by the ASHRAE 581-RP Project Team, Harold G. Lorsch, Principal Investigator. Petitioner used that manual to determine the wet-bulb temperature of the air entering the coil. In particular, he used an equation from the manual involving air mixtures. For that part of the solution he arrived at a temperature of 65.6°F wb. According to the problem solution by Respondent's affiliate testing agency, reference ASHRAE Fundamentals Chapter 26, the coil entering wet-bulb temperature taken from the psychrometric chart was 66.12°F wb. The scorer in grading Petitioner's solution for problem 146 placed an "x" by the answer provided 65.6°F wb and wrote the words "psychrometric chart." No other entry or comment was made by that scorer in initially reviewing the solution Petitioner provided for that problem. This led to the score of eight. The scoring plan for problem 146 for the April 1998 examination taken by Respondent equates the score of eight as: MORE THAN MINIMUM BUT LESS THAN EXCEPTIONAL COMPETENCE Either a) Provides correct solution to problem with two math errors or incorrect dry-bulb or wet-bulb for coil entering or leaving conditions or minor total cooling load error, or b) Provides correct solution to items c and d correctly and minor math errors in items a and b of Score 6 below. Petitioner was entitled to review the results of his examination. He exercised that opportunity on September 21, 1998, through a post-examination review session. Petitioner requested and was provided re-scoring of his solution to problem 146. According to correspondence from the National Council of Examiners for Engineering and Surveying to the Florida Member Board from Patricia M. Simpson, Assistant Supervisor of scoring services, the score did not change through re-scoring. In this instance, the October 14, 1998 correspondence on re-scoring states, in relation to problem 146: Incorrect methodology used in calculating coil entering wet-bulb temperature. Incorrect coil entering wet-bulb temperature provided. No calculation provided for coil leaving temperature conditions. The coil leaving wet-bulb temperature in Respondent's proposed solution was 53.22°F wb taken from the psychrometric chart. Petitioner's solution for the coil leaving wet-bulb temperature taken from the psychrometric chart was 53.3°F wb. At hearing Respondent did not provide an expert to establish the basis for point deduction in the original score and the re-scoring of Petitioner's solution for problem 146. Moreover, Respondent did not present expert witnesses to defend the commentary, the preferred written solution in its examination materials. Consequently, Respondent's preferred solution constitutes hearsay about which no facts may be found accepting the validity of Respondent's proposed solution, as opposed to merely reporting that information.1 By contrast, Petitioner provided direct evidence concerning the solution provided for problem 146 in response to the criticisms of his solution that were unsupported by competent evidence at hearing. More importantly the criticisms were responded to at hearing by Geoffrey Spencer, P.E., a mechanical engineer licensed to practice in Florida, who was accepted as an expert in that field for purposes of the hearing. As Petitioner explained at hearing, he used the Air- Conditioning Systems Design Manual equation to arrive at the coil entering wet-bulb temperature, which he believed would provide the answer as readily as the use of the psychrometric chart. (Although the psychrometric chart had been provided to Petitioner for solving problem 146, the instructions for that problem did not prohibit the use of the equation or formula.) Petitioner in his testimony pointed out the equivalency of the process of the use of the psychrometric chart and the equation. Petitioner deemed the equation to be more accurate than the psychrometric chart. Petitioner had a concern that if the answer on the coil entering wet-bulb temperature was inaccurate, this would present difficulty in solving the rest of problem 146 because the error would be carried forward. Petitioner pointed out in his testimony that the solution for determining the coil entering wet-bulb temperature was set out in his answer. The answer that was derived by use of the formula was more time consuming but less prone to error, according the Petitioner's testimony. Petitioner points out in his testimony that the answer he derived, 65.6°F wb, is not significantly different than Respondent's proposed solution of 66.12°F wb. (The instructions concerning problem 146 did not explain what decimal point of a degree the candidate had to respond to in order to get full credit for that portion of the solution to the problem.) Petitioner in his testimony concerning his solution for the coil leaving wet-bulb temperature indicated that the calculation for arriving at that temperature was taken from the psychrometric chart and is sufficiently detailed to be understood. Further, Petitioner testified that the degree of accuracy in which the answer was given as 53.3°F wb, as opposed to Respondent's proposed solution of 53.22°F wb, is in recognition of the use of the psychrometric chart. Petitioner questions whether the proposed solution by Respondent, two decimal points, could be arrived at by the use of the psychrometric chart. In relation to the calculation of the coil entering wet-bulb temperature, Mr. Spencer testified that the formula from the Air-Conditioning Systems Design Manual or the psychrometric chart could have been used. Moreover, Mr. Spencer stated his opinion that the solution for coil entering wet-bulb temperature of 65.6°F wb by Petitioner is sufficiently close to Respondent's proposed solution of 66.12°F wb to be acceptable. Mr. Spencer expressed the opinion that Petitioner had correctly used the formula from the manual in solving the coil entering wet-bulb temperature. Mr. Spencer expressed the opinion that the psychrometric chart is an easier source for obtaining the solution than the use of the formula from the manual. In Mr. Spencer's opinion, the formula shows a more basic knowledge of the physics involved than the use of the psychrometric chart would demonstrate. In relation to the coil leaving wet-bulb temperature, Mr. Spencer expressed the opinion that Petitioner had adequately explained the manner of deriving the answer. Further, Mr. Spencer expressed the opinion that the answer derived was sufficiently accurate. The testimony of Petitioner and opinion of Mr. Spencer is unrefuted and accepted.

Recommendation Upon consideration of the facts found and conclusions of law reached, it is RECOMMENDED: That a final order be entered which finds that Petitioner passed the Florida Board of Professional Engineers April 24, 1998, Mechanical Engineers Examination with a score of 71. DONE AND ENTERED this 22nd day of February, 1999, in Tallahassee, Leon County, Florida. CHARLES C. ADAMS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 22nd day of February, 1999.

Florida Laws (2) 120.569120.57
# 4
JOHN D. WATSON vs FLORIDA ENGINEERS MANAGEMENT CORPORATION, 98-004756 (1998)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Oct. 26, 1998 Number: 98-004756 Latest Update: Apr. 20, 1999

The Issue The issue in this case is whether the Petitioner is entitled to additional credit for his response to question number 123 of the Principles & Practice Civil/Sanitary Engineer Examination administered on April 24, 1998.

Findings Of Fact Petitioner took the April 24, 1998, Principles & Practice Civil/Sanitary Engineer examination. A score of 70 is required to pass the exam. Petitioner obtained a score of 69. In order to achieve a score of 70, Petitioner needs a raw score of 48. Petitioner obtained a score of 69 which is a raw score of 47. Therefore, Petitioner is in need of one (1) additional raw score point. On question number 123, Petitioner received a score of six points out of a possible ten. Question nimber 123 is scored in increments of two raw points. Two additional raw score points awarded to the Petitioner would equal a raw score of 49, equating to a conversion score of seventy-one, a passing score. The National Council of Examiners for Engineering and Surveying (NCEES), the organization that produces the examination, provides a Solution and Scoring Plan which outlines the scoring process used in question number 123. The Petitioner is not allowed a copy of the examination question or the Solution and Scoring Plan for preparation of the Proposed Recommended Order. Question number 123 has three parts: part A, part B, and part C. For a score of ten on question number 123, the Solution and Scoring Plan states that the solution to part A must be correct within allowable tolerances; the solution to part B must state two variables that affect the answer in part A; and the solution to part C must state that anti-lock brakes do not leave skid marks thus making it very had to determine braking distance. For a score of eight points on question number 123, the Solution and Scoring Plan states that part A could contain one error and lists specific allowable errors, and that part B and part C must be answered correctly showing mastery of the concepts involved. Petitioner made an error in part A which falls into the allowable errors listed in the Solution and Scoring Plan under the eight-point scoring plan. Petitioner answered part B correctly. Petitioner contends that he also answered correctly part C, and should be awarded eight points. NCEES marked part C incorrect. Question number 123 is a problem involving a vehicle (vehicle number one) that skids on asphalt and hits another vehicle (vehicle number two). Part C asks "Explain how your investigation of this accident would have changed if vehicle one had been equipped with anti-lock brakes." The Petitioner's answer was as follows: If vehicle one does not "lock" its brakes, its deceleration will be dependent upon its brakes. (Not f). [Judge's note: f is used as the symbol for the co-efficient of friction between the tires and road surface in the problem.] The rate of deceleration (a) must be determined (from testing, mfg, [manufacturer,] etc.) As stated above, the Board accepts a solution that recognizes that the vehicle equipped with anti-lock brakes will not leave skid marks which can be used for computing initial speed using the skid distance equation. The Petitioner's answer pre-supposes that there are no skid marks because the vehicle's wheels do not lock because of the anti-lock brakes; therefore, if the co-efficient of friction of the tires, which generates the skid marks, has no effect. The Petitioner introduced a portion of a commonly used manual for preparation for examination (Petitioner's Exhibit 1), which states, regarding a vehicle that does not lock its brakes, "its decelerations will be dependent upon its brakes." The Board's expert recognized the statement by the Petitioner in response to part C as true, but indicated it was not responsive to the question in that it did not state specifically that the vehicle would not produce skid marks that would be able to be measured for use in the skid distance equation. The solution sheet states regarding part C, "Part C is answered correctly by explaining that anti-lock brakes would not leave skid marks thus making it very had to determine the braking distance."

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law set forth herein, it is, RECOMMENDED: That the Board of Professional Engineers enter a Final Order giving Petitioner credit for part C on the examination and passing the test. DONE AND ENTERED this 25th day of March, 1999, in Tallahassee, Leon County, Florida. STEPHEN F. DEAN Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 25th day of March, 1999. COPIES FURNISHED: Natalie A. Lowe Vice President of Legal Affairs Florida Engineers Management Corporation 1208 Hays Street Tallahassee, Florida 32301 John D. Watson 88 Marine Street St. Augustine, Florida 32084 Dennis Barton, Executive Director Board of Professional Engineers 1208 Hays Street Tallahassee, Florida 32301

Florida Laws (1) 120.57
# 5
AMERICAN CONTRACT BRIDGE LEAGUE vs. OFFICE OF THE COMPTROLLER AND DEPARTMENT OF REVENUE, 76-001237 (1976)
Division of Administrative Hearings, Florida Number: 76-001237 Latest Update: Mar. 21, 1977

The Issue The issue for determination in this cause is whether petitioner is entitled to a refund in the amount of $6,306.32 paid into the state treasury as sales tax. More specifically, the issue is whether the registration or participation fee charged by petitioner to its members at the 1975 summer national bridge tournament is taxable as an "admission" under Florida Statutes 212.02(16) and 212.04.

Findings Of Fact Upon consideration of the oral and documentary evidence adduced at the hearing, the following relevant facts are found: The petitioner, the American Contract Bridge League, Inc., is a nonprofit corporation incorporated under the laws of New York in 1938. Its membership is approximately 200,000, representing areas all over the North American continent. Its purposes include educational, cultural and charitable pursuits. Among other things, petitioner annually sponsors three national tournaments in various areas of the United States. In August of 1975, petitioner held its summer national tournament at the Americana Hotel in Bal Harbour, Dade County, Florida. Over 1,000 tables for approximately 5,500 members were in operation for the nine-day event. Many of these 5,500 members played in two or more events. In order to participate in each event, the member was required to pay a registration fee ranging from $3.00 to $4.50. No sales tax was included by petitioner in its registration fee. While spectators at the tournament were permitted, it was not intended as a spectator event. No special provision was made for the seating of spectators, whose number rarely exceeded one hundred and who were composed primarily of relatives or friends of the actual players or participants. No admission charges were made to spectators. On previous occasions, petitioner has held bridge events in Florida. On no such occasion has the State of Florida attempted to assess the sales tax on petitioner's registration or participation fees. No other state in which petitioner has held its tournaments has assessed petitioner for sales or other taxes on this fee. The respondent Department of Revenue informed petitioner that the registration fees collected at the 1975 summer national tournament constituted a taxable event, subject to the Florida sales tax, and petitioner, under protest, forwarded a check in the amount of $6,306.32. Thereafter, petitioner applied for a refund pursuant to the provisions of F.S. 215.26. The Comptroller denied the refund application.

Recommendation Based upon the findings of fact and conclusions of law recited above, it is recommended that petitioner's request for a refund in the amount of $6,306.32 be denied. Respectfully submitted and entered this 21st day of March, 1977, in Tallahassee, Florida. DIANE D. TREMOR Hearing Officer Division of Administrative Hearings Room 530 Carlton Building Tallahassee, Florida 32304 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 21st day of March, 1977. COPIES FURNISHED: Comptroller Gerald Lewis The Capitol Tallahassee, Florida 32304 Patricia Turner, Esquire Assistant Attorney General Department of Legal Affairs The Bloxham Building Tallahassee, Florida 32304 Paul J. Levine, Esquire 2100 First Federal Building One Southeast 3rd Avenue Miami, Florida 33131

Florida Laws (3) 212.02212.04215.26
# 6
VADIM J. ALTSHULER vs BOARD OF PROFESSIONAL ENGINEERS, 98-002342 (1998)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida May 18, 1998 Number: 98-002342 Latest Update: Jan. 27, 1999

The Issue Whether Petitioner is entitled to additional credit for his response to Question Number 146 of the Principles and Practice of Engineering examination administered on October 31 through November 1, 1997.

Findings Of Fact Petitioner took the professional engineering licensing examination with emphasis in mechanical engineering on October 31, 1997. Passing score on the examination was 70. Petitioner obtained a score of 65 and a raw score of 43. A score of 70 would have generated a raw score of 48. Petitioner needed at least 5 additional raw score points to achieve a passing grade and a converted score of 70. Out of a possible 10 points on Question Number 146, Petitioner received a score of 4 points. The National Council of Examiners for Engineering and Surveying (NCEES), the organization that produces the examination, provides a Solution and Scoring Plan outlining the scoring process for question 146. Further, NCEES rescored Petitioner’s test but found no basis to award additional points. There are 5 categories to question 146. All six elements of question 146 must be completely and correctly answered to receive full credit of 10 points for the question. Instructions for the question provide: A perfect solution is not required, as the examinee is allowed minor psychometric chart reading errors (two maximum) or minor math errors (two maximum). The total number of minor errors allowed is two. Errors in solution methodology are not allowed. Examinee handles all concepts (i.e., sensible and total heat, sensible heat ratio, coil ADP and BF, adiabatic mixing, and coil heat transfer) correctly. (emphasis supplied.) Testimony at the final hearing of Petitioner’s expert in mechanical engineering establishes that Petitioner did not qualify for additional points for answers provided for question 146. Petitioner failed to use the definition of bypass factor indicated in the problem. Instead, Petitioner used the Lindenburg method rather than the Carrier method to calculate the bypass factor. The Carrier Method was implied in the problem due to the way the problem was structured. The system outlined in question 146 did not have the special configuration that would be listed if the Lindenburg method were utilized. Petitioner also missed the total coil capacity due to misreading the psychometric chart. By his own admission at the final hearing, Petitioner misread the data provided because they were printed one right above the other in the question. Petitioner read the point on the psychometric chart for an outdoor dry bulb temperature at 95 degrees and a 78 percent relative humidity as the outdoor air. The question required a dry bulb temperature of 95 degrees and a wet bulb temperature of 78 degrees. Petitioner’s misreading constituted an error in methodology as opposed to a minor chart reading error. Question Number 146 on the examination was properly designed to test the candidate’s competency, provided enough information for a qualified candidate to supply the correct answer, and was graded correctly and in accord with the scoring plan.

Recommendation Based on the foregoing, it is, hereby, RECOMMENDED: That a final order be entered confirming Petitioner’s score on the examination question which is at issue in this proceeding. DONE AND ENTERED this 25th day of August, 1998, in Tallahassee, Leon County, Florida. DON W. DAVIS Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 Filed with the Clerk of the Division of Administrative Hearings this 25th day of August, 1998. COPIES FURNISHED: Natalie A. Lowe, Esquire Board of Professional Engineers 1208 Hays Street Tallahassee, Florida 32301 Vadim J. Altshuler 9794 Sharing Cross Court Jacksonville, Florida 32257 Dennis Barton, Executive Director Board of Professional Engineers 1208 Hays Street Tallahassee, Florida 32301 Lynda L. Goodgame, General Counsel Department of Business and Professional Regulation 1940 North Monroe Street Tallahassee, Florida 32399-0792

Florida Laws (1) 120.57
# 7
G. H. JOHNSON CONSTRUCTION COMPANY vs COLLIER COUNTY SCHOOL BOARD, 92-003220BID (1992)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida May 26, 1992 Number: 92-003220BID Latest Update: Jul. 20, 1992

Findings Of Fact On March 25, 1992, the School Board of Collier County ("School Board") issued an invitation to bid ("ITB") on the construction of an elementary school in Naples, Florida, identified as Elementary School "D", Bid #84-3/92. Pursuant to School Board Rule No. R-03/89, potential bidders for proposed projects with a construction cost in excess of $50,000 must be prequalified by the School Board. The prequalification procedure is designed to provide the School Board with a responsible successful bidder. The School Board considers prequalification applications at regularly scheduled board meetings. Contractors are required to submit applications at least two weeks prior to the board meeting at which the application receives consideration. By application dated April 13, 1992, and filed April 14, 1992, G. H. Johnson Construction Company ("GHJ") applied to be prequalified by the School Board. The application contains the signed statement by Reza Yazdani, president of GHJ that all statements contained in the application are true and accurate. Question #19 in the prequalification application states "[w]hat are the three largest contracts (dollar amount) ever performed by your organization?" The April 13 application filed by GHJ indicates that the company's three largest contracts were University of Chicago Replacement Hospital ($7,353,000), V. A. Medical Center, Loma Linda, California ($3,810,000), and Cape Canaveral Hospital Phase I, II, & III (($6,000,000). In relevant part, section 6 of School Board Rule No. R-03/89 states: Unless specified exceptions are made by the Board, the contractor shall be qualified to bid or negotiate on projects of equal value and complexity to the largest project previously constructed by him. The Board may qualify contractors for projects the value of which does not exceed that of the largest project previously constructed if the experience record, size and qualifications of staff and other pertinent data regarding the contractor justify such action in the discretion of the Board. However, in no event shall a contractor that has not previously performed work for the Board be granted a Certificate of Qualification which exceeds the smaller of the contractor's largest previous project or 10 times the contractor's net quick assets. In the event use of the largest project to establish the pre-qualified amount for the Certificate of Qualification would preclude the contractor from bidding or negotiating because its work in progress exceeds the dollar amount of the largest project, the criterion of ten times the net quick assets may be used if it would yield a larger face amount for the Certificate of Qualification. (emphasis supplied) The copy of School Board Rule No. R-03/89 provided to GHJ prior to the School Board's consideration of the GHJ prequalification application omitted the portion underlined in the preceding excerpt. Although the result of the error was to garble the meaning of the particular sentence, the first sentence of the referenced excerpt provides that, absent specific exception by the School Board, "the contractor shall be qualified to bid or negotiate on projects of equal value and complexity to the largest project previously constructed by him." Clearly, the value of the largest completed project was of importance in determining a contractor's prequalification amount. There is no evidence that the typographical error caused GHJ to provide incorrect information in the April 13 prequalification application to the School Board. There is no evidence that any GHJ representative read the referenced section until two days before the bid submission deadline. On April 22, 1992, a mandatory prebid conference was held. A representative of GHJ was present at the conference. At the conference, School Board representatives stated that a contractor's bid cost could not exceed the bidder's prequalification amount minus the contractor's work in progress. Contractors were invited to inquire as to prequalification amounts. There is no evidence that the GHJ representative sought any information related to prequalification. The standard bid instructions provided to GHJ state that the School Board "will consider base bid and deduct alternates as may produce a net amount which is acceptable" to the Board. The instructions further state that bid documents include any addenda issued prior to the bid submission deadline. On April 27, 1992, the School Board issued Addendum #2 to the ITB. Addendum #2 defines "alternate" as "an amount proposed by Bidders and stated on the Bid form that will be added to or deducted from Base Bid amount if the Owner (School Board) decides to accept a corresponding change in either scope of work or in products, materials, equipment, systems or installation methods described in Contract Documents." The addendum states "[b]asis for selection of Alternate shall not be limited to price". Addendum #2 instructs bidders to add "removal of exotics" as "Alternate No. 1" to the bid proposal. The alternate identifies the "exotics" as melaleuca trees to be removed from approximately 10.5 acres at the site of middle school "BB". The removal of the exotics is required by an Army Corps of Engineers permit issued to the School Board for middle school site work. The general bid instructions require that, not less than seven days prior to the bid deadline, bidder's must submit written requests for clarification of any error, ambiguity or inconsistency in the bid proposal. Prior to submission of their bid, GHJ representatives discussed whether the add alternate #1 would be considered by the School Board in making the bid award, and, relying solely on the initial bid instructions, determined for themselves that it would not. At no time did GHJ inquire of any School Board representative as to the effect of addendum #2 or the "add alternate #1" on the Board's bid consideration. Based on the information provided in GHJ's April 13 application, the School Board on May 7, 1992, prequalified GHJ for projects not in excess of $7,353,000. The figure is derived directly from GHJ's identification of the three largest jobs completed. The University of Chicago Replacement Hospital's cost of $7,353,000 is the largest of the three jobs cited in the GHJ application for prequalification. There is no evidence that the approved prequalification amount was calculated incorrectly or contrary to the School Board's rule. By "Certificate of Prequalification" and letter of May 8, 1992, the School Board notified GHJ of the prequalification amount of $7,353,000. GHJ had not received the letter prior the May 12, 1992 bid deadline. On or about May 10, 1992, two days prior to the bid opening, the president of GHJ contacted the School Board to ascertain the approved prequalification amount. The prequalification amount was orally provided to him. At no time prior to the bid opening did GHJ question, challenge or seek to amend the prequalification amount. On May 12, 1992, GHJ timely submitted a bid on the project, with a base bid of $7,146,000 and an alternate #1 bid of $50,850. GHJ's base bid was the lowest base bid submitted. The total GHJ bid, including alternate #1, was $7,196,850, the second lowest total bid submitted. The GHJ "Certificate of Current Capacity" submitted as part of the bid proposal identified GHJ's prequalification amount as $7,353,000, total uncompleted work in progress as $1,325,655, and a current capacity (prequalification amount less current uncompleted work) of $6,027,345. Otherwise stated, the GHJ bid of $7,196,850 exceeds the contractor's capacity by $1,169,505. School Board Rule No. R-03/89, Section 2(d), provides as follows: If the bid of any qualified contractor exceeds the difference between the amount stated on the contractor's Certificate of Qualification (as effective on the date of the bid opening) and the contractors work in progress, the bid shall be rejected by the School Board. GHJ asserts that the bid specifications provided only that the award would be made on the basis of the base bid plus "deduct alternates" (of which there were none). Even assuming that the School Board's addendum #2 failed to indicate that factors other than the base bid would be considered, GHJ's base bid of $7,146,000 exceeds GHJ's capacity by $1,118,655. Under the provisions of the rule, the School Board may properly reject the GHJ bid. On May 12, 1992, Carlson Harris General Contractors, Inc., ("CH") timely submitted a bid on the project, with a base bid of $7,163,513 and an alternate #1 bid of $27,115. The total of the CH bid was $7,190,628. The total CH bid was the lowest of the total bids received. The CH "Certificate of Current Capacity" (based on a prequalification amount of $11,201,000), identified total work in progress of $740,830 and a current capacity of $10,460,170. The standard instructions provided to bidders on the project state that the School Board has the "complete and unrestricted right...to reject any and all bids and to waive any informality or irregularity in any bid received." Among other items required by the bidder instructions, each bidder was required to submit a list, signed by the bidder, of subcontractors and major material suppliers. The Petitioner claims that, at the time of submission, and as late as two days after the bid opening, the CH subcontractor list was unsigned. A witness for the Respondent claims that, as of thirty minutes after the bid opening (when he viewed the CH proposal), the list was signed. The School Board official who actually opened and examined the bids did not testify. The testimony of Reza Yazdani is credited and establishes that, at the time of submission, CH's subcontractor list was unsigned. The Petitioner asserts that CH's submission of an unsigned subcontractor list is a material defect which requires that the bid be rejected. The evidence establishes that such is a minor irregularity which does not affect the total cost of the bid or the ability of the School Board to enforce the contract provisions against CH and accordingly may be waived. The instructions also require submission of a bid bond issued by a Florida-licensed surety with a Best's rating of "A" or better who has fulfilled any previous obligation to the School Board. The bond submitted by CH was issued by Employers Reinsurance Corporation and Reliance Insurance Company, and was signed by the surety agents, although not by the CH representatives. Employers had a Best rating of "A+13" and Reliance had a Best rating of "A-11". The Petitioner asserts that CH's submission of a bid bond signed by the surety and not by the contractor is a material defect which requires that the bid be rejected. The Petitioner further asserts that Reliance's Best rating of "A-11" fails to meet the requirement that the surety have a Best rating of "A" or better. The evidence fails to establish that the irregularities in the bid bond are material. Employers Reinsurance had a Best rating of "A+13". The bid bond sufficiently protects the ability of the School Board to enforce the bond against the surety should CH fail to perform under the contract. At hearing, GHJ asserted that the School Board had previously contracted with CH and favored CH based on prior performance. There is no evidence that the School Board has previously contracted with CH for any construction project. Subsequent to the bid opening, GHJ amended it's application for prequalification to indicate that the University of Chicago Replacement Hospital cost was $11,400,000. Although staff has recommended that GHJ's prequalification amount be amended, the School Board has not taken action on the request. There is no evidence that such amended prequalification amount would be or should be applied retroactively to the bid at issue in this case.

Recommendation Based on the foregoing, it is hereby recommended that the School Board of Collier County enter a Final Order DISMISSING the Petition filed by G. H. Johnson Construction Company, Inc. RECOMMENDED this 29th day of June, 1992, in Tallahassee, Florida. WILLIAM F. QUATTLEBAUM Hearing Officer Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, FL 32399-1550 (904) 488-9675 Filed with the Clerk of the Division of Administrative Hearings this 29th day of June, 1992. APPENDIX TO RECOMMENDED ORDER, CASE NO. 92-3220BID The following constitute rulings on proposed findings of facts submitted by the parties. Petitioner The Petitioner's proposed findings of fact are accepted as modified and incorporated in the Recommended Order except as follows: a. Rejected, irrelevant. Rejected, irrelevant. The prequalification application contains the sworn statement that all statements are true and correct. Absent any indication that the contractor is untruthful, there is no cause for the School Board to independently investigate the contractor's application. Rejected, not supported by the greater weight of credible evidence, which establishes that GHJ representative made no attempt to clarify the garbled language cited. Rejected, not supported by the greater weight of evidence which establishes that the GHJ prequalification amount was based clearly and solely on information supplied by GHJ. Rejected, irrelevant. The rule requires rejection of GHJ's bid. Rejected, not supported by the greater weight of credible and persuasive evidence which establishes that CH's total bid was the lowest of those received. Rejected, not supported by the greater weight of credible and persuasive evidence. Rejected, as argumentative, not finding of fact. Rejected, not supported by the greater weight of evidence which clearly establishes that GHJ was aware of the prequalification amount prior to bid deadline and that the prequalification amount was based on information supplied by GHJ. Rejected. The Petitioner's application for amendment of the prequalification amount is irrelevant to this case. The prequalification amount was based solely on information provided by Petitioner prior to the bid opening. The Petitioner did not seek to challenge the prequalification amount until after the bid opening. There is no evidence that a revised prequalification amount should be applied retroactively. Rejected, irrelevant. There is no evidence that the Board used the prequalification process to prevent GHJ from submitting a bid or to restrict competition. Rejected, argumentative, irrelevant. The prequalification amount was based on GHJ information. The instructions provided to GHJ clearly indicated that the contractor would be qualified to bid or negotiate on projects of equal value and complexity to the largest project previously constructed. Any mistake in providing information to the School Board was on the contractor's part. The alleged action or lack thereof by the Board related to GHJ's subsequent request to amend the prequalification amount is irrelevant. There is no evidence that the Board used the prequalification process to prevent GHJ from submitting a bid or to restrict competition. Accepted as to proposed award to CH General Contractors. Rejected as to allegation that one of CH's subcontractors has indicated an unwillingness to perform. There is no indication that CH has or will suggest an inability to perform obligations under the bid contract. Respondent The Respondent's proposed findings of fact are accepted as modified and incorporated in the Recommended Order except as follows: 10. Rejected, unnecessary and uncorroborated hearsay. Although the permit is referenced in Addendum #2, which indicates that a copy of the Corps permit is attached to the addendum), the addendum admitted into evidence does not contain the copy of the Corps permit. 4. Rejected, unnecessary. 25-26. Rejected, irrelevant. Rejected, irrelevant. The timeliness of the Petitioner's protest is not at issue. Rejected. Although correct, the Petitioner's action in seeking amendment of the prequalification amount is irrelevant to this case. The prequalification amount was based solely on information provided by Petitioner prior to the bid opening. The Petitioner did not seek to challenge the prequalification amount until after the bid opening. There is no evidence that a revised prequalification amount should be applied retroactively. COPIES FURNISHED: Thomas L. Richey Superintendent School Board of Collier County 3710 Estey Avenue Naples, FL 33942 Matias Blanco, Jr. Esq. 701 North Franklin Street Franklin Street Mall Tampa, FL 33602 James H. Siesky, Esq. Siesky & Lehman, P.A. 700 Eleventh Street South, Suite 203 Naples, FL 33940-6777

Florida Laws (2) 120.53120.57
# 8
JUVENILE SERVICES PROGRAM, INC. vs DEPARTMENT OF JUVENILE JUSTICE, 96-005982BID (1996)
Division of Administrative Hearings, Florida Filed:Tallahassee, Florida Dec. 27, 1996 Number: 96-005982BID Latest Update: May 05, 1997

The Issue The issues for determination in this case are: 1) whether the Respondent’s decision to award a contract to operate a juvenile work release halfway house program to the Henry and Rilla White Foundation was clearly erroneous, contrary to competition, arbitrary, or capricious; and 2) whether the award of the contract is void as a matter of law because of procedural violations by the selection committee and the Respondent.

Findings Of Fact Petitioner, JUVENILE SERVICES PROGRAM, INC. (JSP), is a Florida-based private not-for-profit corporation which was founded to serve troubled youths and their families. Respondent, FLORIDA DEPARTMENT OF JUVENILE JUSTICE (DJJ), is the agency of the State of Florida with the statutory authorization for planning, coordinating, and managing programs for the delivery of services within the juvenile justice consortium. Section 20.316, Florida Statutes. RFP #16P05 On September 27, 1996, Respondent DJJ advertised and released a Request For Proposal (RFP) #16P05 to provide a Work Release Halfway House for Delinquent Males in District IX, serving Palm Beach County, Florida. In response to the RFP, four bids were submitted to DJJ by the following parties: the Henry and Rilla White Foundation, Total Recovery, Inc., Psychotherapeutic Services Inc., and Petitioner JSP. The DJJ bid selection committee of evaluators for the RFP were Jack Ahern, Steve Brown, Jaque Layne, Patricia Thomas, and from the Office of Budget Finance, Fred Michael Mauterer. The contract manager for the RFP was Diane Rosenfelder. On October 28, 1996, each DJJ evaluator was sent a package consisting of a copy of the RFP, which included the evaluation sheet, a copy of each proposal submitted to DJJ, a conflict of interest questionnaire, a certificate of compliance, a description of the proposal selection process, and instructions. Each package sent to the evaluators had a different colored cover sheet which identified the specific evaluator. After completing the evaluations, each evaluator returned the signed conflict of interest forms, and certificates of compliance to Diane Rosenfelder. The evaluations were identified by the color of the cover sheets, as well as the signed conflict of interest forms and certificates of compliance. DJJ initially intended to provide each evaluator with an Award Preference Form which were to be used in the event the final evaluation scores were very close. The Award Preference Forms, however, were inadvertently omitted from the packages sent to the evaluators. The evaluation process resulted in the Henry and Rilla White Foundation receiving the highest average score of 391.50 points. Petitioner JSP received the second highest average score of 360.50 points. The award of points was determined by each evaluator which is indicated by the evaluator checking the box on Section 5 of the evaluation sheet, or by filling in the appropriate point score. The contract manager, Diane Rosenfelder, corrected addition errors on the scoring sheets. The budget part of the evaluation was completed by Fred Michael Mauterer, Senior Management Analyst Supervisor. In accordance with the evaluation scores, DJJ determined that the best response was submitted by the Henry and Rilla White Foundation which was awarded the contract. On November 8, 1996, Petitioner JSP filed a timely Notice of Protest of the award, which was supplemented on December 9, 1996 with the required posting of a $5000 bond. Alleged Errors and Discrepancies in the Evaluation Process Petitioner JSP alleges that several errors in the evaluation process require that the contract award to the Henry and Rilla White Foundation be set aside and that the RFP be reissued and rebid. Petitioner first alleges that the bid selection committee failed to follow the certain instructions during the evaluation process. The instructions were prepared by the contract manager Diane Rosenfelder. The instructions were not required by rule or policy of DJJ. The contract manager considered the instructions advisory in nature. The instructions stated that the members of the bid selection committee should not contact each other with respect to the proposals under evaluation. The evaluators, however, were permitted to contact the contract manager who would record all questions and answers. There were instances in which the contract manager did not record questions from the evaluators to the contract manager. There is no evidence that the evaluators contacted each other regarding the proposals during the evaluation process. The instructions asked the evaluators to explain high or low scores given to the proposals under consideration. None of the evaluators made specific explanations of high or low scores. The contract manager who prepared the instructions considered this instruction discretionary, and there is no evidence that any score given by an individual evaluator was without basis. The evaluators were instructed to provide page numbers from the proposals used to score each item. None of the evaluators complied with this instruction. As indicated above, however, there is no evidence that the actual scores give by the evaluators were without basis. As set forth above, none of the evaluators received the Award Preference Form. This form was to be used in the case of very close scoring of the proposals. The actual scores from the bid selection committee reflected a clear preference for the proposal submitted by the Henry and Rilla White Foundation. Accordingly, there was no demonstrated need for DJJ to rely upon the Award Preference Forms in making its decision to award the contract. The letter of introduction sent to the bid selection committee members from the contract manager stated that the proposal score sheets and the evaluators award preference and the best interest of the district would be considered in determining the award. The contract manager considered this statement advisory in nature. DJJ has not promulgated specific standards relating to the best interest of District IX; however, the proposal evaluation forms sent to the bid selection committee inherently include criteria setting out standards for the determination of the best proposal for the district. The evidence reflects that one of the evaluators, Patricia Thomas, erroneously checked the box on each proposal which gave each of the proposals fifty points as certified minority enterprises, and erroneously wrote “50” as a point count on one evaluation score sheet. None of the proposals included a copy of the certification for minority enterprise as required by Section 287.0945, Florida Statutes, and the contract manager recognized that the evaluator had made a mistake in this regard. In response to this error, the contract manager consulted her supervisors. Because each proposal was awarded the same points, DJJ did not consider the evaluator’s error as prejudicial to any proposal or to the bid selection process, and did reject the evaluator’s scoring of the proposals. There is no showing that Petitioner JPS was prejudiced by DJJ’s decision in this regard. The contract manager added signature lines to the last page of the evaluation sheets. Some of the sheets were returned unsigned from the evaluators. There is no DJJ requirement that the evaluation sheets specifically contain the signatures of the evaluators. The contract manager did not consider the signature page mandatory, and the evaluation proposal score sheets were clearly identified by both color coding and the certificates of conflict signed by the evaluators. There is no evidence that the procedural discrepancies affected the substance of the evaluator’s scoring of the proposals, nor did the procedural discrepancies prejudice the evaluators’ consideration of Petitioner’s proposal.

Recommendation Based on the foregoing Findings of Fact and Conclusions of Law, it is recommended that the Respondent enter a final order upholding the proposed agency action to award the contract to the Henry and Rilla White Foundation, and dismissing the Petition filed in this case. DONE and ORDERED this 23rd day of April, 1997, in Tallahassee, Florida. RICHARD HIXSON Administrative Law Judge Division of Administrative Hearings DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (904) 488-9675 SUNCOM 278-9675 Fax Filing (904) 921-6847 Filed with the Clerk of the Division of Administrative Hearings this 23rd day of April, 1997. COPIES FURNISHED: Dominic E. Amadio, Esquire Republic Bank Building, Suite 305 100 34th Street North St. Petersburg, Florida 33713 Scott C. Wright, Assistant General Counsel Department of Juvenile Justice 2737 Centerview Drive Tallahassee, Florida 32399-3100 Calvin Ross, Secretary Department of Juvenile Justice 2737 Centerview Drive Tallahassee, Florida 32399-3100 Janet Ferris, General Counsel Department of Juvenile Justice 2737 Centerview Drive Tallahassee, Florida 32399-3100

Florida Laws (2) 120.5720.316
# 9
CAMPBELL THERAPY SERVICES, INC. vs BREVARD COUNTY SCHOOL BOARD, 99-002729BID (1999)
Division of Administrative Hearings, Florida Filed:Viera, Florida Jun. 21, 1999 Number: 99-002729BID Latest Update: Apr. 07, 2000

The Issue The issue in this case is whether Respondent should award a contract to Intervenor to provide physical and occupational therapy services to approximately 1,300 exceptional education students who qualify for such services in 77 public schools in Brevard County, Florida.

Findings Of Fact Intervenor is the incumbent contractor for physical and occupational therapy services provided to Respondent. Intervenor has provided such services to Respondent for approximately six years. On February 24, 1999, Respondent issued its request for proposals ("RFP") for occupational and physical therapy services. The RFP consists of eight unnumbered pages. Ten companies responded to the RFP. However, only the proposals of Petitioner and Intervenor are at issue in this proceeding. A four-member evaluation committee ranked each proposal on the basis of six categories. The six categories were: experience; qualification; recruiting ability; location of office; and responsiveness. The evaluation committee also considered the hourly rate and mileage to be charged by each proposer. The evaluation committee met as a body. Each member of the committee then returned to his or her respective office to complete a scoring sheet. The scoring sheet listed each proposer's name in a column down the left side of the sheet and the six categories for evaluation from left to right across the top of the sheet. A column down the right side of each sheet listed the hourly rate to be charged by the proposer identified in the column down the left side of the sheet. The RFP does not prescribe a scoring formula to be used in completing the scoring sheets. In relevant part, the RFP merely states: . . . The Selection Committee shall rank the firms in order of preference and will submit its recommendation to the Superintendent for his consideration. The [Board] will bear responsibility for the selection of the Contractor and will decide which bid [sic] is most appropriate for Brevard schools and their students. The Superintendent will recommend a therapy service provider which will be presented to the . . . Board for approval at a regular or special Board meeting. RFP at unnumbered page 8. All four members of the evaluation committee ranked Intervenor's proposal first and Petitioner's proposal second. However, the hourly rate in Petitioner's proposal was the lowest of all proposers, at $34.75, and $4.25 less than the $39 hourly rate quoted in the proposal submitted by Intervenor. The proposal submitted by Intervenor charged mileage in addition to the hourly rate while the hourly rate quoted by Petitioner included mileage. Before May 11, 1999, when the Board selected Intervenor as the proposer, the evaluation committee met. The committee asked Respondent's buyer assigned to the contract if the committee was required to recommend the proposal with the lowest price. The buyer advised the committee that the contract was for professional services and did not require the committee to recommend the lowest-priced proposal. The committee determined that Ms. Eva Lewis, one of its members and the Director of Program Support for Exceptional Student Education in Brevard County, should telephone Intervenor and ask if Intervenor would match Petitioner's price. Ms. Lewis telephoned Mr. Rick McCrary, the manager for Intervenor, and asked if Intervenor would accept the contract price of $34.75. After consultation with his superiors, Mr. McCrary agreed to the straight-rate price of $34.75. On May 11, 1999, Ms. Lewis presented the recommendation of the evaluation committee to the Board. The Board asked Ms. Lewis if Intervenor's price was the lowest price. Ms. Lewis disclosed that the evaluation committee preferred the proposal submitted by Intervenor, asked Intervenor to lower its price to meet that of Petitioner, and that Intervenor agreed to do so. The Board voted unanimously to select Intervenor as the proposer to be awarded the contract. The parties directed most of their efforts in this proceeding to the issues of whether competitive bidding requirements apply to the proposed agency action and whether the scoring formula used to rank the proposers complied with those requirements. Petitioner asserts that the selection of Intervenor by the Board violates the competitive bidding provisions in Section 120.57(3), Florida Statutes (1997). (All chapter and section references are to Florida Statutes (1997) unless otherwise stated). Intervenor and Respondent contend that Section 120.57(1), rather than Section 120.57(3), controls the Board's selection of Intervenor for the contract. Although the document used by Respondent to obtain proposals from vendors describes itself as an RFP and describes the responses as either proposals or bids, Respondent and Intervenor suggest that the document is not an RFP but merely a "solicitation." Respondent and Intervenor further argue: . . . that the . . . Board . . . did not attempt to comply with the requirements for competitive procurement under Section 120.57(3) or Chapter 287. . . . And . . . that the . . . Board was never required to comply with those statutes. . . . these are contracts for professional, educational and health services, contracts uniquely and specifically exempted from [the] competitive bid procurement process. Transcript ("TR") at 40. It is not necessary to reach the issue of whether Section 120.57(1) or the competitive procurement provisions in Section 120.57(3) and Chapter 287 control Respondent's selection of Intervenor as the proposer to be awarded the contract. In either event, the proposed agency action is contrary to the specifications in the RFP. Assuming arguendo that Section 120.57(3) and Chapter 287 do not apply to the contract at issue in this proceeding, Respondent failed to comply with RFP specifications. As Intervenor and Respondent point out in their joint PRO, Section F.8. of the RFP states: The . . . Board . . . and the selected proposer will negotiate a contract as to terms and conditions for submission to the . . . Board for consideration and approval. In the event an agreement cannot be reached with the selected proposer in a timely manner, then the . . . Board reserves the right to select an alternative proposer. (emphasis supplied) Intervenor and Respondent are also correct that the phrase "negotiate a contract as to terms and conditions" includes terms and conditions such as the contract price. Contrary to the provisions of Section F.8., the Board did not first select a proposer at its meeting on May 11, 1999, and then negotiate a contract price with the selected proposer. Rather, the evaluation committee negotiated a contract price with Intervenor before May 11, 1999, and the Board then selected Intervenor as the successful proposer. The evaluation committee is not the Board and does not have authority to act on behalf of the Board. As the RFP states, the evaluation committee has authority only to: . . . rank the firms in order of preference and . . . submit its recommendation to the Superintendent for his consideration. The [Board] will bear responsibility for the selection of the Contractor and will decide which bid [sic] is most appropriate for Brevard schools and their students. The Superintendent will recommend a therapy service provider which will be presented to the . . . Board for approval at a regular or special Board meeting. RFP at unnumbered page 8. The last sentence in Section F.8. makes clear that the right to select a proposer is the sole province of the Board and not the evaluation committee. Even if one were to ignore the legal distinctions between the evaluation committee and the Board and the authority of each, the RFP specifications fail to provide adequate notice to potential proposers of the true purpose for the RFP. As Respondent and Intervenor state in their joint PRO: . . . the . . . Board used the proposals it received to test the market for physical and occupational therapy services in Brevard County. The . . . Board then used the information it developed from the proposals as negotiating leverage to obtain a price concession from its incumbent contractor. The . . . Board's negotiation tactics permitted it to secure the superior vendor at the price of an inferior vendor. PRO at 33. The RFP fails to disclose that Respondent intended to use potential proposers to obtain negotiating leverage with the incumbent contractor. The failure of the RFP to disclose its purpose violates fundamental principles of due process, adequate notice, and fairness to potential proposers. It creates a gap between what agency staff knew of the Respondent's intent for the RFP and what potential proposers could know from reading the specifications in the RFP. The failure of the RFP to disclose its true purpose suggests that its authors recognized the chilling effect such a disclosure would have had on the response of potential proposers. The lack of responses from potential proposers, in turn, would have frustrated Respondent's intent to "secure the superior vendor at the price of an inferior vendor." Assuming arguendo that Section 120.57(3) controls the contract award at issue in this proceeding, Respondent's proposed agency action violates relevant provisions in Section 120.57(3)(f). In relevant part, Section 120.57(3)(f) provides: In a competitive procurement contest, other than a rejection of all bids, the Administrative Law Judge shall conduct a de novo proceeding to determine whether the agency’s proposed action is contrary to the agency’s governing statutes, the agency’s rules, or policies, or the bid or proposal specifications. The standard of proof for such proceedings shall be whether the proposed agency action was clearly erroneous, contrary to competition, or arbitrary, or capricious. . . . (emphasis supplied) As previously found, the proposed award of the contract to Intervenor is contrary to the RFP specifications, including specifications for the evaluation and selection process described in paragraphs 7 and 17, supra. The proposed agency action is clearly erroneous within the meaning of Section 120.57(3)(f). It violates fundamental notions of due process, adequate notice, and a level playing field for all proposers. All of the proposers who were induced by the terms of the RFP to expend the time, energy, and expense required to prepare and submit proposals were entitled to rely in good faith on the specifications in the RFP and to require Respondent to adhere to its own specifications. The proposed agency action is also contrary to competition within the meaning of Section 120.57(3)(f). The economic incentive to respond to an RFP would likely diminish over time if the proposed agency action were to persist. Potential proposers would eventually recognize the RFP process as a device intended to reduce the contract price of the incumbent provider rather than as a bona fide business opportunity for potential proposers to gain new market share. Such an economic environment would not likely induce potential proposers to incur the time and expense necessary to prepare and submit proposals. The pool of potential proposers would shrink, and Respondent would lose negotiating leverage with the incumbent vendor. The likely result would be an erosion of negotiating leverage and an accretion in costs.

Recommendation Based upon the foregoing Findings of Fact and Conclusions of Law, it is RECOMMENDED that the Department enter a Final Order finding that the selection of Intervenor for the contract award is contrary to the RFP specifications and contrary to competition. DONE AND ENTERED this 3rd day of September, 1999, in Tallahassee, Leon County, Florida. DANIEL MANRY Administrative Law Judge Division of Administrative Hearings The DeSoto Building 1230 Apalachee Parkway Tallahassee, Florida 32399-3060 (850) 488-9675 SUNCOM 278-9675 Fax Filing (850) 921-6847 www.doah.state.fl.us Filed with the Clerk of the Division of Administrative Hearings this 3rd day of September, 1999. COPIES FURNISHED: Dr. David Sawyer, Superintendent Brevard County School Board 2700 Judge Fran Jamieson Way Viera, Florida 32940-6699 Harold Bistline, Esquire Stromire, Bistline, Miniclier, Miniclier and Griffith 1970 Michigan Avenue, Building E Cocoa, Florida 32922 Jonathan Sjostram, Esquire Steel Hector and Davis, LLP 215 South Monroe Street, Suite 601 Tallahassee, Florida 32301 Edward J. Kinberg, Esquire Edward J. Kinberg, P.A. 2101 South Waverly Place Suite 200E Melbourne, Florida 32901

Florida Laws (1) 120.57
# 10

Can't find what you're looking for?

Post a free question on our public forum.
Ask a Question
Search for lawyers by practice areas.
Find a Lawyer