top of page

Keys to Successful Radio Proposal Evaluations


The procurement of public safety radio systems by municipal, county and state agencies is a rigorous process governed by state/county legal procurement practices. Due to the highly complex and technology-laden aspects of these communication solutions, the primary procurement vehicle often used is the Request for Proposal (RFP) process. In rare instances, usually when the equipment configuration has been self-designed by the Owner, an alternative Bid process may be used. Outwardly the two processes may appear visually similar but in practice they have very significant differences – particularly in the way a Successful Vendor is identified.

Both RFP and Bid projects contain a set of specifications that define the goals and objectives of a given project. Yet, in a Bid project, the Owner specifies the exact configuration of equipment, technology, and services desired whereas in an RFP the Owner specifies a set of minimally-acceptable functional, operational and performance objectives to be met, but the methodology and application of equipment devices is left to the Proposer to determine.

The determination of an RFP’s successful vendor involves the relative evaluation of a proposer’s solution with respect to the specification’s minimum requirements. That is, the various elements, functionality, and services contained within the proposer’s solution are compared to the project specification and are individually ranked as exceeding, meeting, partially meeting or taking exception to requirements. A point-based evaluation, with weights assigned to specific evaluation criteria, is typically utilized in a manner that ensures an objective and unbiased evaluation result.

In the case of Bid procurements, Bidders are unable to take material exceptions to specification requirements and so the determination of award is based on that compliant response having the lowest price. In an RFP process, Proposers may undertake calculated risks by offering exception-based solutions and might even be successful if a sufficient group of other technical factors offset the lower score that results from an assembly of proposal exceptions.

In Today’s public radio system procurements, the most important performance and reliability aspects lie within the submitted coverage reliability solution. Failure to meet minimally-acceptable coverage requirements is reason to outright disqualify a Bid response - irrespective of cost - and, in the case of an RFP, would result in an evaluation score far below those of coverage-compliant submittals.

Response Evaluation Criteria

RFP evaluations are much more complicated than bid evaluations as the process involves ranking factors other than price and specification compliance. The evaluation problem is further compounded by the inherent difference between technical versus functional RFPs. Responses to a technical RFP, although possibly having some operational and hardware differences, are to a single, pre-determined system configuration. By contrast, a functional RFP is entirely open to vendor interpretation of needs. Thus, the system configurations proposed in response to an Owner’s RFP specification may be quite different and vary in technical complexity.

The numerous variables involved in an RFP procurement defy attempts to develop a ‘set-in-stone’ evaluation procedure. Therefore, the following guidelines may be helpful in preparing procedures specific to your procurement’s needs.

The RFP document should clearly state that submitted proposals are to be evaluated and numerically graded with respect to certain pre-determined areas. The order of the evaluation criteria should be listed in a relative order of importance, with the total scaled by percentage to 100. The following is a suggested set of technical/functional evaluation criteria that has evolved over many years:

1. Coverage (25%).

2. Reliability, capacity, and redundancy (15%).

3. Performance, expansion, and interoperability (10%).

4. Bidder Qualifications, RFP deviations (10%).

5. Equipment implementation, installation, repair (12.5%).

6. Training (5%)

7. Warranty (10%).

8. Contract Terms and Conditions (7.5%).

9. Organization, scope, and detail of proposal (5%).

RFPs and Bids can be released by direct mail to pre-qualified candidates and through public advertisement in local newspapers and trade publications. The period for submittal response is specified in the RFP/Bid documents and, depending upon project complexity, a reasonable response period is 60 to 90 days.

It is important to have this information presented early in the RFP document. Doing so sets a procurement tone: A., The Owner expects a coverage-compliant, reliable-high performance solution that meets initial needs and can be expanded. B., The proposed solution must be reliable in terms of its hardware components and the configuration as a whole. C., The Owner has accepted the fact that high performance and reliability does not come cheaply, so don't try to "lowball" the project by supplying only the barest essentials.

With evaluation criteria so structured, the Owner clearly desires the best possible solution. The listed order of evaluation criteria, as suggested, can have a profound effect on the quality of proposal responses. Give this evaluation criteria aspect careful consideration.

Vendor Questions and Observed Strategies

No matter how complete your procurement document is structured, vendors will have questions. Often, the questions have multiple roles which range from clarification of technical or operational issues to one vendor trying to orient or bait a competitor into a less advantageous position. If the project is complex, it may be advantageous to convene one or more pre-submittal conferences. Attendance should be made mandatory to eliminate possible confusion or complaints that important information was unfairly or improperly disseminated.

All questions should be submitted in writing and, if possible, prior to the pre-submittal conference. This will enable you to properly research and directly answer each question orally at the conference and by written addendum. Carefully examine each question for multiple meanings. If you suspect posturing tactics, have questions redefined prior to answering.

It is in your best interest to assure that no one vendor has an unfair advantage over another. Besides, it's your show, so run it your way!

Bid/RFP Evaluation Process

Bid Evaluation

Bid evaluation follows a relatively straightforward procedure. After the bids have been received and the proper information, forms, bid bonds, contractor's license and other information as may be required to qualify the Bidder is submitted and accepted by the governing entity, the first step is to rank each Bid according to price. Begin bid evaluations with the lowest priced bid and inspect it for compliance with all stated bid submittal requirements. If the Bidder has taken no exceptions and its proposed hardware meets or exceeds your project’s specifications, then that Bidder should be declared as the lowest apparent bidder.

If the lowest apparent bidder has taken exceptions that are below the stated specification minimums, that bid should be termed as non-responsive. The next lowest bidder would then be evaluated as above. This process would continue until a compliant Bid is selected.

RFP Evaluation

RFP evaluations are much more complicated than Bid evaluations as the award recommendation process involves ranking factors other than price and compliance. The evaluation problem is further compounded by the inherent difference between technical and functional RFPs.

Responses to a technical RFP, although possibly having some operational and hardware differences, are geared to a pre-determined system expectation (often contained within a consultant-crafted conceptual solutions report). Conversely, a functional RFP is subject to interpretation of the Owner’s needs, and so the response configurations proposed by competing vendors may be markedly different and vary in technical complexity.

While crafting its RFP Specifications, the Owner (or consultant) would have developed, in a schematic sense, an outline of required response statements, calculations, drawings and all other materials necessary to assure the viability of a submitted solution. Where proposers are required to submit a technical explanation involving multiple topics such as coverage, capacity, audio quality, network resilience, voice encryption, security, etc., the outline should be of sufficient detail that any omission in a proposer's response is plainly evident.

The response outline often serves as a global Quality Control of the specification’s development process and, tangentially, the basis for a proposal evaluation worksheet. Alternately, an Owner might numerically serialize sections, subsections, and paragraphs contained within its RFP specification and correlate these each to items within the project’s proposal evaluation worksheet.

Evaluation Worksheet

A typical worksheet item may have the following general form:

Worksheet Example

Submittal of calculations, tables, or charts used to determine response, blockage, and queue times (value = 50 points)

(Select One) Selection __ Total Points_

0 = No submittal

I = Submittal, major exception

2 = Submittal, minor exception

3 = Meets RFP

4 = Exceeds RFP

In this particular case, an evaluator has a choice of selections whose weight varies from O if no submittal was made to 5 for a submittal exceeding the RFP requirements. A point system is shown where the item itself has a value of 50 points. To determine a total point score for that item, simply multiply the value by the selected weight. In this example, scores can range from zero, for no submittal, to as much as 200 points, for a submittal that exceeds the minimum requirement.

Select item values so that their relative order of evaluation importance, as stated in the RFP, is retained. That is, using the earlier described evaluation criteria example the total possible point score for, say, the Coverage section must be substantially higher than that for Training.

In fact, since Coverage is what drives the cost, complexity, and user expectations of any public safety radio solution, the evaluation points allocated to this item should be an order of magnitude higher than all other items. If, for example, the above cited capacity item (one criteria level below coverage) had a base value of 50 points, then the Coverage item would have a base value of 500. While other RFP items/criteria may be more closely related with respect to assigned point values, Coverage must always reign superior in the evaluation process.

The total number of raw points within an evaluation worksheet will be in the order of many thousands, however, the worksheet should be configured to allocate scores to fit within the RFP’s pre-defined evaluation criteria’s “buckets” and then scale each bucket in accordance with the weight-percentages published within the RFP’s specifications.

Aside from the obvious advantages of the evaluation worksheet method and its numeric point system, experience has shown that if an RFP is properly prepared and individual items are sufficiently detailed, scored variations between individual RFP response evaluators will be relatively small. This assumes, of course, that the evaluators are knowledgeable of the technology or have access to a qualified, independent subject matter expert who can provide technical clarification as needed.

Things to Avoid

It is important to minimize subjective evaluations as personal opinions can be widely divergent and could open the door to vendor protests. If evaluation worksheets are properly configured - whereby evaluation items are tied to specific vendor submittals, statements, assertions, graphs, charts, and other required information found within a proposer's submitted response – the fidelity of the evaluation process will be upheld. By keeping subjective evaluations to a minimum (meaning: zero), graded evaluation results will remain consistent and can be readily defended on the basis of fact.

Throughout the technical evaluation of vendor response submittals, it is important to restrict access to cost information. If the indicated cost is too good to be true…well, expect a very bad outcome if your technical evaluation was cost-influenced!

To prevent cost-biased results, the proposal evaluation process should be conducted in two steps, first involving the technical, functional and reliability aspects of each vendor’s proposed solution. Once the technical evaluation is completed, cost is next evaluated with each proposer’s cost individually ranked in proportion to the lowest cost. In order to further prevent ‘low-balling’ of proposal submittals - and assuming a net 100 point award system - the technical portion of an RFP’s evaluation might represent 75% of the final evaluation score, with 25% later allocated to cost.

Closing Thoughts

The evaluation process is completed once a single proposal is declared to be the most responsive in meeting needs. If a consultant is evaluating proposals for the Owner, then that vendor securing the highest combined technical and cost score should be recommended for contract negotiations. The Owner should include both the original RFP Specification and the Successful Vendor’s submittal as attachments to the negotiated contract document.

While evaluating proposed solutions, the coverage and related audio quality portions must be thoroughly considered. The reason is simply this: Funding for radio systems – which in Today’s world are extremely complex and costly – is granted by municipal, county, and state governments because coverage saves lives.

Ever read a news story about some City going public or threatening to cite a radio vendor in default while transitioning to a new radio system? If so, the reason wasn’t because their new radios had only 300 talkgroups but the users expected 500. Or the radio’s ergonomics were so-so. Or because the radio’s battery pack lasted 11 hours instead of 12.

Nope….it’s a news story because of inadequate/defective:


Throughout my 40+ years in the Industry, I’ve seen cobweb-encased jurisdictional ‘budget wallets’ open boxcar-wide in the aftermath of some unfortunate event where an officer was unable to communicate and get needed life-saving help. Or when a fireman pressed his radio’s emergency button while trapped in an engulfed building --- but no one heard that critical call or the on-scene incident commander misunderstood the situation because the received audio was broken and distorted.

An evaluation process that is heavily weighted toward coverage reliability, audio quality, and capacity will result in a deployed radio solution that keeps people safe….and out of the news for the wrong reasons.

Throughout this discussion, a well-structured set of specifications is viewed as an important pre-requisite of the proposal process. By well-structured, we mean specifications that are clearly understood, detailed, and never slanted toward any one vendor’s technology or solution. Vendors can smell a biased set of specifications a mile away and the only one that loses in that instance is YOU. It is competition, alone, that draws in the best-value set of coverage, reliability, versatility, and performance --- where your user agencies are the clear winners!

Oh, wait…One last thing: Best and Final Offers. Commonly referred as BAFO’s.

Occasionally, an Owner gets a wild hair and believes it’s allowable in the competitive process to take the best technical aspects of multiple vendor proposals, distill them, and subsequently entice vendors produce a new (and somehow assumed) lower price. Often this occurs seemingly out of the blue, after an RFP’s first round has been concluded. The Owner (and others) have seen the results from competing vendors, mulled things over, and sense an opportunity to squeeze the lemon. Unfortunately for the Vendors, they usually don’t know there is to be a second round.

An RFP is supposed to be a ‘one and done’ process. The Vendors independently take their best shot at satisfying a set of specification -- under the fair play assumption that their individual work products are secure.

BAFO’s are especially troubling and grossly unfair to whoever should have been the Successful Vendor. You know…the guy who had the best score in the first round and in normal circumstances would have secured the award. Where’s the reward for being an innovative, creative, and best value vendor only to have competitors get a free shot to correct mistakes, get another bite at the apple, or worse, scab onto another firm’s innovative hard work? While applying the BAFO hammer, sure--the Owner may have pounded down the initial price of the job. Yet, whatever good will had developed over time between Vendor-Owner is assuredly gone and the project risks enduring a conga line of change orders that recovers what was squeezed out plus plenty more.

If one is insistent on utilizing the BAFO scheme, at least let the Vendors know in advance so they can consider appropriate safeguards to protect their intellectual property and leave room in their initially offered price to sustain a second round of cost cutting. But be advised, a persistent bleed-out risk is more likely for your project’s budget – with you having to explain cost over-runs to folks that don’t want to hear about your problem.

As sung by Bessie Smith in 1929: Nobody Knows You When You’re Down and Out.

NOTICE: This document and its content, dated March 9, 2023, is the copywrite property of Central Electronics, LLC. All rights reserved.

You may not, without our expressed permission, distribute or commercially exploit the content or publish/link same on any other website or electronic media.



Let’s Work Together

75757 Highway 1082

Covington, LA 70435

Tel: 504-400-8873

Thanks for submitting!

bottom of page