How to Use Post-Submission Debriefs to Increase Your Win Rate
By Harley Stein, professional oral presentation coach and Partner of Tenzing Consulting

After every proposal, win or lose, we must gather lessons learned from two realities: our customer's reality and our team's reality. Post-submission debriefs with associated actions are two key ways to improve win rates.

There are two types of post-submission debriefs: a customer debrief and an in-house debrief. Each type of debrief provides valuable but different lessons learned – and each is key to improving your win rate.

From the customer debrief we can learn how a specific customer evaluates proposals (scoring scheme); what matters to this customer (why they gave us these scores); what we did well (our strengths); and where we need to improve (our weaknesses and our risks).

From the in-house debrief we can learn who in our organization are effective proposal contributors; how efficiently we operated; and the strengths and weaknesses of our proposal process.

Customer Debriefs
There are five keys to learning from customer debriefs:

  • Request a debrief win or lose
  • Know what you want to learn and what you can expect to learn
  • Carefully select attendees
  • Set rules of engagement
  • Communicate the right attitude.

Request a debrief win or lose – This is the first and most important step with regard to customer debriefs. Many companies request a debrief after a loss, but not after a win, yet a debrief after a win is often more valuable than one after a loss

If we are granted a debriefing on a proposal we won, we are more likely to hear the real reasons why we won. Why? The customer is not worried that we might protest! A debrief after a win is also a good opportunity to learn where we fell short in the proposal because customers can afford to be candid. And in being candid they will often discuss their insights in detail. These insights will help us improve future proposals, both to this customer and others.

The official debrief after a win is really only the first step. Because you now have access to your customer daily, over the course of the contract your leadership can learn what their counterparts thought of our proposal effort.

Know what you want to learn and what you can expect to learn – What we want to learn is pretty straightforward: Why the heck didn’t you select us (or if we won, why did you select us)? What can you expect to learn? That depends primarily on the customer and secondarily on our attendees.

Let’s start with what you typically don’t learn. You typically don’t learn how your competitors scored. Most debriefs focus solely on your company’s bid. The exception here is that some customers will tell us how we ranked among the bidders at the highest scoring level of their scheme. For example, they may say we were second in the technical factor and third in the management factor. Customers do not typically provide details as to why a specific competitor scored first in a specific area.

You typically don’t learn exactly what the customer wants to see. The closest customers will come is by revealing their detailed scoring scheme (see the examples that follow). Remember: customer debriefings are not to justify why we were or were not selected.

What can you learn?

You can learn what matters to them; in other words, why they gave us these scores. You can learn what you did well (your strengths) and where you need to improve (your weaknesses and risks). And you can learn all of these if the customer provides you with even hints of their scoring scheme.

Few Federal Government customers use a point scoring scheme anymore, though there are some. Many use some type of adjectival scoring (see Table 1). Others use plusses and minuses to highlight strengths, weaknesses and risk (see Table 2). Others use factors for major sections and standards for subsections. Many customers evaluate risk as a separate entity.

So if you get a +++ or a Purple score, it tells you that you likely did exactly what the customer was hoping for. Customers typically provide the color or adjectival score on a significant section – for example, you might get Blue for your technical approach. Customers typically provide the plus-minus score on subsections – for example, your subsection on hiring and maintaining staff, which was within your overall management approach section.

Table 1. Adjectival (or color) scoring scheme.

Purple: Exceptional—Offeror’s proposal demonstrates an EXCEPTIONAL understanding of goals and objectives of the procurement, and approach to satisfying them

  1. One or more major strengths exist
  2. No Significant weaknesses exist
  3. Strengths significantly outweigh any weaknesses that exist

Blue: Very Good—Offeror’s proposal demonstrates a VERY GOOD understanding of goals and objectives of the procurement, and approach to satisfying them

  1. Strengths outweigh any weaknesses that exist
  2. Any weaknesses are easily correctable

Green: Acceptable—Offeror’s proposal demonstrates a GOOD understanding of goals and objectives of the procurement, and approach to satisfying them

  1. There may be strengths and/or weaknesses
  2. Weaknesses are not offset by strengths, but the weaknesses do not significantly detract from the offeror’s proposal
  3. Weaknesses are correctable

Yellow: Marginal—Offeror’s proposal demonstrates a FAIR understanding of goals and objectives of the procurement, and approach to satisfying them

  1. Weaknesses outweigh any strengths that may exist
  2. Weaknesses will be difficult to correct

Red: Unacceptable—Offeror’s proposal demonstrates a POOR understanding of goals and objectives of the procurement, and approach to satisfying them

  1. No significant strengths exist, and one or more significant weaknesses exist
  2. Weaknesses clearly outweigh any strengths that may exist
  3. Weaknesses will be very difficult to correct or are not correctable

Table 2. Plus-minus scoring scheme.







Significantly above standards/expectations



Above standards/expectations



Slightly above standards/expectations




Slightly below standards/expectations

- -


Below standards/expectations

- - -


Significantly below standards/expectations

It is fair at a debrief to ask why you received a Yellow score or why you were scored a double-minus. Will you receive an answer that satisfies you and allows you to improve the next time you bid to this customer? It depends on the customer, the contentiousness of the bid, and your attitude at the debrief.

Carefully select attendees – The number of attendees at a customer debrief should be severely limited. The criterion for attending is straightforward.

  1. You must know the proposal well. For example, the capture manager and/or program manager are often our leads at customer debriefs, based on the theory that they know what we bid better than anyone else.
  2. You must bring specific knowledge. For example, the cost lead and the technical lead often attend due to their knowledge of what and how we bid from a cost and technical perspective.
  3. You need an objective outsider. A key attendee is an objective third party, someone without a “dog in the fight.” This person will be open to nuances and statements that the people who participated in the proposal effort might not want to hear.
  4. You need a contracts and/or legal representative. Most companies require a contracts representative, others a legal representative, and some both.

There are always more people who want to attend – who believe they have to attend – than we actually want to bring. The executive in charge of the bid often wants to attend; however, they typically have nothing they will add to the discussion. They simply want to hear first-hand what the customer says. If, on the other hand, the executive played a significant role in the bid, then perhaps they attend rather than a capture manager or program manager. The key is to bring the smallest group necessary; you don’t want to overwhelm the customer with an army. Not much can silence your customer faster than a lot of attendees.

Set rules of engagement – Before we attend a customer debrief we must pre-meet with attendees to set roles, responsibilities, and conduct. This means determining who leads; who speaks and when; who does what and when; and who is listening and documenting the session. As with all teams, a pre-meeting to set the rules of engagement will enable the team to walk into the customer debrief ready to perform – and not ready to storm instead!

Communicate the right attitude at the debrief – If we walk in with a chip on our shoulders and with a half-dozen corporate lawyers, expect the government to say little or nothing. We’ve clearly shown them we are very unhappy and ready to protest. If we are combative, challenging the points they make, expect the government to say little or nothing. If we are friendly and open, there is a chance the government will be too. Often we have communicated beforehand – unofficially – with our customer to let them know that in requesting a debrief we have no other motive other than learning.

As a participant in a debriefing, your role is to listen and learn, not argue. After the debrief, participants ought to separately document their impressions to ensure objectivity. Once that is accomplished, the participants assemble to share observations. These should be focused on capturing outcomes. From these reports, a lessons learned document is compiled and shared with the appropriate people and organizations.

An example of how we learned from customer debriefs – We had a government customer that instituted a 1000-point scoring scheme. This scoring scheme allocated roughly 350 points to the specific technical approach. The other 650 points was allocated to mostly boilerplate material that we tailored slightly for each customer and bid: past performance, quality, safety, management approach, HR functions, tools, etc.

In our first bid to this customer we scored over 300 of the 350 points allocated for technical; we were second in this score among all bidders. We did noticeably worse across the boilerplate sections, scoring roughly 500 out of a possible 650 points, and our overall score was a losing score. However, there was a silver lining: this customer provided us with a detailed debrief, walking through every section of the proposal, telling us our score, and telling us why we received that score. They answered several of our clarifying questions, though of course they wouldn’t tell us that a particular approach was the right answer.

Before our next bid to this customer we revised our boilerplate sections based on what we learned at the debrief. Our next bid to them scored about the same in the technical section, and once again we were second technically. This time, though, we scored slightly over 600 points – and we won the bid. We won seven straight bids with this customer, and in only two of those bids were we first technically.

In-House Debriefs
We conduct in-house debriefs reviews to evaluate and improve our capture and proposal process. The goal is to enable us to know, share, and repeat what we do well, and to recognize and improve what we don’t do well. Specifically, we can learn three key things:

  • Who in our organization are effective proposal contributors
  • How efficiently we operated
  • The strengths and weaknesses of our proposal process.

The information for an in-house debrief is often compiled by surveying key members of the proposal team through a combination of a questionnaire, interviews, and meetings. A questionnaire or lessons learned document should be completed by all proposal contributors immediately after proposal submission, before memories fade. This should be a standard document used for all efforts (see Lessons Learned questionnaire appendix), and contributors should be able to complete it anonymously if need be. In addition, in some organizations the proposal manager interviews team members while the proposal effort is still fresh in their minds.

Given the fluid and flexible nature of proposals, in-house debriefs can and should be conducted whenever and however we are able. Good proposal managers and/or capture managers should capture data throughout the proposal process. Some of these managers keep suggestion boxes where any proposal contributor can drop in suggestions at any time. Some proposal contributors have limited roles – perhaps they are color review team members. Waiting for the end of the proposal effort for them to provide feedback might be too late. Suggestion boxes, a brief interview, or the use of tools such as SharePoint make collection of feedback much easier. This multi-channel approach to capturing feedback is particularly useful when contributors are here and gone or when they wish to submit feedback anonymously.

The proposal manager collects the documentation and reviews it with leaders from their organization as well as from capture management. Together they sift through the data and turn it into useful information, which is added to their proposal process database. This information then becomes the basis for the in-house debrief.

One innovative idea is to have an objective third party conduct your in-house feedback process. A third party won’t bristle at the bad and the ugly; will listen to everything; and will maintain objectivity. If your company is serious about instituting real improvements based on real lessons learned, a third party observer can effectively lead in-house debriefs across multiple proposal efforts and help identify systemic trends/issues.

Effective proposal contributors – Growing a corps of good proposal contributors is one way to ensure high-quality proposals. Good capture managers and proposal managers are in the best position to identify the quality of the contributors on their team. A key to growing your in-house proposal resources is requiring that these managers evaluate their team members; for those members who are not yet strong but show promise, they must also identify areas for improvement.

It is not only capture and proposal leadership that conduct evaluations; your proposal team members provide invaluable insight regarding the skills and effectiveness of their teammates as well as their leaders. In this way we grow proposal resources across all aspects of the capture and proposal process.

It is through this identification and evaluation process that our proposal contributors are given more challenges and broader responsibilities, developing from subsection writer to section writer, from book boss to proposal manager. As we grow our proposal resources and they become stronger, our win rate increases.

One key to this improvement is to develop a set of metrics to evaluate proposal contributors. These metrics should measure the various skills required of proposal contributors: writing, graphics, leadership, teamwork, etc.

One other group whose effectiveness we want to evaluate and measure is our teammates. What we are hoping for are teammate companies who carry their fair share of the proposal burden, who step up when we need them to step up. These are teammates who typically will contribute when we win the program, and they are teammates we want to partner with in the future.

How efficiently we operated – The key to going from blank paper to a winning proposal in 30 days is how quickly we pass through the four stages of team dynamics: forming, storming, norming, and performing. The more quickly we reach the last stage, the more efficiently we operate. This is a soft skills element; it is all about teamwork and an atmosphere that encourages collaboration.

Our best leaders are adept at molding teams and quickly ushering them from forming to high performing. This is, perhaps, the critical skill that a proposal leader must have. Often a key ingredient here is how efficiently the capture manager and the proposal manager work together. Those with complementary skills and complementary personalities who can build a relationship are typically stronger as a pair leading a team. Those who are oil and water …

The strengths and weaknesses of our proposal process – The final item we can learn about from our in-house debrief is how effective our proposal process was. Debrief documentation should specifically ask about the effectiveness of milestones in the proposal process – kickoff meeting, color reviews, etc. – and should also ask for recommendations of how to improve.

Examples of what we have learned – Across many companies, I have seen numerous examples of lessons that were learned that improved the proposal process, the strength of contributors, and our efficiency. Through in-house debriefs we learned:

  • To provide far more examples for our proposal team to use early in the proposal process
  • That assigning the same person as Capture Manager and Proposal Manager is a recipe for failure — add Program Manager to their responsibility and you’ve cooked the losing meal
  • If you haven’t helped the customer shape requirements; if you don’t have an intimate knowledge of them and they of you — NO BID
  • The hardest decision you should have to make is the one not to pursue
  • The days of winning with green proposals are in the rear-view mirror — it takes blue to win.

The customer debrief and the in-house debrief each provide valuable but different lessons learned – and each is key to improving your win rate.


Below is the Proposal Process Lessons Learned Questionnaire. Click here to download the Microsft Word file.

Proposal Process Lessons Learned Questionnaire

Proposal Title _____________________________________________
Name ________________________________ Role ____________________ Date ___________

Please respond with your evaluation of the proposal development effort. The objective of this questionnaire is for you to evaluate the processes and resources we used, for you to provide constructive criticism of our effort, and for you to recommend improvements.

For each process or resource, provide a numerical value to indicate the effectiveness of the process listed: 1 indicates very effective and 5 indicates that we need a lot of improvement. If you did not participate in a process or did not use a resource, indicate not applicable (N/A).

1. Overall Proposal Quality
Overall Quality (1 Excellent/ 5 Poor) ______
Quality of the section to which you contributed ______            Volume/Section # _________
Rationale _____________________________________________________________________

Effectiveness        Key Metric

(1 Excellent/5 Poor)           
___________            Was the proposal well organized? Did it follow the RFP instructions?
___________            Was the proposal easy to read? Were there clear win themes and action captions?
___________            Did we succinctly define the problem/requirements we were addressing?
___________            Did we clearly tie the problem/requirements to our solution?
___________            Did we clearly define the customer benefits from what we proposed?
___________            Did we provide proof to substantiate claims?
___________            Did we clearly tell the customer why they should choose us?
___________            Did we address the RFP requirements and evaluation criteria?

If you reviewed the proposal after submission, describe any major errors you found:


2. Processes

Effectiveness       Key Events/Activities

(1 Excellent/5 Poor)
___________            Kickoff Meeting
___________            Proposal Training
___________            Proposal Direction
___________            Capture Team Role Definition
___________            Win Themes/Strategies
___________            Storyboard/Pink Team Review
___________            Red Team Review
___________            Capture Manager Leadership
___________            Proposal Manager Leadership
___________            Proposal Facilitator Coordination
___________            Production Support


3. Team Building

Effectiveness       Key Activities

(1 Excellent/5 Poor)
___________            Did we establish a team relationship?
___________            Did we clearly communicate time frames/due dates?
___________            Was there a Master Schedule?
___________            Did we communicate customer understanding and hot buttons?
___________            Did we hold daily team meetings?
___________            Did we develop an outline?
___________            Did we compile a phone list?
___________            Did we create a compliance checklist?


4. Resources

Effectiveness       Resources

(1 Excellent/5 Poor)
___________            Were supplies readily available?
___________            Did you have access to the equipment you needed?
___________            Did you have access to folders on the network as needed?
___________            Was there a proposal war room?

Please provide recommendations for those areas you score as a 4 or 5. Please describe those processes and resources that you score as a 1 or 2.

Recommendations for process/resource improvements:


Effective processes/resources to replicate/reuse:

TenzingHarley Stein is a proposal professional, professional oral presentation coach and Partner of Tenzing Consulting, specializing in strategies, proposals, presentations and coaching. Contact Harley at or 302-593-6718.  Visit

If you don’t have your own FREE subscription to Design To Win, sign up now at Join more than 2000 other proposal professionals who get answers to their most pressing issues and challenges from recognized industry experts—every other month. Plus you’ll have access to all back issues and our growing library of proposal resources.

Send this article to a colleague:
Share with my social networks:
Social networking logos

Linked in Share on Facebook Share on Twitter