PM Foundations – The Project Closure Report

I used to have a boss who at the end of a project would always say “Let’s put a bow on this thing”. Even though this is kind of corny statement, the project closure report represents the proverbial “bow” on the project. The project closure report summarizes the product delivered, metrics associated with the effectiveness of the project delivery process, and the feedback captured from project stakeholders (including recommended next steps for future projects). This document is utilized to communicate a final summary of the project for project stakeholders, as well as create a reference document filed in the project archives (generally maintained by the project office). The communication to the project stakeholders is focused on confirming that the project is “done”, and assessing the project delivery process. The information maintained for purposes of the project archives focuses on creating a permanent record for reference by future project teams.

Because a large number of the readers will be looking for key take-aways about the project, I always include an Executive Summary that provides an overview of the project, highlights key project performance metrics, and summarizes the lessons learned and next steps. After the Executive Summary, the next section of the project closure report summarizes what was delivered with the project (from the product validation process). This summary compares what was delivered vs. the baseline expectations of the scope of the project and product. A listing of the outstanding issues and defects (with a brief description of the issue) is also included in this section. The primary purpose of the Product Validation Summary is to describe why the project is considered to be done.

Project Results

The next section details the key project performance metrics, including an assessment (rating) for each type of performance metric (cost, schedule, change, and quality), and a brief explanation of the team’s performance associated with the metric. The discussion in this section is generally organized based upon the type of performance metric.

Budget Performance

Budget performance metrics describe how effectively the team was able to manage to the baseline project budget. Budget performance metrics include:

  • Final Cost – What was the final actual cost of the project?
  • Breakdown by Component – Breakdown of the final cost by cost category (same costs categories that have been tracked and reported throughout the project life cycle). It is helpful to provide the breakdown with both total cost by category, as well as % of total.
  • Budget Variances – What is the difference between the baseline budgeted costs and the actual cost? This should be detailed in dollars and as a percent of the project budget. Detailing the variance by cost category clarifies the source of the cost variances.
  • Budget Performance Rating – Based upon the % budget variance, assign a Green/Yellow/Red rating to budget performance for the project. The budget rating is best defined at the beginning of the project in the cost management section of the project management plan (e.g., Green <10%, Yellow 11-20%, Red>20%).
  • Explanations for Key Variances – Represents a descriptive narrative explaining the key budget variances. I find it beneficial in the variance description to quantify how much each variance source accounts for the total variance (in dollars or as a percent).

Schedule Performance

Schedule performance metrics describe how effectively the team was able to manage to the baseline project schedule. Schedule performance metrics include:

  • Total Duration – What was the actual duration of the project (from the start date to the end date)? Even though people may be close to the project, it is surprising how many do not know the day it started, or how long they have been working on it.
  • Schedule Variances – What is the difference (in days) between the baseline end date and the actual end date? This should be detailed in days and as a percent of the total project duration. To provide clarification of the cause of the variance it is also helpful to provide the variances (days and percent) for each of the project phases/milestones.
  • Schedule Performance Rating – Based upon the % schedule variance, assign a Green/Yellow/Red rating to schedule performance for the project. The schedule rating is best defined at the beginning of the project in the schedule management section of the project management plan (e.g., Green <10%, Yellow 11-20%, Red>20%).
  • Explanations for Key Variances – Represents a descriptive narrative of the schedule variance explanations. Again, I find it beneficial in the variance description to quantify how much each variance source accounts for the total variance (in days or as a percent).

Change Management

Change management metrics highlight the team’s ability to manage change throughout the project life cycle. Key change metrics include:

  • Total number of changes – Breakdown of the total number of change requests submitted, and the number that were approved and implemented. In addition, it is often helpful to provide a breakdown of the approved/implemented requests in terms of the number that impacted scope, schedule and cost.
  • Impact of the change – Sum of the impact of the approved/implemented change request on the project schedule (days) and budget (cost). This represents the cumulative impact of change on the project.
  • Highlight of key changes – A detailed description of the key changes that were implemented. In this section it may also be appropriate to highlight changes that were not approved, particularly if they are intended to be deferred to future product releases.
  • Approved change vs. Total variance – A comparison of the total impact of change on the project vs. the total project variances (schedule and budget) provides a good indication of how much change accounts for the total project variances. In other words, were project variances managed in a proactive manner (via the change management process) or reactive manner (via after the fact variance explanations) throughout the project.

Other Metrics

There are often other metrics of interest about the project that may not have been within the scope of the normal performance reporting by the project manager. Some examples include:

  • Duration / Effort by Phase – One of the questions that is often asked about projects is what was the breakdown of time and/or effort in each of the project phases? Although each project is going to be a little different, this is information that will be useful for assessing the reasonableness of plans for future projects. Another interesting measure is the percent of the total effort for specific functions (e.g., project management, requirements definition, coding, testing, and training). This type of metric can be used for estimating high level resource requirements for future projects.
  • Benefits Realized to-date – What benefits have been realized to-date as a result of the project? How do the results compare to the expected benefits identified in the project charter? Because the project closure is likely happening soon after the product is implemented, these may represent “preliminary” results (with an explanation of when more conclusive results will be communicated).
  • Benchmark comparisons – How did the performance on this project compare to other projects (within the project office, company, or compared with industry data)? What were some of the differentiators for this project compared to that data (either positive or negative)?
  • QA Metrics – What were the number / percentage of defects reported within each project phase? Test cycle? What was the closure vs. discovery rate for defects? What conclusions can be drawn from the QA metrics?

Feedback & Next Steps

The final section of the Project Closure summarizes the process and results of gathering stakeholder feedback.

  • Process – Summary of the feedback survey and lessons learned processes. Who was solicited for feedback? What was the response (# of surveys, # of participants)? How was the lessons learned discussion organized and facilitated?
  • Rating Metrics – What was the average score for each question? What was the number of very high or very low scores for specific questions? I often include commentary as to why the score is the way it is (if you received comments that help explain some of the scoring).
  • Key Themes – What were some of the common themes from comments on the survey or during the lessons learned session?
  • Opportunities – What are the high priority opportunities (those opportunities to improve, or leverage what was done well for future projects / product releases)?
  • Next Steps – What are the action items to ensure that the opportunities are moved forward in the right direction? Where possible, include owner assignments and target dates in this section of the document.

4 Elements of an Effective Project Closure Reporting Process

  1. Limited Original Work – If the project execution and closure activities (product performance reporting, product validation, and stakeholder feedback/lessons learned) have been effectively performed, the project closure report represents a summarization related effort, versus an effort involving a significant amount of original work.
  2. Executive Summary – Many of the stakeholders consuming the Project Closure Report are looking for high points about what was delivered, how effectively was it delivered, and what have we learned from it. Make sure you include an Executive Summary section in the beginning of the report that effectively communicates the information these Stakeholders are looking for.
  3. Variance Explanations – Ensure that variance explanations are fact based, and describe the impact the source of the variance had on the performance metric (e.g., the specific source accounts for x% of the total variance).
  4. Next Steps – The Project Closure Report clearly links the project performance to the lessons learned and the continuous improvement related next steps. Stakeholders will be “scratching their heads” if the lessons learned and next steps are not related to the project performance (cost, schedule, change, or quality).

Note: I have added the Project Closure Summary template that I use to my Templates page.

Advertisements

About Steve Hart
Practice Manager responsible for project leadership & delivery services for the Cardinal Solutions Group in the RTP area. I am a PMP with 25 years of project management and technical leadership roles, have developed an extensive practical knowledge that spans a wide variety of industries, and project delivery approaches. As a practicing PMP, I am a member of the North Carolina PMI chapter. I am an avid sports fan, particularly the Miami RedHawks, Cleveland Indians, Cleveland Browns, and most recently the NC State Wolfpack.

One Response to PM Foundations – The Project Closure Report

  1. Pingback: PM Foundations – The Project Closure Report « PMChat

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s