Post-Implementation Review
A post-implementation review (PIR) evaluates a completed change against its original objectives, documents what occurred during implementation, and captures lessons that inform future changes. The review bridges the gap between change completion and organisational learning by creating structured opportunities to assess outcomes while implementation details remain fresh. PIRs apply to all changes that warrant evaluation, from major infrastructure deployments affecting hundreds of users to targeted application updates that introduced unexpected complexity.
The review produces three outputs: an assessment of whether the change achieved its intended outcomes, a record of issues encountered and how they were resolved, and actionable improvements for change management, release processes, or technical practices. These outputs feed directly into the continual improvement register and knowledge base, transforming individual change experiences into organisational capability.
Prerequisites
Before initiating a post-implementation review, verify the following conditions are met.
- Change record access
- Read access to the change management system containing the original change request, risk assessment, implementation plan, and any amendments approved during execution. The change record number and final status must be confirmed.
- Implementation documentation
- Access to implementation logs, deployment records, communication transcripts, and any incident tickets raised during or immediately after the change window. For changes using CI/CD pipelines, access to pipeline execution logs and artifact repositories.
- Monitoring data
- Performance metrics, availability data, and error rates for affected services covering the baseline period before the change, the implementation window, and the stabilisation period after. Minimum 7 days post-implementation data for standard changes, 14 days for major changes.
- Stakeholder availability
- Confirmed attendance from the change owner, technical implementers, service owner, and affected team representatives. For major changes, include CAB representative and any third-party vendors involved in implementation.
- Success criteria documentation
- The original success criteria defined in the change request, including specific metrics, thresholds, and verification methods. If criteria were modified during implementation, both original and revised criteria must be available.
- Authority to document
- Write access to the knowledge management system for creating or updating articles, and to the continual improvement register for logging improvement actions.
Verify change record accessibility before scheduling:
# Query change management system for record completenesscurl -s "https://itsm.example.org/api/v1/changes/CHG0012847" \ -H "Authorization: Bearer $ITSM_TOKEN" | \ jq '{id: .number, status: .state, has_plan: (.implementation_plan != null), has_risk: (.risk_assessment != null), has_backout: (.backout_plan != null)}'Expected output confirming complete record:
{ "id": "CHG0012847", "status": "closed", "has_plan": true, "has_risk": true, "has_backout": true}Procedure
Schedule the review
Determine review timing based on change category. Standard changes require PIR within 5 working days of closure. Major changes require PIR within 10 working days. Emergency changes require PIR within 3 working days, prioritising rapid learning from urgent situations.
Identify required participants using the change record. Extract the change owner, technical lead, and implementation team members. For changes affecting multiple services, include each service owner. A PIR for a major infrastructure change affecting email, file storage, and authentication services requires representation from all three service areas.
Calculate the meeting duration based on change complexity. Allow 30 minutes for standard changes with no incidents. Allow 60 minutes for standard changes with incidents or deviations. Allow 90 minutes for major changes. Allow 120 minutes for failed changes requiring detailed analysis.
Send calendar invitations with the following information: change number, change title, implementation date, meeting purpose, and pre-read materials location. Include a link to the PIR data collection form if using structured input.
Subject: PIR - CHG0012847 - Email Gateway Migration
Post-implementation review for the email gateway migration completed on 2024-11-08.
Change: CHG0012847 Implementation window: 2024-11-08 22:00-02:00 UTC Status: Completed with minor incidents
Pre-read materials: https://docs.example.org/pir/CHG0012847/
Please review the implementation summary and come prepared to discuss your observations.- Create the PIR document shell in the knowledge management system, linking it to the change record. This establishes the documentation location before the meeting and ensures the record exists even if the meeting is delayed.
Collect and prepare data
- Extract quantitative implementation metrics from monitoring systems. Gather availability percentages, error rates, response times, and transaction volumes for the affected services. Calculate the metrics for three periods: 7 days before implementation (baseline), the implementation window itself, and 7 days after implementation (stabilisation).
# Extract availability metrics for email gateway service # Baseline period curl -s "https://monitoring.example.org/api/v1/query_range" \ --data-urlencode "query=avg_over_time(up{service=\"email-gateway\"}[1d])" \ --data-urlencode "start=2024-11-01T00:00:00Z" \ --data-urlencode "end=2024-11-08T00:00:00Z" \ --data-urlencode "step=1d" | jq '.data.result[0].values'
# Post-implementation period curl -s "https://monitoring.example.org/api/v1/query_range" \ --data-urlencode "query=avg_over_time(up{service=\"email-gateway\"}[1d])" \ --data-urlencode "start=2024-11-09T00:00:00Z" \ --data-urlencode "end=2024-11-16T00:00:00Z" \ --data-urlencode "step=1d" | jq '.data.result[0].values'- Compile incident and problem records linked to the change. Query the service management system for any incidents raised during the implementation window or referencing the change number in the 14 days following implementation.
# Find incidents related to change curl -s "https://itsm.example.org/api/v1/incidents" \ -H "Authorization: Bearer $ITSM_TOKEN" \ --data-urlencode "query=related_change=CHG0012847 OR \ (opened_at>=2024-11-08 AND opened_at<=2024-11-22 AND \ affected_service=email-gateway)" | \ jq '.result[] | {number, short_description, priority, resolved_at}'Gather qualitative feedback from implementation participants. Send a brief survey or structured questions to technical staff involved in the change, asking about plan accuracy, unexpected challenges, tool effectiveness, and communication quality. Allow 48 hours for responses before the PIR meeting.
Prepare the implementation timeline showing planned versus actual activities. Document each step from the implementation plan alongside what actually occurred, noting start times, completion times, and any deviations.
Planned activity Planned time Actual time Variance Notes Begin maintenance window 22:00 22:00 0 min On schedule Disable inbound mail flow 22:05 22:08 +3 min DNS propagation slower than expected Database backup 22:15 22:12 -3 min Completed early Apply gateway configuration 22:45 23:15 +30 min Certificate chain issue required troubleshooting Verification testing 23:30 00:05 +35 min Delayed by configuration issue Enable inbound mail flow 00:00 00:45 +45 min Cumulative delay End maintenance window 02:00 01:30 -30 min Completed within window Assemble the data into a PIR preparation pack and distribute to participants 48 hours before the meeting. The pack includes: implementation timeline comparison, incident summary, monitoring metrics comparison, and feedback survey results.
The data collection process draws from multiple sources that together provide a complete implementation picture:
+-----------------------------------------------------------------------+| PIR DATA SOURCES |+-----------------------------------------------------------------------+| || +------------------+ +------------------+ +-----------------+ || | Change Record | | Monitoring | | Incident | || | | | Systems | | Records | || | - Original plan | | | | | || | - Risk assessment| | - Availability | | - Related INC | || | - Approvals | | - Performance | | - Impact data | || | - Amendments | | - Error rates | | - Resolution | || +--------+---------+ +--------+---------+ +--------+--------+ || | | | || +-----------------------+-----------------------+ || | || +--------v---------+ || | | || | PIR Preparation | || | Pack | || | | || +--------+---------+ || | || +-----------------------+-----------------------+ || | | | || +--------v---------+ +--------v---------+ +--------v--------+ || | Implementation | | Participant | | Communication | || | Logs | | Feedback | | Records | || | | | | | | || | - Pipeline runs | | - Survey results | | - Bridge calls | || | - Console output | | - Observations | | - Email threads | || | - Timestamps | | - Suggestions | | - Chat logs | || +------------------+ +------------------+ +-----------------+ || |+-----------------------------------------------------------------------+Figure 1: Data sources feeding PIR preparation, combining quantitative metrics with qualitative feedback
Facilitate the review meeting
Open the meeting by stating the change identifier, implementation date, and overall outcome. Confirm the meeting objective: to assess success, document lessons, and identify improvements. Remind participants that the review examines the change and process, not individual performance.
Present the success criteria assessment. Walk through each criterion defined in the original change request, presenting the measured outcome and determining whether the criterion was met, partially met, or not met.
For the email gateway migration example:
Success criterion Target Measured Assessment Mail delivery latency Under 30 seconds 12 seconds average Met Spam detection rate Above 98% 99.2% Met False positive rate Below 0.1% 0.08% Met User-reported issues Fewer than 10 in first week 3 reported Met Implementation within window Complete by 02:00 Complete at 01:30 Met Review the implementation timeline, highlighting variances between planned and actual execution. For each significant variance (greater than 15 minutes or causing downstream delays), facilitate discussion on the cause and whether it was foreseeable.
Discuss incidents and issues encountered during implementation. For each incident, capture: what happened, how it was detected, what action was taken, and what prevented earlier detection or avoidance. Do not conduct root cause analysis during the PIR; instead, note items requiring deeper investigation and assign them to Problem Management.
Collect lessons learned using structured categories. Ask participants to identify what went well (to reinforce), what could improve (to change), and what surprised them (to investigate). Document each lesson with enough context for someone unfamiliar with the change to understand.
Identify improvement actions arising from the lessons. Each action requires an owner, target date, and clear definition of done. Limit actions to those within the authority of meeting participants or their direct management chain; escalate broader organisational changes through the continual improvement register.
Summarise the review outcome. State the overall assessment (successful, successful with issues, partially successful, failed), list key lessons, and confirm action owners and dates. Thank participants and confirm the timeline for report distribution.
Document lessons learned
Lessons require categorisation to enable pattern analysis across multiple reviews. The categorisation scheme distinguishes between lessons about the change itself and lessons about the change process:
+-----------------------------------------------------------------------+| LESSONS LEARNED CATEGORIES |+-----------------------------------------------------------------------+| || CHANGE EXECUTION CHANGE PROCESS || +---------------------------+ +---------------------------+ || | | | | || | Technical | | Planning | || | - Architecture decisions | | - Scope definition | || | - Configuration choices | | - Timeline estimation | || | - Integration points | | - Resource allocation | || | - Testing coverage | | - Risk identification | || | | | | || +---------------------------+ +---------------------------+ || | | | | || | Operational | | Coordination | || | - Deployment sequence | | - Stakeholder engagement | || | - Monitoring adequacy | | - Communication timing | || | - Rollback readiness | | - Approval workflow | || | - Support preparation | | - Vendor coordination | || | | | | || +---------------------------+ +---------------------------+ || | | | | || | Environmental | | Documentation | || | - Infrastructure state | | - Plan completeness | || | - Dependency behaviour | | - Runbook accuracy | || | - External factors | | - Knowledge capture | || | | | | || +---------------------------+ +---------------------------+ || |+-----------------------------------------------------------------------+Figure 2: Lesson categorisation enabling pattern analysis across reviews
Write each lesson as a complete statement that conveys meaning without requiring the full PIR context. Poor: “The certificate issue caused delays.” Better: “TLS certificate chain validation failed because the intermediate certificate was not included in the configuration bundle, causing 30 minutes of troubleshooting during the implementation window.”
Assign a category and subcategory to each lesson using the standard taxonomy. This enables querying lessons by type when planning similar future changes.
Indicate lesson polarity: positive lessons reinforce practices to continue, negative lessons identify practices to change, and neutral lessons note observations without clear directional guidance.
Link lessons to specific change types, technologies, or services where applicable. A lesson about certificate handling in the email gateway migration should be tagged with “TLS”, “email”, and “gateway” to surface when planning related changes.
Record lessons in the knowledge management system using the standard article template. Each lesson becomes a searchable knowledge article that can be referenced in future change planning.
# Knowledge article metadata for lesson learned article_type: lesson_learned source_pir: PIR-2024-0847 source_change: CHG0012847 category: change_execution subcategory: technical polarity: negative tags: - tls - certificates - email - gateway technologies: - postfix - lets-encrypt created: 2024-11-18 author: j.smith@example.orgCreate and track improvement actions
Draft each improvement action with specific, measurable criteria for completion. Vague: “Improve certificate handling.” Specific: “Update the TLS deployment checklist to include certificate chain validation step, and add automated chain verification to the deployment pipeline by 2024-12-15.”
Assign an owner who has authority to complete the action or escalate appropriately. The owner need not perform the work personally but is accountable for completion.
Set a target date based on action complexity and owner capacity. Quick wins (documentation updates, checklist additions) target 2 weeks. Process changes target 4-6 weeks. Tool implementations target 8-12 weeks.
Register actions in the continual improvement register with the PIR reference. This creates visibility for improvement governance and prevents duplicate efforts across reviews.
# Register improvement action curl -X POST "https://itsm.example.org/api/v1/improvements" \ -H "Authorization: Bearer $ITSM_TOKEN" \ -H "Content-Type: application/json" \ -d '{ "title": "Add certificate chain validation to deployment pipeline", "description": "Automated verification that TLS certificate chains are complete before deployment proceeds", "source_type": "pir", "source_ref": "PIR-2024-0847", "owner": "j.smith@example.org", "target_date": "2024-12-15", "category": "process", "priority": "medium" }'- Schedule follow-up verification for each action. Add calendar reminders for the target date plus 5 working days to confirm completion and close the action.
The action tracking workflow ensures improvements progress from identification through implementation:
+--------------------------------------------------------------------+| IMPROVEMENT ACTION WORKFLOW |+--------------------------------------------------------------------+| || PIR Meeting || | || v || +---+---+ +----------+ +----------+ +----------+ || | | | | | | | | || |Identify+----> Draft +-----> Assign +-----> Register | || | | | Action | | Owner | | in | || +-------+ +----------+ +----------+ | CSI | || +----+-----+ || | || +-----------------------------------------------+ || | || v || +----+-----+ +----------+ +----------+ +----------+ || | | | | | | | | || | Owner +-----> Work +-----> Verify +-----> Close | || | Accepts | | Action | | Complete | | Action | || +----------+ +----+-----+ +----------+ +----------+ || | || | If blocked || v || +----+-----+ || | | || | Escalate | || | | || +----------+ || |+--------------------------------------------------------------------+Figure 3: Action tracking from identification through verified completion
Finalise and distribute the report
Compile the PIR report within 3 working days of the review meeting. The report consolidates all findings into a single document that serves as the permanent record of the review.
Structure the report with the following sections: executive summary, change overview, success criteria assessment, implementation timeline analysis, incidents and issues, lessons learned, improvement actions, and appendices containing raw data.
Review the draft report with the change owner before distribution. Confirm factual accuracy and appropriate tone. PIR reports should be honest without being punitive; the goal is learning, not blame.
Distribute the final report to: all PIR participants, the change owner’s manager, the service owner, and the CAB representative. For major changes, include the IT leadership team.
Archive the report in the knowledge management system linked to the change record. Update the change record with the PIR reference number and overall assessment.
# Update change record with PIR completion curl -X PATCH "https://itsm.example.org/api/v1/changes/CHG0012847" \ -H "Authorization: Bearer $ITSM_TOKEN" \ -H "Content-Type: application/json" \ -d '{ "pir_completed": true, "pir_reference": "PIR-2024-0847", "pir_outcome": "successful_with_issues", "pir_date": "2024-11-18" }'- Update relevant knowledge base articles based on lessons learned. If the PIR identified documentation gaps or inaccuracies, create tasks to address them and link to the PIR as justification.
Verification
After completing the PIR process, verify that all outputs exist and are properly linked.
Confirm the PIR document exists and contains required sections:
# Verify PIR document completenesscurl -s "https://docs.example.org/api/v1/articles/PIR-2024-0847" \ -H "Authorization: Bearer $DOCS_TOKEN" | \ jq '{ exists: (.id != null), has_summary: (.content | contains("Executive Summary")), has_criteria: (.content | contains("Success Criteria")), has_lessons: (.content | contains("Lessons Learned")), has_actions: (.content | contains("Improvement Actions")), linked_change: .metadata.source_change }'Expected output:
{ "exists": true, "has_summary": true, "has_criteria": true, "has_lessons": true, "has_actions": true, "linked_change": "CHG0012847"}Verify improvement actions are registered:
# Query improvement register for PIR actionscurl -s "https://itsm.example.org/api/v1/improvements" \ -H "Authorization: Bearer $ITSM_TOKEN" \ --data-urlencode "query=source_ref=PIR-2024-0847" | \ jq '.result[] | {id, title, owner, target_date, status}'Confirm change record updated with PIR reference:
curl -s "https://itsm.example.org/api/v1/changes/CHG0012847" \ -H "Authorization: Bearer $ITSM_TOKEN" | \ jq '{pir_completed, pir_reference, pir_outcome, pir_date}'Expected output showing PIR linkage:
{ "pir_completed": true, "pir_reference": "PIR-2024-0847", "pir_outcome": "successful_with_issues", "pir_date": "2024-11-18"}Verify lesson articles created in knowledge base:
# Count lessons linked to PIRcurl -s "https://docs.example.org/api/v1/articles" \ -H "Authorization: Bearer $DOCS_TOKEN" \ --data-urlencode "query=metadata.source_pir=PIR-2024-0847" | \ jq '.total_count'The count should match the number of lessons documented in the PIR meeting.
Troubleshooting
| Symptom | Cause | Resolution |
|---|---|---|
| Participants unavailable within required timeframe | Staff on leave, competing priorities, or inadequate notice | Schedule PIR during change planning phase by blocking calendar time at implementation + review interval; escalate to service owner if availability remains blocked |
| Success criteria not documented in change record | Change approved without explicit criteria, or criteria discussed verbally but not recorded | Reconstruct criteria with change owner based on change justification; document gap as lesson learned; update change template to require criteria |
| Monitoring data unavailable for affected services | Services not instrumented, monitoring retention too short, or access restrictions | Use available proxy metrics (user complaints, support tickets); document monitoring gap as improvement action; extend retention for critical services |
| PIR meeting becomes blame session | Participants defensive, focus on individual actions rather than systemic factors | Restate ground rules; redirect from “who” to “what” and “why”; if behaviour continues, pause meeting and resume after private conversation with participants |
| Implementation timeline cannot be reconstructed | Logs deleted, no timestamps recorded, or work performed without documentation | Gather participant recollections; accept reduced accuracy; document logging gap as improvement action |
| Lessons repeat across multiple PIRs | Previous improvement actions not completed, or not addressing root cause | Query improvement register for related open actions; escalate completion blockers; consider whether lesson indicates systemic issue requiring different intervention |
| No improvement actions identified from review | Review too superficial, participants unwilling to critique, or genuinely smooth implementation | Probe with specific questions about risk areas; review against change management maturity criteria; if truly no improvements, document as positive outcome |
| Change owner disputes PIR assessment | Disagreement on criteria interpretation or measurement | Refer to documented criteria; if ambiguous, note dispute in report; escalate to CAB chair for resolution if agreement impossible |
| Actions assigned to people without capacity | Owner overloaded or lacks authority to complete action | Reassign to appropriate owner or escalate to manager for resource allocation; break large actions into smaller deliverables |
| PIR report distribution blocked by sensitivity concerns | Lessons involve personnel issues or vendor disputes | Create redacted version for general distribution; restricted version for relevant managers; legal review if contractual issues involved |
| Knowledge base articles not searchable | Incorrect metadata, missing tags, or indexing delay | Verify article metadata matches taxonomy; manually add tags; allow 24 hours for index refresh; check article visibility permissions |
| Improvement action marked complete but problem recurs | Action addressed symptom rather than cause, or implementation incomplete | Reopen action; conduct deeper analysis potentially involving Problem Management; verify completion criteria were actually met |
Contextual considerations
Organisations with limited IT capacity often struggle to conduct PIRs consistently. A single IT person handling multiple responsibilities may view PIRs as administrative overhead that delays moving to the next urgent task. For these contexts, adopt a lightweight approach: conduct PIR as a 15-minute self-review using a standard checklist, document only the most significant lesson, and register only the highest-impact improvement action. This minimal approach captures value without creating unsustainable process burden.
The checklist for self-review PIRs covers five questions: Did the change achieve its objective? What took longer than expected? What would you do differently? What documentation needs updating? What one improvement would most help the next similar change? Recording answers to these questions takes under 10 minutes and creates sufficient record for future reference.
Organisations with federated IT structures face coordination challenges when changes span multiple autonomous teams. Each team may have different PIR practices, documentation locations, and improvement registers. For cross-team changes, designate a lead team responsible for facilitating the PIR and consolidating findings. Distribute the final report to all participating teams and register improvement actions in each team’s register as appropriate. Where actions require coordination across teams, escalate to the governance body that spans those teams.
For organisations operating in field environments with connectivity constraints, PIR meetings may need to occur asynchronously. Distribute the PIR preparation pack via email with a deadline for written feedback. Compile responses into a consolidated document and circulate for final comment before finalising. While this approach loses the dynamic discussion of synchronous meetings, it enables participation from staff in locations where video conferencing is unreliable.
Grant-funded changes require additional PIR considerations. Document whether the change delivered the capabilities described in the grant proposal and whether implementation costs aligned with budget. This information supports grant reporting and informs future proposal development. If implementation revealed scope changes or cost variances, note these for finance and grants management teams.