Article updated in March 2026 for the PMBOK® Guide — Eighth Edition.
Perform Risk Analysis in PMBOK 8 — Complete Guide
Formerly known as: Perform Qualitative Risk Analysis + Perform Quantitative Risk Analysis (PMBOK 6) — consolidated into one unified process in PMBOK 8
A major infrastructure project team had a risk register with 47 identified risks. Without a structured analysis process, every risk seemed equally urgent, and the team was paralyzed by the sheer volume. The project manager called a two-day risk workshop and put all 47 risks on the wall. By the end of day one, using a structured probability-impact assessment, the 47 risks had been sorted into a priority order: 6 high-priority risks requiring immediate response planning, 14 medium-priority risks requiring monitoring and contingent responses, and 27 low-priority risks to be tracked but not actively managed. The team spent day two developing detailed response plans for the six high-priority risks. Three of those six risks materialized during the project. All three were managed successfully — because the response plans had been developed when the team was calm and analytical, not reactive and under pressure. The 41 remaining risks, by contrast, required no special response — their prioritization had correctly identified them as manageable within normal project management practice.
Risk identification generates a list. Risk analysis transforms that list into actionable priorities. In PMBOK 8, Perform Risk Analysis is Process 3 of the Risk Domain, and it represents one of the most significant structural changes from PMBOK 6: the consolidation of the two separate analysis processes — Perform Qualitative Risk Analysis and Perform Quantitative Risk Analysis — into a single, unified, iterative process. This consolidation reflects the way risk analysis actually works in practice: qualitative and quantitative analysis are not sequential stages — they are complementary lenses applied iteratively to the same risk register throughout the project.
This complete guide covers every dimension of Perform Risk Analysis as defined in PMBOK 8:
- What it is — definition, the PMBOK 6 consolidation explained, and what changed
- Why use it — direct benefits and the cost of inadequate risk analysis
- Full ITTO — every input, tool, technique, and output explained
- Step-by-step application guide — from risk register to prioritized risk analysis
- When to apply it — triggers and iterative analysis cadence
- Two real-world examples — Project Phoenix (website launch) and Project ProjectAdm (SaaS PM platform)
- Templates and tools — with free downloads
- Five common errors — and how to avoid each one
- Tailoring — predictive, agile, and hybrid approaches
- Process interactions — what feeds into risk analysis and what depends on it
- Quick-application checklist — 10 items you can use today
1. What Is Perform Risk Analysis
Perform Risk Analysis involves an iterative process that combines both qualitative and quantitative risk analysis actions. Qualitative risk analysis is conducted throughout the project to evaluate individual project risks by assessing their probability of occurrence and impact. Other characteristics of assessment may include the degree of impact on the objectives, manageability, timing of possible impacts, relationships with other risks, and common causes or effects. Quantitative risk analysis, depending on the project, may not always be required, but when it is, it is also conducted throughout the project. This process includes numerically analyzing the combined effect of identified individual project risks and other sources of uncertainty on overall project objectives.
In PMBOK 8, this is Process 3 of the Risk Domain. It follows Identify Risks (which produces the initial risk register) and precedes Plan Risk Responses (which develops the response strategy for prioritized risks). The process is iterative — risk analysis is not performed once at the beginning of the project but revisited continuously as new risks are identified and as the project context evolves.
The PMBOK 6 consolidation: why two processes became one
| Aspect | PMBOK 6 — Perform Qualitative Risk Analysis | PMBOK 6 — Perform Quantitative Risk Analysis | PMBOK 8 — Perform Risk Analysis |
|---|---|---|---|
| Focus | Probability-impact assessment; risk prioritization | Numerical analysis of combined risk effect on project objectives | Both: qualitative prioritization + quantitative numerical analysis in one iterative process |
| Timing | After risk identification; before risk response planning | After qualitative analysis, for high-priority risks; optional | Iterative throughout the project; both dimensions applied continuously as conditions evolve |
| Outputs | Risk register updates (priority, probability, impact, risk score) | Risk register updates (quantitative probability, exposure amount, contingency reserve) | Integrated project document updates including both qualitative and quantitative risk assessments |
| Applicability | Required for all projects | Optional; required for complex or large projects with significant quantitative uncertainty | Qualitative analysis: required for all projects. Quantitative analysis: applied when needed based on project complexity and risk exposure. |
The consolidation in PMBOK 8 reflects the reality of sophisticated risk management practice: experienced risk practitioners do not perform qualitative analysis, file the results, and then separately perform quantitative analysis. They use qualitative assessment to identify the risks that warrant quantitative analysis, apply quantitative tools to the highest-priority risks, and integrate the results into a unified risk priority picture. The two analyses are complementary, not sequential.
2. Why Use Perform Risk Analysis
Risk identification produces a list. Risk analysis produces a strategy. Without analysis, the risk register is a catalog of concerns, not a management instrument.
Direct benefits
- Focuses management attention on the risks that matter: A project with 50 identified risks cannot actively manage all 50. Risk analysis identifies the 5–10 risks that warrant immediate response planning and separates them from the 40–45 that can be tracked with lower intensity. This prioritization is the most direct contribution of risk analysis to project management effectiveness.
- Provides quantified risk exposure for contingency planning: Quantitative risk analysis (Monte Carlo simulation, decision tree analysis, sensitivity analysis) converts probability-and-impact estimates into expected monetary value (EMV) and contingency reserve calculations that are defensible to sponsors and finance committees.
- Identifies risk interactions and correlations: Multiple medium-probability risks that share a common cause may have a combined high probability of materializing together. Risk analysis identifies these correlations, preventing the underestimation of combined risk exposure that results from treating risks as independent.
- Informs schedule and cost risk assessments: Quantitative risk analysis on schedule activities produces probability distributions for project duration and cost that allow the PM to quantify the probability of meeting the project baseline — and to understand what contingency is needed to achieve a target confidence level.
- Supports transparent communication about project risk: Risk analysis results (risk priority, risk scores, quantified exposure) provide the objective data needed for clear risk communication to sponsors, steering committees, and clients. Subjective risk descriptions are difficult to act on; analyzed risk data is actionable.
- Enables evidence-based risk response investment decisions: When the expected monetary value of a risk is quantified, the investment required for a risk response can be evaluated against the exposure being avoided. A $50,000 mitigation measure for a risk with $20,000 EMV is not a good investment; a $5,000 mitigation measure for the same risk is clearly worthwhile. Quantitative analysis makes this trade-off visible and defensible.
The cost of inadequate risk analysis
- Resource waste on low-priority risks: Without prioritization, teams spend equal management effort on all risks — including risks that are highly unlikely or minimally impactful. This wastes resources that should be directed at high-priority risks.
- Under-resourcing of high-priority risks: The corollary of wasting resources on low-priority risks is under-investment in the risks that actually threaten the project. When a high-priority risk materializes without adequate preparation, the response is reactive, expensive, and often insufficient.
- Inaccurate contingency reserves: Without quantitative risk analysis, contingency reserves are typically set as a round percentage of project budget without a defensible basis. The reserve is either too high (wasteful) or too low (inadequate for actual risk exposure).
- Sponsor surprises: When a significant risk materializes without having been analyzed and communicated, the sponsor experiences it as an unexpected crisis rather than a managed project event. This damages trust and creates governance friction that persists beyond the specific risk event.
3. Inputs, Tools & Techniques, and Outputs (ITTO)
The following table presents the complete ITTO of the Perform Risk Analysis process as defined in PMBOK 8 (p. 203):
| Inputs | Tools & Techniques | Outputs |
|---|---|---|
|
|
|
Inputs explained
Risk management plan: Provides the probability-impact scales, risk categories, assessment methodology, and authority structure that govern the analysis. Risk analysis cannot be conducted consistently without the reference framework established in the risk management plan.
Scope, schedule, and cost baselines: The baselines define the specific project objectives at risk. Risk impact assessments (e.g., “this risk could delay the project by X weeks” or “this risk could increase the budget by $Y”) are only meaningful in the context of the specific project baselines. A 3-week impact on a 12-week project is catastrophic; the same impact on a 2-year program is manageable.
Risk register: The current inventory of identified risks. The risk register is the primary working document for risk analysis — all analysis results are documented back into the register, updating each risk’s probability, impact, priority score, and assessment notes.
Assumption log: Undocumented or unvalidated assumptions in the assumption log represent potential risks. Risk analysis should include a review of the assumption log to identify any assumptions that are both high-impact (if false) and currently unvalidated, converting them into risk register entries.
Cost and duration estimates: Provide the baseline values against which risk impacts are quantified. Without current cost and duration estimates, quantitative risk impact assessments are unmeasured.
Stakeholder register: Stakeholder risk tolerances documented in the register inform which risk impacts are most significant for this stakeholder community — and therefore which risks should receive the highest analysis priority.
Tools & Techniques explained
Risk probability and impact assessment: The core qualitative analysis technique. For each identified risk, the analysis team assesses the probability of occurrence (using the probability scale from the risk management plan) and the impact on each project objective if it occurs (using the impact scale from the risk management plan). The results are plotted on the probability-impact matrix to assign a risk priority (high/medium/low). This assessment should involve the risk owner, subject-matter experts, and the PM — not be performed unilaterally by the PM alone.
Probability and impact matrix: A visual grid that combines probability and impact scores to produce risk priority ratings. The matrix defines the zones (green/yellow/red) that correspond to low/medium/high priority. Risks falling in the red zone require immediate response planning; risks in the yellow zone require monitoring and contingent plans; risks in the green zone are tracked with lower intensity. The matrix is the primary tool for communicating risk priority to stakeholders.
Risk categorization: Organizing risks by category (from the Risk Breakdown Structure) to identify patterns: multiple risks in the same technical area may indicate a systemic architectural problem rather than isolated risk events. Risk categorization helps the analysis team identify root causes that, if addressed, would eliminate or reduce multiple risks simultaneously.
Simulations (Monte Carlo): A quantitative technique that uses probability distributions for activity durations and costs (rather than single-point estimates) to model the range of possible project outcomes through thousands of simulated project runs. Monte Carlo simulation produces probability distributions for project completion date and cost, showing the PM the probability of meeting the baseline under the current risk profile. For example: “Given the identified risks and their probability-impact profiles, there is a 60% probability of completing on schedule, an 80% probability of completing within 2 weeks of the target date, and a 50% probability of completing within budget.” This output directly informs the contingency reserve calculation and the escalation conversation with the sponsor.
Sensitivity analysis: A quantitative technique that identifies which risks have the greatest relative impact on the project objective. Tornado diagrams are the most common visual representation: risks are ranked by their impact range (the spread between their high-impact and low-impact scenarios), producing a visual priority list of the risks that most need attention. Sensitivity analysis answers: “Of all the risks we have identified, which one would hurt us most if it materialized at its maximum impact?”
Decision tree analysis: A quantitative technique for analyzing decisions under uncertainty by mapping decision alternatives, probabilistic outcomes, and associated values in a tree diagram. Decision tree analysis calculates the expected monetary value (EMV) of each decision path, supporting evidence-based selection of risk response strategies (e.g., comparing the EMV of a proactive mitigation investment against the EMV of accepting the risk and using contingency reserves).
Influence diagrams: Graphical representations of causal relationships between project variables. When multiple risks share common causes or have correlated effects, influence diagrams reveal the dependency structure that point-in-time risk assessments miss. This is particularly valuable for identifying risk cascades — where a single triggering event generates a chain of secondary risks.
Facilitation: The interpersonal skill of structuring group risk analysis sessions to produce objective, unbiased assessments. Risk workshops are vulnerable to anchoring bias (first assessments anchor subsequent ones), groupthink, and the tendency to dismiss risks that challenge the team’s optimistic narrative. Skilled facilitation actively counters these biases through structured techniques (e.g., anonymous individual assessments before group discussion, devil’s advocate roles, scenario planning).
Outputs explained
Risk register updates: The primary output. Each risk in the register is updated with: probability assessment (using the plan’s scale); impact assessments (for each project objective affected); risk score (probability × impact); priority classification (high/medium/low from the P&I matrix); qualitative assessment notes (context, assumptions, key drivers); quantitative assessments where applicable (EMV, simulation results); and watch list designation for low-priority risks.
↓ Free template available in section 7.
Risk report updates: A summary-level report of the overall project risk profile, including: total number of risks by priority category; trend analysis (are risks increasing or decreasing?); top 5 risks with current status; quantitative risk exposure summary (total EMV, contingency reserve adequacy); and any new risk categories or concentrations identified in the analysis.
Assumption log updates: If the analysis identifies that any documented assumption is both highly uncertain and highly impactful (if false), those assumptions are escalated as risks in the risk register and flagged for priority validation in the assumption log.
4. Step-by-Step Application Guide
Step 1 — Prepare the analysis session
Before the risk analysis session, ensure: the risk register is current (all identified risks are entered); the probability-impact scales from the risk management plan are printed or displayed; and the right people are in the room (risk owners, technical leads for each risk category, PM, and a facilitator). Brief participants on the assessment methodology to ensure consistent application of the probability-impact scales.
Step 2 — Conduct qualitative probability-impact assessments
For each risk in the register, conduct a structured assessment: What is the probability of this risk occurring during the project? What is the impact on each affected project objective (scope, schedule, cost, quality) if it occurs? Use the risk management plan’s defined scales. For each risk, reach a documented consensus (or document dissenting views with rationale). Assign the risk score (probability × impact) and plot on the P&I matrix.
Step 3 — Categorize and identify patterns
Organize the assessed risks by RBS category. Identify categories with disproportionate concentrations of high-priority risks — these categories indicate systemic risk drivers that may be addressable at the root cause level. Document the pattern analysis in the risk report.
Step 4 — Determine which risks require quantitative analysis
Apply quantitative analysis to risks that: are high-priority (red zone in the P&I matrix); have significant schedule or cost impact; affect the project’s critical path; or require a defensible contingency reserve calculation for sponsor or steering committee reporting. The threshold for applying quantitative analysis should be defined in the risk management plan. Not every project requires Monte Carlo simulation — small, simple projects can be managed effectively with qualitative analysis alone.
Step 5 — Conduct quantitative risk analysis
For risks that meet the quantitative analysis threshold: build probability distributions for the affected activities (three-point estimates: optimistic, most likely, pessimistic); run Monte Carlo simulation to produce probability distributions for project duration and cost; conduct sensitivity analysis (tornado diagram) to identify the highest-impact risks; calculate EMV for decision-point risks where response trade-off decisions are needed. Document quantitative results in the risk register and risk report.
Step 6 — Update the risk register and risk report
Update each risk record with the analysis results: qualitative scores, quantitative metrics where applicable, priority classification, and any revised assumptions. Compile the risk report summary: overall risk profile, top risks, exposure metrics, trend analysis. Communicate the analysis results to the sponsor and key stakeholders through the channels defined in the risk management plan.
5. When to Apply the Process
- After each risk identification session: Every time new risks are identified (including at project initiation, at phase gates, and during ongoing monitoring), the newly identified risks must be analyzed before they can be effectively managed.
- Iteratively throughout the project: Risk analysis is not a one-time event. As the project progresses, risk probabilities and impacts change. Risks that were low-probability in planning may become high-probability in execution. Risks that have been successfully mitigated should be downgraded. The risk register must reflect the current risk profile, not the profile from month one.
- Before major risk response decisions: Decision tree analysis and EMV calculations should be conducted before committing to a significant risk response investment, to ensure that the response is proportionate to the risk exposure.
- At phase gates: Comprehensive risk analysis is a mandatory input to phase gate reviews. The sponsor and steering committee need the current risk profile to make an informed decision about proceeding to the next phase.
- After significant project changes: Scope changes, schedule compression, or major resource changes alter the risk landscape. Risk analysis should be repeated after any significant approved change to update the risk register’s accuracy.
6. Practical Examples
Example 1 — Website Launch: Project Phoenix
Context: TechCorp PM Alex Morgan PMP managing Project Phoenix for CEO Sarah Chen. Budget: $72,250. Duration: 90 days. Approach: 2-week agile sprints.
How Perform Risk Analysis was applied:
Alex conducted the initial risk analysis in a 90-minute session with the senior developer and inbound analyst at the start of Sprint 1. The risk register had 12 identified risks from the identification session. Each risk was assessed against the probability-impact scales defined in the risk management plan.
The analysis produced three high-priority risks: (1) HubSpot API documentation is incomplete, causing integration delays — probability: High (50%), impact: High ($5,000–$8,000, 5–10 day delay), risk score: 25; (2) Client fails to deliver brand assets on time — probability: Medium (30%), impact: Medium ($2,000–$3,000, 3–5 days delay), risk score: 9; (3) Hosting infrastructure cannot achieve sub-2-second load time without upgrade — probability: Medium (25%), impact: Medium-High ($3,000–$5,000, 3–7 day delay), risk score: 10.
For risk #1 (HubSpot API), Alex applied a simple EMV calculation: 50% probability × $6,500 expected impact = $3,250 EMV. This validated the decision to include $3,000 in contingency for HubSpot-related risk responses. The sensitivity analysis confirmed that risk #1 was the highest single-risk driver of cost variance on the project.
The seven low-priority risks were placed on the watch list, reviewed weekly at the standup through a brief (2-minute) status check on each item.
The prediction that came true: Risk #1 (HubSpot API documentation) materialized in Sprint 3. The pre-analyzed risk had a documented response: bring in external HubSpot consulting hours. The response was implemented from the pre-authorized contingency. Total cost: $650. Predicted range: $3,250 EMV. The actual impact was at the lower end of the predicted range because the response was implemented before the full impact materialized. The analysis did not prevent the risk — it ensured the response was proportionate, swift, and pre-authorized.
Example 2 — SaaS PM Platform: Project ProjectAdm (Software Development)
Context: Eduardo Montes (CEO/PM) building ProjectAdm over 18 months with 10-person team. GDPR/CCPA compliance. Multi-region AWS architecture.
How Perform Risk Analysis was applied:
Eduardo structured the risk analysis at two levels: an initial comprehensive analysis at project start, and iterative quarterly re-analysis throughout the 18-month project. Each analysis session used qualitative probability-impact assessment for all risks, with quantitative analysis (Monte Carlo and decision tree) applied to the six risks identified as high-priority in the initial assessment.
The most important quantitative analysis concerned the GDPR compliance certification timeline. The certification process had been estimated at 3 weeks, but the risk register included a risk: “GDPR audit preparation takes longer than estimated, delaying the certification and the public launch.” Risk analysis parameters: probability 40%, impact range: 2–12 weeks of delay. The Monte Carlo simulation on the compliance milestone’s timeline produced a probability distribution showing: 60% probability of completing within 3 weeks (the plan baseline), 80% probability of completing within 6 weeks, 95% probability of completing within 10 weeks. The simulation revealed that the plan’s 3-week assumption was optimistic — it represented only the 60th percentile outcome, not the expected outcome.
This insight drove a critical schedule adjustment: Eduardo extended the compliance certification milestone from 3 weeks to 5 weeks in the project plan, reducing the probability of a launch delay from 40% to approximately 15%. The 2-week schedule extension cost $12,000 in additional team time allocated to compliance preparation, but prevented a post-launch GDPR violation risk that had an estimated remediation cost of $45,000–$90,000.
The sensitivity analysis (tornado diagram) across all 28 identified risks revealed that three risks accounted for 72% of the total project cost variance: GDPR compliance timeline (32%), AWS infrastructure cost overrun (24%), and QA capacity constraint (16%). This concentration guided Eduardo to allocate 80% of his risk management attention to these three risks, rather than distributing effort evenly across all 28.
The risk analysis was updated quarterly. In month 12, two previously low-priority risks were elevated to high-priority when new information changed their probability assessments: a competitive product launched by a major vendor changed the “market adoption risk” from low to high, and a change in EU data processing regulations increased the probability of a compliance re-certification requirement. Both risk escalations were captured in the quarterly re-analysis before they materialized as surprises.
Result: The quantitative risk analysis on the GDPR compliance timeline directly prevented a launch delay that would have cost $35,000–$50,000 in extended team costs. The sensitivity analysis’s identification of the three high-impact risks allowed Eduardo to focus risk management resources where they created the most value. ProjectAdm launched on time and within the compliance certification window.
7. Free and Recommended Templates
| Document | Free download |
|---|---|
| Risk Register (Software Development) Risk ID, category, description, probability, impact, risk score, P&I matrix position, owner, response strategy, status |
Download free template |
| Risk Report Template Executive risk summary, top 5 risks, overall exposure, trend analysis, decisions required |
Download free template |
| Risk Management Plan Template P&I scales, risk categories, probability-impact matrix, authority structure |
Download free template |
8. Five Common Errors — and How to Avoid Each One
Error 1 — Treating risk analysis as a one-time activity
Why it happens: The risk register is assessed at project initiation and never updated. Six months later, the risk register reflects the project’s risk profile from month one, not month six. High-priority risks that have been resolved are still listed as high priority; new high-priority risks that emerged during execution are not yet in the register.
How to avoid it: Risk analysis is iterative. Build regular risk register reviews into the project management rhythm: a brief qualitative review at each sprint planning or monthly status meeting, and a comprehensive re-analysis at each phase gate or after significant project changes. A risk register that is not current is not a risk management tool — it is a historical document.
Error 2 — Applying the same probability-impact scales to all risks regardless of objective
Why it happens: A single impact score is assigned to each risk without distinguishing whether the impact affects schedule, cost, quality, or scope. A risk that has a high impact on schedule and a low impact on cost receives the same treatment as a risk that has a high impact on both — which is an incorrect prioritization.
How to avoid it: Use separate impact assessments for each project objective. A risk’s priority should reflect the highest impact across all objectives, or a weighted average if the project’s stakeholders have defined relative weights for different objectives. Multi-objective impact assessment is more complex but produces significantly more accurate risk prioritization.
Error 3 — Applying quantitative analysis to all risks regardless of their priority
Why it happens: The team believes that more quantitative analysis is always better and applies Monte Carlo simulation to all 40 identified risks. The analysis produces a large volume of data that is too complex to interpret and too expensive to maintain. The signal is buried in the noise.
How to avoid it: Apply quantitative analysis selectively to the risks that warrant it: high-priority risks, risks on the critical path, risks requiring significant response investments, and risks where the sponsor or governance committee requires quantified exposure data. Qualitative prioritization is the filter for quantitative analysis, not a substitute for it.
Error 4 — Group bias in risk assessment workshops
Why it happens: The most senior or most vocal person in the risk workshop dominates the assessment. Junior team members defer to the expert’s probability-impact assessments even when their own knowledge suggests a different assessment. The result is a risk analysis that reflects one person’s perspective rather than the team’s collective knowledge.
How to avoid it: Use structured facilitation techniques that prevent anchoring and groupthink: anonymous individual assessments before group discussion (using planning poker-style risk scoring), devil’s advocate roles, and explicit invitation of minority views. The facilitator’s role is to surface the full range of perspectives, not to validate the consensus that would have emerged without structured facilitation.
Error 5 — Not connecting risk analysis to the contingency reserve calculation
Why it happens: The risk analysis and the contingency reserve allocation are treated as separate activities. The contingency reserve is set as a round percentage of the project budget without reference to the analyzed risk exposure.
How to avoid it: The contingency reserve should be derived from the risk analysis: the sum of the EMVs of all identified high-priority risks provides a baseline for the contingency reserve calculation. Monte Carlo simulation provides the probability-based foundation for a more rigorous reserve calculation. A contingency reserve that is calibrated to actual risk exposure is both more defensible to the sponsor and more accurate as a financial protection measure.
9. Tailoring: Predictive, Agile, and Hybrid
| Aspect | Predictive | Agile | Hybrid (ProjectAdm model) |
|---|---|---|---|
| Analysis formality | Formal: structured workshop, documented P&I assessments, risk register updates, risk report | Lightweight: risks assessed informally in sprint planning; priority visible on sprint board | Formal for strategic/compliance risks + lightweight for sprint-level risks |
| Quantitative analysis | Monte Carlo, sensitivity analysis, decision tree for high-priority risks | Rarely applied; sprint velocity variance serves as the primary quantitative risk signal | Monte Carlo for compliance and infrastructure risks + velocity analysis for sprint risks |
| Analysis frequency | Comprehensive at initiation + phase gates; brief reviews at status meetings | Every sprint: risk review in sprint planning and retrospective | Comprehensive quarterly + sprint-level risk reviews in every sprint cycle |
| Risk prioritization tool | P&I matrix with defined risk score thresholds | Team consensus on risk priority; impediment board for active risks | Formal P&I matrix for strategic risks + team consensus for sprint risks |
| Output communication | Formal risk reports to sponsor and steering committee | Risk visible on sprint board; communicated at sprint reviews | Formal quarterly risk report + sprint review risk summary |
10. Process Interactions
| Process | Domain | Relationship to Perform Risk Analysis |
|---|---|---|
| Plan Risk Management | Risk | Provides the probability-impact scales, risk categories, and analysis methodology that govern all risk analysis activities. Risk analysis cannot be conducted consistently without this framework. |
| Identify Risks | Risk | Produces the risk register that is the primary input to risk analysis. Risk identification and risk analysis are iterative — new identified risks feed immediately into analysis; analysis insights often trigger additional risk identification. |
| Plan Risk Responses | Risk | Uses risk analysis outputs (priority classifications, EMV calculations, sensitivity analysis results) as the primary basis for selecting and prioritizing risk response strategies. Response planning without analysis priorities is not evidence-based. |
| Monitor Risks | Risk | Monitoring tracks whether risk analysis assessments are still accurate. When monitoring reveals that a risk’s probability or impact has changed, the risk analysis is revisited and the risk register is updated. |
| Develop Schedule | Schedule | Schedule risk quantification (Monte Carlo on activity durations) provides the probability distributions for project completion date that inform schedule reserve calculations. Risk analysis and schedule development are deeply interdependent in predictive and hybrid approaches. |
| Estimate Costs | Finance | Cost risk quantification (EMV calculations, Monte Carlo on cost estimates) provides the basis for contingency reserve calculations. The contingency reserve in the cost baseline should be derived from risk analysis results. |
11. Quick-Application Checklist
- ☐ Risk management plan’s P&I scales are available and used consistently in all assessments
- ☐ All risks in the register have been assessed with qualitative probability-impact scores
- ☐ Risk scores have been calculated (probability × impact) and risks plotted on the P&I matrix
- ☐ Risks have been prioritized (high/medium/low) from the P&I matrix
- ☐ Risk categories have been analyzed for systemic patterns (multiple risks in the same category)
- ☐ High-priority risks have been evaluated for quantitative analysis requirement
- ☐ Monte Carlo simulation or EMV calculation has been applied to all high-priority risks meeting the quantitative threshold
- ☐ Sensitivity analysis identifies the top 3–5 risk drivers of project cost/schedule variance
- ☐ Contingency reserve has been calculated based on risk analysis results (not as a round percentage)
- ☐ Risk register and risk report have been updated with analysis results and communicated to the sponsor
Call to Action:
References
PMBOK Guide 8: The New Era of Value-Based Project Management. Available at: https://projectmanagement.com.br/pmbok-guide-8/
Disclaimer
This article is an independent educational interpretation of the PMBOK® Guide – Eighth Edition, developed for informational purposes by ProjectManagement.com.br. It does not reproduce or redistribute proprietary PMI content. All trademarks, including PMI, PMBOK, and Project Management Institute, are the property of the Project Management Institute, Inc. For access to the complete and official content, purchase the guide from Amazon or download it for free at https://www.pmi.org/standards/pmbok if you are a PMI member.
Free PMBOK 8 Quick Reference Card
All 8 Performance Domains, 12 Principles, and key tools on one printable page. Download it free — no payment required.

