This expert judgment covers everything you need to know. Article updated in March 2026 for the PMBOK® Guide — Eighth Edition.
Have you ever been in a planning meeting where someone said “in my experience, this will take about three months” — and everyone accepted it without question? No data, no historical records, no analysis. Just the “expert’s opinion.” Three months later, the project was behind schedule, the budget had been blown, and nobody could explain how that estimate was made — because it was never documented.
This scenario is far more common than it should be. Expert Judgment is simultaneously one of the most powerful and most misused tools in project management. In PMBOK 6, it appeared as a tool in 47 of 49 processes — virtually omnipresent. In PMBOK 7, it was diluted into the principles. Now, in PMBOK 8 (2025), it returns as a formal Tool and Technique, but with a fundamental difference: there is clear guidance on when to use it, how to apply it, and how to document expert judgment so that it delivers real value — rather than serving as a shortcut to avoid analysis.
In this complete guide, you will find:
- What Expert Judgment is and how it is positioned in PMBOK 8
- What changed from PMBOK 6 to PMBOK 8 — and why the change matters
- 10 specific scenarios where Expert Judgment is the right tool
- A step-by-step process for applying it correctly (6 stages)
- Criteria for selecting who truly qualifies as an “expert”
- Practical examples in IT, construction, and marketing
- The cognitive biases that sabotage expert judgment — and how to neutralize them
- How to tailor the tool for predictive, agile, and hybrid projects
- The 5 most common mistakes — and how to avoid them
- A quick-application checklist you can use today
1. WHAT IS EXPERT JUDGMENT AND WHERE IT FITS IN PMBOK 8
Straight to the point
Expert Judgment is the judgment provided by a person or group with proven knowledge, skill, or experience in a specific area relevant to the decision at hand. It is not guesswork and it is not a hunch — it is the structured contribution of someone who has demonstrable competence in the subject matter.
In PMBOK 8, Expert Judgment is classified as a Tool and Technique under the Governance Domain, but its application extends across all seven performance domains. It is used when the available information is insufficient for a purely analytical decision, when the context requires experience that is not documented, or when the complexity of the problem demands the combination of multiple specialized perspectives.
Expert Judgment can come from a variety of sources:
- Project team members with specific technical knowledge
- External consultants hired for specialized topics
- Stakeholders with experience in the business domain
- Professional and technical associations (e.g., PMI, IEEE, ASCE)
- Subject Matter Experts (SMEs)
- The PMO (Project Management Office) drawing on data from previous projects
The central point is that Expert Judgment is not a “wildcard” tool that replaces analysis. It is a complementary tool that fills informational gaps when objective data is not available or is insufficient.
2. WHAT CHANGED FROM PMBOK 6 TO PMBOK 8
The journey of Expert Judgment across the last three PMBOK editions is revealing — and understanding this evolution helps you use the tool with greater rigor.
PMBOK 6 (2017): Omnipresence without guidance
In PMBOK 6, Expert Judgment appeared as a tool and technique in 47 of 49 processes — it was the most frequently cited tool in the entire guide. This omnipresence generated a legitimate criticism: many professionals began treating “expert judgment” as a generic justification for any decision, without documentation, without criteria for selecting the expert, and without traceability. In practice, “we used expert judgment” became synonymous with “someone decided and nobody questioned it.”
PMBOK 7 (2021): Purposeful de-emphasis
PMBOK 7 adopted a principles-based approach and eliminated the prescriptive list of tools by process. Expert Judgment was no longer formally listed as a tool and was implicitly absorbed into the principles of leadership and decision-making. The intention was sound — to force professionals to think critically about when and how to consult experts — but the result was a gap: without formal guidance, many professionals stopped structuring the collection of expert judgment altogether.
PMBOK 8 (2025): Structured return
PMBOK 8 restores Expert Judgment as a formal Tool and Technique, but with a significant qualitative shift: instead of listing it generically across dozens of processes, the guide provides guidance on when, how, and with what criteria to use it. The focus is now on disciplined use — consulting experts with a clear purpose, documenting assumptions and rationale, and combining expert judgment with other analytical tools.
Comparison table: “What changed”
| Aspect | PMBOK 6 (2017) | PMBOK 7 (2021) | PMBOK 8 (2025) |
|---|---|---|---|
| Presence | Listed in 47 of 49 processes as a formal tool | Not formally listed; implicit in the principles | Restored as a formal Tool and Technique in the Governance Domain, with cross-domain application |
| Usage guidance | Generic — “consult experts” without structured criteria | Absorbed into principles of leadership and decision-making | With guidance on when to use, how to select experts, and how to document |
| Documentation | Did not require formal documentation of the rationale | No specific guidance | Emphasizes documentation of assumptions, rationale, and limitations |
| Expert selection | No formal qualification criteria | No formal criteria | Recommends explicit criteria: proven experience, relevance, and absence of conflicts of interest |
| Combination with other tools | Frequently used in isolation | Not directly addressed | Recommends combined use with data analysis, benchmarking, and estimation techniques |
| Risk of misuse | High — “expert judgment” as a wildcard | Reduced (by absence), but without alternative guidance | Mitigated — clear guidance on limitations and biases |
What this means in practice: If in PMBOK 6 it was enough to say “we used expert judgment,” in PMBOK 8 you need to answer: who was consulted, why that person is qualified, what was the rationale behind the recommendation, and how that opinion integrates with other evidence. Expert Judgment has matured — from shortcut to method.
3. WHEN TO USE EXPERT JUDGMENT — 10 SPECIFIC SCENARIOS
Expert Judgment should not be the default tool for every single decision. It is especially valuable in the following scenarios:
- Estimates for projects without historical data: When you are starting a type of project that the organization has never executed (e.g., first cloud migration), there is no historical data. The opinion of someone who has done this type of project in another organization is the best available source.
- Risk assessment in complex domains: Technical, regulatory, or geopolitical risks require specialized knowledge that is not captured in spreadsheets. A geotechnical engineer can assess foundation risk in a way that no quantitative analysis tool can replace.
- Technology or approach selection: When there are multiple viable technical options (e.g., choosing among three CRM platforms), the opinion of someone who has already implemented those platforms in similar contexts saves months of analysis.
- Validation of business case assumptions: Before approving a project, financial assumptions (market size, conversion rate, acquisition cost) need to be validated by someone who knows the market.
- Definition of acceptance criteria: What defines “quality” varies by context. A UX specialist defines usability criteria that a generalist project manager is not in a position to specify.
- Resolution of technical conflicts: When two teams disagree on the technical approach, an external, neutral expert can evaluate both sides and recommend the most appropriate path.
- Make-or-buy analysis: Deciding between building in-house or outsourcing requires knowledge of the vendor market, hidden costs, and internal capabilities — information that frequently resides with experts.
- Compliance and regulatory requirements: Projects in regulated industries (healthcare, financial services, construction) need experts in standards and regulations to ensure compliance.
- Progressive elaboration in agile projects: When the backlog needs to be refined and user stories involve complex technical domains (e.g., machine learning, structural engineering), domain experts are essential.
- Lessons learned from failed projects: Professionals who have experienced similar failures can identify risk patterns that do not appear in objective data — tacit knowledge is irreplaceable in this scenario.
When NOT to use Expert Judgment as the primary tool:
- When sufficient historical data exists for quantitative analysis (e.g., parametric estimates based on previous projects)
- When the decision can be made based on objective, measurable criteria
- When the “opinion” is being used to avoid the work of collecting and analyzing data
- When there is a conflict of interest between the expert and the outcome of the decision
4. HOW TO APPLY EXPERT JUDGMENT CORRECTLY — STEP BY STEP
The difference between using Expert Judgment professionally and using it as a shortcut lies in the process. Follow these 6 stages:
Stage 1 — Define the question with precision
Before consulting any expert, formulate the exact question you need answered. Vague questions produce vague answers.
Wrong: “What do you think about the schedule?”
Right: “Considering that the integration involves 3 legacy APIs and the team has 2 senior developers, how long do you estimate for the integration phase, assuming 80% availability?”
Practical tip: Document the question in writing before the consultation. Include the context, known constraints, and the level of precision expected in the answer.
Stage 2 — Identify and select the experts
Not every senior person is an expert on the subject in question. Use the selection criteria from section 5 of this article to identify who has relevant competence. Ideally, consult between 2 and 5 experts to obtain diverse perspectives.
Practical tip: Create a competency matrix that maps the questions to be answered against the available experts. This prevents consulting the wrong person or overloading a single expert.
Stage 3 — Structure the opinion-gathering process
Define the consultation format according to the complexity of the question:
- Individual consultation: For specific questions — a 30-to-60-minute meeting with a pre-defined agenda
- Expert panel: For complex decisions with multiple variables — a facilitated meeting with 3-5 experts who debate and converge
- Delphi Technique: For situations where you want to avoid influence among experts — anonymous rounds of estimation with iterative convergence
- Structured interview: For collecting lessons learned — standardized questions applied to multiple experts
Stage 4 — Collect the opinion with rigor
During the consultation, record not only the expert’s conclusion, but also:
- The rationale: Why did the expert arrive at that conclusion?
- The assumptions: What conditions must be true for the opinion to be valid?
- The confidence level: Does the expert have high, medium, or low confidence in the answer?
- The limitations: What does the expert not know or cannot assess?
- The alternatives considered: What other options were considered and why were they discarded?
Stage 5 — Analyze, triangulate, and document
If you consulted multiple experts, compare the responses. Convergence indicates robustness; divergence indicates uncertainty that needs to be investigated. Combine the expert judgment with other sources of evidence (historical data, benchmarks, quantitative analyses) to create an integrated view.
Document formally:
- Who was consulted and why they were selected
- The question that was formulated
- The responses obtained, including rationale and assumptions
- The conclusion adopted by the project and how it incorporates (or diverges from) the opinion received
Stage 6 — Review and update as the project evolves
Expert opinions are based on assumptions that can change. Schedule periodic reviews to verify whether the assumptions still hold. If the context has changed significantly, consult again — or consult additional experts.
5. WHO QUALIFIES AS AN EXPERT? SELECTION CRITERIA
One of the most common mistakes is confusing seniority with expertise. Not every director is an expert in everything, and not every expert holds a director-level position. The selection criteria should be objective:
| Criterion | What to assess | How to verify |
|---|---|---|
| Relevant experience | Has the person worked on problems similar to the one in question? | Project portfolio, references, professional track record |
| Technical depth | Does the person command the technical aspects of the issue? | Certifications, publications, participation in technical communities |
| Currency of knowledge | Is the knowledge current or outdated? | Recent projects (within the last 2-3 years), continuing education |
| Independence | Does the person have a conflict of interest with the outcome of the decision? | Analysis of ties to vendors, positions in the organization, financial incentives |
| Ability to articulate | Can the person explain the rationale behind their opinion in an understandable way? | Preliminary interview or track record of previous expert opinions |
| Diversity of perspective | Does the person bring a different perspective from the others consulted? | Professional background, education, cultural context |
Practical rule: For each question, seek at least one internal expert (who knows the organizational context) and one external expert (who brings a market perspective). This combination reduces organizational biases.
6. PRACTICAL EXAMPLES BY INDUSTRY
Example 1 — IT Project: ERP System Migration
Situation: A retail company needs to migrate from a legacy on-premise ERP to a cloud ERP. The internal team has never executed a migration of this scale.
How Expert Judgment was used:
- Question 1 — Timeline estimate: They consulted a solutions architect who had led 4 ERP migrations in companies of similar size. Estimate: 14-18 months (the internal team had estimated 8 months with no objective basis).
- Question 2 — Migration strategy: A panel of 3 experts (SAP architect, DBA, and process consultant) debated between a big-bang migration and a module-by-module migration. Consensus: module-by-module migration, starting with finance and inventory.
- Question 3 — Integration risks: An API integration specialist identified 3 critical failure points in the interfaces with the e-commerce and logistics systems that the internal team had not detected.
Result: The schedule was revised to 16 months (instead of 8), integration risks were mitigated in advance, and the module-by-module migration enabled incremental value delivery. The project was completed in 17 months — within the range estimated by the experts.
Example 2 — Construction: Foundation on Unstable Soil
Situation: A construction firm won the bid to build a logistics center in an area with a history of expansive soil. The initial geotechnical data was inconclusive.
How Expert Judgment was used:
- Question 1 — Foundation type: A geotechnical engineer with 25 years of experience with expansive soils recommended deep root pile foundations, ruling out the spread footing option that was the structural designer’s initial preference.
- Question 2 — Contingency budget: Based on 12 previous projects on similar soils, the expert recommended an 18% contingency for the foundation phase (the firm’s standard was 10%).
- Question 3 — Schedule impact: A construction planning consultant warned that the root pile foundation would require specialized equipment with a 45-day lead time — information that prevented a significant delay.
Result: The foundation was completed without pathologies. The additional 8% cost compared to the spread footing was offset by the elimination of structural risks that could have cost 10x more in corrective maintenance.
Example 3 — Marketing: Product Launch in a New Market
Situation: A Brazilian SaaS company decided to launch its product in the Mexican market. The marketing team had no experience in Latin American markets outside Brazil.
How Expert Judgment was used:
- Question 1 — Pricing strategy: They consulted a go-to-market specialist for Latin America who recommended pricing in USD (not Mexican pesos) with a 30% regional discount, based on 6 previous SaaS launches in Mexico.
- Question 2 — Acquisition channels: A growth marketer with experience in the Mexican market recommended prioritizing LinkedIn and in-person events (instead of Google Ads), based on 3x higher conversion rates in that market for B2B.
- Question 3 — Cultural adaptation: A localization consultant warned that direct translation from Portuguese to Mexican Spanish would create cultural friction — the “informal and casual” tone of Brazilian marketing needed to be adjusted to a more formal, institutional tone.
Result: The Mexico launch achieved 120% of the lead target in the first quarter. USD pricing with the regional discount had an 87% acceptance rate. Without expert judgment, the company would have replicated the Brazilian strategy — which, according to the consultant, would have generated less than 40% of the target.
7. LIMITATIONS AND CAUTIONS: BIASES THAT SABOTAGE EXPERT JUDGMENT
Expert Judgment is powerful, but it is subject to cognitive biases that can compromise its quality. Knowing these biases is the first line of defense against them:
Confirmation Bias
What it is: The tendency to seek out and value opinions that confirm what we already believe, while ignoring contradictory evidence.
Example: The project manager believes that the 6-month schedule is feasible. They consult 3 experts, discard the 2 who said “impossible in less than 10 months,” and present only the opinion of the one who said “it’s tight, but possible.”
Countermeasure: Record all opinions, including the divergent ones. Present the full distribution to the decision committee — not just the opinion that confirms the preference.
Authority Bias
What it is: Accepting someone’s opinion because of their title or status, not because of their technical competence.
Example: The IT Director opines on the development timeline estimate, even though he has not written code in the past 15 years. Nobody challenges it because he is the director.
Countermeasure: Evaluate the opinion based on the criterion of relevant technical competence, not job title. Include in the documentation why that person was selected as an expert for that specific question.
Anchoring Effect
What it is: The first piece of information presented disproportionately influences subsequent judgment.
Example: You tell the expert “our initial estimate is $500,000.” The expert, influenced by that number, responds “I think it will be between $450,000 and $550,000” — even though an independent analysis would have indicated $800,000.
Countermeasure: Do not provide prior estimates to the expert. Present the context and the constraints, but let them arrive at the number independently.
Groupthink
What it is: In expert panels, the pressure for consensus can lead to the suppression of divergent opinions.
Example: In a panel of 5 experts, the first 3 agree on an approach. The last 2, who had reservations, do not voice their concerns in order to avoid “being the problem.”
Countermeasure: Use the Delphi Technique (anonymous collection) for critical decisions. In face-to-face panels, explicitly ask each participant to write down their opinion before opening the discussion. Designate a “devil’s advocate” to challenge the consensus.
Availability Bias
What it is: Overvaluing recent or memorable experiences at the expense of a broader statistical analysis.
Example: The expert just came out of a project that failed because of an integration with a particular vendor. They strongly recommend avoiding that vendor — even though the previous project failed due to management failures, not the vendor’s fault.
Countermeasure: Ask the expert to separate personal experience from objective analysis. Ask: “Is this recommendation based on a pattern you observe across multiple projects or on a single specific experience?”
8. TAILORING — HOW TO ADAPT EXPERT JUDGMENT TO YOUR CONTEXT
PMBOK 8 emphasizes the concept of tailoring — adapting practices to the project context. Expert Judgment manifests differently under each approach:
In predictive (traditional) projects
In predictive environments, Expert Judgment is typically collected at formal milestones and documented as input for planning artifacts.
- Formal review panels: Structured meetings with experts to validate schedule, budget, and risk estimates before each phase approval.
- Technical opinions: Formal documents issued by experts to support design decisions, vendor selection, or regulatory approval.
- Delphi Technique: Used in cost and schedule estimates where it is important to avoid mutual influence among experts. Common in large-scale projects (infrastructure, construction, defense).
- Design review committees: Technical experts evaluate the architecture and design of the project at formal gates.
- Documentation: High formality — meeting minutes, signed opinions, records in the project repository.
In agile projects
In agile environments, Expert Judgment is integrated into the workflow in a more organic and frequent manner, but equally intentional.
- Technical Spikes: When the team needs to investigate a technical question before estimating or implementing a story, a domain expert leads the investigation within a 1-to-3-day timebox.
- Backlog refinement: Business domain experts participate in refinement sessions to clarify acceptance criteria and validate priorities.
- Architecture Decision Records (ADRs): Architectural decisions are documented with the rationale and the experts consulted — ensuring traceability without excessive bureaucracy.
- Pair programming and mob programming: Technical experts work alongside the team, transferring knowledge in a hands-on manner.
- Documentation: Lightweight — ADRs, notes in stories, records in the team wiki.
In hybrid projects
Hybrid projects combine both approaches as needed:
- Planning phases: Use the predictive approach — formal expert panels to define architecture, budget, and milestones.
- Iterative execution phases: Use the agile approach — technical spikes, refinements with SMEs, and ADRs.
- Phase gates: Combine both — experts formally assess progress (predictive) and participate in retrospectives (agile) to identify improvements.
- Documentation: Proportional to risk — high-impact decisions receive formal documentation; operational decisions follow the lightweight model.
9. INTERACTIONS WITH OTHER PMBOK 8 TOOLS AND TECHNIQUES
Expert Judgment does not operate in isolation. It becomes more powerful when combined with other PMBOK 8 tools and techniques:
| Tool / Technique | How it relates to Expert Judgment |
|---|---|
| Project Canvas | Experts contribute to filling sections such as Risks, Finances, and Scope on the Canvas, bringing depth that the internal team may lack. |
| Assumptions and Constraints Analysis | Experts validate whether the project assumptions are realistic and whether constraints have been correctly identified. |
| Three-Point Estimating | The optimistic, most likely, and pessimistic values can be defined by experts when there is insufficient historical data. |
| SWOT Analysis | Market and technical experts contribute strengths, weaknesses, opportunities, and threats that the internal team may not see. |
| Delphi Technique | A formal method for collecting expert judgment anonymously and iteratively, reducing group and authority biases. |
| Brainstorming | Brainstorming sessions are more productive when they include experts who bring diverse perspectives and deep technical knowledge. |
| Data Analysis / Benchmarking | Expert Judgment complements quantitative data with qualitative interpretation — the expert explains the “why” behind the numbers. |
| Meetings and Workshops | Formats for collecting expert judgment: from one-on-one meetings to structured panels with multiple experts. |
Fundamental principle: Expert Judgment is most reliable when it complements objective data, not when it replaces it. Use it to fill gaps, validate analyses, and interpret results — not as a shortcut to avoid analytical work.
10. 5 COMMON MISTAKES WHEN USING EXPERT JUDGMENT — AND HOW TO AVOID THEM
The apparent simplicity of Expert Judgment conceals pitfalls that compromise decision quality. Here are the 5 most frequent mistakes:
Mistake 1 — Using “expert judgment” as an excuse to skip analysis
Why it happens: The team does not want to bother collecting data, running benchmarks, or performing a quantitative analysis. Instead, they ask someone senior, accept the answer, and record “expert judgment” as the justification. The process becomes theater — the decision was already made, and the “expert” merely legitimized it.
How to avoid it: Before resorting to Expert Judgment, ask: “Is there a way to answer this question with objective data?” If the answer is yes, collect the data first. Use Expert Judgment to interpret the data, not to replace it. Document why the available data is insufficient and why expert judgment is necessary.
Mistake 2 — Consulting only experts who agree with you (confirmation bias)
Why it happens: The project manager already has a preference (consciously or not) and selects experts who are likely to confirm that preference. Or they consult several experts but present only the opinions that support the desired decision.
How to avoid it: Define the expert selection criteria before knowing their opinions. Include at least one expert with a different perspective or from outside the organization. Document all opinions collected — including the divergent ones — and present them to the decision committee. Divergence is valuable information, not a problem to be hidden.
Mistake 3 — Failing to document the expert’s rationale
Why it happens: The team records the conclusion (“the expert recommended 12 months”), but not the rationale (“because the integration with the legacy system requires 3 testing cycles with monthly maintenance windows”). Six months later, when the context changes, nobody knows whether the estimate is still valid — because nobody knows what assumptions supported it.
How to avoid it: For each opinion collected, record the following as mandatory: (1) the conclusion or recommendation, (2) the rationale and assumptions that support it, (3) the expert’s confidence level, and (4) the conditions that would invalidate the opinion. This record transforms a volatile opinion into traceable knowledge.
Mistake 4 — Relying on a single expert
Why it happens: The organization has “that one person” who knows everything. They are consulted for every decision on every project. When they are wrong (or leave the company), the impact is catastrophic — because there is no second point of view and there is no knowledge transfer.
How to avoid it: For high-impact decisions, consult at least 2-3 independent experts. Compare the responses. If there is convergence, your confidence in the decision increases. If there is divergence, you have identified an uncertainty that needs to be explored. Additionally, document the rationale so that the knowledge is not locked within a single individual.
Mistake 5 — Confusing seniority with expertise
Why it happens: Organizational culture values hierarchy. The director’s opinion is accepted without challenge, even when they have no technical experience with the specific question. Meanwhile, the senior analyst who has 8 years of hands-on experience in the subject is not consulted — because they “lack seniority.”
How to avoid it: Define expertise by the relevance of experience, not by job title. A senior developer with 10 database migration projects under their belt is more qualified to opine on that topic than a CTO who has never led a migration. Use the selection criteria table (section 5) to justify why each person was or was not consulted. This depersonalizes the decision and focuses on competence.
11. QUICK-APPLICATION CHECKLIST
Use these 7 items as a reference before considering the Expert Judgment consultation complete:
- Is the question to be answered formulated precisely, with context and constraints documented?
- Have I verified that there is insufficient objective data to answer this question without expert judgment?
- Does the selected expert have proven, relevant experience for this specific question — and not just a senior title?
- Have I consulted at least 2 independent experts for high-impact decisions?
- Have I recorded not only the conclusion, but also the rationale, assumptions, confidence level, and limitations?
- Have I combined expert judgment with other sources of evidence (historical data, benchmarks, quantitative analysis)?
- Have I defined a review trigger — under what conditions will this opinion need to be reassessed?
CONCLUSION
Expert Judgment is one of the most widely used tools in project management — and one of the most frequently misused. In PMBOK 8, it returns as a formal Tool and Technique with an important evolution: it is not enough to “consult an expert”; you need to know when, how, and with what criteria to use expert judgment to generate real value.
Three essential takeaways for practice:
- Expert Judgment is a complement, not a substitute for analysis. Use it to fill informational gaps, validate assumptions, and interpret data — never as a shortcut to avoid analytical work. If data is available, analyze it first.
- Expertise is defined by the relevance of experience, not by title. An expert is someone with proven competence in the specific question under discussion. Define objective selection criteria and document why each person was consulted.
- Document the rationale, not just the conclusion. An opinion without a record of assumptions, rationale, and limitations is an opinion that cannot be revisited, challenged, or updated. Documentation transforms opinion into organizational knowledge.
Next step: The next time a decision on your project requires expert judgment, apply the 7-item checklist from the previous section. Formulate the question in writing before consulting the expert, record the complete rationale (not just the conclusion), and combine the opinion with at least one other source of evidence. This simple exercise will significantly elevate the quality of your decisions based on expert judgment.
Resources for further study
- PMBOK 8 Guide — Complete Overview — understand the full structure of PMBOK 8 and how Expert Judgment connects to domains and processes
- Project Canvas (PMBOK 8) — a tool that frequently uses Expert Judgment for filling the Risk and Finance sections
- Governance Domain in PMBOK 8 — the domain where Expert Judgment is formally positioned
See all PMBOK 8 articles in the Complete Index
🇧🇷 Leia este artigo em português
Leia em Portugues: Versao em Portugues deste artigo
Call to Action:
References
PMBOK Guide 8: The New Era of Value-Based Project Management. Available at: https://projectmanagement.com.br/pmbok-guide-8/
Disclaimer
This article is an independent educational interpretation of the PMBOK® Guide – Eighth Edition, developed for informational purposes by ProjectManagement.com.br. It does not reproduce or redistribute proprietary PMI content. All trademarks, including PMI, PMBOK, and Project Management Institute, are the property of the Project Management Institute, Inc. For access to the complete and official content, purchase the guide from Amazon or download it for free at https://www.pmi.org/standards/pmbok if you are a PMI member.

