Manage Project Execution in PMBOK 8 — Complete Guide
✨ Registered readers browse ad-free. Always free. Create your free account →

Article updated in March 2026 for the PMBOK® Guide — Eighth Edition.

Contents hide

Manage Project Execution in PMBOK 8 — Complete Guide

Formerly known as: Direct and Manage Project Work (PMBOK 6)

The project plan was impeccable. Scope was defined. The schedule had been reviewed and approved by every stakeholder. Resources were confirmed. And then execution began — and within six weeks, the project was in serious trouble. Two key developers were splitting their time between this project and another initiative that had been “temporarily” prioritized by the CTO. A critical dependency on a third-party API had not been escalated because no one was formally tracking the issue log. The project manager was reviewing deliverables in isolation, not feeding work performance data into any control process. By week ten, the team was two sprints behind, the client was receiving conflicting status updates from different team members, and the PM was managing by intuition rather than by data. The problem was not planning failure — it was execution governance failure.

In PMBOK 8, Manage Project Execution is Process 4 of the Governance Domain, and it exists precisely to prevent this scenario. It is the process that transforms an approved project management plan into coordinated, tracked, and documented work — producing deliverables while simultaneously generating the performance data that keeps monitoring and control meaningful. Without it, even the best project plan becomes a document that lives on a server while reality moves in a different direction.

This complete guide covers everything a project manager or PMP candidate needs to understand, apply, and tailor this process:

  • What it is — definition, position in PMBOK 8, and what changed from PMBOK 6
  • Why use it — direct benefits and the cost of skipping it
  • Full ITTO — every input, tool, technique, and output explained
  • Step-by-step application guide — from work authorization to deliverable production
  • When to apply it — triggers and mandatory vs. recommended scenarios
  • Two real-world examples — Project Phoenix (website launch) and Project ProjectAdm (SaaS PM platform)
  • Templates and tools — with free downloads
  • Five common errors — and how to avoid each one
  • Tailoring — predictive, agile, and hybrid approaches
  • Process interactions — what feeds into execution and what depends on it
  • Quick-application checklist — 10 items you can use today

1. What Is the Manage Project Execution Process

Manage Project Execution is the process of leading and performing the work defined in the integrated project management plan, implementing approved changes, managing resources, addressing issues and risks, and producing deliverables that fulfill the project’s objectives. It is the execution engine of the project — the process that converts planning intentions into tangible outputs, while simultaneously generating the work performance data that feeds monitoring, control, and decision-making.

In PMBOK 8, this is Process 4 of the Governance Domain (Process 4 of 9). Its position reflects the project lifecycle sequence: after the project has been formally initiated (Process 1), plans integrated and aligned (Process 2), and sourcing strategy defined (Process 3), the team now executes. Everything from this point forward depends on how well this process is managed.

The process produces several critical outputs:

What changed from PMBOK 6 to PMBOK 8

Aspect PMBOK 6 — Direct and Manage Project Work PMBOK 8 — Manage Project Execution
Process name Direct and Manage Project Work Manage Project Execution
Structural location Executing Process Group — Integration Management Governance Domain, Process 4 of 9
Emphasis Directing work and managing resources to produce deliverables Leading execution holistically, including quality during construction, and integrating corrective/preventive actions
Quality integration Quality addressed separately in quality processes Quality management explicitly addressed within execution as a dual responsibility: process quality and deliverable quality
Work performance data Produced as output, flows to monitoring Same, with stronger emphasis on its role as feedback for lessons learned
Principles alignment Not applicable — PMBOK 6 was principle-agnostic Aligned with “Focus on Value,” “Enable Change,” “Be a Diligent, Respectful, and Caring Steward”

The name change from “Direct and Manage” to “Manage” is not cosmetic. PMBOK 8 emphasizes a leadership-oriented view of execution — the project manager does not only direct tasks, but actively manages the collective intelligence of the team, harmonizes technical and functional activities, and continuously evaluates execution quality against the plan’s objectives.

2. Why Use the Manage Project Execution Process

Without this process, even the most carefully constructed project management plan remains an unfulfilled document. Execution governance is what converts plans into value.

Direct benefits

  • Deliverables are produced systematically: Execution becomes an accountable, traceable process rather than an ad hoc collection of individual efforts. Each deliverable can be traced to a specific work package, assigned to a responsible party, and tracked to completion.
  • Performance data is generated and captured: Work performance data collected during execution is the raw material for every monitoring and control decision. Without it, the Monitor and Control Project Performance process has nothing to measure.
  • Issues are surfaced and managed proactively: The issue log created during execution provides a formal, tracked record of obstacles, blockers, and problems. Without a formal issue log, problems circulate informally in email threads and Slack messages and never get resolved — or get resolved in ways that create new problems.
  • Approved changes are implemented coherently: When changes are approved through the Assess and Implement Changes process, it is Manage Project Execution that incorporates them into the live work stream — ensuring that corrective actions, preventive actions, and defect repairs are applied consistently and documented.
  • Resources are managed actively: Execution is where resource utilization becomes real. The project manager monitors whether resources are performing as planned, addresses conflicts between competing resource demands, and makes adjustments before small imbalances become critical bottlenecks.
  • The team’s collective knowledge is aligned: As PMBOK 8 notes, the collective team members have more knowledge than any single person. Managing execution provides the mechanism to align that knowledge toward the project objectives — through daily coordination, structured retrospectives, and active facilitation of cross-functional collaboration.

The cost of skipping or under-managing this process

  • Invisible execution drift: Without formal work performance data collection, the gap between plan and reality grows silently. By the time it becomes visible, catching up requires extraordinary intervention.
  • Unresolved issues compound: Issues that are not formally logged, owned, and tracked tend to persist and amplify. A resource conflict in week 2 that is handled informally becomes a project delay in week 6 when the same resource is unavailable for a critical milestone.
  • Change chaos: Approved changes that are not formally incorporated into execution result in teams working against different versions of reality — one team member implementing the approved change while another is still following the original plan.
  • Lessons are lost: Work performance data and execution observations that are not captured cannot inform lessons learned or improve future project performance. The organization keeps making the same mistakes because no one ever recorded what actually happened during execution.

3. Inputs, Tools & Techniques, and Outputs (ITTO)

The following table presents the complete ITTO of the Manage Project Execution process as defined in PMBOK 8:

Inputs Tools & Techniques Outputs
  • Expert judgment
  • Project management information system (PMIS)
  • Meetings (daily coordination meetings)

Inputs explained

Project management plan (any component): The integrated project management plan is the primary reference document for all execution activities. It defines what work will be done, how it will be done, who is responsible, what quality standards apply, how resources are managed, and what constitutes acceptable completion for each deliverable. During execution, every work activity should be traceable to a specific component of the project management plan. If an activity cannot be justified by reference to the plan, either the activity is out of scope or the plan needs to be updated via a formal change request.

Project documents: The change log tracks all change requests and their status. The lessons learned register provides guidance from earlier phases or similar projects that should inform current execution. The milestone list identifies critical checkpoints. The project schedule provides the time-phased plan against which execution progress is measured. The requirements traceability matrix ensures that the work being executed can be traced back to documented stakeholder requirements. The risk register identifies the risks that execution activities must be designed to avoid, mitigate, or accept.

Approved change requests: Any changes that have been assessed and approved through the Assess and Implement Changes process (Process 8) must be implemented during execution. This includes corrective actions (to bring performance back to plan), preventive actions (to avoid future deviations), and defect repairs (to fix identified quality issues in deliverables). It is the project manager’s responsibility to ensure that approved changes are communicated to the team, incorporated into the work plan, and reflected in updated documents.

Enterprise Environmental Factors (EEFs): Organizational infrastructure, team culture, geographic distribution, available technology platforms, regulatory requirements, and market conditions all shape how execution is managed in practice. A globally distributed team executing in multiple time zones requires different coordination mechanisms than a co-located team. Regulatory environments may require specific documentation standards for deliverables. EEFs are not inputs to be ignored — they are the operational context within which execution governance must operate.

Organizational Process Assets (OPAs): Templates, standard operating procedures, historical performance data from similar projects, communication protocols, and organizational knowledge repositories. OPAs accelerate execution by providing proven patterns that the team can adapt rather than invent from scratch.

Tools & Techniques explained

Expert judgment: Applied throughout execution to assess the appropriateness of work methods, resolve technical ambiguities, evaluate the quality of in-progress deliverables, and make informed decisions when execution deviates from plan. Expert judgment during execution draws on the project manager’s experience, technical specialists’ domain knowledge, and organizational subject-matter experts who can advise on specific challenges. In PMBOK 8’s leadership-oriented view of execution, expert judgment is not just technical — it includes the judgment required to harmonize diverse perspectives, navigate organizational dynamics, and make trade-off decisions under uncertainty.

Project management information system (PMIS): The PMIS is the technology infrastructure that supports execution management — the platform or set of tools through which the project manager coordinates work, tracks progress, manages issues, and communicates with stakeholders. This may include task management platforms (Jira, Azure DevOps, ClickUp), collaborative workspaces (Confluence, Notion), communication tools (Slack, Teams), or integrated project management suites. The PMIS is not the execution process — it is the tool that makes the process visible, auditable, and scalable. A well-configured PMIS generates work performance data automatically, reducing the administrative burden on the team while increasing the quality of monitoring inputs.

Meetings (daily coordination meetings): Structured, time-boxed meetings that keep the team aligned on current work status, surface blockers and issues, coordinate cross-functional dependencies, and maintain execution momentum. The daily stand-up (from agile/Scrum practice) is one format; the weekly status call, the sprint review, and the phase milestone review are others. What distinguishes effective coordination meetings from ineffective ones is structure: a clear agenda, a defined time limit, a focus on blockers and decisions rather than status reporting, and documented action items with assigned owners and due dates. Meetings without structure generate noise; meetings with structure generate coordination.

Outputs explained

Deliverables: The primary output of execution — the tangible or intangible outputs produced by the project’s work activities that fulfill the requirements defined in the project’s scope baseline. A deliverable may be a software module, a design specification, a tested product feature, a training program, a completed construction phase, or a regulatory submission. Deliverables produced during execution are not automatically “accepted” — they proceed to quality control processes where they are inspected against specifications, and to stakeholder acceptance processes where the customer or sponsor formally validates that each deliverable meets requirements.

Work performance data: The raw, unanalyzed observations and measurements collected during execution — percentage of work completed, costs incurred to date, start and finish dates of activities, number of defects identified, resource utilization rates, test results. Work performance data is the essential input for the Monitor and Control Project Performance process: without it, monitoring is subjective opinion rather than evidence-based assessment. The discipline required to collect work performance data consistently is one of the most important execution habits a project manager can build.

Issue log: A structured register of problems, blockers, and unresolved questions that have arisen during execution and require attention. Each issue entry should include: a unique ID, issue description, date identified, who identified it, assigned owner, target resolution date, priority, and current status. The issue log is not a complaint register — it is a governance instrument that ensures every identified problem has a named owner and a resolution pathway. Issues that are not formally logged tend to be handled informally, inconsistently, or not at all.

Change requests: Formal requests initiated during execution to modify any element of the project management plan or its baselines. Change requests generated during execution may request corrective actions (bring performance back to planned performance), preventive actions (reduce the likelihood of future performance problems), defect repairs (address quality failures in deliverables), or updates to project documentation. All change requests flow to the Assess and Implement Changes process for review, impact analysis, and approval or rejection.

4. How to Apply the Process Step by Step

Step 1 — Authorize work according to the project management plan

Before any work begins, ensure that the team understands exactly what is authorized: which work packages, which activities, which resources, and within what constraints. Use the work authorization system (which may be formal or informal, depending on project complexity) to communicate the go-ahead for each work package. In agile contexts, this authorization happens at the sprint planning level — what is committed to the sprint backlog is what is authorized to be executed in that sprint.

Step 2 — Configure the PMIS to support execution tracking

Set up the project management information system to reflect the current execution plan: tasks, assignments, schedules, milestones, and dependencies. Ensure that the team knows how to log their work, report status, and escalate issues within the system. The PMIS should not be an administrative burden imposed on the team — it should be the natural place where work happens and is recorded, reducing reporting overhead rather than increasing it.

Step 3 — Conduct daily coordination meetings

Run structured daily or near-daily coordination sessions focused on three questions: What was completed since the last meeting? What is planned for today? What blockers or issues are preventing progress? Keep these meetings to 15–30 minutes. Their purpose is not status reporting to the PM — it is team coordination. Blockers surfaced in daily meetings feed directly into the issue log and may trigger change requests or risk responses.

Step 4 — Collect and record work performance data continuously

Establish a disciplined practice of recording actual work performance: percentage of each activity completed, actual costs incurred, actual start and finish dates, quality inspection results, resource utilization. This data must be collected consistently — daily or at each sprint boundary — and entered into the PMIS so that it is available as input for monitoring processes. Work performance data collected sporadically or selectively produces misleading performance reports that can mask real problems until they become crises.

Step 5 — Manage issues formally through the issue log

For every problem or blocker that cannot be resolved immediately by the team, create an issue log entry: describe the issue, assign an owner, set a target resolution date, and classify the priority. Review the issue log in every status meeting. Issues that are not progressing toward resolution within the expected timeframe should be escalated to the sponsor or relevant authority as defined in the communications management plan.

Step 6 — Implement approved changes into the work stream

When a change request has been assessed and approved (through the Assess and Implement Changes process), immediately incorporate it into the active work plan. Update the relevant sections of the project management plan, communicate the change to the affected team members, adjust task assignments and schedules in the PMIS, and document the implementation in the change log. Do not delay the implementation of approved changes — the longer a team operates against an outdated version of the plan, the more re-work will be required to align reality with the updated baseline.

Step 7 — Capture lessons learned throughout execution

Do not wait for project closure to capture lessons learned. Every issue resolved, every risk that materialized, every process that worked better than expected, and every delivery that underperformed expectation is a lesson. Document these in the lessons learned register throughout execution. Teams that capture lessons continuously produce more actionable, contextually rich knowledge than teams that attempt to reconstruct execution experience at the end of the project from imperfect memory.

5. When to Apply the Process

Mandatory scenarios

  • Any phase of any project where work is being performed: Manage Project Execution is active from the moment execution begins and continues until the last deliverable is produced and accepted. There is no execution scenario where this process is optional.
  • After a change has been approved: Every approved change request triggers a mandatory Manage Project Execution activity: incorporating the change into the active work stream, updating plans and documents, and communicating the change to the team.
  • When a project re-starts after a hold: If a project has been paused, Manage Project Execution must be re-activated with a formal restart: updated plans, refreshed resource assignments, reviewed issue log, and a coordination meeting to align the team on the current state before resuming work.

Recommended scenarios

  • When execution complexity increases: As project scope expands, team size grows, or cross-functional dependencies multiply, the formality and frequency of execution management activities should increase proportionally. A project that started with weekly coordination meetings may require daily stand-ups when three parallel workstreams are running simultaneously.
  • When execution performance deviates from plan: When work performance data shows that actual progress, costs, or quality are diverging from planned baselines, the Manage Project Execution process should immediately trigger issue logging, root cause analysis, and either corrective actions or formal change requests.

Warning signs that execution management is insufficient

  • Team members are unaware of what other workstreams are doing
  • Issues are being resolved informally without documentation or assigned owners
  • No work performance data is being collected; status is reported subjectively
  • Approved changes have not been reflected in updated plans or communicated to the team
  • The lessons learned register has had no entries since project initiation
  • The PM is managing reactively, responding to surprises rather than proactively monitoring conditions

6. Practical Examples

Example 1 — Website Launch: Project Phoenix

Context: Alex Morgan, PMP, is the project manager for Project Phoenix — a full website redesign and CRM integration for TechCorp, led by CEO Sarah Chen. Budget: $72,250. Timeline: 90 days. Team: PM, two developers, one designer, one inbound analyst. Approach: agile with 2-week sprints.

How Manage Project Execution was applied:

From Sprint 1, Alex configured Jira as the PMIS: every task from the approved sprint backlog was entered as a ticket, assigned to a responsible team member, linked to the relevant milestone, and tagged with its story points for velocity tracking. The daily stand-up was structured around three questions (completed yesterday / planned today / blockers), limited to 20 minutes, and held at 9:00 AM every working day. Blockers were escalated immediately to the issue log, not parked for the weekly status call.

In Sprint 2, work performance data showed that actual development velocity was running at 68% of planned velocity — primarily because one developer was spending approximately 35% of their available time on a parallel internal initiative. Alex created an issue log entry, assigned it to herself as owner, and escalated to Sarah Chen in the weekly status report. Within three business days, the parallel initiative was deprioritized, the developer’s allocation was restored to 100%, and the sprint plan was adjusted with a formal change request to extend the Sprint 2 scope commitment by four story points.

By Sprint 4, when the designer completed the homepage and landing page mockups, Alex conducted a structured deliverable review against the requirements traceability matrix. Two pages were missing a mobile-responsive breakpoint that had been explicitly documented in the requirements. Rather than silently asking the designer to fix them, Alex logged a defect repair change request, documented the gap in the issue log, and updated the lessons learned register with an observation about the definition-of-done checklist needing to include a mobile-responsiveness verification step.

Result: Project Phoenix delivered on Day 87 of the 90-day plan — three days ahead of schedule, within budget. Every sprint’s work performance data was preserved and used to generate accurate velocity projections. The issue log documented and resolved seven issues during the project. Three lessons learned entries captured in execution were incorporated into the agency’s standard sprint kickoff checklist.

Example 2 — SaaS PM Platform: Project ProjectAdm

Context: Eduardo Montes (PM/CEO) leads the development of ProjectAdm — a SaaS project management platform aligned to PMBOK 8. Team: 8 developers + 2 designers + 1 QA engineer. Duration: 18 months. Approach: hybrid — predictive for compliance and infrastructure milestones, agile 2-week sprints for product features.

How Manage Project Execution was applied:

The ProjectAdm team used ProjectAdm itself as their PMIS — a deliberate dogfooding strategy that generated authentic product feedback with every sprint. Each compliance milestone (GDPR/CCPA certification, AWS multi-region deployment, PHPUnit test coverage) was managed on a predictive track with formal milestone reviews and documented acceptance criteria. Product feature development was managed on the agile track with 2-week sprints, daily stand-ups, and sprint reviews attended by a rotating group of beta testers from the target PM community.

In Sprint 7, work performance data revealed that the QA engineer was identifying defects faster than the development team was resolving them: the defect backlog had grown from 4 items to 23 items over two sprints. Eduardo logged this as a critical issue, conducted a root cause analysis in the sprint retrospective (root cause: insufficient unit test coverage in the backend modules being developed in Sprints 5 and 6), and raised a change request to pause feature development for one sprint and dedicate all developer capacity to defect resolution and increasing PHPUnit coverage to the 80% baseline defined in the project charter. The change was approved by the co-sponsor. Sprint 8 was a deliberate “quality sprint” — no new features, only defect resolution and test coverage. By Sprint 9, the defect introduction rate had dropped to two per sprint, well within the team’s resolution capacity.

Throughout execution, lessons learned were captured in real-time in the lessons learned register. By the end of the project, 31 actionable lessons had been documented — including the unit test coverage baseline change, the sprint velocity adjustment for GDPR compliance work, and the performance characteristics of the AWS multi-region architecture under load. These lessons informed the product’s internal development methodology documentation, which was published as part of the ProjectAdm onboarding experience for new organizational users.

7. Free and Recommended Templates

Download the free templates for this process and study filled-in examples from two real projects:

Document Free blank template
Project Management Plan
Integrated plan with scope, schedule, cost, quality, and resource baselines
Download free template
Issue Log
ID, description, owner, priority, target resolution date, status
Download free template

Recommended digital tools

  • Jira / Azure DevOps / Linear: For managing work execution in agile and hybrid contexts — sprint backlog, task tracking, defect management, and velocity reporting.
  • ClickUp / Monday.com / Asana: For managing work execution in predictive and hybrid contexts — task dependencies, resource assignments, milestone tracking.
  • Confluence / Notion: For maintaining the lessons learned register and issue log as collaborative, searchable documents with full team access.
  • Google Workspace / Microsoft 365: For work performance data collection in structured spreadsheets with version history and shared access.
  • Power BI / Tableau: For transforming work performance data into visual dashboards that communicate execution status to stakeholders in real time.

8. Five Common Errors — and How to Avoid Each One

Error 1 — Managing by impression rather than by data

Why it happens: Collecting work performance data requires discipline and system configuration effort. When teams are under delivery pressure, the data collection discipline is the first thing to drop. The PM starts reporting status from memory and from brief conversations rather than from measured actuals.

How to avoid it: Establish work performance data collection as a non-negotiable execution practice from Sprint 1 or Week 1. Configure the PMIS to make data entry as frictionless as possible. Make it clear to the team that status reports are generated from PMIS data — which means that if the data is not in the system, the status report is inaccurate, and inaccurate status reports create stakeholder trust problems that take far longer to resolve than the five minutes it takes to update a task status.

Error 2 — Informal issue management

Why it happens: Issues are often identified in conversation, resolved in conversation, and never formally documented. This creates a false sense of efficiency (“we solved it in five minutes”) while systematically destroying the organization’s ability to learn from execution experience.

How to avoid it: Establish a rule: every problem that requires more than one person’s involvement to resolve, or that has a potential impact on schedule, cost, scope, or quality, goes into the issue log. This takes two minutes. The issue log is reviewed in every status meeting. Closed issues are preserved in the log — they become the historical record of how the project navigated its challenges.

Error 3 — Approved changes not communicated to all affected team members

Why it happens: The PM is informed of the approval, updates the plan document, and assumes the team is aware. In distributed teams or fast-moving execution environments, team members continue working against the pre-change version of the plan for days or weeks before someone notices the inconsistency.

How to avoid it: Every approved change triggers a mandatory execution update protocol: update the PMIS, update the relevant plan components, and communicate the change to every team member affected by it in the next coordination meeting or via a direct notification in the project communication channel. Document the date the change was communicated and who received it.

Error 4 — Quality addressed only at deliverable completion, not during construction

Why it happens: Quality control is perceived as a post-production activity — “we’ll test it when it’s done.” PMBOK 8 explicitly states that quality has two dimensions during execution: process quality (are we following the right procedures?) and deliverable quality (are we building to the right specifications?). Addressing only the second dimension after completion means that process failures that could have been caught early are discovered late, when correction is expensive.

How to avoid it: Integrate quality checkpoints into the execution workflow, not only at the end. For software development, this means code reviews, automated testing in the CI/CD pipeline, and regular definition-of-done reviews. For deliverable-based projects, this means progressive quality checks at the 25%, 50%, and 75% completion points rather than a single review at 100%.

Error 5 — Lessons learned captured only at project closure

Why it happens: The lessons learned process is perceived as a “closure activity” — something that happens in the last week of the project. By that time, the team’s memory of execution challenges is incomplete, the emotional context has faded, and the practical detail that would make a lesson actionable is no longer accessible.

How to avoid it: Treat the lessons learned register as an execution artifact, not a closure artifact. Assign a standing agenda item in every sprint review or phase milestone review to capture at least one lesson: what worked well, what should be done differently, and what was discovered about the project’s operating environment that was not known at planning time. A lessons learned register with 20 specific, contextual entries captured throughout execution is ten times more valuable than a register with 5 vague entries captured on the final day.

9. Tailoring: Predictive, Agile, and Hybrid

Predictive approach (Waterfall)

  • Work authorization: Formal work authorization system with documented approvals for each work package. No work begins without explicit authorization.
  • Work performance data: Collected at formal reporting intervals (weekly or bi-weekly), tied to the earned value management system. CPI and SPI are calculated and reported to the sponsor.
  • Issue management: Formal issue log with priority classification (critical, high, medium, low) and escalation thresholds documented in the communications management plan.
  • Coordination meetings: Weekly project status meetings; formal milestone reviews at phase boundaries; steering committee updates at defined intervals.
  • Best suited for: Projects with well-defined scope, fixed budget and schedule commitments, and regulatory or contractual requirements for formal documentation of all execution decisions.

Agile approach

  • Work authorization: Sprint commitment. What is committed at sprint planning is authorized for execution within that sprint. No formal work authorization system; the sprint backlog is the authorization mechanism.
  • Work performance data: Collected continuously through the PMIS (story points completed, velocity, defect rate, test coverage). Sprint burndown charts provide real-time visibility to the team. Sprint reviews provide visibility to stakeholders.
  • Issue management: Blockers are surfaced in daily stand-ups and resolved within the sprint if possible. Unresolved blockers are escalated to the product owner or sponsor. The issue log may be lighter — impediment boards in Jira or a dedicated Slack channel — but the principle of named ownership and tracked resolution still applies.
  • Lessons learned: Captured in sprint retrospectives every two weeks, ensuring continuous improvement is embedded in the delivery cadence.
  • Best suited for: Products with emerging requirements, high rate of change, short delivery cycles, and strong stakeholder feedback loops.

Hybrid approach (the ProjectAdm model)

  • Dual execution tracks: Predictive track for compliance milestones, infrastructure decisions, and regulatory requirements (with formal work authorization, EVM, and milestone reviews); agile track for product feature development (with sprint commitments, daily stand-ups, and sprint reviews).
  • Integrated work performance data: Both tracks feed into a unified project dashboard — the predictive track contributes milestone completion status and cost performance; the agile track contributes velocity, defect rates, and feature completion metrics.
  • Dual issue management: High-severity issues (compliance risk, infrastructure failure, architectural decision reversals) are managed on the formal issue log with escalation thresholds. Sprint-level impediments are managed on the agile impediment board. Both are reviewed in the weekly integration meeting that coordinates across both tracks.
  • Best suited for: Software product development organizations managing simultaneous compliance requirements and iterative feature development. Multi-year programs where some elements are stable and some are highly variable.
Aspect Predictive Agile Hybrid (ProjectAdm model)
Work authorization Formal work authorization system per work package Sprint commitment at sprint planning Formal authorization for predictive milestones; sprint commitment for agile features
Work performance data Weekly/bi-weekly; EVM-based CPI/SPI Continuous; velocity and burndown charts Integrated dashboard with EVM for predictive track + velocity for agile track
Issue management Formal issue log with escalation thresholds Impediment board; daily stand-up escalation Formal log for compliance/infrastructure + impediment board for sprint blockers
Lessons learned At phase gates and project closure Every sprint retrospective Sprint retrospectives + formal milestone retrospectives

10. Interactions with Other Processes and Domains

Manage Project Execution sits at the center of the project lifecycle — it receives inputs from virtually every planning process and feeds outputs into virtually every monitoring and control process.

What feeds into Manage Project Execution

Process Domain What it provides
Integrate and Align Project Plans (Process 2) Governance The integrated project management plan that authorizes and guides all execution activities
Plan Sourcing Strategy (Process 3) Governance Vendor contracts, external resource assignments, and procurement deliverable obligations
Assess and Implement Changes (Process 8) Governance Approved change requests that must be incorporated into active execution
Planning processes (all domains) All domains Scope baseline, schedule baseline, cost baseline, quality management plan, resource management plan, risk register

What Manage Project Execution feeds into

Process Domain What it receives
Monitor and Control Project Performance (Process 7) Governance Work performance data as the primary input for performance tracking and reporting
Assess and Implement Changes (Process 8) Governance Change requests generated during execution that require assessment and decision
Manage Quality Assurance (Process 5) Governance Execution process data that quality audits and process improvement activities assess
Manage Project Knowledge (Process 6) Governance Lessons learned and new knowledge generated during execution
Scope Control / Quality Control Scope / Quality Deliverables produced during execution, submitted for verification and validation

11. Quick-Application Checklist

Use these 10 items as a completion gate at the start of each execution phase or sprint. For each item where the answer is “no” or “not yet,” treat it as an open action item:

  1. Is the project management plan (or sprint backlog) the authoritative reference for all current execution activities?
  2. Is the PMIS configured and accessible to all team members for task tracking and status updates?
  3. Are daily or near-daily coordination meetings scheduled, time-boxed, and structured around status, plans, and blockers?
  4. Is work performance data being collected consistently and entered into the PMIS at defined intervals?
  5. Is the issue log active, reviewed in every status meeting, and all entries assigned to named owners with target resolution dates?
  6. Have all approved change requests been incorporated into the work plan and communicated to all affected team members?
  7. Are deliverables being reviewed against the requirements traceability matrix before being submitted for acceptance?
  8. Are quality checkpoints embedded throughout the deliverable construction process, not only at completion?
  9. Is the lessons learned register being updated throughout execution, not only at project closure?
  10. Is the PM reviewing work performance data proactively at each reporting interval and generating status reports from measured actuals?

Scoring: Fewer than 8 confirmed means execution governance has identified gaps. Each “no” is a specific risk: the PMIS gap means monitoring data will be incomplete; the issue log gap means problems will be resolved informally and won’t be tracked; the lessons learned gap means the organization will repeat the same execution challenges in the next project.

Conclusion and Next Steps

Manage Project Execution is where a project’s value is actually created — and where governance either holds or breaks down. The planning processes that precede it define intent; Manage Project Execution determines whether that intent becomes reality. Project Phoenix’s on-time delivery, and ProjectAdm’s quality sprint decision that prevented a defect accumulation crisis, were not accidents of favorable circumstances — they were outcomes of disciplined execution governance: work performance data collected consistently, issues managed formally, approved changes implemented immediately, and lessons captured continuously.

Three takeaways for immediate application:

  • Work performance data is not optional: Every monitoring and control decision depends on the quality of work performance data generated during execution. Collect it consistently, enter it into your PMIS, and use it to report status rather than relying on subjective impressions. The five minutes it takes to update a task status saves hours of stakeholder conflict management.
  • The issue log is the PM’s memory: Informal issue resolution feels efficient in the moment but is invisible to governance, unreplicable for future projects, and unavailable for root cause analysis when the same problem recurs. Log every issue. Assign every issue. Review every issue. Close every issue with a documented resolution.
  • Execution is a leadership act, not just a management act: PMBOK 8 positions Manage Project Execution as a process of leading and performing work. The project manager’s role is not only to track tasks but to align the team’s collective knowledge toward the project’s objectives, resolve the human and organizational factors that block productive work, and create an execution environment where quality and speed reinforce each other rather than trade off against each other.

Your concrete next step: Open your current project’s PMIS. Check: is the work performance data up to date for every active task? Does the issue log have a named owner and resolution date for every open item? When were lessons learned last added to the register? If any of these questions produces a “no,” you have identified your first execution governance improvement action. Address it today — because execution drift compounds every day it goes uncorrected.

See all PMBOK 8 articles in the Complete Index


🇧🇷 Leia este artigo em português

Call to Action:

 

 

 

References

Project Management Institute (PMI). A Guide to the Project Management Body of Knowledge (PMBOK® Guide) – Eighth Edition. Newtown Square, Pennsylvania, USA: Project Management Institute, 2025.

PMBOK Guide 8: The New Era of Value-Based Project Management. Available at: https://projectmanagement.com.br/pmbok-guide-8/

Disclaimer

This article is an independent educational interpretation of the PMBOK® Guide – Eighth Edition, developed for informational purposes by ProjectManagement.com.br. It does not reproduce or redistribute proprietary PMI content. All trademarks, including PMI, PMBOK, and Project Management Institute, are the property of the Project Management Institute, Inc. For access to the complete and official content, purchase the guide from Amazon or download it for free at https://www.pmi.org/standards/pmbok if you are a PMI member.

Free PMBOK 8 Quick Reference Card

All 8 Performance Domains, 12 Principles, and key tools on one printable page. Download it free — no payment required.

Get the Free Reference Card →

Facebook
WhatsApp
Twitter
LinkedIn
Pinterest

Leave a Reply