-
Notifications
You must be signed in to change notification settings - Fork 16
Divide And Conquer
ZjzMisaka edited this page Nov 12, 2025
·
4 revisions
For most D&C workloads, use:
- Deque queueing (work closer to the recursion frontier executes first)
- Single-item stealing (reduces cache thrash and mitigates task ping-pong)
Example:
PowerPoolOption options = new PowerPoolOption
{
QueueType = QueueType.Deque,
StealOneWorkOnly = true,
};- QueueType = Deque favors depth-first execution, which is typically beneficial for D&C recursion trees.
- StealOneWorkOnly = true encourages narrow, predictable stealing steps that align with recursive splits.
Control where each work item lands. Set this on each WorkOption (so different branches can use different policies).
Available policies:
- PreferLocalWorker
- If the current thread is a pool worker, place the work into this worker’s local queue.
- Use when you recursively enqueue from inside a worker and want to keep locality.
- PreferIdleThenLocal
- Try to assign to an idle worker. If none, fall back to local (if available).
- Good compromise for spreading work when idle threads exist.
- PreferIdleThenLeastLoaded (default)
- Try idle first; otherwise, choose a worker with the fewest queued items (skipping long-running workers).
- Use when you want global balancing beyond local affinity.
Example:
var opt = new WorkOption<long>
{
ShouldStoreResult = true,
WorkPlacementPolicy = WorkPlacementPolicy.PreferLocalWorker
};
WorkID id = pool.QueueWorkItem<long>(() => /* compute */, opt);- In recursive methods running inside pool workers, PreferLocalWorker commonly yields the best performance and locality.
- When true, a thief steals only one work item at a time.
- Recommended for D&C trees to reduce excessive migration of tasks between workers.
Set on the pool:
var options = new PowerPoolOption
{
StealOneWorkOnly = true
};When a caller is blocked waiting, they can "help" the pool progress by executing available work. This reduces idle time and minimizes overhead.
Places to use it:
- PowerPool.Wait(helpWhileWaiting: true)
- Work.Wait(helpWhileWaiting: true)
- Fetch(id, removeAfterFetch: ..., helpWhileWaiting: true)
- Fetch(IEnumerable, ...)
- Group.Wait(helpWhileWaiting: true)
- Group.Fetch(..., helpWhileWaiting: true)
Example:
// Wait for the entire pool to finish while helping:
pool.Wait(helpWhileWaiting: true);
// Fetch a result while helping:
var res = pool.Fetch<long>(someId, removeAfterFetch: false, helpWhileWaiting: true);
long value = res.Result;- The current thread attempts to execute work (via work stealing) while it’s otherwise waiting.
- This is especially useful in top-down D&C where the caller submits subproblems and then waits.
Below are common patterns for splitting a problem and combining results. These are templates rather than complete programs.
long Solve(int l, int r)
{
if (r - l + 1 <= cutoff) return ComputeDirect(l, r);
int m = (l + r) >> 1;
var opt = new WorkOption<long>
{
ShouldStoreResult = true,
WorkPlacementPolicy = WorkPlacementPolicy.PreferLocalWorker
};
// Enqueue one side
WorkID rightId = pool.QueueWorkItem(() => Solve(m + 1, r), opt);
// Compute the other side inline
long left = Solve(l, m);
// Fetch result and help while waiting
long right = pool.Fetch<long>(rightId, removeAfterFetch: false, helpWhileWaiting: true).Result;
return left + right;
}WorkID rootId = pool.QueueWorkItem<long>(() => Solve(0, n - 1), new WorkOption<long>
{
ShouldStoreResult = true,
WorkPlacementPolicy = WorkPlacementPolicy.PreferIdleThenLocal
});
// Help while waiting for all work to complete
pool.Wait(helpWhileWaiting: true);
// Fetch final result (also can pass helpWhileWaiting: true here)
long total = pool.Fetch<long>(rootId, removeAfterFetch: false, helpWhileWaiting: true).Result;- If you block within a worker and want to reduce stall, you can Fetch with helpWhileWaiting: true.
- Combine with WorkPlacementPolicy.PreferLocalWorker to keep spawned subproblems nearby.
For D&C, you'll frequently:
- Store sub-results: set ShouldStoreResult = true in WorkOption so you can Fetch by ID later.
- Optionally use Group to manage a subtree.
Example:
string groupName = "AAA";
var opt = new WorkOption<long>
{
Group = groupName,
ShouldStoreResult = true,
WorkPlacementPolicy = WorkPlacementPolicy.PreferLocalWorker
};
WorkID id = pool.QueueWorkItem<long>(() => /* subproblem */, opt);
// Later
var grp = pool.GetGroup(groupName);
grp.Wait(helpWhileWaiting: true);
var results = grp.Fetch<long>(removeAfterFetch: false, helpWhileWaiting: true);- If you do not need to fetch a sub-result later, you can compute inline instead of enqueuing.
- For composing results, enqueue the final combine step as a work (with ShouldStoreResult = true) or compute it inline depending on your flow.
- Use QueueType.Deque + PreferLocalWorker for recursion-heavy tasks.
- Enqueue at most one side of a split, compute the other inline to limit enqueued volume.
- Always set helpWhileWaiting: true on blocking waits and fetches in D&C flows.
- Use StealOneWorkOnly = true to avoid moving too many tasks between workers.
- Use a reasonable cutoff to avoid over-splitting into tiny tasks.
- If the caller is outside the pool, prefer pool.Wait(helpWhileWaiting: true) after submitting the top-level task.
- If waiting inside a worker, use Fetch(..., helpWhileWaiting: true) and PreferLocalWorker to preserve locality.
Do:
- Do set ShouldStoreResult = true for subproblems you will fetch by ID later.
- Do use helpWhileWaiting: true whenever you block in a D&C pipeline.
- Do choose WorkPlacementPolicy deliberately based on where the call originates (inside worker vs outside).
Don't:
- Don't enqueue both branches indiscriminately at every level; prefer "one branch inline + one branch queued."
- Don't set QueueType = FIFO for deep-recursion D&C unless you have a specific reason; Deque is usually better.
- Don't forget to pass helpWhileWaiting: true; otherwise you miss out on cooperative progress.
- Sync | Async
- Pool Control | Work Control
- Divide And Conquer
- Thread Pool Sizing
- Work Callback | Default Callback
- Rejection Policy
- Parallel Execution
- Work Priority | Thread Priority
- Error Handling
- Work Timeout | Cumulative Work Timeout
- Work Dependency
- Work Group
- Events
- Runtime Status
- Running Timer
- Queue Type (FIFO | LIFO | Custom | Deque)
- Load Balancing
- Low-Contention Design
Core
Results
Options