SPINE has two main orchestration patterns:
| Pattern | How it runs | When to use |
|---|---|---|
| Fan-Out | Parallel | Independent tasks that can run together |
| Pipeline | Sequential | Steps that depend on each other |
Run multiple tasks at once, then combine the results. Good when tasks don’t depend on each other.
┌─────────────┐
│ Parent │
│ Envelope │
└──────┬──────┘
┌───────────────┼───────────────┐
▼ ▼ ▼
┌────────────┐ ┌────────────┐ ┌────────────┐
│ Analyst A │ │ Analyst B │ │ Analyst C │
└────────────┘ └────────────┘ └────────────┘
│ │ │
└───────────────┼───────────────┘
▼
┌─────────────┐
│ Aggregate │
│ Results │
└─────────────┘
| Scenario | Example |
|---|---|
| Multi-source research | Query multiple docs at once |
| Parallel analysis | Security, style, and logic review |
| Independent subtasks | Generate tests for multiple functions |
| Competitive evaluation | Compare approaches |
tasks = [
{ message: "Analyze source A", agent: "analyst-a" },
{ message: "Analyze source B", agent: "analyst-b" },
{ message: "Analyze source C", agent: "analyst-c" }
]
result = fan_out(parent_envelope, tasks, max_workers: 5)
Fan-out with 3 agents:
├── Base: 1x (orchestrator)
├── Agents: 3x (parallel)
├── Aggregation: 0.5x
└── Total: ~4.5x single-agent cost
Trade-off: more expensive, but faster wall-clock time
Process data through stages where each stage feeds the next. Good for transformations that build on each other.
┌─────────┐ ┌─────────┐ ┌─────────┐ ┌──────────┐
│ Stage │ ──▶ │ Stage │ ──▶ │ Stage │ ──▶ │ Stage │
│ 1 │ │ 2 │ │ 3 │ │ 4 │
└─────────┘ └─────────┘ └─────────┘ └──────────┘
Analyze Extract Transform Synthesize
| Scenario | Example |
|---|---|
| Document processing | Parse → Extract → Summarize |
| Data transformation | Fetch → Clean → Analyze → Report |
| Build processes | Lint → Test → Build → Deploy |
| Staged analysis | Explore → Plan → Implement → Verify |
steps = [
{ name: "analyze", prompt: "You are an analyst." },
{ name: "extract", prompt: "Extract key findings." },
{ name: "synthesize", transform: combine_results }
]
result = pipeline(parent_envelope, steps)
Both patterns wrap operations in a ToolEnvelope for traceability.
┌─────────────────────────────────────────┐
│ ToolEnvelope │
├─────────────────────────────────────────┤
│ id: "call-abc123" │
│ tool: "anthropic:claude-sonnet-4-5" │
│ trace: │
│ root_id: "task-xyz" │
│ parent_id: "orchestrator-001" │
│ span_id: "subagent-research" │
│ metadata: │
│ tags: ["research", "phase-1"] │
│ experiment_id: "exp-2025-001" │
│ timing: │
│ started_at, finished_at, duration_ms │
│ usage: │
│ input_tokens, output_tokens │
└─────────────────────────────────────────┘
Fan-out:
root_id: "research-task-001"
├── span_id: "orchestrator"
│ ├── span_id: "agent-security" (parallel)
│ ├── span_id: "agent-style" (parallel)
│ └── span_id: "agent-logic" (parallel)
└── span_id: "aggregator"
Pipeline:
root_id: "document-process-001"
├── span_id: "stage-1-analyze"
├── span_id: "stage-2-extract"
├── span_id: "stage-3-transform"
└── span_id: "stage-4-synthesize"
Run parallel tasks as one stage:
┌─────────┐ ┌─────────────────────────┐ ┌─────────┐
│ Prepare │ ──▶ │ Fan-Out Stage │ ──▶ │Synthesize│
└─────────┘ │ ┌───┐ ┌───┐ ┌───┐ │ └─────────┘
│ │ A │ │ B │ │ C │ │
│ └───┘ └───┘ └───┘ │
└─────────────────────────┘
Run independent pipelines in parallel:
┌───────────────────┐
┌──────▶│ Pipeline 1: Docs │──────┐
│ └───────────────────┘ │
┌───────┴───┐ ┌───────────────────┐ ┌───┴───────┐
│ Dispatch │──▶│ Pipeline 2: Code │──▶│ Aggregate │
└───────┬───┘ └───────────────────┘ └───┬───────┘
│ ┌───────────────────┐ │
└──────▶│ Pipeline 3: Tests │──────┘
└───────────────────┘
| Factor | Fan-Out | Pipeline |
|---|---|---|
| Task independence | High | Low |
| Order matters | No | Yes |
| Speed priority | Wall-clock time | Throughput |
| Error isolation | Good | Cascading risk |
| Debugging | Parallel traces | Linear flow |
Fan-Out when:
├── Tasks are independent
├── Need fastest wall-clock time
├── Results can be merged
└── Partial results OK
Pipeline when:
├── Steps depend on each other
├── Order matters
├── Data transforms sequentially
└── Need checkpoints
Everything logs to ./logs/YYYY-MM-DD/*.json: