Назад към всички

Workflow

// Build automated pipelines with reusable components, data flow between nodes, and state management.

$ git log --oneline --stat
stars:1,933
forks:367
updated:March 4, 2026
SKILL.mdreadonly
SKILL.md Frontmatter
nameWorkflow
slugworkflow
version1.0.0
descriptionBuild automated pipelines with reusable components, data flow between nodes, and state management.
metadata[object Object]

Architecture

Workflows live in workflows/ with components and flows:

workflows/
├── index.md                 # Inventory with tags
├── components/
│   ├── connections/         # Auth configs
│   ├── nodes/               # Reusable operations
│   └── triggers/            # Event sources
└── flows/{name}/
    ├── flow.md              # Definition
    ├── config.yaml          # Parameters
    ├── run.sh               # Executable
    ├── state/               # Persistent between runs
    └── logs/

Quick Reference

TopicFile
Directory layout, naming, formatsstructure.md
Data passing between nodesdata-flow.md
Cursor, seen set, checkpointstate.md
Retry, rollback, idempotencyerrors.md
Connections, nodes, triggerscomponents.md
Create, test, update, archivelifecycle.md

Requirements

  • jq — JSON processing
  • yq — YAML config parsing
  • curl — HTTP requests
  • flock — Lock files to prevent concurrent runs
  • Secrets in macOS Keychain (security find-generic-password)

Data Storage

  • Location: workflows/ in workspace root
  • State: flows/{name}/state/ — cursor.json, seen.json, checkpoint.json
  • Logs: flows/{name}/logs/ — JSONL per run
  • Data: flows/{name}/data/ — intermediate files between nodes

Core Rules

1. Data Flow Pattern

Each node writes output to data/{NN}-{name}.json. Next node reads it.

curl ... > data/01-fetch.json
jq '...' data/01-fetch.json > data/02-filter.json

Breaking this pattern = nodes can't communicate.

2. Error Declaration

Every node in flow.md MUST declare:

  • On error: retry(N) | fail | continue | alert
  • On empty: skip | continue | fail

Undefined behavior = unpredictable workflow.

3. Lock Files

Prevent concurrent runs:

LOCKFILE="/tmp/workflow-${NAME}.lock"
exec 200>"$LOCKFILE"
flock -n 200 || exit 0

4. State Files

FilePurpose
cursor.json"Where did I leave off?"
seen.json"What have I processed?"
checkpoint.json"Multi-step recovery"

5. Component Reuse

Before creating new connections/nodes/triggers:

ls workflows/components/connections/
ls workflows/components/nodes/

Use existing. Update "Workflows Using This" when adding.


Related: For LLM-driven multi-phase processes, see the cycle skill.