AI agent native DevOps bash script orchestrator.
ShellFlow is a minimal shell script orchestrator for mixed local and remote execution. You write one shell script, mark execution boundaries with comments, and ShellFlow runs each block in order while resolving remote targets from your SSH configuration.
- Split a shell script into
@LOCALand@REMOTEexecution blocks. - Run blocks sequentially by default, or run annotated groups with
--mode parallel. - Freeze uppercase prelude assignments once so values such as
BUILD_ID=$(date +%s)stay consistent across local and remote blocks. - Declare script parameters with
# @optionand pass them as CLI flags or agent schema values. - Run focused parts of a playbook with
# @TASKandshellflow run --task. - Reuse command snippets with
# @MACROand# @HELPER. - Add local lifecycle hooks for setup, cleanup, success, and failure handling.
- Pass the previous block output forward as
SHELLFLOW_LAST_OUTPUT. - Export named scalar values from a block into later block environments.
- Emit either a final JSON report or streaming JSON Lines events for agents.
- Support bounded
@TIMEOUTand@RETRYdirectives without embedding workflow logic. - Provide non-interactive, dry-run, audit-log, doctor, and agent-run modes for automated execution.
- Resolve remote targets from inline
@SERVERdefinitions,~/.ssh/config, or a custom SSH config path.
uv tool install shellflow
shellflow run playbooks/hello.shuv tool install shellflow
shellflow --versionnpx skills add longcipher/shellflowThis installs the agent skill from this repository for writing and reviewing Shellflow playbooks.
To upgrade to the latest version:
uv tool upgrade shellflowgit clone https://github.com/longcipher/shellflow.git
cd shellflow
uv sync --all-groups # uv sync --refresh --reinstall --no-cacheuv tool install --force .
shellflow --versionuv pip install -e .
shellflow --versionShellflow playbooks are ordinary shell scripts plus comment markers. Marker names are shown uppercase here and should be written that way for readability; the parser accepts marker names case-insensitively.
# @LOCAL# @REMOTE <ssh-host>
<ssh-host> must match an inline @SERVER definition or a Host entry in your SSH config. Shellflow then connects using the configured host, HostName, User, Port, and identity key values.
Block directives must appear immediately after the # @LOCAL or # @REMOTE <ssh-host> marker, before the first command in that block.
# @TIMEOUT <seconds># @RETRY <count># @EXPORT NAME=stdout|stderr|output|exit_code# @SHELL <shell>- specifybash,zsh, orsh.# @PARALLEL [group]- mark this block for grouped parallel execution.
@PARALLEL may appear immediately before a block marker or as a block directive. It applies only to that one block. Consecutive parallel blocks are grouped together when you run with --mode parallel.
Example:
#!/bin/bash
set -euo pipefail
# @option release-name=
# @option branch=main
# @LOCAL
# @EXPORT VERSION=stdout
echo "$RELEASE_NAME-$BRANCH"
# @REMOTE sui
uname -a
# @LOCAL
echo "remote output: $SHELLFLOW_LAST_OUTPUT"
echo "version = $VERSION"Declare script parameters near the top of the file:
# @option staging
# @option branch=main
# @option release-name=# @option stagingis boolean. Passing--stagingsetsSTAGING=1.# @option branch=mainhas a default. Passing--branch developsetsBRANCH=develop.# @option release-name=is required. Pass--release-name v1or setRELEASE_NAMEin the environment.- Option names become uppercase environment variables with dashes converted to underscores.
Run it:
shellflow run deploy.sh --branch develop --release-name v2026.05.01 --stagingLines before the first block marker are the shared prelude. Uppercase assignments in the prelude are evaluated once locally and then exported into every block with a frozen value:
BUILD_ID=$(date +%s)
# @LOCAL
echo "$BUILD_ID"
# @REMOTE sui
echo "$BUILD_ID"Both blocks receive the same BUILD_ID. Non-assignment prelude lines, such as set -euo pipefail and helper functions, are still prepended to each block.
Keep the prelude declarative. Avoid one-time side effects such as cd, rm, or deployment commands before the first marker.
Use # @TASK <name> to label blocks and --task to run only that task:
# @TASK build
# @LOCAL
echo "build"
# @TASK deploy
# @REMOTE sui
echo "deploy"shellflow run deploy.sh --task buildUse a single-line macro to define a task flow:
# @MACRO release build deploy smoke-test
# @TASK build
# @LOCAL
echo "build"
# @TASK deploy
# @REMOTE sui
echo "deploy"
# @TASK smoke-test
# @LOCAL
echo "smoke"shellflow run deploy.sh --task releaseMacros can also expand command snippets inside a block:
# @MACRO print_env
# env | sort
# @ENDMACRO
# @LOCAL
print_envHelpers are reusable command snippets. They are expanded when a block contains only the helper name on a line:
# @HELPER backup_db
# pg_dump "$DATABASE_URL" > backup.sql
# @ENDHELPER
# @LOCAL
backup_dbHooks run locally and share Shellflow's execution context:
PRE- once before all main blocks.BEFORE- before each main block.AFTER- after each main block.SUCCESS- after all main blocks succeed.ERROR- after a hook or main block fails.FINISHED- at the end, whether the run succeeds or fails.
POST is accepted as an alias for AFTER; FINALLY is accepted as an alias for FINISHED.
# @HOOK PRE
# echo "prepare"
# @ENDHOOK
# @HOOK ERROR
# echo "rollback or collect diagnostics"
# @ENDHOOK
# @HOOK FINISHED
# echo "cleanup"
# @ENDHOOKMark each block that should join the parallel group:
# @PARALLEL web
# @REMOTE web-1
systemctl restart nginx
# @PARALLEL web
# @REMOTE web-2
systemctl restart nginx
# @LOCAL
echo "runs after the parallel group"Run with:
shellflow run restart.sh --mode parallelWithout --mode parallel, blocks run sequentially even if annotated.
Using @SHELL for remote servers with non-bash default shells:
Shellflow starts remote shells in login mode. For remote zsh and bash blocks, Shellflow also bootstraps ~/.zshrc or ~/.bashrc quietly before running your commands so tools initialized there, such as mise, remain available in non-interactive automation even if the rc file exits non-zero.
#!/bin/bash
# @REMOTE zsh-server
# @SHELL zsh
# zsh-specific commands work here
reload
compdef
# @REMOTE bash-server
# Default bash shell is used
ls -laRemote verbose tracing uses the shell's DEBUG trap and executes the block as one native script. That preserves multi-line Bash and zsh constructs such as if/else/fi, for loops, and function definitions.
You can define remote hosts inline:
# @SERVER sui
# host: 192.168.1.100
# user: deploy
# port: 22
# key: ~/.ssh/id_ed25519
# @REMOTE sui
hostnameInline server definitions are useful for portable playbooks. The host field is required; user, port, and key are optional.
Example ~/.ssh/config entry:
Host sui
HostName 192.168.1.100
User deploy
Port 22
IdentityFile ~/.ssh/id_ed25519With that config, this block is valid:
# @REMOTE sui
hostnameThis is intentional:
- Shellflow accepts configured SSH host aliases, not arbitrary comma-separated or free-form targets.
- Unknown remote targets fail early with a clear error before spawning
ssh. - You can override the default config path with
--ssh-config.
Each block runs in a fresh shell.
- Shell options from the prelude are copied into every block.
- Shell state like
cd, shell variables, aliases, andexportcommands does not persist across blocks. - Explicit context values are passed forward through environment variables.
Example:
# @LOCAL
echo "build-123"
# @LOCAL
echo "last output = $SHELLFLOW_LAST_OUTPUT"Named exports are additive to SHELLFLOW_LAST_OUTPUT:
# @LOCAL
# @EXPORT VERSION=stdout
echo "2026.03.15"
# @REMOTE sui
echo "deploying $VERSION"
echo "last output = $SHELLFLOW_LAST_OUTPUT"Lines before the first marker are treated as a shared prelude and prepended to every executable block:
#!/bin/bash
set -euo pipefail
# @LOCAL
echo "prelude is active"
# @REMOTE sui
echo "prelude is also active here"Uppercase assignments in the prelude are special: Shellflow evaluates them once locally, freezes the values, and exports them into every block. That keeps release IDs, timestamps, and option-derived values stable across local and remote execution.
Shellflow is designed to be the execution substrate for an outer agent, not an embedded planner.
- Use
--jsonwhen you want one final machine-readable run report. - Use
--jsonlwhen you want ordered event records while the script runs. - Use
--no-inputfor CI or agent runs where interactive prompts must fail deterministically. - Use
--dry-runto preview planned execution without running commands. - Use
--audit-log <path>to mirror the structured event stream into a redacted JSONL file. - Use
agent-run --json-inputwhen an agent already has the script body and option values in memory.
Recommended agent flow:
- Generate or select a plain shell script with
@LOCALand@REMOTEmarkers. - Read
# @optiondeclarations and provide required values. - Add bounded directives only where needed:
@TIMEOUT,@RETRY, and@EXPORT. - Run with
--jsonor--jsonl. - Let the outer agent decide whether to retry, branch, or stop based on Shellflow's structured result.
Agent-run input:
shellflow agent-run --json-input '{
"script": "# @option release-name=\n# @LOCAL\necho \"$RELEASE_NAME\"\n",
"options": {"release-name": "v2026.05.01"},
"dry_run": false
}'Shellflow intentionally does not provide:
- Conditional directives such as
@IF stdout_contains=... - A workflow DSL or embedded ReAct loop
- Heuristic destructive-command detection
- Multi-host comma expansion inside one
@REMOTEmarker
Those decisions belong in the outer agent or automation layer.
Shellflow's structured modes are designed for LLM agent consumption:
-
Stable run and block identifiers: JSON and JSONL output include
run_id, one-based block indexes, and stableblock-Nidentifiers. -
Separated stdout and stderr: Block reports keep stdout, stderr, combined output, exit code, failure kind, retries, timeouts, and exported values explicit.
-
Command-level remote tracing: Remote verbose execution uses shell
DEBUGtraps to report the command about to run without breaking multi-line shell syntax. -
Audit-safe exports: Audit logs redact exported values whose names look secret-like, such as
TOKEN,SECRET, orPASSWORD. -
Bounded verbose output:
--output-lineslimits verbose per-command log tails while preserving full block output in structured results.
shellflow run <script>
shellflow run <script> --verbose
shellflow run <script> --output-lines 50
shellflow run <script> --json
shellflow run <script> --jsonl
shellflow run <script> --no-input
shellflow run <script> --dry-run
shellflow run <script> --mode parallel
shellflow run <script> --task <task-or-macro>
shellflow run <script> --audit-log ./audit.jsonl --jsonl
shellflow run <script> --ssh-config ./ssh_config
shellflow run <script> --release-name v1 --branch main
shellflow agent-run --json-input '{"script":"# @LOCAL\necho hi\n"}'
shellflow doctor [script]
shellflow --version
Examples:
shellflow run playbooks/hello.sh
shellflow run playbooks/hello.sh -v
shellflow run playbooks/hello.sh --json
shellflow run playbooks/hello.sh --jsonl --no-input
shellflow run playbooks/hello.sh --dry-run --jsonl
shellflow run playbooks/hello.sh --audit-log ./audit.jsonl --jsonl
shellflow run playbooks/hello.sh --ssh-config ~/.ssh/config.work
shellflow run playbooks/deploy.sh --task release --mode parallel --json
shellflow doctor playbooks/deploy.sh --ssh-config ~/.ssh/config.workRun shellflow run --help, shellflow agent-run --help, or shellflow doctor --help for the exact command options supported by the installed version.
Useful commands:
just sync
just test
just bdd
just test-all
just typecheck
just build
just publishDirect verification commands:
uv run pytest -q
uv run behave features
uv run ruff check .
uv run ty check src tests
uv buildShellflow supports both local publishing and GitHub Actions release publishing.
just publishuv publish uses standard uv authentication mechanisms such as UV_PUBLISH_TOKEN, or PyPI trusted publishing when supported by the environment.
The repository includes:
.github/workflows/ci.ymlfor lint, type-check, test, and build verification..github/workflows/release.ymlfor publishing to PyPI when a tag likev0.1.0is pushed.
Recommended release flow:
git tag v0.1.0
git push origin v0.1.0To use trusted publishing with PyPI:
- Create a
pypienvironment in GitHub repository settings. - Add this repository as a trusted publisher in the PyPI project settings.
- Push a
v*tag.
The release workflow then runs verification, builds distributions with uv build, and uploads them with uv publish.
shellflow/
├── src/shellflow.py
├── src/advanced_modes.py
├── src/config.py
├── src/doctor.py
├── src/helpers.py
├── src/hooks.py
├── src/macros.py
├── src/variables.py
├── tests/
├── features/
├── playbooks/
├── pyproject.toml
├── Justfile
└── README.md
Apache-2.0
