Dynamic pipelines
When your source code projects are built with Buildkite Pipelines, you can write scripts in the same language as your source code, or another suitable language, that generate new Buildkite pipeline steps (in either YAML or JSON format), which you can then upload to the same pipeline using the pipeline upload step. These additional dynamically generated pipeline steps are run on the same Buildkite agent, as part of the same pipeline build, and will appear as their own steps in your pipeline builds. This provides you with the flexibility to structure your pipelines however you require.
For example, the following code snippet is an executable shell script that generates a list of parallel test steps based upon the test/* directory within your repository:
#!/bin/bash
# exit immediately on failure, or if an undefined variable is used
set -eu
# begin the pipeline.yml file
echo "steps:"
# add a new command step to run the tests in each test directory
for test_dir in test/*/; do
echo " - command: \"run_tests "${test_dir}"\""
done
To use this script, save it to the .buildkite/ directory inside your repository (that is, .buildkite/pipeline.sh), ensure the script file is executable, and then update your pipeline upload step to use the new script:
.buildkite/pipeline.sh | buildkite-agent pipeline upload
When the pipeline's build commences, this step executes the script and pipes the output to the buildkite-agent pipeline upload command. The upload command then inserts the steps from the script into the build immediately after this upload step.
Any running job in a build can call buildkite-agent pipeline upload to add new steps, multiple jobs can call it within the same build, and a single job can call it more than once. Buildkite Pipelines applies default service quotas of 500 jobs per upload, 500 uploads per build, and 4,000 jobs per build. Learn more in Pipelines limits.
Step ordering in the Buildkite interface
If you run the pipeline upload step multiple times in a single command step (for example, by running a script file from a command step, in which the script runs the pipeline upload step multiple times), then each batch of uploaded steps will appear in reverse order in the Buildkite interface, such as the Pipeline view (in the sidebar) or Table view of the new build page, as well as the Jobs view of the classic build page, since the upload command inserts its steps immediately after the upload step.
To avoid each of your dynamically-generated pipeline upload steps appearing in reverse order, define each of these upload steps in reverse orderβthat is, the steps being run as part of an upload step that you want to run first should be defined last. Alternatively, you can define explicit dependencies using the depends_on field.
In the following pipeline.yml example, when the build runs, it will execute the .buildkite/pipeline.sh script, then the test steps from the script will be added to the build before the wait step and command step. After the test steps have run, the wait and command step will run.
steps:
- command: .buildkite/pipeline.sh | buildkite-agent pipeline upload
label: ":pipeline: Upload"
- wait
- command: "other-script.sh"
label: "Run other operations"
Dynamic pipeline templates
If you need the ability to use pipelines from a central catalog, or enforce certain configuration rules, you can either use dynamic pipelines and the pipeline upload command to make this happen or write custom plugins and share them across your organization.
To use dynamic pipelines and the pipeline upload command, you'd make a pipeline that looks something like this:
steps:
- command: enforce-rules.sh | buildkite-agent pipeline upload
label: ":pipeline: Upload"
Each team defines their steps in team-steps.yml. Your templating logic is in enforce-rules.sh, which can be written in any language that can pass YAML to the pipeline upload.
In enforce-rules.sh you can add steps to the YAML, require certain versions of dependencies or plugins, or implement any other logic you can program. Depending on your use case, you might want to get enforce-rules.sh from an external catalog instead of committing it to the team repository.
See how Hasura.io used dynamic templates and pipelines to replace their YAML configuration with Go and some shell scripts.
When to use dynamic pipelines
Buildkite Pipelines supports several approaches to varying what runs in a build, ranging from fully static configuration to fully dynamic step generation. The right approach depends on how much the steps need to change from one build to the next.
Static YAML: Use when the pipeline runs the same steps every time, with
ifattribute expressions for variation (for example, based on branch, tag, or pull request state). This is the simplest approach and requires no scripting.Conditional step execution with
if_changed: Use when steps should only run when relevant files change. Add glob patterns to a step definition, and the Buildkite agent compares them against the Git diff at upload time, marking unmatched steps as skipped in the build. See Using if_changed for agent version requirements and supported syntax.Dynamic generation: Use when the steps themselves need to be constructed at build time. A generator script runs as a build step, inspects whatever context it needs (such as file changes, dependency graphs, API responses, or shared configuration), and uploads the appropriate steps for that specific build. This is the most flexible approach.
Buildkite SDK
Learn more about about the Buildkite SDK, which makes it easy to script the generation of steps for dynamic pipelines, on the Buildkite SDK page.