buildkite-agent pipeline

The Buildkite Agent's pipeline command allows you to add and replace build steps in the running build. The steps are defined using YAML or JSON and can be read from a file or streamed from the output of a script.

See the Defining your pipeline steps guide for a step-by-step example and list of step types.

Uploading pipelines

Processing of a single pipeline file

The buildkite-agent pipeline upload command only processes a single pipeline file. If multiple files are passed into a command (including using a wildcard * in the filename), only the first pipeline file will be processed. Additional pipeline files provided as arguments will be ignored. Please see Uploading multiple pipeline files for more information.

Usage

buildkite-agent pipeline upload [file] [options...]

Description

Allows you to change the pipeline of a running build by uploading either a YAML (recommended) or JSON configuration file. If no configuration file is provided, the command looks for the file in the following locations:

  • buildkite.yml
  • buildkite.yaml
  • buildkite.json
  • .buildkite/pipeline.yml
  • .buildkite/pipeline.yaml
  • .buildkite/pipeline.json
  • buildkite/pipeline.yml
  • buildkite/pipeline.yaml
  • buildkite/pipeline.json

You can also pipe build pipelines to the command allowing you to create scripts that generate dynamic pipelines. The configuration file has a limit of 500 steps per file. Configuration files with over 500 steps must be split into multiple files and uploaded in separate steps.

Example

$ buildkite-agent pipeline upload
$ buildkite-agent pipeline upload my-custom-pipeline.yml
$ ./script/dynamic_step_generator | buildkite-agent pipeline upload

Options

--no-color #

Don't show colors in logging
Environment variable: $BUILDKITE_AGENT_NO_COLOR

--debug #

Enable debug mode. Synonym for `--log-level debug`. Takes precedence over `--log-level`
Environment variable: $BUILDKITE_AGENT_DEBUG

--log-level value #

Set the log level for the agent, making logging more or less verbose. Defaults to notice. Allowed values are: debug, info, error, warn, fatal (default: "notice")
Environment variable: $BUILDKITE_AGENT_LOG_LEVEL

--experiment value #

Enable experimental features within the buildkite-agent
Environment variable: $BUILDKITE_AGENT_EXPERIMENT

--profile value #

Enable a profiling mode, either cpu, memory, mutex or block
Environment variable: $BUILDKITE_AGENT_PROFILE

--agent-access-token value #

The access token used to identify the agent
Environment variable: $BUILDKITE_AGENT_ACCESS_TOKEN

--endpoint value #

The Agent API endpoint (default: "https://agent.buildkite.com/v3")
Environment variable: $BUILDKITE_AGENT_ENDPOINT

--no-http2 #

Disable HTTP2 when communicating with the Agent API.
Environment variable: $BUILDKITE_NO_HTTP2

--debug-http #

Enable HTTP debug mode, which dumps all request and response bodies to the log
Environment variable: $BUILDKITE_AGENT_DEBUG_HTTP

--trace-http #

Enable HTTP trace mode, which logs timings for each HTTP request. Timings are logged at the debug level unless a request fails at the network level in which case they are logged at the error level
Environment variable: $BUILDKITE_AGENT_TRACE_HTTP

--replace #

Replace the rest of the existing pipeline with the steps uploaded. Jobs that are already running are not removed.
Environment variable: $BUILDKITE_PIPELINE_REPLACE

--job value #

The job that is making the changes to its build
Environment variable: $BUILDKITE_JOB_ID

--dry-run #

Rather than uploading the pipeline, it will be echoed to stdout
Environment variable: $BUILDKITE_PIPELINE_UPLOAD_DRY_RUN

--format value #

In dry-run mode, specifies the form to output the pipeline in. Must be one of: json,yaml (default: "json")
Environment variable: $BUILDKITE_PIPELINE_UPLOAD_DRY_RUN_FORMAT

--no-interpolation #

Skip variable interpolation into the pipeline prior to upload
Environment variable: $BUILDKITE_PIPELINE_NO_INTERPOLATION

--reject-secrets #

When true, fail the pipeline upload early if the pipeline contains secrets
Environment variable: $BUILDKITE_AGENT_PIPELINE_UPLOAD_REJECT_SECRETS

--apply-if-changed #

When enabled, steps containing an `if_changed` key are evaluated against the git diff. If the `if_changed` glob pattern match no files changed in the build, the step is skipped.
Environment variable: $BUILDKITE_AGENT_APPLY_SKIP_IF_UNCHANGED

--git-diff-base value #

Provides the base from which to find the git diff when processing `if_changed`, e.g. origin/main. If not provided, it uses the first valid value of {origin/$BUILDKITE_PULL_REQUEST_BASE_BRANCH, origin/$BUILDKITE_PIPELINE_DEFAULT_BRANCH, origin/main}.
Environment variable: $BUILDKITE_PULL_REQUEST_BASE_BRANCH

--jwks-file value #

Path to a file containing a JWKS. Passing this flag enables pipeline signing
Environment variable: $BUILDKITE_AGENT_JWKS_FILE

--jwks-key-id value #

The JWKS key ID to use when signing the pipeline. Required when using a JWKS
Environment variable: $BUILDKITE_AGENT_JWKS_KEY_ID

--signing-aws-kms-key value #

The AWS KMS key identifier which is used to sign pipelines.
Environment variable: $BUILDKITE_AGENT_AWS_KMS_KEY

--debug-signing #

Enable debug logging for pipeline signing. This can potentially leak secrets to the logs as it prints each step in full before signing. Requires debug logging to be enabled
Environment variable: $BUILDKITE_AGENT_DEBUG_SIGNING

--redacted-vars value #

Pattern of environment variable names containing sensitive values (default: "*_PASSWORD", "*_SECRET", "*_TOKEN", "*_PRIVATE_KEY", "*_ACCESS_KEY", "*_SECRET_KEY", "*_CONNECTION_STRING")
Environment variable: $BUILDKITE_REDACTED_VARS

Pipeline format

The pipeline can be written as YAML or JSON, but YAML is more common for its readability. There are three top-level properties you can specify:

  • The agents attribute - a map of agent characteristics such as os or queue that restrict what agents the command will run on.
  • The env attribute - a map of environment variables to apply to all steps.
  • The steps attribute - an array of build pipeline steps.

Insertion order

Steps are inserted immediately following the job performing the pipeline upload. Note that if you perform multiple uploads from a single step, they can appear to be in reverse order, because the later uploads are inserted earlier in the pipeline.

Environment variable substitution

The pipeline upload command supports environment variable substitution using the syntax $VAR and ${VAR}.

For example, the following pipeline substitutes a number of Buildkite's default environment variables into a trigger step:

- trigger: "app-deploy"
  label: ":rocket: Deploy"
  branches: "main"
  async: true
  build:
    message: "${BUILDKITE_MESSAGE}"
    commit: "${BUILDKITE_COMMIT}"
    branch: "${BUILDKITE_BRANCH}"

If you want an environment variable to be evaluated at runtime (for example, using the step's environment variables), ensure you escape the $ character using $$ or \$. For example:

- command: "deploy.sh $$SERVER"
  env:
    SERVER: "server-a"

Escaping the $ character

If you need to prevent substitution, you can escape the $ character by using $$ or \$.

For example, using $$USD and \$USD will both result in the same value: $USD.

Disabling interpolation

You can disable interpolation with the --no-interpolation flag, which was added in v3.1.1.

Requiring environment variables

You can set required environment variables using the syntax ${VAR?}. If VAR is not set, the pipeline upload command will print an error and exit with a status of 1.

For example, the following step will cause the pipeline upload to error if the SERVER environment variable has not been set:

- command: "deploy.sh \"${SERVER?}\""

You can set a custom error message after the ? character. For example, the following prints the error message SERVER: is not set. Please specify a server if the environment variable has not been set:

- command: "deploy.sh \"${SERVER?is not set. Please specify a server}\""

Default, blank, and missing values

If an environment variable has not been set it will evaluate to a blank string. You can set a fallback value using the syntax ${VAR:-default-value}.

For example, the following step will run the command deploy.sh staging:

- command: "deploy.sh \"${SERVER:-staging}\""
Environment Variables Syntax Result
Environment Variables Syntax "${SERVER:-staging}" Result "staging"
Environment Variables SERVER="" Syntax "${SERVER:-staging}" Result "staging"
Environment Variables SERVER="staging-5" Syntax "${SERVER:-staging}" Result "staging-5"

If you need to substitute environment variables containing empty strings, you can use the syntax ${VAR-default-value} (notice the missing :).

Environment Variables Syntax Result
Environment Variables Syntax "${SERVER-staging}" Result "staging"
Environment Variables SERVER="" Syntax "${SERVER-staging}" Result ""
Environment Variables SERVER="staging-5" Syntax "${SERVER-staging}" Result "staging-5"

Extracting character ranges

You can substitute a subset of characters from an environment variable by specifying a start and end range using the syntax ${VAR:start:end}.

For example, the following step will echo the first 7 characters of the BUILDKITE_COMMIT environment variable:

- command: "echo \"Short commit is: ${BUILDKITE_COMMIT:0:7}\""

If the environment variable has not been set, the range will return a blank string.

Uploading multiple pipelines

While the buildkite-agent pipeline upload command is only able to process a single file, different approaches are available for handling the upload of multiple pipeline files for processing.

Multiple sequential uploads

You can call buildkite-agent pipeline upload multiple times within the same step to upload multiple pipeline files:

buildkite-agent pipeline upload .buildkite/pipeline1.yml
buildkite-agent pipeline upload .buildkite/pipeline2.yml

Pass multiple files to command

Using the find command, you can pipe | multiple file paths into the buildkite-agent pipeline upload command:

find .buildkite/ -type f -iname '*.yaml' -print0 | xargs -0 -n1 buildkite-agent pipeline upload

Combine multiple pipeline files

Since the buildkite-agent pipeline upload command is also able to accept pipeline YAML, you can emit the contents of multiple pipeline files and have this combined output be processed directly from STDIN.

Processing of multiple pipeline files

When passing multiple pipeline files into the pipeline upload command, include a --- on the first line of each pipeline file to indicate the beginning of each new pipeline YAML file. This is required to ensure the buildkite-agent is able to correctly process multiple files that have been combined into a single input stream.

Using the following three example pipeline files:

pipeline-start.yml
---
steps:
  - label: "Start of the build"
    command: ./scripts/build-start.sh
pipeline-middle.yml
---
steps:
  - label: "Middle of the build"
    command: ./scripts/build-middle.sh
pipeline-end.yml
---
steps:
  - label: "End of the build"
    command: ./scripts/build-end.sh

Pass the contents of all the pipeline files that are matching the wildcard * file pattern into the pipeline upload command:

cat .buildkite/pipeline-*.yml | buildkite-agent pipeline upload

Alternatively, you can explicitly list each pipeline file to be passed into the pipeline upload command:

cat .buildkite/pipeline-start.yml .buildkite/pipeline-middle.yml .buildkite/pipeline-end.yml | buildkite-agent pipeline upload

Troubleshooting

Here are some common issues that can occur when uploading a pipeline.

Common errors

Pipeline uploads can be rejected if certain criteria are not met. Here are explanations for why your pipeline upload might be rejected.

Error Reason
Error The key "duplicate-key-name" has already
been used by another step in this build
Reason This error occurs when you try to upload a pipeline step with a key attribute that matches the key attribute of an existing step in the pipeline. key attributes must be unique for all steps in a build. To resolve this error, either remove the duplicate key or change it to a unique value.
Error You can only change the pipeline of a
running build
Reason This error occurs when you attempt to upload a pipeline to a build that has already finished. This typically happens when using the --job option with the upload command. To resolve this, ensure the build is still running before uploading, or start a new build.