dub
PublicREMOVE ME: Reduce test matrix & other debugging
Passed in 25m 6s

Build
vibe-d/vibe.d+examples
vibe-d/vibe.d+tests
ldc-developers/ldc
vibe-d/vibe.d+base
dlang/phobos
dlang/phobos+no-autodecode
sociomantic-tsunami/ocean
sociomantic-tsunami/swarm
sociomantic-tsunami/turtle
dlang/dub
vibe-d/vibe-core+epoll
vibe-d/vibe-core+select
higgsjs/Higgs
rejectedsoftware/ddox
BlackEdder/ggplotd
dlang-community/D-Scanner
dlang-tour/core
d-widget-toolkit/dwt
rejectedsoftware/diet-ng
mbierlee/poodinis
dlang/tools
atilaneves/unit-threaded
gecko0307/dagon
dlang-community/DCD
CyberShadow/ae
jmdavis/dxml
jacob-carlborg/dstep
libmir/mir-algorithm
dlang-community/D-YAML
libmir/mir-random
dlang-community/libdparse
aliak00/optional
dlang-community/dfmt
Abscissa/libInputVisitor
atilaneves/automem
AuburnSounds/intel-intrinsics
DerelictOrg/DerelictFT
DerelictOrg/DerelictGL3
DerelictOrg/DerelictGLFW3
DerelictOrg/DerelictSDL2
dlang-community/containers
dlang/undeaD
DlangScience/scid
ikod/dlang-requests
symmetryinvestments/autowrap
symmetryinvestments/concurrency
symmetryinvestments/excel-d
symmetryinvestments/ldapauth
kaleidicassociates/lubeck
symmetryinvestments/xlsxreader
lgvz/imageformats
libmir/mir
libmir/mir-core
libmir/mir-cpuid
libmir/mir-optim
msoucy/dproto
Netflix/vectorflow
nomad-software/dunit
pbackus/sumtype
PhilippeSigaud/Pegged
repeatedly/mustache-d
s-ludwig/std_data_json
s-ludwig/taggedalgebraic
snazzy-d/sdc
funkwerk-mobility/serialized
funkwerk-mobility/mocked
andrey-zherikov/argparse
Cluster tests 2 failed, main history:
- Unknown error in test-refresh-mv-warmup:
Docker compose failed: docker compose -f/dev/fd/3 --project-directory /var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-3a6173ea/materialize/test/test/cluster exec -T testdrive testdrive --kafka-addr=kafka:9092 --schema-registry-url=http://schema-registry:8081 --materialize-url=postgres://materialize@materialized:6875 --materialize-internal-url=postgres://materialize@materialized:6877 --aws-endpoint=http://minio:9000 --var=aws-endpoint=http://minio:9000 --aws-access-key-id=minioadmin --var=aws-access-key-id=minioadmin --aws-secret-access-key=minioadmin --var=aws-secret-access-key=minioadmin --no-reset --default-timeout=360s --persist-blob-url=file:///mzdata/persist/blob --persist-consensus-url=postgres://root@materialized:26257?options=--search_path=consensus --source=/var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-3a6173ea/materialize/test/test/cluster/mzcompose.py:3838
^^^ +++
+++ !!! Error Report
1 errors were encountered during execution
source: /var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-3a6173ea/materialize/test/test/cluster/mzcompose.py:3838
- Unknown error in test-storage-controller-metrics:
builtins.AssertionError: got 2.0
Test details & reproducer
Functional tests which require separate clusterd containers (instead of the usual clusterd included in the materialized container).BUILDKITE_PARALLEL_JOB=1 BUILDKITE_PARALLEL_JOB_COUNT=4 bin/mzcompose --find cluster run default
Restart test failed, main history:
- Unknown error in bound-size-mz-status-history:
Docker compose failed: docker compose -f/dev/fd/3 --project-directory /var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-e637fc46/materialize/test/test/restart exec -T testdrive_no_reset testdrive --kafka-addr=kafka:9092 --schema-registry-url=http://schema-registry:8081 --materialize-url=postgres://materialize@materialized:6875 --materialize-internal-url=postgres://materialize@materialized:6877 --aws-endpoint=http://minio:9000 --var=aws-endpoint=http://minio:9000 --aws-access-key-id=minioadmin --var=aws-access-key-id=minioadmin --aws-secret-access-key=minioadmin --var=aws-secret-access-key=minioadmin --no-reset --default-timeout=360s --persist-blob-url=file:///mzdata/persist/blob --persist-consensus-url=postgres://root@materialized:26257?options=--search_path=consensus --source=/var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-e637fc46/materialize/test/test/restart/mzcompose.py:645
^^^ +++
+++ !!! Error Report
1 errors were encountered during execution
source: /var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-e637fc46/materialize/test/test/restart/mzcompose.py:645
Test details & reproducer
Testdrive-based tests involving restarting materialized (including its clusterd processes). See cluster tests for separate clusterds, see platform-checks for further restart scenarios.bin/mzcompose --find restart run default
Cluster tests 3 failed, main history:
- Unknown error in test-workload-class-in-metrics:
psycopg.errors.FeatureNotSupported: log source reads must target a replica DETAIL: The query references the following log sources: mz_dataflow_operators_per_worker HINT: Use `SET cluster_replica = <replica-name>` to target a specific replica in the active cluster. Note that subsequent queries will only be answered by the selected replica, which might reduce availability. To undo the replica selection, use `RESET cluster_replica`.
Test details & reproducer
Functional tests which require separate clusterd containers (instead of the usual clusterd included in the materialized container).BUILDKITE_PARALLEL_JOB=2 BUILDKITE_PARALLEL_JOB_COUNT=4 bin/mzcompose --find cluster run default
Cluster tests 1 failed, main history:
- Unknown error in test-replica-metrics:
builtins.AssertionError: unexpected create_instance count: 1.0
Test details & reproducer
Functional tests which require separate clusterd containers (instead of the usual clusterd included in the materialized container).BUILDKITE_PARALLEL_JOB=0 BUILDKITE_PARALLEL_JOB_COUNT=4 bin/mzcompose --find cluster run default
Testdrive 4 failed, main history:
- Unknown error in session.td:
session.td:18:1: non-matching rows: expected:
[["DateStyle", "ISO, MDY", "Sets the display format for date and time values (PostgreSQL)."], ["IntervalStyle", "postgres", "Sets the display format for interval values (PostgreSQL)."], ["TimeZone", "UTC", "Sets the time zone for displaying and interpreting time stamps (PostgreSQL)."], ["allowed_cluster_replica_sizes", "", "The allowed sizes when creating a new cluster replica (Materialize)."], ["application_name", "", "Sets the application name to be reported in statistics and logs (PostgreSQL)."], ["auto_route_catalog_queries", "on", "Whether to force queries that depend only on system tables, to run on the mz_catalog_server cluster (Materialize)."], ["client_encoding", "UTF8", "Sets the client's character set encoding (PostgreSQL)."], ["client_min_messages", "notice", "Sets the message levels that are sent to the client (PostgreSQL)."], ["cluster", "<VARIES>", "Sets the current cluster (Materialize)."], ["cluster_replica", "", "Sets a target cluster replica for SELECT queries (Materialize)."], ["current_object_missing_warnings", "on", "Whether to emit warnings when the current database, schema, or cluster is missing (Materialize)."], ["database", "materialize", "Sets the current database (CockroachDB)."], ["emit_introspection_query_notice", "on", "Whether to print a notice when querying per-replica introspection sources."], ["emit_plan_insights_notice", "off", "Boolean flag indicating whether to send a NOTICE with JSON-formatted plan insights before executing a SELECT statement (Materialize)."], ["emit_timestamp_notice", "off", "Boolean flag indicating whether to send a NOTICE with timestamp explanations of queries (Materialize)."], ["emit_trace_id_notice", "off", "Boolean flag indicating whether to send a NOTICE specifying the trace id when available (Materialize)."], ["enable_consolidate_after_union_negate", "on", "consolidation after Unions that have a Negated input (Materialize)."], ["enable_rbac_checks", "on", "User facing global boolean flag indicating whether to apply RBAC checks before executing statements (Materialize)."], ["enable_reduce_reduction", "on", "split complex reductions in to simpler ones and a join (Materialize)."], ["enable_session_rbac_checks", "off", "User facing session boolean flag indicating whether to apply RBAC checks before executing statements (Materialize)."], ["extra_float_digits", "3", "Adjusts the number of digits displayed for floating-point values (PostgreSQL)."], ["failpoints", "<omitted>", "Allows failpoints to be dynamically activated."], ["force_source_table_syntax", "off", "Force use of new source model (CREATE TABLE .. FROM SOURCE) and migrate existing sources"], ["idle_in_transaction_session_timeout", "2 min", "Sets the maximum allowed duration that a session can sit idle in a transaction before being terminated. If this value is specified without units, it is taken as milliseconds. A value of zero disables the timeout (PostgreSQL)."], ["integer_datetimes", "on", "Reports whether the server uses 64-bit-integer dates and times (PostgreSQL)."], ["is_superuser", "off", "Reports whether the current session is a superuser (PostgreSQL)."], ["max_aws_privatelink_connections", "0", "The maximum number of AWS PrivateLink connections in the region, across all schemas (Materialize)."], ["max_clusters", "10", "The maximum number of clusters in the region (Materialize)."], ["max_connections", "5000", "The maximum number of concurrent connections (PostgreSQL)."], ["max_continual_tasks", "100", "The maximum number of continual tasks in the region, across all schemas (Materialize)."], ["max_copy_from_size", "1073741824", "The maximum size in bytes we buffer for COPY FROM statements (Materialize)."], ["max_credit_consumption_rate", "1024", "The maximum rate of credit consumption in a region. Credits are consumed based on the size of cluster replicas in use (Materialize)."], ["max_databases", "1000", "The maximum number of databases in the region (Materialize)."], ["max_identifier_length", "255", "The maximum length of object identifiers in bytes (PostgreSQL)."], ["max_kafka_connections", "1000", "The maximum number of Kafka connections in the region, across all schemas (Materialize)."], ["max_materialized_views", "100", "The maximum number of materialized views in the region, across all schemas (Materialize)."], ["max_mysql_connections", "1000", "The maximum number of MySQL connections in the region, across all schemas (Materialize)."], ["max_network_policies", "25", "The maximum number of network policies in the region."], ["max_objects_per_schema", "1000", "The maximum number of objects in a schema (Materialize)."], ["max_postgres_connections", "1000", "The maximum number of PostgreSQL connections in the region, across all schemas (Materialize)."], ["max_query_result_size", "1GB", "The maximum size in bytes for a single query's result (Materialize)."], ["max_replicas_per_cluster", "5", "The maximum number of replicas of a single cluster (Materialize)."], ["max_result_size", "1GB", "The maximum size in bytes for an internal query result (Materialize)."], ["max_roles", "1000", "The maximum number of roles in the region (Materialize)."], ["max_rules_per_network_policy", "25", "The maximum number of rules per network policies."], ["max_schemas_per_database", "1000", "The maximum number of schemas in a database (Materialize)."], ["max_secrets", "100", "The maximum number of secrets in the region, across all schemas (Materialize)."], ["max_sinks", "25", "The maximum number of sinks in the region, across all schemas (Materialize)."], ["max_sources", "200", "The maximum number of sources in the region, across all schemas (Materialize)."], ["max_sql_server_connections", "1000", "The maximum number of SQL Server connections in the region, across all schemas (Materialize)."], ["max_tables", "200", "The maximum number of tables in the region, across all schemas (Materialize)."], ["mz_version", "<VARIES>", "Shows the Materialize server version (Materialize)."], ["network_policy", "default", "Sets the fallback network policy applied to all users without an explicit policy."], ["optimizer_e2e_latency_warning_threshold", "500 ms", "Sets the duration that a query can take to compile; queries that take longer will trigger a warning. If this value is specified without units, it is taken as milliseconds. A value of zero disables the timeout (Materialize)."], ["real_time_recency", "off", "Feature flag indicating whether real time recency is enabled (Materialize)."], ["real_time_recency_timeout", "10 s", "Sets the maximum allowed duration of SELECTs that actively use real-time recency, i.e. reach out to an external system to determine their most recencly exposed data (Materialize)."], ["search_path", "public", "Sets the schema search order for names that are not schema-qualified (PostgreSQL)."], ["server_version", "9.5.0", "Shows the PostgreSQL compatible server version (PostgreSQL)."], ["server_version_num", "90500", "Shows the PostgreSQL compatible server version as an integer (PostgreSQL)."], ["sql_safe_updates", "off", "Prohibits SQL statements that may be overly destructive (CockroachDB)."], ["standard_conforming_strings", "on", "Causes '...' strings to treat backslashes literally (PostgreSQL)."], ["statement_logging_default_sample_rate", "0.01", "The default value of `statement_logging_sample_rate` for new sessions (Materialize)."], ["statement_logging_max_sample_rate", "0.01", "The maximum rate at which statements may be logged. If this value is less than that of `statement_logging_sample_rate`, the latter is ignored (Materialize)."], ["statement_logging_sample_rate", "0.01", "User-facing session variable indicating how many statement executions should be logged, subject to constraint by the system variable `statement_logging_max_sample_rate` (Materialize)."], ["statement_timeout", "1 min", "Sets the maximum allowed duration of INSERT...SELECT, UPDATE, and DELETE operations. If this value is specified without units, it is taken as milliseconds."], ["superuser_reserved_connections", "3", "The number of connections that are reserved for superusers (PostgreSQL)."], ["transaction_isolation", "strict serializable", "Sets the current transaction's isolation level (PostgreSQL)."], ["unsafe_new_transaction_wall_time", "", "Sets the wall time for all new explicit or implicit transactions to control the value of `now()`. If not set, uses the system's clock."], ["welcome_message", "on", "Whether to send a notice with a welcome message after a successful connection (Materialize)."]]
got:
[["DateStyle", "ISO, MDY", "Sets the display format for date and time values (PostgreSQL)."], ["IntervalStyle", "postgres", "Sets the display format for interval values (PostgreSQL)."], ["TimeZone", "UTC", "Sets the time zone for displaying and interpreting time stamps (PostgreSQL)."], ["allowed_cluster_replica_sizes", "", "The allowed sizes when creating a new cluster replica (Materialize)."], ["application_name", "", "Sets the application name to be reported in statistics and logs (PostgreSQL)."], ["auto_route_catalog_queries", "on", "Whether to force queries that depend only on system tables, to run on the mz_catalog_server cluster (Materialize)."], ["client_encoding", "UTF8", "Sets the client's character set encoding (PostgreSQL)."], ["client_min_messages", "notice", "Sets the message levels that are sent to the client (PostgreSQL)."], ["cluster", "<VARIES>", "Sets the current cluster (Materialize)."], ["cluster_replica", "", "Sets a target cluster replica for SELECT queries (Materialize)."], ["current_object_missing_warnings", "on", "Whether to emit warnings when the current database, schema, or cluster is missing (Materialize)."], ["database", "materialize", "Sets the current database (CockroachDB)."], ["default_cluster_replication_factor", "1", "Default cluster replication factor (Materialize)."], ["emit_introspection_query_notice", "on", "Whether to print a notice wh [...]
Test details & reproducer
Testdrive is the basic framework and language for defining product tests under the expected-result/actual-result (aka golden testing) paradigm. A query is retried until it produces the desired result.BUILDKITE_PARALLEL_JOB=3 BUILDKITE_PARALLEL_JOB_COUNT=8 bin/mzcompose --find testdrive run default
Checks + restart of environmentd & storage clusterd 3 failed, main history:
- Unknown error in workflow-default:
Docker compose failed: docker compose -f/dev/fd/3 --project-directory /var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-07b1ac9a/materialize/test/test/platform-checks exec -T testdrive testdrive --kafka-addr=kafka:9092 --schema-registry-url=http://schema-registry:8081 --materialize-url=postgres://materialize@materialized:6875 --materialize-internal-url=postgres://materialize@materialized:6877 --aws-endpoint=http://minio:9000 --var=aws-endpoint=http://minio:9000 --aws-access-key-id=minioadmin --var=aws-access-key-id=minioadmin --aws-secret-access-key=minioadmin --var=aws-secret-access-key=minioadmin --no-reset --materialize-param=statement_timeout='300s' --default-timeout=300s --seed=1 --persist-blob-url=s3://minioadmin:minioadmin@persist/persist?endpoint=http://minio:9000/®ion=minio --persist-consensus-url=postgres://root@materialized:26257?options=--search_path=consensus --var=replicas=1 --var=default-replica-size=4-4 --var=default-storage-size=4-1 --source=/var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-07b1ac9a/materialize/test/misc/python/materialize/checks/all_checks/cluster.py:37
^^^ +++
+++ !!! Error Report
1 errors were encountered during execution
source: /var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-07b1ac9a/materialize/test/misc/python/materialize/checks/all_checks/cluster.py:37
Test details & reproducer
Write a single set of .td fragments for a particular feature or functionality and then have Zippy execute them in upgrade, 0dt-upgrade, restart, recovery and failure contexts.BUILDKITE_PARALLEL_JOB=2 BUILDKITE_PARALLEL_JOB_COUNT=6 bin/mzcompose --find platform-checks run default --scenario=RestartEnvironmentdClusterdStorage --seed=0196383b-f390-40b8-bc2b-5d7ced8599a8
Checks without restart or upgrade 3 failed, main history:
- Unknown error in workflow-default:
Docker compose failed: docker compose -f/dev/fd/4 --project-directory /var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-8cf661c5/materialize/test/test/platform-checks exec -T testdrive testdrive --kafka-addr=kafka:9092 --schema-registry-url=http://schema-registry:8081 --materialize-url=postgres://materialize@materialized:6875 --materialize-internal-url=postgres://materialize@materialized:6877 --aws-endpoint=http://minio:9000 --var=aws-endpoint=http://minio:9000 --aws-access-key-id=minioadmin --var=aws-access-key-id=minioadmin --aws-secret-access-key=minioadmin --var=aws-secret-access-key=minioadmin --no-reset --materialize-param=statement_timeout='300s' --default-timeout=300s --seed=1 --persist-blob-url=s3://minioadmin:minioadmin@persist/persist?endpoint=http://minio:9000/®ion=minio --persist-consensus-url=postgres://root@materialized:26257?options=--search_path=consensus --var=replicas=1 --var=default-replica-size=4-4 --var=default-storage-size=4-1 --source=/var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-8cf661c5/materialize/test/misc/python/materialize/checks/all_checks/cluster.py:37
^^^ +++
+++ !!! Error Report
1 errors were encountered during execution
source: /var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-8cf661c5/materialize/test/misc/python/materialize/checks/all_checks/cluster.py:37
Test details & reproducer
Write a single set of .td fragments for a particular feature or functionality and then have Zippy execute them in upgrade, 0dt-upgrade, restart, recovery and failure contexts.BUILDKITE_PARALLEL_JOB=2 BUILDKITE_PARALLEL_JOB_COUNT=6 bin/mzcompose --find platform-checks run default --scenario=NoRestartNoUpgrade --seed=0196383b-f390-40b8-bc2b-5d7ced8599a8
Checks + restart of environmentd & storage clusterd 5 failed, main history:
- Unknown error in workflow-default:
Docker compose failed: docker compose -f/dev/fd/4 --project-directory /var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-957ddb9d/materialize/test/test/platform-checks exec -T testdrive testdrive --kafka-addr=kafka:9092 --schema-registry-url=http://schema-registry:8081 --materialize-url=postgres://materialize@materialized:6875 --materialize-internal-url=postgres://materialize@materialized:6877 --aws-endpoint=http://minio:9000 --var=aws-endpoint=http://minio:9000 --aws-access-key-id=minioadmin --var=aws-access-key-id=minioadmin --aws-secret-access-key=minioadmin --var=aws-secret-access-key=minioadmin --no-reset --materialize-param=statement_timeout='300s' --default-timeout=300s --seed=1 --persist-blob-url=s3://minioadmin:minioadmin@persist/persist?endpoint=http://minio:9000/®ion=minio --persist-consensus-url=postgres://root@materialized:26257?options=--search_path=consensus --var=replicas=1 --var=default-replica-size=4-4 --var=default-storage-size=4-1 --source=/var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-957ddb9d/materialize/test/misc/python/materialize/checks/all_checks/webhook.py:140
^^^ +++
+++ !!! Error Report
1 errors were encountered during execution
source: /var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-957ddb9d/materialize/test/misc/python/materialize/checks/all_checks/webhook.py:140
Test details & reproducer
Write a single set of .td fragments for a particular feature or functionality and then have Zippy execute them in upgrade, 0dt-upgrade, restart, recovery and failure contexts.BUILDKITE_PARALLEL_JOB=4 BUILDKITE_PARALLEL_JOB_COUNT=6 bin/mzcompose --find platform-checks run default --scenario=RestartEnvironmentdClusterdStorage --seed=0196383b-f390-40b8-bc2b-5d7ced8599a8
Checks without restart or upgrade 5 failed, main history:
- Unknown error in workflow-default:
Docker compose failed: docker compose -f/dev/fd/4 --project-directory /var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-473e37d1/materialize/test/test/platform-checks exec -T testdrive testdrive --kafka-addr=kafka:9092 --schema-registry-url=http://schema-registry:8081 --materialize-url=postgres://materialize@materialized:6875 --materialize-internal-url=postgres://materialize@materialized:6877 --aws-endpoint=http://minio:9000 --var=aws-endpoint=http://minio:9000 --aws-access-key-id=minioadmin --var=aws-access-key-id=minioadmin --aws-secret-access-key=minioadmin --var=aws-secret-access-key=minioadmin --no-reset --materialize-param=statement_timeout='300s' --default-timeout=300s --seed=1 --persist-blob-url=s3://minioadmin:minioadmin@persist/persist?endpoint=http://minio:9000/®ion=minio --persist-consensus-url=postgres://root@materialized:26257?options=--search_path=consensus --var=replicas=1 --var=default-replica-size=4-4 --var=default-storage-size=4-1 --source=/var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-473e37d1/materialize/test/misc/python/materialize/checks/all_checks/webhook.py:140
^^^ +++
+++ !!! Error Report
1 errors were encountered during execution
source: /var/lib/buildkite-agent/builds/hetzner-aarch64-16cpu-32gb-473e37d1/materialize/test/misc/python/materialize/checks/all_checks/webhook.py:140
Test details & reproducer
Write a single set of .td fragments for a particular feature or functionality and then have Zippy execute them in upgrade, 0dt-upgrade, restart, recovery and failure contexts.BUILDKITE_PARALLEL_JOB=4 BUILDKITE_PARALLEL_JOB_COUNT=6 bin/mzcompose --find platform-checks run default --scenario=NoRestartNoUpgrade --seed=0196383b-f390-40b8-bc2b-5d7ced8599a8
Fast SQL logic tests 2 failed, main history:
- Unknown error in test/sqllogictest/cluster.slt:
OutputFailure:test/sqllogictest/cluster.slt:454
expected: Values(["mz_catalog_server", "r1", "2", "mz_probe", "r1", "2", "mz_system", "r1", "2", "quickstart", "r1", "2", "quickstart", "size_1", "1"])
actually: Values(["mz_catalog_server", "r1", "2", "mz_probe", "r1", "2", "mz_system", "r1", "2", "quickstart", "r1", "2", "quickstart", "r2", "2", "quickstart", "size_1", "1"])
actual raw: [Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }]
OutputFailure:test/sqllogictest/cluster.slt:466
expected: Values(["foo", "size_1", "1", "foo", "size_2", "2", "mz_catalog_server", "r1", "2", "mz_probe", "r1", "2", "mz_system", "r1", "2", "quickstart", "r1", "2", "quickstart", "size_1", "1"])
actually: Values(["foo", "size_1", "1", "foo", "size_2", "2", "mz_catalog_server", "r1", "2", "mz_probe", "r1", "2", "mz_system", "r1", "2", "quickstart", "r1", "2", "quickstart", "r2", "2", "quickstart", "size_1", "1"])
actual raw: [Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }]
OutputFailure:test/sqllogictest/cluster.slt:495
expected: Values(["mz_catalog_server", "r1", "2", "mz_probe", "r1", "2", "mz_system", "r1", "2", "quickstart", "r1", "2"])
actually: Values(["mz_catalog_server", "r1", "2", "mz_probe", "r1", "2", "mz_system", "r1", "2", "quickstart", "r1", "2", "quickstart", "r2", "2"])
actual raw: [Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }, Row { columns: [Column { name: "cluster", table_oid: None, column_id: None, type: Text }, Column { name: "replica", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }] }]
OutputFailure:test/sqllogictest/cluster.slt:671
expected: Values(["r1", "2", "1", "18446744073709000000", "18446744073709551615", "2", "1", "r1", "2", "1", "18446744073709000000", "18446744073709551615", "2", "1", "r1", "2", "1", "18446744073709000000", "18446744073709551615", "2", "1", "r1", "2", "1", "18446744073709000000", "18446744073709551615", "2", "1", "size_1", "1", "1", "18446744073709000000", "18446744073709551615", "1", "1", "size_1_8g", "1-8G", "1", "18446744073709000000", "8589934592", "1", "1", "size_2_2", "2-2", "2", "18446744073709000000", "18446744073709551615", "2", "2", "size_32", "32", "1", "18446744073709000000", "18446744073709551615", "32", "1"])
actually: Values(["r1", "2", "1", "18446744073709000000", "18446744073709551615", "2", "1", "r1", "2", "1", "18446744073709000000", "18446744073709551615", "2", "1", "r1", "2", "1", "18446744073709000000", "18446744073709551615", "2", "1", "r1", "2", "1", "18446744073709000000", "18446744073709551615", "2", "1", "r2", "2", "1", "18446744073709000000", "18446744073709551615", "2", "1", "size_1", "1", "1", "18446744073709000000", "18446744073709551615", "1", "1", "size_1_8g", "1-8G", "1", "18446744073709000000", "8589934592", "1", "1", "size_2_2", "2-2", "2", "18446744073709000000", "18446744073709551615", "2", "2", "size_32", "32", "1", "18446744073709000000", "18446744073709551615", "32", "1"])
actual raw: [Row { columns: [Column { name: "name", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }, Column { name: "processes", table_oid: None, column_id: None, type: Other(Other { name: "uint8", oid: 16464, kind: Simple, schema: "mz_catalog" }) }, Column { name: "cpu_nano_cores", table_oid: None, column_id: None, type: Other(Other { name: "uint8", oid: 16464, kind: Simple, schema: "mz_catalog" }) }, Column { name: "memory_bytes", table_oid: None, column_id: None, type: Other(Other { name: "uint8", oid: 16464, kind: Simple, schema: "mz_catalog" }) }, Column { name: "workers", table_oid: None, column_id: None, type: Other(Other { name: "uint8", oid: 16464, kind: Simple, schema: "mz_catalog" }) }, Column { name: "credits_per_hour", table_oid: None, column_id: None, type: Numeric }] }, Row { columns: [Column { name: "name", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }, Column { name: "processes", table_oid: None, column_id: None, type: Other(Other { name: "uint8", oid: 16464, kind: Simple, schema: "mz_catalog" }) }, Column { name: "cpu_nano_cores", table_oid: None, column_id: None, type: Other(Other { name: "uint8", oid: 16464, kind: Simple, schema: "mz_catalog" }) }, Column { name: "memory_bytes", table_oid: None, column_id: None, type: Other(Other { name: "uint8", oid: 16464, kind: Simple, schema: "mz_catalog" }) }, Column { name: "workers", table_oid: None, column_id: None, type: Other(Other { name: "uint8", oid: 16464, kind: Simple, schema: "mz_catalog" }) }, Column { name: "credits_per_hour", table_oid: None, column_id: None, type: Numeric }] }, Row { columns: [Column { name: "name", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }, Column { name: "processes", table_oid: None, column_id: None, type: Other(Other { name: "uint8", oid: 16464, kind: Simple, schema: "mz_catalog" }) }, Column { name: "cpu_nano_cores", table_oid: None, column_id: None, type: Other(Other { name: "uint8", oid: 16464, kind: Simple, schema: "mz_catalog" }) }, Column { name: "memory_bytes", table_oid: None, column_id: None, type: Other(Other { name: "uint8", oid: 16464, kind: Simple, schema: "mz_catalog" }) }, Column { name: "workers", table_oid: None, column_id: None, type: Other(Other { name: "uint8", oid: 16464, kind: Simple, schema: "mz_catalog" }) }, Column { name: "credits_per_hour", table_oid: None, column_id: None, type: Numeric }] }, Row { columns: [Column { name: "name", table_oid: None, column_id: None, type: Text }, Column { name: "size", table_oid: None, column_id: None, type: Text }, Column { name: "processes", table_oid: None, column_id: None, type: Ot [...]
- Unknown error in test/sqllogictest/transform/normalize_lets.slt:
Bail:test/sqllogictest/transform/normalize_lets.slt:492 PlanFailure:test/sqllogictest/transform/normalize_lets.slt:492:
db error: ERROR: log source reads must target a replica
DETAIL: The query references the following log sources:
mz_scheduling_elapsed_raw
mz_compute_import_frontiers_per_worker
mz_dataflow_operators_per_worker
HINT: Use `SET cluster_replica = <replica-name>` to target a specific replica in the active cluster. Note that subsequent queries will only be answered by the selected replica, which might reduce availability. To undo the replica selection, use `RESET cluster_replica`.: ERROR: log source reads must target a replica
DETAIL: The query references the following log sources:
mz_scheduling_elapsed_raw
mz_compute_import_frontiers_per_worker
mz_dataflow_operators_per_worker
HINT: Use `SET cluster_replica = <replica-name>` to target a specific replica in the active cluster. Note that subsequent queries will only be answered by the selected replica, which might reduce availability. To undo the replica selection, use `RESET cluster_replica`.
Test details & reproducer
Run SQL tests using an instance of Mz that is embedded in the sqllogic binary itself. Good for basic SQL tests, but can't interact with sources like MySQL/Kafka, see Testdrive for that.BUILDKITE_PARALLEL_JOB=1 BUILDKITE_PARALLEL_JOB_COUNT=6 bin/mzcompose --find sqllogictest run fast-tests

Waited 6s
Ran in 5s
Buildecho "--- Print environment" && uname -a && git --version && make --version && ${SHELL} --version || true && c++ --version && ld -v && ! command -v gdb &>/dev/null || gdb --version && ! dmd --version # ensure that no dmd is the current environment && echo "--- Load CI folder" && # make sure the entire CI folder is loaded && if [ ! -d buildkite ] ; then && mkdir -p buildkite && pushd buildkite && wget https://github.com/dlang/ci/archive/master.tar.gz && tar xvfz master.tar.gz --strip-components=2 ci-master/buildkite && rm -rf master.tar.gz && popd && fi && echo "--- Merging with the upstream target branch" && ./buildkite/merge_head.sh && ./buildkite/build_distribution.sh
Waited 3s
Ran in 1m 28s
vibe-d/vibe.d+examples# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/vibe-d/vibe.d" && export REPO_DIR="vibe-d-vibe.d+examples" && export REPO_FULL_NAME="vibe-d/vibe.d+examples" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 10s
Ran in 7m 36s
vibe-d/vibe.d+tests# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/vibe-d/vibe.d" && export REPO_DIR="vibe-d-vibe.d+tests" && export REPO_FULL_NAME="vibe-d/vibe.d+tests" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6s
Ran in 5m 5s
ldc-developers/ldc# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/ldc-developers/ldc" && export REPO_DIR="ldc-developers-ldc" && export REPO_FULL_NAME="ldc-developers/ldc" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 5s
Ran in 11m 31s
vibe-d/vibe.d+base# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/vibe-d/vibe.d" && export REPO_DIR="vibe-d-vibe.d+base" && export REPO_FULL_NAME="vibe-d/vibe.d+base" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6s
Ran in 5m 18s
dlang/phobos# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/dlang/phobos" && export REPO_DIR="dlang-phobos" && export REPO_FULL_NAME="dlang/phobos" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 4s
Ran in 11m 23s
dlang/phobos+no-autodecode# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/dlang/phobos" && export REPO_DIR="dlang-phobos+no-autodecode" && export REPO_FULL_NAME="dlang/phobos+no-autodecode" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 3s
Ran in 1m 9s
sociomantic-tsunami/ocean# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/sociomantic-tsunami/ocean" && export REPO_DIR="sociomantic-tsunami-ocean" && export REPO_FULL_NAME="sociomantic-tsunami/ocean" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 7s
Ran in 20m 24s
sociomantic-tsunami/swarm# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/sociomantic-tsunami/swarm" && export REPO_DIR="sociomantic-tsunami-swarm" && export REPO_FULL_NAME="sociomantic-tsunami/swarm" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 2s
Ran in 1m 12s
sociomantic-tsunami/turtle# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/sociomantic-tsunami/turtle" && export REPO_DIR="sociomantic-tsunami-turtle" && export REPO_FULL_NAME="sociomantic-tsunami/turtle" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 3s
Ran in 2m 25s
dlang/dub# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/dlang/dub" && export REPO_DIR="dlang-dub" && export REPO_FULL_NAME="dlang/dub" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6s
Ran in 52s
vibe-d/vibe-core+epoll# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/vibe-d/vibe-core" && export REPO_DIR="vibe-d-vibe-core+epoll" && export REPO_FULL_NAME="vibe-d/vibe-core+epoll" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 8s
Ran in 3m 42s
vibe-d/vibe-core+select# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/vibe-d/vibe-core" && export REPO_DIR="vibe-d-vibe-core+select" && export REPO_FULL_NAME="vibe-d/vibe-core+select" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6s
Ran in 3m 39s
higgsjs/Higgs# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/higgsjs/Higgs" && export REPO_DIR="higgsjs-Higgs" && export REPO_FULL_NAME="higgsjs/Higgs" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 1m 8s
Ran in 1m 4s
rejectedsoftware/ddox# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/rejectedsoftware/ddox" && export REPO_DIR="rejectedsoftware-ddox" && export REPO_FULL_NAME="rejectedsoftware/ddox" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 1m 17s
Ran in 5m 25s
BlackEdder/ggplotd# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/BlackEdder/ggplotd" && export REPO_DIR="BlackEdder-ggplotd" && export REPO_FULL_NAME="BlackEdder/ggplotd" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 1m 24s
Ran in 1m 32s
dlang-community/D-Scanner# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/dlang-community/D-Scanner" && export REPO_DIR="dlang-community-D-Scanner" && export REPO_FULL_NAME="dlang-community/D-Scanner" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 2m 18s
Ran in 21m 13s
dlang-tour/core# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/dlang-tour/core" && export REPO_DIR="dlang-tour-core" && export REPO_FULL_NAME="dlang-tour/core" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 2m 37s
Ran in 1m 27s
d-widget-toolkit/dwt# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/d-widget-toolkit/dwt" && export REPO_DIR="d-widget-toolkit-dwt" && export REPO_FULL_NAME="d-widget-toolkit/dwt" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 3m 4s
Ran in 17s
rejectedsoftware/diet-ng# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/rejectedsoftware/diet-ng" && export REPO_DIR="rejectedsoftware-diet-ng" && export REPO_FULL_NAME="rejectedsoftware/diet-ng" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 3m 29s
Ran in 8s
mbierlee/poodinis# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/mbierlee/poodinis" && export REPO_DIR="mbierlee-poodinis" && export REPO_FULL_NAME="mbierlee/poodinis" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 3m 45s
Ran in 7s
dlang/tools# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/dlang/tools" && export REPO_DIR="dlang-tools" && export REPO_FULL_NAME="dlang/tools" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 4m 0s
Ran in 36s
atilaneves/unit-threaded# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/atilaneves/unit-threaded" && export REPO_DIR="atilaneves-unit-threaded" && export REPO_FULL_NAME="atilaneves/unit-threaded" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 3m 57s
Ran in 22s
gecko0307/dagon# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/gecko0307/dagon" && export REPO_DIR="gecko0307-dagon" && export REPO_FULL_NAME="gecko0307/dagon" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 4m 0s
Ran in 17s
dlang-community/DCD# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/dlang-community/DCD" && export REPO_DIR="dlang-community-DCD" && export REPO_FULL_NAME="dlang-community/DCD" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 4m 14s
Ran in 26s
CyberShadow/ae# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/CyberShadow/ae" && export REPO_DIR="CyberShadow-ae" && export REPO_FULL_NAME="CyberShadow/ae" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 4m 26s
Ran in 16s
jmdavis/dxml# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/jmdavis/dxml" && export REPO_DIR="jmdavis-dxml" && export REPO_FULL_NAME="jmdavis/dxml" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 4m 29s
Ran in 47s
jacob-carlborg/dstep# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/jacob-carlborg/dstep" && export REPO_DIR="jacob-carlborg-dstep" && export REPO_FULL_NAME="jacob-carlborg/dstep" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 4m 55s
Ran in 34s
libmir/mir-algorithm# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/libmir/mir-algorithm" && export REPO_DIR="libmir-mir-algorithm" && export REPO_FULL_NAME="libmir/mir-algorithm" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 4m 46s
Ran in 46s
dlang-community/D-YAML# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/dlang-community/D-YAML" && export REPO_DIR="dlang-community-D-YAML" && export REPO_FULL_NAME="dlang-community/D-YAML" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 4m 50s
Ran in 29s
libmir/mir-random# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/libmir/mir-random" && export REPO_DIR="libmir-mir-random" && export REPO_FULL_NAME="libmir/mir-random" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 5m 12s
Ran in 25s
dlang-community/libdparse# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/dlang-community/libdparse" && export REPO_DIR="dlang-community-libdparse" && export REPO_FULL_NAME="dlang-community/libdparse" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 5m 27s
Ran in 15s
aliak00/optional# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/aliak00/optional" && export REPO_DIR="aliak00-optional" && export REPO_FULL_NAME="aliak00/optional" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 5m 27s
Ran in 18s
dlang-community/dfmt# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/dlang-community/dfmt" && export REPO_DIR="dlang-community-dfmt" && export REPO_FULL_NAME="dlang-community/dfmt" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 5m 25s
Ran in 30s
Abscissa/libInputVisitor# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/Abscissa/libInputVisitor" && export REPO_DIR="Abscissa-libInputVisitor" && export REPO_FULL_NAME="Abscissa/libInputVisitor" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 5m 42s
Ran in 6s
atilaneves/automem# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/atilaneves/automem" && export REPO_DIR="atilaneves-automem" && export REPO_FULL_NAME="atilaneves/automem" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 5m 38s
Ran in 16s
AuburnSounds/intel-intrinsics# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/AuburnSounds/intel-intrinsics" && export REPO_DIR="AuburnSounds-intel-intrinsics" && export REPO_FULL_NAME="AuburnSounds/intel-intrinsics" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 5m 38s
Ran in 17s
DerelictOrg/DerelictFT# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/DerelictOrg/DerelictFT" && export REPO_DIR="DerelictOrg-DerelictFT" && export REPO_FULL_NAME="DerelictOrg/DerelictFT" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 5m 48s
Ran in 7s
DerelictOrg/DerelictGL3# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/DerelictOrg/DerelictGL3" && export REPO_DIR="DerelictOrg-DerelictGL3" && export REPO_FULL_NAME="DerelictOrg/DerelictGL3" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 5m 55s
Ran in 7s
DerelictOrg/DerelictGLFW3# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/DerelictOrg/DerelictGLFW3" && export REPO_DIR="DerelictOrg-DerelictGLFW3" && export REPO_FULL_NAME="DerelictOrg/DerelictGLFW3" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 3s
Ran in 7s
DerelictOrg/DerelictSDL2# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/DerelictOrg/DerelictSDL2" && export REPO_DIR="DerelictOrg-DerelictSDL2" && export REPO_FULL_NAME="DerelictOrg/DerelictSDL2" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 2s
Ran in 7s
dlang-community/containers# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/dlang-community/containers" && export REPO_DIR="dlang-community-containers" && export REPO_FULL_NAME="dlang-community/containers" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 0s
Ran in 7s
dlang/undeaD# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/dlang/undeaD" && export REPO_DIR="dlang-undeaD" && export REPO_FULL_NAME="dlang/undeaD" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 3s
Ran in 7s
DlangScience/scid# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/DlangScience/scid" && export REPO_DIR="DlangScience-scid" && export REPO_FULL_NAME="DlangScience/scid" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 5m 57s
Ran in 12s
ikod/dlang-requests# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/ikod/dlang-requests" && export REPO_DIR="ikod-dlang-requests" && export REPO_FULL_NAME="ikod/dlang-requests" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 12s
Ran in 13s
symmetryinvestments/autowrap# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/symmetryinvestments/autowrap" && export REPO_DIR="symmetryinvestments-autowrap" && export REPO_FULL_NAME="symmetryinvestments/autowrap" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 8s
Ran in 2m 24s
symmetryinvestments/concurrency# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/symmetryinvestments/concurrency" && export REPO_DIR="symmetryinvestments-concurrency" && export REPO_FULL_NAME="symmetryinvestments/concurrency" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 10s
Ran in 40s
symmetryinvestments/excel-d# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/symmetryinvestments/excel-d" && export REPO_DIR="symmetryinvestments-excel-d" && export REPO_FULL_NAME="symmetryinvestments/excel-d" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 15s
Ran in 21s
symmetryinvestments/ldapauth# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/symmetryinvestments/ldapauth" && export REPO_DIR="symmetryinvestments-ldapauth" && export REPO_FULL_NAME="symmetryinvestments/ldapauth" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 17s
Ran in 7s
kaleidicassociates/lubeck# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/kaleidicassociates/lubeck" && export REPO_DIR="kaleidicassociates-lubeck" && export REPO_FULL_NAME="kaleidicassociates/lubeck" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 30s
Ran in 28s
symmetryinvestments/xlsxreader# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/symmetryinvestments/xlsxreader" && export REPO_DIR="symmetryinvestments-xlsxreader" && export REPO_FULL_NAME="symmetryinvestments/xlsxreader" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 34s
Ran in 12s
lgvz/imageformats# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/lgvz/imageformats" && export REPO_DIR="lgvz-imageformats" && export REPO_FULL_NAME="lgvz/imageformats" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 32s
Ran in 6s
libmir/mir# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/libmir/mir" && export REPO_DIR="libmir-mir" && export REPO_FULL_NAME="libmir/mir" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 41s
Ran in 15s
libmir/mir-core# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/libmir/mir-core" && export REPO_DIR="libmir-mir-core" && export REPO_FULL_NAME="libmir/mir-core" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 47s
Ran in 13s
libmir/mir-cpuid# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/libmir/mir-cpuid" && export REPO_DIR="libmir-mir-cpuid" && export REPO_FULL_NAME="libmir/mir-cpuid" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 44s
Ran in 9s
libmir/mir-optim# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/libmir/mir-optim" && export REPO_DIR="libmir-mir-optim" && export REPO_FULL_NAME="libmir/mir-optim" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 52s
Ran in 18s
msoucy/dproto# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/msoucy/dproto" && export REPO_DIR="msoucy-dproto" && export REPO_FULL_NAME="msoucy/dproto" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 51s
Ran in 14s
Netflix/vectorflow# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/Netflix/vectorflow" && export REPO_DIR="Netflix-vectorflow" && export REPO_FULL_NAME="Netflix/vectorflow" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 6m 54s
Ran in 11s
nomad-software/dunit# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/nomad-software/dunit" && export REPO_DIR="nomad-software-dunit" && export REPO_FULL_NAME="nomad-software/dunit" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 7m 3s
Ran in 8s
pbackus/sumtype# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/pbackus/sumtype" && export REPO_DIR="pbackus-sumtype" && export REPO_FULL_NAME="pbackus/sumtype" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 7m 10s
Ran in 8s
PhilippeSigaud/Pegged# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/PhilippeSigaud/Pegged" && export REPO_DIR="PhilippeSigaud-Pegged" && export REPO_FULL_NAME="PhilippeSigaud/Pegged" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 7m 7s
Ran in 3m 51s
repeatedly/mustache-d# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/repeatedly/mustache-d" && export REPO_DIR="repeatedly-mustache-d" && export REPO_FULL_NAME="repeatedly/mustache-d" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 7m 5s
Ran in 10s
s-ludwig/std_data_json# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/s-ludwig/std_data_json" && export REPO_DIR="s-ludwig-std_data_json" && export REPO_FULL_NAME="s-ludwig/std_data_json" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 7m 6s
Ran in 10s
s-ludwig/taggedalgebraic# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/s-ludwig/taggedalgebraic" && export REPO_DIR="s-ludwig-taggedalgebraic" && export REPO_FULL_NAME="s-ludwig/taggedalgebraic" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 7m 19s
Ran in 10s
snazzy-d/sdc# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/snazzy-d/sdc" && export REPO_DIR="snazzy-d-sdc" && export REPO_FULL_NAME="snazzy-d/sdc" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 7m 18s
Ran in 31s
funkwerk-mobility/serialized# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/funkwerk-mobility/serialized" && export REPO_DIR="funkwerk-mobility-serialized" && export REPO_FULL_NAME="funkwerk-mobility/serialized" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 7m 16s
Ran in 1m 10s
funkwerk-mobility/mocked# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/funkwerk-mobility/mocked" && export REPO_DIR="funkwerk-mobility-mocked" && export REPO_FULL_NAME="funkwerk-mobility/mocked" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 7m 22s
Ran in 44s
andrey-zherikov/argparse# don't build everything from the root folder && rm -rf buildkite-ci-build && mkdir buildkite-ci-build && cd buildkite-ci-build && export REPO_URL="https://github.com/andrey-zherikov/argparse" && export REPO_DIR="andrey-zherikov-argparse" && export REPO_FULL_NAME="andrey-zherikov/argparse" && echo "--- Load distribution archive" && buildkite-agent artifact download distribution.tar.xz . && tar xfJ distribution.tar.xz && rm -rf buildkite && mv distribution/buildkite buildkite && rm distribution.tar.xz && ./buildkite/build_project.sh
Waited 7m 38s
Ran in 24s
Total Job Run Time: 2h 7m