Updating a pipeline from within itself - is there any way to force the newly added jobs to only trigger on HEAD rather than start triggering on HEAD-1


#1

Hello!

We have pipelines that updates themselves as the first step[1]

jobs:
- name: update pipeline
  - task: update pipeline
      ...
      run:
        path: /bin/sh
        args:
        - -c
        - |-
          set -e
          fly -t ${CONCOURSE_TEAM} login -c ${CONCOURSE_HOST} -n ${CONCOURSE_TEAM} -u ${CONCOURSE_USERNAME} -p ${CONCOURSE_PASSWORD}
          fly -t ${CONCOURSE_TEAM} set-pipeline -p halfpipe-update-test -c pipeline.yml -n

If I add a new job these will be run with CURRENT_HEAD-1 before running CURRENT_HEAD.

I.e
If I start at commit 0 with a pipeline like update pipeline -> a -> b
At commit 1 I have committed something that has nothing to do with the pipeline.yml file
At commit 2 I change pipeline.yml file to add a new job so the pipeline will look like update pipeline -> a -> b -> c

I will get the execution order

  1. Update pipeline(commit 2)
  2. a(commit 2)
  3. c(commit 1)
  4. b(commit 2)
  5. c(commit 2)

Here is a video that hopefully explains it better!

It makes sense that c will first run commit 1 as it has a dependency on b and they share the same resource, but can I somehow force Concourse to run c with commit 2 and skip commit 1?

[1] We generate concourse pipelines for a lot of teams/projects, and want to make sure they are always up to date.


Is concourse's aim to eliminate snowflaking just a myth?
#2

I wonder if the update pipeline task could pin the git resource version on all the other jobs?


#3

I noticed when running fly upload that the pipeline can be changed while jobs are running, and that (at least in the UI) it then appears that jobs can be scheduled in the wrong order. Is this a limitation of the UI, or is the UI correctly depicting what Concourse is doing?

For example, GoCD makes a distinction between the pipeline definition, and specific executions of a definition. Pipeline definitions are immutable. Uploading a changed pipeline definition does not affect currently running executions. They continue to use the previous pipeline definition. Only new executions will use the new pipeline definition.

I can’t work out from the documentation if Concourse works the same way or not.

As one can imagine, the GoCD model makes it much easier to reason about the effect of updating a pipeline definition while builds are running.


#4

I noticed the same (the pipeline can be changed while the jobs are running), and so the build can fail.

Regarding the original question (auto updating pipeline), what I do is instead to use a separate pipeline with the https://github.com/concourse/concourse-pipeline-resource and live with the fact that the watched pipelines can fail due to a race, since changes to the pipeline happen rarely and once you understand that the race is possible, you can recognize the race as the reason of the failure and retrigger. First times it happened it left us very confused. See https://github.com/concourse/concourse/issues/1200.

I think that GoCD has a better story in this regard, not only because changes are atomic as @npryce-springer mentions, but also because with GoCD it is also possible to navigate the changes to the pipeline, so that it is easier to throubleshoot if a failed build is due to a changed pipeline or to a changed source code. This is a feature I would like Concourse to get, see https://github.com/concourse/concourse/issues/1448.