Updating a pipeline from within itself - is there any way to force the newly added jobs to only trigger on HEAD rather than start triggering on HEAD-1


#1

Hello!

We have pipelines that updates themselves as the first step[1]

jobs:
- name: update pipeline
  - task: update pipeline
      ...
      run:
        path: /bin/sh
        args:
        - -c
        - |-
          set -e
          fly -t ${CONCOURSE_TEAM} login -c ${CONCOURSE_HOST} -n ${CONCOURSE_TEAM} -u ${CONCOURSE_USERNAME} -p ${CONCOURSE_PASSWORD}
          fly -t ${CONCOURSE_TEAM} set-pipeline -p halfpipe-update-test -c pipeline.yml -n

If I add a new job these will be run with CURRENT_HEAD-1 before running CURRENT_HEAD.

I.e
If I start at commit 0 with a pipeline like update pipeline -> a -> b
At commit 1 I have committed something that has nothing to do with the pipeline.yml file
At commit 2 I change pipeline.yml file to add a new job so the pipeline will look like update pipeline -> a -> b -> c

I will get the execution order

  1. Update pipeline(commit 2)
  2. a(commit 2)
  3. c(commit 1)
  4. b(commit 2)
  5. c(commit 2)

Here is a video that hopefully explains it better!

It makes sense that c will first run commit 1 as it has a dependency on b and they share the same resource, but can I somehow force Concourse to run c with commit 2 and skip commit 1?

[1] We generate concourse pipelines for a lot of teams/projects, and want to make sure they are always up to date.


Is concourse's aim to eliminate snowflaking just a myth?
#2

I wonder if the update pipeline task could pin the git resource version on all the other jobs?