Trigger jobs conditionally


#1

Hey there,

I am coming from Jenkins and thinking about conditional workflow where I could trigger jobs based on certain parameters. In my use case, we have a monorepo with multiple apps. I don’t want to rebuild everything all the time so my approach was to generate a list of changed files with “git diff” and based on that generate a list of apps that depend on them and need to be re-built. Then I would like to trigger jobs that correspond to those apps that changed. Does anyone have an idea how this could be accomplished with concourse-ci?

If conditions is not a possible solution in this case I was thinking about having a pipeline with jobs for all the apps and somehow pass parameters to them with list of apps that were affected. In a run command I could check if parameter (variable) corresponding to a particular app exists and then start the build. If it doesn’t, I would just exit 0. All jobs would run every time but only perform a real build when needed. I still need to experiment to see if this is doable.

I would appreciate any suggestions on this. Thanks!


#2

Hi Dan,

you can set ignore_paths with the Git Concourse resource (https://github.com/concourse/git-resource) like so:

resources:
- name: source-code
  type: git
  source:
    uri: git@github.com:concourse/git-resource.git
    branch: master
    ignore_paths:
    - foo/
    - bar/

You can use this to have several Git resources that reflect your conditions (changes to specific parts of the Git repo). Each resource is responsible for triggering a single Job that builds only one specific app.

I recommend the usage of multiple Git resources so you can separate into different Jobs and improve the visualization of your Pipeline. It also seems to be an easier way to implement the disired functioniality.

best D


#3

Thanks for your reply. I have seen this feature, which is nice but unfortunately not applicable for my use case. We have multiple apps that depend on each other so simply looking at a path won’t work. Also, we have common libs that affect certain apps as well. Therefore, I would need to do some processing in a separate job to get a list of files changed and feed it into the tool to identify what needs to be rebuilt. After that I end up with list of apps that I need to build and I need to figure out how I can trigger only those builds.

Would it be possible based on the initial processing generate a a list of jobs or a separate pipeline on a fly with apps that need to be built and trigger it?


#4

If your logic is more complicated so you cannot build triggers based on simple modification paths of your Github repo, i would suggest another way:

You could use the s3 Resource (https://github.com/concourse/s3-resource) with versioned_files, each file when updated indicates that one of your applications needs to be pushed. Lets’s say you have these s3 files app1 - app10 and the s3 resources in your pipelines as well (s3-resource-app1 - s3-resource-app10).

Furthermore you should have 10 individual jobs push-app1 - push-app10, each of them getting triggered when there is a new version of the s3-resource-appX.

Upon changes in your Github repo, trigger the Job that analyses the modifications in the repo and bump the s3 files from the job with e.g. a timestamps depending on which combination of apps you need to push.

You have to do this step from within the Tasks script in order to decide dynamically which s3 file you trigger. Make sure your Task Docker container includes the aws cli.

It’s a bit complicated but i think you should get the behaviour you desire, the only downfall is that your apps might not get pushed at the same time, but rather that with a little time shift (don’t know if this is a problem).

best D


#5

Thanks a lot for your response. I had a similar idea in mind and I was able to achieve this workflow using keyval resource. My pipeline looks very complex now visually :slight_smile: considering that each job needs a dedicated resource for it but it is ok for now.

By the way, is there a way to execute a job after all previous jobs finished (either succeeded or failed)?


#6

Hi Dan,

Don’t worry, my pipelines always look very complex. That happens a lot when you deal with real use cases. But you could improve the overview by the usage of Concourse groups https://concourse-ci.org/pipeline-groups.html.

You mean like a barrier synchronisation? Mmh i am not 100% sure but i would say no. Because any job that produces a new output and is a trigger for the next job will issue a new build for this next job (@vito please correct me if i am wrong).

But it would be nice to have a resource for this which awaits a put from n jobs and then produces a new version so the next job can start (actually would like to try that out).

best D


#7

Hi D,

As usual, thanks for your response :slight_smile:

So I was able to achieve the conditional triggering by creating keyval resources and linking jobs to them. So basically, I have a “git_detect_changes” job that runs a diff and based on the changes files produces the list of apps that need to be built. Then, I am updating corresponding keyval resources so linked jobs can be triggered:

    on_success:
      try:
        aggregate:
        - put: app1_keyval
          params:
            file: keyvalout/app1_keyval.properties
        - put: app2_keyval
          params:
            file: keyvalout/app2_keyval.properties

In this case I found that if the file is missing it won’t fail the job ( by using “try”). What I was unable to achieve is what in Jenkins exists as “post build step”. We have a manifest file, which lists all the images and their versions. After all build jobs finish I would like to have a post-build job that could pull the latest keyval versions, update the manifest and commit it to git. Alternatively I can do commits inside builds jobs too but having a post-build job would make it cleaner and result in a single commit.


#8

Hi Dan,

I think it will be difficult to achieve this in Concourse. I would go another route by updating the list after each Job only for the specific changes produced by this Job.

To ensure that no two jobs interfere with your manifest at the same time because of the parallelism which is certainly going on in your pipeline, you could use the serial_groups https://concourse-ci.org/jobs.html#job-serial-groups feature of concourse.

Each of your Jobs that pushes an app after beeing triggered by your git_detect_changes, would trigger another Job that updates its part of the manifest. All of the update Jobs should share the same serial group label, to make sure that not two Jobs can modify the manifest at the same time.

best D


#9

Hi Dan,

I am very interested in how you got on with your approach as I am just about to attempt to do the same thing as we have a similar set up and rather than building everything on each change I only want to build the application that was updated. I’d appreciate any advice you have!

Thanks,
Sinéad


#10

Hi Sinead,

Most of the details about the approach I took are already described in the posts above. I am using a keyval resource to update a specific app resource, which then triggers a corresponding job. I am also using serial groups so my update jobs don’t interfere with each other.

Let me know what specific questions/issues you have and I’ll see if I can help.


#11

Hi, great question and something we’ve been banging our heads against for quite some time. We solved our workflow using gate-resource which allows us to trigger builds only for those parts of the pipeline that are affected by a change (we use metadata-resource as a trigger and gate-resource to track progress through our pipeline).

It’s a neatly working setup, albeit a bit complex. I have written a short summary on the spatial resource flows RFC but plan on writing it up in a blog post later. For now I can recommend you take a look at the gate-resource and it’s example to see what you can do with it.