How to pass a folder to a pipeline and task?



I am new to concourse so I apologize if this question seems dumb…
I have a folder (/etc/hadoop) from host that needs to be mounted in the container that runs my pipeline. This folder has cluster configurations and I want to use it to submit jobs in hadoop.

I know that I can pass files and folders to tasks using:

fly -t tutorial e -c inputs_required.yml -i some-important-input=../task-hello-world

But how do I set this permanent in the pipeline?

My pipeline looks like this:

- name: repository
  type: git

  - name: run
    - get: repository
      trigger: true

    - task: run
      tags: [hadoop]
      file: repository/ci/tasks/run.yml

And my task “run” looks like this:

platform: linux

  type: docker-image
    repository: centos
    tag: latest

- name: repository

  path: repository/ci/scripts/


Hello, welcome to Concourse!

I suggest to rethink your approach, for the following reason.

Concourse does everything it can to force you to do things in a certain way, with the goal of making builds repeatable and non flaky. Said in another way, builds like immutable infrastructure.

If you manage to mount in your container that directory /etc/hadoop, it means that the following is possible: if you trigger the pipeline two times, with NOTHING changed in the resources used by pipeline (say your git source code), the results can still vary if, out of band, the contents of /etc/hadoop have changed. Said in another way: we introduced flakiness in the pipeline.

The Concourse way would be to make the contents of /etc/hadoop available as a resource, for example in a git repository, or maybe backed in the container. The container itself should be created by a pipeline (Concourse has a resource to do that) or in any case by an automated, scripted way. No manual steps, no flakiness.

I cannot be more precise because I am not familiar with Hadoop, but I hope you get the gist of the Concourse approach.