I am new to concourse so I apologize if this question seems dumb…
I have a folder (/etc/hadoop) from host that needs to be mounted in the container that runs my pipeline. This folder has cluster configurations and I want to use it to submit jobs in hadoop.
I know that I can pass files and folders to tasks using:
fly -t tutorial e -c inputs_required.yml -i some-important-input=../task-hello-world
But how do I set this permanent in the pipeline?
My pipeline looks like this:
--- resources: - name: repository type: git source: uri: email@example.com:made-up-repo/made-up-repo.git jobs: - name: run plan: - get: repository trigger: true - task: run tags: [hadoop] file: repository/ci/tasks/run.yml
And my task “run” looks like this:
platform: linux image_resource: type: docker-image source: repository: centos tag: latest inputs: - name: repository run: path: repository/ci/scripts/some-random-script.sh