Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-55587

Env variables not available in pipeline options sections, outside of stages

      Environments variables such as ${env.NODE_NAME} are not available in Options {} before stages.  I want to dynamically lock a node using options {} before the first stage starts and keep that node locked for the entire pipeline. 

      I have spent hours on this everything I tried returns NULL. I know I can lock each stage individually but that won't work for me since it releases that lock at the end of each stage.

      options {
          lock (resource: "${env.NODE_NAME}")
          }
      stages {
          stage("stage name") {
               steps {.....

          [JENKINS-55587] Env variables not available in pipeline options sections, outside of stages

          Luke Lussenden added a comment - - edited

          I use the options style lock to wrap a few stages in a large pipeline and I have the same problem.  Even though the lock is deep within the pipeline, it seems the options are per-computed before the run and the ENV variables are not yet injected.   For my use case - I would really like to use the `GIT_URL` provided by the multibranch pipeline project as the source of my lock.  I don't mind multiple branches building at once, but there is a section they can't be in at the same time if they are on the same repo.

          stages {
                stage('do stuff 1'){}
                stage('do stuff 2'){}
                stage('locked set'){
                   stages {
                       options { lock(resource: "${GIT_URL}") }
                       stage('first action set'){ }
                       stage('second action set'){ }
                   }
               }
               stage('do final stuff'){}
           }

          If you have a parameterized build - I believe parameters are available to the options blocks - but not environment variables.

          Luke Lussenden added a comment - - edited I use the options style lock to wrap a few stages in a large pipeline and I have the same problem.  Even though the lock is deep within the pipeline, it seems the options are per-computed before the run and the ENV variables are not yet injected.   For my use case - I would really like to use the `GIT_URL` provided by the multibranch pipeline project as the source of my lock.  I don't mind multiple branches building at once, but there is a section they can't be in at the same time if they are on the same repo. stages {      stage( ' do stuff 1' ){}      stage( ' do stuff 2' ){}      stage( 'locked set' ){         stages {             options { lock(resource: "${GIT_URL}" ) }             stage( 'first action set' ){ }             stage( 'second action set' ){ }         }     }     stage( ' do final stuff' ){} } If you have a parameterized build - I believe parameters are available to the options blocks - but not environment variables.

          Philip Beadling added a comment - - edited

          I have exactly the same issue - I want to lock a single stage with the lock based on the Git repo address that triggered the build - this allows me to update a 2nd repo containing package files - one file per repo, safe in the knowledge that no other Jenkins process is updating exactly the the same file (but could update other files in the same repo at the same time).  This means I can be sure that any resulting merge is a fast-forward if two separate files were updated in the package repo at the same time - which allows for a significant parallelization win across different development repos (unrelated projects don't queue to update the package manager).

           

          From what lglussen is saying - it sounds like it might be possible to include a parameter in the trigger from the development repo from Bitbucket/Github which could then be used to lock the stage - I'll have a look at this (thanks for the tip!).

           

          Even with this workaround it's a duplication of data that is already automatically passed to Jenkins as part of the trigger - which isn't ideal, and is a bit of maintenance effort to update all the triggers to do this.  So a proper fix to the original problem would be great!

          Philip Beadling added a comment - - edited I have exactly the same issue - I want to lock a single stage with the lock based on the Git repo address that triggered the build - this allows me to update a 2nd repo containing package files - one file per repo, safe in the knowledge that no other Jenkins process is updating exactly the the same file (but could update other files in the same repo at the same time).  This means I can be sure that any resulting merge is a fast-forward if two separate files were updated in the package repo at the same time - which allows for a significant parallelization win across different development repos (unrelated projects don't queue to update the package manager).   From what lglussen is saying - it sounds like it might be possible to include a parameter in the trigger from the development repo from Bitbucket/Github which could then be used to lock the stage - I'll have a look at this (thanks for the tip!).   Even with this workaround it's a duplication of data that is already automatically passed to Jenkins as part of the trigger - which isn't ideal, and is a bit of maintenance effort to update all the triggers to do this.  So a proper fix to the original problem would be great!

          I've found a better workaround (for me at least).

          You can parameterize a pipeline if you nest it in a function:

          https://www.jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/

           

          Pass in your repoSlug or whatever as a key,value in the pipelineParams map - this is then accessible from a nested options section in your pipeline.

           

           

          Philip Beadling added a comment - I've found a better workaround (for me at least). You can parameterize a pipeline if you nest it in a function: https://www.jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/   Pass in your repoSlug or whatever as a key,value in the pipelineParams map - this is then accessible from a nested options section in your pipeline.    

          Daniel Klöck added a comment - - edited

          I also have the same issue. The problem is that, as stated in the documentation for stage options:

          Inside a stage, the steps in the options directive are invoked before entering the agent or checking any when conditions.

          Therefore, neither the env variables nor variables from other steps can be used. I guess needed vars have to be generated within the options block or before entering the agent.

          I wonder what led to the decision of executing the options block before entering the agent...
           

          Daniel Klöck added a comment - - edited I also have the same issue. The problem is that, as stated in the documentation for stage options : Inside a  stage , the steps in the  options  directive are invoked before entering the  agent  or checking any  when  conditions. Therefore, neither the env variables nor variables from other steps can be used. I guess needed vars have to be generated within the options block or before entering the agent. I wonder what led to the decision of executing the options block before entering the agent...  

            Unassigned Unassigned
            chadg Chad Geisler
            Votes:
            9 Vote for this issue
            Watchers:
            12 Start watching this issue

              Created:
              Updated: