Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-41335

Allow variables and functions to be defined within pipeline to be used in any stage

      I have several use cases where I may operate on a list or map on different stages of a pipeline. This variable may not be known ahead of time - it could be defined within a stage. I'd like a syntax that allows me to define variables within a pipeline.

      Today, I have to define a shared variable before entering pipeline {}.

      Jenkinsfile
      #!/bin/groovy
      // Define outside of pipeline to make sure accessible in all script {} blocks
      def my_list
      
      pipeline {
          agent { label 'label' }
          stages {
              stage('stage1') {
                steps {
                 script {
                     my_list = [1,2,3]
                 }
                }
              }
          
              stage('stage2') {
                steps {
                  script {
                      for(int i=0; i<my_list.size();i++) {
                          echo "Doing something with ${my_list[i]}"
                      }
                  }
                }
              }
          }
      }
      

      A simple approach would be to allow "def" to work anywhere within pipeline {}. This avoids any overly new syntax for pipeline.

      Another option would allow the user to provide a special pipeline {} block for defining non-env/params style variables. This could be helpful in allowing the user to use items available in script {} for defining shared state.

      Jenkinsfile with possible example syntax
      #!/bin/groovy
      
      
      pipeline {
          agent { label 'label' }    
      
          define {
             def my_map = [:] //empty map
             def my_list //undefined shared variable
           }
      
          stages {
              stage('stage1') {
                steps {
                  script {
                    //If def can be anywhere in pipeline {}, drop script {}
                    my_list = [1,2,3]
                  }
                }
              }
          
              stage('stage2') {
                steps {
                  script {
                      for(int i=0; i<my_list.size();i++) {
                          echo "Doing something with ${my_list[i]}"
                      }
                  }
                }
              }
          }
      }
      

          [JENKINS-41335] Allow variables and functions to be defined within pipeline to be used in any stage

          Andrew Bayer added a comment -

          I like the define block idea, though I can't yet guarantee I can make that work in implementation. Will experiment in that direction when I get a chance. I've also long been kicking around the idea of having functions available for setting the value of variables to deal with the = blocking but haven't fallen in love with it enough to actually do it. =)

          Andrew Bayer added a comment - I like the define block idea, though I can't yet guarantee I can make that work in implementation. Will experiment in that direction when I get a chance. I've also long been kicking around the idea of having functions available for setting the value of variables to deal with the = blocking but haven't fallen in love with it enough to actually do it. =)

          Andrew Bayer added a comment -

          hrmpw jamesdumay michaelneale rsandell - something else I'd like your thoughts on.

          Andrew Bayer added a comment - hrmpw jamesdumay michaelneale rsandell - something else I'd like your thoughts on.

          Patrick Wolf added a comment -

          Using top-level define and then def inside the block seems a bit redundant.

          This would make the use of Artifactory much simpler. They rely heavily on defining configuration settings. That can easily be done in a script but I could see it used with this, too.

          Patrick Wolf added a comment - Using top-level define and then def inside the block seems a bit redundant. This would make the use of Artifactory much simpler. They rely heavily on defining configuration settings. That can easily be done in a script but I could see it used with this, too.

          Robby Pocase added a comment -

          Not a huge fan of the redundant def either, but wanted a compromise to possibly allow script expressions. An alternative would be using expression (similar to when) to be explicit about when script style usage is allowed. I'm not overly fond of this syntax either, but it does provide some consistency with other declarative constructs.

          define {
              my_var = expression { return "foo" == "bar" }
          }
          

          Robby Pocase added a comment - Not a huge fan of the redundant def either, but wanted a compromise to possibly allow script expressions. An alternative would be using expression (similar to when) to be explicit about when script style usage is allowed. I'm not overly fond of this syntax either, but it does provide some consistency with other declarative constructs. define { my_var = expression { return "foo" == "bar" } }

          Andrew Bayer added a comment -

          Worth mentioning that in any scenario, the def would literally be redundant no matter what we're doing with the value - we wouldn't be literally calling this Groovy code at parse time, we'd be saying "Ok, here's the variable name, and here's what to set it to - evaluate that value at runtime and set the variable to that result".

          Andrew Bayer added a comment - Worth mentioning that in any scenario, the def would literally be redundant no matter what we're doing with the value - we wouldn't be literally calling this Groovy code at parse time, we'd be saying "Ok, here's the variable name, and here's what to set it to - evaluate that value at runtime and set the variable to that result".

          Andrew Bayer added a comment -

          So the way I can see to do this currently is to first have a library step added to Pipeline (i.e., JENKINS-39450). We need that for JENKINS-38110 anyway. With that step, we can add to the Binding at runtime, so we could have something like this:

          define {
            def foo = "bar"
          
            def someMethod() {
               return "Hello"
            }
          }
          

          ...and then take the source text for define's block and pass it to the library step with a new LibraryRetriever that can output a file containing a given string rather than fetching from an SCM source as the existing retrievers do. Then we should, if I'm understanding correctly, get the variables and methods available from then on in the execution.

          jglick - does that sound right?

          Andrew Bayer added a comment - So the way I can see to do this currently is to first have a library step added to Pipeline (i.e., JENKINS-39450 ). We need that for JENKINS-38110 anyway. With that step, we can add to the Binding at runtime, so we could have something like this: define { def foo = "bar" def someMethod() { return "Hello" } } ...and then take the source text for define 's block and pass it to the library step with a new LibraryRetriever that can output a file containing a given string rather than fetching from an SCM source as the existing retrievers do. Then we should, if I'm understanding correctly, get the variables and methods available from then on in the execution. jglick - does that sound right?

          Andrew Bayer added a comment -

          Hrm - realizing that this would end up in something like vars/foo.groovy, but then the methods and variables could only be accessed like foo.someVar or foo.someMethod() which wouldn't pass validation.

          Andrew Bayer added a comment - Hrm - realizing that this would end up in something like vars/foo.groovy , but then the methods and variables could only be accessed like foo.someVar or foo.someMethod() which wouldn't pass validation.

          Sam Van Oort added a comment -

          rpocase If I understand your request correctly, what you are describing is better handled by functional and imperative features in vanilla AKA "scripted" pipeline. It's breaking from the declarative model because of the potential for side effects with variable declarations and shared state.

          Sam Van Oort added a comment - rpocase If I understand your request correctly, what you are describing is better handled by functional and imperative features in vanilla AKA "scripted" pipeline. It's breaking from the declarative model because of the potential for side effects with variable declarations and shared state.

          Jesse Glick added a comment -

          I have several use cases where I may operate on a list or map on different stages of a pipeline. This variable may not be known ahead of time

          IMO if you are going to rely so heavily on script anyway, dispense with pipeline-model-definition and just write your script directly.

          his would make the use of Artifactory much simpler.

          The current plugin relies heavily on a DSL, against my advice which is to stick to plain old Step implementations using vanilla structs (or, better yet, wrappers which set the bare minimum environment necessary for non-Jenkins-specific shell/batch scripts to run). Workable in scripted Pipeline but I think it would need to be reconceptualized to be useful in Declarative.

          take the source text for define's block and pass it to the library step with a new LibraryRetriever that can output a file containing a given string

          Uh, or call evaluate?

          Jesse Glick added a comment - I have several use cases where I may operate on a list or map on different stages of a pipeline. This variable may not be known ahead of time IMO if you are going to rely so heavily on script anyway, dispense with pipeline-model-definition and just write your script directly. his would make the use of Artifactory much simpler. The current plugin relies heavily on a DSL, against my advice which is to stick to plain old Step implementations using vanilla structs (or, better yet, wrappers which set the bare minimum environment necessary for non-Jenkins-specific shell/batch scripts to run). Workable in scripted Pipeline but I think it would need to be reconceptualized to be useful in Declarative. take the source text for define's block and pass it to the library step with a new LibraryRetriever that can output a file containing a given string Uh, or call evaluate ?

          Andrew Bayer added a comment -

          jglick Ok, putting aside the variables - thoughts on functions? i.e., JENKINS-41396, which I folded into this.

          Andrew Bayer added a comment - jglick Ok, putting aside the variables - thoughts on functions? i.e., JENKINS-41396 , which I folded into this.

          Jesse Glick added a comment -

          I think they are both a bad idea, but this is a worse one.

          Jesse Glick added a comment - I think they are both a bad idea, but this is a worse one.

          Liam Newman added a comment -

          jglick
          > IMO if you are going to rely so heavily on script anyway, dispense with pipeline-model-definition and just write your script directly.

          I'm actually finding that Declarative combined with some helpers is really great. The overall pipeline flow goes in the Declarative, and the gritty complex logic goes in the helper methods/Steps.

          Liam Newman added a comment - jglick > IMO if you are going to rely so heavily on script anyway, dispense with pipeline-model-definition and just write your script directly. I'm actually finding that Declarative combined with some helpers is really great. The overall pipeline flow goes in the Declarative, and the gritty complex logic goes in the helper methods/Steps.

          Jason Davis added a comment -

          I agree with Liam Newman (I think ). My two cents, Declarative pipelining makes a ton of sense and it feels like a more natural model than Scripted pipelining. In declarative land, though, to get the job done, I still end up needing if-then logic, to assign variables (as per this JIRA issue), using shared functions in groovy scripts – all those things need to be wrapped with script {. Really unifying the two methods back into one official "way" to pipeline, would be great for pipeline development. It'll reduce the maintenance burden on Jenkins jobs down the road too.

          Jason Davis added a comment - I agree with Liam Newman (I think ). My two cents, Declarative pipelining makes a ton of sense and it feels like a more natural model than Scripted pipelining. In declarative land, though, to get the job done, I still end up needing if-then logic, to assign variables (as per this JIRA issue), using shared functions in groovy scripts – all those things need to be wrapped with script {. Really unifying the two methods back into one official "way" to pipeline, would be great for pipeline development. It'll reduce the maintenance burden on Jenkins jobs down the road too.

          Robby Pocase added a comment -

          What bitwiseman jedavis said. The conveniences of declarative make dropping into script from time to time fine. There will always be a need programmatic logic. Direct access in declarative would be huge for maintainability.

          Robby Pocase added a comment - What bitwiseman jedavis said. The conveniences of declarative make dropping into script from time to time fine. There will always be a need programmatic logic. Direct access in declarative would be huge for maintainability.

          R. Tyler Croy added a comment -

          The primary use-case that bitwiseman points out here, as I understand it, is not to add a bunch of scripting and variable silliness into a Declarative Jenkinsfile. But rather, there's no "middle-ground" between Declarative right now and Shared Libraries.

          Some of the folks at Mozilla have used the "loophole" which currently exists, and should be closed, to experiment with shared library ideas inside the Jenkinsfile. I think a methods directive would be helpful to bridge the gap betwixt a puritanical Declarative Pipeline and what should ultimately live in a Shared Library.

          R. Tyler Croy added a comment - The primary use-case that bitwiseman points out here, as I understand it, is not to add a bunch of scripting and variable silliness into a Declarative Jenkinsfile . But rather, there's no "middle-ground" between Declarative right now and Shared Libraries. Some of the folks at Mozilla have used the "loophole" which currently exists, and should be closed, to experiment with shared library ideas inside the Jenkinsfile . I think a methods directive would be helpful to bridge the gap betwixt a puritanical Declarative Pipeline and what should ultimately live in a Shared Library.

          Liam Newman added a comment -

          rtyler
          Yes, that. That is want See JENKINS-41396 is about.

          jedavisrpocase
          Would method definitions be enough, do you think?

          Liam Newman added a comment - rtyler Yes, that. That is want See JENKINS-41396 is about. jedavis rpocase Would method definitions be enough, do you think?

          Patrick Wolf added a comment -

          I think JENKINS-41396 solves a more concrete problem. I was thinking about helper functions and @Non-CPS functions during the JAM this morning. It is possible to write helper functions outside of pipeline block (and I have done so for some examples) but that means it can't be opened in the editor. This would allow the same thing to be done within the pipeline block and be supported in the editor.

          I agree with bitwiseman that sometimes you need a helper function inline in your Pipeline. Throwing everything out and converting the entire Pipeline to Scripted Pipeline isn't a good solution. I see no reason why 99% of use cases can't be solved in Declarative for users that want to use Declarative as the primary structure with some Scripted pieces. If people want to use only Scripted Pipeline because they prefer imperative to declarative programming that is fine but we should not force users to use only Scripted just because a use case isn't fully supported yet.

          Patrick Wolf added a comment - I think JENKINS-41396 solves a more concrete problem. I was thinking about helper functions and @Non-CPS functions during the JAM this morning. It is possible to write helper functions outside of pipeline block (and I have done so for some examples) but that means it can't be opened in the editor. This would allow the same thing to be done within the pipeline block and be supported in the editor. I agree with bitwiseman that sometimes you need a helper function inline in your Pipeline. Throwing everything out and converting the entire Pipeline to Scripted Pipeline isn't a good solution. I see no reason why 99% of use cases can't be solved in Declarative for users that want to use Declarative as the primary structure with some Scripted pieces. If people want to use only Scripted Pipeline because they prefer imperative to declarative programming that is fine but we should not force users to use only Scripted just because a use case isn't fully supported yet.

          Robby Pocase added a comment - - edited

          bitwiseman I use this pattern a lot for dynamically defining parallel steps between stages. My primary usage of def is a stop gap between parallel/inner stage improvements. Library methods could likely handle this.

          I think some combination of JENKINS-41334 and comments from JENKINS-39932 makes this a lot better, but neither satisfy instances where what should be defined isn't known until run time (or even after scm checkout).

          Robby Pocase added a comment - - edited bitwiseman I use this pattern a lot for dynamically defining parallel steps between stages. My primary usage of def is a stop gap between parallel/inner stage improvements. Library methods could likely handle this. I think some combination of JENKINS-41334 and comments from JENKINS-39932 makes this a lot better, but neither satisfy instances where what should be defined isn't known until run time (or even after scm checkout).

          Gigi Jackson added a comment - - edited

          I like the declarative syntax, but I often find myself defining script{} blocks or helper methods outside the pipeline{} just to do something like extract a version string from a repo, find previous git tags for generating changelogs etc.  

          You have to enclose even local variables in a script{} block just to assign and use in a later step, so my pipelines are littered with this kind of thing:

          script {
              version = sh(script: "python setup.py --version", returnStdout: true).trim()
              write_revision_file(version)
              sh(". venv/bin/activate; " +
                  $/python setup.py egg_info -b".dev${env.BUILD_NUMBER}" sdist upload -r local rotate -m.tar.gz -k5; /$ +
                  "deactivate")
              tag = "${version}.dev${env.BUILD_NUMBER}"
              sh(/git tag -a v${tag} -m "Jenkins dev publish"/)
              sh("git push origin v${tag}")
          }
          

          I often want to extract something like version string once, and then use it in multiple later stages for git tagging, substituting a line in a file yada yada. Just allowing for the definition of a global variable scoped to the Jenkinsfile without having to write a temp file to the agent workspace or set an env var on the system would ease friction a lot.  These could be set in an early stage (after checkout) and then used in later stages.

          Gigi Jackson added a comment - - edited I like the declarative syntax, but I often find myself defining script{} blocks or helper methods outside the pipeline{} just to do something like extract a version string from a repo, find previous git tags for generating changelogs etc.   You have to enclose even local variables in a script{} block just to assign and use in a later step, so my pipelines are littered with this kind of thing: script { version = sh(script: "python setup.py --version" , returnStdout: true ).trim() write_revision_file(version) sh( ". venv/bin/activate; " + $/python setup.py egg_info -b ".dev${env.BUILD_NUMBER}" sdist upload -r local rotate -m.tar.gz -k5; /$ + "deactivate" ) tag = "${version}.dev${env.BUILD_NUMBER}" sh(/git tag -a v${tag} -m "Jenkins dev publish" /) sh( "git push origin v${tag}" ) } I often want to extract something like version string once, and then use it in multiple later stages for git tagging, substituting a line in a file yada yada. Just allowing for the definition of a global variable scoped to the Jenkinsfile without having to write a temp file to the agent workspace or set an env var on the system would ease friction a lot.  These could be set in an early stage (after checkout) and then used in later stages.

          Edgars Batna added a comment - - edited

          Language features already seep in everywhere (e.g. string manipulation), I don't see how defining local variables and functions does any harm.

          It would then have the potential of replacing any other build/whatever scripts once and for all. Frankly, I'm sick of dealing with unavoidable tools and scripts galore in any Jenkins project (not by the fault of Jenkins, ofc).

          Edgars Batna added a comment - - edited Language features already seep in everywhere (e.g. string manipulation), I don't see how defining local variables and functions does any harm. It would then have the potential of replacing any other build/whatever scripts once and for all. Frankly, I'm sick of dealing with unavoidable tools and scripts galore in any Jenkins project (not by the fault of Jenkins, ofc).

          Andrew Bayer added a comment -

          So, sorry to say this, but this isn't going to happen. It's not the direction we want to go with Declarative.

          Andrew Bayer added a comment - So, sorry to say this, but this isn't going to happen. It's not the direction we want to go with Declarative.

          Liam Newman added a comment - - edited

          abayer Could you point to the direction we do want to go?  Is there public document(s) somewhere talking about this?   

           Do we need JEPs to talk about these design choices? 

          Liam Newman added a comment - - edited abayer Could you point to the direction we do want to go?  Is there public document(s) somewhere talking about this?     Do we need JEPs to talk about these design choices? 

          Andrew Bayer added a comment -

          bitwiseman - in this case, it's more that it was never something that fit into Declarative's model, and I've decided that it's time to just say yeah, it ain't happening, and move on.

          Andrew Bayer added a comment - bitwiseman - in this case, it's more that it was never something that fit into Declarative's model, and I've decided that it's time to just say yeah, it ain't happening, and move on.

          Sam Van Oort added a comment -

          bitwiseman FWIW I consider this the bright red line that says "no you need to be using Scripted for this." This feature would convert Declarative into a more full programming language as opposed to a simple, declarative Pipeline description.

          Sam Van Oort added a comment - bitwiseman FWIW I consider this the bright red line that says "no you need to be using Scripted for this." This feature would convert Declarative into a more full programming language as opposed to a simple, declarative Pipeline description.

          svanoort I thought I had read somehwhere that the roadmap was for scripted to eventually be deprecated, and that declarative would replace it.

          In any case, I second Liam's point that it would be interesting to see documents explaining this vision. It would help, at the very least,  people to make long term decisions for their CI architecture around Jenkins.

          Francis Therien added a comment - svanoort I thought I had read somehwhere that the roadmap was for scripted to eventually be deprecated, and that declarative would replace it. In any case, I second Liam's point that it would be interesting to see documents explaining this vision. It would help, at the very least,  people to make long term decisions for their CI architecture around Jenkins.

          Sam Van Oort added a comment -

          ftherien Caveat: take anything I say here as a personal opinion only. By all means, people should default to Declarative (or Declarative + Shared Libraries as needs grow) because it's easier and shows an opinionated "lit path" to successful CI/CD. And yes, we devote more work to Declarative specifically – I think the split is roughly 30/60/10 for work that is Declarative-only / General Pipeline for both / Scripted-only. That's because Declarative is newer and intended to be the "easy" mechanism that covers 80% of needs, where Scripted lets you roll-your-own and the majority of the features are just General Pipeline Stuff + Groovy.

          But as far as actually deprecating Scripted? You see people propose it from time to time, generally without understanding what it actually means, because Declarative is joined at the hip to Scripted and runs on top of it. For Declarative to replace Scripted, it which would have to absorb all the crazy use cases people can implement in Scripted due to its flexibility. Not to mention the number of organizations who would likely drop Pipeline and Jenkins entirely if they made a major investment in Scripted and Shared Libraries and had to walk away from 'em. I'd encourage anybody who thinks it's a good idea to come talk to me personally; I'll be happy to explain the other reasons why it's both extremely technically/architecturally difficult and a very poor idea (not going to run through it here because it's practically a novella).

          Ergo why I push back on trying to insert Scripted-like features such as this into Declarative – we don't want to find ourselves maintaining a new programming language. Better to make Declarative the best config language it can be, and delegate fancy stuff to a proper programming language (Groovy).

          Hope that all makes sense!

          Sam Van Oort added a comment - ftherien Caveat: take anything I say here as a personal opinion only. By all means, people should default to Declarative (or Declarative + Shared Libraries as needs grow) because it's easier and shows an opinionated "lit path" to successful CI/CD. And yes, we devote more work to Declarative specifically – I think the split is roughly 30/60/10 for work that is Declarative-only / General Pipeline for both / Scripted-only. That's because Declarative is newer and intended to be the "easy" mechanism that covers 80% of needs, where Scripted lets you roll-your-own and the majority of the features are just General Pipeline Stuff + Groovy. But as far as actually deprecating Scripted? You see people propose it from time to time, generally without understanding what it actually means, because Declarative is joined at the hip to Scripted and runs on top of it. For Declarative to replace Scripted, it which would have to absorb all the crazy use cases people can implement in Scripted due to its flexibility. Not to mention the number of organizations who would likely drop Pipeline and Jenkins entirely if they made a major investment in Scripted and Shared Libraries and had to walk away from 'em. I'd encourage anybody who thinks it's a good idea to come talk to me personally; I'll be happy to explain the other reasons why it's both extremely technically/architecturally difficult and a very poor idea (not going to run through it here because it's practically a novella). Ergo why I push back on trying to insert Scripted-like features such as this into Declarative – we don't want to find ourselves maintaining a new programming language. Better to make Declarative the best config language it can be, and delegate fancy stuff to a proper programming language (Groovy). Hope that all makes sense!

          Liam Newman added a comment -

          abayer

          I want to be clear, I personally am willing to accept that this is not only out of scope, but also counter to core design.  My point is I'd like to make sure that core design and direction are documented and understandable.

           

          Liam Newman added a comment - abayer I want to be clear, I personally am willing to accept that this is not only out of scope, but also counter to core design.  My point is I'd like to make sure that core design and direction are documented and understandable.  

          I very agree with bitwiseman and ftherien: Having some background information about the underlying design would it make a lot easier to build Jenkins pipelines the "right" way: Luckily, they are actively developed, thus it is extremely important for users to know about things you should not use because they are getting obsoleted in near future due to changes / advances in design.
          Frankly, today it is sometimes quite challening to justify the maintenance effort for pipelines, if the answer to the question "why did the jenkins update break our pipe?" repeatedly is: "they decided to change it" ...
          IMO, in the recent past Jenkins came to a very good way of communicating and handling of security related issues (pre-announcement mails, changelog, ...)
          Having a design outline and a roadmap with milestones IMO would be a good starting point to achieve a similar level of transparency for the design of a complex, production-grade CI system that Jenkins is.

          Florian Miedniak added a comment - I very agree with bitwiseman and ftherien : Having some background information about the underlying design would it make a lot easier to build Jenkins pipelines the "right" way: Luckily, they are actively developed, thus it is extremely important for users to know about things you should not use because they are getting obsoleted in near future due to changes / advances in design. Frankly, today it is sometimes quite challening to justify the maintenance effort for pipelines, if the answer to the question "why did the jenkins update break our pipe?" repeatedly is: "they decided to change it" ... IMO, in the recent past Jenkins came to a very good way of communicating and handling of security related issues (pre-announcement mails, changelog, ...) Having a design outline and a roadmap with milestones IMO would be a good starting point to achieve a similar level of transparency for the design of a complex, production-grade CI system that Jenkins is.

          Liam Newman added a comment -

          Bulk closing resolved issues.

          Liam Newman added a comment - Bulk closing resolved issues.

            abayer Andrew Bayer
            rpocase Robby Pocase
            Votes:
            43 Vote for this issue
            Watchers:
            57 Start watching this issue

              Created:
              Updated:
              Resolved: