Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-47129

Executing multiple declaritive pipelines into a scripted pipeline is not possible anymore

    XMLWordPrintable

Details

    Description

      I work on a very big project and for us it's not suitable to put all tests into one jenkinsfile. On jenkins we have for each one a dedicated pipeline job. However in the end when somebody makes a pull request towards the master branch, we want to load every pipeline into a single job for convenience reasons. This is done with a scripted pipeline which scans for the other jenkinsfile's and then executes them in parallel with the 'load' command.

      However after the recent update of the pipeline model to 1.2, this isn't working anymore with the following error:

      Java.lang.IllegalStateException: Only one pipeline { ... } block can be executed in a single run.

      This exception blocks us from using declaritive pipelines in a complex manner .

      Attachments

        Activity

          abayer Andrew Bayer added a comment -

          Sorry - until 1.2, loading and executing Declarative Pipelines from shared libraries was officially unsupported. With 1.2, we do support that use case, but with certain restrictions, such as where you can define Declarative Pipelines and limiting to only one executed Declarative Pipeline per run. This is to ensure consistent behavior with both the new use case and normal Declarative Pipelines defined in a Jenkinsfile.

          abayer Andrew Bayer added a comment - Sorry - until 1.2, loading and executing Declarative Pipelines from shared libraries was officially unsupported. With 1.2, we do support that use case, but with certain restrictions, such as where you can define Declarative Pipelines and limiting to only one executed Declarative Pipeline per run. This is to ensure consistent behavior with both the new use case and normal Declarative Pipelines defined in a Jenkinsfile.

          We resolved out issue by moving back to scripted pipelines in combination with shared libraries

          roel0 roel postelmans added a comment - We resolved out issue by moving back to scripted pipelines in combination with shared libraries
          abhishekmukherg Abhishek Mukherjee added a comment - - edited

          While it's totally reasonable to change the API, may I voice some concern on a breaking API change like this happening in a minor revision when going from 1.1.9 to 1.2.0? Should this not have been a 2.0.0?

          abhishekmukherg Abhishek Mukherjee added a comment - - edited While it's totally reasonable to change the API, may I voice some concern on a breaking API change like this happening in a minor revision when going from 1.1.9 to 1.2.0? Should this not have been a 2.0.0?
          abayer Andrew Bayer added a comment -

          This was never an officially intended or supported behavior in the first place, so there was no problem in changing it to a defined behavior.

          abayer Andrew Bayer added a comment - This was never an officially intended or supported behavior in the first place, so there was no problem in changing it to a defined behavior.
          chantivlad chanti vlad added a comment - - edited

          Hi abayer , roel0 and abhishekmukherg,

          i am facing the exact same problem, where i was hoping to have the following set up:

          • git repo A stores all generic Jenkins pipelines, which need specific env variables to be parameterized
          • git repo B has a Jenkinsfile that gets executed pre merge on changes in the repo B. This Jenkinsfile would do some things, and then clone the repo A, set the env variable properly, load a Jenkinsfile from A and run it as "sub pipeline".

          Do i understand you correctly that the best way to do this is to have the generic part in repo A as scripted and not declarative ?

          If yes, why this limitation?

           

          chantivlad chanti vlad added a comment - - edited Hi  abayer  ,  roel0 and abhishekmukherg , i am facing the exact same problem, where i was hoping to have the following set up: git repo A stores all generic Jenkins pipelines, which need specific env variables to be parameterized git repo B has a Jenkinsfile that gets executed pre merge on changes in the repo B. This Jenkinsfile would do some things, and then clone the repo A, set the env variable properly, load a Jenkinsfile from A and run it as "sub pipeline". Do i understand you correctly that the best way to do this is to have the generic part in repo A as scripted and not declarative ? If yes, why this limitation?  
          bitwiseman Liam Newman added a comment -

          Bulk closing resolved issues.

          bitwiseman Liam Newman added a comment - Bulk closing resolved issues.

          People

            abayer Andrew Bayer
            roel0 roel postelmans
            Votes:
            0 Vote for this issue
            Watchers:
            6 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: