Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-61654

Allow specifying commit for multibranch pipeline build

      We have a Jenkins multi-branch pipeline backed by a Jenkinsfile contained in the source repo. For various reasons, we would like to split this huge Jenkinsfile into a few (two or three) smaller, more focused Jenkinsfiles. Each of these Jenkinsfiles will correspond to a different multi-branch pipeline. The intent is that a push to Git will only trigger a build of some master orchestration Jenkinsfile, which in turn triggers the other downstream jobs via the build step. https://jenkins.io/doc/pipeline/steps/pipeline-build-step/ All of these Jenkinsfiles will live together in the source repo.

      However, the problem with this approach is that when you trigger a multi-branch pipeline job, it is going to use the Jenkinsfile from whatever the head of the branch in question is at that time. This can cause weird problems if, for example, the branch in question gets pushed to after the orchestration build starts, but before it triggers the downstream job.

      We would like the ability to have the build step trigger a build for a specific commit hash of the branch in question. In particular, the orchestration build will collect its own commit hash from checkout(scm).GIT_COMMIT and pass it to the build step.

      From what I understand, the code changes necessary would be right around here. https://github.com/jenkinsci/workflow-multibranch-plugin/blob/f9b5cd015093ae6c03601b4fcb52609edac4f6c3/src/main/java/org/jenkinsci/plugins/workflow/multibranch/SCMBinder.java#L102

          [JENKINS-61654] Allow specifying commit for multibranch pipeline build

          Mike Griffin added a comment -

          I have run into the same problem. I'd like to be able to do this as it seems a good way to a) overcome the nested parallel restriction b) split up a big declarative pipeline into more manageable pieces

          Mike Griffin added a comment - I have run into the same problem. I'd like to be able to do this as it seems a good way to a) overcome the nested parallel restriction b) split up a big declarative pipeline into more manageable pieces

          Kim K added a comment - - edited

          It would be nice if the current documentation mentioned that the commit information is not sent when requesting a build job be done. I'm attempting to locate a workaround, but it's adding a lot of overhead on what I thought was taken care of for me.

          Kim K added a comment - - edited It would be nice if the current documentation mentioned that the commit information is not sent when requesting a build  job be done. I'm attempting to locate a workaround, but it's adding a lot of overhead on what I thought was taken care of for me.

          I wonder if the ability to specify a particular commit should require a permission higher than Item.BUILD.

          Kalle Niemitalo added a comment - I wonder if the ability to specify a particular commit should require a permission higher than Item.BUILD .

          If the commit ID were a new argument to the build pipeline step, I imagine BuildTriggerStepExecution should construct an SCMRevisionAction and add that to List<Action> actions, like it now adds a ParametersAction. That seems doable by calling the SCMRevision fetch(@NonNull String thingName, @CheckForNull TaskListener listener) method of SCMSource and providing the commit ID as thingName.

          This SCMRevisionAction would then be saved to the run and take effect in various places. I hope WorkflowRun would pass the actions to SCMBinder, which would have to be changed to look for an existing SCMRevisionAction. If found, initialize SCMRevision tip from that instead of calling scmSource.fetch(head, listener) and creating a new SCMRevisionAction.

          Kalle Niemitalo added a comment - If the commit ID were a new argument to the build pipeline step, I imagine BuildTriggerStepExecution should construct an SCMRevisionAction and add that to List<Action> actions , like it now adds a ParametersAction. That seems doable by calling the SCMRevision fetch(@NonNull String thingName, @CheckForNull TaskListener listener) method of SCMSource and providing the commit ID as thingName . This SCMRevisionAction would then be saved to the run and take effect in various places. I hope WorkflowRun would pass the actions to SCMBinder, which would have to be changed to look for an existing SCMRevisionAction. If found, initialize SCMRevision tip from that instead of calling scmSource.fetch(head, listener) and creating a new SCMRevisionAction.

          Kim K added a comment - - edited

          The presence of the extension on the scm object hudson.plugins.git.extensions.impl.BuildChooserSetting seems to be preventing the ability to use a commit in the branches key.
          I've been trying to remove that extension in a shared library, but I keep getting NotSerializable errors from modifying the extensions.
           

          Kim K added a comment - - edited The presence of the extension on the scm object hudson.plugins.git.extensions.impl.BuildChooserSetting seems to be preventing the ability to use a commit in the branches key. I've been trying to remove that extension in a shared library, but I keep getting NotSerializable errors from modifying the extensions.  

          What is "the branches key"?

          IIRC, @NonCPS can fix some NotSerializable errors.

          Kalle Niemitalo added a comment - What is "the branches key"? IIRC, @NonCPS can fix some NotSerializable errors.

          If you use the Git Parameter plugin to define a parameter with type: 'PT_REVISION', and provide the commit ID as that parameter in the build step, then will it load the Jenkinsfile from that commit? I guess it won't.

          Kalle Niemitalo added a comment - If you use the Git Parameter plugin to define a parameter with type: 'PT_REVISION' , and provide the commit ID as that parameter in the build step, then will it load the Jenkinsfile from that commit? I guess it won't.

          Kim K added a comment - - edited

           
           I am in the same boat as rittneje, where we use multiple Jenkinsfiles with multiple Multibranch Pipeline Jobs, which get triggered via one triggering upstream job with  build job: '../subJobA/${BRANCH_NAME}' but I cannot tell if this is a new bug, because if there are multiple stages on different nodes that check out (with or without skipDefaultCheckout) the checkout will always get the latest commit.

           

          // psuedocode
          pipeline {
          agent none
             options {
                skipDefaultCheckout(true) 
          }   
          stages {
                stage('Checkout') {
                    agent {label 'windows'}
                    steps {
                        checkout([
                            $class: 'GitSCM',
                            branches: scm.branches,               
                            extensions: [[$class: 'CloneOption', depth: 20, noTags: false, reference: '', shallow: true]],               
                            userRemoteConfigs: scm.userRemoteConfigs
                        ])
                        // Do long running work
                }}
          
          // Someone commits between this checkout and the next
          
                stage('OtherCheckout') {
                    agent {label 'linux'}
                    steps {
                        // !!!! This will NOT checkout the change that was the triggering event
                        checkout([
                            $class: 'GitSCM',
                            branches: scm.branches,               
                            extensions: [[$class: 'CloneOption', noTags: false, reference: '/srv/git_cache']],               
                            userRemoteConfigs: scm.userRemoteConfigs
                       ])
                }}
          }}

           

          Kim K added a comment - - edited    I am in the same boat as rittneje , where we use multiple Jenkinsfiles with multiple Multibranch Pipeline Jobs, which get triggered via one triggering upstream job with   build job: '../subJobA/${BRANCH_NAME}' but I cannot tell if this is a new bug, because if there are multiple stages on different nodes that check out (with or without skipDefaultCheckout ) the checkout will  always get the latest commit.   // psuedocode pipeline { agent none    options { skipDefaultCheckout( true )  }    stages {       stage( 'Checkout' ) { agent {label 'windows' } steps { checkout([ $class:  'GitSCM' , branches: scm.branches,                extensions: [[$class:  'CloneOption' , depth: 20, noTags:  false , reference: '', shallow:  true ]],                userRemoteConfigs: scm.userRemoteConfigs ]) // Do long running work }} // Someone commits between this checkout and the next stage( 'OtherCheckout' ) { agent {label 'linux' } steps { // !!!! This will NOT checkout the change that was the triggering event checkout([ $class:  'GitSCM' , branches: scm.branches,                extensions: [[$class:  'CloneOption' , noTags:  false , reference:  '/srv/git_cache' ]],                userRemoteConfigs: scm.userRemoteConfigs ]) }} }}  

          Kim K added a comment -

          I am using git-plugin:4.7.1 on Jenkins 2.277.2
          downstreamCheckout.groovy
          I used @NonCPS to return the extensions without BuildChooserSetting and that works, but it doesn't if you attempt to add to scm.extensions.

          I added my shared library file in vars/downstreamCheckout.groovy that shows a semi functional version that can be used by downstream jobs, provided the upstream set the GIT_COMMIT in the environment:

          // Upstream Job
          pipeline{.......
          checkout([
                $class: 'GitSCM',
                branches: scm.branches,
                extensions: scm.extensions,
                userRemoteConfigs: scm.userRemoteConfigs
          ]).each { k,v -> env.setProperty(k, v) }
          
          // Downstream job
          @Library('JenkinsLib') _
          pipeline {........
          script{ 
          downstreamCheckout()
          //or specify custom extensions. 
          downstreamCheckout(extensions: [[$class: 'CloneOption', depth: 20, noTags: false, reference: '', shallow: true]])
          
          // This causes NotSerializableException in the checkout step itself. I can't figure out how to fix that, NonCPS doesnt work with steps
          //downstreamCheckout(extensions: scm.extensions + [[$class: 'CloneOption', depth: 20, noTags: false, reference: '', shallow: true]])
          }
          

          I'm hopeful that it is a bug and not the intended design that multibranch pipeline checkouts grab the latest no matter what, because if it's a feature, Jenkins cannot be used in a production environment and I need to move to something else

          Kim K added a comment - I am using git-plugin:4.7.1 on Jenkins 2.277.2 downstreamCheckout.groovy I used @NonCPS to return the extensions without BuildChooserSetting and that works, but it doesn't if you attempt to add to scm.extensions . I added my shared library file in vars/downstreamCheckout.groovy that shows a semi functional version that can be used by downstream jobs, provided the upstream set the GIT_COMMIT in the environment: // Upstream Job pipeline{....... checkout([ $class: 'GitSCM' , branches: scm.branches, extensions: scm.extensions, userRemoteConfigs: scm.userRemoteConfigs ]).each { k,v -> env.setProperty(k, v) } // Downstream job @Library( 'JenkinsLib' ) _ pipeline {........ script{ downstreamCheckout() //or specify custom extensions. downstreamCheckout(extensions: [[$class: 'CloneOption' , depth: 20, noTags: false , reference: '', shallow: true ]]) // This causes NotSerializableException in the checkout step itself. I can't figure out how to fix that, NonCPS doesnt work with steps //downstreamCheckout(extensions: scm.extensions + [[$class: 'CloneOption' , depth: 20, noTags: false , reference: '', shallow: true ]]) } I'm hopeful that it is a bug and not the intended design that multibranch pipeline checkouts grab the latest no matter what, because if it's a feature, Jenkins cannot be used in a production environment and I need to move to something else

          This will be great feature

          Martin Pokorny added a comment - This will be great feature

            Unassigned Unassigned
            rittneje Jesse Rittner
            Votes:
            6 Vote for this issue
            Watchers:
            8 Start watching this issue

              Created:
              Updated: