Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-59484

EXECUTOR_NUMBER is always expanded to '0' in workspace name when checking out with Perforce to a subdirectory

XMLWordPrintable

      When using the p4 plugin to checkout to a subdirectory, EXECUTOR_NUMBER is always expanded in the Perforce workspace name to be '0'. This messes up concurrent builds that rely on the Perforce workspace root since each subsequent build uses the same workspace and overwrites the root directory.

      When not checking out to a subdirectory, the executor number is expanded correctly. It still doesn't match up with the result of 'echo ${EXECUTOR_NUMBER}', but at least multiple builds aren't clashing workspaces with each other.

      This occurs both when using 'checkoutToSubdirectory' under options as well as manually specifying a dir() step with 'checkout scm' under it.

       

      To reproduce, I used the jenkins/jenkins:2.176.3 docker image of Jenkins (I was able to reproduce it on 2.150.3, 2.164.3, and 2.176.3) with p4-plugin 1.10.3 as well as a clean docker instance of Perforce (https://github.com/p4paul/helix-docker). I created a stream depot with a dummy stream to use for tests. The workspace name format is still the default jenkins-${NODE_NAME}${JOB_NAME}${EXECUTOR_NUMBER}

      I have been able to reproduce it by starting several concurrent builds on the same node (I tried both the master node and a slave node) and the following Jenkinsfiles:

       

      pipeline
      {
        agent any
        options
        {
          checkoutToSubdirectory('test')
        }
        stages
        {
          stage('testStage')
          {
            steps
            {
              script
              {
                sh "echo ${EXECUTOR_NUMBER}"
              }
            }
          }
        }
      }
      

       

      pipeline
      { 
        agent any
        options
        {
          skipDefaultCheckout(true)
        }
        stages
        {
          stage('testStage')
          {
            steps
            { 
              script
              { 
                dir('test')
                {
                  checkout scm
                  sh "echo ${EXECUTOR_NUMBER}"
                }
              }
            }
          }
        }
      }
      

      And example output of 3 concurrent builds:

      P4 Task: establishing connection.
      ... server: helixdocker_build.helix_1:1666
      ... node: e75d28131ce5
      (p4):cmd:... p4 where /var/jenkins_home/workspace/testMultibranch_testStream%403/test/Jenkinsfil___
      p4 where /var/jenkins_home/workspace/testMultibranch_testStream%403/test/Jenkinsfile(p4):stop:5
      Building on Node: master
      (p4):cmd:... p4 client -o jenkins-master-testMultibranch-testStream-0
      p4 client -o jenkins-master-testMultibranch-testStream-0
      P4 Task: establishing connection.
      ... server: helixdocker_build.helix_1:1666
      ... node: e75d28131ce5
      (p4):cmd:... p4 where /var/jenkins_home/workspace/testMultibranch_testStream%402/test/Jenkinsfil___
      p4 where /var/jenkins_home/workspace/testMultibranch_testStream%402/test/Jenkinsfile(p4):stop:5
      Building on Node: master
      (p4):cmd:... p4 client -o jenkins-master-testMultibranch-testStream-0
      p4 client -o jenkins-master-testMultibranch-testStream-0
      P4 Task: establishing connection.
      ... server: helixdocker_build.helix_1:1666
      ... node: e75d28131ce5
      (p4):cmd:... p4 where /var/jenkins_home/workspace/testMultibranch_testStream/test/Jenkinsfile
      p4 where /var/jenkins_home/workspace/testMultibranch_testStream/test/Jenkinsfile(p4):stop:5
      Building on Node: master
      (p4):cmd:... p4 client -o jenkins-master-testMultibranch-testStream-0
      p4 client -o jenkins-master-testMultibranch-testStream-0
      

       

       

            msmeeth Matthew Smeeth
            mbrunton27 Matthew Brunton
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

              Created:
              Updated:
              Resolved: