• Icon: Bug Bug
    • Resolution: Cannot Reproduce
    • Icon: Minor Minor
    • kubernetes-plugin
    • None

      After upgrading from 1.1.3 to 1.1.4, Jenkins is no longer able to successfully build the following Jenkinsfile:

      podTemplate(
        label: 'dockerpod',
        containers: [containerTemplate(
          image: 'docker:17.11.0-ce', name: 'docker', command: 'cat', ttyEnabled: true
        )],
        volumes: [
          hostPathVolume(hostPath: '/var/run/docker.sock', mountPath: '/var/run/docker.sock')
        ]
        ) {
      
        podTemplate(
          label: 'mavenpod',
          containers: [containerTemplate(image: 'maven:3.3.9-jdk-8-alpine', name: 'maven', command: 'cat', ttyEnabled: true)]
          ) {
          
          node('mavenpod') {
            stage('Test') {
      
              container('docker') {
                sh "echo Running docker"
              }
      
              container('maven') {
                sh "echo Running maven"
              }
      
            } // stage
      
          } // node
      
          node('dockerpod') {
          }
        } // podTemplate: maven
      } // podTemplate: docker
      

      Output on 1.1.3 (working):

      Started by user admin
      Connecting to xxxxxxx
      Obtained Jenkinsfile from 02d8a240fa366f4dcec4e533982114ea62202bd4
      Running in Durability level: MAX_SURVIVABILITY
      [Pipeline] podTemplate
      [Pipeline] {
      [Pipeline] podTemplate
      [Pipeline] {
      [Pipeline] node
      Still waiting to schedule task
      Waiting for next available executor
      Running on jenkins-slave-d8t8w-cshmg in /home/jenkins/workspace/-spring-boot-java8_fixerror-FBDHQ2DI2UHV7QF37G2A3Z4BXFAJHPBVS5GDMTJKZFHJRAA2TAUA
      [Pipeline] {
      [Pipeline] stage
      [Pipeline] { (Test)
      [Pipeline] container
      [Pipeline] {
      [Pipeline] sh
      [-spring-boot-java8_fixerror-FBDHQ2DI2UHV7QF37G2A3Z4BXFAJHPBVS5GDMTJKZFHJRAA2TAUA] Running shell script
      + echo Running docker
      Running docker
      [Pipeline] }
      [Pipeline] // container
      [Pipeline] container
      [Pipeline] {
      [Pipeline] sh
      [-spring-boot-java8_fixerror-FBDHQ2DI2UHV7QF37G2A3Z4BXFAJHPBVS5GDMTJKZFHJRAA2TAUA] Running shell script
      + echo Running maven
      Running maven
      [Pipeline] }
      [Pipeline] // container
      [Pipeline] }
      [Pipeline] // stage
      [Pipeline] }
      [Pipeline] // node
      [Pipeline] node
      Still waiting to schedule task
      jenkins-slave-j4dwc-wx37x is offline
      Running on jenkins-slave-j4dwc-wx37x in /home/jenkins/workspace/-spring-boot-java8_fixerror-FBDHQ2DI2UHV7QF37G2A3Z4BXFAJHPBVS5GDMTJKZFHJRAA2TAUA
      [Pipeline] {
      [Pipeline] }
      [Pipeline] // node
      [Pipeline] }
      [Pipeline] // podTemplate
      [Pipeline] }
      [Pipeline] // podTemplate
      [Pipeline] End of Pipeline
      
      GitHub has been notified of this commit’s build result
      
      Finished: SUCCESS
      

      Output on 1.1.4 (failing):

      Started by user admin
      Connecting to xxxxxx
      Obtained Jenkinsfile from 02d8a240fa366f4dcec4e533982114ea62202bd4
      Running in Durability level: MAX_SURVIVABILITY
      [Pipeline] podTemplate
      [Pipeline] {
      [Pipeline] podTemplate
      [Pipeline] {
      [Pipeline] node
      Still waiting to schedule task
      Waiting for next available executor
      Running on jenkins-slave-2v2hj-1f6jm in /home/jenkins/workspace/-spring-boot-java8_fixerror-FBDHQ2DI2UHV7QF37G2A3Z4BXFAJHPBVS5GDMTJKZFHJRAA2TAUA
      [Pipeline] {
      [Pipeline] stage
      [Pipeline] { (Test)
      [Pipeline] container
      [Pipeline] {
      [Pipeline] sh
      [-spring-boot-java8_fixerror-FBDHQ2DI2UHV7QF37G2A3Z4BXFAJHPBVS5GDMTJKZFHJRAA2TAUA] Running shell script
      [Pipeline] }
      [Pipeline] // container
      [Pipeline] }
      [Pipeline] // stage
      [Pipeline] }
      [Pipeline] // node
      [Pipeline] }
      [Pipeline] // podTemplate
      [Pipeline] }
      [Pipeline] // podTemplate
      [Pipeline] End of Pipeline
      
      GitHub has been notified of this commit’s build result
      
      java.io.IOException: container [docker] does not exist in pod [jenkins-slave-2v2hj-1f6jm]
      	at org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1.waitUntilContainerIsReady(ContainerExecDecorator.java:401)
      	at org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1.doLaunch(ContainerExecDecorator.java:226)
      	at org.csanchez.jenkins.plugins.kubernetes.pipeline.ContainerExecDecorator$1.launch(ContainerExecDecorator.java:148)
      	at hudson.Launcher$ProcStarter.start(Launcher.java:450)
      	at org.jenkinsci.plugins.durabletask.BourneShellScript.launchWithCookie(BourneShellScript.java:186)
      	at org.jenkinsci.plugins.durabletask.FileMonitoringTask.launch(FileMonitoringTask.java:64)
      	at org.jenkinsci.plugins.workflow.steps.durable_task.DurableTaskStep$Execution.start(DurableTaskStep.java:177)
      	at org.jenkinsci.plugins.workflow.cps.DSL.invokeStep(DSL.java:229)
      	at org.jenkinsci.plugins.workflow.cps.DSL.invokeMethod(DSL.java:153)
      	at org.jenkinsci.plugins.workflow.cps.CpsScript.invokeMethod(CpsScript.java:108)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:498)
      	at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
      	at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
      	at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1213)
      	at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
      	at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:42)
      	at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
      	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
      	at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:157)
      	at org.kohsuke.groovy.sandbox.GroovyInterceptor.onMethodCall(GroovyInterceptor.java:23)
      	at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onMethodCall(SandboxInterceptor.java:133)
      	at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:155)
      	at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:159)
      	at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:129)
      	at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:129)
      	at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:129)
      	at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:129)
      	at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:129)
      	at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.methodCall(SandboxInvoker.java:17)
      	at WorkflowScript.run(WorkflowScript:20)
      	at ___cps.transform___(Native Method)
      	at com.cloudbees.groovy.cps.impl.ContinuationGroup.methodCall(ContinuationGroup.java:57)
      	at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.dispatchOrArg(FunctionCallBlock.java:109)
      	at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.fixArg(FunctionCallBlock.java:82)
      	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
      	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
      	at java.lang.reflect.Method.invoke(Method.java:498)
      	at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
      	at com.cloudbees.groovy.cps.impl.ConstantBlock.eval(ConstantBlock.java:21)
      	at com.cloudbees.groovy.cps.Next.step(Next.java:83)
      	at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:174)
      	at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:163)
      	at org.codehaus.groovy.runtime.GroovyCategorySupport$ThreadCategoryInfo.use(GroovyCategorySupport.java:122)
      	at org.codehaus.groovy.runtime.GroovyCategorySupport.use(GroovyCategorySupport.java:261)
      	at com.cloudbees.groovy.cps.Continuable.run0(Continuable.java:163)
      	at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.access$001(SandboxContinuable.java:19)
      	at org.jenkinsci.plugins.workflow.cps.SandboxContinuable$1.call(SandboxContinuable.java:35)
      	at org.jenkinsci.plugins.workflow.cps.SandboxContinuable$1.call(SandboxContinuable.java:32)
      	at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovySandbox.runInSandbox(GroovySandbox.java:108)
      	at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.run0(SandboxContinuable.java:32)
      	at org.jenkinsci.plugins.workflow.cps.CpsThread.runNextChunk(CpsThread.java:174)
      	at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.run(CpsThreadGroup.java:331)
      	at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.access$200(CpsThreadGroup.java:82)
      	at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:243)
      	at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:231)
      	at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$2.call(CpsVmExecutorService.java:64)
      	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      	at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:112)
      	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
      	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
      	at java.lang.Thread.run(Thread.java:748)
      Finished: FAILURE
      

      If I change node label from mavenPod to dockerPod, I am able to run the command in the docker-container, but then I get an error when running the command in the maven container.

      Am I doing something wrong or is this a bug? I want to be able to run commands from both containers.

        1. config.xml
          15 kB
          Yngvar Kristiansen

          [JENKINS-49366] Nested podTemplate stopped working in 1.1.4

          Yngvar Kristiansen created issue -

          Probably a dupe of JENKINS-49313

          Carlos Sanchez added a comment - Probably a dupe of JENKINS-49313
          Carlos Sanchez made changes -
          Link New: This issue duplicates JENKINS-49313 [ JENKINS-49313 ]

          do you have global defined templates with dockerpod or mavenpod ?

          Carlos Sanchez added a comment - do you have global defined templates with dockerpod or mavenpod ?
          Yngvar Kristiansen made changes -
          Attachment New: config.xml [ 41330 ]

          I just checked, and I don't have any with those names or labels. See attached config.xml.

          (I was a bit surprised to find some other templates there, prefixed with jenkins-slave-xyz12, named from earlier, now terminated pods. But that's perhaps unrelated to this issue.)

          Yngvar Kristiansen added a comment - I just checked, and I don't have any with those names or labels. See attached config.xml . (I was a bit surprised to find some other templates there, prefixed with jenkins-slave-xyz12, named from earlier, now terminated pods. But that's perhaps unrelated to this issue.)

          yngvark The previous behavior seems like a bug.

          Your first node block works inside the mavenpod pod template. How would it be possible to switch to a container that doesn't belong to this pod template? Until the node keyword is used, the pod is not even created.

          Something like this works for me and seems more correct

          podTemplate(
            label: 'dockerpod',
            containers: [
                containerTemplate(
              image: 'docker:17.11.0-ce', name: 'docker', command: 'cat', ttyEnabled: true,
            ),
                containerTemplate(image: 'maven:3.3.9-jdk-8-alpine', name: 'maven', command: 'cat', ttyEnabled: true)
            ],
            volumes: [hostPathVolume(hostPath: '/var/run/docker.sock', mountPath: '/var/run/docker.sock')]
            ) {
              node('dockerpod') {
                stage('Test') {
                  container('docker') {
                    sh "docker version"
                  }
                  container('maven') {
                    sh "mvn -v"
                  }
                } // stage
              } // node
          } // podTemplate: docker
          

          Vincent Latombe added a comment - yngvark The previous behavior seems like a bug. Your first node block works inside the mavenpod pod template. How would it be possible to switch to a container that doesn't belong to this pod template? Until the node keyword is used, the pod is not even created. Something like this works for me and seems more correct podTemplate( label: 'dockerpod' , containers: [ containerTemplate( image: 'docker:17.11.0-ce' , name: 'docker' , command: 'cat' , ttyEnabled: true , ), containerTemplate(image: 'maven:3.3.9-jdk-8-alpine' , name: 'maven' , command: 'cat' , ttyEnabled: true ) ], volumes: [hostPathVolume(hostPath: '/ var /run/docker.sock' , mountPath: '/ var /run/docker.sock' )] ) { node( 'dockerpod' ) { stage( 'Test' ) { container( 'docker' ) { sh "docker version" } container( 'maven' ) { sh "mvn -v" } } // stage } // node } // podTemplate: docker
          Carlos Sanchez made changes -
          Resolution New: Not A Defect [ 7 ]
          Status Original: Open [ 1 ] New: Closed [ 6 ]

          vlatombe: I see, thanks for the clarification. I tested your approach on versions 1.1.3 and 1.2, and it worked fine.

           

          The reason I chose a nested podTemplate to use multiple containers, is this section in the readme: https://github.com/jenkinsci/kubernetes-plugin#nesting-pod-templates

          You can nest multiple pod templates together in order to compose a single one.

          The example below composes two different podTemplates in order to create one with maven and docker capabilities.

          and then listing two podTemplates the same way I did, except not writing how to write the node-declaration.

          It seems the whole documentation under "Nesting Pod templates" does the same thing I did, including the PodTemplate-part. I think it should be removed as it doesn't work (or maybe there is some other use case for using nested pod templates?).

           

          Yngvar Kristiansen added a comment - vlatombe : I see, thanks for the clarification. I tested your approach on versions 1.1.3 and 1.2, and it worked fine.   The reason I chose a nested podTemplate to use multiple containers, is this section in the readme: https://github.com/jenkinsci/kubernetes-plugin#nesting-pod-templates You can nest multiple pod templates together in order to compose a single one. The example below composes two different podTemplates in order to create one with maven and docker capabilities. and then listing two podTemplates the same way I did, except not writing how to write the node-declaration. It seems the whole documentation under "Nesting Pod templates" does the same thing I did, including the PodTemplate-part. I think it should be removed as it doesn't work (or maybe there is some other use case for using nested pod templates?).  
          Carlos Sanchez made changes -
          Link New: This issue is duplicated by JENKINS-49700 [ JENKINS-49700 ]

            csanchez Carlos Sanchez
            yngvark Yngvar Kristiansen
            Votes:
            0 Vote for this issue
            Watchers:
            7 Start watching this issue

              Created:
              Updated:
              Resolved: