• Icon: Bug Bug
    • Resolution: Won't Fix
    • Icon: Blocker Blocker
    • docker-workflow-plugin

      create a job called "my job" with the following pipeline

      node ("docker") {
          sh 'pwd'
          img = docker.image("ubuntu")
          img.inside {
              sh 'pwd'
          }
      }
      
      [Pipeline] Allocate node : Start
      Running on docker in /root/workspace/workspace/test-jobs/jnord/docker inside
      [Pipeline] node {
      [Pipeline] sh
      [docker inside] Running shell script
      + pwd
      /root/workspace/workspace/test-jobs/jnord/docker inside
      [Pipeline] sh
      [docker inside] Running shell script
      + docker inspect -f . ubuntu
      .
      [Pipeline] Run build steps inside a Docker container : Start
      $ docker run -t -d -u 0:0 -w "/root/workspace/workspace/test-jobs/jnord/docker inside" -v "/root/workspace/workspace/test-jobs/jnord/docker inside:/root/workspace/workspace/test-jobs/jnord/docker inside:rw" -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** ubuntu cat
      [Pipeline] withDockerContainer {
      [Pipeline] sh
      [docker inside] Running shell script
      sh: 1: cannot create /root/workspace/workspace/test-jobs/jnord/docker inside@tmp/durable-5ea3c644/pid: Directory nonexistent
      sh: 1: cannot create /root/workspace/workspace/test-jobs/jnord/docker inside@tmp/durable-5ea3c644/jenkins-log.txt: Directory nonexistent
      sh: 1: cannot create /root/workspace/workspace/test-jobs/jnord/docker inside@tmp/durable-5ea3c644/jenkins-result.txt: Directory nonexistent
      [Pipeline] } //withDockerContainer
      $ docker stop 260e1e96255fd938e64d6bcd296ce616e01a40ece5ef093dfd97325418247904
      $ docker rm -f 260e1e96255fd938e64d6bcd296ce616e01a40ece5ef093dfd97325418247904
      [Pipeline] Run build steps inside a Docker container : End
      [Pipeline] } //node
      [Pipeline] Allocate node : End
      [Pipeline] End of Pipeline
      ERROR: script returned exit code -2
      Finished: FAILURE
      

          [JENKINS-33632] docker.inside broken with old client versions

          James Nord added a comment - - edited

          Note: reproduces with a job without spaces in the name.

          [docker_inside] Running shell script
          sh: 1: cannot create /root/workspace/workspace/test-jobs/jnord/docker_inside@tmp/durable-d8f4d687/pid: Directory nonexistent
          sh: 1: cannot create /root/workspace/workspace/test-jobs/jnord/docker_inside@tmp/durable-d8f4d687/jenkins-log.txt: Directory nonexistent
          sh: 1: cannot create /root/workspace/workspace/test-jobs/jnord/docker_inside@tmp/durable-d8f4d687/jenkins-result.txt: Directory nonexistent
          

          James Nord added a comment - - edited Note: reproduces with a job without spaces in the name. [docker_inside] Running shell script sh: 1: cannot create /root/workspace/workspace/test-jobs/jnord/docker_inside@tmp/durable-d8f4d687/pid: Directory nonexistent sh: 1: cannot create /root/workspace/workspace/test-jobs/jnord/docker_inside@tmp/durable-d8f4d687/jenkins-log.txt: Directory nonexistent sh: 1: cannot create /root/workspace/workspace/test-jobs/jnord/docker_inside@tmp/durable-d8f4d687/jenkins-result.txt: Directory nonexistent

          Antonio Muñiz added a comment - Maybe related to https://github.com/jenkinsci/docker-workflow-plugin/pull/33 ?

          Antonio Muñiz added a comment - - edited

          I think it will work if you upgrade Pipeline to 1.15 (to pick up this commit). But it's not clear to me who is creating/trying to access the @tmp directory as you seem to have a set of versions all previous the @tmp change.

          Antonio Muñiz added a comment - - edited I think it will work if you upgrade Pipeline to 1.15 (to pick up this commit ). But it's not clear to me who is creating/trying to access the @tmp directory as you seem to have a set of versions all previous the @tmp change.

          James Nord added a comment -

          fixed by updating docker-workflow from 1.2 to 1.4 as suggested by jglick

          James Nord added a comment - fixed by updating docker-workflow from 1.2 to 1.4 as suggested by jglick

          I am seeing this on 1.642.2 with pipeline/workflow updates prior to the 2.0 split and doker-workflow 1.4.

          Jacob Blain Christen added a comment - I am seeing this on 1.642.2 with pipeline/workflow updates prior to the 2.0 split and doker-workflow 1.4.

          Vedran Lerenc added a comment -

          Seeing the same issues, even with Jenkins 2.7 (running as a container itself) and the demo image golang:1.5.0, but also my own image:

          Entering stage main
          Proceeding
          [Pipeline] sh
          [service-fabrik-deployments] Running shell script
          + docker inspect -f . golang:1.5.0
          .
          [Pipeline] withDockerContainer
          $ docker run -t -d -u 1000:1000 -w /var/jenkins_home/workspace/service-fabrik-deployments -v /var/jenkins_home/workspace/service-fabrik-deployments:/var/jenkins_home/workspace/service-fabrik-deployments:rw -v /var/jenkins_home/workspace/service-fabrik-deployments@tmp:/var/jenkins_home/workspace/service-fabrik-deployments@tmp:rw -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** golang:1.5.0 cat
          [Pipeline] {
          [Pipeline] sh
          [service-fabrik-deployments] Running shell script
          sh: 1: cannot create /var/jenkins_home/workspace/service-fabrik-deployments@tmp/durable-1db39848/pid: Directory nonexistent
          sh: 1: cannot create /var/jenkins_home/workspace/service-fabrik-deployments@tmp/durable-1db39848/jenkins-log.txt: Directory nonexistent
          sh: 1: cannot create /var/jenkins_home/workspace/service-fabrik-deployments@tmp/durable-1db39848/jenkins-result.txt: Directory nonexistent
          [Pipeline] }
          $ docker stop 478b92add1f1ac18c87bcde97f0d744c88531603d1ee0c68e7831a6ee8d41239
          $ docker rm -f 478b92add1f1ac18c87bcde97f0d744c88531603d1ee0c68e7831a6ee8d41239
          [Pipeline] // withDockerContainer
          [Pipeline] }
          [Pipeline] // node
          [Pipeline] End of Pipeline
          ERROR: script returned exit code -2
          Finished: FAILURE
          

          System Info:

          System Properties
          
          Name  ↓
          Value   
          awt.toolkit	sun.awt.X11.XToolkit
          executable-war	/usr/share/jenkins/jenkins.war
          file.encoding	UTF-8
          file.encoding.pkg	sun.io
          file.separator	/
          hudson.diyChunking	true
          hudson.model.DirectoryBrowserSupport.CSP	
          java.awt.graphicsenv	sun.awt.X11GraphicsEnvironment
          java.awt.headless	true
          java.awt.printerjob	sun.print.PSPrinterJob
          java.class.path	/usr/share/jenkins/jenkins.war
          java.class.version	52.0
          java.endorsed.dirs	/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/endorsed
          java.ext.dirs	/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/ext:/usr/java/packages/lib/ext
          java.home	/usr/lib/jvm/java-8-openjdk-amd64/jre
          java.io.tmpdir	/tmp
          java.library.path	/usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib
          java.runtime.name	OpenJDK Runtime Environment
          java.runtime.version	1.8.0_66-internal-b01
          java.specification.name	Java Platform API Specification
          java.specification.vendor	Oracle Corporation
          java.specification.version	1.8
          java.vendor	Oracle Corporation
          java.vendor.url	http://java.oracle.com/
          java.vendor.url.bug	http://bugreport.sun.com/bugreport/
          java.version	1.8.0_66-internal
          java.vm.info	mixed mode
          java.vm.name	OpenJDK 64-Bit Server VM
          java.vm.specification.name	Java Virtual Machine Specification
          java.vm.specification.vendor	Oracle Corporation
          java.vm.specification.version	1.8
          java.vm.vendor	Oracle Corporation
          java.vm.version	25.66-b01
          jna.loaded	true
          jna.platform.library.path	/usr/lib/x86_64-linux-gnu:/lib/x86_64-linux-gnu:/lib64:/usr/lib:/lib
          jnidispatch.path	/tmp/jna--1712433994/jna1486343830700614445.tmp
          line.separator	
          mail.smtp.sendpartial	true
          mail.smtps.sendpartial	true
          os.arch	amd64
          os.name	Linux
          os.version	3.13.0-79-generic
          path.separator	:
          sun.arch.data.model	64
          sun.boot.class.path	/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/resources.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/rt.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jsse.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jce.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/charsets.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jfr.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/classes
          sun.boot.library.path	/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/amd64
          sun.cpu.endian	little
          sun.cpu.isalist	
          sun.font.fontmanager	sun.awt.X11FontManager
          sun.io.unicode.encoding	UnicodeLittle
          sun.java.command	/usr/share/jenkins/jenkins.war
          sun.java.launcher	SUN_STANDARD
          sun.jnu.encoding	UTF-8
          sun.management.compiler	HotSpot 64-Bit Tiered Compilers
          sun.os.patch.level	unknown
          svnkit.http.methods	Digest,Basic,NTLM,Negotiate
          svnkit.ssh2.persistent	false
          user.dir	/
          user.home	/var/jenkins_home
          user.language	en
          user.name	jenkins
          user.timezone	Etc/UTC
          Environment Variables
          
          Name  ↓
          Value   
          BASH_FUNC_copy_reference_file%%	() {  f="${1%/}";
          b="${f%.override}";
          echo "$f" >> "$COPY_REFERENCE_FILE_LOG";
          rel="${b:23}";
          dir=$(dirname "${b}");
          echo " $f -> $rel" >> "$COPY_REFERENCE_FILE_LOG";
          if [[ ! -e $JENKINS_HOME/${rel} || $f = *.override ]]; then
          echo "copy $rel to JENKINS_HOME" >> "$COPY_REFERENCE_FILE_LOG";
          mkdir -p "$JENKINS_HOME/${dir:23}";
          cp -r "${f}" "$JENKINS_HOME/${rel}";
          [[ ${rel} == plugins/*.jpi ]] && touch "$JENKINS_HOME/${rel}.pinned";
          fi
          }
          TINI_SHA	066ad710107dc7ee05d3aa6e4974f01dc98f3888
          COPY_REFERENCE_FILE_LOG	/var/jenkins_home/copy_reference_file.log
          HOME	/var/jenkins_home
          HOSTNAME	eedd3e210c78
          JAVA_DEBIAN_VERSION	8u66-b01-1~bpo8+1
          JAVA_OPTS	-Dhudson.model.DirectoryBrowserSupport.CSP=
          JAVA_VERSION	8u66
          JENKINS_HOME	/var/jenkins_home
          CA_CERTIFICATES_JAVA_VERSION	20140324
          JENKINS_SLAVE_AGENT_PORT	50000
          JENKINS_UC	https://updates.jenkins.io
          JENKINS_VERSION	2.7
          LANG	C.UTF-8
          PATH	/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
          PWD	/
          SHLVL	0
          JENKINS_SHA	69e3ab0cc44acc3d711efb7436505e967174d628
          Plugins
          
          Name  ↓
          Version   
          Enabled   
          Pinned   
          ace-editor	1.1	true	false
          analysis-core	1.78	true	false
          ant	1.3	true	false
          antisamy-markup-formatter	1.3	true	false
          authentication-tokens	1.2	true	false
          branch-api	1.9	true	false
          build-timeout	1.16	true	false
          checkstyle	3.46	true	false
          cloudbees-folder	5.11	true	false
          cloudfoundry	1.5	true	false
          cobertura	1.9.8	true	false
          credentials	2.0.7	true	false
          credentials-binding	1.7	true	false
          docker-commons	1.3.1	true	false
          docker-custom-build-environment	1.6.5	true	false
          docker-workflow	1.4	true	false
          durable-task	1.10	true	false
          email-ext	2.43	true	false
          envinject	1.92.1	true	false
          external-monitor-job	1.4	true	false
          git	2.4.4	true	false
          git-client	1.19.6	true	false
          git-server	1.6	true	false
          github	1.19.1	true	false
          github-api	1.75	true	false
          github-branch-source	1.7	true	false
          github-organization-folder	1.3	true	false
          gradle	1.24	true	false
          handlebars	1.1.1	true	false
          htmlpublisher	1.11	true	false
          icon-shim	2.0.3	true	false
          javadoc	1.3	true	false
          jquery-detached	1.2.1	true	false
          junit	1.13	true	false
          ldap	1.12	true	false
          mailer	1.17	true	false
          mapdb-api	1.0.9.0	true	false
          matrix-auth	1.4	true	false
          matrix-project	1.7	true	false
          maven-plugin	2.13	true	false
          momentjs	1.1.1	true	false
          nodejs	0.2.1	true	false
          pam-auth	1.2	true	false
          pegdown-formatter	1.3	true	false
          pipeline-build-step	2.1	true	false
          pipeline-input-step	2.0	true	false
          pipeline-rest-api	1.4	true	false
          pipeline-stage-step	2.1	true	false
          pipeline-stage-view	1.4	true	false
          plain-credentials	1.2	true	false
          scm-api	1.2	true	false
          script-security	1.19	true	false
          ssh-credentials	1.12	true	false
          ssh-slaves	1.11	true	false
          structs	1.1	true	false
          subversion	2.5.7	true	false
          timestamper	1.8.2	true	false
          token-macro	1.12.1	true	false
          windows-slaves	1.1	true	false
          workflow-aggregator	2.1	true	false
          workflow-api	2.0	true	false
          workflow-basic-steps	2.0	true	false
          workflow-cps	2.4	true	false
          workflow-cps-global-lib	2.0	true	false
          workflow-durable-task-step	2.0	true	false
          workflow-job	2.2	true	false
          workflow-multibranch	2.6	true	false
          workflow-scm-step	2.0	true	false
          workflow-step-api	2.1	true	false
          workflow-support	2.0	true	false
          ws-cleanup	0.29	true	false
          

          Vedran Lerenc added a comment - Seeing the same issues, even with Jenkins 2.7 (running as a container itself) and the demo image golang:1.5.0, but also my own image: Entering stage main Proceeding [Pipeline] sh [service-fabrik-deployments] Running shell script + docker inspect -f . golang:1.5.0 . [Pipeline] withDockerContainer $ docker run -t -d -u 1000:1000 -w / var /jenkins_home/workspace/service-fabrik-deployments -v / var /jenkins_home/workspace/service-fabrik-deployments:/ var /jenkins_home/workspace/service-fabrik-deployments:rw -v / var /jenkins_home/workspace/service-fabrik-deployments@tmp:/ var /jenkins_home/workspace/service-fabrik-deployments@tmp:rw -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** golang:1.5.0 cat [Pipeline] { [Pipeline] sh [service-fabrik-deployments] Running shell script sh: 1: cannot create / var /jenkins_home/workspace/service-fabrik-deployments@tmp/durable-1db39848/pid: Directory nonexistent sh: 1: cannot create / var /jenkins_home/workspace/service-fabrik-deployments@tmp/durable-1db39848/jenkins-log.txt: Directory nonexistent sh: 1: cannot create / var /jenkins_home/workspace/service-fabrik-deployments@tmp/durable-1db39848/jenkins-result.txt: Directory nonexistent [Pipeline] } $ docker stop 478b92add1f1ac18c87bcde97f0d744c88531603d1ee0c68e7831a6ee8d41239 $ docker rm -f 478b92add1f1ac18c87bcde97f0d744c88531603d1ee0c68e7831a6ee8d41239 [Pipeline] // withDockerContainer [Pipeline] } [Pipeline] // node [Pipeline] End of Pipeline ERROR: script returned exit code -2 Finished: FAILURE System Info: System Properties Name ↓ Value awt.toolkit sun.awt.X11.XToolkit executable-war /usr/share/jenkins/jenkins.war file.encoding UTF-8 file.encoding.pkg sun.io file.separator / hudson.diyChunking true hudson.model.DirectoryBrowserSupport.CSP java.awt.graphicsenv sun.awt.X11GraphicsEnvironment java.awt.headless true java.awt.printerjob sun.print.PSPrinterJob java. class. path /usr/share/jenkins/jenkins.war java. class. version 52.0 java.endorsed.dirs /usr/lib/jvm/java-8-openjdk-amd64/jre/lib/endorsed java.ext.dirs /usr/lib/jvm/java-8-openjdk-amd64/jre/lib/ext:/usr/java/packages/lib/ext java.home /usr/lib/jvm/java-8-openjdk-amd64/jre java.io.tmpdir /tmp java.library.path /usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib java.runtime.name OpenJDK Runtime Environment java.runtime.version 1.8.0_66-internal-b01 java.specification.name Java Platform API Specification java.specification.vendor Oracle Corporation java.specification.version 1.8 java.vendor Oracle Corporation java.vendor.url http: //java.oracle.com/ java.vendor.url.bug http: //bugreport.sun.com/bugreport/ java.version 1.8.0_66-internal java.vm.info mixed mode java.vm.name OpenJDK 64-Bit Server VM java.vm.specification.name Java Virtual Machine Specification java.vm.specification.vendor Oracle Corporation java.vm.specification.version 1.8 java.vm.vendor Oracle Corporation java.vm.version 25.66-b01 jna.loaded true jna.platform.library.path /usr/lib/x86_64-linux-gnu:/lib/x86_64-linux-gnu:/lib64:/usr/lib:/lib jnidispatch.path /tmp/jna--1712433994/jna1486343830700614445.tmp line.separator mail.smtp.sendpartial true mail.smtps.sendpartial true os.arch amd64 os.name Linux os.version 3.13.0-79- generic path.separator : sun.arch.data.model 64 sun.boot. class. path /usr/lib/jvm/java-8-openjdk-amd64/jre/lib/resources.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/rt.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jsse.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jce.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/charsets.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jfr.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/classes sun.boot.library.path /usr/lib/jvm/java-8-openjdk-amd64/jre/lib/amd64 sun.cpu.endian little sun.cpu.isalist sun.font.fontmanager sun.awt.X11FontManager sun.io.unicode.encoding UnicodeLittle sun.java.command /usr/share/jenkins/jenkins.war sun.java.launcher SUN_STANDARD sun.jnu.encoding UTF-8 sun.management.compiler HotSpot 64-Bit Tiered Compilers sun.os.patch.level unknown svnkit.http.methods Digest,Basic,NTLM,Negotiate svnkit.ssh2.persistent false user.dir / user.home / var /jenkins_home user.language en user.name jenkins user.timezone Etc/UTC Environment Variables Name ↓ Value BASH_FUNC_copy_reference_file%% () { f= "${1%/}" ; b= "${f%.override}" ; echo "$f" >> "$COPY_REFERENCE_FILE_LOG" ; rel= "${b:23}" ; dir=$(dirname "${b}" ); echo " $f -> $rel" >> "$COPY_REFERENCE_FILE_LOG" ; if [[ ! -e $JENKINS_HOME/${rel} || $f = *.override ]]; then echo "copy $rel to JENKINS_HOME" >> "$COPY_REFERENCE_FILE_LOG" ; mkdir -p "$JENKINS_HOME/${dir:23}" ; cp -r "${f}" "$JENKINS_HOME/${rel}" ; [[ ${rel} == plugins/*.jpi ]] && touch "$JENKINS_HOME/${rel}.pinned" ; fi } TINI_SHA 066ad710107dc7ee05d3aa6e4974f01dc98f3888 COPY_REFERENCE_FILE_LOG / var /jenkins_home/copy_reference_file.log HOME / var /jenkins_home HOSTNAME eedd3e210c78 JAVA_DEBIAN_VERSION 8u66-b01-1~bpo8+1 JAVA_OPTS -Dhudson.model.DirectoryBrowserSupport.CSP= JAVA_VERSION 8u66 JENKINS_HOME / var /jenkins_home CA_CERTIFICATES_JAVA_VERSION 20140324 JENKINS_SLAVE_AGENT_PORT 50000 JENKINS_UC https: //updates.jenkins.io JENKINS_VERSION 2.7 LANG C.UTF-8 PATH /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin PWD / SHLVL 0 JENKINS_SHA 69e3ab0cc44acc3d711efb7436505e967174d628 Plugins Name ↓ Version Enabled Pinned ace-editor 1.1 true false analysis-core 1.78 true false ant 1.3 true false antisamy-markup-formatter 1.3 true false authentication-tokens 1.2 true false branch-api 1.9 true false build-timeout 1.16 true false checkstyle 3.46 true false cloudbees-folder 5.11 true false cloudfoundry 1.5 true false cobertura 1.9.8 true false credentials 2.0.7 true false credentials-binding 1.7 true false docker-commons 1.3.1 true false docker-custom-build-environment 1.6.5 true false docker-workflow 1.4 true false durable-task 1.10 true false email-ext 2.43 true false envinject 1.92.1 true false external-monitor-job 1.4 true false git 2.4.4 true false git-client 1.19.6 true false git-server 1.6 true false github 1.19.1 true false github-api 1.75 true false github-branch-source 1.7 true false github-organization-folder 1.3 true false gradle 1.24 true false handlebars 1.1.1 true false htmlpublisher 1.11 true false icon-shim 2.0.3 true false javadoc 1.3 true false jquery-detached 1.2.1 true false junit 1.13 true false ldap 1.12 true false mailer 1.17 true false mapdb-api 1.0.9.0 true false matrix-auth 1.4 true false matrix-project 1.7 true false maven-plugin 2.13 true false momentjs 1.1.1 true false nodejs 0.2.1 true false pam-auth 1.2 true false pegdown-formatter 1.3 true false pipeline-build-step 2.1 true false pipeline-input-step 2.0 true false pipeline- rest -api 1.4 true false pipeline-stage-step 2.1 true false pipeline-stage-view 1.4 true false plain-credentials 1.2 true false scm-api 1.2 true false script-security 1.19 true false ssh-credentials 1.12 true false ssh-slaves 1.11 true false structs 1.1 true false subversion 2.5.7 true false timestamper 1.8.2 true false token-macro 1.12.1 true false windows-slaves 1.1 true false workflow-aggregator 2.1 true false workflow-api 2.0 true false workflow-basic-steps 2.0 true false workflow-cps 2.4 true false workflow-cps-global-lib 2.0 true false workflow-durable-task-step 2.0 true false workflow-job 2.2 true false workflow-multibranch 2.6 true false workflow-scm-step 2.0 true false workflow-step-api 2.1 true false workflow-support 2.0 true false ws-cleanup 0.29 true false

          Joost Meijles added a comment -

          Getting the same issues with v2.7 (and Docker v1.11) as well.
          The following code, from the docker-workflow-plugin flow.groovy script:

          node {
              docker.withServer('10.57.3.192:2375')
              {
                  def maven = docker.image('maven:3.3.9-jdk-8');
                  
                  maven.inside {
                    sh "echo paard"
                  }
              }
          }
          

          gives:

          Running on master in /var/jenkins_home/workspace/test
          [Pipeline] {
          [Pipeline] withDockerServer
          [Pipeline] {
          [Pipeline] sh
          [test] Running shell script
          + docker inspect -f . maven:3.3.9-jdk-8
          .
          [Pipeline] withDockerContainer
          $ docker run -t -d -u 1000:1000 -w /var/jenkins_home/workspace/test -v /var/jenkins_home/workspace/test:/var/jenkins_home/workspace/test:rw -v /var/jenkins_home/workspace/test@tmp:/var/jenkins_home/workspace/test@tmp:rw -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** maven:3.3.9-jdk-8 cat
          [Pipeline] {
          [Pipeline] sh
          [test] Running shell script
          sh: 1: cannot create /var/jenkins_home/workspace/test@tmp/durable-b6571b6c/pid: Directory nonexistent
          sh: 1: cannot create /var/jenkins_home/workspace/test@tmp/durable-b6571b6c/jenkins-log.txt: Directory nonexistent
          sh: 1: cannot create /var/jenkins_home/workspace/test@tmp/durable-b6571b6c/jenkins-result.txt: Directory nonexistent
          
          [Pipeline] }
          $ docker stop 1ec4e5fe48cb461e14b2ca6facccfa1ee94feff74c82045939a4c6dc16368027
          $ docker rm -f 1ec4e5fe48cb461e14b2ca6facccfa1ee94feff74c82045939a4c6dc16368027
          [Pipeline] // withDockerContainer
          [Pipeline] }
          [Pipeline] // withDockerServer
          [Pipeline] }
          [Pipeline] // node
          [Pipeline] End of Pipeline
          ERROR: script returned exit code -2
          Finished: FAILURE
          

          Should this issue be reopened? Or am I missing something in the setup?

          Joost Meijles added a comment - Getting the same issues with v2.7 (and Docker v1.11) as well. The following code, from the docker-workflow-plugin flow.groovy script: node { docker.withServer( '10.57.3.192:2375' ) { def maven = docker.image( 'maven:3.3.9-jdk-8' ); maven.inside { sh "echo paard" } } } gives: Running on master in / var /jenkins_home/workspace/test [Pipeline] { [Pipeline] withDockerServer [Pipeline] { [Pipeline] sh [test] Running shell script + docker inspect -f . maven:3.3.9-jdk-8 . [Pipeline] withDockerContainer $ docker run -t -d -u 1000:1000 -w / var /jenkins_home/workspace/test -v / var /jenkins_home/workspace/test:/ var /jenkins_home/workspace/test:rw -v / var /jenkins_home/workspace/test@tmp:/ var /jenkins_home/workspace/test@tmp:rw -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** maven:3.3.9-jdk-8 cat [Pipeline] { [Pipeline] sh [test] Running shell script sh: 1: cannot create / var /jenkins_home/workspace/test@tmp/durable-b6571b6c/pid: Directory nonexistent sh: 1: cannot create / var /jenkins_home/workspace/test@tmp/durable-b6571b6c/jenkins-log.txt: Directory nonexistent sh: 1: cannot create / var /jenkins_home/workspace/test@tmp/durable-b6571b6c/jenkins-result.txt: Directory nonexistent [Pipeline] } $ docker stop 1ec4e5fe48cb461e14b2ca6facccfa1ee94feff74c82045939a4c6dc16368027 $ docker rm -f 1ec4e5fe48cb461e14b2ca6facccfa1ee94feff74c82045939a4c6dc16368027 [Pipeline] // withDockerContainer [Pipeline] } [Pipeline] // withDockerServer [Pipeline] } [Pipeline] // node [Pipeline] End of Pipeline ERROR: script returned exit code -2 Finished: FAILURE Should this issue be reopened? Or am I missing something in the setup?

          Christopher Eck added a comment - - edited

          Also seeing this issue - blocking us from migrating our build to Jenkins.
          I expect we're supposed to be seeing --volumes-from instead of -v

          Jenkins: 2.17
          Docker: 1.10.3

          The demo doesn't mount /var/lib/jenkins or any other host path into the jenkins host container, and it doesn't appear to use slave containers to run the workflow steps, so I'm not sure how this works in the demo case either...

          I'd like to reopen this issue as it doesn't appear to be working properly and there are no good docs that I can find on how to set up the Jenkins master and slave containers. I'll give it a day or two and re-open if I don't hear anything.

          Christopher Eck added a comment - - edited Also seeing this issue - blocking us from migrating our build to Jenkins. I expect we're supposed to be seeing --volumes-from instead of -v Jenkins: 2.17 Docker: 1.10.3 The demo doesn't mount /var/lib/jenkins or any other host path into the jenkins host container, and it doesn't appear to use slave containers to run the workflow steps, so I'm not sure how this works in the demo case either... I'd like to reopen this issue as it doesn't appear to be working properly and there are no good docs that I can find on how to set up the Jenkins master and slave containers. I'll give it a day or two and re-open if I don't hear anything.

          Vitor Dantas added a comment -

          I reported a similar issue JENKINS-37069, only difference is the message when trying to access the durable folder is "Permission denied" on my setup instead of "Directory nonexistent".

          Vitor Dantas added a comment - I reported a similar issue JENKINS-37069 , only difference is the message when trying to access the durable folder is "Permission denied" on my setup instead of "Directory nonexistent".

          joostm, docker.inside only works with a local docker-host. This means that if the docker daemon that you've specified via withServer is not actually local (typically this means that the Jenkins slave and Docker daemon share the same PID namespace) to the Jenkins executor process, it simply will not work. This is because docker.inside bind-mounts in the workspace which of course relies on the workspace filesystem existing on the same system as the docker daemon.

          Jacob Blain Christen added a comment - joostm , docker.inside only works with a local docker-host. This means that if the docker daemon that you've specified via withServer is not actually local (typically this means that the Jenkins slave and Docker daemon share the same PID namespace) to the Jenkins executor process, it simply will not work. This is because docker.inside bind-mounts in the workspace which of course relies on the workspace filesystem existing on the same system as the docker daemon.

          In my environment, we're running the jenkins master and slaves as containers within kubernetes. I'm passing /var/run/docker.sock into each slave container so it connects to the local host's docker daemon. On a slave, I'm attempting to run docker.inside on a newly built container and seeing this behavior. I expect, based on what I read of the demo, that this should work properly.

          Christopher Eck added a comment - In my environment, we're running the jenkins master and slaves as containers within kubernetes. I'm passing /var/run/docker.sock into each slave container so it connects to the local host's docker daemon. On a slave, I'm attempting to run docker.inside on a newly built container and seeing this behavior. I expect, based on what I read of the demo, that this should work properly.

          Danny Waite added a comment -

          Guys, I'm also seeing this issue when running on Kubernetes as outlined in the following article https://cloud.google.com/solutions/configuring-jenkins-container-engine

          Any help appreciated, as Christopher mentioned this is a blocking issue.

          Danny Waite added a comment - Guys, I'm also seeing this issue when running on Kubernetes as outlined in the following article https://cloud.google.com/solutions/configuring-jenkins-container-engine Any help appreciated, as Christopher mentioned this is a blocking issue.

          Jesse Glick added a comment -

          A recent release allows inside to work when the node is also running in a container, using --volumes-from.

          Jesse Glick added a comment - A recent release allows inside to work when the node is also running in a container, using --volumes-from .

          Jesse, the problem seems to be that the plugin isn't issuing --volumes-from in this case. It's still attempting to use -v.

          Christopher Eck added a comment - Jesse, the problem seems to be that the plugin isn't issuing --volumes-from in this case. It's still attempting to use -v.

          Jesse Glick added a comment -

          chrisleck if that is the only issue, then PR 56 might have fixed it.

          Jesse Glick added a comment - chrisleck if that is the only issue, then PR 56 might have fixed it.

          Nice! Looking forward to trying it out. Is there a timeline for the next release?

          Christopher Eck added a comment - Nice! Looking forward to trying it out. Is there a timeline for the next release?

          Danny Waite added a comment -

          This very simple project (derived from Google's Jenkins/Kubernetes example) appears to be working:

          https://github.com/danny-waite/jenkins-sample-pipeline-app

          Great work, many thanks, will continue to more detailed testing, I realise this isn't done 'by the book', ie. the use of `sh` blocks.

          Danny Waite added a comment - This very simple project (derived from Google's Jenkins/Kubernetes example) appears to be working: https://github.com/danny-waite/jenkins-sample-pipeline-app Great work, many thanks, will continue to more detailed testing, I realise this isn't done 'by the book', ie. the use of `sh` blocks.

          Danny Waite added a comment -

          Spoke too soon, this example does not work:

          https://github.com/danny-waite/gceme

          Danny Waite added a comment - Spoke too soon, this example does not work: https://github.com/danny-waite/gceme

          Jesse Glick added a comment -

          Version 1.9 had a related fix. Please do not reopen without complete, self-contained steps to reproduce from scratch.

          Jesse Glick added a comment - Version 1.9 had a related fix. Please do not reopen without complete, self-contained steps to reproduce from scratch.

          Pierre Beitz added a comment - - edited

          Hello jglick,

          I think I found a self-contained example showing that the issue is still around. Using the following versions of Docker

          • Docker 1.12.2, build bb80604 (on linux).
          • Docker 1.12.3, build 6b644ec, experimental (on windows).

          And using this gist, I ended up with the following error on my two machines:

          $ docker run -t -d -u 1000:1000 -w /var/jenkins_home/jobs/demo/workspace -v /var/jenkins_home/jobs/demo/workspace:/var/jenkins_home/jobs/demo/workspace:rw -v /var/jenkins_home/jobs/demo/workspace@tmp:/var/jenkins_home/jobs/demo/workspace@tmp:rw -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** --entrypoint cat ubuntu
          [Pipeline] {
          [Pipeline] sh
          [workspace] Running shell script
          sh: 1: cannot create /var/jenkins_home/jobs/demo/workspace@tmp/durable-252b7e97/pid: Directory nonexistent
          sh: 1: cannot create /var/jenkins_home/jobs/demo/workspace@tmp/durable-252b7e97/jenkins-log.txt: Directory nonexistent
          sh: 1: cannot create /var/jenkins_home/jobs/demo/workspace@tmp/durable-252b7e97/jenkins-result.txt: Directory nonexistent
          

          Version of the Docker pipeline is 1.9.1 at the time of this writing, but as you can see in the console dump, the -v flag is used instead of the --volume-from, as if the plugin did not detect it was running inside a container.

          Steps to reproduce

          • Copy this gist
          • Set the executable permissions on the scripts.
          • Run the build script and then the run script.
          • The Jenkins instance will be available on port 8080. There is one job in it running the following:
          node ('master') {
              sh 'pwd'
              img = docker.image('ubuntu')
              img.inside {
                  sh 'pwd'
              }
          }
          

          Do you think there is enough to reopen this issue?

          Thanks

          Pierre Beitz added a comment - - edited Hello jglick , I think I found a self-contained example showing that the issue is still around. Using the following versions of Docker Docker 1.12.2, build bb80604 (on linux). Docker 1.12.3, build 6b644ec, experimental (on windows). And using this gist , I ended up with the following error on my two machines: $ docker run -t -d -u 1000:1000 -w /var/jenkins_home/jobs/demo/workspace -v /var/jenkins_home/jobs/demo/workspace:/var/jenkins_home/jobs/demo/workspace:rw -v /var/jenkins_home/jobs/demo/workspace@tmp:/var/jenkins_home/jobs/demo/workspace@tmp:rw -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** -e ******** --entrypoint cat ubuntu [Pipeline] { [Pipeline] sh [workspace] Running shell script sh: 1: cannot create /var/jenkins_home/jobs/demo/workspace@tmp/durable-252b7e97/pid: Directory nonexistent sh: 1: cannot create /var/jenkins_home/jobs/demo/workspace@tmp/durable-252b7e97/jenkins-log.txt: Directory nonexistent sh: 1: cannot create /var/jenkins_home/jobs/demo/workspace@tmp/durable-252b7e97/jenkins-result.txt: Directory nonexistent Version of the Docker pipeline is 1.9.1 at the time of this writing, but as you can see in the console dump, the -v flag is used instead of the --volume-from, as if the plugin did not detect it was running inside a container. Steps to reproduce Copy this gist Set the executable permissions on the scripts. Run the build script and then the run script. The Jenkins instance will be available on port 8080. There is one job in it running the following: node ('master') { sh 'pwd' img = docker.image('ubuntu') img.inside { sh 'pwd' } } Do you think there is enough to reopen this issue? Thanks

          Jesse Glick added a comment -

          The errors from sh are irrelevant; the point is that your docker run command is missing --volumes-from. Offhand I do not know why. Someday when I have time I will try to follow the steps to reproduce and track down the problem.

          Jesse Glick added a comment - The errors from sh are irrelevant; the point is that your docker run command is missing --volumes-from . Offhand I do not know why. Someday when I have time I will try to follow the steps to reproduce and track down the problem.

          Pierre Beitz added a comment -

          Thank you for your answer. I had a quick look to the code, and wrote a failing unit test (that you can also find in the gist):

              @Test
              public void test_cgroup_string_matching_bug() {
          
                  final String[] possibleCgroupStrings = new String[] {
                          "14:name=systemd:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "13:pids:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "12:hugetlb:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "11:net_prio:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "10:perf_event:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "9:net_cls:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "8:freezer:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "7:devices:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "6:memory:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "5:blkio:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "4:cpuacct:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "3:cpu:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "2:cpuset:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11",
                          "1:name=openrc:/docker"
                  };
          
                  for (final String possibleCgroupString : possibleCgroupStrings) {
                      final Pattern pattern = Pattern.compile(DockerClient.CGROUP_MATCHER_PATTERN);
                      Matcher matcher = pattern.matcher(possibleCgroupString);
                      Assert.assertTrue(matcher.find());
                      Assert.assertEquals("45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11", matcher.group(1));
                  }
          
              }
          

          As you can see, the pattern is not matching one of the known cgroup formats. I have no idea how to proceed from there. Note that I extracted this pattern from a docker running on top of windows 10. I also have the problem on Linux Mint, I can extract the content of the /proc/self/cgroup on it too if it can help.

          Pierre Beitz added a comment - Thank you for your answer. I had a quick look to the code, and wrote a failing unit test (that you can also find in the gist): @Test public void test_cgroup_string_matching_bug() { final String [] possibleCgroupStrings = new String [] { "14:name=systemd:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "13:pids:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "12:hugetlb:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "11:net_prio:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "10:perf_event:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "9:net_cls:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "8:freezer:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "7:devices:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "6:memory:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "5:blkio:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "4:cpuacct:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "3:cpu:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "2:cpuset:/docker/45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , "1:name=openrc:/docker" }; for ( final String possibleCgroupString : possibleCgroupStrings) { final Pattern pattern = Pattern.compile(DockerClient.CGROUP_MATCHER_PATTERN); Matcher matcher = pattern.matcher(possibleCgroupString); Assert.assertTrue(matcher.find()); Assert.assertEquals( "45686cf8ff804c6250e87c02f768f44c63f4d25987e904189ea9156af9f63a11" , matcher.group(1)); } } As you can see, the pattern is not matching one of the known cgroup formats. I have no idea how to proceed from there. Note that I extracted this pattern from a docker running on top of windows 10. I also have the problem on Linux Mint, I can extract the content of the /proc/self/cgroup on it too if it can help.

          Jesse Glick added a comment -

          I think the test is not correct. When I amended it as follows

          diff --git a/src/test/java/org/jenkinsci/plugins/docker/workflow/client/DockerClientTest.java b/src/test/java/org/jenkinsci/plugins/docker/workflow/client/DockerClientTest.java
          index 2edee8e..5e665c7 100644
          --- a/src/test/java/org/jenkinsci/plugins/docker/workflow/client/DockerClientTest.java
          +++ b/src/test/java/org/jenkinsci/plugins/docker/workflow/client/DockerClientTest.java
          @@ -94,15 +94,17 @@ public class DockerClientTest {
               public void test_cgroup_string_matching() {
               	
               	final String[] possibleCgroupStrings = new String[] {
          -    		"2:cpu:/docker/3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b",
          -    		"4:cpuset:/system.slice/docker-3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b.scope"
          +            "2:cpu:/docker/3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b",
          +            "4:cpuset:/system.slice/docker-3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b.scope",
          +            "3:cpu:/docker/3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b",
          +            "2:cpuset:/docker/3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b",
               	};
               	
               	for (final String possibleCgroupString : possibleCgroupStrings) {
               		final Pattern pattern = Pattern.compile(DockerClient.CGROUP_MATCHER_PATTERN);
               		Matcher matcher = pattern.matcher(possibleCgroupString);
          -    		Assert.assertTrue(matcher.find());
          -    		Assert.assertEquals("3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b", matcher.group(1));
          +    		Assert.assertTrue(possibleCgroupString, matcher.find());
          +    		Assert.assertEquals(possibleCgroupString, "3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b", matcher.group(1));
           		}
               	
               }
          

          it passes.

          Jesse Glick added a comment - I think the test is not correct. When I amended it as follows diff --git a/src/test/java/org/jenkinsci/plugins/docker/workflow/client/DockerClientTest.java b/src/test/java/org/jenkinsci/plugins/docker/workflow/client/DockerClientTest.java index 2edee8e..5e665c7 100644 --- a/src/test/java/org/jenkinsci/plugins/docker/workflow/client/DockerClientTest.java +++ b/src/test/java/org/jenkinsci/plugins/docker/workflow/client/DockerClientTest.java @@ -94,15 +94,17 @@ public class DockerClientTest { public void test_cgroup_string_matching() { final String [] possibleCgroupStrings = new String [] { - "2:cpu:/docker/3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b" , - "4:cpuset:/system.slice/docker-3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b.scope" + "2:cpu:/docker/3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b" , + "4:cpuset:/system.slice/docker-3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b.scope" , + "3:cpu:/docker/3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b" , + "2:cpuset:/docker/3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b" , }; for ( final String possibleCgroupString : possibleCgroupStrings) { final Pattern pattern = Pattern.compile(DockerClient.CGROUP_MATCHER_PATTERN); Matcher matcher = pattern.matcher(possibleCgroupString); - Assert.assertTrue(matcher.find()); - Assert.assertEquals( "3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b" , matcher.group(1)); + Assert.assertTrue(possibleCgroupString, matcher.find()); + Assert.assertEquals(possibleCgroupString, "3dd988081e7149463c043b5d9c57d7309e079c5e9290f91feba1cc45a04d6a5b" , matcher.group(1)); } } it passes.

          Pierre Beitz added a comment -

          Hello again,

          You are absolutely right and I went debugging again to find out...that the issue was on my side

          To give more context for people coming here, it is in fact the docker that was inside my jenkins docker image that was not correct. I installed it using apt-get install from the official jenkins image which was a mistake as I ended up with a docker client version 1.6.2 (communicating with a docker deamon on my host running a version 12...).
          The problem finally came from the fact the docker inspect output format was changed for volumes in docker 1.8 (I'm pretty sure this comes from this PR). As a consequence, the plugin did not find any volumes, falling back to a --volume mount.

          Installing the latest version of docker using curl -sSL https://get.docker.com/ | sh solved the issue. I guess this means the issue can be closed. If you are interested, I made a pull request to add an error in the logs saying : "The docker version is less than v1.8. Running a 'docker.inside' from inside a container will not work."

          Pierre Beitz added a comment - Hello again, You are absolutely right and I went debugging again to find out...that the issue was on my side To give more context for people coming here, it is in fact the docker that was inside my jenkins docker image that was not correct. I installed it using apt-get install from the official jenkins image which was a mistake as I ended up with a docker client version 1.6.2 (communicating with a docker deamon on my host running a version 12...). The problem finally came from the fact the docker inspect output format was changed for volumes in docker 1.8 (I'm pretty sure this comes from this PR ). As a consequence, the plugin did not find any volumes, falling back to a --volume mount. Installing the latest version of docker using curl -sSL https://get.docker.com/ | sh solved the issue. I guess this means the issue can be closed. If you are interested, I made a pull request to add an error in the logs saying : "The docker version is less than v1.8. Running a 'docker.inside' from inside a container will not work."

          Jesse Glick added a comment -

          There is already a minimum client version specified in the plugin, so this could be updated (or is it the server version which matters?), or we could expand the list of parsable formats. Either way, definitely it would be nice for the plugin to print more helpful diagnostics.

          Jesse Glick added a comment - There is already a minimum client version specified in the plugin, so this could be updated (or is it the server version which matters?), or we could expand the list of parsable formats. Either way, definitely it would be nice for the plugin to print more helpful diagnostics.

          Pierre Beitz added a comment -

          I had a look at the non working and working environment and came up with the following:

          working

          Client:
           Version:      1.12.3
           API version:  1.24
          
          Server:
           Version:      1.12.3
           API version:  1.24
          

          non working

          Client version: 1.6.2
          Client API version: 1.18
          Server version: 1.12.3
          Server API version: 1.24
          

          I guess the client version is the one the plugin should look at then.

          Pierre Beitz added a comment - I had a look at the non working and working environment and came up with the following: working Client: Version: 1.12.3 API version: 1.24 Server: Version: 1.12.3 API version: 1.24 non working Client version: 1.6.2 Client API version: 1.18 Server version: 1.12.3 Server API version: 1.24 I guess the client version is the one the plugin should look at then.

          Code changed in jenkins
          User: Jesse Glick
          Path:
          src/main/java/org/jenkinsci/plugins/docker/workflow/WithContainerStep.java
          http://jenkins-ci.org/commit/docker-workflow-plugin/98d2af626cca98f05d358c11431c9f0b5600809b
          Log:
          Merge pull request #80 from PierreBtz/dev

          JENKINS-33632 Add a warning if the detected docker version is less than v1.8

          Compare: https://github.com/jenkinsci/docker-workflow-plugin/compare/eb142f64eba3...98d2af626cca

          SCM/JIRA link daemon added a comment - Code changed in jenkins User: Jesse Glick Path: src/main/java/org/jenkinsci/plugins/docker/workflow/WithContainerStep.java http://jenkins-ci.org/commit/docker-workflow-plugin/98d2af626cca98f05d358c11431c9f0b5600809b Log: Merge pull request #80 from PierreBtz/dev JENKINS-33632 Add a warning if the detected docker version is less than v1.8 Compare: https://github.com/jenkinsci/docker-workflow-plugin/compare/eb142f64eba3...98d2af626cca

          Jesse Glick added a comment -

          Jesse Glick added a comment - Warning added with https://github.com/jenkinsci/docker-workflow-plugin/pull/80 , thanks!

            pierrebtz Pierre Beitz
            teilo James Nord
            Votes:
            5 Vote for this issue
            Watchers:
            15 Start watching this issue

              Created:
              Updated:
              Resolved: