Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-462

job hangs on slave

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Closed (View Workflow)
    • Priority: Major
    • Resolution: Fixed
    • Component/s: remoting
    • Labels:
      None
    • Environment:
      Platform: All, OS: All
    • Similar Issues:

      Description

      I installed version 1.100 on my environment. Build of one project is hangs on
      slave node after finishing of ANT script. Build artifacts are transfered to
      master node, no junit reports and emma reports are transfered and nothing is
      written to log file of job or tomcat log file. Job just hangs.
      If build runs on master node then everything is ok.
      Strange thing that another project (also uses emma and junit reports) could be
      built on the slave node without problem.

      I will attach two config files for job which hangs and for job which goes ok.

        Attachments

        1. 111.log
          25 kB
        2. config.xml
          2 kB
        3. config.xml
          6 kB

          Activity

          Hide
          ramazanyich2 ramazanyich2 added a comment -

          finally found then it happens.
          job hangs during publishing of big amount of artifacts.
          In my job I have 56 Mb of data which is published as artifacts.
          And in job which doesn't hang it is "only" 20 Mb.
          It seemd like there is some issue with PipedInputStream for big amounts of data.
          Because then I do kill jobs exception is always the same (as example last stack
          trace which I had):

          FATAL: Error while expanding null
          Error while expanding null
          at org.apache.tools.ant.taskdefs.Untar.expandResource(Untar.java:121)
          at org.apache.tools.ant.taskdefs.Expand.execute(Expand.java:119)
          at hudson.FilePath.readFromTar(FilePath.java:743)
          at hudson.FilePath.copyRecursiveTo(FilePath.java:678)
          at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:65)
          at hudson.model.Build$RunnerImpl.post(Build.java:146)
          at hudson.model.Run.run(Run.java:557)
          at hudson.model.Build.run(Build.java:110)
          at hudson.model.Executor.run(Executor.java:61)
          Caused by: java.io.InterruptedIOException
          at java.io.PipedInputStream.read(PipedInputStream.java:262)
          at java.io.PipedInputStream.read(PipedInputStream.java:305)
          at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:214)
          at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:134)
          at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:87)
          at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
          at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
          at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
          at java.io.BufferedInputStream.read1(BufferedInputStream.java:254)
          at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
          at org.apache.tools.tar.TarBuffer.readBlock(TarBuffer.java:257)
          at org.apache.tools.tar.TarBuffer.readRecord(TarBuffer.java:223)
          at org.apache.tools.tar.TarInputStream.read(TarInputStream.java:340)
          at java.io.FilterInputStream.read(FilterInputStream.java:90)
          at org.apache.tools.ant.taskdefs.Expand.extractFile(Expand.java:282)
          at org.apache.tools.ant.taskdefs.Untar.expandStream(Untar.java:142)
          at org.apache.tools.ant.taskdefs.Untar.expandResource(Untar.java:119)
          ... 8 more
          — Nested Exception —
          java.io.InterruptedIOException
          at java.io.PipedInputStream.read(PipedInputStream.java:262)
          at java.io.PipedInputStream.read(PipedInputStream.java:305)
          at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:214)
          at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:134)
          at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:87)
          at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
          at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
          at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
          at java.io.BufferedInputStream.read1(BufferedInputStream.java:254)
          at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
          at org.apache.tools.tar.TarBuffer.readBlock(TarBuffer.java:257)
          at org.apache.tools.tar.TarBuffer.readRecord(TarBuffer.java:223)
          at org.apache.tools.tar.TarInputStream.read(TarInputStream.java:340)
          at java.io.FilterInputStream.read(FilterInputStream.java:90)
          at org.apache.tools.ant.taskdefs.Expand.extractFile(Expand.java:282)
          at org.apache.tools.ant.taskdefs.Untar.expandStream(Untar.java:142)
          at org.apache.tools.ant.taskdefs.Untar.expandResource(Untar.java:119)
          at org.apache.tools.ant.taskdefs.Expand.execute(Expand.java:119)
          at hudson.FilePath.readFromTar(FilePath.java:743)
          at hudson.FilePath.copyRecursiveTo(FilePath.java:678)
          at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:65)
          at hudson.model.Build$RunnerImpl.post(Build.java:146)
          at hudson.model.Run.run(Run.java:557)
          at hudson.model.Build.run(Build.java:110)
          at hudson.model.Executor.run(Executor.java:61)

          Show
          ramazanyich2 ramazanyich2 added a comment - finally found then it happens. job hangs during publishing of big amount of artifacts. In my job I have 56 Mb of data which is published as artifacts. And in job which doesn't hang it is "only" 20 Mb. It seemd like there is some issue with PipedInputStream for big amounts of data. Because then I do kill jobs exception is always the same (as example last stack trace which I had): FATAL: Error while expanding null Error while expanding null at org.apache.tools.ant.taskdefs.Untar.expandResource(Untar.java:121) at org.apache.tools.ant.taskdefs.Expand.execute(Expand.java:119) at hudson.FilePath.readFromTar(FilePath.java:743) at hudson.FilePath.copyRecursiveTo(FilePath.java:678) at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:65) at hudson.model.Build$RunnerImpl.post(Build.java:146) at hudson.model.Run.run(Run.java:557) at hudson.model.Build.run(Build.java:110) at hudson.model.Executor.run(Executor.java:61) Caused by: java.io.InterruptedIOException at java.io.PipedInputStream.read(PipedInputStream.java:262) at java.io.PipedInputStream.read(PipedInputStream.java:305) at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:214) at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:134) at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:87) at java.io.BufferedInputStream.fill(BufferedInputStream.java:218) at java.io.BufferedInputStream.read1(BufferedInputStream.java:256) at java.io.BufferedInputStream.read(BufferedInputStream.java:313) at java.io.BufferedInputStream.read1(BufferedInputStream.java:254) at java.io.BufferedInputStream.read(BufferedInputStream.java:313) at org.apache.tools.tar.TarBuffer.readBlock(TarBuffer.java:257) at org.apache.tools.tar.TarBuffer.readRecord(TarBuffer.java:223) at org.apache.tools.tar.TarInputStream.read(TarInputStream.java:340) at java.io.FilterInputStream.read(FilterInputStream.java:90) at org.apache.tools.ant.taskdefs.Expand.extractFile(Expand.java:282) at org.apache.tools.ant.taskdefs.Untar.expandStream(Untar.java:142) at org.apache.tools.ant.taskdefs.Untar.expandResource(Untar.java:119) ... 8 more — Nested Exception — java.io.InterruptedIOException at java.io.PipedInputStream.read(PipedInputStream.java:262) at java.io.PipedInputStream.read(PipedInputStream.java:305) at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:214) at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:134) at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:87) at java.io.BufferedInputStream.fill(BufferedInputStream.java:218) at java.io.BufferedInputStream.read1(BufferedInputStream.java:256) at java.io.BufferedInputStream.read(BufferedInputStream.java:313) at java.io.BufferedInputStream.read1(BufferedInputStream.java:254) at java.io.BufferedInputStream.read(BufferedInputStream.java:313) at org.apache.tools.tar.TarBuffer.readBlock(TarBuffer.java:257) at org.apache.tools.tar.TarBuffer.readRecord(TarBuffer.java:223) at org.apache.tools.tar.TarInputStream.read(TarInputStream.java:340) at java.io.FilterInputStream.read(FilterInputStream.java:90) at org.apache.tools.ant.taskdefs.Expand.extractFile(Expand.java:282) at org.apache.tools.ant.taskdefs.Untar.expandStream(Untar.java:142) at org.apache.tools.ant.taskdefs.Untar.expandResource(Untar.java:119) at org.apache.tools.ant.taskdefs.Expand.execute(Expand.java:119) at hudson.FilePath.readFromTar(FilePath.java:743) at hudson.FilePath.copyRecursiveTo(FilePath.java:678) at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:65) at hudson.model.Build$RunnerImpl.post(Build.java:146) at hudson.model.Run.run(Run.java:557) at hudson.model.Build.run(Build.java:110) at hudson.model.Executor.run(Executor.java:61)
          Hide
          ramazanyich2 ramazanyich2 added a comment -
          Show
          ramazanyich2 ramazanyich2 added a comment - Could it be related to this issue http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4028462 ?
          Hide
          ramazanyich2 ramazanyich2 added a comment -

          Created an attachment (id=54)
          Tomcat thread stack then job is hanging (generated by calling kill -3 )

          Show
          ramazanyich2 ramazanyich2 added a comment - Created an attachment (id=54) Tomcat thread stack then job is hanging (generated by calling kill -3 )
          Hide
          kohsuke Kohsuke Kawaguchi added a comment -

          Thanks. I think I fixed this problem in 1.101.

          If you can try the latest snapshot, that would be wonderful.

          Show
          kohsuke Kohsuke Kawaguchi added a comment - Thanks. I think I fixed this problem in 1.101. If you can try the latest snapshot, that would be wonderful.
          Hide
          ramazanyich2 ramazanyich2 added a comment -

          It works with snapshot
          Hudson ver. 1.101-SNAPSHOT (private-04/12/2007 02:18-hudson)

          Thanks a lot !

          Show
          ramazanyich2 ramazanyich2 added a comment - It works with snapshot Hudson ver. 1.101-SNAPSHOT (private-04/12/2007 02:18-hudson) Thanks a lot !

            People

            Assignee:
            Unassigned Unassigned
            Reporter:
            ramazanyich2 ramazanyich2
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

              Dates

              Created:
              Updated:
              Resolved: