-
Bug
-
Resolution: Fixed
-
Major
-
None
-
Platform: All, OS: All
-
Powered by SuggestiMate
I installed version 1.100 on my environment. Build of one project is hangs on
slave node after finishing of ANT script. Build artifacts are transfered to
master node, no junit reports and emma reports are transfered and nothing is
written to log file of job or tomcat log file. Job just hangs.
If build runs on master node then everything is ok.
Strange thing that another project (also uses emma and junit reports) could be
built on the slave node without problem.
I will attach two config files for job which hangs and for job which goes ok.
- 111.log
- 25 kB
- config.xml
- 2 kB
- config.xml
- 6 kB
[JENKINS-462] job hangs on slave
In catalina.out only following is printed then build is running on slave:
Apr 12, 2007 12:04:23 PM hudson.model.Run setResult
INFO: CERTIONE2_CORE #135 : result is set to SUCCESS by hudson.model.Run.run(Run
.java:544)
Apr 12, 2007 12:04:23 PM hudson.model.Run run
INFO: CERTIONE2_CORE #135 main build action completed: SUCCESS
an if it is running on master following is in catalina.out:
INFO: CERTIONE2_CORE #136 : result is set to SUCCESS by
hudson.model.Run.run(Run.java:544)
Apr 12, 2007 12:17:41 PM hudson.model.Run run
INFO: CERTIONE2_CORE #136 main build action completed: SUCCESS
Apr 12, 2007 12:18:30 PM hudson.model.Run setResult
INFO: CERTIONE2_CORE #136 : result is set to UNSTABLE by
hudson.tasks.junit.JUnitResultArchiver.perform(JUnitResultArchiver.java:90)
Then I kill job I got following stacktrace (maybe it can help you):
BUILD SUCCESSFUL
Total time: 5 minutes 5 seconds
FATAL: Error while expanding null
Error while expanding null
at org.apache.tools.ant.taskdefs.Untar.expandResource(Untar.java:121)
at org.apache.tools.ant.taskdefs.Expand.execute(Expand.java:119)
at hudson.FilePath.readFromTar(FilePath.java:743)
at hudson.FilePath.copyRecursiveTo(FilePath.java:678)
at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:65)
at hudson.model.Build$RunnerImpl.post(Build.java:146)
at hudson.model.Run.run(Run.java:557)
at hudson.model.Build.run(Build.java:110)
at hudson.model.Executor.run(Executor.java:61)
Caused by: java.io.InterruptedIOException
at java.io.PipedInputStream.read(PipedInputStream.java:262)
at java.io.PipedInputStream.read(PipedInputStream.java:305)
at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:214)
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:134)
at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:87)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:254)
at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:254)
at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
at org.apache.tools.tar.TarBuffer.readBlock(TarBuffer.java:257)
at org.apache.tools.tar.TarBuffer.readRecord(TarBuffer.java:223)
at org.apache.tools.tar.TarInputStream.read(TarInputStream.java:340)
at java.io.FilterInputStream.read(FilterInputStream.java:90)
at org.apache.tools.ant.taskdefs.Expand.extractFile(Expand.java:282)
at org.apache.tools.ant.taskdefs.Untar.expandStream(Untar.java:142)
at org.apache.tools.ant.taskdefs.Untar.expandResource(Untar.java:119)
... 8 more
— Nested Exception —
java.io.InterruptedIOException
at java.io.PipedInputStream.read(PipedInputStream.java:262)
at java.io.PipedInputStream.read(PipedInputStream.java:305)
at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:214)
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:134)
at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:87)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:254)
at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:254)
at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
at org.apache.tools.tar.TarBuffer.readBlock(TarBuffer.java:257)
at org.apache.tools.tar.TarBuffer.readRecord(TarBuffer.java:223)
at org.apache.tools.tar.TarInputStream.read(TarInputStream.java:340)
at java.io.FilterInputStream.read(FilterInputStream.java:90)
at org.apache.tools.ant.taskdefs.Expand.extractFile(Expand.java:282)
at org.apache.tools.ant.taskdefs.Untar.expandStream(Untar.java:142)
at org.apache.tools.ant.taskdefs.Untar.expandResource(Untar.java:119)
at org.apache.tools.ant.taskdefs.Expand.execute(Expand.java:119)
at hudson.FilePath.readFromTar(FilePath.java:743)
at hudson.FilePath.copyRecursiveTo(FilePath.java:678)
at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:65)
at hudson.model.Build$RunnerImpl.post(Build.java:146)
at hudson.model.Run.run(Run.java:557)
at hudson.model.Build.run(Build.java:110)
at hudson.model.Executor.run(Executor.java:61)
finally found then it happens.
job hangs during publishing of big amount of artifacts.
In my job I have 56 Mb of data which is published as artifacts.
And in job which doesn't hang it is "only" 20 Mb.
It seemd like there is some issue with PipedInputStream for big amounts of data.
Because then I do kill jobs exception is always the same (as example last stack
trace which I had):
FATAL: Error while expanding null
Error while expanding null
at org.apache.tools.ant.taskdefs.Untar.expandResource(Untar.java:121)
at org.apache.tools.ant.taskdefs.Expand.execute(Expand.java:119)
at hudson.FilePath.readFromTar(FilePath.java:743)
at hudson.FilePath.copyRecursiveTo(FilePath.java:678)
at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:65)
at hudson.model.Build$RunnerImpl.post(Build.java:146)
at hudson.model.Run.run(Run.java:557)
at hudson.model.Build.run(Build.java:110)
at hudson.model.Executor.run(Executor.java:61)
Caused by: java.io.InterruptedIOException
at java.io.PipedInputStream.read(PipedInputStream.java:262)
at java.io.PipedInputStream.read(PipedInputStream.java:305)
at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:214)
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:134)
at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:87)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:254)
at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
at org.apache.tools.tar.TarBuffer.readBlock(TarBuffer.java:257)
at org.apache.tools.tar.TarBuffer.readRecord(TarBuffer.java:223)
at org.apache.tools.tar.TarInputStream.read(TarInputStream.java:340)
at java.io.FilterInputStream.read(FilterInputStream.java:90)
at org.apache.tools.ant.taskdefs.Expand.extractFile(Expand.java:282)
at org.apache.tools.ant.taskdefs.Untar.expandStream(Untar.java:142)
at org.apache.tools.ant.taskdefs.Untar.expandResource(Untar.java:119)
... 8 more
— Nested Exception —
java.io.InterruptedIOException
at java.io.PipedInputStream.read(PipedInputStream.java:262)
at java.io.PipedInputStream.read(PipedInputStream.java:305)
at java.util.zip.InflaterInputStream.fill(InflaterInputStream.java:214)
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:134)
at java.util.zip.GZIPInputStream.read(GZIPInputStream.java:87)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:254)
at java.io.BufferedInputStream.read(BufferedInputStream.java:313)
at org.apache.tools.tar.TarBuffer.readBlock(TarBuffer.java:257)
at org.apache.tools.tar.TarBuffer.readRecord(TarBuffer.java:223)
at org.apache.tools.tar.TarInputStream.read(TarInputStream.java:340)
at java.io.FilterInputStream.read(FilterInputStream.java:90)
at org.apache.tools.ant.taskdefs.Expand.extractFile(Expand.java:282)
at org.apache.tools.ant.taskdefs.Untar.expandStream(Untar.java:142)
at org.apache.tools.ant.taskdefs.Untar.expandResource(Untar.java:119)
at org.apache.tools.ant.taskdefs.Expand.execute(Expand.java:119)
at hudson.FilePath.readFromTar(FilePath.java:743)
at hudson.FilePath.copyRecursiveTo(FilePath.java:678)
at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:65)
at hudson.model.Build$RunnerImpl.post(Build.java:146)
at hudson.model.Run.run(Run.java:557)
at hudson.model.Build.run(Build.java:110)
at hudson.model.Executor.run(Executor.java:61)
Could it be related to this issue
http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=4028462 ?
Created an attachment (id=54)
Tomcat thread stack then job is hanging (generated by calling kill -3 )
Thanks. I think I fixed this problem in 1.101.
If you can try the latest snapshot, that would be wonderful.
It works with snapshot
Hudson ver. 1.101-SNAPSHOT (private-04/12/2007 02:18-hudson)
Thanks a lot !
Created an attachment (id=52)
config which hangs