Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-14362

100% CPU load during org.kohsuke.stapler.compression.CompressionFilter.reportException

    • Icon: Bug Bug
    • Resolution: Fixed
    • Icon: Critical Critical
    • core

      Jenkins starts using 100% CPU after a few days. Using jstack I see several threads trying to write compressed output, and apparently not changing over time:

      • java.util.zip.Deflater.deflate(byte[], int, int) @bci=55, line=322 (Compiled frame; information may be imprecise)
      • java.util.zip.DeflaterOutputStream.deflate() @bci=14, line=176 (Compiled frame)
      • java.util.zip.DeflaterOutputStream.write(byte[], int, int) @bci=108, line=135 (Compiled frame)
      • java.util.zip.GZIPOutputStream.write(byte[], int, int) @bci=4, line=89 (Compiled frame)
      • org.kohsuke.stapler.compression.FilterServletOutputStream.write(byte[], int, int) @bci=7, line=31 (Compiled frame)
      • sun.nio.cs.StreamEncoder.writeBytes() @bci=120, line=220 (Interpreted frame)
      • sun.nio.cs.StreamEncoder.implClose() @bci=84, line=315 (Interpreted frame)
      • sun.nio.cs.StreamEncoder.close() @bci=18, line=148 (Interpreted frame)
      • java.io.OutputStreamWriter.close() @bci=4, line=233 (Interpreted frame)
      • java.io.PrintWriter.close() @bci=21, line=312 (Interpreted frame)
      • org.kohsuke.stapler.compression.CompressionFilter.reportException(java.lang.Exception, javax.servlet.http.HttpServletResponse) @bci=112, line=77 (Interpreted frame)
      • org.kohsuke.stapler.compression.CompressionFilter.doFilter(javax.servlet.ServletRequest, javax.servlet.ServletResponse, javax.servlet.FilterChain) @bci=57, line=53 (Compiled frame)
      • winstone.FilterConfiguration.execute(javax.servlet.ServletRequest, javax.servlet.ServletResponse, javax.servlet.FilterChain) @bci=25, line=194 (Compiled frame)
      • winstone.RequestDispatcher.doFilter(javax.servlet.ServletRequest, javax.servlet.ServletResponse) @bci=48, line=366 (Compiled frame)
      • hudson.util.CharacterEncodingFilter.doFilter(javax.servlet.ServletRequest, javax.servlet.ServletResponse, javax.servlet.FilterChain) @bci=43, line=81 (Compiled frame)
      • winstone.FilterConfiguration.execute(javax.servlet.ServletRequest, javax.servlet.ServletResponse, javax.servlet.FilterChain) @bci=25, line=194 (Compiled frame)
      • winstone.RequestDispatcher.doFilter(javax.servlet.ServletRequest, javax.servlet.ServletResponse) @bci=48, line=366 (Compiled frame)
      • winstone.RequestDispatcher.forward(javax.servlet.ServletRequest, javax.servlet.ServletResponse) @bci=483, line=331 (Compiled frame)
      • winstone.RequestHandlerThread.processRequest(winstone.WebAppConfiguration, winstone.WinstoneRequest, winstone.WinstoneResponse, java.lang.String) @bci=38, line=215 (Compiled frame)
      • winstone.RequestHandlerThread.run() @bci=631, line=138 (Compiled frame)
      • java.util.concurrent.Executors$RunnableAdapter.call() @bci=4, line=471 (Interpreted frame)
      • java.util.concurrent.FutureTask$Sync.innerRun() @bci=29, line=334 (Interpreted frame)
      • java.util.concurrent.FutureTask.run() @bci=4, line=166 (Interpreted frame)
      • winstone.BoundedExecutorService$1.run() @bci=4, line=77 (Compiled frame)
      • java.util.concurrent.ThreadPoolExecutor.runWorker(java.util.concurrent.ThreadPoolExecutor$Worker) @bci=46, line=1110 (Compiled frame)
      • java.util.concurrent.ThreadPoolExecutor$Worker.run() @bci=5, line=603 (Interpreted frame)
      • java.lang.Thread.run() @bci=11, line=679 (Interpreted frame)

      I'm suspecting but not 100% sure that these threads are in an infinite loop (livelocked). I'm struggling to see what other threads might be doing this.

      This JVM was not started with debugging enabled to attach a debugger for analysis. I've enabled it now. Stack traces attached as files below.

        1. thread-dump-prod.txt
          134 kB
        2. thread-dump-reproduce-1.txt
          72 kB
        3. thread-dump-reproduce-2.txt
          72 kB
        4. thread-dump-reproduce-3.txt
          72 kB
        5. jenkins.stacktrace.2
          51 kB
        6. jenkins.stacktrace.3
          46 kB
        7. jenkins.stacktrace.1
          53 kB

          [JENKINS-14362] 100% CPU load during org.kohsuke.stapler.compression.CompressionFilter.reportException

          Chris Wilson created issue -
          Chris Wilson made changes -
          Attachment New: jenkins.stacktrace.1 [ 22085 ]
          Attachment New: jenkins.stacktrace.2 [ 22086 ]
          Attachment New: jenkins.stacktrace.3 [ 22087 ]
          OHTAKE Tomohiro made changes -
          Link New: This issue is duplicated by JENKINS-14361 [ JENKINS-14361 ]

          David Pärsson added a comment - - edited

          I think I have seen this issue (or a very similar one) when trying to generate Performance reports for huge JMeter reports. Jenkins v1.471 and Performance plugin v1.8.

          David Pärsson added a comment - - edited I think I have seen this issue (or a very similar one) when trying to generate Performance reports for huge JMeter reports. Jenkins v1.471 and Performance plugin v1.8.

          Bob Lloyd added a comment - - edited

          I have the same issue, but cannot track it to a specific job. I am not generating Performance reports (to my knowledge, unless it's happening w/o my intention). I've attached my log below (though, it's pretty much the same as above). I'm running with Jenkins 1.491 with Sun JDK 1.6.0_26

          This happens for me after about 36 hours of up-time for Jenkins. I have ~30 jobs running on 6 servers. One server runs jobs pretty much constantly, while other servers run much less frequently.

          "RequestHandlerThread541" daemon prio=6 tid=0x4959d800 nid=0x8c4 runnable [0x4b40f000]
          java.lang.Thread.State: RUNNABLE
          at java.util.zip.Deflater.deflateBytes(Native Method)
          at java.util.zip.Deflater.deflate(Unknown Source)

          • locked <0x19974c70> (a java.util.zip.ZStreamRef)
            at java.util.zip.DeflaterOutputStream.deflate(Unknown Source)
            at java.util.zip.DeflaterOutputStream.write(Unknown Source)
            at java.util.zip.GZIPOutputStream.write(Unknown Source)
          • locked <0x19974c80> (a java.util.zip.GZIPOutputStream)
            at org.kohsuke.stapler.compression.FilterServletOutputStream.write(FilterServletOutputStream.java:31)
            at sun.nio.cs.StreamEncoder.writeBytes(Unknown Source)
            at sun.nio.cs.StreamEncoder.implClose(Unknown Source)
            at sun.nio.cs.StreamEncoder.close(Unknown Source)
          • locked <0x19976ce0> (a java.io.OutputStreamWriter)
            at java.io.OutputStreamWriter.close(Unknown Source)
            at java.io.PrintWriter.close(Unknown Source)
          • locked <0x19976ce0> (a java.io.OutputStreamWriter)
            at org.kohsuke.stapler.compression.CompressionFilter.reportException(CompressionFilter.java:77)
            at org.kohsuke.stapler.compression.CompressionFilter.doFilter(CompressionFilter.java:53)
            at winstone.FilterConfiguration.execute(FilterConfiguration.java:194)
            at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366)
            at hudson.util.CharacterEncodingFilter.doFilter(CharacterEncodingFilter.java:81)
            at winstone.FilterConfiguration.execute(FilterConfiguration.java:194)
            at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366)
            at winstone.RequestDispatcher.forward(RequestDispatcher.java:331)
            at winstone.RequestHandlerThread.processRequest(RequestHandlerThread.java:215)
            at winstone.RequestHandlerThread.run(RequestHandlerThread.java:138)
            at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
            at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
            at java.util.concurrent.FutureTask.run(Unknown Source)
            at winstone.BoundedExecutorService$1.run(BoundedExecutorService.java:77)
            at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
            at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
            at java.lang.Thread.run(Unknown Source)

          Locked ownable synchronizers:

          • <0x19973bf8> (a java.util.concurrent.locks.ReentrantLock$NonfairSync)

          Bob Lloyd added a comment - - edited I have the same issue, but cannot track it to a specific job. I am not generating Performance reports (to my knowledge, unless it's happening w/o my intention). I've attached my log below (though, it's pretty much the same as above). I'm running with Jenkins 1.491 with Sun JDK 1.6.0_26 This happens for me after about 36 hours of up-time for Jenkins. I have ~30 jobs running on 6 servers. One server runs jobs pretty much constantly, while other servers run much less frequently. "RequestHandlerThread 541 " daemon prio=6 tid=0x4959d800 nid=0x8c4 runnable [0x4b40f000] java.lang.Thread.State: RUNNABLE at java.util.zip.Deflater.deflateBytes(Native Method) at java.util.zip.Deflater.deflate(Unknown Source) locked <0x19974c70> (a java.util.zip.ZStreamRef) at java.util.zip.DeflaterOutputStream.deflate(Unknown Source) at java.util.zip.DeflaterOutputStream.write(Unknown Source) at java.util.zip.GZIPOutputStream.write(Unknown Source) locked <0x19974c80> (a java.util.zip.GZIPOutputStream) at org.kohsuke.stapler.compression.FilterServletOutputStream.write(FilterServletOutputStream.java:31) at sun.nio.cs.StreamEncoder.writeBytes(Unknown Source) at sun.nio.cs.StreamEncoder.implClose(Unknown Source) at sun.nio.cs.StreamEncoder.close(Unknown Source) locked <0x19976ce0> (a java.io.OutputStreamWriter) at java.io.OutputStreamWriter.close(Unknown Source) at java.io.PrintWriter.close(Unknown Source) locked <0x19976ce0> (a java.io.OutputStreamWriter) at org.kohsuke.stapler.compression.CompressionFilter.reportException(CompressionFilter.java:77) at org.kohsuke.stapler.compression.CompressionFilter.doFilter(CompressionFilter.java:53) at winstone.FilterConfiguration.execute(FilterConfiguration.java:194) at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366) at hudson.util.CharacterEncodingFilter.doFilter(CharacterEncodingFilter.java:81) at winstone.FilterConfiguration.execute(FilterConfiguration.java:194) at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366) at winstone.RequestDispatcher.forward(RequestDispatcher.java:331) at winstone.RequestHandlerThread.processRequest(RequestHandlerThread.java:215) at winstone.RequestHandlerThread.run(RequestHandlerThread.java:138) at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source) at java.util.concurrent.FutureTask.run(Unknown Source) at winstone.BoundedExecutorService$1.run(BoundedExecutorService.java:77) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.lang.Thread.run(Unknown Source) Locked ownable synchronizers: <0x19973bf8> (a java.util.concurrent.locks.ReentrantLock$NonfairSync)

          Jose Sa added a comment -

          I still have this problem occurring at least twice a week causing cpu at 400% (on a 4 CPUs machine) that forces me to restart the server.

          When it happens I can workaround it (temporarily) by using the 'monitoring' plugin, going to the list of threads and sort the table by execution time and kill the threads actively running "DeflaterOutputStream.deflate" that are on top of the table. This makes CPU drop back to normal values but eventually a restart will be needed at the end of the day.

          Here is a stack trace collected from our logs:

          WARNING: Untrapped servlet exception
          winstone.ClientSocketException: Failed to write to client
                  at winstone.ClientOutputStream.write(ClientOutputStream.java:41)
                  at winstone.WinstoneOutputStream.commit(WinstoneOutputStream.java:181)
                  at winstone.WinstoneOutputStream.commit(WinstoneOutputStream.java:119)
                  at winstone.WinstoneOutputStream.write(WinstoneOutputStream.java:112)
                  at java.util.zip.GZIPOutputStream.finish(GZIPOutputStream.java:169)
                  at java.util.zip.DeflaterOutputStream.close(DeflaterOutputStream.java:238)
                  at org.kohsuke.stapler.compression.FilterServletOutputStream.close(FilterServletOutputStream.java:36)
                  at net.bull.javamelody.FilterServletOutputStream.close(FilterServletOutputStream.java:46)
                  at java.io.FilterOutputStream.close(FilterOutputStream.java:160)
                  at sun.nio.cs.StreamEncoder.implClose(StreamEncoder.java:320)
                  at sun.nio.cs.StreamEncoder.close(StreamEncoder.java:149)
                  at java.io.OutputStreamWriter.close(OutputStreamWriter.java:233)
                  at java.io.BufferedWriter.close(BufferedWriter.java:266)
                  at org.dom4j.io.XMLWriter.close(XMLWriter.java:286)
                  at org.kohsuke.stapler.jelly.HTMLWriterOutput.close(HTMLWriterOutput.java:70)
                  at org.kohsuke.stapler.jelly.DefaultScriptInvoker.invokeScript(DefaultScriptInvoker.java:56)
                  at org.kohsuke.stapler.jelly.JellyClassTearOff.serveIndexJelly(JellyClassTearOff.java:107)
                  at org.kohsuke.stapler.jelly.JellyFacet.handleIndexRequest(JellyFacet.java:127)
                  at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:563)
                  at org.kohsuke.stapler.Stapler.invoke(Stapler.java:659)
                  at org.kohsuke.stapler.MetaClass$6.doDispatch(MetaClass.java:241)
                  at org.kohsuke.stapler.NameBasedDispatcher.dispatch(NameBasedDispatcher.java:53)
                  at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:574)
                  at org.kohsuke.stapler.Stapler.invoke(Stapler.java:659)
                  at org.kohsuke.stapler.MetaClass$12.dispatch(MetaClass.java:384)
                  at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:574)
                  at org.kohsuke.stapler.Stapler.invoke(Stapler.java:659)
                  at org.kohsuke.stapler.MetaClass$4.doDispatch(MetaClass.java:203)
                  at org.kohsuke.stapler.NameBasedDispatcher.dispatch(NameBasedDispatcher.java:53)
                  at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:574)
                  at org.kohsuke.stapler.Stapler.invoke(Stapler.java:659)
                  at org.kohsuke.stapler.Stapler.invoke(Stapler.java:488)
                  at org.kohsuke.stapler.Stapler.service(Stapler.java:162)
                  at javax.servlet.http.HttpServlet.service(HttpServlet.java:45)
                  at winstone.ServletConfiguration.execute(ServletConfiguration.java:248)
                  at winstone.RequestDispatcher.forward(RequestDispatcher.java:333)
                  at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:376)
                  at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:95)
                  at hudson.plugins.greenballs.GreenBallFilter.doFilter(GreenBallFilter.java:58)
                  at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:98)
                  at net.bull.javamelody.MonitoringFilter.doFilter(MonitoringFilter.java:206)
                  at net.bull.javamelody.MonitoringFilter.doFilter(MonitoringFilter.java:179)
                  at net.bull.javamelody.PluginMonitoringFilter.doFilter(PluginMonitoringFilter.java:86)
                  at org.jvnet.hudson.plugins.monitoring.HudsonMonitoringFilter.doFilter(HudsonMonitoringFilter.java:84)
                  at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:98)
                  at hudson.util.PluginServletFilter.doFilter(PluginServletFilter.java:87)
                  at winstone.FilterConfiguration.execute(FilterConfiguration.java:194)
                  at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366)
                  at hudson.security.csrf.CrumbFilter.doFilter(CrumbFilter.java:47)
                  at winstone.FilterConfiguration.execute(FilterConfiguration.java:194)
                  at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366)
                  at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:84)
                  at hudson.security.UnwrapSecurityExceptionFilter.doFilter(UnwrapSecurityExceptionFilter.java:51)
                  at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
                  at org.acegisecurity.ui.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:166)
                  at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
                  at org.acegisecurity.providers.anonymous.AnonymousProcessingFilter.doFilter(AnonymousProcessingFilter.java:125)
                  at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
                  at org.acegisecurity.ui.rememberme.RememberMeProcessingFilter.doFilter(RememberMeProcessingFilter.java:142)
                  at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
                  at org.acegisecurity.ui.AbstractProcessingFilter.doFilter(AbstractProcessingFilter.java:271)
                  at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
                  at org.acegisecurity.ui.basicauth.BasicProcessingFilter.doFilter(BasicProcessingFilter.java:173)
                  at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
                  at jenkins.security.ApiTokenFilter.doFilter(ApiTokenFilter.java:63)
                  at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
                  at org.acegisecurity.context.HttpSessionContextIntegrationFilter.doFilter(HttpSessionContextIntegrationFilter.java:249)
                  at hudson.security.HttpSessionContextIntegrationFilter2.doFilter(HttpSessionContextIntegrationFilter2.java:66)
                  at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87)
                  at hudson.security.ChainedServletFilter.doFilter(ChainedServletFilter.java:76)
                  at hudson.security.HudsonFilter.doFilter(HudsonFilter.java:164)
                  at winstone.FilterConfiguration.execute(FilterConfiguration.java:194)
                  at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366)
                  at org.kohsuke.stapler.compression.CompressionFilter.doFilter(CompressionFilter.java:50)
                  at winstone.FilterConfiguration.execute(FilterConfiguration.java:194)
                  at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366)
                  at hudson.util.CharacterEncodingFilter.doFilter(CharacterEncodingFilter.java:81)
                  at winstone.FilterConfiguration.execute(FilterConfiguration.java:194)
                  at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366)
                  at winstone.RequestDispatcher.forward(RequestDispatcher.java:331)
                  at winstone.RequestHandlerThread.processRequest(RequestHandlerThread.java:215)
                  at winstone.RequestHandlerThread.run(RequestHandlerThread.java:138)
                  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
                  at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
                  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
                  at winstone.BoundedExecutorService$1.run(BoundedExecutorService.java:77)
                  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
                  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
                  at java.lang.Thread.run(Thread.java:722)
          Caused by: java.net.SocketException: Broken pipe
                  at java.net.SocketOutputStream.socketWrite0(Native Method)
                  at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:109)
                  at java.net.SocketOutputStream.write(SocketOutputStream.java:153)
                  at winstone.ClientOutputStream.write(ClientOutputStream.java:39)
                  ... 88 more
          

          Jose Sa added a comment - I still have this problem occurring at least twice a week causing cpu at 400% (on a 4 CPUs machine) that forces me to restart the server. When it happens I can workaround it (temporarily) by using the 'monitoring' plugin, going to the list of threads and sort the table by execution time and kill the threads actively running "DeflaterOutputStream.deflate" that are on top of the table. This makes CPU drop back to normal values but eventually a restart will be needed at the end of the day. Here is a stack trace collected from our logs: WARNING: Untrapped servlet exception winstone.ClientSocketException: Failed to write to client at winstone.ClientOutputStream.write(ClientOutputStream.java:41) at winstone.WinstoneOutputStream.commit(WinstoneOutputStream.java:181) at winstone.WinstoneOutputStream.commit(WinstoneOutputStream.java:119) at winstone.WinstoneOutputStream.write(WinstoneOutputStream.java:112) at java.util.zip.GZIPOutputStream.finish(GZIPOutputStream.java:169) at java.util.zip.DeflaterOutputStream.close(DeflaterOutputStream.java:238) at org.kohsuke.stapler.compression.FilterServletOutputStream.close(FilterServletOutputStream.java:36) at net.bull.javamelody.FilterServletOutputStream.close(FilterServletOutputStream.java:46) at java.io.FilterOutputStream.close(FilterOutputStream.java:160) at sun.nio.cs.StreamEncoder.implClose(StreamEncoder.java:320) at sun.nio.cs.StreamEncoder.close(StreamEncoder.java:149) at java.io.OutputStreamWriter.close(OutputStreamWriter.java:233) at java.io.BufferedWriter.close(BufferedWriter.java:266) at org.dom4j.io.XMLWriter.close(XMLWriter.java:286) at org.kohsuke.stapler.jelly.HTMLWriterOutput.close(HTMLWriterOutput.java:70) at org.kohsuke.stapler.jelly.DefaultScriptInvoker.invokeScript(DefaultScriptInvoker.java:56) at org.kohsuke.stapler.jelly.JellyClassTearOff.serveIndexJelly(JellyClassTearOff.java:107) at org.kohsuke.stapler.jelly.JellyFacet.handleIndexRequest(JellyFacet.java:127) at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:563) at org.kohsuke.stapler.Stapler.invoke(Stapler.java:659) at org.kohsuke.stapler.MetaClass$6.doDispatch(MetaClass.java:241) at org.kohsuke.stapler.NameBasedDispatcher.dispatch(NameBasedDispatcher.java:53) at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:574) at org.kohsuke.stapler.Stapler.invoke(Stapler.java:659) at org.kohsuke.stapler.MetaClass$12.dispatch(MetaClass.java:384) at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:574) at org.kohsuke.stapler.Stapler.invoke(Stapler.java:659) at org.kohsuke.stapler.MetaClass$4.doDispatch(MetaClass.java:203) at org.kohsuke.stapler.NameBasedDispatcher.dispatch(NameBasedDispatcher.java:53) at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:574) at org.kohsuke.stapler.Stapler.invoke(Stapler.java:659) at org.kohsuke.stapler.Stapler.invoke(Stapler.java:488) at org.kohsuke.stapler.Stapler.service(Stapler.java:162) at javax.servlet.http.HttpServlet.service(HttpServlet.java:45) at winstone.ServletConfiguration.execute(ServletConfiguration.java:248) at winstone.RequestDispatcher.forward(RequestDispatcher.java:333) at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:376) at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:95) at hudson.plugins.greenballs.GreenBallFilter.doFilter(GreenBallFilter.java:58) at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:98) at net.bull.javamelody.MonitoringFilter.doFilter(MonitoringFilter.java:206) at net.bull.javamelody.MonitoringFilter.doFilter(MonitoringFilter.java:179) at net.bull.javamelody.PluginMonitoringFilter.doFilter(PluginMonitoringFilter.java:86) at org.jvnet.hudson.plugins.monitoring.HudsonMonitoringFilter.doFilter(HudsonMonitoringFilter.java:84) at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:98) at hudson.util.PluginServletFilter.doFilter(PluginServletFilter.java:87) at winstone.FilterConfiguration.execute(FilterConfiguration.java:194) at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366) at hudson.security.csrf.CrumbFilter.doFilter(CrumbFilter.java:47) at winstone.FilterConfiguration.execute(FilterConfiguration.java:194) at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366) at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:84) at hudson.security.UnwrapSecurityExceptionFilter.doFilter(UnwrapSecurityExceptionFilter.java:51) at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87) at org.acegisecurity.ui.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:166) at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87) at org.acegisecurity.providers.anonymous.AnonymousProcessingFilter.doFilter(AnonymousProcessingFilter.java:125) at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87) at org.acegisecurity.ui.rememberme.RememberMeProcessingFilter.doFilter(RememberMeProcessingFilter.java:142) at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87) at org.acegisecurity.ui.AbstractProcessingFilter.doFilter(AbstractProcessingFilter.java:271) at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87) at org.acegisecurity.ui.basicauth.BasicProcessingFilter.doFilter(BasicProcessingFilter.java:173) at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87) at jenkins.security.ApiTokenFilter.doFilter(ApiTokenFilter.java:63) at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87) at org.acegisecurity.context.HttpSessionContextIntegrationFilter.doFilter(HttpSessionContextIntegrationFilter.java:249) at hudson.security.HttpSessionContextIntegrationFilter2.doFilter(HttpSessionContextIntegrationFilter2.java:66) at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:87) at hudson.security.ChainedServletFilter.doFilter(ChainedServletFilter.java:76) at hudson.security.HudsonFilter.doFilter(HudsonFilter.java:164) at winstone.FilterConfiguration.execute(FilterConfiguration.java:194) at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366) at org.kohsuke.stapler.compression.CompressionFilter.doFilter(CompressionFilter.java:50) at winstone.FilterConfiguration.execute(FilterConfiguration.java:194) at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366) at hudson.util.CharacterEncodingFilter.doFilter(CharacterEncodingFilter.java:81) at winstone.FilterConfiguration.execute(FilterConfiguration.java:194) at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366) at winstone.RequestDispatcher.forward(RequestDispatcher.java:331) at winstone.RequestHandlerThread.processRequest(RequestHandlerThread.java:215) at winstone.RequestHandlerThread.run(RequestHandlerThread.java:138) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334) at java.util.concurrent.FutureTask.run(FutureTask.java:166) at winstone.BoundedExecutorService$1.run(BoundedExecutorService.java:77) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) at java.lang. Thread .run( Thread .java:722) Caused by: java.net.SocketException: Broken pipe at java.net.SocketOutputStream.socketWrite0(Native Method) at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:109) at java.net.SocketOutputStream.write(SocketOutputStream.java:153) at winstone.ClientOutputStream.write(ClientOutputStream.java:39) ... 88 more

          Bruce Edge added a comment -

          I have the same problem. It triggers soft lockups in the kernel of the jenkins host:

          901 Jan 30 02:31:58 build kernel: [589969.228931] INFO: rcu_sched detected stall on CPU 3 (t=57719 jiffies)
          902 Jan 30 02:31:58 build kernel: [589969.228939] INFO: rcu_sched detected stall on CPU 2 (t=57719 jiffies)
          903 Jan 30 02:31:58 build kernel: [589969.228929] INFO: rcu_sched detected stall on CPU 1 (t=57719 jiffies)
          904 Jan 30 02:31:58 build kernel: [589969.228944] INFO: rcu_sched detected stall on CPU 4 (t=57719 jiffies)
          905 Jan 30 02:31:58 build kernel: [589969.228939] sending NMI to all CPUs:

          Note the timestamp correlation between the exception below and the stalls above.

          712 Jan 30, 2013 2:31:58 AM org.kohsuke.stapler.compression.CompressionFilter reportException
          6713 WARNING: Untrapped servlet exception
          6714 winstone.ClientSocketException: Failed to write to client
          6715 at winstone.ClientOutputStream.write(ClientOutputStream.java:41)
          6716 at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
          6717 at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
          6718 at winstone.WinstoneOutputStream.commit(WinstoneOutputStream.java:165)
          6719 at winstone.WinstoneOutputStream.flush(WinstoneOutputStream.java:217)
          6720 at winstone.WinstoneOutputStream.close(WinstoneOutputStream.java:227)
          6721 at java.util.zip.DeflaterOutputStream.close(DeflaterOutputStream.java:241)
          6722 at org.kohsuke.stapler.compression.FilterServletOutputStream.close(FilterServletOutputStream.java:36)
          6723 at net.bull.javamelody.FilterServletOutputStream.close(FilterServletOutputStream.java:46)

          This happens most nights at the same time.

          Bruce Edge added a comment - I have the same problem. It triggers soft lockups in the kernel of the jenkins host: 901 Jan 30 02:31:58 build kernel: [589969.228931] INFO: rcu_sched detected stall on CPU 3 (t=57719 jiffies) 902 Jan 30 02:31:58 build kernel: [589969.228939] INFO: rcu_sched detected stall on CPU 2 (t=57719 jiffies) 903 Jan 30 02:31:58 build kernel: [589969.228929] INFO: rcu_sched detected stall on CPU 1 (t=57719 jiffies) 904 Jan 30 02:31:58 build kernel: [589969.228944] INFO: rcu_sched detected stall on CPU 4 (t=57719 jiffies) 905 Jan 30 02:31:58 build kernel: [589969.228939] sending NMI to all CPUs: Note the timestamp correlation between the exception below and the stalls above. 712 Jan 30, 2013 2:31:58 AM org.kohsuke.stapler.compression.CompressionFilter reportException 6713 WARNING: Untrapped servlet exception 6714 winstone.ClientSocketException: Failed to write to client 6715 at winstone.ClientOutputStream.write(ClientOutputStream.java:41) 6716 at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) 6717 at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) 6718 at winstone.WinstoneOutputStream.commit(WinstoneOutputStream.java:165) 6719 at winstone.WinstoneOutputStream.flush(WinstoneOutputStream.java:217) 6720 at winstone.WinstoneOutputStream.close(WinstoneOutputStream.java:227) 6721 at java.util.zip.DeflaterOutputStream.close(DeflaterOutputStream.java:241) 6722 at org.kohsuke.stapler.compression.FilterServletOutputStream.close(FilterServletOutputStream.java:36) 6723 at net.bull.javamelody.FilterServletOutputStream.close(FilterServletOutputStream.java:46) This happens most nights at the same time.

          Richard Otter added a comment -

          On my system, just upgraded to 1.492 from the last LTS 1.480.1, this happens whenever I display an EMMA code coverage report.
          The server java process takes upwards of 80% CPU for at least 20 min until I use the method above to kill the "Active Request" in the Monitoring plugin.
          I've seen this behavior for months, but this jira thread shows me how to kill the thread. That's progress.

          Richard Otter added a comment - On my system, just upgraded to 1.492 from the last LTS 1.480.1, this happens whenever I display an EMMA code coverage report. The server java process takes upwards of 80% CPU for at least 20 min until I use the method above to kill the "Active Request" in the Monitoring plugin. I've seen this behavior for months, but this jira thread shows me how to kill the thread. That's progress.

          Jan Hoppe added a comment -

          Hi,

          we have been running into these troubles often the last time.
          I found a "disable" flag for "reportException".
          Add -Dorg.kohsuke.stapler.compression.CompressionFilter.disabled=true to your VM-Options in jenkins xml!

          Good luck,
          Jan

          Jan Hoppe added a comment - Hi, we have been running into these troubles often the last time. I found a "disable" flag for "reportException". Add -Dorg.kohsuke.stapler.compression.CompressionFilter.disabled=true to your VM-Options in jenkins xml! Good luck, Jan

          Jose Sa added a comment -

          I've added that entry to my startup script 2 weeks ago and also haven't experienced 100% cpu symptom since then.

          Jose Sa added a comment - I've added that entry to my startup script 2 weeks ago and also haven't experienced 100% cpu symptom since then.

            lmcazra Audrey Azra
            gcc Chris Wilson
            Votes:
            44 Vote for this issue
            Watchers:
            62 Start watching this issue

              Created:
              Updated:
              Resolved: