Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-14362

100% CPU load during org.kohsuke.stapler.compression.CompressionFilter.reportException

    • Icon: Bug Bug
    • Resolution: Fixed
    • Icon: Critical Critical
    • core

      Jenkins starts using 100% CPU after a few days. Using jstack I see several threads trying to write compressed output, and apparently not changing over time:

      • java.util.zip.Deflater.deflate(byte[], int, int) @bci=55, line=322 (Compiled frame; information may be imprecise)
      • java.util.zip.DeflaterOutputStream.deflate() @bci=14, line=176 (Compiled frame)
      • java.util.zip.DeflaterOutputStream.write(byte[], int, int) @bci=108, line=135 (Compiled frame)
      • java.util.zip.GZIPOutputStream.write(byte[], int, int) @bci=4, line=89 (Compiled frame)
      • org.kohsuke.stapler.compression.FilterServletOutputStream.write(byte[], int, int) @bci=7, line=31 (Compiled frame)
      • sun.nio.cs.StreamEncoder.writeBytes() @bci=120, line=220 (Interpreted frame)
      • sun.nio.cs.StreamEncoder.implClose() @bci=84, line=315 (Interpreted frame)
      • sun.nio.cs.StreamEncoder.close() @bci=18, line=148 (Interpreted frame)
      • java.io.OutputStreamWriter.close() @bci=4, line=233 (Interpreted frame)
      • java.io.PrintWriter.close() @bci=21, line=312 (Interpreted frame)
      • org.kohsuke.stapler.compression.CompressionFilter.reportException(java.lang.Exception, javax.servlet.http.HttpServletResponse) @bci=112, line=77 (Interpreted frame)
      • org.kohsuke.stapler.compression.CompressionFilter.doFilter(javax.servlet.ServletRequest, javax.servlet.ServletResponse, javax.servlet.FilterChain) @bci=57, line=53 (Compiled frame)
      • winstone.FilterConfiguration.execute(javax.servlet.ServletRequest, javax.servlet.ServletResponse, javax.servlet.FilterChain) @bci=25, line=194 (Compiled frame)
      • winstone.RequestDispatcher.doFilter(javax.servlet.ServletRequest, javax.servlet.ServletResponse) @bci=48, line=366 (Compiled frame)
      • hudson.util.CharacterEncodingFilter.doFilter(javax.servlet.ServletRequest, javax.servlet.ServletResponse, javax.servlet.FilterChain) @bci=43, line=81 (Compiled frame)
      • winstone.FilterConfiguration.execute(javax.servlet.ServletRequest, javax.servlet.ServletResponse, javax.servlet.FilterChain) @bci=25, line=194 (Compiled frame)
      • winstone.RequestDispatcher.doFilter(javax.servlet.ServletRequest, javax.servlet.ServletResponse) @bci=48, line=366 (Compiled frame)
      • winstone.RequestDispatcher.forward(javax.servlet.ServletRequest, javax.servlet.ServletResponse) @bci=483, line=331 (Compiled frame)
      • winstone.RequestHandlerThread.processRequest(winstone.WebAppConfiguration, winstone.WinstoneRequest, winstone.WinstoneResponse, java.lang.String) @bci=38, line=215 (Compiled frame)
      • winstone.RequestHandlerThread.run() @bci=631, line=138 (Compiled frame)
      • java.util.concurrent.Executors$RunnableAdapter.call() @bci=4, line=471 (Interpreted frame)
      • java.util.concurrent.FutureTask$Sync.innerRun() @bci=29, line=334 (Interpreted frame)
      • java.util.concurrent.FutureTask.run() @bci=4, line=166 (Interpreted frame)
      • winstone.BoundedExecutorService$1.run() @bci=4, line=77 (Compiled frame)
      • java.util.concurrent.ThreadPoolExecutor.runWorker(java.util.concurrent.ThreadPoolExecutor$Worker) @bci=46, line=1110 (Compiled frame)
      • java.util.concurrent.ThreadPoolExecutor$Worker.run() @bci=5, line=603 (Interpreted frame)
      • java.lang.Thread.run() @bci=11, line=679 (Interpreted frame)

      I'm suspecting but not 100% sure that these threads are in an infinite loop (livelocked). I'm struggling to see what other threads might be doing this.

      This JVM was not started with debugging enabled to attach a debugger for analysis. I've enabled it now. Stack traces attached as files below.

        1. thread-dump-prod.txt
          134 kB
        2. thread-dump-reproduce-1.txt
          72 kB
        3. thread-dump-reproduce-2.txt
          72 kB
        4. thread-dump-reproduce-3.txt
          72 kB
        5. jenkins.stacktrace.2
          51 kB
        6. jenkins.stacktrace.3
          46 kB
        7. jenkins.stacktrace.1
          53 kB

          [JENKINS-14362] 100% CPU load during org.kohsuke.stapler.compression.CompressionFilter.reportException

          Chris Wilson created issue -
          Chris Wilson made changes -
          Attachment New: jenkins.stacktrace.1 [ 22085 ]
          Attachment New: jenkins.stacktrace.2 [ 22086 ]
          Attachment New: jenkins.stacktrace.3 [ 22087 ]
          OHTAKE Tomohiro made changes -
          Link New: This issue is duplicated by JENKINS-14361 [ JENKINS-14361 ]
          evernat made changes -
          Link New: This issue is related to JENKINS-17349 [ JENKINS-17349 ]
          Jesse Glick made changes -
          Assignee New: Jesse Glick [ jglick ]
          Jesse Glick made changes -
          Status Original: Open [ 1 ] New: In Progress [ 3 ]
          Jesse Glick made changes -
          Labels New: lts-candidate performance
          Jesse Glick made changes -
          Priority Original: Major [ 3 ] New: Critical [ 2 ]
          Jesse Glick made changes -
          Resolution New: Fixed [ 1 ]
          Status Original: In Progress [ 3 ] New: Resolved [ 5 ]
          Justin Harringa made changes -
          Attachment New: thread-dump-reproduce-2.txt [ 24109 ]
          Attachment New: thread-dump-reproduce-3.txt [ 24110 ]
          Attachment New: thread-dump-prod.txt [ 24111 ]
          Attachment New: thread-dump-reproduce-1.txt [ 24112 ]
          Jesse Glick made changes -
          Link New: This issue is related to JENKINS-13625 [ JENKINS-13625 ]

            lmcazra Audrey Azra
            gcc Chris Wilson
            Votes:
            44 Vote for this issue
            Watchers:
            62 Start watching this issue

              Created:
              Updated:
              Resolved: