Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-56595

Regression: higher than usual CPU usage with 2.164.1

    • Icon: Improvement Improvement
    • Resolution: Unresolved
    • Icon: Major Major
    • core
    • None

      Ever since upgrading to 2.164.1 (current LTS) from 2.150.3, we are experiencing higher than usual CPU usage which also caused a crash:

       

       

       

          [JENKINS-56595] Regression: higher than usual CPU usage with 2.164.1

          Günter Grodotzki created issue -

          Daniel Beck added a comment -

          This report does not contain nearly enough information for us to investigate further.

          Daniel Beck added a comment - This report does not contain nearly enough information for us to investigate further.
          Daniel Beck made changes -
          Resolution New: Incomplete [ 4 ]
          Status Original: Open [ 1 ] New: Closed [ 6 ]
          Ryan Taylor made changes -
          Attachment New: image-2019-04-18-10-06-18-006.png [ 46822 ]
          Ryan Taylor made changes -
          Attachment New: image-2019-04-18-10-06-42-865.png [ 46823 ]

          Ryan Taylor added a comment -

          I can also confirm this behavior on 2.164.1

          Symptoms are:

          • gerrit queue is blocked
          • CPU utilization is all user and pegged to 100%
          • IO drops to nothing

           

          The UI was completely locked up ~925 and was restarted at 937.  System CPU spike at 933 was inspecting jenkins logs before reboot.

           

          This happens every few days so I should be able to capture more information on the next cycle.  I assume a thread dump is the best resource to ascertain what's happening here??

          Ryan Taylor added a comment - I can also confirm this behavior on 2.164.1 Symptoms are: gerrit queue is blocked CPU utilization is all user and pegged to 100% IO drops to nothing   The UI was completely locked up ~925 and was restarted at 937.  System CPU spike at 933 was inspecting jenkins logs before reboot.   This happens every few days so I should be able to capture more information on the next cycle.  I assume a thread dump is the best resource to ascertain what's happening here??
          Tyler Pickett made changes -
          Attachment New: jenkinshangWithJstack.6.output.tar.gz [ 46850 ]

          Tyler Pickett added a comment -

          I was on vacation last week when rtaylor_instructure commented or he would have included a set of thread traces  I've attached one that I captured from our instance the week before last.

          Tyler Pickett added a comment - I was on vacation last week when rtaylor_instructure commented or he would have included a set of thread traces  I've attached one that I captured from our instance the week before last.

          FWIW:

          • we doubled the RAM from 4GB to 8GB instance (with ~50% JVM allocation) - it lasted longer without a crash but ultimately gave in after a couple of days
          • doubled the RAM again, from 8GB to 16GB instance (with ~50% JVM allocation) - so far its lasting

           

          So for us that was quite a bump on resource requirements. Doubling is something one might be expecting (also because we were growing with more repositories) - but a 4x bump seemed a bit out of the norm.

          As long as JVM has enough RAM, the CPU usage is kept relatively low - so for now we are not seeing any issues.

          Günter Grodotzki added a comment - FWIW: we doubled the RAM from 4GB to 8GB instance (with ~50% JVM allocation) - it lasted longer without a crash but ultimately gave in after a couple of days doubled the RAM again, from 8GB to 16GB instance (with ~50% JVM allocation) - so far its lasting   So for us that was quite a bump on resource requirements. Doubling is something one might be expecting (also because we were growing with more repositories) - but a 4x bump seemed a bit out of the norm. As long as JVM has enough RAM, the CPU usage is kept relatively low - so for now we are not seeing any issues.

          Tyler Pickett added a comment -

          Additional info has been added, and we're willing to collect additional data as necessary.

          Tyler Pickett added a comment - Additional info has been added, and we're willing to collect additional data as necessary.
          Tyler Pickett made changes -
          Resolution Original: Incomplete [ 4 ]
          Status Original: Closed [ 6 ] New: Reopened [ 4 ]

            Unassigned Unassigned
            lifeofguenter Günter Grodotzki
            Votes:
            7 Vote for this issue
            Watchers:
            14 Start watching this issue

              Created:
              Updated: