Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-66491

Memory consumption greater in jdk11 images than in jdk8 images

    • Icon: Bug Bug
    • Resolution: Duplicate
    • Icon: Major Major
    • core

      We noticed our Jenkins deployment in Kubernetes was going over our resource limits for memory a 4Gi on existing deployments and on new deployments.  This was causing restarts everytime we viewed a job or sent a remove build command to Jenkins. We noticed this on August 26th 2021

      We usually never hit that limit unless we are doing something abnormal. This issue was hapenning when nothing was going on with jenkins and persisted after deployment restarts, etc.

      The solution we found was to switch from the LTS image to the 2.306 image. We tried other images and they all had the same problem. Changing helm chart versions did not help either. After the switch to the 2.306 image, we had a 50% reduction in memory consumption and it stayed that way.

          [JENKINS-66491] Memory consumption greater in jdk11 images than in jdk8 images

          Mark Waite added a comment - - edited

          When you say "Jenkins LTS" does that mean you had installed Jenkins 2.303.1 as the LTS or Jenkins 2.289.3 as the LTS or some other version of LTS? Since you have git plugin 4.7.2 installed and it requires at least 2.263.1, you must be running at least 2.263.1.

          You list the helm chart version as "2.6.4" but I don't see a Helm chart version 2.6.4 in the recent releases of https://github.com/jenkinsci/helm-charts/releases. There is a jenkins-2.6.4 helm chart release from September 2020, but that is almost 12 months ago. Are you running helm chart 2.6.4 or something newer? What Jenkins version is run with that helm chart?

          Does the same problem persist if you use a newer version of the helm chart, like https://github.com/jenkinsci/helm-charts/releases/tag/jenkins-3.5.14 ?

          Mark Waite added a comment - - edited When you say "Jenkins LTS" does that mean you had installed Jenkins 2.303.1 as the LTS or Jenkins 2.289.3 as the LTS or some other version of LTS? Since you have git plugin 4.7.2 installed and it requires at least 2.263.1, you must be running at least 2.263.1. You list the helm chart version as "2.6.4" but I don't see a Helm chart version 2.6.4 in the recent releases of https://github.com/jenkinsci/helm-charts/releases . There is a jenkins-2.6.4 helm chart release from September 2020, but that is almost 12 months ago. Are you running helm chart 2.6.4 or something newer? What Jenkins version is run with that helm chart? Does the same problem persist if you use a newer version of the helm chart, like https://github.com/jenkinsci/helm-charts/releases/tag/jenkins-3.5.14 ?

          Neil added a comment - - edited

          markewaite we use the image jenkins/jenkins:lts in our helm chart. That's all I can tell from our config.
          We can upgrade our helm chart to the latest 2.x helm chart release, but we have to do some conversion work for our helm chart config to work on the latest 3.x helm chart release.
          When we upgrade to the latest 2.x helm chart release, we did not see any change to the memory consumption, only when we changed the image specified from "lts" to "2.306" did we see the memory consumption go back to normal amounts.

          Neil added a comment - - edited markewaite  we use the image jenkins/jenkins:lts in our helm chart. That's all I can tell from our config. We can upgrade our helm chart to the latest 2.x helm chart release, but we have to do some conversion work for our helm chart config to work on the latest 3.x helm chart release. When we upgrade to the latest 2.x helm chart release, we did not see any change to the memory consumption, only when we changed the image specified from "lts" to "2.306" did we see the memory consumption go back to normal amounts.

          Mark Waite added a comment -

          The image name jenkins/jenkins:lts is ambiguous. You'll need to open the Jenkins web page and read the version number from the running application.

          Mark Waite added a comment - The image name jenkins/jenkins:lts is ambiguous. You'll need to open the Jenkins web page and read the version number from the running application.

          Neil added a comment -

          From what I see on the bottom right, after I updated the deployment back to the "lts" version is that it is running 2.303.1

          Neil added a comment - From what I see on the bottom right, after I updated the deployment back to the "lts" version is that it is running 2.303.1

          Mark Waite added a comment -

          Switching the Docker image from 2.303.1 jenkins/jenkins:lts to 2.306 jenkins/jenkins:2.306 also switches from Java 11 back to Java 8. You may want to compare jenkins/jenkins:2.303.1-jdk8 with jenkins/jenkins:2.306. It may be that some Java command line option improvements are needed to use Java 11 with a similar memory use as Java 8.

          Mark Waite added a comment - Switching the Docker image from 2.303.1 jenkins/jenkins:lts to 2.306 jenkins/jenkins:2.306 also switches from Java 11 back to Java 8. You may want to compare jenkins/jenkins:2.303.1-jdk8 with jenkins/jenkins:2.306 . It may be that some Java command line option improvements are needed to use Java 11 with a similar memory use as Java 8.

          Neil added a comment -

          Hi markewaite finally got around looking more into this.
          The memory consumption really only changes when we change from jdk8 images to jdk11 images. 2.303.1-jdk8 consumes the same memory as 2.306.

          These are the javaOpts we send in the helm chart.
          I've taken the time and upgrade the helm chart now to 3.6.0

          -Xms8192Mi -Xmx7168m -Dkubernetes.websocket.ping.interval=10000 -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.HEARTBEAT_CHECK_INTERVAL=86400 -Dorg.csanchez.jenkins.plugins.kubernetes.PodTemplate.connectionTimeout=1200 -Djenkins.install.runSetupWizard=false
          

          Neil added a comment - Hi markewaite  finally got around looking more into this. The memory consumption really only changes when we change from jdk8 images to jdk11 images. 2.303.1-jdk8 consumes the same memory as 2.306. These are the javaOpts we send in the helm chart. I've taken the time and upgrade the helm chart now to 3.6.0 -Xms8192Mi -Xmx7168m -Dkubernetes.websocket.ping.interval=10000 -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.HEARTBEAT_CHECK_INTERVAL=86400 -Dorg.csanchez.jenkins.plugins.kubernetes.PodTemplate.connectionTimeout=1200 -Djenkins.install.runSetupWizard= false

          Neil added a comment -

          Seems like this issue is very similar to my circumstances: https://issues.jenkins.io/browse/JENKINS-63766

          Neil added a comment - Seems like this issue is very similar to my circumstances: https://issues.jenkins.io/browse/JENKINS-63766

          Basil Crow added a comment -

          Most likely a duplicate of JENKINS-63766, assuming that the leak is in the metaspace.

          Basil Crow added a comment - Most likely a duplicate of JENKINS-63766 , assuming that the leak is in the metaspace.

          Basil Crow added a comment -

          It has been two weeks without a response, so I am closing this as a duplicate of JENKINS-63766. If the problem persists, please open a new ticket.

          Basil Crow added a comment - It has been two weeks without a response, so I am closing this as a duplicate of JENKINS-63766 . If the problem persists, please open a new ticket.

            Unassigned Unassigned
            nsewardep Neil
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: