Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-23706

Plugin Memory Leak : OutOfMemoryError : PermGen space

    • Icon: Bug Bug
    • Resolution: Incomplete
    • Icon: Major Major
    • disk-usage-plugin
    • -Xmx768m -XX:+CMSClassUnloadingEnabled -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/var/log/jenkins/memory.dump -XX:+UseCompressedOops -Djava.awt.headless=true

      I have a situation where I am running out of permgen space on a regular basis. I can literally what the permgen use grow until the server falls over. I have seen numerous other ticket that point toward this actually being a plugin issue, and that the only way to troubleshoot this is with a Memory Dump. I am of course able to increase the amount of PermGen, but all this does is delay the inevitable. No matter what levels I set permgen to, it eventually falls over.

      I have attached a memory dump.

      WARNING: Disk usage plugin fails during build calculation disk space of job **-*********-config-test
      java.io.IOException: remote file operation failed: /var/lib/jenkins/workspace/**-*********-config-test at hudson.remoting.Channel@236ff568:WSCPSLRDREAPP15
      at hudson.FilePath.act(FilePath.java:916)
      at hudson.FilePath.act(FilePath.java:893)
      at hudson.FilePath.exists(FilePath.java:1325)
      at hudson.plugins.disk_usage.DiskUsageUtil.calculateWorkspaceDiskUsageForPath(DiskUsageUtil.java:292)
      at hudson.plugins.disk_usage.DiskUsageBuildListener.onCompleted(DiskUsageBuildListener.java:60)
      at hudson.plugins.disk_usage.DiskUsageBuildListener.onCompleted(DiskUsageBuildListener.java:23)
      at hudson.model.listeners.RunListener.fireCompleted(RunListener.java:199)
      at hudson.model.Run.execute(Run.java:1783)
      at hudson.maven.MavenModuleSetBuild.run(MavenModuleSetBuild.java:529)
      at hudson.model.ResourceController.execute(ResourceController.java:88)
      at hudson.model.Executor.run(Executor.java:234)
      Caused by: java.io.IOException: Remote call on WSCPSLRDREAPP15 failed
      at hudson.remoting.Channel.call(Channel.java:748)
      at hudson.FilePath.act(FilePath.java:909)
      ... 10 more
      Caused by: java.lang.OutOfMemoryError: PermGen space

          [JENKINS-23706] Plugin Memory Leak : OutOfMemoryError : PermGen space

          David Hoffman added a comment -

          The memory dump I tried to attach was too large. It can be downloaded from https://dl.dropboxusercontent.com/u/8866678/memory.dump.bz2

          David Hoffman added a comment - The memory dump I tried to attach was too large. It can be downloaded from https://dl.dropboxusercontent.com/u/8866678/memory.dump.bz2

          Greg horvath added a comment -

          Are you using the TAP plugin? That thing is a beast; we just figured out that it was the cause of multiple Jenkins instances grinding to a halt over the past few days.

          Greg horvath added a comment - Are you using the TAP plugin? That thing is a beast; we just figured out that it was the cause of multiple Jenkins instances grinding to a halt over the past few days.

          Daniel Beck added a comment -

          To clarify, you did set e.g. -XX:MaxPermSize=256M but it didn't help for long?

          FWIW I'd get rid of Global Build Stats. (Won't help for permgen, but still...)

          Daniel Beck added a comment - To clarify, you did set e.g. -XX:MaxPermSize=256M but it didn't help for long? FWIW I'd get rid of Global Build Stats. (Won't help for permgen, but still...)

          I was not able to determine the cause of OOM so I can not exclude disk-usage, but I think that it is not the root cause. If the common usage of disk-usage causes that error, I think that more people would have the same problem. I can not recognize the root cause from dump and your log only proves that disk-usage took the last byte, but not who took the most. Please, do you still have this problem? Which version of Jenkins you used?

          Lucie Votypkova added a comment - I was not able to determine the cause of OOM so I can not exclude disk-usage, but I think that it is not the root cause. If the common usage of disk-usage causes that error, I think that more people would have the same problem. I can not recognize the root cause from dump and your log only proves that disk-usage took the last byte, but not who took the most. Please, do you still have this problem? Which version of Jenkins you used?

          Daniel Beck added a comment -

          lvotypkova This issue is almost a year old with no responses by the reporter, I think this can safely be resolved as Incomplete. It doesn't show an issue in Disk Usage Plugin after all.

          Daniel Beck added a comment - lvotypkova This issue is almost a year old with no responses by the reporter, I think this can safely be resolved as Incomplete. It doesn't show an issue in Disk Usage Plugin after all.

            Unassigned Unassigned
            dave_hoffman David Hoffman
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

              Created:
              Updated:
              Resolved: