Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-13617

64-bit java.lang.OutOfMemoryError: PermGen space

    • Icon: Bug Bug
    • Resolution: Fixed
    • Icon: Major Major
    • core

      Even with -XX:PermSize=512M I still get java.lang.OutOfMemoryError: PermGen space about once a day with light load. Our 32-bit Jenkins has never had this problem and no special settings. Memory leak?

      Apr 26, 2012 9:56:34 AM winstone.Logger logInternal
      WARNING: Untrapped Error in Servlet
      java.lang.OutOfMemoryError: PermGen space
      at java.lang.Throwable.getStackTraceElement(Native Method)
      at java.lang.Throwable.getOurStackTrace(Throwable.java:591)
      at java.lang.Throwable.printStackTraceAsCause(Throwable.java:529)
      at java.lang.Throwable.printStackTraceAsCause(Throwable.java:545)
      at java.lang.Throwable.printStackTraceAsCause(Throwable.java:545)
      at java.lang.Throwable.printStackTrace(Throwable.java:516)
      at net.bull.javamelody.MonitoringFilter.doFilter(MonitoringFilter.java:224)
      at net.bull.javamelody.MonitoringFilter.doFilter(MonitoringFilter.java:171)
      at net.bull.javamelody.PluginMonitoringFilter.doFilter(PluginMonitoringFilter.java:86)
      at org.jvnet.hudson.plugins.monitoring.HudsonMonitoringFilter.doFilter(HudsonMonitoringFilter.java:84)
      at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:98)
      at hudson.plugins.greenballs.GreenBallFilter.doFilter(GreenBallFilter.java:74)
      at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:98)
      at hudson.util.PluginServletFilter.doFilter(PluginServletFilter.java:87)
      at winstone.FilterConfiguration.execute(FilterConfiguration.java:194)
      at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366)
      at hudson.security.csrf.CrumbFilter.doFilter(CrumbFilter.java:47)
      at winstone.FilterConfiguration.execute(FilterConfiguration.java:194)
      at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366)
      at hudson.security.ChainedServletFilter$1.doFilter(ChainedServletFilter.java:84)
      at hudson.security.ChainedServletFilter.doFilter(ChainedServletFilter.java:76)
      at hudson.security.HudsonFilter.doFilter(HudsonFilter.java:164)
      at winstone.FilterConfiguration.execute(FilterConfiguration.java:194)
      at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366)
      at hudson.util.CharacterEncodingFilter.doFilter(CharacterEncodingFilter.java:81)
      at winstone.FilterConfiguration.execute(FilterConfiguration.java:194)
      at winstone.RequestDispatcher.doFilter(RequestDispatcher.java:366)
      at winstone.RequestDispatcher.forward(RequestDispatcher.java:331)
      at winstone.RequestHandlerThread.processRequest(RequestHandlerThread.java:215)
      at winstone.RequestHandlerThread.run(RequestHandlerThread.java:138)
      at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
      at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
      Apr 26, 2012 9:56:37 AM winstone.Logger logInternal
      WARNING: Untrapped Error in Servlet
      java.lang.OutOfMemoryError: PermGen space
      Apr 26, 2012 9:56:50 AM hudson.triggers.SafeTimerTask run
      SEVERE: Timer task hudson.model.LoadStatistics$LoadStatisticsUpdater@2b1c2043 failed
      java.lang.OutOfMemoryError: PermGen space

        1. memory.dump.bz2
          8.53 MB
        2. periodicbackup.hpi
          1.67 MB
        3. periodicbackup.hpi
          1.67 MB
        4. memory.dump.bz2
          8.54 MB

          [JENKINS-13617] 64-bit java.lang.OutOfMemoryError: PermGen space

          Adam Sloan created issue -

          wgracelee added a comment -

          We're getting this error at least once per day after we upgraded to 1.463. In its previous version, 1.456, we hardly saw it. The underneath os is Centos 5.5, 64bit.

          wgracelee added a comment - We're getting this error at least once per day after we upgraded to 1.463. In its previous version, 1.456, we hardly saw it. The underneath os is Centos 5.5, 64bit.

          wgracelee added a comment -

          And our Jenkins is being run as:
          java -XX:NewSize=256m -XX:MaxNewSize=256m -XX:SurvivorRatio=8 -XX:+UseConcMarkSweepGC -XX:+CMSPermGenSweepingEnabled -XX:+CMSClassUnloadingEnabled -Xms... -Xmx... -jar ...

          wgracelee added a comment - And our Jenkins is being run as: java -XX:NewSize=256m -XX:MaxNewSize=256m -XX:SurvivorRatio=8 -XX:+UseConcMarkSweepGC -XX:+CMSPermGenSweepingEnabled -XX:+CMSClassUnloadingEnabled -Xms... -Xmx... -jar ...

          evernat added a comment -

          and you have not set MaxPermSize?

          evernat added a comment - and you have not set MaxPermSize?

          wgracelee added a comment -

          Nop. I'll set it to 512m to see how it turned out.

          wgracelee added a comment - Nop. I'll set it to 512m to see how it turned out.

          Frédéric Camblor added a comment - - edited

          Seems like I encounter same issue, once a day (I restart, and it solves the problem, 'til the next day).

          I'm facing it on 1.466.1 (LTS) under ubuntu with automatic package installer.

          Frédéric Camblor added a comment - - edited Seems like I encounter same issue, once a day (I restart, and it solves the problem, 'til the next day). I'm facing it on 1.466.1 (LTS) under ubuntu with automatic package installer.

          I think we faced with the same problem. Our Jenkins crashes every 2 days.
          Configuration :
          Jenkins 1.478 running on AIX 5.3, IBM Java 6, 5 slaves (RHEL-5 and Windows), and near 100 actives jobs.

          When analizing memory with Javamelody monitoring, it seems that permgen (native memory in case of IBM) is constantly increasing.

          We have not yet found the cause, but we noticed that it often blocks at fixed hours (4:00 pm, 6:00 pm, ...). Our doubts are on plugins or features performing heavy tasks periodically.

          Michael Pailloncy added a comment - I think we faced with the same problem. Our Jenkins crashes every 2 days. Configuration : Jenkins 1.478 running on AIX 5.3, IBM Java 6, 5 slaves (RHEL-5 and Windows), and near 100 actives jobs. When analizing memory with Javamelody monitoring, it seems that permgen (native memory in case of IBM) is constantly increasing. We have not yet found the cause, but we noticed that it often blocks at fixed hours (4:00 pm, 6:00 pm, ...). Our doubts are on plugins or features performing heavy tasks periodically.

          If you suspect you have a leak (in current case in permanent generation) then you can try to use Plumbr with your Jenkins instance: http://plumbr.eu/blog/plumbr-1-1-we-now-find-permgen-leaks

          Disclaimer: I am Plumbr's developer

          Nikita Salnikov-Tarnovski added a comment - If you suspect you have a leak (in current case in permanent generation) then you can try to use Plumbr with your Jenkins instance: http://plumbr.eu/blog/plumbr-1-1-we-now-find-permgen-leaks Disclaimer: I am Plumbr's developer
          Greg Allen made changes -
          Link New: This issue is related to JENKINS-15552 [ JENKINS-15552 ]

          Steve Roth added a comment -

          me too, on 1.491

          Steve Roth added a comment - me too, on 1.491

            johno Johno Crawford
            asloan7 Adam Sloan
            Votes:
            4 Vote for this issue
            Watchers:
            12 Start watching this issue

              Created:
              Updated:
              Resolved: