• Icon: Bug Bug
    • Resolution: Unresolved
    • Icon: Major Major
    • maven-plugin
    • Ubuntu 11.10
      Java 1.6

      Was running 1.506 jenkins, and then rolled forward to 1.527. (also updated plugins, which I won't list here unless we think there is a specific one that might be causing the problem).

      Didn't change any jobs, just kept running as-is.

      Suddenly, after the server has been up for more than a couple days I start getting these messages:
      FATAL: PermGen space
      java.lang.OutOfMemoryError: PermGen space
      at java.lang.ClassLoader.defineClass1(Native Method)
      at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
      at org.apache.tools.ant.AntClassLoader.defineClassFromData(AntClassLoader.java:1128)
      at hudson.ClassicPluginStrategy$AntClassLoader2.defineClassFromData(ClassicPluginStrategy.java:696)
      at org.apache.tools.ant.AntClassLoader.getClassFromStream(AntClassLoader.java:1299)
      at org.apache.tools.ant.AntClassLoader.findClassInComponents(AntClassLoader.java:1355)
      at org.apache.tools.ant.AntClassLoader.findClass(AntClassLoader.java:1315)
      at org.apache.tools.ant.AntClassLoader.loadClass(AntClassLoader.java:1068)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
      at org.apache.http.impl.conn.PoolingClientConnectionManager.leaseConnection(PoolingClientConnectionManager.java:222)
      at org.apache.http.impl.conn.PoolingClientConnectionManager$1.getConnection(PoolingClientConnectionManager.java:199)
      at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:456)
      at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906)
      at org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805)
      at org.apache.maven.wagon.shared.http4.AbstractHttpClientWagon.execute(AbstractHttpClientWagon.java:746)
      at org.apache.maven.wagon.shared.http4.AbstractHttpClientWagon.fillInputData(AbstractHttpClientWagon.java:886)
      at org.apache.maven.wagon.StreamWagon.getInputStream(StreamWagon.java:116)
      at org.apache.maven.wagon.StreamWagon.getIfNewer(StreamWagon.java:88)
      at org.apache.maven.wagon.StreamWagon.get(StreamWagon.java:61)
      at org.eclipse.aether.connector.wagon.WagonRepositoryConnector$GetTask.run(WagonRepositoryConnector.java:660)
      at org.eclipse.aether.util.concurrency.RunnableErrorForwarder$1.run(RunnableErrorForwarder.java:67)
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
      at java.lang.Thread.run(Thread.java:679)

      ------------------
      Something has changed recently that is slowly using up more and more memory, because this wasn't an issue before. I've even tried increasing the amount of permgen space available and that has had no measurable effect.

          [JENKINS-19335] Sudden influx of permgen space issues

          Waldek M added a comment -

          Weird. It seems that whatever I set, it gets ignored.
          Can someone please tell me what I'm doing wrong?

          /home/myuser/jdk1.6.0_24/bin/java -Dcom.sun.akuma.Daemon=daemonized -jar /home/myuser/hudson/jenkins.war --daemon --config=/home/myuser/hudson/hudson_1/etc/winstone.properties --logfile=/home/myuser/hudson/hudson_1/log/20130930104251-winstone.log -XX:MaxPermSize=512m -XX:PermSize=256m

          And the heap's MaxPermSize is still fixed at 82MB:

          jmap -heap 907
          Attaching to process ID 907, please wait...
          Debugger attached successfully.
          Server compiler detected.
          JVM version is 19.1-b02

          using thread-local object allocation.
          Parallel GC with 18 thread(s)

          Heap Configuration:
          MinHeapFreeRatio = 40
          MaxHeapFreeRatio = 70
          MaxHeapSize = 12660506624 (12074.0MB)
          NewSize = 1310720 (1.25MB)
          MaxNewSize = 17592186044415 MB
          OldSize = 5439488 (5.1875MB)
          NewRatio = 2
          SurvivorRatio = 8
          PermSize = 21757952 (20.75MB)

          Waldek M added a comment - Weird. It seems that whatever I set, it gets ignored. Can someone please tell me what I'm doing wrong? /home/myuser/jdk1.6.0_24/bin/java -Dcom.sun.akuma.Daemon=daemonized -jar /home/myuser/hudson/jenkins.war --daemon --config=/home/myuser/hudson/hudson_1/etc/winstone.properties --logfile=/home/myuser/hudson/hudson_1/log/20130930104251-winstone.log -XX:MaxPermSize=512m -XX:PermSize=256m And the heap's MaxPermSize is still fixed at 82MB: jmap -heap 907 Attaching to process ID 907, please wait... Debugger attached successfully. Server compiler detected. JVM version is 19.1-b02 using thread-local object allocation. Parallel GC with 18 thread(s) Heap Configuration: MinHeapFreeRatio = 40 MaxHeapFreeRatio = 70 MaxHeapSize = 12660506624 (12074.0MB) NewSize = 1310720 (1.25MB) MaxNewSize = 17592186044415 MB OldSize = 5439488 (5.1875MB) NewRatio = 2 SurvivorRatio = 8 PermSize = 21757952 (20.75MB)

          A K added a comment - http://stackoverflow.com/questions/14762162/how-do-i-give-jenkins-more-heap-space-when-its-running-as-a-daemon-on-ubuntu#comment27039638_14762164

          Waldek M added a comment -

          Thanks, yet it's not that (I think); I'm not starting the service - I'm starting the jenkins war directly with java, setting parameters as above. Using JAVA_OPTS has no effect.
          Or did you have something specific in mind?

          Waldek M added a comment - Thanks, yet it's not that (I think); I'm not starting the service - I'm starting the jenkins war directly with java, setting parameters as above. Using JAVA_OPTS has no effect. Or did you have something specific in mind?

          Waldek M added a comment - - edited

          Got it
          a) I moved the -XX parameters to front, before the -jar
          b) used a newer Java. 1.6.0_24 didn't recognize the -XX:MaxPermSize.

          Waldek M added a comment - - edited Got it a) I moved the -XX parameters to front, before the -jar b) used a newer Java. 1.6.0_24 didn't recognize the -XX:MaxPermSize.

          This also worked on a Mac Server installation. Let's get that added to the doco.

          Our script to start Jenkins now.

          #!/bin/bash
          export JENKINS_HOME=/jenkins
          java -DJENKINS_HOME=/jenkins/ -Xmx1536M -XX:MaxPermSize=1536M -jar /jenkins/jenkins.war

          Matthew Stevens added a comment - This also worked on a Mac Server installation. Let's get that added to the doco. Our script to start Jenkins now. #!/bin/bash export JENKINS_HOME=/jenkins java -DJENKINS_HOME=/jenkins/ -Xmx1536M -XX:MaxPermSize=1536M -jar /jenkins/jenkins.war

          Also seeing this issue with version 1.534, with the Java 6 default permgen space of 82MB. After startup, permgen usage creeps upward at a slow pace, and actually using Jenkins has a more noticeable impact (see below).

          Using VisualVM to profile Jenkins 1.534, here's what I've noticed:

          • After starting, permgen is around 50MB.
          • Loading the home page for the first time brings it to around 60MB.
          • Running an empty freestyle job raises it slightly, on the order of 1MB.
          • Running a Maven project with Sonar analysis usually pushes permgen to its limit, causing Jenkins to become unresponsive.

          Notable plugins on our Jenkins install:

          • Maven Project 2.0
          • Artifactory 2.2.0
          • Sonar 2.1
          • Some internally developed plugins, which I'm pretty sure aren't causing this issue (I'll try disabling them tomorrow)

          I used Eclipse MAP's "Leak Suspects" report and here's what I got:

          One instance of "org.kohsuke.stapler.WebApp" loaded by "winstone.classLoader.WebappClassLoader @ 0xa06c3d30" occupies 10,064,168 (19.93%) bytes. The memory is accumulated in one instance of "java.util.HashMap$Entry[]" loaded by "<system class loader>".

          Keywords
          winstone.classLoader.WebappClassLoader @ 0xa06c3d30
          java.util.HashMap$Entry[]
          org.kohsuke.stapler.WebApp

          6,778 instances of "java.lang.Class", loaded by "<system class loader>" occupy 9,953,992 (19.71%) bytes.

          Biggest instances:
          •class org.apache.commons.jexl.util.introspection.UberspectImpl @ 0x830d4048 - 4,955,632 (9.81%) bytes.
          •class sun.security.provider.X509Factory @ 0x806c9590 - 643,608 (1.27%) bytes.

          Keywords
          java.lang.Class

          Ronald Jenkins Jr added a comment - Also seeing this issue with version 1.534, with the Java 6 default permgen space of 82MB. After startup, permgen usage creeps upward at a slow pace, and actually using Jenkins has a more noticeable impact (see below). Using VisualVM to profile Jenkins 1.534, here's what I've noticed: After starting, permgen is around 50MB. Loading the home page for the first time brings it to around 60MB. Running an empty freestyle job raises it slightly, on the order of 1MB. Running a Maven project with Sonar analysis usually pushes permgen to its limit, causing Jenkins to become unresponsive. Notable plugins on our Jenkins install: Maven Project 2.0 Artifactory 2.2.0 Sonar 2.1 Some internally developed plugins, which I'm pretty sure aren't causing this issue (I'll try disabling them tomorrow) I used Eclipse MAP's "Leak Suspects" report and here's what I got: One instance of "org.kohsuke.stapler.WebApp" loaded by "winstone.classLoader.WebappClassLoader @ 0xa06c3d30" occupies 10,064,168 (19.93%) bytes. The memory is accumulated in one instance of "java.util.HashMap$Entry[]" loaded by "<system class loader>". Keywords winstone.classLoader.WebappClassLoader @ 0xa06c3d30 java.util.HashMap$Entry[] org.kohsuke.stapler.WebApp 6,778 instances of "java.lang.Class", loaded by "<system class loader>" occupy 9,953,992 (19.71%) bytes. Biggest instances: •class org.apache.commons.jexl.util.introspection.UberspectImpl @ 0x830d4048 - 4,955,632 (9.81%) bytes. •class sun.security.provider.X509Factory @ 0x806c9590 - 643,608 (1.27%) bytes. Keywords java.lang.Class

          This issue seems to have gotten better by downgrading from 1.534 to 1.523, but I'm not fully convinced that there is no memory leak in that older version. With minimal plugins installed, garbage collection was still occurring when the permgen ceiling was reached, although it finally did go unresponsive once.

          For now, we've upped our permgen space to 128MB, but I sense we're just avoiding the inevitable.

          Ronald Jenkins Jr added a comment - This issue seems to have gotten better by downgrading from 1.534 to 1.523, but I'm not fully convinced that there is no memory leak in that older version. With minimal plugins installed, garbage collection was still occurring when the permgen ceiling was reached, although it finally did go unresponsive once. For now, we've upped our permgen space to 128MB, but I sense we're just avoiding the inevitable.

          Daniel Beck added a comment -

          Can the issue be reproduced in recent versions of Jenkins, i.e. 1.554.x or 1.560+?

          Daniel Beck added a comment - Can the issue be reproduced in recent versions of Jenkins, i.e. 1.554.x or 1.560+?

          I have a similar issue with Jenkins v1.570 on OSX 10.9.3 (Java 1.7.0_25-b15). I have just started to try to debug it. Upon start I see that permGen is at 77M, grows about 2k every few seconds and it crashes pretty quickly. This started either when we updated Jenkins to a more recent version, when we updated a plug-in, or when we switched one of the builds for an iOS app to use a workspace instead of a target. Unfortunately, we don't have a clear picture of when the crashes started, so it's difficult to determine what might have been the cause.

          Heap size was 1024M, but I upped it to 1536M with no effect. I'm going to try to up the permgen though I don't know what the default permgen size is, so it will have to be an arbitrary value of 128M or more.

          I've installed the Monitoring plugin, which is at least giving me some data. I installed VisualVM, though it doesn't see the Jenkins process for some reason. I'm also looking at https://wiki.jenkins-ci.org/display/JENKINS/I%27m+getting+OutOfMemoryError to see if there is a way to get a dump after it crashes.

          When I have more data I will post up here.

          George McMullen added a comment - I have a similar issue with Jenkins v1.570 on OSX 10.9.3 (Java 1.7.0_25-b15). I have just started to try to debug it. Upon start I see that permGen is at 77M, grows about 2k every few seconds and it crashes pretty quickly. This started either when we updated Jenkins to a more recent version, when we updated a plug-in, or when we switched one of the builds for an iOS app to use a workspace instead of a target. Unfortunately, we don't have a clear picture of when the crashes started, so it's difficult to determine what might have been the cause. Heap size was 1024M, but I upped it to 1536M with no effect. I'm going to try to up the permgen though I don't know what the default permgen size is, so it will have to be an arbitrary value of 128M or more. I've installed the Monitoring plugin, which is at least giving me some data. I installed VisualVM, though it doesn't see the Jenkins process for some reason. I'm also looking at https://wiki.jenkins-ci.org/display/JENKINS/I%27m+getting+OutOfMemoryError to see if there is a way to get a dump after it crashes. When I have more data I will post up here.

          We have something like this on 554.2. After long uptime (week or so), suddenly we got such error. After the day and logs filled with this, jenkins just hung.

          Aug 7, 2014 12:24:46 AM hudson.model.Run execute
          INFO: latency #193 main build action completed: SUCCESS
          Aug 7, 2014 3:27:25 AM hudson.model.AsyncPeriodicWork$1 run
          INFO: Started Fingerprint cleanup
          Aug 7, 2014 3:27:25 AM hudson.model.FingerprintCleanupThread execute
          INFO: Cleaned up 0 records
          Aug 7, 2014 3:27:25 AM hudson.model.AsyncPeriodicWork$1 run
          INFO: Finished Fingerprint cleanup. 6 ms
          Exception in thread "http-bio-1050-exec-92" java.lang.OutOfMemoryError: PermGen space
          at java.lang.Throwable.getStackTraceElement(Native Method)
          at java.lang.Throwable.getOurStackTrace(Throwable.java:591)
          at java.lang.Throwable.printStackTrace(Throwable.java:510)
          at java.util.logging.SimpleFormatter.format(SimpleFormatter.java:72)
          at org.apache.juli.FileHandler.publish(FileHandler.java:200)
          at java.util.logging.Logger.log(Logger.java:458)
          at java.util.logging.Logger.doLog(Logger.java:480)
          at java.util.logging.Logger.logp(Logger.java:680)
          at org.apache.juli.logging.DirectJDKLog.log(DirectJDKLog.java:185)
          at org.apache.juli.logging.DirectJDKLog.error(DirectJDKLog.java:151)
          at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:260)
          at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123)
          at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
          at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168)
          at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:99)
          at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:929)
          at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
          at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407)
          at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1002)
          at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:585)
          at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:312)
          at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
          at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
          at java.lang.Thread.run(Thread.java:619)
          Exception in thread "http-bio-1050-exec-96" java.lang.OutOfMemoryError: PermGen space
          at java.lang.Throwable.getStackTraceElement(Native Method)
          at java.lang.Throwable.getOurStackTrace(Throwable.java:591)
          at java.lang.Throwable.printStackTraceAsCause(Throwable.java:529)
          at java.lang.Throwable.printStackTrace(Throwable.java:516)
          at java.util.logging.SimpleFormatter.format(SimpleFormatter.java:72)
          at org.apache.juli.FileHandler.publish(FileHandler.java:200)
          at java.util.logging.Logger.log(Logger.java:458)
          at java.util.logging.Logger.doLog(Logger.java:480)
          at java.util.logging.Logger.logp(Logger.java:680)
          at org.apache.juli.logging.DirectJDKLog.log(DirectJDKLog.java:185)
          at org.apache.juli.logging.DirectJDKLog.error(DirectJDKLog.java:151)
          at org.apache.catalina.core.ApplicationContext.log(ApplicationContext.java:742)
          at org.apache.catalina.core.ApplicationContextFacade.log(ApplicationContextFacade.java:325)
          at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:762)

          at org.kohsuke.stapler.Stapler.invoke(Stapler.java:858)
          at org.kohsuke.stapler.MetaClass$12.dispatch(MetaClass.java:390)
          at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:728)
          at org.kohsuke.stapler.Stapler.invoke(Stapler.java:858)
          at org.kohsuke.stapler.MetaClass$6.doDispatch(MetaClass.java:248)
          at org.kohsuke.stapler.NameBasedDispatcher.dispatch(NameBasedDispatcher.java:53)
          at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:728)
          at org.kohsuke.stapler.Stapler.invoke(Stapler.java:858)
          at org.kohsuke.stapler.Stapler.invoke(Stapler.java:631)
          at org.kohsuke.stapler.Stapler.service(Stapler.java:225)
          at javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
          at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
          at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
          at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:96)
          at hudson.util.PluginServletFilter.doFilter(PluginServletFilter.java:88)
          at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
          at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
          at hudson.security.csrf.CrumbFilter.doFilter(CrumbFilter.java:48)
          Exception in thread "Ping thread for channel hudson.remoting.Channel@698fe8fc:linux-ny3" Exception in thread "Ping thread for channel hudson.remoting.Channel@436cab6b:linux-nj3" java.lang.OutOfMemoryError: PermGen space
          java.lang.OutOfMemoryError: PermGen space
          Exception in thread "Ping thread for channel hudson.remoting.Channel@6adc0b2f:windows" java.lang.OutOfMemoryError: PermGen space

          Gennady Kupava added a comment - We have something like this on 554.2. After long uptime (week or so), suddenly we got such error. After the day and logs filled with this, jenkins just hung. Aug 7, 2014 12:24:46 AM hudson.model.Run execute INFO: latency #193 main build action completed: SUCCESS Aug 7, 2014 3:27:25 AM hudson.model.AsyncPeriodicWork$1 run INFO: Started Fingerprint cleanup Aug 7, 2014 3:27:25 AM hudson.model.FingerprintCleanupThread execute INFO: Cleaned up 0 records Aug 7, 2014 3:27:25 AM hudson.model.AsyncPeriodicWork$1 run INFO: Finished Fingerprint cleanup. 6 ms Exception in thread "http-bio-1050-exec-92" java.lang.OutOfMemoryError: PermGen space at java.lang.Throwable.getStackTraceElement(Native Method) at java.lang.Throwable.getOurStackTrace(Throwable.java:591) at java.lang.Throwable.printStackTrace(Throwable.java:510) at java.util.logging.SimpleFormatter.format(SimpleFormatter.java:72) at org.apache.juli.FileHandler.publish(FileHandler.java:200) at java.util.logging.Logger.log(Logger.java:458) at java.util.logging.Logger.doLog(Logger.java:480) at java.util.logging.Logger.logp(Logger.java:680) at org.apache.juli.logging.DirectJDKLog.log(DirectJDKLog.java:185) at org.apache.juli.logging.DirectJDKLog.error(DirectJDKLog.java:151) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:260) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:168) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:99) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:929) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:407) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1002) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:585) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:312) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:619) Exception in thread "http-bio-1050-exec-96" java.lang.OutOfMemoryError: PermGen space at java.lang.Throwable.getStackTraceElement(Native Method) at java.lang.Throwable.getOurStackTrace(Throwable.java:591) at java.lang.Throwable.printStackTraceAsCause(Throwable.java:529) at java.lang.Throwable.printStackTrace(Throwable.java:516) at java.util.logging.SimpleFormatter.format(SimpleFormatter.java:72) at org.apache.juli.FileHandler.publish(FileHandler.java:200) at java.util.logging.Logger.log(Logger.java:458) at java.util.logging.Logger.doLog(Logger.java:480) at java.util.logging.Logger.logp(Logger.java:680) at org.apache.juli.logging.DirectJDKLog.log(DirectJDKLog.java:185) at org.apache.juli.logging.DirectJDKLog.error(DirectJDKLog.java:151) at org.apache.catalina.core.ApplicationContext.log(ApplicationContext.java:742) at org.apache.catalina.core.ApplicationContextFacade.log(ApplicationContextFacade.java:325) at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:762) at org.kohsuke.stapler.Stapler.invoke(Stapler.java:858) at org.kohsuke.stapler.MetaClass$12.dispatch(MetaClass.java:390) at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:728) at org.kohsuke.stapler.Stapler.invoke(Stapler.java:858) at org.kohsuke.stapler.MetaClass$6.doDispatch(MetaClass.java:248) at org.kohsuke.stapler.NameBasedDispatcher.dispatch(NameBasedDispatcher.java:53) at org.kohsuke.stapler.Stapler.tryInvoke(Stapler.java:728) at org.kohsuke.stapler.Stapler.invoke(Stapler.java:858) at org.kohsuke.stapler.Stapler.invoke(Stapler.java:631) at org.kohsuke.stapler.Stapler.service(Stapler.java:225) at javax.servlet.http.HttpServlet.service(HttpServlet.java:722) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at hudson.util.PluginServletFilter$1.doFilter(PluginServletFilter.java:96) at hudson.util.PluginServletFilter.doFilter(PluginServletFilter.java:88) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at hudson.security.csrf.CrumbFilter.doFilter(CrumbFilter.java:48) Exception in thread "Ping thread for channel hudson.remoting.Channel@698fe8fc:linux-ny3" Exception in thread "Ping thread for channel hudson.remoting.Channel@436cab6b:linux-nj3" java.lang.OutOfMemoryError: PermGen space java.lang.OutOfMemoryError: PermGen space Exception in thread "Ping thread for channel hudson.remoting.Channel@6adc0b2f:windows" java.lang.OutOfMemoryError: PermGen space

            Unassigned Unassigned
            casualt A K
            Votes:
            5 Vote for this issue
            Watchers:
            9 Start watching this issue

              Created:
              Updated: