Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-14336

Too many open files upon HTTP listener init or shutdown

    • Icon: Bug Bug
    • Resolution: Fixed
    • Icon: Critical Critical
    • core
    • Jenkins 1.472 on RHEL 5 x86_64

      Trying to navigate to Jenkins web site, browser keeps loading and loading. When checked on Jenkins console it showed that Jenkins failed with the following:

      Jul 6, 2012 1:49:28 PM winstone.Logger logInternal
      SEVERE: Error during HTTP listener init or shutdown
      java.net.SocketException: Too many open files
              at java.net.PlainSocketImpl.socketAccept(Native Method)
              at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:408)
              at java.net.ServerSocket.implAccept(ServerSocket.java:462)
              at java.net.ServerSocket.accept(ServerSocket.java:430)
              at winstone.HttpListener.run(HttpListener.java:136)
              at java.lang.Thread.run(Thread.java:662)
      

      I have verified open files and it seems that unreasonable large number of descriptors is open to various jar files, with jenkins-core being the leader:

      $ ls -la /proc/26867/fd | grep jar | wc -l
      876
      $ ls -la /proc/26867/fd | grep jenkins-core | wc -l
      462
      $ ls -la /proc/26867/fd | grep stapler | wc -l
      173

          [JENKINS-14336] Too many open files upon HTTP listener init or shutdown

          Krzysztof Malinowski created issue -

          Christian Holmboe added a comment - - edited

          We're seeing the same problem on 1.480 on CentOS 5.5

          Christian Holmboe added a comment - - edited We're seeing the same problem on 1.480 on CentOS 5.5

          Jason Stanley added a comment -

          Also on 1.482 running on CentOS release 5.8

          Oct 22, 2012 12:07:23 PM winstone.Logger logInternal
          SEVERE: Error during HTTP listener init or shutdown
          java.net.SocketException: Too many open files
          at java.net.PlainSocketImpl.socketAccept(Native Method)
          at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:375)
          at java.net.ServerSocket.implAccept(ServerSocket.java:470)
          at java.net.ServerSocket.accept(ServerSocket.java:438)
          at winstone.HttpListener.run(HttpListener.java:136)
          at java.lang.Thread.run(Thread.java:679)
          Oct 22, 2012 12:42:40 PM hudson.model.Run execute

          Jason Stanley added a comment - Also on 1.482 running on CentOS release 5.8 Oct 22, 2012 12:07:23 PM winstone.Logger logInternal SEVERE: Error during HTTP listener init or shutdown java.net.SocketException: Too many open files at java.net.PlainSocketImpl.socketAccept(Native Method) at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:375) at java.net.ServerSocket.implAccept(ServerSocket.java:470) at java.net.ServerSocket.accept(ServerSocket.java:438) at winstone.HttpListener.run(HttpListener.java:136) at java.lang.Thread.run(Thread.java:679) Oct 22, 2012 12:42:40 PM hudson.model.Run execute

          tsondergaard added a comment -

          Also on 1.485 running on RHEL-6.2

          tsondergaard added a comment - Also on 1.485 running on RHEL-6.2

          Jeremiah Roth added a comment -

          Also seeing this on 1.466.2, running in a tomcat6 container on Ubuntu 12.04. However, it happens without the HTTP init/shutdown trigger.

          We have not been able to determine what the trigger is, but I do know that when it happens, our number of open file descriptors jumps quickly (from about 1100 normally, to the terminal point of 4097 in just over an hour). ulimit set to: tomcat6 soft nofile 10000000 in /etc/security/limits.conf

          When this increase in java FD's occurs, it all comes from duplicate instances (thousands) of jenkins-core-1.466.2.jar.

          catalina.out doesn't indicate any problems until the "Too many open files" error starts showing up at about 3900 open file descriptors.

          Jeremiah Roth added a comment - Also seeing this on 1.466.2, running in a tomcat6 container on Ubuntu 12.04. However, it happens without the HTTP init/shutdown trigger. We have not been able to determine what the trigger is, but I do know that when it happens, our number of open file descriptors jumps quickly (from about 1100 normally, to the terminal point of 4097 in just over an hour). ulimit set to: tomcat6 soft nofile 10000000 in /etc/security/limits.conf When this increase in java FD's occurs, it all comes from duplicate instances (thousands) of jenkins-core-1.466.2.jar. catalina.out doesn't indicate any problems until the "Too many open files" error starts showing up at about 3900 open file descriptors.

          Peter Kline added a comment -

          Same here for us, Jenkins 1.500 on Centos 5.8 Java 1.6.38
          Only solution is for us to restart Jenkins.

          Peter Kline added a comment - Same here for us, Jenkins 1.500 on Centos 5.8 Java 1.6.38 Only solution is for us to restart Jenkins.
          xoan vilas made changes -
          Attachment New: nofopenfiles.png [ 23245 ]
          Attachment New: jenkins.log.old.2013-02-28_13-45-23.gz [ 23246 ]

          xoan vilas added a comment -

          We do see the same running jenkins 1.500 on Red Hat Enterprise Linux Server release 5.6 (Tikanga) and java 1.4.2

          Attached the jenkins log with the list of open files (most of then related to adjunct stuff from jenkins-core) and a screenshoot.

          It ramps up in less than one hour usually after a garbage collection and at random times without an easy pattern to identify. Around 2 times a week at any time day or night with low number of sessions as with high

          xoan vilas added a comment - We do see the same running jenkins 1.500 on Red Hat Enterprise Linux Server release 5.6 (Tikanga) and java 1.4.2 Attached the jenkins log with the list of open files (most of then related to adjunct stuff from jenkins-core) and a screenshoot. It ramps up in less than one hour usually after a garbage collection and at random times without an easy pattern to identify. Around 2 times a week at any time day or night with low number of sessions as with high

          Liya Katz added a comment -

          same with Jenkins 1.505 on Centos 5.6 x86_64

          Liya Katz added a comment - same with Jenkins 1.505 on Centos 5.6 x86_64

          Jeremiah Roth added a comment -

          We've taken to scheduling therapeutic tomcat restarts every couple weeks so as to not have Jenkins crash unexpectedly. We're discussing upgrading from openjdk-6 (1.6.0_24) to openjdk-7 in case it's a java issue. I'd be interested to know if anyone is already using openjdk-7 and still has this problem.

          Jeremiah Roth added a comment - We've taken to scheduling therapeutic tomcat restarts every couple weeks so as to not have Jenkins crash unexpectedly. We're discussing upgrading from openjdk-6 (1.6.0_24) to openjdk-7 in case it's a java issue. I'd be interested to know if anyone is already using openjdk-7 and still has this problem.

            kohsuke Kohsuke Kawaguchi
            raspy Krzysztof Malinowski
            Votes:
            13 Vote for this issue
            Watchers:
            19 Start watching this issue

              Created:
              Updated:
              Resolved: