-
Bug
-
Resolution: Fixed
-
Critical
-
Jenkins 1.472 on RHEL 5 x86_64
Trying to navigate to Jenkins web site, browser keeps loading and loading. When checked on Jenkins console it showed that Jenkins failed with the following:
Jul 6, 2012 1:49:28 PM winstone.Logger logInternal SEVERE: Error during HTTP listener init or shutdown java.net.SocketException: Too many open files at java.net.PlainSocketImpl.socketAccept(Native Method) at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:408) at java.net.ServerSocket.implAccept(ServerSocket.java:462) at java.net.ServerSocket.accept(ServerSocket.java:430) at winstone.HttpListener.run(HttpListener.java:136) at java.lang.Thread.run(Thread.java:662)
I have verified open files and it seems that unreasonable large number of descriptors is open to various jar files, with jenkins-core being the leader:
$ ls -la /proc/26867/fd | grep jar | wc -l
876
$ ls -la /proc/26867/fd | grep jenkins-core | wc -l
462
$ ls -la /proc/26867/fd | grep stapler | wc -l
173
- depends on
-
JENKINS-20163 java.lang.NoClassDefFoundError: sun/net/www/protocol/jar/JarURLConnection
-
- Resolved
-
Also seeing this on 1.466.2, running in a tomcat6 container on Ubuntu 12.04. However, it happens without the HTTP init/shutdown trigger.
We have not been able to determine what the trigger is, but I do know that when it happens, our number of open file descriptors jumps quickly (from about 1100 normally, to the terminal point of 4097 in just over an hour). ulimit set to: tomcat6 soft nofile 10000000 in /etc/security/limits.conf
When this increase in java FD's occurs, it all comes from duplicate instances (thousands) of jenkins-core-1.466.2.jar.
catalina.out doesn't indicate any problems until the "Too many open files" error starts showing up at about 3900 open file descriptors.