• Icon: Bug Bug
    • Resolution: Unresolved
    • Icon: Critical Critical
    • timestamper-plugin
    • None
    • Jenkins LTS 2.263.4
      timestamper-plugin 1.13
      OpenJDK 1.8.0_282
      CentOS 7.9.2009

      Since updating timestamper-plugin to version 1.13, Jenkins throws IOExceptions for "Too many open files" after some hours and Jenkins gets unresponsive.

      Checked with Linux's "lsof -u jenkins" command we see a lot of open files with pattern

      /var/lib/jenkins/jobs/*/builds/*/timestamper/timestamps

      The number of open files of this pattern increases over time until the limit is reached.

      A Jenkins restart frees the file handles and makes Jenkins responsive again - until open file limits are reached again.

      A downgrade of timestamper-plugin to version 1.12 solved the issue for us.

       

          [JENKINS-65738] File handle leak in timestamper plugin

          Basil Crow added a comment -

          Thanks for reporting this, lenslord. I did indeed change the code in TimestampsReader and TimestampsWriter from new FileInputStream and new FileOutputStream to Files.newInputStream and Files.newOutputStream in 1.13. However, I just audited the new code and I do not see any obvious file handle leaks. And interactive testing of 1.13 on 2.277.4 LTS while watching the open file table did not reveal any problems either: I was able to create and run multiple timestamped Freestyle jobs concurrently, viewing the console logs in my browser, and at the conclusion of it all there were no timestamps files left open. Timestamper 1.13 was also released 3 weeks ago, and I find it surprising that no other users have complained besides you. If there was truly a leak that affected all users, I would expect to have heard from more people by now.

          I think whatever is going on must be specific to your Jenkins installation. It's actually normal for the timestamps files to be open while users are viewing console logs, but those files should be closed once the thread handling the HTTP request finishes its processing. I confirmed this interactively with 1.13 and 2.277.4 LTS. I wonder if this might be a red herring and your real problem is with some other open files. Or maybe you are having a problem with some other plugin.

          Since I can't reproduce this based on the information you've given me, I am going to need a little help from your side to get to the bottom of this. Can you run Jenkins with the File Leak Detector agent? Then when you hit a "Too many open files" error, the agent will dump a table to standard error with the list of open files and the stack trace of the thread that opened the file. Send me that complete table and I can possibly help.

          Basil Crow added a comment - Thanks for reporting this, lenslord . I did indeed change the code in TimestampsReader and TimestampsWriter from new FileInputStream and new FileOutputStream to Files.newInputStream and Files.newOutputStream in 1.13. However, I just audited the new code and I do not see any obvious file handle leaks. And interactive testing of 1.13 on 2.277.4 LTS while watching the open file table did not reveal any problems either: I was able to create and run multiple timestamped Freestyle jobs concurrently, viewing the console logs in my browser, and at the conclusion of it all there were no timestamps files left open. Timestamper 1.13 was also released 3 weeks ago, and I find it surprising that no other users have complained besides you. If there was truly a leak that affected all users, I would expect to have heard from more people by now. I think whatever is going on must be specific to your Jenkins installation. It's actually normal for the timestamps files to be open while users are viewing console logs, but those files should be closed once the thread handling the HTTP request finishes its processing. I confirmed this interactively with 1.13 and 2.277.4 LTS. I wonder if this might be a red herring and your real problem is with some other open files. Or maybe you are having a problem with some other plugin. Since I can't reproduce this based on the information you've given me, I am going to need a little help from your side to get to the bottom of this. Can you run Jenkins with the File Leak Detector agent? Then when you hit a "Too many open files" error, the agent will dump a table to standard error with the list of open files and the stack trace of the thread that opened the file. Send me that complete table and I can possibly help.

          Sean Humes added a comment -

          Hi basil,

          I think my team's instance recently saw something similar to this. We also saw Jenkins begin to throw IOExceptions of "too many open files" as we began to scale up the number of jobs running daily (Would occur about every 3 - 4 days). My team took a look through the host machines open files descriptor logs and a lot of them pointed to the `timestamper/timestamps` file for jobs that had already completed running:

          /<path-to-jenkins>/.jenkins/jobs/<job-name>/builds/263/timestamper/timestamps 

          We had also recently upgraded our Jenkins instance which also updated the timestamp plugin from 1.12 -> 1.13. Our Jenkins instance was on 2.249.1 upgrading to 2.303.2. We ended up downgrading the timestamper plugin back to 1.12 like Michael and we haven't seen the issue since then (about 3 months ago now).

          Unfortunately, I can not install the File Leak Detector plugin due to policies we have for installing new plugins on our instance but I hope this helps.

           

          Sean Humes added a comment - Hi basil , I think my team's instance recently saw something similar to this. We also saw Jenkins begin to throw IOExceptions of "too many open files" as we began to scale up the number of jobs running daily (Would occur about every 3 - 4 days). My team took a look through the host machines open files descriptor logs and a lot of them pointed to the `timestamper/timestamps` file for jobs that had already completed running: /<path-to-jenkins>/.jenkins/jobs/<job-name>/builds/263/timestamper/timestamps We had also recently upgraded our Jenkins instance which also updated the timestamp plugin from 1.12 -> 1.13. Our Jenkins instance was on 2.249.1 upgrading to 2.303.2. We ended up downgrading the timestamper plugin back to 1.12 like Michael and we haven't seen the issue since then (about 3 months ago now). Unfortunately, I can not install the File Leak Detector plugin due to policies we have for installing new plugins on our instance but I hope this helps.  

          Basil Crow added a comment -

          What job type are these? I seem to recall an open file issue with the Maven job type (Maven plugin) but I am not aware of any issues with Freestyle or Pipeline jobs.

          Basil Crow added a comment - What job type are these? I seem to recall an open file issue with the Maven job type (Maven plugin) but I am not aware of any issues with Freestyle or Pipeline jobs.

          Sean Humes added a comment -

          The majority of these jobs are freestyle with some matrix jobs. We're planning on migrating all jobs to pipeline jobs soon. We don't use Maven jobs on our current instance. Let me know if you have any other questions and I'll do my best to answer them for you.

          Sean Humes added a comment - The majority of these jobs are freestyle with some matrix jobs. We're planning on migrating all jobs to pipeline jobs soon. We don't use Maven jobs on our current instance. Let me know if you have any other questions and I'll do my best to answer them for you.

          Basil Crow added a comment -

          Have you installed the File Leak Detector and debugged the issue?

          Basil Crow added a comment - Have you installed the File Leak Detector and debugged the issue?

            Unassigned Unassigned
            lenslord Michael Meissner
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated: