Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-24304

Enable/Disable verbose Git command logging in Jenkins build log

    • Icon: Improvement Improvement
    • Resolution: Unresolved
    • Icon: Minor Minor
    • git-client-plugin
    • None

      The git plugin currently prints out the git commands that it executes.

      git rev-parse master^{commit} # timeout=10
      

      In some cases, these commands can be numerous and thus distract the user from other pertinent information in the build log.

      It would be very valuable to be able to enable or disable this verbose output via a option in the job configuration.

          [JENKINS-24304] Enable/Disable verbose Git command logging in Jenkins build log

          Mark Waite added a comment - - edited

          The output happens whenever the plugin needs to convert a name to a SHA1. That happens in several areas (like Prune stale remote tracking branches)

          Mark Waite added a comment - - edited The output happens whenever the plugin needs to convert a name to a SHA1. That happens in several areas (like Prune stale remote tracking branches)

          Sam Deane added a comment - - edited

          This seems to have started happening recently on our server, in a way in which it didn't previously.

           

          All of the log lines are of this form:

           

          09:41:13 > git rev-parse refs/tags/builds/appstore/3.4/15562^{commit} # timeout=10

          and there are tons of them.

           

          Worse still, it's taking minutes to process them all

           

          Has something changed in the git plugin recently that could have caused this? It's not really an option to remove these tags.

          Sam Deane added a comment - - edited This seems to have started happening recently on our server, in a way in which it didn't previously.   All of the log lines are of this form:   09:41:13 > git rev-parse refs/tags/builds/appstore/3.4/15562^{commit} # timeout=10 and there are tons of them.   Worse still, it's taking minutes to process them all   Has something changed in the git plugin recently that could have caused this? It's not really an option to remove these tags.

          Sam Deane added a comment -

          To illustrate the impact of this problem, here's the first line like this from a recent job:

          09:33:32 > git rev-parse refs/tags/builds/appstore/3.4.1/15745^{commit} # timeout=10

          and here's the last:

          09:41:38 > git rev-parse refs/tags/issues/closed/5510^{commit} # timeout=10

          Note the timestamps. Our job - which usually takes 20 minutes or so, is now taking an extra 8 minutes...

          Sam Deane added a comment - To illustrate the impact of this problem, here's the first line like this from a recent job: 09:33:32 > git rev-parse refs/tags/builds/appstore/3.4.1/15745^{commit} # timeout=10 and here's the last: 09:41:38 > git rev-parse refs/tags/issues/closed/5510^{commit} # timeout=10 Note the timestamps. Our job - which usually takes 20 minutes or so, is now taking an extra 8 minutes...

          Sam Deane added a comment -

          (I realise that the log output and the underlying performance are different issues - both annoying though!)

          Sam Deane added a comment - (I realise that the log output and the underlying performance are different issues - both annoying though!)

          Sam Deane added a comment -

          I've just discovered the do not fetch tags option, which may help. Will try turning it on, but presumably I'll also need to manually remove all the local tags from each of the jenkins slaves?

          I still don't understand why this has started happening recently, however. Many of the tags have been there for years.

          Sam Deane added a comment - I've just discovered the  do not fetch tags option, which may help. Will try turning it on, but presumably I'll also need to manually remove all the local tags from each of the jenkins slaves? I still don't understand why this has started happening recently, however. Many of the tags have been there for years.

          Mark Waite added a comment -

          samdeane I am not aware of any recent change in the plugin that would affect this behavior. Did you recently upgrade either the git plugin or the git client plugin?

          Did the number of tags (or branches) in your repository increase dramatically recently?

          Did a job setting change recently?

          Did you add a new plugin recently (like the timestamper plugin)?

          Mark Waite added a comment - samdeane I am not aware of any recent change in the plugin that would affect this behavior. Did you recently upgrade either the git plugin or the git client plugin? Did the number of tags (or branches) in your repository increase dramatically recently? Did a job setting change recently? Did you add a new plugin recently (like the timestamper plugin)?

          Sam Deane added a comment - - edited

          > Did you recently upgrade either the git plugin or the git client plugin?

          They're upgraded (by me) periodically. I think the last change was 17 days ago. A number of plugins were updated, include the git plugin which went from 3.0.1 to 3.2.0.

          I can't say for certain if that coincides with this problem starting.

          > Did the number of tags (or branches) in your repository increase dramatically recently?

          No. There are 2242 tags, but they're largely historical.

          > Did a job setting change recently?

          No.

          > Did you add a new plugin recently (like the timestamper plugin)?

          No, the timestamper plugin has been installed for a long time.

          Sam Deane added a comment - - edited > Did you recently upgrade either the git plugin or the git client plugin? They're upgraded (by me) periodically. I think the last change was 17 days ago. A number of plugins were updated, include the git plugin which went from 3.0.1 to 3.2.0. I can't say for certain if that coincides with this problem starting. > Did the number of tags (or branches) in your repository increase dramatically recently? No. There are 2242 tags, but they're largely historical. > Did a job setting change recently? No. > Did you add a new plugin recently (like the timestamper plugin)? No, the timestamper plugin has been installed for a long time.

          Sam Deane added a comment -

          Just noticed this in one of the logs, reported directly after the last rev-parse entry. Not sure if it's relevant:

           
          07:41:09 JENKINS-19022: warning: possible memory leak due to Git plugin usage; see:
          https://wiki.jenkins-ci.org/display/JENKINS/Remove+Git+Plugin+BuildsByBranch+BuildData

          Sam Deane added a comment - Just noticed this in one of the logs, reported directly after the last rev-parse entry. Not sure if it's relevant:   07:41:09 JENKINS-19022 : warning: possible memory leak due to Git plugin usage; see: https://wiki.jenkins-ci.org/display/JENKINS/Remove+Git+Plugin+BuildsByBranch+BuildData

          Sam Deane added a comment - - edited

          The contents of that link is very unclear.

          Is it supposed to be a script to run as a work around? Once? Periodically? From the shell? From a job?

          Is it a report that the script itself is the cause of the "leak"?

          It would be helpful if it was explained clearly, with instructions on what to do if you hit this problem.

          The issue that it then points to in turn seems to suggest that the script should be run, but reading the whole thread of comments (most of which are hidden by default), even then it's far from clear whether it will help or not, or if it's always safe to run it, etc.

           

           

           

          Sam Deane added a comment - - edited The contents of that link is very unclear. Is it supposed to be a script to run as a work around? Once? Periodically? From the shell? From a job? Is it a report that the script itself is the cause of the "leak"? It would be helpful if it was explained clearly, with instructions on what to do if you hit this problem. The issue that it then points to in turn seems to suggest that the script should be run, but reading the whole thread of comments (most of which are hidden by default), even then it's far from clear whether it will help or not, or if it's always safe to run it, etc.      

          Mark Waite added a comment -

          Is it supposed to be a script to run as a work around? Once? Periodically? From the shell? From a job?

          It is run as a workaround, whenever you encounter the problem. It can be run periodically if you wish. Another alternative is to limit the amount of history you retain with your jobs.

          Is it a report that the script itself is the cause of the "leak"?

          No, the script is not the cause of the problem. The git plugin is the cause of the problem.

          It would be helpful if it was explained clearly, with instructions on what to do if you hit this problem.

          I'm not sure I understand. Can you edit that wiki page to better describe it? If a user has many history records in a git job, the git plugin incorrectly stores too much information about that history within each of the individual build records. Those bloated build records are then loaded into memory, which slows Jenkins startup and makes the Jenkins process much larger than necessary.

          The issue that it then points to in turn seems to suggest that the script should be run, but reading the whole thread of comments (most of which are hidden by default), even then it's far from clear whether it will help or not, or if it's always safe to run it, etc.

          If you depend on the information in those bloated build records, then the script is not safe to run. Most people do not depend on the information in those bloated build records.

          Another way to avoid the issue is to limit the number of build records you retain for your jobs. The configuration slicing plugin will allow you to modify the job definitions of all jobs in your system to limit the amount of history you keep for the jobs. That then avoids the problem by removing historical build records which include that duplicated information.

          Mark Waite added a comment - Is it supposed to be a script to run as a work around? Once? Periodically? From the shell? From a job? It is run as a workaround, whenever you encounter the problem. It can be run periodically if you wish. Another alternative is to limit the amount of history you retain with your jobs. Is it a report that the script itself is the cause of the "leak"? No, the script is not the cause of the problem. The git plugin is the cause of the problem. It would be helpful if it was explained clearly, with instructions on what to do if you hit this problem. I'm not sure I understand. Can you edit that wiki page to better describe it? If a user has many history records in a git job, the git plugin incorrectly stores too much information about that history within each of the individual build records. Those bloated build records are then loaded into memory, which slows Jenkins startup and makes the Jenkins process much larger than necessary. The issue that it then points to in turn seems to suggest that the script should be run, but reading the whole thread of comments (most of which are hidden by default), even then it's far from clear whether it will help or not, or if it's always safe to run it, etc. If you depend on the information in those bloated build records, then the script is not safe to run. Most people do not depend on the information in those bloated build records. Another way to avoid the issue is to limit the number of build records you retain for your jobs. The configuration slicing plugin will allow you to modify the job definitions of all jobs in your system to limit the amount of history you keep for the jobs. That then avoids the problem by removing historical build records which include that duplicated information.

            Unassigned Unassigned
            scoheb Scott Hebert
            Votes:
            13 Vote for this issue
            Watchers:
            14 Start watching this issue

              Created:
              Updated: