Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-21605

Logging all UpstreamCause's floods Jenkins in large setups

      This bug is the same as https://issues.jenkins-ci.org/browse/JENKINS-15747 but it happens for us with Jenkins version 1.548 again (while it was working in the mean time).

      In cases where there are multiple paths though the ob dependencies logging all upstream causes generates a huge build log.
      The example job linked with this ticket has a log which is nearly 110MB in size because of that.

      In the previous ticket the issue has been addressed by "not only cap total number of transitive upstream causes but also avoid redundantly storing information about upstream causes listed elsewhere".

          [JENKINS-21605] Logging all UpstreamCause's floods Jenkins in large setups

          Dirk Thomas created issue -
          Jesse Glick made changes -
          Link New: This issue is blocking JENKINS-15747 [ JENKINS-15747 ]

          Dirk Thomas added a comment -

          Is there any chance this critical regression will be addressed? This makes Jenkins basically unusable in large setups. And going back to a one year old release where the problem was addressed is not really an option with the latest security fixes.

          Dirk Thomas added a comment - Is there any chance this critical regression will be addressed? This makes Jenkins basically unusable in large setups. And going back to a one year old release where the problem was addressed is not really an option with the latest security fixes.

          Dirk Thomas added a comment - - edited

          Can someone please look into this issue?

          The original ticket was already fixed (15747) and this regression makes our logging nearly unusable.

          Dirk Thomas added a comment - - edited Can someone please look into this issue? The original ticket was already fixed (15747) and this regression makes our logging nearly unusable.

          Dirk Thomas added a comment -

          Since you have fixed the original ticket for us could you please look into fixing this regression?

          Dirk Thomas added a comment - Since you have fixed the original ticket for us could you please look into fixing this regression?
          Dirk Thomas made changes -
          Assignee New: Jesse Glick [ jglick ]

          Jesse Glick added a comment -

          Do not really have time for it. File a pull request if you know how to fix it, or if you are a CloudBees support customer open a ticket.

          Jesse Glick added a comment - Do not really have time for it. File a pull request if you know how to fix it, or if you are a CloudBees support customer open a ticket.

          Daniel Beck added a comment -

          Could someone please explain to me what the critical bug is here? Checking a few of the linked builds rarely shows more than a dozen or so causes, even the linked 813 only has a <100KB build log and the build index page loads without problems.

          There might well have been a build with a log build log because of that (the now missing original one), but it seems to have been a freak accident, possibly due to a job not getting scheduled, not something that regularly breaks production for you.

          Daniel Beck added a comment - Could someone please explain to me what the critical bug is here? Checking a few of the linked builds rarely shows more than a dozen or so causes, even the linked 813 only has a <100KB build log and the build index page loads without problems. There might well have been a build with a log build log because of that (the now missing original one), but it seems to have been a freak accident, possibly due to a job not getting scheduled, not something that regularly breaks production for you.

          Dirk Thomas added a comment - - edited

          The problem is that the repeated logging of all upstream causes can result in extremely large build logs.
          As mentioned in the referenced original ticket that can be even up to 100 MB for a single job.
          Having numerous jobs / builds with such extensive logs will basically make Jenkins completely unable to operate.

          The bug has been fixed before but has reappeared since then.
          A recent example has "only" 1.5 MB: http://jenkins.ros.org/job/ros-hydro-metapackages_binarydeb_precise_amd64/1102/consoleFull
          Just pick a random upstream build (e.g. the first line `project "ros-hydro-map-server_binarydeb_precise_amd64" build number 52`) which triggered this jobs and see how many repetitions are in the log.

          Dirk Thomas added a comment - - edited The problem is that the repeated logging of all upstream causes can result in extremely large build logs. As mentioned in the referenced original ticket that can be even up to 100 MB for a single job. Having numerous jobs / builds with such extensive logs will basically make Jenkins completely unable to operate. The bug has been fixed before but has reappeared since then. A recent example has "only" 1.5 MB: http://jenkins.ros.org/job/ros-hydro-metapackages_binarydeb_precise_amd64/1102/consoleFull Just pick a random upstream build (e.g. the first line `project "ros-hydro-map-server_binarydeb_precise_amd64" build number 52`) which triggered this jobs and see how many repetitions are in the log.

            Unassigned Unassigned
            dthomas Dirk Thomas
            Votes:
            5 Vote for this issue
            Watchers:
            5 Start watching this issue

              Created:
              Updated:
              Resolved: