Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-18486

When providing multiple jobs to run, support running them one after another and not all together

      When I set the "Projects to build" to a multiple list of jobs, I'd like the option to run them serialized and not parallel.

      A use case:

      • I set the "Projects to build" to jobs A,B,C.
      • Job B needs the output of an action from job A and C needs from A and B.

      Chaining them hard coded in not acceptable as the "Projects to build" chain is dynamic and is a parameter itself.

          [JENKINS-18486] When providing multiple jobs to run, support running them one after another and not all together

          Same thing here, I spent quite some time assuming the projects would be executed in order, only to find out that they aren't. Obviously, the triggered projects are executed in alphabetical order of their names, so a workaround is currently to ensure dependent projects have "greater" names than the other projects.

          But I would really like to define the order when creating the initial project in the first place.

          Chrstian Alexander Wolf added a comment - Same thing here, I spent quite some time assuming the projects would be executed in order, only to find out that they aren't. Obviously, the triggered projects are executed in alphabetical order of their names, so a workaround is currently to ensure dependent projects have "greater" names than the other projects. But I would really like to define the order when creating the initial project in the first place.

          Edd Grant added a comment - - edited

          +1 It would be a really useful feature to be able to trigger downstream parameterised jobs serially, plus also optionally being able to decide not to continue if any of those jobs fail.

          [edit:] - I've just realised that this can be achieved by "Trigger/ Call builds on other projects" which allows exactly the configuration I was after: block/ don't block, fail conditionally etc. So please feel free to discount my +1

          Edd Grant added a comment - - edited +1 It would be a really useful feature to be able to trigger downstream parameterised jobs serially, plus also optionally being able to decide not to continue if any of those jobs fail. [edit:] - I've just realised that this can be achieved by "Trigger/ Call builds on other projects" which allows exactly the configuration I was after: block/ don't block, fail conditionally etc. So please feel free to discount my +1

          My use case involves one job triggering several other jobs that all specify "Build on the same node". I also have dependencies between these jobs, and the current behavior of randomly ordering those triggered jobs is not ideal. Note that building everything on the same node (with only one executor) precludes me from using the build step.

          In my opinion, these triggered builds should clearly be guaranteed to be queued up in the exact order that they are specified in the job configuration. If there are multiple executors available, certainly they can run in parallel. But everything would naturally run sequentially if the user specifies that everything runs on the same node.

          This seems like a very small change that could be made to solve this, and I'm really surprised that this wasn't already the default behavior.

          Christopher Shannon added a comment - My use case involves one job triggering several other jobs that all specify "Build on the same node". I also have dependencies between these jobs, and the current behavior of randomly ordering those triggered jobs is not ideal. Note that building everything on the same node (with only one executor) precludes me from using the build step. In my opinion, these triggered builds should clearly be guaranteed to be queued up in the exact order that they are specified in the job configuration. If there are multiple executors available, certainly they can run in parallel. But everything would naturally run sequentially if the user specifies that everything runs on the same node. This seems like a very small change that could be made to solve this, and I'm really surprised that this wasn't already the default behavior.

          Just a note for folks running into the same situation as me, I was able to workaround my issue by setting a "Quiet period" for my post-build parameterized jobs.

          So, I really want Job 1, Job 2, and Job 3 to be kicked off in that order. With default settings, and regardless of how they are ordered in the trigger step, they will be queued up in a seemingly random order by this plugin.

          If I put a quiet period of 5 seconds for Job 1, 10 seconds for Job 2, and 15 seconds for Job 3, however, then I can ensure that they execute in the order that I want. I can actually look at the build queue and see the jobs reorder themselves after their respective quiet periods end.

          Christopher Shannon added a comment - Just a note for folks running into the same situation as me, I was able to workaround my issue by setting a "Quiet period" for my post-build parameterized jobs. So, I really want Job 1, Job 2, and Job 3 to be kicked off in that order. With default settings, and regardless of how they are ordered in the trigger step, they will be queued up in a seemingly random order by this plugin. If I put a quiet period of 5 seconds for Job 1, 10 seconds for Job 2, and 15 seconds for Job 3, however, then I can ensure that they execute in the order that I want. I can actually look at the build queue and see the jobs reorder themselves after their respective quiet periods end.

            huybrechts huybrechts
            eldada Eldad Assis
            Votes:
            3 Vote for this issue
            Watchers:
            4 Start watching this issue

              Created:
              Updated: