Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-43749

Support multiple Jenkinsfiles from the same repository

      This would support scenarios where different "configurations" of a pipeline cannot share the same Jenkinsfile.

      If I had multiple Jenkinsfiles... repository github.com/apple/swift

      /Package.jenkinsfile 
      /Incremental.jenkinsfile
      /Incremental-RA.jenkinsfile
      /Assert.jenkinsfile
      /src/…
      

      I would like to create multibranch Pipelines for each so I have the resulting structure:

      /Apple
      /Apple/Swift - Package
      /Apple/Swift - Incremental
      /Apple/Swift - Incremental-RA
      /Apple/Swfit - Assert
      

      Note that in this example I have an organization folder for github.com/apple and it is creating multiple multibranch pipelines for each Jenkinsfile discovered in each repository.

      I have written up examples and use cases in this doc

          [JENKINS-43749] Support multiple Jenkinsfiles from the same repository

          James Dumay added a comment -

          hrmpw this one keeps coming up all over the place (most recently on gitter) so I thought that we should share the internal document more widely with the community.

          James Dumay added a comment - hrmpw this one keeps coming up all over the place (most recently on gitter) so I thought that we should share the internal document more widely with the community.

          Alex Lourie added a comment -

          Just to add a bit more clarity for our use case:

           

          We have the main repository, src. In that repository, we have a code for all modules of our app, the code for packaging it in the distribution packages, and tests. We have a few distinct Jenkins jobs in that repo:

          • A job that runs tests. It can accept parameters (such as branch to checkout to run the tests on, the level of tests - small, large or medium, and components - for specifying a subset of the project to run the tests on). This job is run both manually as a part of some operational procedures, and automatically when executed from another job.
          • A build job, that builds artefacts. It always runs manually. It accepts parameters defining branch, environment (test/prod), set of modules to build and set of packages to create.
          • A "cron" job that runs subsets of tests periodically. These subsets are very resource-heavy, and cannot be run often. Mostly runs automatically on a scheduler, but also can be run manually if needed.
          • A subproject in a different repo that's being automatically or manually run by a build job with a set of dynamic parameters.

           

          At the moment, I cannot see a way to implement these in 1 Jenkins file. Especially combining a build job (which only should output success/failure) with tests (which should show results of passed/failed tests) into 1 pipeline doesn't make sense in our environment.

          Alex Lourie added a comment - Just to add a bit more clarity for our use case:   We have the main repository, src. In that repository, we have a code for all modules of our app, the code for packaging it in the distribution packages, and tests. We have a few distinct Jenkins jobs in that repo: A job that runs tests. It can accept parameters (such as branch to checkout to run the tests on, the level of tests - small, large or medium, and components - for specifying a subset of the project to run the tests on). This job is run both manually as a part of some operational procedures, and automatically when executed from another job. A build job, that builds artefacts. It always runs manually. It accepts parameters defining branch, environment (test/prod), set of modules to build and set of packages to create. A "cron" job that runs subsets of tests periodically. These subsets are very resource-heavy, and cannot be run often. Mostly runs automatically on a scheduler, but also can be run manually if needed. A subproject in a different repo that's being automatically or manually run by a build job with a set of dynamic parameters.   At the moment, I cannot see a way to implement these in 1 Jenkins file. Especially combining a build job (which only should output success/failure) with tests (which should show results of passed/failed tests) into 1 pipeline doesn't make sense in our environment.

          Seems today the only way to get this scenario running is by changing the build behavior entirely by switching based on the jobs naming pattern (wich does feel like bad design).

          Anyways I am curious about that internal document.

          Daniel Geißler added a comment - Seems today the only way to get this scenario running is by changing the build behavior entirely by switching based on the jobs naming pattern (wich does feel like bad design). Anyways I am curious about that internal document.

          Patrick Wolf added a comment - - edited

          Adding the ability to recognize other files is not difficult. The problem is that most people don't want to run N jobs every time a commit happens. 

          Pipeline was designed to have flow control using when and if to take different actions based on different conditions. The primary reason the Jenkinsfile exists is to tell Jenkins, create a new Pipeline for this branch. What happens in that Pipeline is entirely up to author of the Pipeline.

          This takes into account build parameters, triggers, branches, upstream stage success, running external jobs etc.  

          E.g.

          when { branch "master"}

          Pipeline was not meant to replace Freestyle jobs 1:1 but to collapse multiple Freestyle jobs into a single cohesive flow that adapted to the conditions. With all that is possible in Pipeline a single Jenkinsfile can account for most scenarios,

          Moving all of that flow control outside of the actual Pipeline essentially wraps Pipeline into a Freestyle configuration paradigm and we lose all of the advantages of Pipeline as Code

          1. The Pipelines are no longer portable 
          2. The UI becomes more complex by an order of magnitude
          3. Changes to the flow, triggers, parameters etc of the different Pipeline branches lose code review and audit capabilities

          I understand the desire to have separate build histories and change histories for discrete Pipelines, since those are the primary value-adds of having multiple files.  If you don't care about separate build histories you can already use the Jenkinsfile to determine the type of build to perform and then load a specific, discrete Pipeline file from the workspace to execute. You can even change the currentBuild.displayName = type of job that is being run.

          Just adding multiple Jenkinsfiles as a solution creates more problems than it solves. We need to examine the set of problems that are not satisfied by Jenkinsfile today and come up with an actual solution that also meets the goal of Pipeline-as-code.

           

           

          Patrick Wolf added a comment - - edited Adding the ability to recognize other files is not difficult. The problem is that most people don't want to run N jobs every time a commit happens.  Pipeline was designed to have flow control using when and if to take different actions based on different conditions. The primary reason the Jenkinsfile exists is to tell Jenkins, create a new Pipeline for this branch. What happens in that Pipeline is entirely up to author of the Pipeline. This takes into account build parameters, triggers, branches, upstream stage success, running external jobs etc.   E.g. when { branch "master" } Pipeline was not meant to replace Freestyle jobs 1:1 but to collapse multiple Freestyle jobs into a single cohesive flow that adapted to the conditions. With all that is possible in Pipeline a single Jenkinsfile can account for most scenarios, Moving all of that flow control outside of the actual Pipeline essentially wraps Pipeline into a Freestyle configuration paradigm and we lose all of the advantages of Pipeline as Code The Pipelines are no longer portable  The UI becomes more complex by an order of magnitude Changes to the flow, triggers, parameters etc of the different Pipeline branches lose code review and audit capabilities I understand the desire to have separate build histories and change histories for discrete Pipelines, since those are the primary value-adds of having multiple files.  If you don't care about separate build histories you can already use the Jenkinsfile to determine the type of build to perform and then load a specific, discrete Pipeline file from the workspace to execute. You can even change the currentBuild.displayName = type of job that is being run. Just adding multiple Jenkinsfiles as a solution creates more problems than it solves. We need to examine the set of problems that are not satisfied by Jenkinsfile today and come up with an actual solution that also meets the goal of Pipeline-as-code.    

          Alex Lourie added a comment -

           

          hrmpw I am not planning to break the Pipelines goals.

           

          It seems to me that there's an inherent assumption for a workflow with pipelines - that I should have one Jenkinsfile per repository and all the flow is initiated by the commits into the repository. Whereas I see the pipelines as advanced activities replacing previous Jenkins jobs, and not necessarily Freetype jobs.

           

          Activities that I'm considering are not necessarily initiated by code changes in the repository:

          • I could be running tests automatically from PR create/update. Definitely a good match for a pipeline to run automatically on specific commits to PR.
          • I would like to choose when to build a deployment artefacts (jar, binaries, packages, etc), and from which branch. This decision may originate from code changes in the repository but also by management decision.
          • I would like to run certain tests either on a schedule (without any code change expected) or by request (for validating a specific change).

           

          I'm not sure I understand the following example: 

          The primary reason the Jenkinsfile exists is to tell Jenkins, create a new Pipeline for this branch. What happens in that Pipeline is entirely up to author of the Pipeline.

           

          Does it mean I'm specifically changing my Jenkinsfile to match a specific branch I'm working on? If so, what happens then when I merge my work to master branch? Will the branch-specific stuff hang around? What if it's not relevant in master branch? Will I end up with a Jenkinsfile in master with a lot of branch-related specifics?

           

          Also, I am definitely not suggesting to have N jobs running per commit. What I was aiming at is basically allowing setting up a custom Jenkinsfile per pipeline, so instead of using <repository>/Jenkinsfile allow me to use <repository>/custompath/Jenkinsfile. I believe that this definitely falls into the existing pipeline concepts and model. It keeps the Jenkinsfile in the code repository, it allows creating multiple pipelines per one repository, it allows keeping different history for different pipelines and it runs one job per pipeline for any commit. More than that, the current Jenkins Pipeline job already supports providing a custom Jenkinsfile name:

          so why can't BlueOcean allow the same?

           

          Maybe I'm mistaken with my understanding of how pipelines work or should work, and there's still an option to do what I need with the current model?

          Alex Lourie added a comment -   hrmpw I am not planning to break the Pipelines goals.   It seems to me that there's an inherent assumption for a workflow with pipelines - that I should have one Jenkinsfile per repository and all the flow is initiated by the commits into the repository. Whereas I see the pipelines as advanced activities replacing previous Jenkins jobs, and not necessarily Freetype jobs.   Activities that I'm considering are not necessarily initiated by code changes in the repository: I could be running tests automatically from PR create/update. Definitely a good match for a pipeline to run automatically on specific commits to PR. I would like to choose when to build a deployment artefacts (jar, binaries, packages, etc), and from which branch. This decision may originate from code changes in the repository but also by management decision. I would like to run certain tests either on a schedule (without any code change expected) or by request (for validating a specific change).   I'm not sure I understand the following example:  The primary reason the Jenkinsfile exists is to tell Jenkins, create a new Pipeline for this branch. What happens in that Pipeline is entirely up to author of the Pipeline.   Does it mean I'm specifically changing my Jenkinsfile to match a specific branch I'm working on? If so, what happens then when I merge my work to master branch? Will the branch-specific stuff hang around? What if it's not relevant in master branch? Will I end up with a Jenkinsfile in master with a lot of branch-related specifics?   Also, I am definitely not suggesting to have N jobs running per commit. What I was aiming at is basically allowing setting up a custom Jenkinsfile per pipeline , so instead of using <repository>/Jenkinsfile allow me to use <repository>/custompath/Jenkinsfile . I believe that this definitely falls into the existing pipeline concepts and model. It keeps the Jenkinsfile in the code repository, it allows creating multiple pipelines per one repository, it allows keeping different history for different pipelines and it runs one job per pipeline for any commit. More than that, the current Jenkins Pipeline job already supports providing a custom Jenkinsfile name: so why can't BlueOcean allow the same?   Maybe I'm mistaken with my understanding of how pipelines work or should work, and there's still an option to do what I need with the current model?

          hrmpw there at least two more scenarios (besides separate build histories) i can think of that are currently unavailable but useful:

          1. multiple distinct projects in one repository (http://stackoverflow.com/questions/43614002/how-to-use-a-jenkins-multibranch-pipeline-with-a-monorepo)
          2. executing steps on different schedules (http://stackoverflow.com/questions/40255572/can-i-run-a-jenkins-pipeline-step-or-stage-on-individual-schedules). I can change the build based on the build trigger, but that may not be sufficient. Lets say you have fast unit tests, that should run every build, integration tests that should run on a daily base and performance tests that should run only on weekends, i don't have any idea how to solve that or did i just miss something?

          You said Pipelines should not replace freestyle projects, but this is actually the way many users need to start when migrating from their current setup to the new (very useful) pipeline world. Many build managers have to work on monolitic legacy projects that can not be handled in just one job, not yet.

          But the cool thing is that Jenkins Pipelines initially do not force you into that pattern, at least as long as you do not try to use Multibranch-Pipelines.

          Even in multibranch pipelines we are able to deactivate scm triggers like this right?

          properties([overrideIndexTriggers(true)])])

          So it would be super helpful to be able to define a Jenkinsfile pattern for multibranch pipelines, to hold all jobs together while still maintaining the flexibility scripted build descriptions offer.

          Please do not limit the potential pipelines by forcing people into specific patterns.

          Daniel Geißler added a comment - hrmpw there at least two more scenarios (besides separate build histories) i can think of that are currently unavailable but useful: multiple distinct projects in one repository ( http://stackoverflow.com/questions/43614002/how-to-use-a-jenkins-multibranch-pipeline-with-a-monorepo) executing steps on different schedules ( http://stackoverflow.com/questions/40255572/can-i-run-a-jenkins-pipeline-step-or-stage-on-individual-schedules) . I can change the build based on the build trigger, but that may not be sufficient. Lets say you have fast unit tests, that should run every build, integration tests that should run on a daily base and performance tests that should run only on weekends, i don't have any idea how to solve that or did i just miss something? You said Pipelines should not replace freestyle projects, but this is actually the way many users need to start when migrating from their current setup to the new (very useful) pipeline world. Many build managers have to work on monolitic legacy projects that can not be handled in just one job, not yet. But the cool thing is that Jenkins Pipelines initially do not force you into that pattern, at least as long as you do not try to use Multibranch-Pipelines. Even in multibranch pipelines we are able to deactivate scm triggers like this right? properties([overrideIndexTriggers( true )])]) So it would be super helpful to be able to define a Jenkinsfile pattern for multibranch pipelines, to hold all jobs together while still maintaining the flexibility scripted build descriptions offer. Please do not limit the potential pipelines by forcing people into specific patterns.

          Patrick Wolf added a comment -

          Currently, it is possible to have as many unique "Pipelines" as you want in a repository.   I can have the structure described in this ticket just fine:

          /Jenkinsfile
          /Package.jenkinsfile 
          /Incremental.jenkinsfile
          /Incremental-RA.jenkinsfile
          /Assert.jenkinsfile
          /src/…
          

          The Jenkinsfile is the designator telling Jenkins that I want a Pipeline for this repo. When a commit is pushed to this repo it triggers a new run of the Jenkinsfile. All of the logic for what happens when that run is triggered is contained with the Jenkinsfile. With a combination of Scripted Pipeline, the build step, the load step and any parameters passed along, etc I can run any combination of the Pipelines contained in the other .jenkinsfiles.

          I can also combine all of those builds into a single Jenkinsfile that has a set of common stages but will skip other stages based on criteria. Pretty much any configuration desired is possible with Pipeline. As I said, it's up to the Pipeline author.

          The one big thing that is missing is separate run history for each of those different runs.

          Elevating all of these various .jenkinsfiles to the same level as the Jenkinsfile means that I have to configure all of the logic of which Pipeline to run based on the given trigger criteria on a configuration screen in Jenkins. The orchestration of the various jobs is no longer part of the code but part of the configuration stored on a single master.

          The goal, as I see it, is not necessarily support for multiple Jenkinsfiles, as siblings, but to allow multiple Pipeline jobs for the same repo, each with discrete histories and the ability to configure and orchestrate all of these various jobs across branches and pull requests using information defined in the repo itself. 

           

           

          Patrick Wolf added a comment - Currently, it is possible to have as many unique "Pipelines" as you want in a repository.   I can have the structure described in this ticket just fine: /Jenkinsfile /Package.jenkinsfile /Incremental.jenkinsfile /Incremental-RA.jenkinsfile /Assert.jenkinsfile /src/… The Jenkinsfile is the designator telling Jenkins that I want a Pipeline for this repo. When a commit is pushed to this repo it triggers a new run of the Jenkinsfile. All of the logic for what happens when that run is triggered is contained with the Jenkinsfile. With a combination of Scripted Pipeline, the build step, the load step and any parameters passed along, etc I can run any combination of the Pipelines contained in the other .jenkinsfiles. I can also combine all of those builds into a single Jenkinsfile that has a set of common stages but will skip other stages based on criteria. Pretty much any configuration desired is possible with Pipeline. As I said, it's up to the Pipeline author. The one big thing that is missing is separate run history for each of those different runs. Elevating all of these various .jenkinsfiles to the same level as the Jenkinsfile means that I have to configure all of the logic of which Pipeline to run based on the given trigger criteria on a configuration screen in Jenkins. The orchestration of the various jobs is no longer part of the code but part of the configuration stored on a single master. The goal, as I see it, is not necessarily support for multiple Jenkinsfiles, as siblings, but to allow multiple Pipeline jobs for the same repo, each with discrete histories and the ability to configure and orchestrate all of these various jobs across branches and pull requests using information defined in the repo itself.     

          Alex Lourie added a comment -

          hrmpw

          The Jenkinsfile is the designator telling Jenkins that I want a Pipeline for this repo. When a commit is pushed to this repo it triggers a new run of the Jenkinsfile. All of the logic for what happens when that run is triggered is contained with the Jenkinsfile

          Once again, this assumes that:

             a) a commit initiates a pipeline job which may not always be the case in some work flows, and

             b) it appears that the only way to initiate a pipeline jobs seems to be a commit, which in our workflow is just not. Most of our builds are not initiated by a specific repo commit.

          Also, if Jenkinsfile is the designator telling Jenkins I want a Pipeline for this repo, why not allow a customly-named designator? You keep still the 1 Jenkinsfile <-> 1 Pipeline ratio, just let me pick my own file. This would solve the issue for all of us - you still using the same file as before (<repo>/Jenkinsfile), and I'm having 5 in my repo (<repo>/something[1-5]/Jenkinsfile) and running 5 different pipelines using a custom file per each.

           

          The one big thing that is missing is separate run history for each of those different runs.

          Yep.

           

          Elevating all of these various .jenkinsfiles to the same level as the Jenkinsfile means that I have to configure all of the logic of which Pipeline to run based on the given trigger criteria on a configuration screen in Jenkins. The orchestration of the various jobs is no longer part of the code but part of the configuration stored on a single master.

          I understand that. Unfortunately, I see no other option when manual runs are expected. Or do it with multiple pipeline files with "predefined" inputs and flow logic.

           

          The goal, as I see it, is not necessarily support for multiple Jenkinsfiles, as siblings, but to allow multiple Pipeline jobs for the same repo, each with discrete histories and the ability to configure and orchestrate all of these various jobs across branches and pull requests using information defined in the repo itself. 

          That would be great. If that existed allowing really separate Pipeline jobs that can be executed independently, it would answer our requirements. 

          Alex Lourie added a comment - hrmpw The Jenkinsfile is the designator telling Jenkins that I want a Pipeline for this repo. When a commit is pushed to this repo it triggers a new run of the Jenkinsfile. All of the logic for what happens when that run is triggered is contained with the Jenkinsfile Once again, this assumes that:    a) a commit initiates a pipeline job which may not always be the case in some work flows, and    b) it appears that the only way to initiate a pipeline jobs seems to be a commit, which in our workflow is just not. Most of our builds are not initiated by a specific repo commit. Also, if Jenkinsfile is the designator telling Jenkins I want a Pipeline for this repo, why not allow a customly-named designator? You keep still the 1 Jenkinsfile <-> 1 Pipeline ratio, just let me pick my own file. This would solve the issue for all of us - you still using the same file as before (<repo>/Jenkinsfile), and I'm having 5 in my repo (<repo>/something [1-5] /Jenkinsfile) and running 5 different pipelines using a custom file per each.   The one big thing that is missing is separate run history for each of those different runs. Yep.   Elevating all of these various .jenkinsfiles to the same level as the Jenkinsfile means that I have to configure all of the logic of which Pipeline to run based on the given trigger criteria on a configuration screen in Jenkins. The orchestration of the various jobs is no longer part of the code but part of the configuration stored on a single master. I understand that. Unfortunately, I see no other option when manual runs are expected. Or do it with multiple pipeline files with "predefined" inputs and flow logic.   The goal, as I see it, is not necessarily support for multiple Jenkinsfiles, as siblings, but to allow multiple Pipeline jobs for the same repo, each with discrete histories and the ability to configure and orchestrate all of these various jobs across branches and pull requests using information defined in the repo itself.  That would be great. If that existed allowing really separate Pipeline jobs that can be executed independently, it would answer our requirements. 

          Rob Coward added a comment - - edited

          We have an almost identical use-case to the requester, although perhaps a little more niche. We have a several monolithic repos, each with nested directories containing several jenkinsfiles representing a number of different pipelines that would be triggered manually (in master), though there is potential for feature branches to want to trigger a CI run of the pipelines.

          My ask is that the resulting folder hierachy in jenkins is configurable - either to allow a flat structure as suggested originally using "repo - filename" to construct the folder:

          /Apple
          /Apple/Swift - Package
          /Apple/Swift - Incremental
          /Apple/Swift - Incremental-RA
          /Apple/Swfit - Assert

          or to take into account the directory in which the jenkinsfiles are found and have the resulting jenkins folder structure represent the folders structure in SCM:

          /Apple
          /Apple/Swift/ProjectA/Workflow1
          /Apple/Swift/ProjectA/Workflow2
          /Apple/Swift/ProjectB/WorkflowA
          /Apple/Swift/ProjectC/Team1/PerfPipeline
          /Apple/Swift/ProjectC/Team2/RegressionPipeline

           

          Rob Coward added a comment - - edited We have an almost identical use-case to the requester, although perhaps a little more niche. We have a several monolithic repos, each with nested directories containing several jenkinsfiles representing a number of different pipelines that would be triggered manually (in master), though there is potential for feature branches to want to trigger a CI run of the pipelines. My ask is that the resulting folder hierachy in jenkins is configurable - either to allow a flat structure as suggested originally using "repo - filename" to construct the folder: /Apple /Apple/Swift - Package /Apple/Swift - Incremental /Apple/Swift - Incremental-RA /Apple/Swfit - Assert or to take into account the directory in which the jenkinsfiles are found and have the resulting jenkins folder structure represent the folders structure in SCM: /Apple /Apple/Swift/ProjectA/Workflow1 /Apple/Swift/ProjectA/Workflow2 /Apple/Swift/ProjectB/WorkflowA /Apple/Swift/ProjectC/Team1/PerfPipeline /Apple/Swift/ProjectC/Team2/RegressionPipeline  

          James Dumay added a comment -

          rob_coward thats an interesting proposal. For the time being I believe if CloudBees picks this up that we will be implementing the flat structure first. We may be able to consider custom structures in future.

          James Dumay added a comment - rob_coward thats an interesting proposal. For the time being I believe if CloudBees picks this up that we will be implementing the flat structure first. We may be able to consider custom structures in future.

          Sam Van Oort added a comment -

          Interesting proposal!  rob_coward could your use-case be achieved by multiple Multibranch projects with different detection rules for each Jenkinsrule type?

          I'm curious where this goes because it seems like a case where there's many possible ways to achieve similar results (multiple paths of execution within a repo) and it's unclear at this time what would be the Best Practice to put forward.

          Sam Van Oort added a comment - Interesting proposal!   rob_coward  could your use-case be achieved by multiple Multibranch projects with different detection rules for each Jenkinsrule type? I'm curious where this goes because it seems like a case where there's many possible ways to achieve similar results (multiple paths of execution within a repo) and it's unclear at this time what would be the Best Practice to put forward.

          James Dumay added a comment -

          jglick if you don't mind could we please leave this open until I've had a chat with hrmpw? Theres some details here I don't want to be lost if we roll this into the project recognisers ticket (which is more than just this one).

          James Dumay added a comment - jglick if you don't mind could we please leave this open until I've had a chat with hrmpw ? Theres some details here I don't want to be lost if we roll this into the project recognisers ticket (which is more than just this one).

          Jesse Glick added a comment -

          Nothing is lost in JIRA history.

          Jesse Glick added a comment - Nothing is lost in JIRA history.

          James Dumay added a comment -

          I have this referenced in a few public places and referring to it when it is closed would be confusing for those I've referenced it to.

          James Dumay added a comment - I have this referenced in a few public places and referring to it when it is closed would be confusing for those I've referenced it to.

          Nat Sr added a comment -

          Do we have any updates for this ticket? I will be useful if we can get it in Release 1.3

          Nat Sr added a comment - Do we have any updates for this ticket? I will be useful if we can get it in Release 1.3

          Currently to try and workaround this I am trying to write a Jenkinsfile that will search for all *.jenkinsfile and create jobs with folder/name based off the dirname/basename of files found, but this seems something that is intrinsically better to be done natively in Jenkins.

          My use-case for this is for those jobs that don't necessarily build artefacts or deploy something - jobs that perform operational activities.

          I want as far as possible to prevent the need to use the UI for changing jobs, ensuring that all definitions for jobs reside in a git repo.  This also helps with the master repo with many submodules build use-cases that can exist too.

          I know I could manually create many pipeline jobs and for each reference a particular Jenkinsfile in a repo, but I want to avoid the overhead of requiring to do that any time a new job is to be added.

          Jon-Paul Sullivan added a comment - Currently to try and workaround this I am trying to write a Jenkinsfile that will search for all *.jenkinsfile and create jobs with folder/name based off the dirname/basename of files found, but this seems something that is intrinsically better to be done natively in Jenkins. My use-case for this is for those jobs that don't necessarily build artefacts or deploy something - jobs that perform operational activities. I want as far as possible to prevent the need to use the UI for changing jobs, ensuring that all definitions for jobs reside in a git repo.  This also helps with the master repo with many submodules build use-cases that can exist too. I know I could manually create many pipeline jobs and for each reference a particular Jenkinsfile in a repo, but I want to avoid the overhead of requiring to do that any time a new job is to be added.

          Aidan Feldman added a comment -

          > trying to write a Jenkinsfile that will search for all *.jenkinsfile and create jobs with folder/name based off the dirname/basename of files found

          j3p0uk I think the "seed" job of the Job DSL Plugin does something similar - might be worth taking a look. Any chance you could share that code, in progress or when it's in a working state?

          Aidan Feldman added a comment - > trying to write a Jenkinsfile that will search for all *.jenkinsfile and create jobs with folder/name based off the dirname/basename of files found j3p0uk I think the "seed" job of the Job DSL Plugin  does something similar - might be worth taking a look. Any chance you could share that code, in progress or when it's in a working state?

          Taking this as a starting point:

          https://stackoverflow.com/questions/41146952/multi-branch-pipeline-plugin-load-multiple-jenkinsfile-per-branch

          Still working through approvals to get scripts run outside of groovy sandbox, etc, as not my Jenkins to administer

          Also, the decision was made to only allow jobs that are auto-generated from pipeline.

          Thanks for the pointer to the Job DSL though, I'll certainly check that out on another Jenkins.

          Jon-Paul Sullivan added a comment - Taking this as a starting point: https://stackoverflow.com/questions/41146952/multi-branch-pipeline-plugin-load-multiple-jenkinsfile-per-branch Still working through approvals to get scripts run outside of groovy sandbox, etc, as not my Jenkins to administer Also, the decision was made to only allow jobs that are auto-generated from pipeline. Thanks for the pointer to the Job DSL though, I'll certainly check that out on another Jenkins.

          Henri Anttila added a comment -

          I have a use case for this too, as I want multiple projects to be built off of the same source code repo or repos. I'm currently doing this by putting all my jenkinsfiles inside a dedicated jenkins git repository instead. The same could be accomplished by putting multiple jenkinsfiles inside a single source code repository, for example: platform1.jenkinsfile, platform2.jenkinsfile etc. I'd love to migrate to Blue Ocean, but it's just not in any way plausible, since it only looks for a file named "Jenkinsfile" inside a given repository's different branches.

          Henri Anttila added a comment - I have a use case for this too, as I want multiple projects to be built off of the same source code repo or repos. I'm currently doing this by putting all my jenkinsfiles inside a dedicated jenkins git repository instead. The same could be accomplished by putting multiple jenkinsfiles inside a single source code repository, for example: platform1.jenkinsfile, platform2.jenkinsfile etc. I'd love to migrate to Blue Ocean, but it's just not in any way plausible, since it only looks for a file named "Jenkinsfile" inside a given repository's different branches.

          Jens Beyer added a comment - - edited

          I am not sure if I understand your issue correctly, but I had a similar issue (multiple projects in a single repository), and my solution was quite easy.

          The multiple projects's sources were located in subdirectories like:

          /project-base
          /project-app
          /project2-app

          Naturally, first I had thought about one single /Jenkinsfile in a Multibranch Pipeline project.

          My solution now:

          /project-base
               |- Jenkinsfile
          /project-app
               |- Jenkinsfile
          /project2-app
               |- Jenkinsfile

           

          And of course, having three  Multibranch Pipeline Projects, each with the configuration Mode "by Jenkinsfile" (and not "by default Jenkinsfile") and the Script Path as full path from the repository (like /project-base/Jenkinsfile for the first one).

           

          This way, I have a simple solution for multiple builds from one repository.

           

          Hope that helps anyone.

           

          Edit: Just found JENKINS-34561 - which is exactly what my suggestion uses, so this issue should be related to the resolved ones, and also be resolved?

          Jens Beyer added a comment - - edited I am not sure if I understand your issue correctly, but I had a similar issue (multiple projects in a single repository), and my solution was quite easy. The multiple projects's sources were located in subdirectories like: /project-base /project-app /project2-app Naturally, first I had thought about one single /Jenkinsfile in a Multibranch Pipeline project. My solution now: /project-base      |- Jenkinsfile /project-app      |- Jenkinsfile /project2-app      |- Jenkinsfile   And of course, having three  Multibranch Pipeline Projects, each with the configuration Mode "by Jenkinsfile" (and not "by default Jenkinsfile") and the Script Path as full path from the repository (like /project-base/Jenkinsfile for the first one).   This way, I have a simple solution for multiple builds from one repository.   Hope that helps anyone.   Edit: Just found  JENKINS-34561  - which is exactly what my suggestion uses, so this issue should be related to the resolved ones, and also be resolved?

          beyerj - Yes, manually creating each job that points to a specific Jenkinsfile is a workaround for the request in this issue, which is to auto-create a folder structure from the discovered files of a particular pattern.

          Jon-Paul Sullivan added a comment - beyerj - Yes, manually creating each job that points to a specific Jenkinsfile is a workaround for the request in this issue, which is to auto-create a folder structure from the discovered files of a particular pattern.

          From what I can see, this is an ask for a new implementation of a branch factory.

          The current branch factory (in workflow-multibranch) creates a pipeline job for each branch.

          What this is asking for is, instead, to create a computed folder with a pipeline job for each jenkinsfile within the branch.

          I think the APIs should support that if somebody wants to take a stab at it. The only issue I see is that we may need to tweak the branch-api to allow for the branch jobs to be a non-job type (i.e. computed folder)

          This should be implemented in a brand new plugin, not as a PR to an existing plugin. (multipipeline-multibranch?)

          Stephen Connolly added a comment - From what I can see, this is an ask for a new implementation of a branch factory. The current branch factory (in workflow-multibranch) creates a pipeline job for each branch. What this is asking for is, instead, to create a computed folder with a pipeline job for each jenkinsfile within the branch. I think the APIs should support that if somebody wants to take a stab at it. The only issue I see is that we may need to tweak the branch-api to allow for the branch jobs to be a non-job type (i.e. computed folder) This should be implemented in a brand new plugin, not as a PR to an existing plugin. (multipipeline-multibranch?)

          Jesse Glick added a comment -

          If that is indeed the request, I do not think this should be implemented directly, i.e., hard-coded in a plugin. Rather recommend JENKINS-37220.

          Jesse Glick added a comment - If that is indeed the request, I do not think this should be implemented directly, i.e., hard-coded in a plugin. Rather recommend  JENKINS-37220 .

          Mark Wright added a comment -

          beyerj in your example, do you use githooks to trigger builds? If so, how do you prevent changes from one of your sub-projects from triggering building of another? E.g how do you prevent a change under /project-app from building /project2-app?

          Or do you just build them all?

          Mark Wright added a comment - beyerj in your example, do you use githooks to trigger builds? If so, how do you prevent changes from one of your sub-projects from triggering building of another? E.g how do you prevent a change under /project-app from building /project2-app? Or do you just build them all?

          Jens Beyer added a comment -

          kakapo4: Indeed we have the problem that githooks trigger all builds, and this is some kind of issue, but not too big for us (rarely leads to intermediate build errors because the projects depend on each other). We thought about solutions once the issue would become bigger, like cancelling builds if their upstream builds are still in queue, or lockable resources) but at the moment we believe only the separation of the subprojects into own repositories would probably help in the long run. Except: If Git polling would honour sparse checkouts (maybe as additional option, since I can imagine this being quite expensive), this would not be necessary at all.

          Jens Beyer added a comment - kakapo4 : Indeed we have the problem that githooks trigger all builds, and this is some kind of issue, but not too big for us (rarely leads to intermediate build errors because the projects depend on each other). We thought about solutions once the issue would become bigger, like cancelling builds if their upstream builds are still in queue, or lockable resources) but at the moment we believe only the separation of the subprojects into own repositories would probably help in the long run. Except: If Git polling would honour sparse checkouts (maybe as additional option, since I can imagine this being quite expensive), this would not be necessary at all.

          Just found this thread - and I support the purpose as well.

          However, maybe the thread should be renamed "Support multiple Jenkinsfiles from the same GIT repository" - as I'm doing quite a few things presently with Subversion that works very well, but just cannot be done with Git.

          One usecase: I've setup a large system to do automatic updates on a lot of our Debian hosts. Some runs daily, some runs weekly. They're all scheduled to run between 5 and 6 o'clock in the morning, and as the update includes a reboot, they just cannot be run in normal working hours. From 6 o'clock our normal support staff is on guard to react to failed updates - quite often by rolling back the update by reverting to a VMware snapshot that is created as part of the automatic update. If they do not need to rollback, the snapshot is deleted again by another cron-like job running at 11 o'clock. This has worked quite well for several years.

          Each host has it own Jenkinsfile that - among other things - run a special test after the upgrade tailored specifically for that host. The Jenkinsfiles are all stored in a single Subversion repository (Well: 25 - named A-Z - we have a LOT of hosts), and the Jenkins job is set up as a Multibranch Pipeline. Needless to say, the job has been set up to suppress automatic SCM triggering. Instead, they're triggered by the clock, and can be run manually on demand. The only gotcha so far is the fact that the job must be run manually the first time on a new host - otherwise Jenkins won't pick up the

          properties([pipelineTriggers([cron('H 5 * * *')])])
          

          statement from the Jenkinsfile.

          When checked out from Subversion, a repository looks like this:

          hosts
             |-   host-01.noone.net
                  |-   Jenkinsfile
             |-   host-02.noone.net
                  |-   Jenkinsfile
             |-   host-03.noone.net
                  |-   Jenkinsfile
             |-   etc.
          

          Adding a new host becomes as simple as copy/pasting an existing setup and making the necessary modifications. Because of the nature of Subversion, this structure can be used in a Jenkins multibranch pipeline which will create a pipeline for each host - named after the host - and automatically create new jobs as hosts are added - and disabling existing hosts as they're deleted from Subversion. Lots of Jenkinsfiles in a single repository - a solution that works very well for us.

          But, unfortunately, we're slowly migrating to Git, and I haven't found a solution with Git that can do the same trick with Jenkins.

          Another use case: We're hosting a lot of Drupal sites and we're building and deploying these as Docker containers. I've created a Jenkinsfile and a multibranch pipeline and the code is stored in Git. My colleagues use this setup the usual way, creating feature branches in Git, merging them into a develop branch, then merging into the master branch. We deploy all the Docker containers of all the branches for testing and approval by the product owner.

          But the deployment jobs? These are pipelines as well - but I do not want deployment to part of the huge Jenkinsfile in the repo root. I've several reasons for that, but basically we don't want to wait hours for a long building job just to redeploy a few Docker containers. I would like very much to store the deployment jobs in the same Git repo - but with the current Jenkins pipeline functionality, I cannot - or at least I haven't figured out how to do it. So I have to maintain a separate repository for each deployment job, not an optimal situation.

          With Subversion I could easily do it, but as Git doesn't allow checkout of subbranches - or rather, the Jenkins SCM plugins cannot do it - I just cannot do it with Git.

          Just my thoughts.

          Lars Skjærlund added a comment - Just found this thread - and I support the purpose as well. However, maybe the thread should be renamed "Support multiple Jenkinsfiles from the same GIT repository" - as I'm doing quite a few things presently with Subversion that works very well, but just cannot be done with Git. One usecase: I've setup a large system to do automatic updates on a lot of our Debian hosts. Some runs daily, some runs weekly. They're all scheduled to run between 5 and 6 o'clock in the morning, and as the update includes a reboot, they just cannot be run in normal working hours. From 6 o'clock our normal support staff is on guard to react to failed updates - quite often by rolling back the update by reverting to a VMware snapshot that is created as part of the automatic update. If they do not need to rollback, the snapshot is deleted again by another cron-like job running at 11 o'clock. This has worked quite well for several years. Each host has it own Jenkinsfile that - among other things - run a special test after the upgrade tailored specifically for that host. The Jenkinsfiles are all stored in a single Subversion repository (Well: 25 - named A-Z - we have a LOT of hosts), and the Jenkins job is set up as a Multibranch Pipeline. Needless to say, the job has been set up to suppress automatic SCM triggering. Instead, they're triggered by the clock, and can be run manually on demand. The only gotcha so far is the fact that the job must be run manually the first time on a new host - otherwise Jenkins won't pick up the properties([pipelineTriggers([cron( 'H 5 * * *' )])]) statement from the Jenkinsfile. When checked out from Subversion, a repository looks like this: hosts |- host-01.noone.net |- Jenkinsfile |- host-02.noone.net |- Jenkinsfile |- host-03.noone.net |- Jenkinsfile |- etc. Adding a new host becomes as simple as copy/pasting an existing setup and making the necessary modifications. Because of the nature of Subversion, this structure can be used in a Jenkins multibranch pipeline which will create a pipeline for each host - named after the host - and automatically create new jobs as hosts are added - and disabling existing hosts as they're deleted from Subversion. Lots of Jenkinsfiles in a single repository - a solution that works very well for us. But, unfortunately, we're slowly migrating to Git, and I haven't found a solution with Git that can do the same trick with Jenkins. Another use case: We're hosting a lot of Drupal sites and we're building and deploying these as Docker containers. I've created a Jenkinsfile and a multibranch pipeline and the code is stored in Git. My colleagues use this setup the usual way, creating feature branches in Git, merging them into a develop branch, then merging into the master branch. We deploy all the Docker containers of all the branches for testing and approval by the product owner. But the deployment jobs? These are pipelines as well - but I do not want deployment to part of the huge Jenkinsfile in the repo root. I've several reasons for that, but basically we don't want to wait hours for a long building job just to redeploy a few Docker containers. I would like very much to store the deployment jobs in the same Git repo - but with the current Jenkins pipeline functionality, I cannot - or at least I haven't figured out how to do it. So I have to maintain a separate repository for each deployment job, not an optimal situation. With Subversion I could easily do it, but as Git doesn't allow checkout of subbranches - or rather, the Jenkins SCM plugins cannot do it - I just cannot do it with Git. Just my thoughts.

          larsskj The only way I found was using a different branch per host. It works me for now but it doesn't feel very nice.

          Peter Leibiger added a comment - larsskj The only way I found was using a different branch per host. It works me for now but it doesn't feel very nice.

          pleibiger: That's actually what I'm doing. I do tell Jenkins that I have a Subversion branch per host - that's why the multibranch pipeline works so fine.

          The real difference lies within the nature of Subversion vs. Git: With Subversion, you can check out a complete repository with all branches side-by-side, and it's very easy to copy code from one branch to another. With Git, you can only have a single branch active at the time, so it becomes very clumpsy to copy/paste from one branch to another. And you completely loose the overview at the same time.

          Lars Skjærlund added a comment - pleibiger : That's actually what I'm doing. I do tell Jenkins that I have a Subversion branch per host - that's why the multibranch pipeline works so fine. The real difference lies within the nature of Subversion vs. Git: With Subversion, you can check out a complete repository with all branches side-by-side, and it's very easy to copy code from one branch to another. With Git, you can only have a single branch active at the time, so it becomes very clumpsy to copy/paste from one branch to another. And you completely loose the overview at the same time.

          Alex Lourie added a comment -

          larsskj You kind of can checkout 2 branches at once using git worktrees. But that's still not ideal.

          Supporting more than 1 Jenkinsfiles would be definitely better.

          OTOH, if you don't mind creating jobs manually, then you can still create a pipeline and use a custom Jenkinsfile, which you manually set from that hierarchy above. That could also work in your case.

          Alex Lourie added a comment - larsskj You kind of can checkout 2 branches at once using git worktrees. But that's still not ideal. Supporting more than 1 Jenkinsfiles would be definitely better. OTOH, if you don't mind creating jobs manually, then you can still create a pipeline and use a custom Jenkinsfile, which you manually set from that hierarchy above. That could also work in your case.

          alourie: I do mind creating jobs manually: The jobs are not maintained by me, they're maintained on a daily basis by our operations support staff, and they need the simplicity that they just has to maintain the list of jobs in Subversion.

          I could create a Subversion post-commit hook that maintained the list of jobs in Jenkins - but that would be very ugly and add a lot of complexity to the solution.

          It would be a lot better if Jenkins pipelines supported more than one Jenkinsfile in Git. They already do with Subversion.

          Lars Skjærlund added a comment - alourie : I  do mind creating jobs manually: The jobs are not maintained by me, they're maintained on a daily basis by our operations support staff, and they need the simplicity that they just has to maintain the list of jobs in Subversion. I could create a Subversion post-commit hook that maintained the list of jobs in Jenkins - but that would be very ugly and add a lot of complexity to the solution. It would be a lot better if Jenkins pipelines supported more than one Jenkinsfile in Git. They already do with Subversion.

          alourie I just learned about git worktree 20 minutes ago , that makes it a lot easier for me.

          I am not creating jobs manually, I use the Bitbucket Branch Source Plugin which automatically creates a job per branch that has a Jenkinsfile.

          With git worktree I can now have all my hosts open in IntelliJ at once and copy from one to another.
          Now Gitkraken just needs to support worktrees...

           

          Peter Leibiger added a comment - alourie I just learned about git worktree 20 minutes ago , that makes it a lot easier for me. I am not creating jobs manually, I use the Bitbucket Branch Source Plugin which automatically creates a job per branch that has a Jenkinsfile. With git worktree I can now have all my hosts open in IntelliJ at once and copy from one to another. Now Gitkraken just needs to support worktrees...  

          Alex Lourie added a comment -

          pleibiger well, today is not totally lost I guess.

          larsskj yea, multi-Jenkinsfile support would be nice indeed. Sorry you're stuck with SVN

          Alex Lourie added a comment - pleibiger well, today is not totally lost I guess. larsskj yea, multi-Jenkinsfile support would be nice indeed. Sorry you're stuck with SVN

          alourie: I'm not sorry - I like SVN very much - and I'm not very fond of Git...

          Lars Skjærlund added a comment - alourie : I'm not sorry - I like SVN very much - and I'm not very fond of Git...

          I have to say something because I see developers commenting about the Pipeline "purity" and "goals". As a long time Jenkins user/admin/CI guy, I'd say that you need to know your clients.

           

          I imagine that the shiny Pipeline sells Cloudbees contracts, and that's great, we want Cloudbees to be successful and be able to develop Jenkins in the future. However, the truth from the trenches is a bit different: Jenkins is both used for shiny pipelines but it's used just as much as a distributed cron with an execution history.

           

          And because of missing features such as these, the declarative pipeline especially can't replace the freestyle jobs. And it should. Otherwise you're asking me (ok) and especially devs or ops guys who won't ever become Jenkins experts to learn:

           

          • Groovy (kind of a dead language outside of Jenkins, right now)
          • the scripted pipeline (custom Jenkins DSL, not used outside of Jenkins)
          • the declarative pipeline (same as the scripted one, but a bit nicer)

           

          or to fall back to the clunky old manual configuration. Or the job DSL configuration, but that doesn't play well with Github/Bitbucket branch source plugins, for example.

           

          Always keep in mind your users and their use cases

           

          The distributed cron with history is actually Jenkins' main selling point. It can do what other less flexible tools such as Teamcity & co can't do. That's how I "sell" Jenkins to people, and that's why they "buy" it (not always with Cloudbees support contracts, true, but as they grow that can change ).

           

          Anyway, sorry for the rant, but I really think this should be re-prioritized.

          Costin Caraivan added a comment - I have to say something because I see developers commenting about the Pipeline "purity" and "goals". As a long time Jenkins user/admin/CI guy, I'd say that you need to know your clients.   I imagine that the shiny Pipeline sells Cloudbees contracts, and that's great, we want Cloudbees to be successful and be able to develop Jenkins in the future. However, the truth from the trenches is a bit different: Jenkins is both used for shiny pipelines but it's used just as much as a distributed cron with an execution history.   And because of missing features such as these, the declarative pipeline especially can't replace the freestyle jobs. And it should. Otherwise you're asking me (ok) and especially devs or ops guys who won't ever become Jenkins experts to learn:   Groovy (kind of a dead language outside of Jenkins, right now) the scripted pipeline (custom Jenkins DSL, not used outside of Jenkins) the declarative pipeline (same as the scripted one, but a bit nicer)   or to fall back to the clunky old manual configuration. Or the job DSL configuration, but that doesn't play well with Github/Bitbucket branch source plugins, for example.   Always keep in mind your users and their use cases   The distributed cron with history is actually Jenkins' main selling point. It can do what other less flexible tools such as Teamcity & co can't do. That's how I "sell" Jenkins to people, and that's why they "buy" it (not always with Cloudbees support contracts, true, but as they grow that can change ).   Anyway, sorry for the rant, but I really think this should be re-prioritized.

          Just noticed that JENKINS-50328 bug is related to this issue. If that bug would be fixed then we could create multiple multi-branch pipeline projects using same git repository but with different Jenkinsfile?

          Antti Turpeinen added a comment - Just noticed that  JENKINS-50328  bug is related to this issue. If that bug would be fixed then we could create multiple multi-branch pipeline projects using same git repository but with different Jenkinsfile?

          Hi

          Any update on this bug ...

          for mono-repositories in micro-services, this is very common scenario, 

          where there are multiple jenkins files for multiple folder (microservice). 

          github org webhook doesnt support this....and creating branch for each and every microservice is not feasible 

           

          please fix this on priority if possible

          nikhil kanotra added a comment - Hi Any update on this bug ... for mono-repositories in micro-services, this is very common scenario,  where there are multiple jenkins files for multiple folder (microservice).  github org webhook doesnt support this....and creating branch for each and every microservice is not feasible    please fix this on priority if possible

          Cool feature.

          I really need the same.

           

          Mikhail Naletov added a comment - Cool feature. I really need the same.  

          This feature would allow to ease up configuration on many environments.

          Aleksei Pushnov added a comment - This feature would allow to ease up configuration on many environments.

          Marcelo Carlos added a comment - - edited

          Are there any plans to add/support this feature?

          Our scenario is fairly similar to the others. We have a monorepo where we store different projects. To better illustrate our scenario and why this is important to us, here is one simplified use-case:

          We have our repo (let's call it "monorepo"). In this repo we have several folders, such as:

          • toolOne (written in golang)
          • docker-images
            • base
            • imageTwo

          Ideally, we'd have three pipelines in the example above, one for `toolOne`, one for `docker-images/base` and one for `docker-images/imageTwo`.

          On top of that, an interesting additional use case is when we want to build `imageTwo`. That project contains a multi-stage Dockerfile which compiles part of the source code of `toolOne` and adds the compiled binary into the `imageTwo` image (that's one of the main reasons we choose the monorepo approach, this way we can easily and quickly handle cross-project/tools dependencies).

          At the moment, we've been setting all of that up using a single Jenkinsfile at the root of the repo and handling builds with lots of build parameters and conditionals, so we can run/skip stages according to the build we want to execute.

          We looked into the "Job DSL Plugin", but we'd rather avoid since we've been successfully using Jenkinsfile in several of other projects and it is great to the same format for all projects and repos.

          However, the scenario above is gradually becoming a blocker as the pipeline is growing too big due to all the conditionals and stages we had to define to work around the existing limitation of 1 Jenkinsfile per repo.

          So, back to the original question, are there any plans you could share about adding/supporting this feature?

          Marcelo Carlos added a comment - - edited Are there any plans to add/support this feature? Our scenario is fairly similar to the others. We have a monorepo where we store different projects. To better illustrate our scenario and why this is important to us, here is one simplified use-case: We have our repo (let's call it "monorepo"). In this repo we have several folders, such as: toolOne (written in golang) docker-images base imageTwo Ideally, we'd have three pipelines in the example above, one for `toolOne`, one for `docker-images/base` and one for `docker-images/imageTwo`. On top of that, an interesting additional use case is when we want to build `imageTwo`. That project contains a multi-stage Dockerfile which compiles part of the source code of `toolOne` and adds the compiled binary into the `imageTwo` image (that's one of the main reasons we choose the monorepo approach, this way we can easily and quickly handle cross-project/tools dependencies). At the moment, we've been setting all of that up using a single Jenkinsfile at the root of the repo and handling builds with lots of build parameters and conditionals, so we can run/skip stages according to the build we want to execute. We looked into the "Job DSL Plugin", but we'd rather avoid since we've been successfully using Jenkinsfile in several of other projects and it is great to the same format for all projects and repos. However, the scenario above is gradually becoming a blocker as the pipeline is growing too big due to all the conditionals and stages we had to define to work around the existing limitation of 1 Jenkinsfile per repo. So, back to the original question, are there any plans you could share about adding/supporting this feature?

          Fajran Rusadi added a comment -

          What I ended up doing to support our monorepo approach was the following

          • Create separate (multibranch pipeline) Jenkins job for every individual components in the monorepo
          • All Jenkins jobs will use the same repository URL but each has its own path to Jenkinsfile. This way each component can have its own distinct build pipeline
          • I disabled the build triggering function in Jenkins and instead create an additional service to handle build trigger
          • The build trigger service also manages the Jenkins jobs. Whenever it detects a new component, it will create the corresponding Jenkins job automatically, with the correct configuration

          Fajran Rusadi added a comment - What I ended up doing to support our monorepo approach was the following Create separate (multibranch pipeline) Jenkins job for every individual components in the monorepo All Jenkins jobs will use the same repository URL but each has its own path to Jenkinsfile. This way each component can have its own distinct build pipeline I disabled the build triggering function in Jenkins and instead create an additional service to handle build trigger The build trigger service also manages the Jenkins jobs. Whenever it detects a new component, it will create the corresponding Jenkins job automatically, with the correct configuration

          When creating multiple jobs off the same GitHub repository, I found we required this plugin to get sane PR checking: https://plugins.jenkins.io/github-scm-trait-notification-context

           

          Default behaviour can play havoc with CI pipelines.

           

          Jon-Paul Sullivan added a comment - When creating multiple jobs off the same GitHub repository, I found we required this plugin to get sane PR checking: https://plugins.jenkins.io/github-scm-trait-notification-context   Default behaviour can play havoc with CI pipelines.  

          Tzach Yarimi added a comment -

          fajran is your service creating jobs inside the multibranch folders? Can you please share some code for doing that?

          Tzach Yarimi added a comment - fajran is your service creating jobs inside the multibranch folders? Can you please share some code for doing that?

          Fajran Rusadi added a comment -

          tzach to clarify, we had different multibranch pipeline job for different component. So what service does is create a new multibranch pipeline job when it detects a new component added to the repository. The new job will be configured to use the Jenkinsfile specific of that new component. 

          If the job already exists, it will check if the branch is already registered otherwise it will update the job config to add a new branch and start the scanning process to make the branch job available. Once the branch job is available, the service will trigger the build of that.

          Unfortunately I could not share the code.

          Fajran Rusadi added a comment - tzach to clarify, we had different multibranch pipeline job for different component. So what service does is create a new multibranch pipeline job when it detects a new component added to the repository. The new job will be configured to use the Jenkinsfile specific of that new component.  If the job already exists, it will check if the branch is already registered otherwise it will update the job config to add a new branch and start the scanning process to make the branch job available. Once the branch job is available, the service will trigger the build of that. Unfortunately I could not share the code.

          Francesco Pretto added a comment - - edited

          I'm relatively new to Jenkins but for me the reason this feature can't just be straightforwardly implemented in Jenkins is that it lacks a concept of a first-citizen SCM resource, meaning that SCM repositories can be configured in jobs but they are not treated as shareable resources themselves. If repositories could be configured as shareable resources (with no build steps configurable in them) and job could be configured to source on these resources (instead of configuring a separate git/svn connection for each job) it would be easier to implement a single queue of jobs to trigger for SCM changes on a single repository and it would be easier to filter which pipeline to trigger in case Job1 is configured with Jenkinsfile1 and Job2 is configured with JenkinsFile2. The single queue and the filter can probably be implemented with scripting but to have UI support and more declarative approach in the upstream pipelines (as opposed to imperatively select which job to trigger in the downstream SCM projects) something like what I am suggesting could be needed.

          Francesco Pretto added a comment - - edited I'm relatively new to Jenkins but for me the reason this feature can't just be straightforwardly implemented in Jenkins is that it lacks a concept of a first-citizen SCM resource, meaning that SCM repositories can be configured in jobs but they are not treated as shareable resources themselves. If repositories could be configured as shareable resources (with no build steps configurable in them) and job could be configured to source on these resources (instead of configuring a separate git/svn connection for each job) it would be easier to implement a single queue of jobs to trigger for SCM changes on a single repository and it would be easier to filter which pipeline to trigger in case Job1 is configured with Jenkinsfile1 and Job2 is configured with JenkinsFile2. The single queue and the filter can probably be implemented with  scripting  but to have UI support and more declarative approach in the upstream pipelines (as opposed to imperatively select which job to trigger in the downstream SCM projects) something like what I am suggesting could be needed.

          victor paul added a comment - - edited

          fajran Is it possible to share the code removing your firms references? Seems this has been the issue with Jenkins since years and there is not a stable solution.

          victor paul added a comment - - edited fajran Is it possible to share the code removing your firms references? Seems this has been the issue with Jenkins since years and there is not a stable solution.

          I don't know if this precisely answer the original issue here but I achieved to have multiple Jenkinsfile in the same repository/branch with the current feature set, I explained the procedure in this StackOverflow answer. The main idea is having separate Build and CSM projects configured in Jenkins, with the former that can be a simple Freestyle project with a polling on a repository and the latter a pipeline with a trigger on the CSM project. The trickier part was to add to add the skipDefaultCheckout directive in the Build pipeline, which is strange since I expected "Lightweight checkout" flag in the pipeline project to actually disable the checkout itself. Is this intended?

          Francesco Pretto added a comment - I don't know if this precisely answer the original issue here but I achieved to have multiple Jenkinsfile in the same repository/branch with the current feature set, I explained the procedure in this StackOverflow answer . The main idea is having separate Build and CSM projects configured in Jenkins, with the former that can be a simple Freestyle project with a polling on a repository and the latter a pipeline with a trigger on the CSM project. The trickier part was to add to add the skipDefaultCheckout directive in the Build pipeline, which is strange since I expected "Lightweight checkout" flag in the pipeline project to actually disable the checkout itself. Is this intended?

          Jeffrey Bennett added a comment - - edited

          I cannot speak to all the various use-cases that people have brought up, but it seems like a fair-number of people looking for this would be satisfied if multi-branch pipelines just supported an 'exclude regions' flag.  You get that behavior with regular-pipeline, but not multi-branch pipeline.

          In my situation (and seems like others have this as well), I have directoryA/Jenkinsfile and directoryB/Jenkinsfile, and I wish to trigger pipelineA when commits are made to DirectoryA, and pipelineB for DirectoryB.  There are [a few solutions floating around|https://stackoverflow.com/questions/49448029/multiple-jenkinsfile-in-one-repository/60316968] that effectively have both pipelineA and pipelineB starting, doing a 'validate' step, and then pipeline logic to short-circuit, causing completing when the commit is on the other side.  But I'd really prefer that the build NEVER start in the first place for the wrong pipeline.  Commit to DirectoryA -> only PipelineA is triggered.   That amounts to PipelineA configuring "DirectoryB" as an excludes region (and vice versa).

          That said, there are a bunch of use-cases all entwined herein, and this is only one of them.  But... it seems like a fairly well-defined one that might mitigate a lot of people's issues.

           

          Jeffrey Bennett added a comment - - edited I cannot speak to all the various use-cases that people have brought up, but it seems like a fair-number of people looking for this would be satisfied if multi-branch pipelines just supported an 'exclude regions' flag.  You get that behavior with regular-pipeline, but not multi-branch pipeline. In my situation (and seems like others have this as well), I have directoryA/Jenkinsfile and directoryB/Jenkinsfile, and I wish to trigger pipelineA when commits are made to DirectoryA, and pipelineB for DirectoryB.  There are [a few solutions floating around| https://stackoverflow.com/questions/49448029/multiple-jenkinsfile-in-one-repository/60316968 ] that effectively have both pipelineA and pipelineB starting, doing a 'validate' step, and then pipeline logic to short-circuit, causing completing when the commit is on the other side.  But I'd really prefer that the build NEVER start in the first place for the wrong pipeline.  Commit to DirectoryA -> only PipelineA is triggered.   That amounts to PipelineA configuring "DirectoryB" as an excludes region (and vice versa). That said, there are a bunch of use-cases all entwined herein, and this is only one of them.  But... it seems like a fairly well-defined one that might mitigate a lot of people's issues.  

          Tim Black added a comment - - edited

          As noted above there are actually LOTS of reasons why someone would want multiple Jenkins sub-projects nested inside a repo. For example, this issue creates problems for folks that try to avoid submodules and tend to favor monorepos. In one such case, we have a "configuration-management" repo that we use to contain all of our configuration management tools and projects, e.g. packer, ansible, vagrant, etc.. Workflows are A LOT simpler having a monorepo for this, but when it comes to standing these processes up in Jenkins CI, we crash into this core problem.

          To support multiple Jenkinsfiles/multibranch projects in the same repo, we currently use jobdsl to explicitly and statically create a multibranch project for each known project location/subfolder. E.g. in this repo tree, we have 2 packer multibranch pipeline projects and 2 ansible multibranch pipeline projects:

          configuration-management
            - packer
              - subproject1
                - Jenkinsfile
                - src
              - subproject2
                - Jenkinsfile
                - src 
            - ansible
              - subproject1
                - Jenkinsfile
                - src
              - subproject2
                - Jenkinsfile
                - src 
          

          We use Folders to mirror the repo structure in Jenkins. The jobdsl to create the items in Jenkins looks like:

          folder("Packer") {
          // TODO: dynamically-create this list by searching repo on all branches
          //       for the superset of `packer/<packer_project_name>` subfolders that
          //       contain a Jenkinsfile: https://issues.jenkins.io/browse/JENKINS-43749
          packer_projects = ['subproject1', 'subproject2']
          for (packer_project in packer_projects)
          {
              multibranchPipelineJob("Packer/${packer_project}") {
                  .
                  .
                  factory {
                      workflowBranchProjectFactory {
                          scriptPath("packer/${packer_project}/Jenkinsfile")
                      }
                  }
              }
          }

          and

          folder("Ansible") {
          // TODO: dynamically-create this list by searching repo on all branches
          //       for the superset of `ansible/<ansible_project_name>` subfolders that
          //       contain a Jenkinsfile: https://issues.jenkins.io/browse/JENKINS-43749
          ansible_projects = ['subproject1', 'subproject2']
          for (ansible_project in ansible_projects)
          {
              multibranchPipelineJob("Ansible/${ansible_project}") {
                  .
                  .
                  factory {
                      workflowBranchProjectFactory {
                          scriptPath("ansible/${ansible_project}/Jenkinsfile")
                      }
                  }
               }
          }
          
          

          This approach works well, except that every time we want to add a new sub-project (ansible or packer subfolder with Jenkinsfile, in this case), we have to modify and push the jobdsl. It'd be great if workflow-multibranch could recursively scan for Jenkinsfiles and basically do what the jobdsl above is doing dynamically for me, as noted by my comments.

          I was planning on making this dynamic by modifying my jobdsl to execute a script to perform this Jenkinsfile scanning for me. I assume this is possible - but it's not a complete solution since jobdsl is  only updated during Jenkins infrastructure provisioning - infrequently - so this wouldn't detect most cases where someone adds, commits and pushes a subproject with a Jenkinsfile. We really need this scanning to be done by workflow-multibranch.

          Tim Black added a comment - - edited As noted above there are actually LOTS of reasons why someone would want multiple Jenkins sub-projects nested inside a repo. For example, this issue creates problems for folks that try to avoid submodules and tend to favor monorepos. In one such case, we have a "configuration-management" repo that we use to contain all of our configuration management tools and projects, e.g. packer, ansible, vagrant, etc.. Workflows are A LOT simpler having a monorepo for this, but when it comes to standing these processes up in Jenkins CI, we crash into this core problem. To support multiple Jenkinsfiles/multibranch projects in the same repo, we currently use jobdsl to explicitly and statically create a multibranch project for each known project location/subfolder. E.g. in this repo tree, we have 2 packer multibranch pipeline projects and 2 ansible multibranch pipeline projects: configuration-management - packer - subproject1 - Jenkinsfile - src - subproject2 - Jenkinsfile - src - ansible - subproject1 - Jenkinsfile - src - subproject2 - Jenkinsfile - src We use Folders to mirror the repo structure in Jenkins. The jobdsl to create the items in Jenkins looks like: folder( "Packer" ) { // TODO: dynamically-create this list by searching repo on all branches // for the superset of `packer/<packer_project_name>` subfolders that // contain a Jenkinsfile: https://issues.jenkins.io/browse/JENKINS-43749 packer_projects = [ 'subproject1' , 'subproject2' ] for (packer_project in packer_projects) { multibranchPipelineJob( "Packer/${packer_project}" ) { . . factory { workflowBranchProjectFactory { scriptPath( "packer/${packer_project}/Jenkinsfile" ) } } } } and folder( "Ansible" ) { // TODO: dynamically-create this list by searching repo on all branches // for the superset of `ansible/<ansible_project_name>` subfolders that // contain a Jenkinsfile: https://issues.jenkins.io/browse/JENKINS-43749 ansible_projects = [ 'subproject1' , 'subproject2' ] for (ansible_project in ansible_projects) { multibranchPipelineJob( "Ansible/${ansible_project}" ) { . . factory { workflowBranchProjectFactory { scriptPath( "ansible/${ansible_project}/Jenkinsfile" ) } } } } This approach works well, except that every time we want to add a new sub-project (ansible or packer subfolder with Jenkinsfile, in this case), we have to modify and push the jobdsl. It'd be great if workflow-multibranch could recursively scan for Jenkinsfiles and basically do what the jobdsl above is doing dynamically for me , as noted by my comments. I was planning on making this dynamic by modifying my jobdsl to execute a script to perform this Jenkinsfile scanning for me. I assume this is possible - but it's not a complete solution since jobdsl is  only updated during Jenkins infrastructure provisioning - infrequently - so this wouldn't detect most cases where someone adds, commits and pushes a subproject with a Jenkinsfile. We really need this scanning to be done by workflow-multibranch.

          If it may interest someone, I have an approach (https://stackoverflow.com/a/60316968/213871) that actually supports multiple pipelines in the same repository. It's not exactly a trivial solution (meaning that a first citizen support in Jenkins would just be better and it's good if this is considered for future releases) but it's very reliable and I have no compromises. I used it only in single branch scenarios, so far, so I don't know if it can be used also in multibranch. I also talk about it in my previous post (link).

          Francesco Pretto added a comment - If it may interest someone, I have an approach ( https://stackoverflow.com/a/60316968/213871 ) that actually supports multiple pipelines in the same repository. It's not exactly a trivial solution (meaning that a first citizen support in Jenkins would just be better and it's good if this is considered for future releases) but it's very reliable and I have no compromises. I used it only in single branch scenarios, so far, so I don't know if it can be used also in multibranch. I also talk about it in my previous post ( link ).

          Edgars Batna added a comment - - edited

          So far I had to work around this on every Jenkins installation that I've worked on, so +1. Usually the reasons being:

          • dependency build orchestration
          • complex trigger conditions
          • multiple build streams/environments (production/development)

          Edgars Batna added a comment - - edited So far I had to work around this on every Jenkins installation that I've worked on, so +1. Usually the reasons being: dependency build orchestration complex trigger conditions multiple build streams/environments (production/development)

          Philipp Hahn added a comment -

          My company also uses mono-repos and would like to have multiple `Jeninsfile`s discovered automatically. While researching this myself I found a Blog Post by Alexis Gauthiez where he describes their approach using JobDSL

          Philipp Hahn added a comment - My company also uses mono-repos and would like to have multiple `Jeninsfile`s discovered automatically. While researching this myself I found a Blog Post by Alexis Gauthiez  where he describes their approach using JobDSL

          Adam added a comment -

          When I ran into this same issue with my team I also came across the blog by Alexis Gauthiez, however it seemed a bit overkill for the size of our problem. I ended up going with a different approach. Multiple pipelines in parallel within one Jenkinsfile, I just wrote a blog about that approach too. Hopefully it might be useful to someone. 

          Adam added a comment - When I ran into this same issue with my team I also came across the blog by Alexis Gauthiez, however it seemed a bit overkill for the size of our problem. I ended up going with a different approach. Multiple pipelines in parallel within one Jenkinsfile, I just wrote a blog about that approach too . Hopefully it might be useful to someone. 

          I think the best solution is enhancing the pipeline declarative to allow a master jenkinsfiles to check the jobs condition (branch, tag, parameter, upstream …) and then execute any other declarative jenkinsfiles in project at any name base on the condition and the code in the project.

          NhatKhai Nguyen added a comment - I think the best solution is enhancing the pipeline declarative to allow a master jenkinsfiles to check the jobs condition (branch, tag, parameter, upstream …) and then execute any other declarative jenkinsfiles in project at any name base on the condition and the code in the project.

            Unassigned Unassigned
            jamesdumay James Dumay
            Votes:
            105 Vote for this issue
            Watchers:
            129 Start watching this issue

              Created:
              Updated: