Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-37220

Job DSL analogue of multibranch projects & organization folders

      To provide a more scalable alternative to the endless requested branch-api customizations such as JENKINS-32396, or the tricks based on JENKINS-30519, it would be useful to have a simple way of calling into scm-api implementations from a Job DSL folder (JENKINS-33275). So you could (for example) easily construct the equivalent of a GitHub organization folder, but from a script loop that could do arbitrary customizations to each, such as setting job properties.

      Need to also implement SCMSourceOwner, HeadByItem, etc. so that the branch source knows that a given generated project is associated with a given branch, which is needed for various purposes.

      TBD how webhooks should be integrated. The script needs to somehow indicate that changes to a given repository or organization should trigger a regeneration of the folder. May need to create an alternative to SCMTrigger that is keyed off the same criteria that branch-api branch indexing uses, though there are also use cases for traditional SCM polling—it is a little less efficient, but has more configuration options.

      For Pipeline projects you could use Jenkinsfile, if SCMBinder and SCMVar were generalized a bit, or something else.

      Needs a lot of design work; at this point this is more of a placeholder issue to collect ideas.

          [JENKINS-37220] Job DSL analogue of multibranch projects & organization folders

          Use case:

          Joe creates a GitHub Org Folder.  His repos foo and bar are scanned and picked up because they have a Jenkinsfile in each.

          foo is a huge repository (2gb) and he only wants the last 10 commits when building (a shallow clone with depth 10) and without any tags.

          bar uses lots of submodules, but isn't big.  He wants the submodule extensions turned on and to use the GitHub Org credentials.

          This could be done by allow adding extensions at the GitHub Org Folder level, but that would mean forcing all builds to have a depth of 10, no tags, and submodules turned on.  This could cause people to split the GitHub Orgs up just to accommodate Jenkins (which may not be possible, if they are paying for the Orgs, etc.)

          An idea is we could change the scm from an Object to the list form (I don't know what it is called) that looks like [$class:GitSCM, extensions: ... ]

          I have some code I run in a trusted pipeline via the src/ directory that almost does this...

           class ScmEditablizer {
               static def makeScmEditable(scm, env) {
                  Integer i
          
                  def branches = []
                  for (i = 0; i < scm.branches.size(); i++) {
                      def it = scm.branches.get(i)
                      branches.add([name: it.getName()])
                  }
          
                  def userRemoteConfigs = []
                  for (i = 0; i < scm.userRemoteConfigs.size(); i++) {
                      def it = scm.userRemoteConfigs.get(i)
                      userRemoteConfigs.add([
                      name:          it.getName(),
                      url:           it.getUrl(),
                      credentialsId: it.getCredentialsId(),
                      refspec:       it.getRefspec()
                      ])
                  }
          
                  return [ $class: 'GitSCM', extensions: [], branches: branches, userRemoteConfigs: userRemoteConfigs ]
              }
          }
          

          But this doesn't work because the extensions are missing and it doesn't understand Pull Requests (I don't know why exactly).

          This allows getting an "editable" scm object that can have extensions, etc. added to it.

           

          Christian Höltje added a comment - Use case: Joe creates a GitHub Org Folder.  His repos foo and bar are scanned and picked up because they have a Jenkinsfile in each. foo is a huge repository (2gb) and he only wants the last 10 commits when building (a shallow clone with depth 10) and without any tags. bar  uses lots of submodules, but isn't big.  He wants the submodule extensions turned on and to use the GitHub Org credentials. This could be done by allow adding extensions at the GitHub Org Folder level, but that would mean forcing all builds to have a depth of 10, no tags, and submodules turned on.  This could cause people to split the GitHub Orgs up just to accommodate Jenkins (which may not be possible, if they are paying for the Orgs, etc.) An idea is we could change the scm from an Object to the list form (I don't know what it is called) that looks like [$class:GitSCM, extensions: ... ] I have some code I run in a trusted pipeline via the src/ directory that almost  does this...   class ScmEditablizer { static def makeScmEditable(scm, env) { Integer i def branches = [] for (i = 0; i < scm.branches.size(); i++) { def it = scm.branches.get(i) branches.add([name: it.getName()]) } def userRemoteConfigs = [] for (i = 0; i < scm.userRemoteConfigs.size(); i++) { def it = scm.userRemoteConfigs.get(i) userRemoteConfigs.add([ name: it.getName(), url: it.getUrl(), credentialsId: it.getCredentialsId(), refspec: it.getRefspec() ]) } return [ $class: 'GitSCM' , extensions: [], branches: branches, userRemoteConfigs: userRemoteConfigs ] } } But this doesn't work because the extensions are missing and it doesn't understand Pull Requests (I don't know why exactly). This allows getting an "editable" scm object that can have extensions, etc. added to it.  

          Michael Neale added a comment -

          Is the aim here to "kill job DSL"? 

           

          "have a simple way of calling into scm-api implementations from a Job DSL folder" - interesting use of "simple". I would say "for very advanced users". 

          Michael Neale added a comment - Is the aim here to "kill job DSL"?    "have a simple way of calling into scm-api implementations from a Job DSL folder" - interesting use of "simple". I would say "for very advanced users". 

          Jesse Glick added a comment -

          docwhat your use case is solvable in various ways in organization folders as they stand, since you “only” need to adjust the behavior of the build. Now it happens that currently scm is not “editable” but that could be addressed I think. Really you can checkout any GitSCM you like, you just lose the feature of picking up the exact commit (including a PR merge with base branch) that the Jenkinsfile comes from.

          This issue was more about allowing users to programmatically customize things which are inherently aspects of the job, not any individual build. Or even decide when to create a job for a given repo/branch, or create multiple such jobs, etc.

          michaelneale for years now, plenty of Job DSL users already iterate over the GitHub API (via github-api plugin, or roll-your-own), or something along those lines, so as to get a DIY version of multibranch. This proposal would (if it is feasible) bring some of the polish of branch-api and some of the features of branch source plugins to this usage style, while retaining its flexibility.

          Jesse Glick added a comment - docwhat your use case is solvable in various ways in organization folders as they stand, since you “only” need to adjust the behavior of the build . Now it happens that currently scm is not “editable” but that could be addressed I think. Really you can checkout any GitSCM you like, you just lose the feature of picking up the exact commit (including a PR merge with base branch) that the Jenkinsfile comes from. This issue was more about allowing users to programmatically customize things which are inherently aspects of the job , not any individual build. Or even decide when to create a job for a given repo/branch, or create multiple such jobs, etc. michaelneale  for years now, plenty of Job DSL users already iterate over the GitHub API (via github-api plugin, or roll-your-own), or something along those lines, so as to get a DIY version of multibranch. This proposal would (if it is feasible) bring some of the polish of branch-api  and some of the features of branch source plugins to this usage style, while retaining its flexibility.

          Costin Caraivan added a comment - - edited

          I don't know if the aim here is to "kill job DSL", but it should be

           

          There's a brand new code-oriented approach to Jenkins, the Jenkins Pipeline. There's a brand spanking new UI for it, Blue Ocean. But we can't really use it. Why? Because some things are not quite pipelines, they're more like Jenkins being used as a fancy cron. We would like to store all those in version control. We would rather not make a new repo for each Jenkinsfile. We would rather not make a new branch for each Jenkinsfile.

          We would prefer to plonk the Jenkinsfiles somewhere in git and have new pipelines be generated for them.

           
          Ideally a team wanting to script Jenkins would use only 1.5 Groovy DSLs instead of the 3 Groovy DSLs (! ) we have to use right now: declarative (+scripting only when needed) vs declarative + scripting + Job DSL...

           

          As a simple solution, maybe it's possible to make the multibranch pipeline accept multiple Jenkinsfiles. Ideally the inclusion would be done with wildcards. That way every job configured just goes *.jenkinsfile and everything automagically works from end-to-end. I have no idea about the full UX impact, though.

          Costin Caraivan added a comment - - edited I don't know if the aim here is to "kill job DSL", but it should be   There's a brand new code-oriented approach to Jenkins, the Jenkins Pipeline. There's a brand spanking new UI for it, Blue Ocean. But we can't really use it. Why? Because some things are not quite pipelines, they're more like Jenkins being used as a fancy cron. We would like to store all those in version control. We would rather not make a new repo for each Jenkinsfile. We would rather not make a new branch for each Jenkinsfile. We would prefer to plonk the Jenkinsfiles somewhere in git and have new pipelines be generated for them.   Ideally a team wanting to script Jenkins would use only 1.5 Groovy DSLs instead of the 3 Groovy DSLs (! ) we have to use right now: declarative (+scripting only when needed) vs declarative + scripting + Job DSL...   As a simple solution, maybe it's possible to make the multibranch pipeline accept multiple Jenkinsfiles. Ideally the inclusion would be done with wildcards. That way every job configured just goes *.jenkinsfile and everything automagically works from end-to-end. I have no idea about the full UX impact, though.

          Tim Black added a comment - - edited

          ccaraivan, I know a lot of time has passed, but JobDSL and Pipeline serve entirely different purposes. Despite both being "code", their functional overlap is slight. 

          jglick, daspilker, I believe my use case is very relevant to this issue.

          As noted here and in the linked issues, there are LOTS of reasons why someone would want multiple Jenkins sub-projects nested inside a repo. Many folks, if they know what's best for them, try to avoid submodules and favor monorepos. In one such case, we have a "configuration-management" repo that we use to contain all of our configuration management tools and projects, e.g. packer, ansible, vagrant, etc.. Workflows are A LOT simpler having a monorepo for this, but when it comes to standing these processes up in Jenkins CI, we crash into this core problem.

          To support multiple Jenkinsfiles/multibranch projects in the same repo, we currently use jobdsl to explicitly and statically create a multibranch project for each known project location/subfolder. E.g. in this repo tree, we have 2 packer multibranch pipeline projects and 2 ansible multibranch pipeline projects:

          configuration-management
            - packer
              - subproject1
                - Jenkinsfile
                - src
              - subproject2
                - Jenkinsfile
                - src 
            - ansible
              - subproject1
                - Jenkinsfile
                - src
              - subproject2
                - Jenkinsfile
                - src 
          

          We use Folders to mirror the repo structure in Jenkins. The jobdsl to create the items in Jenkins looks like:

          folder("Packer") {
          // TODO: dynamically-create this list by searching repo on all branches
          //       for the superset of `packer/<packer_project_name>` subfolders that
          //       contain a Jenkinsfile: https://issues.jenkins.io/browse/JENKINS-43749
          packer_projects = ['subproject1', 'subproject2']
          for (packer_project in packer_projects)
          {
              multibranchPipelineJob("Packer/${packer_project}") {
                  .
                  .
                  factory {
                      workflowBranchProjectFactory {
                          scriptPath("packer/${packer_project}/Jenkinsfile")
                      }
                  }
              }
          }

          and

          folder("Ansible") {
          // TODO: dynamically-create this list by searching repo on all branches
          //       for the superset of `ansible/<ansible_project_name>` subfolders that
          //       contain a Jenkinsfile: https://issues.jenkins.io/browse/JENKINS-43749
          ansible_projects = ['subproject1', 'subproject2']
          for (ansible_project in ansible_projects)
          {
              multibranchPipelineJob("Ansible/${ansible_project}") {
                  .
                  .
                  factory {
                      workflowBranchProjectFactory {
                          scriptPath("ansible/${ansible_project}/Jenkinsfile")
                      }
                  }
               }
          }

          This approach works well, except that every time we want to add a new sub-project (ansible or packer subfolder with Jenkinsfile, in this case), we have to modify and push the jobdsl. It'd be great if workflow-multibranch could recursively scan for Jenkinsfiles and basically do what the jobdsl above is doing dynamically for me, as noted by my comments.

          I was planning on making this dynamic by modifying my jobdsl to execute a script to perform this Jenkinsfile scanning for me. I assume this is possible - but it's not a complete solution since jobdsl is  only updated during Jenkins infrastructure provisioning - infrequently - so this wouldn't detect most cases where someone adds, commits and pushes a subproject with a Jenkinsfile. 

          Tim Black added a comment - - edited ccaraivan , I know a lot of time has passed, but JobDSL and Pipeline serve entirely different purposes. Despite both being "code", their functional overlap is slight.  jglick , daspilker , I believe my use case is very relevant to this issue. As noted here and in the linked issues, there are LOTS of reasons why someone would want multiple Jenkins sub-projects nested inside a repo. Many folks, if they know what's best for them, try to avoid submodules and favor monorepos. In one such case, we have a "configuration-management" repo that we use to contain all of our configuration management tools and projects, e.g. packer, ansible, vagrant, etc.. Workflows are A LOT simpler having a monorepo for this, but when it comes to standing these processes up in Jenkins CI, we crash into this core problem. To support multiple Jenkinsfiles/multibranch projects in the same repo, we currently use jobdsl to explicitly and  statically  create a multibranch project for each  known  project location/subfolder. E.g. in this repo tree, we have 2  packer  multibranch pipeline projects and 2  ansible  multibranch pipeline projects: configuration-management - packer - subproject1 - Jenkinsfile - src - subproject2 - Jenkinsfile - src - ansible - subproject1 - Jenkinsfile - src - subproject2 - Jenkinsfile - src We use Folders to mirror the repo structure in Jenkins. The jobdsl to create the items in Jenkins looks like: folder( "Packer" ) { // TODO: dynamically-create this list by searching repo on all branches // for the superset of `packer/<packer_project_name>` subfolders that // contain a Jenkinsfile: https://issues.jenkins.io/browse/JENKINS-43749 packer_projects = [ 'subproject1' , 'subproject2' ] for (packer_project in packer_projects) { multibranchPipelineJob( "Packer/${packer_project}" ) { . . factory { workflowBranchProjectFactory { scriptPath( "packer/${packer_project}/Jenkinsfile" ) } } } } and folder( "Ansible" ) { // TODO: dynamically-create this list by searching repo on all branches // for the superset of `ansible/<ansible_project_name>` subfolders that // contain a Jenkinsfile: https://issues.jenkins.io/browse/JENKINS-43749 ansible_projects = [ 'subproject1' , 'subproject2' ] for (ansible_project in ansible_projects) { multibranchPipelineJob( "Ansible/${ansible_project}" ) { . . factory { workflowBranchProjectFactory { scriptPath( "ansible/${ansible_project}/Jenkinsfile" ) } } } } This approach works well, except that  every time we want to add a new sub-project (ansible or packer subfolder with Jenkinsfile, in this case), we have to modify and push the jobdsl.   It'd be great if workflow-multibranch could recursively scan for Jenkinsfiles and basically do what the jobdsl above is doing dynamically for me , as noted by my comments. I was planning on making this dynamic by modifying my jobdsl to execute a script to perform this Jenkinsfile scanning for me. I assume this is possible - but it's not a complete solution since jobdsl is  only updated during Jenkins infrastructure provisioning - infrequently - so this wouldn't detect most cases where someone adds, commits and pushes a subproject with a Jenkinsfile. 

            jamietanna Jamie Tanna
            jglick Jesse Glick
            Votes:
            11 Vote for this issue
            Watchers:
            25 Start watching this issue

              Created:
              Updated: