Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-38669

SCM (Git) Will Not Notify Pipeline Before Execution

      I work in bitbucket which uses a hook to notify our jenkins server on commit (or if manually triggered in a pull request). I've noticed that, when I make a new pipeline job from SCM (git), onNotifyCommit won't look at the new job until after I've manually run the job and it has completed.

      A bit more details on the job:

      • I did specify an empty poll interval, which allowed my other maven jobs to be properly notified.
      • I haven't specified any additional behaviors

      Since the job has the SCM information built directly into the pipeline job, shouldn't onNotifyCommit properly find and notify the new job without needing execution?

          [JENKINS-38669] SCM (Git) Will Not Notify Pipeline Before Execution

          Dayton Gomez created issue -
          Mark Waite made changes -
          Assignee Original: Mark Waite [ markewaite ]

          Dayton Gomez added a comment -

          Not really knowing anything at all about the inner workings of jenkins, I started digging around in the hopes that I'd find a workaround. Came across this line:

          https://github.com/jenkinsci/git-plugin/blob/master/src/main/java/hudson/plugins/git/GitStatus.java#L300

          With some console work, I came to find that

          scmTriggerItem.getSCMs()

          is empty until you the pipeline job is run at least once. After it runs, getSCMs() returns the same object references as

          project.definition.scm

          ... at least in my scenario. The object doesn't seem to change after the first run, at least as far as my little script sees. The data is for sure there.

          Dayton Gomez added a comment - Not really knowing anything at all about the inner workings of jenkins, I started digging around in the hopes that I'd find a workaround. Came across this line: https://github.com/jenkinsci/git-plugin/blob/master/src/main/java/hudson/plugins/git/GitStatus.java#L300 With some console work, I came to find that scmTriggerItem.getSCMs() is empty until you the pipeline job is run at least once. After it runs, getSCMs() returns the same object references as project.definition.scm ... at least in my scenario. The object doesn't seem to change after the first run, at least as far as my little script sees. The data is for sure there.

          Mark Waite added a comment -

          Could you be more specific in your description of how to duplicate the problem? In particular, is the job a multi-branch pipeline job, or a freestyle pipeline or a multi-configuration multi-branch pipeline?

          How is the job defined?

          Can you see the same failure from a freshly created Jenkins instance?

          Mark Waite added a comment - Could you be more specific in your description of how to duplicate the problem? In particular, is the job a multi-branch pipeline job, or a freestyle pipeline or a multi-configuration multi-branch pipeline? How is the job defined? Can you see the same failure from a freshly created Jenkins instance?

          Dayton Gomez added a comment - - edited

          Does posting the config.xml help?

          <flow-definition plugin="workflow-job@2.7">
              <actions/>
              <description/>
              <keepDependencies>false</keepDependencies>
              <properties>
                  <hudson.plugins.buildblocker.BuildBlockerProperty plugin="build-blocker-plugin@1.7.3">
                      <useBuildBlocker>false</useBuildBlocker>
                      <blockLevel>GLOBAL</blockLevel>
                      <scanQueueFor>DISABLED</scanQueueFor>
                      <blockingJobs/>
                  </hudson.plugins.buildblocker.BuildBlockerProperty>
                  <jenkins.model.BuildDiscarderProperty>
                      <strategy class="hudson.tasks.LogRotator">
                          <daysToKeep>-1</daysToKeep>
                          <numToKeep>5</numToKeep>
                          <artifactDaysToKeep>-1</artifactDaysToKeep>
                          <artifactNumToKeep>-1</artifactNumToKeep>
                      </strategy>
                  </jenkins.model.BuildDiscarderProperty>
                  <com.sonyericsson.rebuild.RebuildSettings plugin="rebuild@1.25">
                      <autoRebuild>false</autoRebuild>
                      <rebuildDisabled>false</rebuildDisabled>
                  </com.sonyericsson.rebuild.RebuildSettings>
                  <com.synopsys.arc.jenkinsci.plugins.jobrestrictions.jobs.JobRestrictionProperty plugin="job-restrictions@0.4"/>
                  <hudson.plugins.throttleconcurrents.ThrottleJobProperty plugin="throttle-concurrents@1.9.0">
                      <maxConcurrentPerNode>0</maxConcurrentPerNode>
                      <maxConcurrentTotal>0</maxConcurrentTotal>
                      <categories class="java.util.concurrent.CopyOnWriteArrayList"/>
                      <throttleEnabled>false</throttleEnabled>
                      <throttleOption>project</throttleOption>
                      <limitOneJobWithMatchingParams>false</limitOneJobWithMatchingParams>
                      <paramsToUseForLimit/>
                  </hudson.plugins.throttleconcurrents.ThrottleJobProperty>
                  <org.jenkinsci.plugins.workflow.job.properties.PipelineTriggersJobProperty>
                      <triggers>
                          <hudson.triggers.SCMTrigger>
                              <spec/>
                              <ignorePostCommitHooks>false</ignorePostCommitHooks>
                          </hudson.triggers.SCMTrigger>
                      </triggers>
                  </org.jenkinsci.plugins.workflow.job.properties.PipelineTriggersJobProperty>
              </properties>
              <definition class="org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition" plugin="workflow-cps@2.18">
                  <scm class="hudson.plugins.git.GitSCM" plugin="git@3.0.0">
                      <configVersion>2</configVersion>
                      <userRemoteConfigs>
                          <hudson.plugins.git.UserRemoteConfig>
                              <url>
                                  ssh://git@our-bitbucket.com/common-libraries.git
                              </url>
                              <credentialsId>xxxxx</credentialsId>
                          </hudson.plugins.git.UserRemoteConfig>
                      </userRemoteConfigs>
                      <branches>
                          <hudson.plugins.git.BranchSpec>
                              <name>*/develop</name>
                          </hudson.plugins.git.BranchSpec>
                      </branches>
                      <doGenerateSubmoduleConfigurations>false</doGenerateSubmoduleConfigurations>
                      <submoduleCfg class="list"/>
                      <extensions/>
                  </scm>
                  <scriptPath>Jenkinsfile</scriptPath>
              </definition>
              <triggers/>
          </flow-definition>
          

          In English: it's just a pipeline job using a script from SCM using a single branch (develop).

          Duplication is pretty straight forward:

          • create 2 new pipeline jobs from SCM. The Jenkinsfile, far as my testing has shown, doesn't matter at all. But for testing purposes this is fine:
            node {
                stage("checkout") {
                    checkout scm
                }
            }
            
          • Run ONE of the jobs
          • Configure the bitbucket repo with the "Bitbucket Server Webhook to Jenkins" hook, to notify on any change. Don't think this is the failure, since I can watch jenkins logs and see onNotifyCommit() logging about other skipped jobs.
          • Push a change to develop in the bitbucket repo.

          At this point, the one pipeline job that's been executed will get triggered, but the other will not even write any log messages. If you delete the executed job and repeat the steps, the non-executed job will still not be notified.

          Not sure about a fresh jenkins instance.

          Dayton Gomez added a comment - - edited Does posting the config.xml help? <flow-definition plugin= "workflow-job@2.7" > <actions/> <description/> <keepDependencies> false </keepDependencies> <properties> <hudson.plugins.buildblocker.BuildBlockerProperty plugin= "build-blocker-plugin@1.7.3" > <useBuildBlocker> false </useBuildBlocker> <blockLevel>GLOBAL</blockLevel> <scanQueueFor>DISABLED</scanQueueFor> <blockingJobs/> </hudson.plugins.buildblocker.BuildBlockerProperty> <jenkins.model.BuildDiscarderProperty> <strategy class= "hudson.tasks.LogRotator" > <daysToKeep>-1</daysToKeep> <numToKeep>5</numToKeep> <artifactDaysToKeep>-1</artifactDaysToKeep> <artifactNumToKeep>-1</artifactNumToKeep> </strategy> </jenkins.model.BuildDiscarderProperty> <com.sonyericsson.rebuild.RebuildSettings plugin= "rebuild@1.25" > <autoRebuild> false </autoRebuild> <rebuildDisabled> false </rebuildDisabled> </com.sonyericsson.rebuild.RebuildSettings> <com.synopsys.arc.jenkinsci.plugins.jobrestrictions.jobs.JobRestrictionProperty plugin= "job-restrictions@0.4" /> <hudson.plugins.throttleconcurrents.ThrottleJobProperty plugin= "throttle-concurrents@1.9.0" > <maxConcurrentPerNode>0</maxConcurrentPerNode> <maxConcurrentTotal>0</maxConcurrentTotal> <categories class= "java.util.concurrent.CopyOnWriteArrayList" /> <throttleEnabled> false </throttleEnabled> <throttleOption>project</throttleOption> <limitOneJobWithMatchingParams> false </limitOneJobWithMatchingParams> <paramsToUseForLimit/> </hudson.plugins.throttleconcurrents.ThrottleJobProperty> <org.jenkinsci.plugins.workflow.job.properties.PipelineTriggersJobProperty> <triggers> <hudson.triggers.SCMTrigger> <spec/> <ignorePostCommitHooks> false </ignorePostCommitHooks> </hudson.triggers.SCMTrigger> </triggers> </org.jenkinsci.plugins.workflow.job.properties.PipelineTriggersJobProperty> </properties> <definition class= "org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition" plugin= "workflow-cps@2.18" > <scm class= "hudson.plugins.git.GitSCM" plugin= "git@3.0.0" > <configVersion>2</configVersion> <userRemoteConfigs> <hudson.plugins.git.UserRemoteConfig> <url> ssh: //git@our-bitbucket.com/common-libraries.git </url> <credentialsId>xxxxx</credentialsId> </hudson.plugins.git.UserRemoteConfig> </userRemoteConfigs> <branches> <hudson.plugins.git.BranchSpec> <name>*/develop</name> </hudson.plugins.git.BranchSpec> </branches> <doGenerateSubmoduleConfigurations> false </doGenerateSubmoduleConfigurations> <submoduleCfg class= "list" /> <extensions/> </scm> <scriptPath>Jenkinsfile</scriptPath> </definition> <triggers/> </flow-definition> In English: it's just a pipeline job using a script from SCM using a single branch (develop). Duplication is pretty straight forward: create 2 new pipeline jobs from SCM. The Jenkinsfile, far as my testing has shown, doesn't matter at all. But for testing purposes this is fine: node { stage( "checkout" ) { checkout scm } } Run ONE of the jobs Configure the bitbucket repo with the "Bitbucket Server Webhook to Jenkins" hook, to notify on any change. Don't think this is the failure, since I can watch jenkins logs and see onNotifyCommit() logging about other skipped jobs. Push a change to develop in the bitbucket repo. At this point, the one pipeline job that's been executed will get triggered, but the other will not even write any log messages. If you delete the executed job and repeat the steps, the non-executed job will still not be notified. Not sure about a fresh jenkins instance.
          Vivek Pandey made changes -
          Labels Original: pipeline scm New: pipeline scm triaged-2018-11

          Angry Gami added a comment -

          Hi I have same problem as dgomez
          It is ridiculous how this bug still exists. I've checked Jenkins version 2.205 and Git plugin version 4.0.0.
          Here is why:

          package org.jenkinsci.plugins.workflow.job;
          ...
          public final class WorkflowJob extends Job<WorkflowJob,WorkflowRun> implements LazyBuildMixIn.LazyLoadingJob<WorkflowJob,WorkflowRun>, ParameterizedJobMixIn.ParameterizedJob<WorkflowJob, WorkflowRun>, TopLevelItem, Queue.FlyweightTask, SCMTriggerItem, BlockableResume {
          ...
              @Override public Collection<? extends SCM> getSCMs() {
                  WorkflowRun b = getLastSuccessfulBuild();
                  if (b == null) {
                      b = getLastCompletedBuild();
                  }
                  if (b == null) {
                      return Collections.emptySet();
                  }
                  Map<String,SCM> scms = new LinkedHashMap<>();
                  for (WorkflowRun.SCMCheckout co : b.checkouts(null)) {
                      scms.put(co.scm.getKey(), co.scm);
                  }
                  return scms.values();
              }        
          

          Above is method getSCM from WorkflowJob class and as you can see that it will return empty collection when there is no successful or completed build for this workflow.

          And here is code from GitStatus class that is responsible for processing notifications:

          package hudson.plugins.git;
          ...
          public class GitStatus implements UnprotectedRootAction {
          ...
              public static class JenkinsAbstractProjectListener extends Listener {
          ...
                  @Override
                  public List<ResponseContributor> onNotifyCommit(String origin, URIish uri, String sha1, List<ParameterValue> buildParameters, String... branches) {
          ...
                          for (final Item project : jenkins.getAllItems()) {
                              SCMTriggerItem scmTriggerItem = SCMTriggerItem.SCMTriggerItems.asSCMTriggerItem(project);
                              if (scmTriggerItem == null) {
                                  continue;
                              }
                              SCMS: for (SCM scm : scmTriggerItem.getSCMs()) {
          // do stuff
          ...
          

          I.e. if getSCMs returns empty collection (and it will if no builds had happened yet) nothing will be checked.
          Maybe better way would be to first check if definition of the scmTriggerItem has scm attached with something like this:

          ...
                          for (final Item project : jenkins.getAllItems()) {
                              SCMTriggerItem scmTriggerItem = SCMTriggerItem.SCMTriggerItems.asSCMTriggerItem(project);
                              if (scmTriggerItem == null) {
                                  continue;
                              }
                             Collection<? extends SCM> scms = scmTriggerItem.getSCMs();
                             if (scms.isEmpty()){
                                 if (scmTriggerItem.getDefinition() instanceof org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition) {
                                     CpsScmFlowDefinition fd =  (CpsScmFlowDefinition) scmTriggerItem.getDefinition();
                                     scms = Arrays.asList(fd.getScm());
                                 }                       
                             }
                              SCMS: for (SCM scm : scms) {
          // do stuff
          ...
          

          It is hard to tell if `org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition` will be available so maybe only real way to do that will be to do some reflection. Though main idea it NOT to rely on build happened at least once for receiving scms.

          Angry Gami added a comment - Hi I have same problem as dgomez It is ridiculous how this bug still exists. I've checked Jenkins version 2.205 and Git plugin version 4.0.0. Here is why: package org.jenkinsci.plugins.workflow.job; ... public final class WorkflowJob extends Job<WorkflowJob,WorkflowRun> implements LazyBuildMixIn.LazyLoadingJob<WorkflowJob,WorkflowRun>, ParameterizedJobMixIn.ParameterizedJob<WorkflowJob, WorkflowRun>, TopLevelItem, Queue.FlyweightTask, SCMTriggerItem, BlockableResume { ... @Override public Collection<? extends SCM> getSCMs() { WorkflowRun b = getLastSuccessfulBuild(); if (b == null ) { b = getLastCompletedBuild(); } if (b == null ) { return Collections.emptySet(); } Map< String ,SCM> scms = new LinkedHashMap<>(); for (WorkflowRun.SCMCheckout co : b.checkouts( null )) { scms.put(co.scm.getKey(), co.scm); } return scms.values(); } Above is method getSCM from WorkflowJob class and as you can see that it will return empty collection when there is no successful or completed build for this workflow. And here is code from GitStatus class that is responsible for processing notifications: package hudson.plugins.git; ... public class GitStatus implements UnprotectedRootAction { ... public static class JenkinsAbstractProjectListener extends Listener { ... @Override public List<ResponseContributor> onNotifyCommit( String origin, URIish uri, String sha1, List<ParameterValue> buildParameters, String ... branches) { ... for ( final Item project : jenkins.getAllItems()) { SCMTriggerItem scmTriggerItem = SCMTriggerItem.SCMTriggerItems.asSCMTriggerItem(project); if (scmTriggerItem == null ) { continue ; } SCMS: for (SCM scm : scmTriggerItem.getSCMs()) { // do stuff ... I.e. if getSCMs returns empty collection (and it will if no builds had happened yet) nothing will be checked. Maybe better way would be to first check if definition of the scmTriggerItem has scm attached with something like this: ... for ( final Item project : jenkins.getAllItems()) { SCMTriggerItem scmTriggerItem = SCMTriggerItem.SCMTriggerItems.asSCMTriggerItem(project); if (scmTriggerItem == null ) { continue ; } Collection<? extends SCM> scms = scmTriggerItem.getSCMs(); if (scms.isEmpty()){ if (scmTriggerItem.getDefinition() instanceof org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition) { CpsScmFlowDefinition fd = (CpsScmFlowDefinition) scmTriggerItem.getDefinition(); scms = Arrays.asList(fd.getScm()); } } SCMS: for (SCM scm : scms) { // do stuff ... It is hard to tell if `org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition` will be available so maybe only real way to do that will be to do some reflection. Though main idea it NOT to rely on build happened at least once for receiving scms.

          Mark Waite added a comment -

          Thanks for the thorough investigation angrygami! Will you be submitting a pull request to resolve the issue, including tests that show the problem and can be used to confirm that the fix resolves the problem?

          Mark Waite added a comment - Thanks for the thorough investigation angrygami ! Will you be submitting a pull request to resolve the issue, including tests that show the problem and can be used to confirm that the fix resolves the problem?

          Angry Gami added a comment -

          I don't have a fix yet. Suggested solution above won't compile because org.jenkinsci.plugins.workflow package is not a dependency of plugin project. I'll try to find working one.

           

          Angry Gami added a comment - I don't have a fix yet. Suggested solution above won't compile because org.jenkinsci.plugins.workflow package is not a dependency of plugin project. I'll try to find working one.  

          Angry Gami added a comment - - edited

          Ok, sorry for a delay. Now I have my fix working and it involves changes to 3 plugins:

          workflow-api

          diff --git a/src/main/java/org/jenkinsci/plugins/workflow/flow/FlowDefinition.java b/src/main/java/org/jenkinsci/plugins/workflow/flow/FlowDefinition.java
          index 55c29c7..ef71822 100644
          --- a/src/main/java/org/jenkinsci/plugins/workflow/flow/FlowDefinition.java
          +++ b/src/main/java/org/jenkinsci/plugins/workflow/flow/FlowDefinition.java
          @@ -29,12 +29,15 @@ import hudson.Util;
           import hudson.model.AbstractDescribableImpl;
           import hudson.model.Action;
           import hudson.model.TaskListener;
          +import hudson.scm.SCM;
           import hudson.util.LogTaskListener;
           
           import javax.annotation.CheckForNull;
           import javax.annotation.Nonnull;
           import java.io.IOException;
           import java.util.List;
          +import java.util.Collection;
          +import java.util.Collections;
           import java.util.logging.Level;
           import java.util.logging.Logger;
           
          @@ -80,4 +83,7 @@ public abstract class FlowDefinition extends AbstractDescribableImpl<FlowDefinit
                   return (FlowDefinitionDescriptor) super.getDescriptor();
               }
           
          +    public Collection<? extends SCM> getSCMs() {
          +        return Collections.emptyList();
          +    }
           }
          

          Here I've added new method to FlowDefinition class that could be overridden by subclasses and by default it returns empty collection. This is meant to help subclasses of FlowDefinition to provide list of SCMs that it might be aware of.

          workflow-cps

          diff --git a/src/main/java/org/jenkinsci/plugins/workflow/cps/CpsScmFlowDefinition.java b/src/main/java/org/jenkinsci/plugins/workflow/cps/CpsScmFlowDefinition.java
          index 6e665a8..6fcd729 100644
          --- a/src/main/java/org/jenkinsci/plugins/workflow/cps/CpsScmFlowDefinition.java
          +++ b/src/main/java/org/jenkinsci/plugins/workflow/cps/CpsScmFlowDefinition.java
          @@ -46,6 +46,7 @@ import java.io.FileNotFoundException;
           import java.io.IOException;
           import java.io.InterruptedIOException;
           import java.util.Collection;
          +import java.util.Collections;
           import java.util.List;
           import jenkins.model.Jenkins;
           import jenkins.scm.api.SCMFileSystem;
          @@ -82,6 +83,11 @@ public class CpsScmFlowDefinition extends FlowDefinition {
                   return scm;
               }
           
          +    @Override
          +    public Collection<? extends SCM> getSCMs() {
          +       return Collections.singletonList(scm);
          +    }
          +
               public String getScriptPath() {
                   return scriptPath;
               }
          

          Here I actually override getSCMs method from FlowDefinition for case when we are dealing with CpsScmFlowDefinition - i.e. definition of the flow that actually known to be based on SCM.
          For definitions that are based of inline script we can't tell if there are any SCM used until first build and this behavior is preserved.

          workflow-job

          diff --git a/src/main/java/org/jenkinsci/plugins/workflow/job/WorkflowJob.java b/src/main/java/org/jenkinsci/plugins/workflow/job/WorkflowJob.java
          index 57ea992..3d0c84d 100644
          --- a/src/main/java/org/jenkinsci/plugins/workflow/job/WorkflowJob.java
          +++ b/src/main/java/org/jenkinsci/plugins/workflow/job/WorkflowJob.java
          @@ -526,17 +526,23 @@ public final class WorkflowJob extends Job<WorkflowJob,WorkflowRun> implements L
               }
           
               @Override public Collection<? extends SCM> getSCMs() {
          +        Collection<? extends SCM> definedSCMs = definition != null 
          +            ? definition.getSCMs() 
          +            : Collections.emptySet();
                   WorkflowRun b = getLastSuccessfulBuild();
                   if (b == null) {
                       b = getLastCompletedBuild();
                   }
                   if (b == null) {
          -            return Collections.emptySet();
          +            return definedSCMs;
                   }
                   Map<String,SCM> scms = new LinkedHashMap<>();
                   for (WorkflowRun.SCMCheckout co : b.checkouts(null)) {
                       scms.put(co.scm.getKey(), co.scm);
                   }
          +        for (SCM scm : definedSCMs) {
          +            scms.put(scm.getKey(), scm);
          +        }
                   return scms.values();
               }
          

          And finally here I use getSCMs method from FlowDefinition to return known SCMs when there were no build happened yet.

          I can try to make pull requests from all of the above, though I don't know how to do that for three projects simultaneously and consistently. Maybe markewaite can help me with that somehow?
          I also don't understand how to write a test that cover three plugins at same time.

          Angry Gami added a comment - - edited Ok, sorry for a delay. Now I have my fix working and it involves changes to 3 plugins: workflow-api diff --git a/src/main/java/org/jenkinsci/plugins/workflow/flow/FlowDefinition.java b/src/main/java/org/jenkinsci/plugins/workflow/flow/FlowDefinition.java index 55c29c7..ef71822 100644 --- a/src/main/java/org/jenkinsci/plugins/workflow/flow/FlowDefinition.java +++ b/src/main/java/org/jenkinsci/plugins/workflow/flow/FlowDefinition.java @@ -29,12 +29,15 @@ import hudson.Util; import hudson.model.AbstractDescribableImpl; import hudson.model.Action; import hudson.model.TaskListener; + import hudson.scm.SCM; import hudson.util.LogTaskListener; import javax.annotation.CheckForNull; import javax.annotation.Nonnull; import java.io.IOException; import java.util.List; + import java.util.Collection; + import java.util.Collections; import java.util.logging.Level; import java.util.logging.Logger; @@ -80,4 +83,7 @@ public abstract class FlowDefinition extends AbstractDescribableImpl<FlowDefinit return (FlowDefinitionDescriptor) super .getDescriptor(); } + public Collection<? extends SCM> getSCMs() { + return Collections.emptyList(); + } } Here I've added new method to FlowDefinition class that could be overridden by subclasses and by default it returns empty collection. This is meant to help subclasses of FlowDefinition to provide list of SCMs that it might be aware of. workflow-cps diff --git a/src/main/java/org/jenkinsci/plugins/workflow/cps/CpsScmFlowDefinition.java b/src/main/java/org/jenkinsci/plugins/workflow/cps/CpsScmFlowDefinition.java index 6e665a8..6fcd729 100644 --- a/src/main/java/org/jenkinsci/plugins/workflow/cps/CpsScmFlowDefinition.java +++ b/src/main/java/org/jenkinsci/plugins/workflow/cps/CpsScmFlowDefinition.java @@ -46,6 +46,7 @@ import java.io.FileNotFoundException; import java.io.IOException; import java.io.InterruptedIOException; import java.util.Collection; + import java.util.Collections; import java.util.List; import jenkins.model.Jenkins; import jenkins.scm.api.SCMFileSystem; @@ -82,6 +83,11 @@ public class CpsScmFlowDefinition extends FlowDefinition { return scm; } + @Override + public Collection<? extends SCM> getSCMs() { + return Collections.singletonList(scm); + } + public String getScriptPath() { return scriptPath; } Here I actually override getSCMs method from FlowDefinition for case when we are dealing with CpsScmFlowDefinition - i.e. definition of the flow that actually known to be based on SCM. For definitions that are based of inline script we can't tell if there are any SCM used until first build and this behavior is preserved. workflow-job diff --git a/src/main/java/org/jenkinsci/plugins/workflow/job/WorkflowJob.java b/src/main/java/org/jenkinsci/plugins/workflow/job/WorkflowJob.java index 57ea992..3d0c84d 100644 --- a/src/main/java/org/jenkinsci/plugins/workflow/job/WorkflowJob.java +++ b/src/main/java/org/jenkinsci/plugins/workflow/job/WorkflowJob.java @@ -526,17 +526,23 @@ public final class WorkflowJob extends Job<WorkflowJob,WorkflowRun> implements L } @Override public Collection<? extends SCM> getSCMs() { + Collection<? extends SCM> definedSCMs = definition != null + ? definition.getSCMs() + : Collections.emptySet(); WorkflowRun b = getLastSuccessfulBuild(); if (b == null ) { b = getLastCompletedBuild(); } if (b == null ) { - return Collections.emptySet(); + return definedSCMs; } Map< String ,SCM> scms = new LinkedHashMap<>(); for (WorkflowRun.SCMCheckout co : b.checkouts( null )) { scms.put(co.scm.getKey(), co.scm); } + for (SCM scm : definedSCMs) { + scms.put(scm.getKey(), scm); + } return scms.values(); } And finally here I use getSCMs method from FlowDefinition to return known SCMs when there were no build happened yet. I can try to make pull requests from all of the above, though I don't know how to do that for three projects simultaneously and consistently. Maybe markewaite can help me with that somehow? I also don't understand how to write a test that cover three plugins at same time.

            angrygami Angry Gami
            dgomez Dayton Gomez
            Votes:
            2 Vote for this issue
            Watchers:
            6 Start watching this issue

              Created:
              Updated:
              Resolved: