Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-57108

Pipeline-Maven-Plugin - UNSTABLE build with downstream list of parent job

    • Icon: Bug Bug
    • Resolution: Unresolved
    • Icon: Major Major
    • pipeline-maven-plugin
    • None
    • Jenkins ver. 2.164.1
      pipeline-maven-plugin: 3.6.9

      Hi,
       
      Here is my context.
      I have a pipeline with main steps:

      • mvn clean deploy
      • mvn test
      • sonar quality gate

      I choose to set currentBuild.result = 'UNSTABLE' if last step (quality gate from Sonar) is not OK.
      I want my dependencies to build automatically downstream projects when SNAPSHOT.
      Everything works well when quality gate is OK (means the job is SUCCESS):
      If my newly SUCCESSed project A contains SNAPSHOTed dependencies B, I can see on "Downstream Jobs" of this dependency B the newly build project.
      So, when I build this dependency B, on finished, job from project A is launched.
       
      BUT
       
      If my job from project A is UNSTABLE, it won't be referenced as a downstream Job from dependency B (so building B will no trigger A). I have to build at least once in SUCCESS (by ex. by bypassing SONAR GATE) to see it in "Downstream Jobs" of my parent project.
       
      I'd like my parent "Downstream jobs" list updated for a project that is SUCCESS or UNSTABLE.
       
      Thanks
       

      def call(body) {
        // evaluate the body block, and collect configuration into the object
        def pipelineParams = [:]
        body.resolveStrategy = Closure.DELEGATE_FIRST
        body.delegate = pipelineParams
        body()
      
        // Get Artifactory server instance, defined in the Artifactory Plugin administration page.
        def artifactory = Artifactory.server "ARTIFACTORY"
      
        def scmUrl
      
        def trimOrigin = {
          it.startsWith('origin/') ? it.trim() - 'origin/' : it.trim()
        }
      
        node('maven') {
          try {
            stage('Clone sources') {
              // Keep only last 3 builds + disable concurrent builds
              properties([
                  buildDiscarder(
                      logRotator(
                          artifactDaysToKeepStr: '',
                          artifactNumToKeepStr: '',
                          daysToKeepStr: '',
                          numToKeepStr: '3')
                  ),
                  disableConcurrentBuilds()
              ])
      
              testFailure = false
              buildFailure = false
      
              // MULTIBRANCH: Branch is part of the context: so use BRANCH_NAME
              branchTobuild = env.BRANCH_NAME
              echo "branchTobuild=${branchTobuild}"
      
              // Scm url
              scmUrl = scm.getUserRemoteConfigs()[0].getUrl()
      
              // Clean
              step([$class: 'WsCleanup', cleanWhenFailure: false])
      
              // Get code from a Gitlab repository
              git branch: trimOrigin(branchTobuild), credentialsId: 'jenkins', url: scmUrl
      
              shortCommit = sh(returnStdout: true, script: "git log -n1 --pretty=format:'%H'").trim()
      
               // Get deployPath
              deployPath = pipelineParams.deployPath ?: ""
              echo "deployPath:${deployPath}"
      
              // Is this component deployable (if not, no need to display deploy buttons in Slack)
              deployable = pipelineParams.isDeployable ?: true
              echo "deployable:${deployable}"
            }
      
            stage('Maven build') {
              withMaven(maven: 'Maven 3.6.0', options: [junitPublisher(disabled: true)]) {
                try {
                  sh 'mvn -U -T 2 clean deploy -DskipTests -Dmaven.javadoc.skip=true'
                } catch (e) {
                  buildFailure = true
                  throw e
                }
              }
            }
      
            stage('Running tests') {
              try {
                sh 'mvn -T 2 --errors test -DfailIfNoTests=false -Dsurefire.useSystemClassLoader=false'
              } catch (e) {
                // if any exception occurs, mark the build as failed
                testFailure = true
                throw e
              } finally {
                junit(testResults: '**/surefire-reports/*xml', allowEmptyResults: true)
              }
            }
      
            stage('SonarQube analysis') {
              withSonarQubeEnv('Sonar') {
                sh "mvn org.sonarsource.scanner.maven:sonar-maven-plugin:3.6.0.1398:sonar \
                   -Dsonar.sources='.' \
                   -Dsonar.inclusions='pom.xml,src/main/web/**,src/main/java/**' \
                   -Dsonar.exclusions='src/main/web/node_modules/**' \
                   -Dsonar.upsource.url='https://upsource.ehtrace.com' \
                   -Dsonar.upsource.project=${pomArtifactId} \
                   -Dsonar.upsource.revision=${shortCommit} \
                   -Dsonar.upsource.token='***********'"
              }
            }
      
            stage("Notify slack Quality Gate") {
              timeout(time: 1, unit: 'HOURS') {
                // Just in case something goes wrong, pipeline will be killed after a timeout
                def qg = waitForQualityGate() // Reuse taskId previously collected by withSonarQubeEnv
                if (qg.status != 'OK') {
                  currentBuild.result = 'UNSTABLE'
                  echo "Pipeline aborted due to quality gate failure: ${qg.status}"
                  notifySlackStatus('SONAR_QUALITY_GATE_FAILURE')
                } else {
                  currentBuild.result = 'SUCCESS'
                  notifySlackStatus('SUCCESS')
                }
              }
            }
          } catch (Exception e) {
            // if any exception occurs, mark the build as failed
            echo e.message
      
            currentBuild.result = 'FAILURE'
            if (buildFailure == true) {
              notifySlackStatus('BUILD_FAILURE')
            } else if (testFailure == true) {
              notifySlackStatus('TEST_FAILURE')
            } else {
              notifySlackStatus('FAILURE')
            }
            throw e // rethrow so the build is considered failed
          }
        }
      }
      

          [JENKINS-57108] Pipeline-Maven-Plugin - UNSTABLE build with downstream list of parent job

          This is an unexpected behaviour:

          org.jenkinsci.plugins.pipeline.maven.publishers.PipelineGraphPublisher records all the dependencies and generated artifacts of a pipeline. It is executed at the end of the "withMaven(){...}" wrapping step, it should not be ignored even if the maven build is a failure as long as it gracefully finishes.
          Please enable logs to FINER for org.jenkinsci.plugins.pipeline.maven.publishers.PipelineGraphPublisher, rerun the downstream pipeline A and see in the console of your newbuild of A the message

          [withMaven] pipelineGraphPublisher - Record dependency: " + dependency.getId() + ", ignoreUpstreamTriggers: " + ignoreUpstreamTriggers"
          

          ref: https://github.com/jenkinsci/pipeline-maven-plugin/blob/pipeline-maven-3.6.11/jenkins-plugin/src/main/java/org/jenkinsci/plugins/pipeline/maven/publishers/PipelineGraphPublisher.java#L172

          you should also rerun the upstream pipeline and see in the pipeline build console logs the message

          [withMaven] pipelineGraphPublisher - Record generated artifact:...
          

          Ref https://github.com/jenkinsci/pipeline-maven-plugin/blob/pipeline-maven-3.6.11/jenkins-plugin/src/main/java/org/jenkinsci/plugins/pipeline/maven/publishers/PipelineGraphPublisher.java#L210

          Cyrille Le Clerc added a comment - This is an unexpected behaviour: org.jenkinsci.plugins.pipeline.maven.publishers.PipelineGraphPublisher records all the dependencies and generated artifacts of a pipeline. It is executed at the end of the " withMaven(){... }" wrapping step, it should not be ignored even if the maven build is a failure as long as it gracefully finishes. Please enable logs to FINER for org.jenkinsci.plugins.pipeline.maven.publishers.PipelineGraphPublisher , rerun the downstream pipeline A and see in the console of your newbuild of A the message [withMaven] pipelineGraphPublisher - Record dependency: " + dependency.getId() + ", ignoreUpstreamTriggers: " + ignoreUpstreamTriggers" ref: https://github.com/jenkinsci/pipeline-maven-plugin/blob/pipeline-maven-3.6.11/jenkins-plugin/src/main/java/org/jenkinsci/plugins/pipeline/maven/publishers/PipelineGraphPublisher.java#L172 you should also rerun the upstream pipeline and see in the pipeline build console logs the message [withMaven] pipelineGraphPublisher - Record generated artifact:... Ref https://github.com/jenkinsci/pipeline-maven-plugin/blob/pipeline-maven-3.6.11/jenkins-plugin/src/main/java/org/jenkinsci/plugins/pipeline/maven/publishers/PipelineGraphPublisher.java#L210

          benjamin tocquec added a comment - - edited

          Ok. I understand your point saying : if logs are present during build, it should work even if status is not SUCCESS.

          However, I tried with FINER logs and saw for parent:

          [withMaven] pipelineGraphPublisher - Record generated artifact: com.xxx.common:eht-error-codes:jar:3.1.0-SNAPSHOT, version: 3.1.0-20190513.093918-6, executedLifecyclePhases: [clean, initialize, process-resources, compile, process-test-resources, test-compile, test, prepare-package, package, install, deploy], skipDownstreamTriggers: false, lifecycleThreshold:deploy, file: /home/jenkins/agent/workspace/Common_eht-error-codes_develop/target/eht-error-codes-3.1.0-SNAPSHOT.jar
          

          and for child:

          [withMaven] pipelineGraphPublisher - Record dependency: com.xxx.common:eht-error-codes:jar:3.1.0-SNAPSHOT, ignoreUpstreamTriggers: false
          

          but result is the same on"Maven" view:

          • child has parent in "Upstream builds" => ok
          • parent has nothing in "Downstream Jobs" => ko

          Parent build is SUCCESS
          Child build is UNSTABLE because of Sonar gate failure

          EDIT:
          Just tested with

          currentBuild.result = 'SUCCESS'

          instead of

          currentBuild.result = 'UNSTABLE'

          in my sonar stage. My child is now present in the "Downstream Jobs" list of the parent !!

          benjamin tocquec added a comment - - edited Ok. I understand your point saying : if logs are present during build, it should work even if status is not SUCCESS. However, I tried with FINER logs and saw for parent: [withMaven] pipelineGraphPublisher - Record generated artifact: com.xxx.common:eht-error-codes:jar:3.1.0-SNAPSHOT, version: 3.1.0-20190513.093918-6, executedLifecyclePhases: [clean, initialize, process-resources, compile, process-test-resources, test-compile, test, prepare- package , package , install, deploy], skipDownstreamTriggers: false , lifecycleThreshold:deploy, file: /home/jenkins/agent/workspace/Common_eht-error-codes_develop/target/eht-error-codes-3.1.0-SNAPSHOT.jar and for child: [withMaven] pipelineGraphPublisher - Record dependency: com.xxx.common:eht-error-codes:jar:3.1.0-SNAPSHOT, ignoreUpstreamTriggers: false but result is the same on"Maven" view: child has parent in "Upstream builds" => ok parent has nothing in "Downstream Jobs" => ko Parent build is SUCCESS Child build is UNSTABLE because of Sonar gate failure EDIT: Just tested with currentBuild.result = 'SUCCESS' instead of currentBuild.result = 'UNSTABLE' in my sonar stage. My child is now present in the "Downstream Jobs" list of the parent !!

          I think that I understood the problem: listing upstream/downstream jobs is based on "job.last_successful_build_id". This problem may not be easy to solve.

          See https://github.com/jenkinsci/pipeline-maven-plugin/blob/pipeline-maven-3.6.12/jenkins-plugin/src/main/java/org/jenkinsci/plugins/pipeline/maven/dao/AbstractPipelineMavenPluginDao.java#L883

           protected Map<String, Integer> listUpstreamPipelinesBasedOnMavenDependencies(@Nonnull String downstreamJobFullName, int downstreamBuildNumber) {
                  LOGGER.log(Level.FINER, "listUpstreamPipelinesBasedOnMavenDependencies({0}, {1})", new Object[]{downstreamJobFullName, downstreamBuildNumber});
          
                  String sql = "select  upstream_job.full_name, upstream_build.number\n" +
                          "from JENKINS_JOB as upstream_job\n" +
                          "inner join JENKINS_BUILD as upstream_build on (upstream_job.id = upstream_build.job_id and upstream_job.last_successful_build_number = upstream_build.number)\n" +
                          "inner join GENERATED_MAVEN_ARTIFACT on (upstream_build.id = GENERATED_MAVEN_ARTIFACT.build_id  and GENERATED_MAVEN_ARTIFACT.skip_downstream_triggers = false)\n" +
                          "inner join MAVEN_ARTIFACT on GENERATED_MAVEN_ARTIFACT.artifact_id = MAVEN_ARTIFACT.id\n" +
                          "inner join MAVEN_DEPENDENCY on (MAVEN_DEPENDENCY.artifact_id = MAVEN_ARTIFACT.id and MAVEN_DEPENDENCY.ignore_upstream_triggers = false)\n" +
                          "inner join JENKINS_BUILD as downstream_build on MAVEN_DEPENDENCY.build_id = downstream_build.id\n" +
                          "inner join JENKINS_JOB as downstream_job on downstream_build.job_id = downstream_job.id\n" +
                          "where downstream_job.full_name = ? and downstream_job.jenkins_master_id = ? and  downstream_build.number = ? and upstream_job.jenkins_master_id = ?";
          
                  Map<String, Integer> upstreamJobsFullNames = new HashMap<>();
                  LOGGER.log(Level.FINER, "sql: {0}, jobFullName:{1}, buildNumber: {2}", new Object[]{sql, downstreamJobFullName, downstreamBuildNumber});
          
                  try (Connection cnn = ds.getConnection()) {
                      try (PreparedStatement stmt = cnn.prepareStatement(sql)) {
                          stmt.setString(1, downstreamJobFullName);
                          stmt.setLong(2, getJenkinsMasterPrimaryKey(cnn));
                          stmt.setInt(3, downstreamBuildNumber);
                          stmt.setLong(4, getJenkinsMasterPrimaryKey(cnn));
                          try (ResultSet rst = stmt.executeQuery()) {
                              while (rst.next()) {
                                  upstreamJobsFullNames.put(rst.getString(1), rst.getInt(2));
                              }
                          }
                      }
                  } catch (SQLException e) {
                      throw new RuntimeSqlException(e);
                  }
                  LOGGER.log(Level.FINE, "listUpstreamPipelinesBasedOnMavenDependencies({0}, {1}): {2}", new Object[]{downstreamJobFullName, downstreamBuildNumber, upstreamJobsFullNames});
          
                  return upstreamJobsFullNames;
              }
          

          Cyrille Le Clerc added a comment - I think that I understood the problem: listing upstream/downstream jobs is based on "job.last_successful_build_id". This problem may not be easy to solve. See https://github.com/jenkinsci/pipeline-maven-plugin/blob/pipeline-maven-3.6.12/jenkins-plugin/src/main/java/org/jenkinsci/plugins/pipeline/maven/dao/AbstractPipelineMavenPluginDao.java#L883 protected Map< String , Integer > listUpstreamPipelinesBasedOnMavenDependencies(@Nonnull String downstreamJobFullName, int downstreamBuildNumber) { LOGGER.log(Level.FINER, "listUpstreamPipelinesBasedOnMavenDependencies({0}, {1})" , new Object []{downstreamJobFullName, downstreamBuildNumber}); String sql = "select upstream_job.full_name, upstream_build.number\n" + "from JENKINS_JOB as upstream_job\n" + " inner join JENKINS_BUILD as upstream_build on (upstream_job.id = upstream_build.job_id and upstream_job.last_successful_build_number = upstream_build.number)\n" + " inner join GENERATED_MAVEN_ARTIFACT on (upstream_build.id = GENERATED_MAVEN_ARTIFACT.build_id and GENERATED_MAVEN_ARTIFACT.skip_downstream_triggers = false )\n" + " inner join MAVEN_ARTIFACT on GENERATED_MAVEN_ARTIFACT.artifact_id = MAVEN_ARTIFACT.id\n" + " inner join MAVEN_DEPENDENCY on (MAVEN_DEPENDENCY.artifact_id = MAVEN_ARTIFACT.id and MAVEN_DEPENDENCY.ignore_upstream_triggers = false )\n" + " inner join JENKINS_BUILD as downstream_build on MAVEN_DEPENDENCY.build_id = downstream_build.id\n" + " inner join JENKINS_JOB as downstream_job on downstream_build.job_id = downstream_job.id\n" + "where downstream_job.full_name = ? and downstream_job.jenkins_master_id = ? and downstream_build.number = ? and upstream_job.jenkins_master_id = ?" ; Map< String , Integer > upstreamJobsFullNames = new HashMap<>(); LOGGER.log(Level.FINER, "sql: {0}, jobFullName:{1}, buildNumber: {2}" , new Object []{sql, downstreamJobFullName, downstreamBuildNumber}); try (Connection cnn = ds.getConnection()) { try (PreparedStatement stmt = cnn.prepareStatement(sql)) { stmt.setString(1, downstreamJobFullName); stmt.setLong(2, getJenkinsMasterPrimaryKey(cnn)); stmt.setInt(3, downstreamBuildNumber); stmt.setLong(4, getJenkinsMasterPrimaryKey(cnn)); try (ResultSet rst = stmt.executeQuery()) { while (rst.next()) { upstreamJobsFullNames.put(rst.getString(1), rst.getInt(2)); } } } } catch (SQLException e) { throw new RuntimeSqlException(e); } LOGGER.log(Level.FINE, "listUpstreamPipelinesBasedOnMavenDependencies({0}, {1}): {2}" , new Object []{downstreamJobFullName, downstreamBuildNumber, upstreamJobsFullNames}); return upstreamJobsFullNames; }

          I don't know how I can help.

          Perhaps change :
          https://github.com/jenkinsci/pipeline-maven-plugin/blob/f52b76dfe8a22a22feeb01f11c1cfe44682a5de3/jenkins-plugin/src/main/java/org/jenkinsci/plugins/pipeline/maven/dao/AbstractPipelineMavenPluginDao.java#L1148
          to add SUCCESS and UNSTABLE ? (but may cause side effects)

          Or add another entry in table that sounds like "last_not_failed_build_id" ?

          benjamin tocquec added a comment - I don't know how I can help. Perhaps change : https://github.com/jenkinsci/pipeline-maven-plugin/blob/f52b76dfe8a22a22feeb01f11c1cfe44682a5de3/jenkins-plugin/src/main/java/org/jenkinsci/plugins/pipeline/maven/dao/AbstractPipelineMavenPluginDao.java#L1148 to add SUCCESS and UNSTABLE ? (but may cause side effects) Or add another entry in table that sounds like "last_not_failed_build_id" ?

          Any update on this ticket? Do you have any idea how to fix it?
          Thanks

          benjamin tocquec added a comment - Any update on this ticket? Do you have any idea how to fix it? Thanks

          Stefan Sedelmaier added a comment - - edited

          The Problem is if you change the setting "Trigger downstream upon result" to anything else than stable, the downstream build will not be triggered on unstable

          A workaround may be to update LAST_SUCCESSFUL_BUILD_NUMBER also on UNSTABLE result in AbstractPipelineMavenPluginDao:updateBuildOnCompletion :

          if (Result.SUCCESS.ordinal == buildResultOrdinal
           || Result.UNSTABLE.ordinal == buildResultOrdinal) {
          

           

          Stefan Sedelmaier added a comment - - edited The Problem is if you change the setting "Trigger downstream upon result" to anything else than stable, the downstream build will not be triggered on unstable A workaround may be to update LAST_SUCCESSFUL_BUILD_NUMBER also on UNSTABLE result in AbstractPipelineMavenPluginDao:updateBuildOnCompletion : if (Result.SUCCESS.ordinal == buildResultOrdinal || Result.UNSTABLE.ordinal == buildResultOrdinal) {  

          I did a fix and created PullRequest #228 for that issue.

          It uses configuration settings from "Trigger downstream upon result" to determine if the build is successfull

          Stefan Sedelmaier added a comment - I did a fix and created PullRequest #228 for that issue. It uses configuration settings from "Trigger downstream upon result" to determine if the build is successfull

          Thanks a lot sedstef !
          Hope this PR will be merged soon

          benjamin tocquec added a comment - Thanks a lot sedstef ! Hope this PR will be merged soon

          Maxime Hochet added a comment -

          The worst case is to have a build result different to SUCCESS (could be FAILURE) but publishing its artifacts. In this case new snapshot artifacts are available with the downstream feature not working.

          I would only take in account the publication as a condition for triggering other jobs.

          Maybe a setting in the admin area about the downstream feature conditioned to the build result level would be the best solution.

          Maxime Hochet added a comment - The worst case is to have a build result different to SUCCESS (could be FAILURE ) but publishing its artifacts. In this case new snapshot artifacts are available with the downstream feature not working. I would only take in account the publication as a condition for triggering other jobs. Maybe a setting in the admin area about the downstream feature conditioned to the build result level would be the best solution.

          Benoit added a comment -

          > The worst case is to have a build result different to SUCCESS (could be FAILURE) but publishing its artifacts. In this case new snapshot artifacts are available with the downstream feature not working.

          Agreed. Hopefully, this is already not the case

          This is the code that detect artefacts of a job :
          https://github.com/jenkinsci/pipeline-maven-plugin/blob/master/jenkins-plugin/src/main/java/org/jenkinsci/plugins/pipeline/maven/util/XmlUtils.java#L199
          it will only consider those being deployed

          See the difference between a package phase XML spy and a deploy phase one :
          https://github.com/jenkinsci/pipeline-maven-plugin/blob/09eee091029f1cf75ee7da350cee7eb4b7dc7ed5/jenkins-plugin/src/test/resources/org/jenkinsci/plugins/pipeline/maven/maven-spy-package-jar.xml#L396
          https://github.com/jenkinsci/pipeline-maven-plugin/blob/09eee091029f1cf75ee7da350cee7eb4b7dc7ed5/jenkins-plugin/src/test/resources/org/jenkinsci/plugins/pipeline/maven/maven-spy-deploy-jar.xml#L921

          So it seems to be that whatever is your build result (success, unstable, or even failed), the plugin will not record anything and thus not trigger anything if the maven-deploy-plugin is not executed. On the other hand, the plugin should record links between jobs provided that artefacts are deployed.

          And then, if result is acceptable for the user, should trigger dependent builds. Here, and only here, is the problem

          Benoit added a comment - > The worst case is to have a build result different to SUCCESS (could be FAILURE) but publishing its artifacts. In this case new snapshot artifacts are available with the downstream feature not working. Agreed. Hopefully, this is already not the case This is the code that detect artefacts of a job : https://github.com/jenkinsci/pipeline-maven-plugin/blob/master/jenkins-plugin/src/main/java/org/jenkinsci/plugins/pipeline/maven/util/XmlUtils.java#L199 it will only consider those being deployed See the difference between a package phase XML spy and a deploy phase one : https://github.com/jenkinsci/pipeline-maven-plugin/blob/09eee091029f1cf75ee7da350cee7eb4b7dc7ed5/jenkins-plugin/src/test/resources/org/jenkinsci/plugins/pipeline/maven/maven-spy-package-jar.xml#L396 https://github.com/jenkinsci/pipeline-maven-plugin/blob/09eee091029f1cf75ee7da350cee7eb4b7dc7ed5/jenkins-plugin/src/test/resources/org/jenkinsci/plugins/pipeline/maven/maven-spy-deploy-jar.xml#L921 So it seems to be that whatever is your build result (success, unstable, or even failed), the plugin will not record anything and thus not trigger anything if the maven-deploy-plugin is not executed. On the other hand, the plugin should record links between jobs provided that artefacts are deployed. And then, if result is acceptable for the user, should trigger dependent builds. Here, and only here, is the problem

            Unassigned Unassigned
            bentocquec benjamin tocquec
            Votes:
            3 Vote for this issue
            Watchers:
            6 Start watching this issue

              Created:
              Updated: