Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-41249

Job failed despite all stages being successful

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Closed (View Workflow)
    • Priority: Minor
    • Resolution: Duplicate
    • Component/s: pipeline
    • Labels:
      None
    • Environment:
      Jenkins 2.40
      Pipeline 2.4
      Pipeline: Multibranch 2.9.2
    • Similar Issues:

      Description

      My pipeline is failing, despite all stages individually passing successfully. Both Pipeline Steps view in the classic UI and Blue Ocean show all stages as successful, yet the encompassing build as failed. The logs contain no 'ERROR' message at the end, and they clearly show the stages executing to their full conclusion.

      I simply get this at the end:

      ....
      [Pipeline] }
      [Pipeline] // node
      [Pipeline] }
      [Pipeline] // stage
      [Pipeline] End of Pipeline
      
      GitHub has been notified of this commit’s build result
      
      Finished: FAILURE
      

      You'll notice that my Jenkinsfile has a mechanism for not re-running the guts of a stage if it was previously recorded as passed in a previous build for the same branch/commit combination. When I re-run the failed job, this mechanism kicks-in to avoid performing the lengthy test commands and the build is marked as successful.

      This leads me to believe that maybe the 'sh' command may be changing the job status even for a 0 exit code. Is it possible that STDERR output alone might cause the failure condition? This is just a theory, and note that none of the 'sh' calls actually result in an 'error' call, either explicitly here in my Jenkinsfile or internally.

      Jenkinsfile

      IPHONE_7_102 = "iPhone 7 (10.2)"
      IPHONE_6S_93 = "iPhone 6s (9.3)"
      IPAD_AIR_2_102 = "iPad Air 2 (10.2)"
      
      TT_STORAGE_HOST = "tt@10.88.195.2"
      TT_STORAGE_PR_BASE_PATH = null
      TT_TEST_ID = null
      TT_STORAGE_COMMIT_BASE_PATH = null
      
      node("mac") {
        stage("Lint") {
          tt_checkout_scm()
      
          if (tt_step_succeeded_in_previous_job("lint")) {
            println "Skipping lint - linting succeeded in ${TT_TEST_ID}"
          } else {
            tt_prepare_workspace()
      
            dir("TTKit") {
              if (!tt_fastlane("lint")) {
                error "Lint failed"
              }
      
              tt_step_record_success "lint"
            }
          }
        }
      }
      
      stage("Build") {
        def jobs = [:]
        def builds = [
          ["TTKit", ["TTKit Unit Tests", "TTKit Snapshot Tests"]],
          ["Bowtie", ["Bowtie Alpha", "Bowtie Functional Tests", "Bowtie UI Tests", "Bowtie Snapshot Tests"]],
          ["Wingtip", ["Wingtip Alpha", "Wingtip Functional Tests", "Wingtip UI Tests", "Wingtip Snapshot Tests"]]
        ]
      
        for (int i = 0; i < builds.size(); i++) {
          def name = builds[i][0]
          def schemes = builds[i][1]
          def joined_schemes = schemes.join(", ")
      
          jobs[name] = {
            node("mac") {
              if (tt_storage_has_packages_for_schemes(schemes)) {
                println "Skipping build - products already exist for ${joined_schemes} in ${TT_TEST_ID}"
              } else {
                tt_prepare()
      
                dir("TTKit") {
                  def cmd = "build_schemes_package schemes:'${joined_schemes}'"
      
                  if (tt_fastlane(cmd)) {
                    tt_storage_put "fastlane/schemes_package/*.tar"
                    archiveArtifacts artifacts: "fastlane/test_output/build/*.zip"
                  } else {
                    error "Failed: ${cmd}"
                  }
                }
              }
            }
          }
        }
      
        parallel(jobs)
      }
      
      stage("Test") {
        def jobs = [:]
        def tests = [
          ["TTKit - Unit - 10.2", "TTKit Unit Tests", IPHONE_7_102],
          ["TTKit - Unit - 9.3", "TTKit Unit Tests", IPHONE_6S_93],
          ["TTKit - Snapshot - iPhone 7 - 10.2", "TTKit Snapshot Tests", IPHONE_7_102],
      
          ["Bowtie - Unit - 10.2", "Bowtie Alpha", IPHONE_7_102],
          ["Bowtie - Unit - 9.3", "Bowtie Alpha", IPHONE_6S_93],
          ["Bowtie - Snapshot - iPhone 7 - 10.2", "Bowtie Snapshot Tests", IPHONE_7_102],
          ["Bowtie - UI - iPhone 7 - 10.2", "Bowtie UI Tests", IPHONE_7_102],
          ["Bowtie - UI - iPhone 6s - 9.3", "Bowtie UI Tests", IPHONE_6S_93],
      
          ["Wingtip - Unit - 10.2", "Wingtip Alpha", IPHONE_7_102],
          ["Wingtip - Unit - 9.3", "Wingtip Alpha", IPHONE_6S_93],
          ["Wingtip - Snapshot - iPhone 7 - 10.2", "Wingtip Snapshot Tests", IPHONE_7_102],
          ["Wingtip - UI - iPhone 7 - 10.2", "Wingtip UI Tests", IPHONE_7_102],
          ["Wingtip - UI - iPhone 6s - 9.3", "Wingtip UI Tests", IPHONE_6S_93],
          ["Wingtip - UI - iPad Air 2 - 10.2", "Wingtip UI Tests", IPAD_AIR_2_102],
        ]
      
        for (int i = 0; i < tests.size(); i++) {
          def name = tests[i][0]
          def scheme = tests[i][1]
          def device = tests[i][2]
          def package_name = tt_storage_normalize(scheme)
          def test_dir = name.split(" ")[0]
      
          jobs[name] = {
            node("mac") {
              def step_name = "${scheme}_${device}"
      
              if (tt_step_succeeded_in_previous_job(step_name)) {
                println("Skipping test - test succeeded for ${scheme} on ${device} in ${TT_TEST_ID}")
              } else {
                tt_prepare()
      
                dir(test_dir) {
                  def cmd = "test_scheme_package scheme:'${scheme}' device:'${device}'"
                  def report = "fastlane/test_output/test/report.junit"
      
                  tt_storage_get "fastlane/schemes_package/${package_name}.tar"
      
                  if (tt_fastlane(cmd)) {
                    tt_step_record_success step_name
                    junit report
                    archiveArtifacts artifacts: "fastlane/test_output/test/*.zip"
                  } else {
                    if (fileExists(report)) {
                      junit report
                    }
      
                    error "Failed: ${cmd}"
                  }
                }
              }
            }
          }
        }
      
        parallel(jobs)
      }
      
      stage("Cleanup") {
        node("mac") {
          tt_git_clean()
          // tt_storage_cleanup()
        }
      }
      
      ////////////////////////
      //   HELPER METHODS   //
      ////////////////////////
      
      def tt_cocoapods(cmd) {
        withEnv(["COCOAPODS_DISABLE_STATS=1", "COCOAPODS_PARALLEL_CODE_SIGN=1"]) {
          sh cmd
        }
      }
      
      def tt_fastlane(cmd) {
        def status = 1
        withEnv(["FASTLANE_SKIP_UPDATE_CHECK=1", "TT_SKIP_BUILD_PHASE_LINT=1", "JENKINS=1", "COCOAPODS_DISABLE_STATS=1", "COCOAPODS_PARALLEL_CODE_SIGN=1"]) {
          ansiColor("xterm") {
            def newCmd = """#!/bin/bash -le
              # set -x after the shell has started to avoid a lot of useless output, we just want to
              # echo the commands from this point forward.
              set -x
      
              bundle exec fastlane ${cmd}
            """
            status = sh script: newCmd, returnStatus: true
          }
        }
        status == 0
      }
      
      def tt_sh_status(cmd) {
        sh script: cmd, returnStatus: true
      }
      
      def tt_sh_stdout(cmd) {
        output = sh script: cmd, returnStdout: true
        output.trim()
      }
      
      def tt_prepare() {
        tt_checkout_scm()
        tt_prepare_workspace()
      }
      
      def tt_prepare_workspace() {
        tt_git_clean()
        sh "time bundle install"
      
        if (tt_storage_exists("pods.tar")) {
          tt_storage_get("pods.tar")
          sh "tar xf pods.tar; rm pods.tar"
        } else {
          tt_cocoapods "time bundle exec pod install || time bundle exec pod install --repo-update"
          sh "tar cf pods.tar Pods"
          tt_storage_put("pods.tar")
        }
      }
      
      def tt_checkout_scm() {
        checkout scm
        tt_storage_setup()
      }
      
      def tt_storage_touch(file) {
        def target_path = tt_storage_path(file)
        def parts = target_path.split("/")
        def base = TT_STORAGE_COMMIT_BASE_PATH
      
        if (parts.length > 1) {
          base = parts[0..parts.length-2].join("/")
        }
      
        sh "ssh ${TT_STORAGE_HOST} 'mkdir -p ${base}; touch ${target_path}'"
      }
      
      def tt_storage_put(file) {
        def target_path = tt_storage_path(file)
        def parts = target_path.split("/")
        def base = TT_STORAGE_COMMIT_BASE_PATH
      
        if (parts.length > 1) {
          base = parts[0..parts.length-2].join("/")
        }
      
        if (parts[-1].contains("*")) {
          target_path = parts[0..parts.length-2].join("/")
        }
      
        sh """
          ssh ${TT_STORAGE_HOST} 'mkdir -p ${base}'
          time scp ${file} ${TT_STORAGE_HOST}:${target_path}
        """
      }
      
      def tt_storage_get(file) {
        def remote_path = tt_storage_path(file)
        def parts = file.split("/")
      
        if (parts.length > 1) {
          local_dir = parts[0..parts.length-2].join("/")
          sh "mkdir -p ${local_dir}"
        }
      
        sh "time scp ${TT_STORAGE_HOST}:${remote_path} ${file}"
      }
      
      def tt_storage_cleanup() {
        sh "ssh ${TT_STORAGE_HOST} 'rm -r ${TT_STORAGE_PR_BASE_PATH} || true'"
      }
      
      def tt_storage_path(file) {
        "${TT_STORAGE_COMMIT_BASE_PATH}/${file}"
      }
      
      def tt_storage_setup() {
        if (TT_STORAGE_PR_BASE_PATH != null) {
          return
        }
      
        def commit = tt_sh_stdout "git rev-parse --short HEAD"
        TT_TEST_ID = "${env.JOB_NAME}/${commit}"
        TT_STORAGE_PR_BASE_PATH = "/tmp/storage/${env.JOB_NAME}"
        TT_STORAGE_COMMIT_BASE_PATH = "${TT_STORAGE_PR_BASE_PATH}/${commit}"
      }
      
      def tt_storage_all_exist(files) {
        def tests = []
      
        for (int i = 0; i < files.size(); i++) {
          def path = tt_storage_path(files[i])
          tests << "[[ -f ${path} ]]"
        }
      
        def bash_test = tests.join(" && ")
        def status = tt_sh_status "ssh ${TT_STORAGE_HOST} '${bash_test}'"
        return status == 0
      }
      
      def tt_storage_exists(file) {
        def path = tt_storage_path(file)
        def status = tt_sh_status "ssh ${TT_STORAGE_HOST} [[ -f ${path} ]]"
        return status == 0
      }
      
      def tt_storage_has_packages_for_schemes(schemes) {
        def files = []
      
        for (int i = 0; i < schemes.size(); i++) {
          def scheme = schemes[i]
          files << "fastlane/schemes_package/${tt_storage_normalize(scheme)}.tar"
        }
      
        tt_storage_all_exist(files)
      }
      
      def tt_storage_normalize(filename) {
        filename.replaceAll("[\\(\\)]", "").replaceAll("[\\. ]", "_")
      }
      
      def tt_step_succeeded_in_previous_job(step) {
        def file = tt_step_success_filename(step)
        tt_storage_exists(file)
      }
      
      def tt_step_record_success(step) {
        def file = tt_step_success_filename(step)
        tt_storage_touch(file)
      }
      
      def tt_step_success_filename(step) {
        tt_storage_normalize("successful_steps/${step}")
      }
      
      def tt_git_clean() {
        sh "git clean -fxd"
      }
      

        Attachments

          Issue Links

            Activity

            Hide
            ileitch Ian Leitch added a comment -

            I just tried putting:

            currentBuild.result = "SUCCESS"
            

            in the final "Cleanup" stage but it made no difference

            Show
            ileitch Ian Leitch added a comment - I just tried putting: currentBuild.result = "SUCCESS" in the final "Cleanup" stage but it made no difference
            Hide
            jeffweiss Jeff Weiss added a comment -

            Ian Leitch I recently had a similar problem which I discovered (read: wasted a full day) to be the result of an archiveArtifacts step that did not find any artifacts to archive.

            Show
            jeffweiss Jeff Weiss added a comment - Ian Leitch I recently had a similar problem which I discovered (read: wasted a full day) to be the result of an archiveArtifacts step that did not find any artifacts to archive.
            Hide
            ileitch Ian Leitch added a comment -

            @jeffweiss I'll double check that, but I seems to remember that archiveArtifacts produces an error in the log if there aren't any artifcats - I'm not seeing that.

            Show
            ileitch Ian Leitch added a comment - @jeffweiss I'll double check that, but I seems to remember that archiveArtifacts produces an error in the log if there aren't any artifcats - I'm not seeing that.

              People

              Assignee:
              Unassigned Unassigned
              Reporter:
              ileitch Ian Leitch
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

                Dates

                Created:
                Updated:
                Resolved: