Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-42616

Restarted Jobs consume Nodes when not necessary

    • Icon: Bug Bug
    • Resolution: Unresolved
    • Icon: Major Major
    • workflow-cps-plugin
    • None
    • Chrome
      Jenkins ver. 2.19.4
      Pipeline 2.5

      May be related to 

      1. JENKINS-40771

       

      In the following situation, we have a single job that runs a function which requires a node.

      Upon restart the job consumes 1 more mode than expected.

      I expect only a single node to be used at any time.  Before restart this is the case, after restart this is not.

      The additional node is consumed until the parallel call is finished.

      If this additional node happens to be the only available node then the pending jobs will be stuck waiting for executor.

      Pictures attached.

      To reproduce:

      1. Run the following pipeline script
      2. Once its running restart server
      3. Observe the build queue

      This has been tested using the windows service and running from the cmdline using: "java -jar jenkins.war --httpPort=8082"

      def run_function (String words){
        return {
          node (){
            sleep(300)
            println words
          }
        }
      }
      Map parllel_map
      node () {
        println "Using my frist node"
        parllel_map = [
        'A': run_function("A")
        ]
      }
      parallel parllel_map
      

        1. 42616.PNG
          42616.PNG
          15 kB
        2. after restart.png
          after restart.png
          9 kB
        3. before restart.png
          before restart.png
          6 kB

          [JENKINS-42616] Restarted Jobs consume Nodes when not necessary

          Matthew Hall added a comment -

          A work around for this issue is to not populate the map that is passed to the parallel step inside a node:

          def run_function (String words){
            return {
              node (){
                sleep(300)
                println words
              }
            }
          }
          Map parllel_map
          node () {
            println "Using my frist node"  
          }
          parllel_map = [
            'A': run_function("A")
            ]
          parallel parllel_map
          

          Matthew Hall added a comment - A work around for this issue is to not populate the map that is passed to the parallel step inside a node: def run_function (String words){ return { node (){ sleep(300) println words } } } Map parllel_map node () { println "Using my frist node" } parllel_map = [ 'A': run_function("A") ] parallel parllel_map

            Unassigned Unassigned
            matthall Matthew Hall
            Votes:
            1 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated: