Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-59328

job hangs although failed after ConcurrentModificationException of ExwsAllocateActionImpl

    XMLWordPrintable

    Details

    • Similar Issues:

      Description

      I have a pipeline Job that does some simple checks for all nodes every 5 minutes.
      Sometimes (something like twice a day) this job fails with ConcurrentModificationException in ExwsAllocateActionImpl (on a Windows node while accessing a file on the network). This has been happening ever since but I didn't bother because the next build ran fine again.

      But since some updates (in July or August) the build hangs in a strange state after this exception.
      Somehow the build is still running although it is failed already.
      Therefore the next build doesn't run because it waits for the previous build to finish (I configured to not allow concurrent builds).
      I can't abort the failed build because it is already failed.
      So I have to delete the failed build in order to get the next build running.

      It doesn't happen anymore since I switched "Pipeline speed/durability override" from
      performance optimized to maximum durability

      Log:

      Started by timer
      Running in Durability level: PERFORMANCE_OPTIMIZED
      [Pipeline] Start of Pipeline
      [Pipeline] stage
      [Pipeline] { (master)
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] node
      Running on Jenkins in /rsync/JenkinsJobs/workspace/work_ServerCheck
      [Pipeline] {
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] isUnix
      [Pipeline] sh
      + df /rsync
      [Pipeline] sh
      + uptime
      [Pipeline] isUnix
      [Pipeline] isUnix
      [Pipeline] echo
      master: 37.99999952316284
      [Pipeline] echo
      we have 34710392 bytes left on master
      [Pipeline] fileOperations
      File Delete Operation:
      /rsync/JenkinsJobs/workspace/work_ServerCheck/build_152473_master.txt deleting....
      Success.
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] writeFile
      [Pipeline] exwsAllocate
      Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config
      Using Disk allocation strategy: 'Select the Disk with the most usable space'
      Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds'
      The path on Disk is: unmerged/work_ServerCheck
      [Pipeline] exws
      Searching for disk definitions in the External Workspace Templates from Jenkins global config
      Running in /media/Builds/unmerged/work_ServerCheck
      [Pipeline]

      { [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: /media/Builds/unmerged/work_ServerCheck/build_152473_master.txt deleting.... Success. [Pipeline] sleep Sleeping for 1 sec [Pipeline] }

      [Pipeline] // exws
      [Pipeline] fileOperations
      File Copy Operation:
      /rsync/JenkinsJobs/workspace/work_ServerCheck/build_152474_master.txt
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] exws
      Searching for disk definitions in the External Workspace Templates from Jenkins global config
      Running in /media/Builds/unmerged/work_ServerCheck
      [Pipeline]

      { [Pipeline] fileExists [Pipeline] }
      [Pipeline] // exws
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] }
      [Pipeline] // node
      [Pipeline] }
      [Pipeline] // stage
      [Pipeline] stage
      [Pipeline] { (domvsaf1)
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] node
      Running on domvsaf1 in /rsync/JenkinsJobs/workspace/work_ServerCheck
      [Pipeline] { [Pipeline] sleep Sleeping for 1 sec [Pipeline] isUnix [Pipeline] sh + df /rsync [Pipeline] sh + uptime [Pipeline] isUnix [Pipeline] isUnix [Pipeline] echo domvsaf1: 0.9999999776482582 [Pipeline] echo we have 39575068 bytes left on domvsaf1 [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /rsync/JenkinsJobs/workspace/work_ServerCheck/build_152473_domvsaf1.txt deleting.... Success. [Pipeline] writeFile [Pipeline] exwsAllocate Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config Using Disk allocation strategy: 'Select the Disk with the most usable space' Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds' The path on Disk is: unmerged/work_ServerCheck [Pipeline] exws Searching for disk definitions in the External Workspace Templates from Jenkins global config Running in /media/Builds/unmerged/work_ServerCheck [Pipeline] \{ [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /media/Builds/unmerged/work_ServerCheck/build_152473_domvsaf1.txt deleting.... Success. [Pipeline] }
      [Pipeline] // exws
      [Pipeline] fileOperations
      File Copy Operation:
      [Pipeline] sleep
      Sleeping for 1 sec
      /rsync/JenkinsJobs/workspace/work_ServerCheck/build_152474_domvsaf1.txt
      [Pipeline] exws
      Searching for disk definitions in the External Workspace Templates from Jenkins global config
      Running in /media/Builds/unmerged/work_ServerCheck
      [Pipeline] {[Pipeline] fileExists[Pipeline] }

      [Pipeline] // exws
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] }
      [Pipeline] // node
      [Pipeline] }
      [Pipeline] // stage
      [Pipeline] stage
      [Pipeline] { (SRV1625u)
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] node
      Running on SRV1625u in /rsync/JenkinsJobs/workspace/work_ServerCheck
      [Pipeline] {
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] isUnix
      [Pipeline] sh
      + df /rsync
      [Pipeline] sh
      + uptime
      [Pipeline] isUnix
      [Pipeline] isUnix
      [Pipeline] echo
      SRV1625u: 0.9999999776482582
      [Pipeline] echo
      we have 19422016 bytes left on SRV1625u
      [Pipeline] fileOperations
      File Delete Operation:
      [Pipeline] sleep
      Sleeping for 1 sec
      /rsync/JenkinsJobs/workspace/work_ServerCheck/build_152473_SRV1625u.txt deleting....
      Success.
      [Pipeline] writeFile
      [Pipeline] exwsAllocate
      Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config
      Using Disk allocation strategy: 'Select the Disk with the most usable space'
      Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds'
      The path on Disk is: unmerged/work_ServerCheck
      [Pipeline] exws
      Searching for disk definitions in the External Workspace Templates from Jenkins global config
      Running in /media/Builds/unmerged/work_ServerCheck
      [Pipeline]
      { [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /media/Builds/unmerged/work_ServerCheck/build_152473_SRV1625u.txt deleting.... Success. [Pipeline] }

      [Pipeline] // exws
      [Pipeline] fileOperations
      File Copy Operation:
      [Pipeline] sleep
      Sleeping for 1 sec
      /rsync/JenkinsJobs/workspace/work_ServerCheck/build_152474_SRV1625u.txt
      [Pipeline] exws
      Searching for disk definitions in the External Workspace Templates from Jenkins global config
      Running in /media/Builds/unmerged/work_ServerCheck
      [Pipeline]
      { [Pipeline] fileExists [Pipeline] }

      [Pipeline] // exws
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] }
      [Pipeline] // node
      [Pipeline] }
      [Pipeline] // stage
      [Pipeline] stage
      [Pipeline] { (SRV4708)
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] node
      Running on SRV4708 in /rsync/JenkinsJobs/workspace/work_ServerCheck
      [Pipeline]

      { [Pipeline] sleep Sleeping for 1 sec [Pipeline] isUnix [Pipeline] sh + df /rsync [Pipeline] sh + uptime [Pipeline] isUnix [Pipeline] isUnix [Pipeline] echo SRV4708: 0.0 [Pipeline] echo we have 363949504 bytes left on SRV4708 [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /rsync/JenkinsJobs/workspace/work_ServerCheck/build_152473_SRV4708.txt deleting.... Success. [Pipeline] writeFile [Pipeline] exwsAllocate Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config Using Disk allocation strategy: 'Select the Disk with the most usable space' Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds' The path on Disk is: unmerged/work_ServerCheck [Pipeline] exws Searching for disk definitions in the External Workspace Templates from Jenkins global config Running in /media/Builds/unmerged/work_ServerCheck [Pipeline] \{ [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /media/Builds/unmerged/work_ServerCheck/build_152473_SRV4708.txt deleting.... Success. [Pipeline] }

      [Pipeline] // exws
      [Pipeline] fileOperations
      File Copy Operation:
      [Pipeline] sleep
      Sleeping for 1 sec
      /rsync/JenkinsJobs/workspace/work_ServerCheck/build_152474_SRV4708.txt
      [Pipeline] exws
      Searching for disk definitions in the External Workspace Templates from Jenkins global config
      Running in /media/Builds/unmerged/work_ServerCheck
      [Pipeline] {[Pipeline] fileExists[Pipeline] }

      [Pipeline] // exws
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] }
      [Pipeline] // node
      [Pipeline] }
      [Pipeline] // stage
      [Pipeline] stage
      [Pipeline] { (domvsafb1)
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] node
      Running on domvsafb1 in e:\jenkins\workspace\work_ServerCheck
      [Pipeline] {
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] isUnix
      [Pipeline] sh
      + df /cygdrive/e/jenkins/workspace/
      [Pipeline] sh
      + wmic cpu get loadpercentage
      [Pipeline] isUnix
      [Pipeline] isUnix
      [Pipeline] echo
      domvsafb1: 12.0
      [Pipeline] echo
      we have 12600340 bytes left on domvsafb1
      [Pipeline] fileOperations
      File Delete Operation:
      [Pipeline] sleep
      Sleeping for 1 sec
      e:\jenkins\workspace\work_ServerCheck\build_152473_domvsafb1.txt deleting....
      Success.
      [Pipeline] writeFile
      [Pipeline] exwsAllocate
      Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config
      Using Disk allocation strategy: 'Select the Disk with the most usable space'
      Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds'
      The path on Disk is: unmerged/work_ServerCheck
      [Pipeline] exws
      Searching for disk definitions in the External Workspace Templates from Jenkins global config
      Searching for disk definitions in the Node config
      Running in \\servername_Builds\unmerged\work_ServerCheck
      [Pipeline]

      { [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec \\servername\_Builds\unmerged\work_ServerCheck\build_152473_domvsafb1.txt deleting.... Success. [Pipeline] }

      [Pipeline] // exws
      [Pipeline] fileOperations
      File Copy Operation:
      [Pipeline] sleep
      Sleeping for 1 sec
      e:\jenkins\workspace\work_ServerCheck\build_152474_domvsafb1.txt
      [Pipeline] exws
      Searching for disk definitions in the External Workspace Templates from Jenkins global config
      Searching for disk definitions in the Node config
      Running in \\servername_Builds\unmerged\work_ServerCheck
      [Pipeline]

      { [Pipeline] fileExists [Pipeline] }

      [Pipeline] // exws
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] }
      [Pipeline] // node
      [Pipeline] }
      [Pipeline] // stage
      [Pipeline] stage
      [Pipeline] { (domvsafb2)
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] node
      Running on domvsafb2 in e:\jenkins\workspace\work_ServerCheck
      [Pipeline]

      { [Pipeline] sleep Sleeping for 1 sec [Pipeline] isUnix [Pipeline] sh + df /cygdrive/e/jenkins/workspace/ [Pipeline] sh + wmic cpu get loadpercentage [Pipeline] isUnix [Pipeline] isUnix [Pipeline] echo domvsafb2: 20.0 [Pipeline] echo we have 6471024 bytes left on domvsafb2 [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec e:\jenkins\workspace\work_ServerCheck\build_152473_domvsafb2.txt deleting.... Success. [Pipeline] writeFile [Pipeline] exwsAllocate Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config Using Disk allocation strategy: 'Select the Disk with the most usable space' [Pipeline] End of Pipeline java.util.ConcurrentModificationException at java.util.LinkedList$ListItr.checkForComodification(LinkedList.java:966) at java.util.LinkedList$ListItr.next(LinkedList.java:888) at com.thoughtworks.xstream.converters.collections.CollectionConverter.marshal(CollectionConverter.java:73) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69) at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller$1.convertAnother(AbstractReferenceMarshaller.java:84) at hudson.util.RobustReflectionConverter.marshallField(RobustReflectionConverter.java:263) at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:250) Caused: java.lang.RuntimeException: Failed to serialize org.jenkinsci.plugins.ewm.actions.ExwsAllocateActionImpl#allocatedWorkspaces for class org.jenkinsci.plugins.ewm.actions.ExwsAllocateActionImpl at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:254) at hudson.util.RobustReflectionConverter$2.visit(RobustReflectionConverter.java:222) at com.thoughtworks.xstream.converters.reflection.PureJavaReflectionProvider.visitSerializableFields(PureJavaReflectionProvider.java:138) at hudson.util.RobustReflectionConverter.doMarshal(RobustReflectionConverter.java:208) at hudson.util.RobustReflectionConverter.marshal(RobustReflectionConverter.java:149) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69) at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58) at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:43) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller$1.convertAnother(AbstractReferenceMarshaller.java:88) at com.thoughtworks.xstream.converters.collections.AbstractCollectionConverter.writeItem(AbstractCollectionConverter.java:64) at com.thoughtworks.xstream.converters.collections.CollectionConverter.marshal(CollectionConverter.java:74) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69) at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller$1.convertAnother(AbstractReferenceMarshaller.java:84) at hudson.util.RobustReflectionConverter.marshallField(RobustReflectionConverter.java:263) at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:250) Caused: java.lang.RuntimeException: Failed to serialize hudson.model.Actionable#actions for class org.jenkinsci.plugins.workflow.job.WorkflowRun at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:254) at hudson.util.RobustReflectionConverter$2.visit(RobustReflectionConverter.java:222) at com.thoughtworks.xstream.converters.reflection.PureJavaReflectionProvider.visitSerializableFields(PureJavaReflectionProvider.java:138) at hudson.util.RobustReflectionConverter.doMarshal(RobustReflectionConverter.java:208) at hudson.util.RobustReflectionConverter.marshal(RobustReflectionConverter.java:149) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69) at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58) at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:43) at com.thoughtworks.xstream.core.TreeMarshaller.start(TreeMarshaller.java:82) at com.thoughtworks.xstream.core.AbstractTreeMarshallingStrategy.marshal(AbstractTreeMarshallingStrategy.java:37) at com.thoughtworks.xstream.XStream.marshal(XStream.java:1026) at com.thoughtworks.xstream.XStream.marshal(XStream.java:1015) at com.thoughtworks.xstream.XStream.toXML(XStream.java:988) at hudson.util.XStream2.toXMLUTF8(XStream2.java:313) at org.jenkinsci.plugins.workflow.support.PipelineIOUtils.writeByXStream(PipelineIOUtils.java:34) at org.jenkinsci.plugins.workflow.job.WorkflowRun.save(WorkflowRun.java:1137) at hudson.BulkChange.commit(BulkChange.java:98) at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.notifyListeners(CpsFlowExecution.java:1475) at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$3.run(CpsThreadGroup.java:458) at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$1.run(CpsVmExecutorService.java:37) at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:131) at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28) at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Finished: FAILURE [Pipeline] exws Searching for disk definitions in the External Workspace Templates from Jenkins global config Searching for disk definitions in the Node config Running in \\servername_Builds\unmerged\work_ServerCheck [Pipeline] \{ [Pipeline] pwd (show) [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec \\servername\_Builds\unmerged\work_ServerCheck\build_152473_domvsafb2.txt deleting.... Success. [Pipeline] }

      [Pipeline] // exws
      [Pipeline] fileOperations
      File Copy Operation:
      [Pipeline] sleep
      Sleeping for 1 sec
      e:\jenkins\workspace\work_ServerCheck\build_152474_domvsafb2.txt
      [Pipeline] exws
      Searching for disk definitions in the External Workspace Templates from Jenkins global config
      Searching for disk definitions in the Node config
      Running in \\servername_Builds\unmerged\work_ServerCheck
      [Pipeline] {[Pipeline] fileExists[Pipeline] }

      [Pipeline] // exws
      [Pipeline] sleep
      Sleeping for 1 sec
      [Pipeline] }
      [Pipeline] // node
      [Pipeline] }
      [Pipeline] // stage
      [Pipeline] node
      Running on SRV4708 in /rsync/JenkinsJobs/workspace/work_ServerCheck
      [Pipeline]

      { [Pipeline] fileOperations File Create Operation: [Pipeline] plot [Pipeline] fileOperations File Create Operation: [Pipeline] plot [Pipeline] archiveArtifacts Archiving artifacts [Pipeline] logParser [Pipeline] }

      [Pipeline] // node
      data_load.csv file already exists, replacing the content with the provided content.
      Creating file: /rsync/JenkinsJobs/workspace/work_ServerCheck/data_load.csv
      data_disk.csv file already exists, replacing the content with the provided content.
      Creating file: /rsync/JenkinsJobs/workspace/work_ServerCheck/data_disk.csv

        Attachments

          Issue Links

            Activity

            Hide
            alexsomai Alexandru Somai added a comment -

            I have a feeling that it's also related with 

            java.lang.RuntimeException: Failed to serialize

            But I have no idea why this is happening. I've looked into the code, and I can't find anything wrong when it calls #getAllocatedWorkspaces.

            Roman Zwi not sure if you are allowed to, but can you also post a snapshot of the pipeline? Maybe something is not compatible with the plugin.

            Oleg Nenashev Maybe you have some insight on this, have you seen similar errors before, in other plugins? Any hints would be helpful. Thanks!

            Show
            alexsomai Alexandru Somai added a comment - I have a feeling that it's also related with  java.lang.RuntimeException: Failed to serialize But I have no idea why this is happening. I've looked into the code, and I can't find anything wrong when it calls  #getAllocatedWorkspaces . Roman Zwi not sure if you are allowed to, but can you also post a snapshot of the pipeline? Maybe something is not compatible with the plugin. Oleg Nenashev Maybe you have some insight on this, have you seen similar errors before, in other plugins? Any hints would be helpful. Thanks!
            Hide
            romanz Roman Zwi added a comment -

            Well I will attach the pipeline code, but it was just a quick and dirty hack. Maybe it's of use anyways.

            I think there are several problems here:

            • the exception itselve (obviously)
            • I couldn't handle this exception (tried to catch it)
            • the job hangs after it although it is already failed (this is the one that bothers me)
              so it is stuck in a strange state

            So it seems that one part of the pipeline fails while another part keeps on running and then isn't able to finish.
            This is especially strange to me as I don't start anything in parallel (at least I thought so...).

            Show
            romanz Roman Zwi added a comment - Well I will attach the pipeline code, but it was just a quick and dirty hack. Maybe it's of use anyways. I think there are several problems here: the exception itselve (obviously) I couldn't handle this exception (tried to catch it) the job hangs after it although it is already failed (this is the one that bothers me) so it is stuck in a strange state So it seems that one part of the pipeline fails while another part keeps on running and then isn't able to finish. This is especially strange to me as I don't start anything in parallel (at least I thought so...).
            Hide
            romanz Roman Zwi added a comment - - edited

            possibly similar problems:
            JENKINS-31536
            JENKINS-51568
            JENKINS-59083

            Show
            romanz Roman Zwi added a comment - - edited possibly similar problems: JENKINS-31536 JENKINS-51568 JENKINS-59083
            Hide
            romanz Roman Zwi added a comment -

            It doesn't hang anymore since the last plugin updates - I guess this is because of the fix of JENKINS-59083

            I still get the exception from time to time but that is not much of a problem for this job.
            It usually comes along with a cron warning like this:
            Trigger hudson.triggers.TimerTrigger.run() triggered by org.jenkinsci.plugins.workflow.job.WorkflowJob@260587ee[work_ServerCheck] spent too much time (40 sec) in its execution, other timers can be affected

             

            Log:

            Started by timer
            Running in Durability level: PERFORMANCE_OPTIMIZED
            [Pipeline] Start of Pipeline
            [Pipeline] stage
            [Pipeline] { (master)
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] node
            Running on Jenkins in /rsync/JenkinsJobs/workspace/work_ServerCheck
            [Pipeline] {
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] isUnix
            [Pipeline] sh
            + df /rsync
            [Pipeline] sh
            + uptime
            [Pipeline] isUnix
            [Pipeline] isUnix
            [Pipeline] echo
            master: 92.00000166893005
            [Pipeline] echo
            we have 24138944 bytes left on master
            [Pipeline] fileOperations
            File Delete Operation:
            /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158550_master.txt deleting....
            Success.
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] writeFile
            [Pipeline] exwsAllocate
            Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config
            Using Disk allocation strategy: 'Select the Disk with the most usable space'
            Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds'
            The path on Disk is: unmerged/work_ServerCheck
            [Pipeline] exws
            Searching for disk definitions in the External Workspace Templates from Jenkins global config
            Running in /media/Builds/unmerged/work_ServerCheck
            [Pipeline]

            { [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: /media/Builds/unmerged/work_ServerCheck/build_158550_master.txt deleting.... Success. [Pipeline] sleep Sleeping for 1 sec [Pipeline] }

            [Pipeline] // exws
            [Pipeline] fileOperations
            File Copy Operation:
            /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158551_master.txt
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] exws
            Searching for disk definitions in the External Workspace Templates from Jenkins global config
            Running in /media/Builds/unmerged/work_ServerCheck
            [Pipeline]

            { [Pipeline] fileExists [Pipeline] }
            [Pipeline] // exws
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] }
            [Pipeline] // node
            [Pipeline] }
            [Pipeline] // stage
            [Pipeline] stage
            [Pipeline] { (domvsaf1)
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] node
            Running on domvsaf1 in /rsync/JenkinsJobs/workspace/work_ServerCheck
            [Pipeline] {
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] isUnix
            [Pipeline] sh
            + df /rsync
            [Pipeline] sh
            + uptime
            [Pipeline] isUnix
            [Pipeline] isUnix
            [Pipeline] echo
            domvsaf1: 28.00000011920929
            [Pipeline] echo
            we have 38286796 bytes left on domvsaf1
            [Pipeline] fileOperations
            File Delete Operation:
            [Pipeline] sleep
            Sleeping for 1 sec
            /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158550_domvsaf1.txt deleting....
            Success.
            [Pipeline] writeFile
            [Pipeline] exwsAllocate
            Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config
            Using Disk allocation strategy: 'Select the Disk with the most usable space'
            Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds'
            The path on Disk is: unmerged/work_ServerCheck
            [Pipeline] exws
            Searching for disk definitions in the External Workspace Templates from Jenkins global config
            Running in /media/Builds/unmerged/work_ServerCheck
            [Pipeline] { [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /media/Builds/unmerged/work_ServerCheck/build_158550_domvsaf1.txt deleting.... Success. [Pipeline] }
            [Pipeline] // exws
            [Pipeline] fileOperations
            File Copy Operation:
            [Pipeline] sleep
            Sleeping for 1 sec
            /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158551_domvsaf1.txt
            [Pipeline] exws
            Searching for disk definitions in the External Workspace Templates from Jenkins global config
            Running in /media/Builds/unmerged/work_ServerCheck
            [Pipeline] {[Pipeline] fileExists[Pipeline] }

            [Pipeline] // exws
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] }
            [Pipeline] // node
            [Pipeline] }
            [Pipeline] // stage
            [Pipeline] stage
            [Pipeline] { (SRV1625u)
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] node
            Running on SRV1625u in /rsync/JenkinsJobs/workspace/work_ServerCheck
            [Pipeline] {
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] isUnix
            [Pipeline] sh
            + df /rsync
            [Pipeline] sh
            + uptime
            [Pipeline] isUnix
            [Pipeline] isUnix
            [Pipeline] echo
            SRV1625u: 117.99999475479126
            [Pipeline] echo
            we have 19393780 bytes left on SRV1625u
            [Pipeline] fileOperations
            File Delete Operation:
            [Pipeline] sleep
            Sleeping for 1 sec
            /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158550_SRV1625u.txt deleting....
            Success.
            [Pipeline] writeFile
            [Pipeline] exwsAllocate
            Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config
            Using Disk allocation strategy: 'Select the Disk with the most usable space'
            Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds'
            The path on Disk is: unmerged/work_ServerCheck
            [Pipeline] exws
            Searching for disk definitions in the External Workspace Templates from Jenkins global config
            Running in /media/Builds/unmerged/work_ServerCheck
            [Pipeline]

            { [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /media/Builds/unmerged/work_ServerCheck/build_158550_SRV1625u.txt deleting.... Success. [Pipeline] }

            [Pipeline] // exws
            [Pipeline] fileOperations
            File Copy Operation:
            [Pipeline] sleep
            Sleeping for 1 sec
            /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158551_SRV1625u.txt
            [Pipeline] exws
            Searching for disk definitions in the External Workspace Templates from Jenkins global config
            Running in /media/Builds/unmerged/work_ServerCheck
            [Pipeline]

            { [Pipeline] fileExists [Pipeline] }
            [Pipeline] // exws
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] }
            [Pipeline] // node
            [Pipeline] }
            [Pipeline] // stage
            [Pipeline] stage
            [Pipeline] { (SRV4708)
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] node
            Running on SRV4708 in /rsync/JenkinsJobs/workspace/work_ServerCheck
            [Pipeline] {
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] isUnix
            [Pipeline] sh
            + df /rsync
            [Pipeline] sh
            + uptime
            [Pipeline] isUnix
            [Pipeline] isUnix
            [Pipeline] echo
            SRV4708: 111.00000143051147
            [Pipeline] echo
            we have 364202208 bytes left on SRV4708
            [Pipeline] fileOperations
            File Delete Operation:
            [Pipeline] sleep
            Sleeping for 1 sec
            /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158550_SRV4708.txt deleting....
            Success.
            [Pipeline] writeFile
            [Pipeline] exwsAllocate
            Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config
            Using Disk allocation strategy: 'Select the Disk with the most usable space'
            Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds'
            The path on Disk is: unmerged/work_ServerCheck
            [Pipeline] exws
            Searching for disk definitions in the External Workspace Templates from Jenkins global config
            Running in /media/Builds/unmerged/work_ServerCheck
            [Pipeline] { [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /media/Builds/unmerged/work_ServerCheck/build_158550_SRV4708.txt deleting.... Success. [Pipeline] }
            [Pipeline] // exws
            [Pipeline] fileOperations
            File Copy Operation:
            [Pipeline] sleep
            Sleeping for 1 sec
            /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158551_SRV4708.txt
            [Pipeline] exws
            Searching for disk definitions in the External Workspace Templates from Jenkins global config
            Running in /media/Builds/unmerged/work_ServerCheck
            [Pipeline] {[Pipeline] fileExists[Pipeline] }

            [Pipeline] // exws
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] }
            [Pipeline] // node
            [Pipeline] }
            [Pipeline] // stage
            [Pipeline] stage
            [Pipeline] { (domvsafb1)
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] node
            Running on domvsafb1 in e:\jenkins\workspace\work_ServerCheck
            [Pipeline] {
            [Pipeline] sleep
            Sleeping for 1 sec
            [Pipeline] isUnix
            [Pipeline] sh
            + df /cygdrive/e/jenkins/workspace/
            [Pipeline] sh
            + wmic cpu get loadpercentage
            [Pipeline] isUnix
            [Pipeline] isUnix
            [Pipeline] echo
            domvsafb1: 12.0
            [Pipeline] echo
            we have 12588140 bytes left on domvsafb1
            [Pipeline] fileOperations
            File Delete Operation:
            [Pipeline] sleep
            Sleeping for 1 sec
            e:\jenkins\workspace\work_ServerCheck\build_158550_domvsafb1.txt deleting....
            Success.
            [Pipeline] writeFile
            [Pipeline] exwsAllocate
            Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config
            Using Disk allocation strategy: 'Select the Disk with the most usable space'
            [Pipeline] End of Pipeline
            java.util.ConcurrentModificationException
            at java.util.LinkedList$ListItr.checkForComodification(LinkedList.java:966)
            at java.util.LinkedList$ListItr.next(LinkedList.java:888)
            at com.thoughtworks.xstream.converters.collections.CollectionConverter.marshal(CollectionConverter.java:73)
            at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69)
            at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58)
            at com.thoughtworks.xstream.core.AbstractReferenceMarshaller$1.convertAnother(AbstractReferenceMarshaller.java:84)
            at hudson.util.RobustReflectionConverter.marshallField(RobustReflectionConverter.java:263)
            at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:250)
            Caused: java.lang.RuntimeException: Failed to serialize org.jenkinsci.plugins.ewm.actions.ExwsAllocateActionImpl#allocatedWorkspaces for class org.jenkinsci.plugins.ewm.actions.ExwsAllocateActionImpl
            at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:254)
            at hudson.util.RobustReflectionConverter$2.visit(RobustReflectionConverter.java:222)
            at com.thoughtworks.xstream.converters.reflection.PureJavaReflectionProvider.visitSerializableFields(PureJavaReflectionProvider.java:138)
            at hudson.util.RobustReflectionConverter.doMarshal(RobustReflectionConverter.java:208)
            at hudson.util.RobustReflectionConverter.marshal(RobustReflectionConverter.java:149)
            at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69)
            at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58)
            at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:43)
            at com.thoughtworks.xstream.core.AbstractReferenceMarshaller$1.convertAnother(AbstractReferenceMarshaller.java:88)
            at com.thoughtworks.xstream.converters.collections.AbstractCollectionConverter.writeItem(AbstractCollectionConverter.java:64)
            at com.thoughtworks.xstream.converters.collections.CollectionConverter.marshal(CollectionConverter.java:74)
            at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69)
            at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58)
            at com.thoughtworks.xstream.core.AbstractReferenceMarshaller$1.convertAnother(AbstractReferenceMarshaller.java:84)
            at hudson.util.RobustReflectionConverter.marshallField(RobustReflectionConverter.java:263)
            at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:250)
            Caused: java.lang.RuntimeException: Failed to serialize hudson.model.Actionable#actions for class org.jenkinsci.plugins.workflow.job.WorkflowRun
            at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:254)
            at hudson.util.RobustReflectionConverter$2.visit(RobustReflectionConverter.java:222)
            at com.thoughtworks.xstream.converters.reflection.PureJavaReflectionProvider.visitSerializableFields(PureJavaReflectionProvider.java:138)
            at hudson.util.RobustReflectionConverter.doMarshal(RobustReflectionConverter.java:208)
            at hudson.util.RobustReflectionConverter.marshal(RobustReflectionConverter.java:149)
            at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69)
            at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58)
            at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:43)
            at com.thoughtworks.xstream.core.TreeMarshaller.start(TreeMarshaller.java:82)
            at com.thoughtworks.xstream.core.AbstractTreeMarshallingStrategy.marshal(AbstractTreeMarshallingStrategy.java:37)
            at com.thoughtworks.xstream.XStream.marshal(XStream.java:1026)
            at com.thoughtworks.xstream.XStream.marshal(XStream.java:1015)
            at com.thoughtworks.xstream.XStream.toXML(XStream.java:988)
            at hudson.util.XStream2.toXMLUTF8(XStream2.java:313)
            at org.jenkinsci.plugins.workflow.support.PipelineIOUtils.writeByXStream(PipelineIOUtils.java:34)
            at org.jenkinsci.plugins.workflow.job.WorkflowRun.save(WorkflowRun.java:1143)
            at hudson.BulkChange.commit(BulkChange.java:98)
            at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.notifyListeners(CpsFlowExecution.java:1475)
            at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$3.run(CpsThreadGroup.java:458)
            at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$1.run(CpsVmExecutorService.java:37)
            at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:131)
            at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
            at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
            at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
            at java.util.concurrent.FutureTask.run(FutureTask.java:266)
            at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
            at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
            at java.lang.Thread.run(Thread.java:745)
            Finished: FAILURE
            Searching for disk definitions in the External Workspace Templates from Jenkins global config
            Searching for disk definitions in the Node config
            Running in \\servername_Builds\unmerged\work_ServerCheck
            File Delete Operation:
            Sleeping for 1 sec
            \\servername_Builds\unmerged\work_ServerCheck\build_158550_domvsafb1.txt deleting....
            Success.
            File Copy Operation:
            Sleeping for 1 sec
            e:\jenkins\workspace\work_ServerCheck\build_158551_domvsafb1.txt
            Searching for disk definitions in the External Workspace Templates from Jenkins global config
            Searching for disk definitions in the Node config
            Running in \\servername_Builds\unmerged\work_ServerCheck
            Sleeping for 1 sec
            Sleeping for 1 sec
            Running on domvsafb2 in e:\jenkins\workspace\work_ServerCheck
            Sleeping for 1 sec
            + df /cygdrive/e/jenkins/workspace/
            + wmic cpu get loadpercentage
            domvsafb2: 20.0
            we have 9333040 bytes left on domvsafb2
            File Delete Operation:
            Sleeping for 1 sec
            e:\jenkins\workspace\work_ServerCheck\build_158550_domvsafb2.txt deleting....
            Success.
            Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config
            Using Disk allocation strategy: 'Select the Disk with the most usable space'
            Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds'
            The path on Disk is: unmerged/work_ServerCheck
            Searching for disk definitions in the External Workspace Templates from Jenkins global config
            Searching for disk definitions in the Node config
            Running in \\servername_Builds\unmerged\work_ServerCheck
            File Delete Operation:
            Sleeping for 1 sec
            \\servername_Builds\unmerged\work_ServerCheck\build_158550_domvsafb2.txt deleting....
            Success.
            File Copy Operation:
            Sleeping for 1 sec
            e:\jenkins\workspace\work_ServerCheck\build_158551_domvsafb2.txt
            Searching for disk definitions in the External Workspace Templates from Jenkins global config
            Searching for disk definitions in the Node config
            Running in \\servername_Builds\unmerged\work_ServerCheck
            Sleeping for 1 sec
            Running on domvsaf1 in /rsync/JenkinsJobs/workspace/work_ServerCheck
            File Create Operation:
            File Create Operation:
            Archiving artifacts
            data_load.csv file already exists, replacing the content with the provided content.
            Creating file: /rsync/JenkinsJobs/workspace/work_ServerCheck/data_load.csv
            data_disk.csv file already exists, replacing the content with the provided content.
            Creating file: /rsync/JenkinsJobs/workspace/work_ServerCheck/data_disk.csv

             

             

            Show
            romanz Roman Zwi added a comment - It doesn't hang anymore since the last plugin updates - I guess this is because of the fix of JENKINS-59083 I still get the exception from time to time but that is not much of a problem for this job. It usually comes along with a cron warning like this: Trigger hudson.triggers.TimerTrigger.run() triggered by org.jenkinsci.plugins.workflow.job.WorkflowJob@260587ee [work_ServerCheck] spent too much time (40 sec) in its execution, other timers can be affected   Log: Started by timer Running in Durability level: PERFORMANCE_OPTIMIZED [Pipeline] Start of Pipeline [Pipeline] stage [Pipeline] { (master) [Pipeline] sleep Sleeping for 1 sec [Pipeline] node Running on Jenkins in /rsync/JenkinsJobs/workspace/work_ServerCheck [Pipeline] { [Pipeline] sleep Sleeping for 1 sec [Pipeline] isUnix [Pipeline] sh + df /rsync [Pipeline] sh + uptime [Pipeline] isUnix [Pipeline] isUnix [Pipeline] echo master: 92.00000166893005 [Pipeline] echo we have 24138944 bytes left on master [Pipeline] fileOperations File Delete Operation: /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158550_master.txt deleting.... Success. [Pipeline] sleep Sleeping for 1 sec [Pipeline] writeFile [Pipeline] exwsAllocate Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config Using Disk allocation strategy: 'Select the Disk with the most usable space' Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds' The path on Disk is: unmerged/work_ServerCheck [Pipeline] exws Searching for disk definitions in the External Workspace Templates from Jenkins global config Running in /media/Builds/unmerged/work_ServerCheck [Pipeline] { [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: /media/Builds/unmerged/work_ServerCheck/build_158550_master.txt deleting.... Success. [Pipeline] sleep Sleeping for 1 sec [Pipeline] } [Pipeline] // exws [Pipeline] fileOperations File Copy Operation: /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158551_master.txt [Pipeline] sleep Sleeping for 1 sec [Pipeline] exws Searching for disk definitions in the External Workspace Templates from Jenkins global config Running in /media/Builds/unmerged/work_ServerCheck [Pipeline] { [Pipeline] fileExists [Pipeline] } [Pipeline] // exws [Pipeline] sleep Sleeping for 1 sec [Pipeline] } [Pipeline] // node [Pipeline] } [Pipeline] // stage [Pipeline] stage [Pipeline] { (domvsaf1) [Pipeline] sleep Sleeping for 1 sec [Pipeline] node Running on domvsaf1 in /rsync/JenkinsJobs/workspace/work_ServerCheck [Pipeline] { [Pipeline] sleep Sleeping for 1 sec [Pipeline] isUnix [Pipeline] sh + df /rsync [Pipeline] sh + uptime [Pipeline] isUnix [Pipeline] isUnix [Pipeline] echo domvsaf1: 28.00000011920929 [Pipeline] echo we have 38286796 bytes left on domvsaf1 [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158550_domvsaf1.txt deleting.... Success. [Pipeline] writeFile [Pipeline] exwsAllocate Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config Using Disk allocation strategy: 'Select the Disk with the most usable space' Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds' The path on Disk is: unmerged/work_ServerCheck [Pipeline] exws Searching for disk definitions in the External Workspace Templates from Jenkins global config Running in /media/Builds/unmerged/work_ServerCheck [Pipeline] { [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /media/Builds/unmerged/work_ServerCheck/build_158550_domvsaf1.txt deleting.... Success. [Pipeline] } [Pipeline] // exws [Pipeline] fileOperations File Copy Operation: [Pipeline] sleep Sleeping for 1 sec /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158551_domvsaf1.txt [Pipeline] exws Searching for disk definitions in the External Workspace Templates from Jenkins global config Running in /media/Builds/unmerged/work_ServerCheck [Pipeline] {[Pipeline] fileExists[Pipeline] } [Pipeline] // exws [Pipeline] sleep Sleeping for 1 sec [Pipeline] } [Pipeline] // node [Pipeline] } [Pipeline] // stage [Pipeline] stage [Pipeline] { (SRV1625u) [Pipeline] sleep Sleeping for 1 sec [Pipeline] node Running on SRV1625u in /rsync/JenkinsJobs/workspace/work_ServerCheck [Pipeline] { [Pipeline] sleep Sleeping for 1 sec [Pipeline] isUnix [Pipeline] sh + df /rsync [Pipeline] sh + uptime [Pipeline] isUnix [Pipeline] isUnix [Pipeline] echo SRV1625u: 117.99999475479126 [Pipeline] echo we have 19393780 bytes left on SRV1625u [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158550_SRV1625u.txt deleting.... Success. [Pipeline] writeFile [Pipeline] exwsAllocate Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config Using Disk allocation strategy: 'Select the Disk with the most usable space' Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds' The path on Disk is: unmerged/work_ServerCheck [Pipeline] exws Searching for disk definitions in the External Workspace Templates from Jenkins global config Running in /media/Builds/unmerged/work_ServerCheck [Pipeline] { [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /media/Builds/unmerged/work_ServerCheck/build_158550_SRV1625u.txt deleting.... Success. [Pipeline] } [Pipeline] // exws [Pipeline] fileOperations File Copy Operation: [Pipeline] sleep Sleeping for 1 sec /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158551_SRV1625u.txt [Pipeline] exws Searching for disk definitions in the External Workspace Templates from Jenkins global config Running in /media/Builds/unmerged/work_ServerCheck [Pipeline] { [Pipeline] fileExists [Pipeline] } [Pipeline] // exws [Pipeline] sleep Sleeping for 1 sec [Pipeline] } [Pipeline] // node [Pipeline] } [Pipeline] // stage [Pipeline] stage [Pipeline] { (SRV4708) [Pipeline] sleep Sleeping for 1 sec [Pipeline] node Running on SRV4708 in /rsync/JenkinsJobs/workspace/work_ServerCheck [Pipeline] { [Pipeline] sleep Sleeping for 1 sec [Pipeline] isUnix [Pipeline] sh + df /rsync [Pipeline] sh + uptime [Pipeline] isUnix [Pipeline] isUnix [Pipeline] echo SRV4708: 111.00000143051147 [Pipeline] echo we have 364202208 bytes left on SRV4708 [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158550_SRV4708.txt deleting.... Success. [Pipeline] writeFile [Pipeline] exwsAllocate Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config Using Disk allocation strategy: 'Select the Disk with the most usable space' Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds' The path on Disk is: unmerged/work_ServerCheck [Pipeline] exws Searching for disk definitions in the External Workspace Templates from Jenkins global config Running in /media/Builds/unmerged/work_ServerCheck [Pipeline] { [Pipeline] pwd [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec /media/Builds/unmerged/work_ServerCheck/build_158550_SRV4708.txt deleting.... Success. [Pipeline] } [Pipeline] // exws [Pipeline] fileOperations File Copy Operation: [Pipeline] sleep Sleeping for 1 sec /rsync/JenkinsJobs/workspace/work_ServerCheck/build_158551_SRV4708.txt [Pipeline] exws Searching for disk definitions in the External Workspace Templates from Jenkins global config Running in /media/Builds/unmerged/work_ServerCheck [Pipeline] {[Pipeline] fileExists[Pipeline] } [Pipeline] // exws [Pipeline] sleep Sleeping for 1 sec [Pipeline] } [Pipeline] // node [Pipeline] } [Pipeline] // stage [Pipeline] stage [Pipeline] { (domvsafb1) [Pipeline] sleep Sleeping for 1 sec [Pipeline] node Running on domvsafb1 in e:\jenkins\workspace\work_ServerCheck [Pipeline] { [Pipeline] sleep Sleeping for 1 sec [Pipeline] isUnix [Pipeline] sh + df /cygdrive/e/jenkins/workspace/ [Pipeline] sh + wmic cpu get loadpercentage [Pipeline] isUnix [Pipeline] isUnix [Pipeline] echo domvsafb1: 12.0 [Pipeline] echo we have 12588140 bytes left on domvsafb1 [Pipeline] fileOperations File Delete Operation: [Pipeline] sleep Sleeping for 1 sec e:\jenkins\workspace\work_ServerCheck\build_158550_domvsafb1.txt deleting.... Success. [Pipeline] writeFile [Pipeline] exwsAllocate Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config Using Disk allocation strategy: 'Select the Disk with the most usable space' [Pipeline] End of Pipeline java.util.ConcurrentModificationException at java.util.LinkedList$ListItr.checkForComodification(LinkedList.java:966) at java.util.LinkedList$ListItr.next(LinkedList.java:888) at com.thoughtworks.xstream.converters.collections.CollectionConverter.marshal(CollectionConverter.java:73) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69) at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller$1.convertAnother(AbstractReferenceMarshaller.java:84) at hudson.util.RobustReflectionConverter.marshallField(RobustReflectionConverter.java:263) at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:250) Caused: java.lang.RuntimeException: Failed to serialize org.jenkinsci.plugins.ewm.actions.ExwsAllocateActionImpl#allocatedWorkspaces for class org.jenkinsci.plugins.ewm.actions.ExwsAllocateActionImpl at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:254) at hudson.util.RobustReflectionConverter$2.visit(RobustReflectionConverter.java:222) at com.thoughtworks.xstream.converters.reflection.PureJavaReflectionProvider.visitSerializableFields(PureJavaReflectionProvider.java:138) at hudson.util.RobustReflectionConverter.doMarshal(RobustReflectionConverter.java:208) at hudson.util.RobustReflectionConverter.marshal(RobustReflectionConverter.java:149) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69) at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58) at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:43) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller$1.convertAnother(AbstractReferenceMarshaller.java:88) at com.thoughtworks.xstream.converters.collections.AbstractCollectionConverter.writeItem(AbstractCollectionConverter.java:64) at com.thoughtworks.xstream.converters.collections.CollectionConverter.marshal(CollectionConverter.java:74) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69) at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller$1.convertAnother(AbstractReferenceMarshaller.java:84) at hudson.util.RobustReflectionConverter.marshallField(RobustReflectionConverter.java:263) at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:250) Caused: java.lang.RuntimeException: Failed to serialize hudson.model.Actionable#actions for class org.jenkinsci.plugins.workflow.job.WorkflowRun at hudson.util.RobustReflectionConverter$2.writeField(RobustReflectionConverter.java:254) at hudson.util.RobustReflectionConverter$2.visit(RobustReflectionConverter.java:222) at com.thoughtworks.xstream.converters.reflection.PureJavaReflectionProvider.visitSerializableFields(PureJavaReflectionProvider.java:138) at hudson.util.RobustReflectionConverter.doMarshal(RobustReflectionConverter.java:208) at hudson.util.RobustReflectionConverter.marshal(RobustReflectionConverter.java:149) at com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:69) at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:58) at com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:43) at com.thoughtworks.xstream.core.TreeMarshaller.start(TreeMarshaller.java:82) at com.thoughtworks.xstream.core.AbstractTreeMarshallingStrategy.marshal(AbstractTreeMarshallingStrategy.java:37) at com.thoughtworks.xstream.XStream.marshal(XStream.java:1026) at com.thoughtworks.xstream.XStream.marshal(XStream.java:1015) at com.thoughtworks.xstream.XStream.toXML(XStream.java:988) at hudson.util.XStream2.toXMLUTF8(XStream2.java:313) at org.jenkinsci.plugins.workflow.support.PipelineIOUtils.writeByXStream(PipelineIOUtils.java:34) at org.jenkinsci.plugins.workflow.job.WorkflowRun.save(WorkflowRun.java:1143) at hudson.BulkChange.commit(BulkChange.java:98) at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.notifyListeners(CpsFlowExecution.java:1475) at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$3.run(CpsThreadGroup.java:458) at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$1.run(CpsVmExecutorService.java:37) at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:131) at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28) at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Finished: FAILURE Searching for disk definitions in the External Workspace Templates from Jenkins global config Searching for disk definitions in the Node config Running in \\servername_Builds\unmerged\work_ServerCheck File Delete Operation: Sleeping for 1 sec \\servername_Builds\unmerged\work_ServerCheck\build_158550_domvsafb1.txt deleting.... Success. File Copy Operation: Sleeping for 1 sec e:\jenkins\workspace\work_ServerCheck\build_158551_domvsafb1.txt Searching for disk definitions in the External Workspace Templates from Jenkins global config Searching for disk definitions in the Node config Running in \\servername_Builds\unmerged\work_ServerCheck Sleeping for 1 sec Sleeping for 1 sec Running on domvsafb2 in e:\jenkins\workspace\work_ServerCheck Sleeping for 1 sec + df /cygdrive/e/jenkins/workspace/ + wmic cpu get loadpercentage domvsafb2: 20.0 we have 9333040 bytes left on domvsafb2 File Delete Operation: Sleeping for 1 sec e:\jenkins\workspace\work_ServerCheck\build_158550_domvsafb2.txt deleting.... Success. Disk allocation strategy was not provided as step parameter. Fallback to the strategy defined in the Jenkins global config Using Disk allocation strategy: 'Select the Disk with the most usable space' Selected Disk ID 'S_Produkte__Builds' from the Disk Pool ID 'ArchivedBuilds' The path on Disk is: unmerged/work_ServerCheck Searching for disk definitions in the External Workspace Templates from Jenkins global config Searching for disk definitions in the Node config Running in \\servername_Builds\unmerged\work_ServerCheck File Delete Operation: Sleeping for 1 sec \\servername_Builds\unmerged\work_ServerCheck\build_158550_domvsafb2.txt deleting.... Success. File Copy Operation: Sleeping for 1 sec e:\jenkins\workspace\work_ServerCheck\build_158551_domvsafb2.txt Searching for disk definitions in the External Workspace Templates from Jenkins global config Searching for disk definitions in the Node config Running in \\servername_Builds\unmerged\work_ServerCheck Sleeping for 1 sec Running on domvsaf1 in /rsync/JenkinsJobs/workspace/work_ServerCheck File Create Operation: File Create Operation: Archiving artifacts data_load.csv file already exists, replacing the content with the provided content. Creating file: /rsync/JenkinsJobs/workspace/work_ServerCheck/data_load.csv data_disk.csv file already exists, replacing the content with the provided content. Creating file: /rsync/JenkinsJobs/workspace/work_ServerCheck/data_disk.csv    

              People

              Assignee:
              alexsomai Alexandru Somai
              Reporter:
              romanz Roman Zwi
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Dates

                Created:
                Updated: