Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-45774

Cannot find files from reports generated in parallel

      Relevant `Jenkinsfile` part:

      pipeline {
          stages {
              stage('PythonChecks') {
                  steps {
                      parallel(
                          some_para_pep8_step: {
                                  sh 'pycodestyle src/whmonit || exit 0'
                          },
                      )
                  }
                  post {
                      always {
                          warnings(
                              canRunOnFailed: true,
                              categoriesPattern: '',
                              consoleParsers: [[parserName: 'Pep8']],
                              defaultEncoding: '',
                              excludePattern: '',
                              healthy: '1',
                              includePattern: '',
                              messagesPattern: '',
                              unHealthy: '2',
                              unstableTotalAll: '3',
                          )
                      }
                  }
              }
          }
      }
      

      Traceback displayed when trying to open any file in the report:

      01 Copying the source file '[some_para_pep8_step] src/whmonit/client/agent.py' from the workspace to the build folder '9f0d4020.tmp' on the Jenkins master failed.
      02 Seems that the path is relative, however an absolute path is required when copying the sources.
      03 Is the file 'agent.py' contained more than once in your workspace?
      04 Is the file '[some_para_pep8_step] src/whmonit/client/agent.py' a valid filename?
      05 If you are building on a slave: please check if the file is accessible under '$JENKINS_HOME/[job-name]/[some_para_pep8_step] src/whmonit/client/agent.py'
      06 If you are building on the master: please check if the file is accessible under '$JENKINS_HOME/[job-name]/workspace/[some_para_pep8_step] src/whmonit/client/agent.py'
      07 java.io.IOException: Failed to copy [some_para_pep8_step] src/whmonit/client/agent.py to /var/lib/jenkins/jobs/PM_master_TEST/builds/145/workspace-files/9f0d4020.tmp
      08   at hudson.FilePath.copyTo(FilePath.java:2003)
      09   at hudson.plugins.analysis.util.Files.copyFilesWithAnnotationsToBuildFolder(Files.java:80)
      10   at hudson.plugins.analysis.core.HealthAwareRecorder.copyFilesWithAnnotationsToBuildFolder(HealthAwareRecorder.java:351)
      11   at hudson.plugins.analysis.core.HealthAwarePublisher.perform(HealthAwarePublisher.java:91)
      12   at hudson.plugins.analysis.core.HealthAwareRecorder.perform(HealthAwareRecorder.java:298)
      13   at org.jenkinsci.plugins.workflow.steps.CoreStep$Execution.run(CoreStep.java:80)
      14   at org.jenkinsci.plugins.workflow.steps.CoreStep$Execution.run(CoreStep.java:67)
      15   at org.jenkinsci.plugins.workflow.steps.SynchronousNonBlockingStepExecution$1$1.call(SynchronousNonBlockingStepExecution.java:49)
      16   at hudson.security.ACL.impersonate(ACL.java:260)
      17   at org.jenkinsci.plugins.workflow.steps.SynchronousNonBlockingStepExecution$1.run(SynchronousNonBlockingStepExecution.java:46)
      18   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      19   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      20   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      21   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      22   at java.lang.Thread.run(Thread.java:748)
      23 Caused by: java.io.IOException: remote file operation failed: [some_para_pep8_step] src/whmonit/client/agent.py at hudson.remoting.Channel@21ac9f5d:cislave2: java.nio.file.NoSuchFileException: [some_para_pep8_step] src/whmonit/client/agent.py
      24   at hudson.FilePath.act(FilePath.java:994)
      25   at hudson.FilePath.act(FilePath.java:976)
      26   at hudson.FilePath.copyTo(FilePath.java:2024)
      27   at hudson.FilePath.copyTo(FilePath.java:2000)
      28   ... 14 more
      29 Caused by: java.nio.file.NoSuchFileException: [some_para_pep8_step] src/whmonit/client/agent.py
      30   at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
      31   at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
      32   at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
      33   at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
      34   at java.nio.file.Files.newByteChannel(Files.java:361)
      35   at java.nio.file.Files.newByteChannel(Files.java:407)
      36   at java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:384)
      37   at java.nio.file.Files.newInputStream(Files.java:152)
      38   at hudson.FilePath$41.invoke(FilePath.java:2027)
      39   at hudson.FilePath$41.invoke(FilePath.java:2024)
      40   at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2750)
      41   at hudson.remoting.UserRequest.perform(UserRequest.java:181)
      42   at hudson.remoting.UserRequest.perform(UserRequest.java:52)
      43   at hudson.remoting.Request$2.run(Request.java:336)
      44   at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:68)
      45   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      46   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      47   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      48   at java.lang.Thread.run(Thread.java:748)
      49   at ......remote call to cislave2(Native Method)
      50   at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1554)
      51   at hudson.remoting.UserResponse.retrieve(UserRequest.java:281)
      52   at hudson.remoting.Channel.call(Channel.java:839)
      53   at hudson.FilePath.act(FilePath.java:987)
      54   ... 17 more
      

      See how the name of the parallel step (`some_para_pep8_step`) gets added to the filepath, hence plugin cannot find it.

      The problem disappears as soon as I do the calls without parallel, but that's a very suboptimal workaround.

          [JENKINS-45774] Cannot find files from reports generated in parallel

          It's not only pep8, though. E.g. pylint does the same thing, as well. Couldn't this be done above the specific parser(s) level?

          Karol Woźniak added a comment - It's not only pep8, though. E.g. pylint does the same thing, as well. Couldn't this be done above the specific parser(s) level?

          Ulli Hafner added a comment -

          I don't think that this can be done generally, maybe some parsers require a text in brackets. But using a common base class for the affected parsers would be fine.

          Can you add one of the lines from the console log that caused the problem?

          Ulli Hafner added a comment - I don't think that this can be done generally, maybe some parsers require a text in brackets. But using a common base class for the affected parsers would be fine. Can you add one of the lines from the console log that caused the problem?

          But the part in brackets is not returned by the tool (pep, pylint, etc.), it's literally the name of the task defined in the parallel call. I guess it's added somewhere on the way, so that plugins can tell which of the tasks produced which log, if they need.

          The output of `pycodestyle` in console log looks like this (exactly the same with/without parallel):

          [PM_master_TEST] Running shell script
          + pycodestyle src/whmonit
          src/whmonit/client/agent.py:13:1: E402 module level import not at top of file
          src/whmonit/client/agent.py:14:1: E402 module level import not at top of file
          src/whmonit/client/agent.py:15:1: E402 module level import not at top of file
          

           

          Maybe there's a way to only remove this piece when supplying paths to the part of code that's responsible for copying the files? I haven't seen the code, so it's just thinking out loud...

          Karol Woźniak added a comment - But the part in brackets is not returned by the tool (pep, pylint, etc.), it's literally the name of the task defined in the parallel call. I guess it's added somewhere on the way, so that plugins can tell which of the tasks produced which log, if they need. The output of `pycodestyle` in console log looks like this (exactly the same with/without parallel): [PM_master_TEST] Running shell script + pycodestyle src/whmonit src/whmonit/client/agent.py:13:1: E402 module level import not at top of file src/whmonit/client/agent.py:14:1: E402 module level import not at top of file src/whmonit/client/agent.py:15:1: E402 module level import not at top of file   Maybe there's a way to only remove this piece when supplying paths to the part of code that's responsible for copying the files? I haven't seen the code, so it's just thinking out loud...

          Ulli Hafner added a comment -

          Aha, that is strange. I.e. there is a text that is not shown in the console log view but scanned by the parser? I think i need to write a ATH test so that I can reproduce it on my machine...

          Ulli Hafner added a comment - Aha, that is strange. I.e. there is a text that is not shown in the console log view but scanned by the parser? I think i need to write a ATH test so that I can reproduce it on my machine...

          Yep, this is pretty much what happens. Was kinda surprised too :-]. Let me know if I can help anything, we'd like to see this resolved, as running it sequentially slows down our builds quite a bit.

          Karol Woźniak added a comment - Yep, this is pretty much what happens. Was kinda surprised too :-]. Let me know if I can help anything, we'd like to see this resolved, as running it sequentially slows down our builds quite a bit.

          drulli: Sorry to bother you, but is there any progress on this? Maybe you could point me in a direction, so I could try to fix it?

          Karol Woźniak added a comment - drulli : Sorry to bother you, but is there any progress on this? Maybe you could point me in a direction, so I could try to fix it?

          Ulli Hafner added a comment - - edited

          Actually, I did not look into this issue since you reported it, sorry. I chatted with abayer about the problem right after you created the bug and he wanted to look into it but seems that he also has forgotten it... 

          A simple solution would be to make the parser more intelligent: the regular expression should ignore that part of the message. (You need to write a small test case with your log file and adapt the parser accordingly.)

          Currently I'm converting all parsers to the new API, but that branch is not yet ready for a release. So maybe it makes sense to wait a little bit longer until this part is done.

          Ulli Hafner added a comment - - edited Actually, I did not look into this issue since you reported it, sorry. I chatted with abayer about the problem right after you created the bug and he wanted to look into it but seems that he also has forgotten it...  A simple solution would be to make the parser more intelligent: the regular expression should ignore that part of the message. (You need to write a small test case with your log file and adapt the parser accordingly.) Currently I'm converting all parsers to the new API , but that branch is not yet ready for a release. So maybe it makes sense to wait a little bit longer until this part is done.

          Thank you for the response.

          Yes, this would probably be the simplest road, but it does not sound like a correct solution. As I've mentioned earlier, this is not a tool specific issue, so we'd have to workaround it like this in every parser [that we use].

          I'd hope that it can be tracked down (or up?) to why this actually happens.

          For now, we're just not using parallel there, so it's not a deal breaker, but it is annoying .

          Karol Woźniak added a comment - Thank you for the response. Yes, this would probably be the simplest road, but it does not sound like a correct solution. As I've mentioned earlier, this is not a tool specific issue, so we'd have to workaround it like this in every parser [that we use] . I'd hope that it can be tracked down (or up?) to why this actually happens. For now, we're just not using parallel there, so it's not a deal breaker, but it is annoying .

          Ulli Hafner added a comment -

          I think that the best solution for this problem would be to integrate the removal of all [tags] in the console log before they are passed to the parsers.

          Ulli Hafner added a comment - I think that the best solution for this problem would be to integrate the removal of all [tags] in the console log before they are passed to the parsers.

          Ulli Hafner added a comment -

          As far as I understood the parallel prefix is not shown anymore in the console log. can you confirm?

          Ulli Hafner added a comment - As far as I understood the parallel prefix is not shown anymore in the console log. can you confirm?

            drulli Ulli Hafner
            kenjitakahashi Karol Woźniak
            Votes:
            1 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: