Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-47046

s3Upload with includePathPattern does not upload files

    • Icon: Bug Bug
    • Resolution: Unresolved
    • Icon: Major Major
    • None
    • Jenkins 2.73.1
      pipeline-aws-plugin 1.15

      Thanks for releasing the 1.15 version with the includePathPattern option to s3Upload()!

      Unfortunately, it doesn't work for me - no files are uploaded to S3.

      See the following pipeline:

      node {
      sh """
        mkdir -p test test2
        echo foo > test/bar.txt
        echo foo > test2/baz.txt
      """
      s3Upload(bucket: bucket, path: 'test-pattern/', includePathPattern: '*/*.txt')
      s3Upload(bucket: bucket, path: 'test-filename/test/bar.txt', file: 'test/bar.txt')
      }

      Only the test-filename folder is created in the bucket, no test-pattern folder. The output is as follows:

      Started by user anonymous
      [Pipeline] node
      Running on ESC (sir-4cs867nq) in /home/ubuntu/workspace/Test
      [Pipeline] {
      [Pipeline] sh
      [Test] Running shell script
      + mkdir -p test test2
      + echo foo
      + echo foo
      [Pipeline] s3Upload
      Uploading */*.txt to s3://$bucket/test-pattern/ 
      Upload complete
      [Pipeline] s3Upload
      Uploading file:/home/ubuntu/workspace/Test/test/bar.txt to s3://$bucket/test-filename/test/bar.txt 
      Finished: Uploading to $bucket/test-filename/test/bar.txt
      Upload complete
      [Pipeline] }
      [Pipeline] // node
      [Pipeline] End of Pipeline
      Finished: SUCCESS

      I hope that I'm not to stupid for pattern matching  (the same happens with includePathPattern:'**/*').

          [JENKINS-47046] s3Upload with includePathPattern does not upload files

          Steffen Gebert created issue -
          Steffen Gebert made changes -
          Description Original: Thanks for releasing the 1.15 version with the \{\{includePathPattern}} option to {{s3Upload()}}!

          Unfortunately, it doesn't work for me - no files are uploaded to S3.

          See the following pipeline:
          {noformat}
          node {
          sh """
            mkdir -p test test2
            echo foo > test/bar.txt
            echo foo > test2/baz.txt
          """
          s3Upload(bucket: bucket, path: 'test-pattern/', includePathPattern: '*/*.txt')
          s3Upload(bucket: bucket, path: 'test-filename/test/bar.txt', file: 'test/bar.txt')
          }{noformat}
          Only the {{test-filename}} folder is created in the bucket, no {{test-pattern}} folder. The output is as follows:
          {noformat}
          Started by user anonymous
          [Pipeline] node
          Running on ESC (sir-4cs867nq) in /home/ubuntu/workspace/Test
          [Pipeline] {
          [Pipeline] sh
          [Test] Running shell script
          + mkdir -p test test2
          + echo foo
          + echo foo
          [Pipeline] s3Upload
          Uploading */*.txt to s3://$bucket/test-pattern/
          Upload complete
          [Pipeline] s3Upload
          Uploading file:/home/ubuntu/workspace/Test/test/bar.txt to s3://$bucket/test-filename/test/bar.txt
          Finished: Uploading to $bucket/test-filename/test/bar.txt
          Upload complete
          [Pipeline] }
          [Pipeline] // node
          [Pipeline] End of Pipeline
          Finished: SUCCESS{noformat}
          I hope that I'm not to stupid for pattern matching ;) (the same happens with \{\{{{includePathPattern:'**/*'}})}}
          New: Thanks for releasing the 1.15 version with the {{includePathPattern}} option to {{s3Upload()}}!

          Unfortunately, it doesn't work for me - no files are uploaded to S3.

          See the following pipeline:
          {noformat}
          node {
          sh """
            mkdir -p test test2
            echo foo > test/bar.txt
            echo foo > test2/baz.txt
          """
          s3Upload(bucket: bucket, path: 'test-pattern/', includePathPattern: '*/*.txt')
          s3Upload(bucket: bucket, path: 'test-filename/test/bar.txt', file: 'test/bar.txt')
          }{noformat}
          Only the {{test-filename}} folder is created in the bucket, no {{test-pattern}} folder. The output is as follows:
          {noformat}
          Started by user anonymous
          [Pipeline] node
          Running on ESC (sir-4cs867nq) in /home/ubuntu/workspace/Test
          [Pipeline] {
          [Pipeline] sh
          [Test] Running shell script
          + mkdir -p test test2
          + echo foo
          + echo foo
          [Pipeline] s3Upload
          Uploading */*.txt to s3://$bucket/test-pattern/
          Upload complete
          [Pipeline] s3Upload
          Uploading file:/home/ubuntu/workspace/Test/test/bar.txt to s3://$bucket/test-filename/test/bar.txt
          Finished: Uploading to $bucket/test-filename/test/bar.txt
          Upload complete
          [Pipeline] }
          [Pipeline] // node
          [Pipeline] End of Pipeline
          Finished: SUCCESS{noformat}
          I hope that I'm not to stupid for pattern matching ;) (the same happens with {{includePathPattern:'**/*'}}
          Steffen Gebert made changes -
          Description Original: Thanks for releasing the 1.15 version with the {{includePathPattern}} option to {{s3Upload()}}!

          Unfortunately, it doesn't work for me - no files are uploaded to S3.

          See the following pipeline:
          {noformat}
          node {
          sh """
            mkdir -p test test2
            echo foo > test/bar.txt
            echo foo > test2/baz.txt
          """
          s3Upload(bucket: bucket, path: 'test-pattern/', includePathPattern: '*/*.txt')
          s3Upload(bucket: bucket, path: 'test-filename/test/bar.txt', file: 'test/bar.txt')
          }{noformat}
          Only the {{test-filename}} folder is created in the bucket, no {{test-pattern}} folder. The output is as follows:
          {noformat}
          Started by user anonymous
          [Pipeline] node
          Running on ESC (sir-4cs867nq) in /home/ubuntu/workspace/Test
          [Pipeline] {
          [Pipeline] sh
          [Test] Running shell script
          + mkdir -p test test2
          + echo foo
          + echo foo
          [Pipeline] s3Upload
          Uploading */*.txt to s3://$bucket/test-pattern/
          Upload complete
          [Pipeline] s3Upload
          Uploading file:/home/ubuntu/workspace/Test/test/bar.txt to s3://$bucket/test-filename/test/bar.txt
          Finished: Uploading to $bucket/test-filename/test/bar.txt
          Upload complete
          [Pipeline] }
          [Pipeline] // node
          [Pipeline] End of Pipeline
          Finished: SUCCESS{noformat}
          I hope that I'm not to stupid for pattern matching ;) (the same happens with {{includePathPattern:'**/*'}}
          New: Thanks for releasing the 1.15 version with the {{includePathPattern}} option to {{s3Upload()}}!

          Unfortunately, it doesn't work for me - no files are uploaded to S3.

          See the following pipeline:
          {noformat}
          node {
          sh """
            mkdir -p test test2
            echo foo > test/bar.txt
            echo foo > test2/baz.txt
          """
          s3Upload(bucket: bucket, path: 'test-pattern/', includePathPattern: '*/*.txt')
          s3Upload(bucket: bucket, path: 'test-filename/test/bar.txt', file: 'test/bar.txt')
          }{noformat}
          Only the {{test-filename}} folder is created in the bucket, no {{test-pattern}} folder. The output is as follows:
          {noformat}
          Started by user anonymous
          [Pipeline] node
          Running on ESC (sir-4cs867nq) in /home/ubuntu/workspace/Test
          [Pipeline] {
          [Pipeline] sh
          [Test] Running shell script
          + mkdir -p test test2
          + echo foo
          + echo foo
          [Pipeline] s3Upload
          Uploading */*.txt to s3://$bucket/test-pattern/
          Upload complete
          [Pipeline] s3Upload
          Uploading file:/home/ubuntu/workspace/Test/test/bar.txt to s3://$bucket/test-filename/test/bar.txt
          Finished: Uploading to $bucket/test-filename/test/bar.txt
          Upload complete
          [Pipeline] }
          [Pipeline] // node
          [Pipeline] End of Pipeline
          Finished: SUCCESS{noformat}
          I hope that I'm not to stupid for pattern matching ;) (the same happens with includePathPattern:'***/**').
          Jacob Sohn made changes -
          Priority Original: Minor [ 4 ] New: Major [ 3 ]

          Jacob Sohn added a comment -

          Can confirm includePathPattern and excludePathPattern in conjunction with workingDir does not upload anything as shown in example as "file" parameter cannot be used.

          Bumping up the priority.

          Jacob Sohn added a comment - Can confirm includePathPattern and excludePathPattern in conjunction with workingDir does not upload anything as shown in example as "file" parameter cannot be used. Bumping up the priority.

          a new feature that's broken is definitely not a major issue for me, esp. as the replacement with findFiles is possible.

          Steffen Gebert added a comment - a new feature that's broken is definitely not a major issue for me, esp. as the replacement with findFiles is possible.

          Jacob Sohn added a comment -

          Although it's new feature, documentation does not reflect current state of s3Upload usage example. I, myself, was caught offguard when plugin was installed on stable branch Jenkins installation with pipeline-aws-plugin with version 1.15.

          Jacob Sohn added a comment - Although it's new feature, documentation does not reflect current state of s3Upload usage example. I, myself, was caught offguard when plugin was installed on stable branch Jenkins installation with pipeline-aws-plugin with version 1.15 .

          Hi, do you both happen to use a slave setup?

          It does work on my setup, but I am not using slaves.

          Thorsten Hoeger added a comment - Hi, do you both happen to use a slave setup? It does work on my setup, but I am not using slaves.

          Yes, we run on slaves exclusively.

          Steffen Gebert added a comment - Yes, we run on slaves exclusively.

          Jacob Sohn added a comment -

          Yes, the s3Upload portion runs on build slaves.

          When used with file parameter, s3Upload on slave works as expected, however.

          Jacob Sohn added a comment - Yes, the s3Upload portion runs on build slaves. When used with file parameter, s3Upload on slave works as expected, however.

            hoegertn Thorsten Hoeger
            stephenking Steffen Gebert
            Votes:
            9 Vote for this issue
            Watchers:
            16 Start watching this issue

              Created:
              Updated: