Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-47046

s3Upload with includePathPattern does not upload files

    XMLWordPrintable

    Details

    • Similar Issues:

      Description

      Thanks for releasing the 1.15 version with the includePathPattern option to s3Upload()!

      Unfortunately, it doesn't work for me - no files are uploaded to S3.

      See the following pipeline:

      node {
      sh """
        mkdir -p test test2
        echo foo > test/bar.txt
        echo foo > test2/baz.txt
      """
      s3Upload(bucket: bucket, path: 'test-pattern/', includePathPattern: '*/*.txt')
      s3Upload(bucket: bucket, path: 'test-filename/test/bar.txt', file: 'test/bar.txt')
      }

      Only the test-filename folder is created in the bucket, no test-pattern folder. The output is as follows:

      Started by user anonymous
      [Pipeline] node
      Running on ESC (sir-4cs867nq) in /home/ubuntu/workspace/Test
      [Pipeline] {
      [Pipeline] sh
      [Test] Running shell script
      + mkdir -p test test2
      + echo foo
      + echo foo
      [Pipeline] s3Upload
      Uploading */*.txt to s3://$bucket/test-pattern/ 
      Upload complete
      [Pipeline] s3Upload
      Uploading file:/home/ubuntu/workspace/Test/test/bar.txt to s3://$bucket/test-filename/test/bar.txt 
      Finished: Uploading to $bucket/test-filename/test/bar.txt
      Upload complete
      [Pipeline] }
      [Pipeline] // node
      [Pipeline] End of Pipeline
      Finished: SUCCESS

      I hope that I'm not to stupid for pattern matching  (the same happens with includePathPattern:'**/*').

        Attachments

          Activity

          Hide
          jonvnieu Jonas Van Nieuwenberg added a comment -

          Hi, I'm facing the same issue in a master/slave setup.

          Something peculiar I noticed while goofing around with the settings: if I set the path parameter to be an actual file name (instead of a directory), then the matched files do seem to be uploaded, but obviously they all have the provided file name.

          Show
          jonvnieu Jonas Van Nieuwenberg added a comment - Hi, I'm facing the same issue in a master/slave setup. Something peculiar I noticed while goofing around with the settings: if I set the path parameter to be an actual file name (instead of a directory), then the matched files do seem to be uploaded, but obviously they all have the provided file name.
          Hide
          ssbarnea Sorin Sbarnea added a comment - - edited

          I can confirm this bug, mainly the `path` is fully ignored when include pathPattern is a pattern, making impossible to upload packages to specify target locations inside the buckets. 

          I am still trying to find a way to workaround this bug but so solution yet. Anyone?

          Show
          ssbarnea Sorin Sbarnea added a comment - - edited I can confirm this bug, mainly the `path` is fully ignored when include pathPattern is a pattern, making impossible to upload packages to specify target locations inside the buckets.  I am still trying to find a way to workaround this bug but so solution yet. Anyone?
          Hide
          schollii Oliver Schoenborn added a comment - - edited

          The only thing that works for me is
           

          s3Upload( 
            bucket: 'BUCKET', 
            path: "PATH_TO_FOLDER", // no trailing slash 
            file: "FOLDER", 
            workingDir: "PARENT_OF_FOLDER" 
          )

           
          With includePath, the only one that worked partially was using "*/.yaml" as path; other patterns like ".yaml", "/" did not work. I say partially because it uploaded only one file eventhough there were several (there is a bug related to this).

          Also findFiles is an option, as documented https://github.com/jenkinsci/pipeline-aws-plugin/issues/83.

          Show
          schollii Oliver Schoenborn added a comment - - edited The only thing that works for me is   s3Upload( bucket: 'BUCKET' , path: "PATH_TO_FOLDER" , // no trailing slash file: "FOLDER" , workingDir: "PARENT_OF_FOLDER" )   With includePath, the only one that worked partially was using "* / .yaml" as path; other patterns like " .yaml", " / " did not work. I say partially because it uploaded only one file eventhough there were several (there is a bug related to this). Also findFiles is an option, as documented  https://github.com/jenkinsci/pipeline-aws-plugin/issues/83 .
          Hide
          wimnat Rob White added a comment -

          Sad to see no update on this.

          I can confirm it's an issue for me and yes, only on my slaves.

          Show
          wimnat Rob White added a comment - Sad to see no update on this. I can confirm it's an issue for me and yes, only on my slaves.
          Hide
          djlm Demetrio Lopez added a comment -

          my workaround is uploading file by file

           

          pipeline {
            agent any
              stages {
                stage('Setting up environment variables'){
                  def AWS_ACCOUNT_ID = '01010101010101010'
                  def REGION = 'eu-north-1'
                  def ROLE = 'MyIamRole'
                  def EXTERNAL_ID = 'MyExternalId'
                  def BUCKET = 'my-artifacts'
                  def PROJECT = 'my-project'
                }
                stage ('Build app and upload artifacts to S3'){
                  agent {
                    label 'my-slave-with-maven'
                  }
                  steps {
                    // build source code
                    dir('./SourceCode') {
                      sh 'mvn -B clean package'            
                    }
                  }
                  script {
                    // upload files to S3
                    def jar_files = findFiles(glob: "**/SourceCode/${PROJECT}/target/*.jar")
                    jar_files.each {
                      echo "JAR found: ${it}"
                      withAWS(externalId: "${EXTERNAL_ID}", region: "${REGION}", role: "${ROLE}", roleAccount: "${AWS_ACCOUNT_ID}") {
                        s3Upload(file: "${it}", bucket: "${BUCKET}", path: "${PROJECT}/", acl: 'BucketOwnerFullControl')
                      }
                    }
                  }
                }
              }
            }
          }
          
          Show
          djlm Demetrio Lopez added a comment - my workaround is uploading file by file   pipeline { agent any stages { stage( 'Setting up environment variables' ){ def AWS_ACCOUNT_ID = '01010101010101010' def REGION = 'eu-north-1' def ROLE = 'MyIamRole' def EXTERNAL_ID = 'MyExternalId' def BUCKET = 'my-artifacts' def PROJECT = 'my-project' } stage ( 'Build app and upload artifacts to S3' ){ agent { label 'my-slave-with-maven' } steps { // build source code dir( './SourceCode' ) { sh 'mvn -B clean package ' } } script { // upload files to S3 def jar_files = findFiles(glob: "**/SourceCode/${PROJECT}/target/*.jar" ) jar_files.each { echo "JAR found: ${it}" withAWS(externalId: "${EXTERNAL_ID}" , region: "${REGION}" , role: "${ROLE}" , roleAccount: "${AWS_ACCOUNT_ID}" ) { s3Upload(file: "${it}" , bucket: "${BUCKET}" , path: "${PROJECT}/" , acl: 'BucketOwnerFullControl' ) } } } } } } }

            People

            Assignee:
            hoegertn Thorsten Hoeger
            Reporter:
            stephenking Steffen Gebert
            Votes:
            9 Vote for this issue
            Watchers:
            16 Start watching this issue

              Dates

              Created:
              Updated: