• Icon: Bug Bug
    • Resolution: Unresolved
    • Icon: Minor Minor
    • s3-plugin
    • None
    • Jenkins ver. 2.217
      S3 publisher ver. 0.11.3

      After upgrading the S3 publisher plugin to version 0.11.3 we found that uploads with gzipFiles set to true fail with the following message:

      ERROR: Failed to upload files
      com.amazonaws.SdkClientException: Failed to mark the file position
       at com.amazonaws.internal.ResettableInputStream.mark(ResettableInputStream.java:148)
       at com.amazonaws.internal.SdkFilterInputStream.mark(SdkFilterInputStream.java:114)
       at com.amazonaws.util.LengthCheckInputStream.mark(LengthCheckInputStream.java:116)
       at com.amazonaws.internal.SdkFilterInputStream.mark(SdkFilterInputStream.java:114)
       at com.amazonaws.services.s3.internal.MD5DigestCalculatingInputStream.mark(MD5DigestCalculatingInputStream.java:94)
       at com.amazonaws.internal.SdkFilterInputStream.mark(SdkFilterInputStream.java:114)
       at com.amazonaws.internal.SdkFilterInputStream.mark(SdkFilterInputStream.java:114)
       at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1081)
       at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:784)
       at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:752)
       at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:726)
       at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:686)
       at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:668)
       at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:532)
       at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:512)
       at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5052)
       at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4998)
       at com.amazonaws.services.s3.AmazonS3Client.access$300(AmazonS3Client.java:394)
       at com.amazonaws.services.s3.AmazonS3Client$PutObjectStrategy.invokeServiceCall(AmazonS3Client.java:5940)
       at com.amazonaws.services.s3.AmazonS3Client.uploadObject(AmazonS3Client.java:1808)
       at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1768)
       at com.amazonaws.services.s3.transfer.internal.UploadCallable.uploadInOneChunk(UploadCallable.java:131)
       at com.amazonaws.services.s3.transfer.internal.UploadCallable.call(UploadCallable.java:123)
       at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:143)
       at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:48)
       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
       at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
       at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
       at java.lang.Thread.run(Thread.java:748)
      Caused by: java.nio.channels.ClosedChannelException
       at sun.nio.ch.FileChannelImpl.ensureOpen(FileChannelImpl.java:110)
       at sun.nio.ch.FileChannelImpl.position(FileChannelImpl.java:253)
       at com.amazonaws.internal.ResettableInputStream.mark(ResettableInputStream.java:146)
       ... 28 more

      If we revert the plugin to 0.11.2 uploads function as expected.

          [JENKINS-60916] S3 upload fails with gzipFiles set to true

          +1.

          Same issue here after using 0.11.3 with Jenkins 2.204.1, since we build a new jenkins image each month I'm not sure what might broke the job If the upgrade of Jenkins but based on when plugin is released I might bet for the plugin https://github.com/jenkinsci/s3-plugin/releases/tag/s3-0.11.3.

          Daniel Alejandro Hernández added a comment - +1. Same issue here after using 0.11.3 with Jenkins 2.204.1, since we build a new jenkins image each month I'm not sure what might broke the job If the upgrade of Jenkins but based on when plugin is released I might bet for the plugin https://github.com/jenkinsci/s3-plugin/releases/tag/s3-0.11.3 .

          I've recently bumped into this issue as well, and have a probable fix ready in https://github.com/jenkinsci/s3-plugin/pull/127

          Matthias van de Meent added a comment - I've recently bumped into this issue as well, and have a probable fix ready in https://github.com/jenkinsci/s3-plugin/pull/127

          Oren Magid added a comment -

          We are using 0.11.5 and getting a somewhat different error message. We haven't used the gzip feature with an earlier version. Does this look like it's the same issue?

          (Also, will this set the Content-Encoding header on these files to "Gzip" in S3? If not, this won't work for us, and it doesn't appear there's another way to do so, though there does appear to be another pull request that was closed and not merged that would allow this header to be set.)

           

          ERROR: Failed to upload files com.amazonaws.ResetException: The request to the service failed with a retryable reason, but resetting the request input stream has failed. See exception.getExtraInfo or debug-level logging for the original failure that caused this retry.; If the request involves an input stream, the maximum stream buffer size can be configured via request.getRequestClientOptions().setReadLimit(int) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.resetRequestInputStream(AmazonHttpClient.java:1465) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1266) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1139) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:796) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:764) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:738) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:698) at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:680) at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:544) at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:524) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5052) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4998) at com.amazonaws.services.s3.AmazonS3Client.access$300(AmazonS3Client.java:394) at com.amazonaws.services.s3.AmazonS3Client$PutObjectStrategy.invokeServiceCall(AmazonS3Client.java:5940) at com.amazonaws.services.s3.AmazonS3Client.uploadObject(AmazonS3Client.java:1808) at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1768) at com.amazonaws.services.s3.transfer.internal.UploadCallable.uploadInOneChunk(UploadCallable.java:131) at com.amazonaws.services.s3.transfer.internal.UploadCallable.call(UploadCallable.java:123) at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:143) at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:48) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.nio.channels.ClosedChannelException at sun.nio.ch.FileChannelImpl.ensureOpen(FileChannelImpl.java:110) at sun.nio.ch.FileChannelImpl.position(FileChannelImpl.java:276) at com.amazonaws.internal.ResettableInputStream.reset(ResettableInputStream.java:174) at com.amazonaws.internal.SdkFilterInputStream.reset(SdkFilterInputStream.java:120) at com.amazonaws.util.LengthCheckInputStream.reset(LengthCheckInputStream.java:126) at com.amazonaws.internal.SdkFilterInputStream.reset(SdkFilterInputStream.java:120) at com.amazonaws.services.s3.internal.MD5DigestCalculatingInputStream.reset(MD5DigestCalculatingInputStream.java:105) at com.amazonaws.internal.SdkFilterInputStream.reset(SdkFilterInputStream.java:120) at com.amazonaws.event.ProgressInputStream.reset(ProgressInputStream.java:168) at com.amazonaws.internal.SdkFilterInputStream.reset(SdkFilterInputStream.java:120) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.resetRequestInputStream(AmazonHttpClient.java:1463) ... 23 more
          

           

          Oren Magid added a comment - We are using 0.11.5 and getting a somewhat different error message. We haven't used the gzip feature with an earlier version. Does this look like it's the same issue? (Also, will this set the Content-Encoding header on these files to "Gzip" in S3? If not, this won't work for us, and it doesn't appear there's another way to do so, though there does appear to be another pull request that was closed and not merged that would allow this header to be set.)   ERROR: Failed to upload files com.amazonaws.ResetException: The request to the service failed with a retryable reason, but resetting the request input stream has failed. See exception.getExtraInfo or debug-level logging for the original failure that caused this retry.; If the request involves an input stream, the maximum stream buffer size can be configured via request.getRequestClientOptions().setReadLimit( int ) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.resetRequestInputStream(AmazonHttpClient.java:1465) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1266) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1139) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:796) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:764) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:738) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:698) at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:680) at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:544) at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:524) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5052) at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4998) at com.amazonaws.services.s3.AmazonS3Client.access$300(AmazonS3Client.java:394) at com.amazonaws.services.s3.AmazonS3Client$PutObjectStrategy.invokeServiceCall(AmazonS3Client.java:5940) at com.amazonaws.services.s3.AmazonS3Client.uploadObject(AmazonS3Client.java:1808) at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1768) at com.amazonaws.services.s3.transfer.internal.UploadCallable.uploadInOneChunk(UploadCallable.java:131) at com.amazonaws.services.s3.transfer.internal.UploadCallable.call(UploadCallable.java:123) at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:143) at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:48) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang. Thread .run( Thread .java:748) Caused by: java.nio.channels.ClosedChannelException at sun.nio.ch.FileChannelImpl.ensureOpen(FileChannelImpl.java:110) at sun.nio.ch.FileChannelImpl.position(FileChannelImpl.java:276) at com.amazonaws.internal.ResettableInputStream.reset(ResettableInputStream.java:174) at com.amazonaws.internal.SdkFilterInputStream.reset(SdkFilterInputStream.java:120) at com.amazonaws.util.LengthCheckInputStream.reset(LengthCheckInputStream.java:126) at com.amazonaws.internal.SdkFilterInputStream.reset(SdkFilterInputStream.java:120) at com.amazonaws.services.s3.internal.MD5DigestCalculatingInputStream.reset(MD5DigestCalculatingInputStream.java:105) at com.amazonaws.internal.SdkFilterInputStream.reset(SdkFilterInputStream.java:120) at com.amazonaws.event.ProgressInputStream.reset(ProgressInputStream.java:168) at com.amazonaws.internal.SdkFilterInputStream.reset(SdkFilterInputStream.java:120) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.resetRequestInputStream(AmazonHttpClient.java:1463) ... 23 more  

          ...
          Caused by: java.nio.channels.ClosedChannelException at sun.nio.ch.FileChannelImpl.ensureOpen(FileChannelImpl.java:110)
          ...
          

          That indeed looks like this issue: the transport channel is closed before the data is (fully) transfered.

          Also, will this set the Content-Encoding header on these files to "Gzip" in S3?

          I do believe that it does set the headers correctly, as I haven't had problems downloading the uploaded files through the S3 UI.

          Matthias van de Meent added a comment - ... Caused by: java.nio.channels.ClosedChannelException at sun.nio.ch.FileChannelImpl.ensureOpen(FileChannelImpl.java:110) ... That indeed looks like this issue: the transport channel is closed before the data is (fully) transfered. Also, will this set the Content-Encoding header on these files to "Gzip" in S3? I do believe that it does set the headers correctly, as I haven't had problems downloading the uploaded files through the S3 UI.

          Oren Magid added a comment -

          Awesome! Do you know if this will be fixed soon? Just trying to figure out how to get gzipped versions of our files up on S3.

          Oren Magid added a comment - Awesome! Do you know if this will be fixed soon? Just trying to figure out how to get gzipped versions of our files up on S3.

          Do you know if this will be fixed soon?

          I'm not a part of Jenkins, and have no write-access to the plugin... There's a MR which I expect fixes the issue, but there has been no action on Jenkins' part.

          Just trying to figure out how to get gzipped versions of our files up on S3.

          I'm just waiting for a fix, and staying on 0.11.2 until it's fixed (it works up to and including 0.11.2): the security risks have been accepted by our security officer. Your milage may vary though.

          Matthias van de Meent added a comment - Do you know if this will be fixed soon? I'm not a part of Jenkins, and have no write-access to the plugin... There's a MR which I expect fixes the issue, but there has been no action on Jenkins' part. Just trying to figure out how to get gzipped versions of our files up on S3. I'm just waiting for a fix, and staying on 0.11.2 until it's fixed (it works up to and including 0.11.2): the security risks have been accepted by our security officer. Your milage may vary though.

          Oren Magid added a comment -

          Ah, I see. Thanks for your response!

          Oren Magid added a comment - Ah, I see. Thanks for your response!

          Steven added a comment -

          We are experiencing the same issue. Can this please be bumped up from severity 'Minor'? 
          This is forcing us to keep using 0.11.2 which has known security issues. 

          Steven added a comment - We are experiencing the same issue. Can this please be bumped up from severity 'Minor'?  This is forcing us to keep using 0.11.2 which has known security issues. 

            jimilian Alexander A
            spectre683 Berin Babcock-McConnell
            Votes:
            2 Vote for this issue
            Watchers:
            7 Start watching this issue

              Created:
              Updated: