-
Type:
Bug
-
Resolution: Fixed
-
Priority:
Major
-
Component/s: artifact-manager-s3-plugin
-
None
-
Environment:Jenkins 2.176.1, artifact-manager-s3-plugin 1.6
-
1.7
When restarting a multi-branch pipeline job from a stage I noticed that the archived artifacts are not copied to the new pipeline run if the job name contains a forward slash. The archived artifacts are stored in S3 via the artifact manager s3 plugin. The s3 bucket that stores the artifacts has the / encoded as %2F and the artifact is successfully archived as I can see it in our AWS account.
09:48:25 Started by user <redacted> 09:48:25 Restarted from build #1, stage Build environment 14:48:25 [2019-10-28T14:48:25.754Z] Connecting to https://github.<redacted>.com/api/v3 using <redacted>/****** (Token used for integration with <redacted> GitHub Enterprise.) 09:48:26 Obtained Jenkinsfile from a1590de81a30e783b4327b83e5276c462f652086 09:48:27 09:48:27 GitHub has been notified of this commit's build result 09:48:27 09:48:27 org.jclouds.blobstore.KeyNotFoundException: /artifacts/shuttle-migrate/unstash-test/user%2Ftest/2/artifacts/test.txt not found in container <redacted>.s3.amazonaws.com: The specified key does not exist.
Â
This same pipeline (attached to the issue) works when I restart from a stage if the job name does not have a forward slash in it. The archived artifacts are copied to the new pipeline run and the pipeline continues successfully.
09:41:06 Started by user <redacted> 09:41:06 Restarted from build #1, stage Build environment 14:41:06 [2019-10-28T14:41:06.519Z] Connecting to https://github.<redacted>.com/api/v3 using <redacted>/****** (Token used for integration with <redacted> GitHub Enterprise.) 09:41:07 Obtained Jenkinsfile from d21269a70265fb22ca154c1faaa628882047e6ce 09:41:07 Copied 2 artifact(s)/stash(es) from https://<redacted>.s3.amazonaws.com/artifacts/shuttle-migrate/unstash-test/test-branch/1/ to https://<redacted>.s3.amazonaws.com/artifacts/shuttle-migrate/unstash-test/test-branch/2/
Â
We'd like to be able to restart an older pipeline so that we can redeploy older builds in the event of a rollback.
I found this resolved issue that seems to be related to what I've experienced, but it seems to be specifically for manually downloading artifacts via the s3 http url.
https://issues.jenkins-ci.org/browse/JENKINS-52151
I also see that the issue was resolved in the 1.2 release, we're running the latest version but still seeing the branch name issue.
https://wiki.jenkins.io/display/JENKINS/Artifact+Manager+S3+Plugin
Â
To test this I created a pipeline, which is attached to this issue, and would do the following.
- Run a successful build that archives an artifact
- Start a new build by restarting from the first stage of the successful build above.
Restarting from an older pipeline worked when I ran it on jobs based on the branches named below.
master user-branch user-testÂ
The same steps failed with the "The specified key does not exist. " issue when running it on jobs based on the following branch names.
test/user/branch-test user/test user/branch
Â
I'm guessing some form of hashing based on the branch name happens to do the artifact lookup in the S3 bucket and the forward slash causes some issue with that.
- relates to
-
JENKINS-50591 Files and paths with ampersand cause exception
-
- Resolved
-
- links to