-
New Feature
-
Resolution: Done
-
Major
-
Powered by SuggestiMate
JENKINS-27295 discusses getting values from various kinds of parameters. Handling FileParameterValue is another matter. buildEnvironment (what is called today) is useless since it gives only a file name. createBuildWrapper is the way this works in a freestyle project, but this cannot work in a workflow; even if it were to return a SimpleBuildWrapper (JENKINS-24673) it is not clear where that would be called, since we can only use it from a workspace. getValue as currently implemented is useless since a FileItem does not have whitelisted methods, and anyway we would not want the flow itself to be copying streams to the workspace; this needs to be done by infrastructure. The only way forward I can see at the moment is for getValue to return a SimpleBuildWrapper, so that your flow could read
node {
wrap([$delegate: parameters.myFileParam]) {
sh 'cat myFileParam'
}
}
- depends on
-
JENKINS-24673 SimpleBuildWrapper
-
- Resolved
-
-
JENKINS-27295 Boolean parameters injected as String
-
- Resolved
-
- is duplicated by
-
JENKINS-47333 file parameter not working in pipeline job
-
- Closed
-
-
JENKINS-51245 file parameter issue in jenkins pipeline
-
- Closed
-
- is related to
-
JENKINS-29289 InputStep doesn't support File Parameters
-
- Resolved
-
-
JENKINS-47333 file parameter not working in pipeline job
-
- Closed
-
- relates to
-
JENKINS-12699 Temp upload files from file param not removed after transferred to slave
-
- In Review
-
- links to
[JENKINS-27413] Handle file parameters
Also need to check how we could send file parameters from a Workflow build, using the build step. Presumably you would want the actual file to come from the contextual workspace.
Is there any way to work around this issue while keeping everything in a pipeline job?
I have a process where we run a job to pre-process some data, then save the result as an artifact; we then use the Promoted Builds plugin to indicate that the result was OK (after a manual review), and to trigger a second job to load that data using an ETL tool. The second job, then, has to use the Copy Artifact plugin.
While this works, it feels less direct than I'd like it to be; also, the user has to go to the second job to see the actual result of the ETL load.
I've just run into this issue, in fact I've run into it at an earlier level that the file just doesn't seem to be uploaded at all (you can't view it in the "parameters" page of the build, the file name and link are there, but clicking it gives an error rather than the file contents).
This is a major issue, because I need to set up try-before-commit functionality.
I'm in the same boat with Jon Hodgson: trying to set up try-before-commit. All my jobs are pipelines, most of them are pretty complex so I can't go back to freestyle jobs... I'm stuck with missing file upload functionality.
For the sake of documentation, if someone knows of a workaround please post it here.
I bent over backwards trying to get it to work with string parameters, but between python (which I use for my launch script), Groovy and HTTP trying to be helpful something always got auto-converted (early tests would seem great, then I would try a real-world diff and there would be something in there that broke, such as accented characters in files that had different formats)
In the end I went with a smple FTP serverr on the same machine as Jenkins, and uploading a file with a name given by a job I pass as a string parameter.
It's not the most neat and tidy solution, but it works at least... well so far.
FYI, if you only need to deal with manual input.. file parameter can be caught in an 'input' step, and then passed on:
stage('file input') { node { // Get file using input step, will put it in build directory def inputFile = input message: 'Upload file', parameters: [file(name: 'data.txt')] // Read contents and write to workspace writeFile(file: 'data.txt', text: inputFile.readToString()) // Stash it for use in a different part of the pipeline stash name: 'data', includes: 'data.txt' } } stage('do something with data') { node { // Unstash the file into an 'input' directory in the workspace dir('input') { unstash 'data' } // do something useful sh "ls -lR input/" } }
Normally you would have the 'input' step outside of a node, but since we want to put our file in a workspace and stash (and / or archive) it, we'll need the node here.
Thank you Johan Wärlander for this solution which works well with text files. But I didn't find any simple solution for binary files, encoding parameter of writeFile is ignored (https://issues.jenkins-ci.org/browse/JENKINS-27094).
Here is one solution I've found, by using FilePath API directly :
stage('file input') { node('master') { def inputFile = input message: 'Upload file', parameters: [file(name: 'data.zip')] unzip dir: '', glob: '', zipFile: inputFile.remote } }
But two drawbacks :
- This works only on the master node (I didn't manage to access to the FilePath channel of the workspace from Pipeline plugin)
- getRemote() method from FilePath is not whitelisted
Using FilePath.copyTo() or FilePath.unzip() instead of unzip has similar drawbacks.
Any suggestion ?
arthaud, does Base64 encoding change anything in that scenario? Eg. something like:
writeFile(file: 'data.zip.b64', text: inputFile.read().getBytes().encodeBase64().toString())
..and then stash it, and do the decoding / unzipping it in a separate node section?
I don't know if it makes a difference w/ master vs slave node execution, as we only have a master node so far in our setup.
I personally gave up trying to pass the file as a parameter, as a text file it always got messed up somewhere along the line (it's a diff, so although it's text the encoding can vary depending on the source files).
So I siwtched to creating a zip with a uri as the filename, uploading it with ftp, and passing the uri as a parameter.
It's perhaps not as elegant as doing it all in jenkins, but it works.
Base64 encoding/decoding should do the trick on any master or slave node. Thank you !
I bit the bullet and created a simple (in the end) pipeline library to workaround this issue. The library provides a new "step" - unstashParam - that saves the file build parameter into a workspace. Not (yet) thoroughly tested but seems to work fine with text files at least.
You may find it here: https://bitbucket.org/janvrany/jenkins-27413-workaround-library
Hope you would find it useful.
"UnstashParam" does not work when node is "master". I did not succeed in getting workspace object in that case.
zendev: Just for the record: more on https://bitbucket.org/janvrany/jenkins-27413-workaround-library/issues/2/does-not-working-on-master-node
Any news on this?
For the record, I noticed that the file provided through parameter is only missing when using Pipeline from SCM - when using Pipeline script, the file is present and this trivial pipeline completes successfully:
parameters { file description: 'Blah', name: 'custom.xml' } pipeline { agent any stages { stage ("cat-file") { steps { powershell 'Get-Content .\\custom.xml' } } } }
Took a while to find out that this was an issue. Plus one for getting this fixed in an upcoming release.
UnstashParam worked for me. Thanks.
Can confirm that this is still broken - my minimal Jenkinsfile below yields "echo null":
pipeline { agent any parameters { file(name: 'ZIP') } stages { stage('Test file') { steps { sh "echo ${params.ZIP}" } } } }
works in freesyle project but in pipeline is not working.
tried with `ls -l .` with no success, as if there is no file at all.
File parameters are not currently supported. I am not particularly inclined to fix this per se since support for them is quite complicated architecturally—the implementation for freestyle projects does not generalize naturally and relies on a special file upload web request handler. Simpler, safer, and more flexible would be to introduce a plugin defining a Base64-encoded string parameter type, with a form submission GUI and CLI that make it convenient to obtain the value from a local file, as well as a SimpleBuildWrapper (usable as a Pipeline block-scoped step) that decodes the value to a local temporary file within its body.
my 5 cents, this code shows how to load a properties file into the environment, in the example the properties come from a String but also you can load a file from disk with the readProperties pipeline utility step
node { def configPrperties = """ VAR01 = value1 VAR02 = value2 """ def config = readProperties(text: configPrperties) config.each { k, v -> env."${k}" = v } sh 'export' }
If there is no plan to support the file parameter option, it would be nice to remove it from the documentation here: https://jenkins.io/doc/book/pipeline/syntax/#parameters
Avoid actively breaking it though please as as we do actually use this functionality with the help of our global library.
def inputGetFile(String savedfile = null) { def filedata = null def filename = null // Get file using input step, will put it in build directory // the filename will not be included in the upload data, so optionally allow it to be specified if (savedfile == null) { def inputFile = input message: 'Upload file', parameters: [file(name: 'library_data_upload'), string(name: 'filename', defaultValue: 'uploaded-file-data')] filedata = inputFile['library_data_upload'] filename = inputFile['filename'] } else { def inputFile = input message: 'Upload file', parameters: [file(name: 'library_data_upload')] filedata = inputFile filename = savedfile } // Read contents and write to workspace writeFile(file: filename, encoding: 'Base64', text: filedata.read().getBytes().encodeBase64().toString()) // Remove the file from the master to avoid stuff like secret leakage filedata.delete() return filename }
And this library code is used with this jenkinsfile snippet:
stage('request a file with chosen filename') { node { deleteDir() uploaded_file = library.inputGetFile('my-file-here') sh "file '${uploaded_file}'" sh "ls -la ." } }
Another plugin idea, useful for uploads too large to be reasonably handled as Base64 and environment variables: a parameter type which lets you upload a file to an S3 (or MinIO) bucket. Could either upload from the master upon form/CLI submission; or, more efficiently, provide a special UI & API endpoint to allocate a blob and then hand you a presigned URL good for upload only, valid for an hour. The parameter value would be a presigned URL good for download only, again valid only for a day or so. Similarly for Azure etc.
(No need to delete the blob in a RunListener after the build completes. That would break Replay / Rebuild and, as with artifact-manager-s3, would force the master to have a more dangerous IAM permission. Anyway S3 can be configured to automatically expunge or archive old blobs.)
"I am not particularly inclined to fix this per se"
That is very disappointing, and a stumbling block for those trying to convert their freestyle jobs to declarative pipeline jobs. I really hope you change your mind on this.
At the very least, the documentation should mention that it isn't supported.
Just a notice:
I have moved my workaround library with `unstashParam` step to GitHub, new locations is
https://github.com/janvrany/jenkinsci-unstashParam-library
I also renamed it as this issue seems to be unlikely "fixed".
Hi hogarthj - Does the solution works for binary/zip file. Tried using groovy script & respective one on Jenkins pipleline, but it doesn't seem to work for zip (Can not open file 'fileName.zip' as archive') or binary file uploaded. Would like to know if this is been considered in future releases ?
vinwin: The unstashParam workaround works great for us, including zip files. We are using a windows instance of Jenkins (in case that is significant).
I don't quite understand if this is just not implemented yet, or if it's impossible to implement at all. If it is possible, we might be able to fund the development of this feature, as we need it too. As I am still rather new to the Jenkins community: Can anyone recommend me any companies that could do that?
schmandr as to technical design, see my comments of 2018-10-24 and 2019-06-21. I am interpreting the requirement more broadly than the original statement: a user or script should be able to trigger a Pipeline job with a build parameter that includes the contents of a (possibly large, possibly binary, but not secret) file, using any common mechanism (GUI, HTTP POST, build-job CLI command, build step), in such a way that some convenient (Scripted and/or Declarative) syntax may be used to retrieve that file in a designated workspace location at a designated time, including in future rebuilds.
jglick
Still prototyping? Now in https://github.com/jenkinsci/file-parameters-plugin
bitwiseman the basics work fine. I (or somebody) need to do a bit more testing, and work on design of usage from the input and build steps (or a decision that integration with these steps will not be supported). On my back burner.
I have released an initial version of the File Parameter plugin. I think it covers the basic use cases for Pipeline users. Use its issue tracker as needed.
Hello jglick, first of all, thx for implementing new plugin! Is there any change your File Parameter plugin will get promoted to the Cloudbees Update Center where are verified and trusted opensource plugins since it is currently only working workaround for broken file parameter?
As well I guess if this issue was closed with resolution people should use new File Parameter plugin then the original (for years broken and still not working) file parameter should be removed from Jenkins to prevent people confusion and stop waisting time on not existing workarounds...
Is there any change your File Parameter plugin will get promoted to the Cloudbees Update Center
If you are a CloudBees customer, open an RFE via the normal process. I am not personally in charge of such decisions.
the original […] file parameter should be removed from Jenkins [core]
Well, I do not think we can remove it. In principle it could be detached to a (deprecated) plugin. Certainly we could consider adding help text and the like in core suggesting use of https://plugins.jenkins.io/file-parameters/ in its place, once it has gotten some more serious usage to shake out design problems.
jglick this is a cool plugin, thank you!
how's the intended handling of non-mandatory large files used with stashedFile in declarative?
- If no file is provided and the build is triggered manually, unstashing works, but the file is 0 Bytes.
For stages depending on the file, I'm using the following (unfortunately platform-dependent) when-condition:when { expression { sh(returnStatus:true, script: '[ -s FPGA.zip ]' ) == 0 } }
- If no file is provided and the build is triggered by the SCM, unstashing fails because the whole parameter seems to be missing, so I need to wrap the unstash into a catchError
I think, that's a quite cumbersome error handling for basically the same case (no file provided)
reneb for any bugs or RFEs please use GitHub Issues, and be sure to provide a complete, self-contained, reproducible test case.
I am not sure I understand the issue's description but we do encounter a problem with workflow and file parameters.
We observe that the file parameter is not at all taken into account in the workflow parameter:
Which basically makes the file parameter completely useless for a job.
The same setup works for a standard job.
Are we are doing something wrong or is it a bug in the workflow plugin as this current issue seems to indicate ?