JENKINS-27295 discusses getting values from various kinds of parameters. Handling FileParameterValue is another matter. buildEnvironment (what is called today) is useless since it gives only a file name. createBuildWrapper is the way this works in a freestyle project, but this cannot work in a workflow; even if it were to return a SimpleBuildWrapper (JENKINS-24673) it is not clear where that would be called, since we can only use it from a workspace. getValue as currently implemented is useless since a FileItem does not have whitelisted methods, and anyway we would not want the flow itself to be copying streams to the workspace; this needs to be done by infrastructure. The only way forward I can see at the moment is for getValue to return a SimpleBuildWrapper, so that your flow could read

      node {
        wrap([$delegate: parameters.myFileParam]) {
          sh 'cat myFileParam'
        }
      }
      

          [JENKINS-27413] Handle file parameters

          I am not sure I understand the issue's description but we do encounter a problem with workflow and file parameters.

          We observe that the file parameter is not at all taken into account in the workflow parameter:

          • there is no environment variable defined at all concerning the parameter,
          • the file uploaded is not copied into the workflow workspace.

          Which basically makes the file parameter completely useless for a job.
          The same setup works for a standard job.

          Are we are doing something wrong or is it a bug in the workflow plugin as this current issue seems to indicate ?

          Yann Rouillard added a comment - I am not sure I understand the issue's description but we do encounter a problem with workflow and file parameters. We observe that the file parameter is not at all taken into account in the workflow parameter: there is no environment variable defined at all concerning the parameter, the file uploaded is not copied into the workflow workspace. Which basically makes the file parameter completely useless for a job. The same setup works for a standard job. Are we are doing something wrong or is it a bug in the workflow plugin as this current issue seems to indicate ?

          Jesse Glick added a comment -

          Also need to check how we could send file parameters from a Workflow build, using the build step. Presumably you would want the actual file to come from the contextual workspace.

          Jesse Glick added a comment - Also need to check how we could send file parameters from a Workflow build, using the build step. Presumably you would want the actual file to come from the contextual workspace.

          Is there any way to work around this issue while keeping everything in a pipeline job?

          I have a process where we run a job to pre-process some data, then save the result as an artifact; we then use the Promoted Builds plugin to indicate that the result was OK (after a manual review), and to trigger a second job to load that data using an ETL tool. The second job, then, has to use the Copy Artifact plugin.

          While this works, it feels less direct than I'd like it to be; also, the user has to go to the second job to see the actual result of the ETL load.

          Johan Wärlander added a comment - Is there any way to work around this issue while keeping everything in a pipeline job? I have a process where we run a job to pre-process some data, then save the result as an artifact; we then use the Promoted Builds plugin to indicate that the result was OK (after a manual review), and to trigger a second job to load that data using an ETL tool. The second job, then, has to use the Copy Artifact plugin. While this works, it feels less direct than I'd like it to be; also, the user has to go to the second job to see the actual result of the ETL load.

          Jon Hodgson added a comment -

          I've just run into this issue, in fact I've run into it at an earlier level that the file just doesn't seem to be uploaded at all (you can't view it in the "parameters" page of the build, the file name and link are there, but clicking it gives an error rather than the file contents).

          This is a major issue, because I need to set up try-before-commit functionality.

          Jon Hodgson added a comment - I've just run into this issue, in fact I've run into it at an earlier level that the file just doesn't seem to be uploaded at all (you can't view it in the "parameters" page of the build, the file name and link are there, but clicking it gives an error rather than the file contents). This is a major issue, because I need to set up try-before-commit functionality.

          Oren Chapo added a comment -

          I'm in the same boat with Jon Hodgson: trying to set up try-before-commit. All my jobs are pipelines, most of them are pretty complex so I can't go back to freestyle jobs... I'm stuck with missing file upload functionality.
          For the sake of documentation, if someone knows of a workaround please post it here.

          Oren Chapo added a comment - I'm in the same boat with Jon Hodgson: trying to set up try-before-commit. All my jobs are pipelines, most of them are pretty complex so I can't go back to freestyle jobs... I'm stuck with missing file upload functionality. For the sake of documentation, if someone knows of a workaround please post it here.

          Jon Hodgson added a comment -

          I bent over backwards trying to get it to work with string parameters, but between python (which I use for my launch script), Groovy and HTTP trying to be helpful something always got auto-converted (early tests would seem great, then I would try a real-world diff and there would be something in there that broke, such as accented characters in files that had different formats)

          In the end I went with a smple FTP serverr on the same machine as Jenkins, and uploading a file with a name given by a job I pass as a string parameter.

          It's not the most neat and tidy solution, but it works at least... well so far.

          Jon Hodgson added a comment - I bent over backwards trying to get it to work with string parameters, but between python (which I use for my launch script), Groovy and HTTP trying to be helpful something always got auto-converted (early tests would seem great, then I would try a real-world diff and there would be something in there that broke, such as accented characters in files that had different formats) In the end I went with a smple FTP serverr on the same machine as Jenkins, and uploading a file with a name given by a job I pass as a string parameter. It's not the most neat and tidy solution, but it works at least... well so far.

          FYI, if you only need to deal with manual input.. file parameter can be caught in an 'input' step, and then passed on:

          stage('file input') {
            node {
              // Get file using input step, will put it in build directory
              def inputFile = input message: 'Upload file', parameters: [file(name: 'data.txt')]
              // Read contents and write to workspace
              writeFile(file: 'data.txt', text: inputFile.readToString())
              // Stash it for use in a different part of the pipeline
              stash name: 'data', includes: 'data.txt'
            }
          }
          
          stage('do something with data') {
            node {
              // Unstash the file into an 'input' directory in the workspace
              dir('input') {
                unstash 'data'
              }
              // do something useful
              sh "ls -lR input/"
            }
          }
          

          Normally you would have the 'input' step outside of a node, but since we want to put our file in a workspace and stash (and / or archive) it, we'll need the node here.

          Johan Wärlander added a comment - FYI, if you only need to deal with manual input.. file parameter can be caught in an 'input' step, and then passed on: stage( 'file input' ) { node { // Get file using input step, will put it in build directory def inputFile = input message: 'Upload file' , parameters: [file(name: 'data.txt' )] // Read contents and write to workspace writeFile(file: 'data.txt' , text: inputFile.readToString()) // Stash it for use in a different part of the pipeline stash name: 'data' , includes: 'data.txt' } } stage( ' do something with data' ) { node { // Unstash the file into an 'input' directory in the workspace dir( 'input' ) { unstash 'data' } // do something useful sh "ls -lR input/" } } Normally you would have the 'input' step outside of a node, but since we want to put our file in a workspace and stash (and / or archive) it, we'll need the node here.

          Geoffrey Arthaud added a comment - - edited

          Thank you Johan Wärlander for this solution which works well with text files. But I didn't find any simple solution for binary files, encoding parameter of writeFile is ignored (https://issues.jenkins-ci.org/browse/JENKINS-27094).

          Here is one solution I've found, by using FilePath API directly :

          stage('file input') {
            node('master') {
              def inputFile = input message: 'Upload file', parameters: [file(name: 'data.zip')]
              unzip dir: '', glob: '', zipFile: inputFile.remote
            }
          }
          

          But two drawbacks :

          • This works only on the master node (I didn't manage to access to the FilePath channel of the workspace from Pipeline plugin)
          • getRemote() method from FilePath is not whitelisted

          Using FilePath.copyTo() or FilePath.unzip() instead of unzip has similar drawbacks.

          Any suggestion ?

          Geoffrey Arthaud added a comment - - edited Thank you Johan Wärlander for this solution which works well with text files. But I didn't find any simple solution for binary files, encoding parameter of writeFile is ignored ( https://issues.jenkins-ci.org/browse/JENKINS-27094 ). Here is one solution I've found, by using FilePath API directly : stage( 'file input' ) { node( 'master' ) { def inputFile = input message: 'Upload file' , parameters: [file(name: 'data.zip' )] unzip dir: '', glob: ' ', zipFile: inputFile.remote } } But two drawbacks : This works only on the master node (I didn't manage to access to the FilePath channel of the workspace from Pipeline plugin) getRemote() method from FilePath is not whitelisted Using FilePath.copyTo() or FilePath.unzip() instead of unzip has similar drawbacks. Any suggestion ?

          arthaud, does Base64 encoding change anything in that scenario? Eg. something like:

          writeFile(file: 'data.zip.b64', text: inputFile.read().getBytes().encodeBase64().toString())
          

          ..and then stash it, and do the decoding / unzipping it in a separate node section?

          I don't know if it makes a difference w/ master vs slave node execution, as we only have a master node so far in our setup.

          Johan Wärlander added a comment - arthaud , does Base64 encoding change anything in that scenario? Eg. something like: writeFile(file: 'data.zip.b64' , text: inputFile.read().getBytes().encodeBase64().toString()) ..and then stash it, and do the decoding / unzipping it in a separate node section? I don't know if it makes a difference w/ master vs slave node execution, as we only have a master node so far in our setup.

          Jon Hodgson added a comment -

          I personally gave up trying to pass the file as a parameter, as a text file it always got messed up somewhere along the line (it's a diff, so although it's text the encoding can vary depending on the source files).

          So I siwtched to creating a zip with a uri as the filename, uploading it with ftp, and passing the uri as a parameter.

          It's perhaps not as elegant as doing it all in jenkins, but it works.

          Jon Hodgson added a comment - I personally gave up trying to pass the file as a parameter, as a text file it always got messed up somewhere along the line (it's a diff, so although it's text the encoding can vary depending on the source files). So I siwtched to creating a zip with a uri as the filename, uploading it with ftp, and passing the uri as a parameter. It's perhaps not as elegant as doing it all in jenkins, but it works.

          Base64 encoding/decoding should do the trick on any master or slave node. Thank you !

          Geoffrey Arthaud added a comment - Base64 encoding/decoding should do the trick on any master or slave node. Thank you !

          Jan Vrany added a comment -

          I bit the bullet and created a simple (in the end) pipeline library to workaround this issue. The library provides a new "step" - unstashParam - that saves the file build parameter into a workspace. Not (yet) thoroughly tested but seems to work fine with text files at least.

          You may find it here: https://bitbucket.org/janvrany/jenkins-27413-workaround-library

          Hope you would find it useful.

          Jan Vrany added a comment - I bit the bullet and created a simple (in the end) pipeline library to workaround this issue. The library provides a new "step" - unstashParam - that saves the file build parameter into a workspace. Not (yet) thoroughly tested but seems to work fine with text files at least. You may find it here: https://bitbucket.org/janvrany/jenkins-27413-workaround-library Hope you would find it useful.

          Zuzik Zuzikovitch added a comment - - edited

          janvrany thanks a lot for **unstashParam**, tested and it works just fine.

          Zuzik Zuzikovitch added a comment - - edited janvrany thanks a lot for ** unstashParam **, tested and it works just fine.

          Cyril Guillon added a comment -

          "UnstashParam" does not work when node is "master". I did not succeed in getting workspace object in that case.

          Cyril Guillon added a comment - "UnstashParam" does not work when node is "master". I did not succeed in getting workspace object in that case.

          Jan Vrany added a comment -

          Jan Vrany added a comment - zendev : Just for the record: more on https://bitbucket.org/janvrany/jenkins-27413-workaround-library/issues/2/does-not-working-on-master-node

          Any news on this?

          For the record, I noticed that the file provided through parameter is only missing when using Pipeline from SCM - when using Pipeline script, the file is present and this trivial pipeline completes successfully:

          parameters {
            file description: 'Blah', name: 'custom.xml'
          }
          pipeline {
            agent any
            stages {
              stage ("cat-file") {
                  steps {
                      powershell 'Get-Content .\\custom.xml'
                  }
              }
            }
          }
          
          

          Michal Zatloukal added a comment - Any news on this? For the record, I noticed that the file provided through parameter is only missing when using Pipeline from SCM - when using Pipeline script , the file is present and this trivial pipeline completes successfully: parameters { file description: 'Blah' , name: 'custom.xml' } pipeline { agent any stages { stage ( "cat-file" ) { steps { powershell 'Get-Content .\\custom.xml' } } } }

          Chris Snyder added a comment -

          Took a while to find out that this was an issue. Plus one for getting this fixed in an upcoming release. 

          UnstashParam worked for me. Thanks.

          Chris Snyder added a comment - Took a while to find out that this was an issue. Plus one for getting this fixed in an upcoming release.  UnstashParam worked for me. Thanks.

          Emmanuel Goh added a comment - - edited

          Can confirm that this is still broken - my minimal Jenkinsfile below yields "echo null":

           

          pipeline {
            agent any
            parameters {
              file(name: 'ZIP')
            }
            stages {
              stage('Test file') {
                steps { 
                  sh "echo ${params.ZIP}" 
                }
              }
            }
          }
          

          Emmanuel Goh added a comment - - edited Can confirm that this is still broken - my minimal Jenkinsfile below yields "echo null":   pipeline { agent any   parameters {     file(name: 'ZIP' )   }   stages {     stage( 'Test file' ) { steps { sh "echo ${params.ZIP}" } } } }

           works in freesyle project but in pipeline is not working.

           
          tried with `ls -l .` with no success, as if there is no file at all.

          Bauyrzhan Makhambetov added a comment -  works in freesyle project but in pipeline is not working.   tried with `ls -l .` with no success, as if there is no file at all.

          Jesse Glick added a comment -

          File parameters are not currently supported. I am not particularly inclined to fix this per se since support for them is quite complicated architecturally—the implementation for freestyle projects does not generalize naturally and relies on a special file upload web request handler. Simpler, safer, and more flexible would be to introduce a plugin defining a Base64-encoded string parameter type, with a form submission GUI and CLI that make it convenient to obtain the value from a local file, as well as a SimpleBuildWrapper (usable as a Pipeline block-scoped step) that decodes the value to a local temporary file within its body.

          Jesse Glick added a comment - File parameters are not currently supported. I am not particularly inclined to fix this per se since support for them is quite complicated architecturally—the implementation for freestyle projects does not generalize naturally and relies on a special file upload web request handler. Simpler, safer, and more flexible would be to introduce a plugin defining a Base64-encoded string parameter type, with a form submission GUI and CLI that make it convenient to obtain the value from a local file, as well as a SimpleBuildWrapper (usable as a Pipeline block-scoped step) that decodes the value to a local temporary file within its body.

          Thank you for pointing that out.

          Bauyrzhan Makhambetov added a comment - Thank you for pointing that out.

          Ivan Fernandez Calvo added a comment - - edited

          my 5 cents, this code shows how to load a properties file into the environment, in the example the properties come from a String but also you can load a file from disk with the readProperties pipeline utility step

          node {
             def configPrperties = """
             VAR01 = value1
             VAR02 = value2
             """
             def config = readProperties(text: configPrperties)
             config.each { k, v ->
                 env."${k}" = v
             }
             sh 'export'
          }
          

          Ivan Fernandez Calvo added a comment - - edited my 5 cents, this code shows how to load a properties file into the environment, in the example the properties come from a String but also you can load a file from disk with the readProperties pipeline utility step node { def configPrperties = """ VAR01 = value1 VAR02 = value2 """ def config = readProperties(text: configPrperties) config.each { k, v -> env. "${k}" = v } sh 'export' }

          Alex Haynes added a comment -

          If there is no plan to support the file parameter option, it would be nice to remove it from the documentation here: https://jenkins.io/doc/book/pipeline/syntax/#parameters

          Alex Haynes added a comment - If there is no plan to support the file parameter option, it would be nice to remove it from the documentation here:  https://jenkins.io/doc/book/pipeline/syntax/#parameters

          Jesse Glick added a comment -

          deseao good point! You can help.

          Jesse Glick added a comment - deseao good point! You can help .

          James Hogarth added a comment - - edited

          Avoid actively breaking it though please as as we do actually use this functionality with the help of our global library. 

           

          def inputGetFile(String savedfile = null) {
           def filedata = null
           def filename = null
           // Get file using input step, will put it in build directory
           // the filename will not be included in the upload data, so optionally allow it to be specified
           if (savedfile == null) {
           def inputFile = input message: 'Upload file', parameters: [file(name: 'library_data_upload'), string(name: 'filename', defaultValue: 'uploaded-file-data')]
           filedata = inputFile['library_data_upload']
           filename = inputFile['filename']
           } else {
           def inputFile = input message: 'Upload file', parameters: [file(name: 'library_data_upload')]
           filedata = inputFile
           filename = savedfile
           }
           // Read contents and write to workspace
           writeFile(file: filename, encoding: 'Base64', text: filedata.read().getBytes().encodeBase64().toString())
           // Remove the file from the master to avoid stuff like secret leakage
           filedata.delete()
           return filename
          }

          And this library code is used with this jenkinsfile snippet:

          stage('request a file with chosen filename') {
           node {
           deleteDir()
           uploaded_file = library.inputGetFile('my-file-here')
           sh "file '${uploaded_file}'"
           sh "ls -la ."
           }
          }

           

           

          James Hogarth added a comment - - edited Avoid actively breaking it though please as as we do actually use this functionality with the help of our global library.    def inputGetFile( String savedfile = null ) { def filedata = null def filename = null // Get file using input step, will put it in build directory // the filename will not be included in the upload data, so optionally allow it to be specified if (savedfile == null ) { def inputFile = input message: 'Upload file' , parameters: [file(name: 'library_data_upload' ), string(name: 'filename' , defaultValue: 'uploaded-file-data' )] filedata = inputFile[ 'library_data_upload' ] filename = inputFile[ 'filename' ] } else { def inputFile = input message: 'Upload file' , parameters: [file(name: 'library_data_upload' )] filedata = inputFile filename = savedfile } // Read contents and write to workspace writeFile(file: filename, encoding: 'Base64' , text: filedata.read().getBytes().encodeBase64().toString()) // Remove the file from the master to avoid stuff like secret leakage filedata.delete() return filename } And this library code is used with this jenkinsfile snippet: stage( 'request a file with chosen filename' ) { node { deleteDir() uploaded_file = library.inputGetFile( 'my-file-here' ) sh "file '${uploaded_file}' " sh "ls -la ." } }    

          Jesse Glick added a comment -

          Another plugin idea, useful for uploads too large to be reasonably handled as Base64 and environment variables: a parameter type which lets you upload a file to an S3 (or MinIO) bucket. Could either upload from the master upon form/CLI submission; or, more efficiently, provide a special UI & API endpoint to allocate a blob and then hand you a presigned URL good for upload only, valid for an hour. The parameter value would be a presigned URL good for download only, again valid only for a day or so. Similarly for Azure etc.

          (No need to delete the blob in a RunListener after the build completes. That would break Replay / Rebuild and, as with artifact-manager-s3, would force the master to have a more dangerous IAM permission. Anyway S3 can be configured to automatically expunge or archive old blobs.)

          Jesse Glick added a comment - Another plugin idea, useful for uploads too large to be reasonably handled as Base64 and environment variables: a parameter type which lets you upload a file to an S3 (or MinIO) bucket. Could either upload from the master upon form/CLI submission; or, more efficiently, provide a special UI & API endpoint to allocate a blob and then hand you a presigned URL good for upload only, valid for an hour. The parameter value would be a presigned URL good for download only, again valid only for a day or so. Similarly for Azure etc. (No need to delete the blob in a RunListener after the build completes. That would break Replay / Rebuild and, as with artifact-manager-s3 , would force the master to have a more dangerous IAM permission. Anyway S3 can be configured to automatically expunge or archive old blobs.)

          Chris Frolik added a comment - - edited

          "I am not particularly inclined to fix this per se"

          That is very disappointing, and a stumbling block for those trying to convert their freestyle jobs to declarative pipeline jobs. I really hope you change your mind on this.

          At the very least, the documentation should mention that it isn't supported.

          Chris Frolik added a comment - - edited "I am not particularly inclined to fix this per se" That is very disappointing, and a stumbling block for those trying to convert their freestyle jobs to declarative pipeline jobs. I really hope you change your mind on this. At the very least, the documentation should mention that it isn't supported.

          Mark Waite added a comment -

          Removed from documentation on jenkins.io July 25, 2019 by PR-2388

          Mark Waite added a comment - Removed from documentation on jenkins.io July 25, 2019 by PR-2388

          Jan Vrany added a comment -

          Just a notice:
          I have moved my workaround library with `unstashParam` step to GitHub, new locations is

          https://github.com/janvrany/jenkinsci-unstashParam-library

          I also renamed it as this issue seems to be unlikely "fixed".

          Jan Vrany added a comment - Just a notice: I have moved my workaround library with `unstashParam` step to GitHub, new locations is https://github.com/janvrany/jenkinsci-unstashParam-library I also renamed it as this issue seems to be unlikely "fixed".

          Vin Win added a comment -

          Hi hogarthj - Does the solution works for binary/zip file. Tried using groovy script & respective one on Jenkins pipleline, but it doesn't seem to work for zip (Can not open file 'fileName.zip' as archive') or binary file uploaded. Would like to know if this is been considered in future releases ?

          Vin Win added a comment - Hi hogarthj - Does the solution works for binary/zip file. Tried using groovy script & respective one on Jenkins pipleline, but it doesn't seem to work for zip (Can not open file 'fileName.zip' as archive') or binary file uploaded. Would like to know if this is been considered in future releases ?

          Paul Sharpe added a comment -

          vinwin: The unstashParam workaround works great for us, including zip files. We are using a windows instance of Jenkins (in case that is significant).

          Paul Sharpe added a comment - vinwin : The unstashParam workaround works great for us, including zip files. We are using a windows instance of Jenkins (in case that is significant).

          I don't quite understand if this is just not implemented yet, or if it's impossible to implement at all. If it is possible, we might be able to fund the development of this feature, as we need it too. As I am still rather new to the Jenkins community: Can anyone recommend me any companies that could do that?

           

          Andreas Schmid added a comment - I don't quite understand if this is just not implemented yet, or if it's impossible to implement at all. If it is possible, we might be able to fund the development of this feature, as we need it too. As I am still rather new to the Jenkins community: Can anyone recommend me any companies that could do that?  

          Jesse Glick added a comment -

          schmandr as to technical design, see my comments of 2018-10-24 and 2019-06-21. I am interpreting the requirement more broadly than the original statement: a user or script should be able to trigger a Pipeline job with a build parameter that includes the contents of a (possibly large, possibly binary, but not secret) file, using any common mechanism (GUI, HTTP POST, build-job CLI command, build step), in such a way that some convenient (Scripted and/or Declarative) syntax may be used to retrieve that file in a designated workspace location at a designated time, including in future rebuilds.

          Jesse Glick added a comment - schmandr as to technical design, see my comments of 2018-10-24 and 2019-06-21. I am interpreting the requirement more broadly than the original statement: a user or script should be able to trigger a Pipeline job with a build parameter that includes the contents of a (possibly large, possibly binary, but not secret) file, using any common mechanism (GUI, HTTP POST, build-job CLI command, build step), in such a way that some convenient (Scripted and/or Declarative) syntax may be used to retrieve that file in a designated workspace location at a designated time, including in future rebuilds.

          Jesse Glick added a comment -

          Jesse Glick added a comment - Prototyping in: https://github.com/jglick/alt-file-parameter-plugin

          Liam Newman added a comment -

          Liam Newman added a comment - jglick Still prototyping? Now in https://github.com/jenkinsci/file-parameters-plugin

          Jesse Glick added a comment -

          bitwiseman the basics work fine. I (or somebody) need to do a bit more testing, and work on design of usage from the input and build steps (or a decision that integration with these steps will not be supported). On my back burner.

          Jesse Glick added a comment - bitwiseman the basics work fine. I (or somebody) need to do a bit more testing, and work on design of usage from the input and build steps (or a decision that integration with these steps will not be supported). On my back burner.

          Jesse Glick added a comment -

          I have released an initial version of the File Parameter plugin. I think it covers the basic use cases for Pipeline users. Use its issue tracker as needed.

          Jesse Glick added a comment - I have released an initial version of the File Parameter plugin. I think it covers the basic use cases for Pipeline users. Use its issue tracker as needed.

          David Navrkal added a comment - - edited

          Hello jglick, first of all, thx for implementing new plugin! Is there any change your File Parameter plugin will get promoted to the Cloudbees Update Center where are verified and trusted opensource plugins since it is currently only working workaround for broken file parameter?

          As well I guess if this issue was closed with resolution people should use new File Parameter plugin then the original (for years broken and still not working) file parameter should be removed from Jenkins to prevent people confusion and stop waisting time on not existing workarounds...

          David Navrkal added a comment - - edited Hello jglick , first of all, thx for implementing new plugin! Is there any change your File Parameter plugin will get promoted to the Cloudbees Update Center where are verified and trusted opensource plugins since it is currently only working workaround for broken file parameter? As well I guess if this issue was closed with resolution people should use new  File Parameter plugin then the original (for years broken and still not working) file parameter should be removed from Jenkins to prevent people confusion and stop waisting time on not existing workarounds...

          Jesse Glick added a comment -

          Is there any change your File Parameter plugin will get promoted to the Cloudbees Update Center

          If you are a CloudBees customer, open an RFE via the normal process. I am not personally in charge of such decisions.

          the original […] file parameter should be removed from Jenkins [core]

          Well, I do not think we can remove it. In principle it could be detached to a (deprecated) plugin. Certainly we could consider adding help text and the like in core suggesting use of https://plugins.jenkins.io/file-parameters/ in its place, once it has gotten some more serious usage to shake out design problems.

          Jesse Glick added a comment - Is there any change your File Parameter plugin will get promoted to the Cloudbees Update Center If you are a CloudBees customer, open an RFE via the normal process. I am not personally in charge of such decisions. the original […] file parameter should be removed from Jenkins [core] Well, I do not think we can remove it. In principle it could be detached to a (deprecated) plugin. Certainly we could consider adding help text and the like in core suggesting use of https://plugins.jenkins.io/file-parameters/ in its place, once it has gotten some more serious usage to shake out design problems.

          Rene Buergel added a comment - - edited

          jglick this is a cool plugin, thank you!

          how's the intended handling of non-mandatory large files used with stashedFile in declarative?

          • If no file is provided and the build is triggered manually, unstashing works, but the file is 0 Bytes.
            For stages depending on the file, I'm using the following (unfortunately platform-dependent) when-condition:
            when { expression { sh(returnStatus:true, script: '[ -s FPGA.zip ]' ) == 0 } }
            
          • If no file is provided and the build is triggered by the SCM, unstashing fails because the whole parameter seems to be missing, so I need to wrap the unstash into a catchError

          I think, that's a quite cumbersome error handling for basically the same case (no file provided)

          Rene Buergel added a comment - - edited jglick this is a cool plugin, thank you! how's the intended handling of non-mandatory large files used with stashedFile in declarative? If no file is provided and the build is triggered manually, unstashing works, but the file is 0 Bytes. For stages depending on the file, I'm using the following (unfortunately platform-dependent) when-condition: when { expression { sh(returnStatus: true , script: '[ -s FPGA.zip ]' ) == 0 } } If no file is provided and the build is triggered by the SCM, unstashing fails because the whole parameter seems to be missing, so I need to wrap the unstash into a catchError I think, that's a quite cumbersome error handling for basically the same case (no file provided)

          Jesse Glick added a comment -

          reneb for any bugs or RFEs please use GitHub Issues, and be sure to provide a complete, self-contained, reproducible test case.

          Jesse Glick added a comment - reneb for any bugs or RFEs please use GitHub Issues, and be sure to provide a complete, self-contained, reproducible test case.

            jglick Jesse Glick
            jglick Jesse Glick
            Votes:
            111 Vote for this issue
            Watchers:
            119 Start watching this issue

              Created:
              Updated:
              Resolved: