Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-43758

Parameters disappear from pipeline job after running the job

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Reopened (View Workflow)
    • Priority: Major
    • Resolution: Unresolved
    • Component/s: job-dsl-plugin
    • Labels:
      None
    • Environment:
      Jenkins ver. 2.53,
      Build Pipeline Plugin 1.5.6
      Pipeline
    • Similar Issues:

      Description

       

       

       

      Steps to reproduce

      1. I have created Pipeline job
      2. During creation I checked up "This project is parameterized" checkbox and added two Choice parameters
      3. I have run the job and it failed
      4. I checked the configuration of the job and parameters are no longer there and "This project is parameterized"  checkbox is no longer checked up.

        Attachments

          Issue Links

            Activity

            Hide
            tsurankov tsurankov added a comment - - edited

            I have created a workaround based on the approach taken from this post:

            https://issues.jenkins.io/browse/JENKINS-44681?focusedCommentId=304082&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-304082
            The script saves the existing properties of a pipeline job if there are any and then recreates them in JobDSL block.

            import jenkins.model.Jenkins
            import hudson.model.Item
            import hudson.model.Items
            
            def jobProperties
            Item currentJob = Jenkins.instance.getItemByFullName('_test')
            if (currentJob) {
              jobProperties = currentJob.@properties
            }
            
            pipelineJob('_test') {
              displayName('_test')
              description('Test job')
              disabled(false)
              definition {
                cpsScm {
                  scm {
                    git {
                      remote {
                        url('***')
                        credentials('***')
                      }
                      branch('master')
                      extensions { }
                    }
                  }
                  scriptPath('***')
                }
              }
              if (jobProperties) {
                configure { root ->
                  def properties = root / 'properties'
                  jobProperties.each { property ->
                    String xml = Items.XSTREAM2.toXML(property)
                    def jobPropertiesPropertyNode = new XmlParser().parseText(xml)
                    properties << jobPropertiesPropertyNode
                  }
                }
              }
            }
            
            Show
            tsurankov tsurankov added a comment - - edited I have created a workaround based on the approach taken from this post: https://issues.jenkins.io/browse/JENKINS-44681?focusedCommentId=304082&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-304082 The script saves the existing properties of a pipeline job if there are any and then recreates them in JobDSL block. import jenkins.model.Jenkins import hudson.model.Item import hudson.model.Items def jobProperties Item currentJob = Jenkins.instance.getItemByFullName( '_test' ) if (currentJob) { jobProperties = currentJob.@properties } pipelineJob( '_test' ) { displayName( '_test' ) description( 'Test job' ) disabled( false ) definition { cpsScm { scm { git { remote { url( '***' ) credentials( '***' ) } branch( 'master' ) extensions { } } } scriptPath( '***' ) } } if (jobProperties) { configure { root -> def properties = root / 'properties' jobProperties.each { property -> String xml = Items.XSTREAM2.toXML(property) def jobPropertiesPropertyNode = new XmlParser().parseText(xml) properties << jobPropertiesPropertyNode } } } }
            Hide
            elavaud Edouard Lavaud added a comment - - edited

            Ok, this is really major for us so I made a bunch of tests to try reproduce the issue.

            All based on a declarative pipeline script, script coming from SCM (git), containing "options

            { skipDefaultCheckout() }

            "

             

            1. Re-adding manually the parameters solve the issue

            If:

            • I create a new pipeline with "COPY FROM"
            • I trigger it manually, the job deletes my parameters configured in job UI
            • Then I restore my configuration via the "job config history" plugin
            • I trigger again, parameters disappear again
            • over and over and over ...

            However, as soon as I re-enter manually the parameters (not via the config history), all good, parameters do not disappear anymore.

             

             

            2. Issue when "COPY FROM"

            If I create a new pipeline with "COPY_FROM", I have the issue.

            If I create a fully new pipeline, adding exactly the same settings, including the same pipeline script coming from same git repository, same tag. No issues...

             

            So I guess this is coming from merge conflict issues of the job configuration.

            It seems random because it depends of the last change date.

            Workarounds here will re-add the properties with latest change date.

             

            Does that make any sense? 

             

             

            Show
            elavaud Edouard Lavaud added a comment - - edited Ok, this is really major for us so I made a bunch of tests to try reproduce the issue. All based on a declarative pipeline script, script coming from SCM (git), containing "options { skipDefaultCheckout() } "   1. Re-adding manually the parameters solve the issue If: I create a new pipeline with "COPY FROM" I trigger it manually, the job deletes my parameters configured in job UI Then I restore my configuration via the "job config history" plugin I trigger again, parameters disappear again over and over and over ... However, as soon as I re-enter manually the parameters (not via the config history), all good, parameters do not disappear anymore.     2. Issue when "COPY FROM" If I create a new pipeline with "COPY_FROM", I have the issue. If I create a fully new pipeline, adding exactly the same settings, including the same pipeline script coming from same git repository, same tag. No issues...   So I guess this is coming from merge conflict issues of the job configuration. It seems random because it depends of the last change date. Workarounds here will re-add the properties with latest change date.   Does that make any sense?     
            Hide
            vslala Varun added a comment - - edited

            I'm facing the same issue. I tried tsurankov approach but it didn't work for me. 

            This is how my job(s) looks like:

            import javaposse.jobdsl.dsl.DslFactory
            
            def repositories = [
                    [
                            id         : 'jenkins-test',
                            name       : 'jenkins-test',
                            displayName: 'Jenkins Test',
                            repo       : 'ssh://<JENKINS_BASE_URL>/<PROJECT_SLUG>/jenkins-test.git'
                    ]
            ]
            
            DslFactory dslFactory = this as DslFactory
            
            repositories.each { repository ->
                pipelineJob(repository.name) {
                    parameters {
                        stringParam("BRANCH", "master", "")
                    }
                    logRotator{
                        numToKeep(30)
                    }
                    authenticationToken('<TOKEN_MATCHES_WITH_THE_BITBUCKET_POST_RECEIVE_HOOK>')
                    displayName(repository.displayName)
                    description("Builds deploy pipelines for ${repository.displayName}")
                    definition {
                        cpsScm {
                            scm {
                                git {
                                    branch('${BRANCH}')
                                    remote {
                                        url(repository.repo)
                                        credentials('<CREDENTIAL_NAME>')
                                    }
                                    extensions {
                                        localBranch('${BRANCH}')
                                        wipeOutWorkspace()
                                        cloneOptions {
                                            noTags(false)
                                        }
                                    }
                                }
                                scriptPath('Jenkinsfile)
                            }
                        }
                    }
            
                }
            }
            

            This works as expected for the first time. But after the build is triggered, the parameters disappear.

             

            I've tried a lot of things but nothing worked so far. Would appreciate any help on this. Thanks.

            Show
            vslala Varun added a comment - - edited I'm facing the same issue. I tried tsurankov approach but it didn't work for me.  This is how my job(s) looks like: import javaposse.jobdsl.dsl.DslFactory def repositories = [ [ id : 'jenkins-test' , name : 'jenkins-test' , displayName: 'Jenkins Test' , repo : 'ssh: //<JENKINS_BASE_URL>/<PROJECT_SLUG>/jenkins-test.git' ] ] DslFactory dslFactory = this as DslFactory repositories.each { repository -> pipelineJob(repository.name) { parameters { stringParam( "BRANCH" , "master" , "") } logRotator{ numToKeep(30) } authenticationToken( '<TOKEN_MATCHES_WITH_THE_BITBUCKET_POST_RECEIVE_HOOK>' ) displayName(repository.displayName) description( "Builds deploy pipelines for ${repository.displayName}" ) definition { cpsScm { scm { git { branch( '${BRANCH}' ) remote { url(repository.repo) credentials( '<CREDENTIAL_NAME>' ) } extensions { localBranch( '${BRANCH}' ) wipeOutWorkspace() cloneOptions { noTags( false ) } } } scriptPath('Jenkinsfile) } } } } } This works as expected for the first time. But after the build is triggered, the parameters disappear.   I've tried a lot of things but nothing worked so far. Would appreciate any help on this. Thanks.
            Hide
            kevin_mayfield Kevin Mayfield added a comment -

            I generate the Pipeline Job with DSL and the Pipeline Job has parameters, and properties defined.  The Pipeline script does not define parameters. In my case it includes a groovy library which does reference parameters and environment (properties) from the job:

            String email = pipeline.params.email.trim()
            // and
            pipeline.archiveArtifacts artifacts: pipeline.env.STAGE_NAME + '/*.out, '
            

            If the library fails with error, the then parameters are still defined in the Pipeline Job definition.
            But once the pipeline starts running, the parameters and properties on the pipline job definition are gone.

            They are there in the first run, but the job def has already been wiped of parameter and properties.

            Show
            kevin_mayfield Kevin Mayfield added a comment - I generate the Pipeline Job with DSL and the Pipeline Job has parameters, and properties defined.  The Pipeline script does not define parameters. In my case it includes a groovy library which does reference parameters and environment (properties) from the job: String email = pipeline.params.email.trim() // and pipeline.archiveArtifacts artifacts: pipeline.env.STAGE_NAME + '/*.out, ' If the library fails with error, the then parameters are still defined in the Pipeline Job definition. But once the pipeline starts running, the parameters and properties on the pipline job definition are gone. They are there in the first run, but the job def has already been wiped of parameter and properties.
            Hide
            kevin_mayfield Kevin Mayfield added a comment -

            I previously included some Pipeline groovy class code in the Pipeline:
             @NonCPS
            And after removing I think my pipeline job parameters are staying defined.

            Show
            kevin_mayfield Kevin Mayfield added a comment - I previously included some Pipeline groovy class code in the Pipeline:  @NonCPS And after removing I think my pipeline job parameters are staying defined.

              People

              Assignee:
              Unassigned Unassigned
              Reporter:
              dzieciou Maciej Gawinecki
              Votes:
              31 Vote for this issue
              Watchers:
              42 Start watching this issue

                Dates

                Created:
                Updated: