-
Bug
-
Resolution: Unresolved
-
Major
-
None
-
Jenkins ver. 2.53,
Build Pipeline Plugin 1.5.6
Pipeline
-
Powered by SuggestiMate
Steps to reproduce
- I have created Pipeline job
- During creation I checked up "This project is parameterized" checkbox and added two Choice parameters
- I have run the job and it failed
- I checked the configuration of the job and parameters are no longer there and "This project is parameterized" checkbox is no longer checked up.
[JENKINS-43758] Parameters disappear from pipeline job after running the job
Annother super annoyed user here (sorry to say that, but that's just the truth).
We are setting up most of our jobs via JCasC (which wraps JobDSL) and every single time we execute our JCasC yaml files, all properties that are defined by the respective pipeline scripts are lost: parameters, triggers, sidebar links etc.
Losing parameters of jobs that are triggered not by human project members but by other systems/scripts (e.g. Pull Request Notifier for Bitbucket Server) is especially painful.
Those jobs frequently triggered by human project members will sooner or later re-receive their parameters because someone will just click "Build Now" eventually but those jobs triggered from outside will just never run (rejected because of "unknown" parameters?).
Every single time we execute our JCasC scripts we have to go through a list of jobs and "fix" them by clicking "Build Now". Yes, we could write a script for that but some jobs don't have parameters.
Instead they need to have their scm-polling re-initialized. Since some of those jobs run for many hours, so we need to abort them right away. Writing a script for all those cases feels like investing too much time on the wrong end of the problem.
I am willing to contribute a fix but where to start? What is the right approach? Should we start with an opt-in to preserve (instead of wipe) parameters, triggers etc.?
Like others have mentioned here, it would be very useful if we could append to existing properties inside a pipeline script.
We should be able to run `properties` more than once with an append option.
Should I open a separate issue specifically to track this request?
We struggled with this problem also. For now, we wrote a little script to start jobs and put it in /var/jenkins_home/init.groovy.d folder which contains groovy scripts that are executed after Jenkins docker instance started. After that, we added a stage to pipeline script of relevant job to stop it after Jenkins restarted. I know this hacky but at least works for now until this issue resolved.
Additionally, using queue function of jobDsl didn't work sometimes, so we gave up using it. It seems, there is a race condition issue.
famod I'm facing exactly the same issue, may I know if you have found any solution?
Hey Team, Is there a fix or workaround is available for this?
I have created a workaround based on the approach taken from this post:
https://issues.jenkins.io/browse/JENKINS-44681?focusedCommentId=304082&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-304082
The script saves the existing properties of a pipeline job if there are any and then recreates them in JobDSL block.
import jenkins.model.Jenkins import hudson.model.Item import hudson.model.Items def jobProperties Item currentJob = Jenkins.instance.getItemByFullName('_test') if (currentJob) { jobProperties = currentJob.@properties } pipelineJob('_test') { displayName('_test') description('Test job') disabled(false) definition { cpsScm { scm { git { remote { url('***') credentials('***') } branch('master') extensions { } } } scriptPath('***') } } if (jobProperties) { configure { root -> def properties = root / 'properties' jobProperties.each { property -> String xml = Items.XSTREAM2.toXML(property) def jobPropertiesPropertyNode = new XmlParser().parseText(xml) properties << jobPropertiesPropertyNode } } } }
Ok, this is really major for us so I made a bunch of tests to try reproduce the issue.
All based on a declarative pipeline script, script coming from SCM (git), containing "options
{ skipDefaultCheckout() }"
1. Re-adding manually the parameters solve the issue
If:
- I create a new pipeline with "COPY FROM"
- I trigger it manually, the job deletes my parameters configured in job UI
- Then I restore my configuration via the "job config history" plugin
- I trigger again, parameters disappear again
- over and over and over ...
However, as soon as I re-enter manually the parameters (not via the config history), all good, parameters do not disappear anymore.
2. Issue when "COPY FROM"
If I create a new pipeline with "COPY_FROM", I have the issue.
If I create a fully new pipeline, adding exactly the same settings, including the same pipeline script coming from same git repository, same tag. No issues...
So I guess this is coming from merge conflict issues of the job configuration.
It seems random because it depends of the last change date.
Workarounds here will re-add the properties with latest change date.
Does that make any sense?
I'm facing the same issue. I tried tsurankov approach but it didn't work for me.
This is how my job(s) looks like:
import javaposse.jobdsl.dsl.DslFactory def repositories = [ [ id : 'jenkins-test', name : 'jenkins-test', displayName: 'Jenkins Test', repo : 'ssh://<JENKINS_BASE_URL>/<PROJECT_SLUG>/jenkins-test.git' ] ] DslFactory dslFactory = this as DslFactory repositories.each { repository -> pipelineJob(repository.name) { parameters { stringParam("BRANCH", "master", "") } logRotator{ numToKeep(30) } authenticationToken('<TOKEN_MATCHES_WITH_THE_BITBUCKET_POST_RECEIVE_HOOK>') displayName(repository.displayName) description("Builds deploy pipelines for ${repository.displayName}") definition { cpsScm { scm { git { branch('${BRANCH}') remote { url(repository.repo) credentials('<CREDENTIAL_NAME>') } extensions { localBranch('${BRANCH}') wipeOutWorkspace() cloneOptions { noTags(false) } } } scriptPath('Jenkinsfile) } } } } }
This works as expected for the first time. But after the build is triggered, the parameters disappear.
I've tried a lot of things but nothing worked so far. Would appreciate any help on this. Thanks.
I generate the Pipeline Job with DSL and the Pipeline Job has parameters, and properties defined. The Pipeline script does not define parameters. In my case it includes a groovy library which does reference parameters and environment (properties) from the job:
String email = pipeline.params.email.trim() // and pipeline.archiveArtifacts artifacts: pipeline.env.STAGE_NAME + '/*.out, '
If the library fails with error, the then parameters are still defined in the Pipeline Job definition.
But once the pipeline starts running, the parameters and properties on the pipline job definition are gone.
They are there in the first run, but the job def has already been wiped of parameter and properties.
I tried switching to properties([ parameters([]) ]) and that didn't work at first.
Then at the bottom of my pipeline I saw some additional "properties([])" lines in later stages. I removed those. and params seem be staying in the UI.
At this point I am reluctant to touch it to try the more standard "parameters { }" definition.
So if you're not using Job DSL, please open a separate JIRA. If you're using the properties step in Scripted Pipeline or the parameters directive in Declarative, those do try to preserve job properties and build parameters defined outside of the pipeline, but Job DSL is still going to wipe out whatever is in the properties and parameters when it runs its seed job. Also, don't ever call properties step more than once in a pipeline - you're gonna run into a bunch of potential pitfalls there.