-
Task
-
Resolution: Unresolved
-
Critical
-
None
-
Powered by SuggestiMate
for more complex variables that are annotated with @Field, the load step fails to bind the variable into the Script class.
For example:
// a.groovy import groovy.transform.Field @Field def LOADED_BUILD_NUMBER = ${env.BUILD_NUMBER} return this // Jenkinsfile node() { def a = load('a.groovy') echo(${env.BUILD_NUMBER}) echo(${a.LOADED_BUILD_NUMBER}) }
This example will fail. However, if you replace ${env.BUILD_NUMBER} with a simple type such as `3`, the load step will succeed.
This seems to be related to the security update in workflow-cps v2.64 and the subsequent regression fix for @Field in v2.71.
[JENKINS-63384] load step fails to bind "complex" @Field defined variables
The cumbersome is the `evaluate` step. The state can't traverse from the context of caller in to the DSL that's being `evaluate`d. No binding is allowed to be passes in and no custom `GroovyShell` or similar is allowed by CPS. That is why singleton/`@Field` was needed in the first place. Since they do not work - therefore not possible to save state for custom project factories and custom pipeline generators. The only case it can save state for at the moment is primitive inline Jenkinsfile pipelines that at best may delegate to some custom lib steps.
Let me unpack it a bit for more context.
Consider custom project factory like https://plugins.jenkins.io/inline-pipeline/ - custom recognizer (maybe a yaml file) instead of a Jenkinsfile with default inline Jenkinsfile hardcoded in the job:
library 'some-loader-lib@v1' loaderStep(this)
The `loaderStep` in that library `readTrusted` file (yaml or else), schedules a pod to process that input that generates an actual Jenkinsfile that the loader then `evaluate`. External process that generates Jenkinsfile does not have JVM context, so there is certain state the loader library needs to read (example Git data like repo path/branch etc) and it has to be preserved for both pipeline generator and actual generated pipeline. Also consider all the scripts or other configuration files pipeline generator needs to send back to the loader to be made available inside `evaluate` DSL - I can't embed too much into the DSL string itself due to infamous "Method code too large". Right now I am solving this via a library singleton that the loader writes to and the steps from `evaluate` can read - but it doesn't survive Controller restarts.
Is it maybe a time to stop fighting CPS and write a plugin that extends a FlowDefinition ?
https://www.jenkins.io/doc/developer/extensions/workflow-api/
llibicpep still sounds like you are overthinking this.
there is certain state the loader library needs to read (example Git data like repo path/branch etc)
is not (mutable) state, it is information which you can pass e.g. using withEnv.
I can't embed too much into the DSL string itself due to infamous "Method code too large".
You cannot too much into a single method. You can define as many methods as you like so long as each is a reasonable size.
Anyway, it is hard to follow what you are really trying to accomplish, or how you thought you should accomplish it, but this is all a topic for the users’ list or Discourse or Stack Overflow or something. So far as @Field is concerned, the status remains: this sounds like a bug in a corner case; it is unlikely anyone is going to work on it due to the effort and the risk of other regressions.
jglick - it is fairly simple actually - the pipeline generator needs to contribute stuff back to the state initiated in the loader to be consumed from the generated Jenkinsfile. As simple as that. And yes it IS one method - `evaluate` method so the entire Jenkinsfile should fit. Think of all the static resources in the pipeline that you'd need to pass from the generator via the loader to the actual steps.
Project factory:
library 'some-loader-lib@v1' loaderStep(this)
Loader:
def call(def jenkinsfile) { def context = Context.it() // data in singletone def output podTemplate(yaml: """ apiVersion: v1 kind: Pod metadata: spec: containers: - name: generator image: ... command: - sleep args: - 5m """) { node(POD_LABEL) { container('generator') { writeJSON(file: 'input.json', pretty: 4, json: context.input) output = readJSON(returnPojo: true, text: sh(returnStdout: true, script: 'cat input.json | generator').trim()) } } } context.save(output.templates) // there is a whole bunch of heavy string resources like scripts and config files context.save(output.etc) // and some other stuff that might not be as heavy library output.lib // a library with collection of custom steps generator telling the loaded to load evaluate(output.dsl) // now this string evaluating there will call steps from the above and these steps needs access to the context, including templates and etc }
Does it makes sense now?
duemir with all the time I already wasted on fighting artificial issues and limitations - sorry, no. I am looking for ways to get off of Jenkins at this point asap - there's so many great open source kubernetes native alternatives out there today like Tekton that are not flawed and are not making their goal to artificially limit and cripple user experience so that a commercial offering really shines comparing to open source.
IIUC just use
evaluate(output.dsl)(context)
where the output of the DSL is something like
{ context ->
// as before
}
I probably should not be surprised, but that didn't work too. Obviously it works for a very simple case, but if `output.dsl` actually calls any libs steps, which in turn are trying to access `context` - they have no access.
Obviously, it would work if the `context` were to be used as an argument inside the `output.dsl` every time it calls any library step, but at that point it defeats a purpose of the library, and all I get is a spaghetti code with methods (steps) that needs 50 arguments. Not to mention that I would need to refactor a library in a backward incompatible way, changing all the steps signatures.
Surprisingly, `binding` property is not the same inside and outside of a step, although `evaluate` method explicitly passing in the parent `binding`. So my attempt to use `binding` to communicate between my abstraction layers are failing too.
Whatever the first real top-level Groovy code to be run consists of, it should initialize the state map.