-
Task
-
Resolution: Unresolved
-
Critical
-
None
-
Powered by SuggestiMate
for more complex variables that are annotated with @Field, the load step fails to bind the variable into the Script class.
For example:
// a.groovy import groovy.transform.Field @Field def LOADED_BUILD_NUMBER = ${env.BUILD_NUMBER} return this // Jenkinsfile node() { def a = load('a.groovy') echo(${env.BUILD_NUMBER}) echo(${a.LOADED_BUILD_NUMBER}) }
This example will fail. However, if you replace ${env.BUILD_NUMBER} with a simple type such as `3`, the load step will succeed.
This seems to be related to the security update in workflow-cps v2.64 and the subsequent regression fix for @Field in v2.71.
[JENKINS-63384] load step fails to bind "complex" @Field defined variables
I just had a brilliant idea to implement InvisibleAction in my library and use it to store state... and what do you think? Yeah that's right - they took care of that too!
https://www.jenkins.io/blog/2018/03/15/jep-200-lts/
Well done Jenkins! Well done.
I'm feeling like Alice falling down the rabbit hole. I've tried to (for the sake of testing) executing this:
def f = ExtensionList.lookup(jenkins.security.CustomClassFilter.Static.class)[0] f.overrides.put('my.JenkinsfileContext', true)
Which is an equivalent of `-Dhudson.remoting.ClassFilter=my.JenkinsfileContext` so I don't have to restart my Jenkins for testing.
And then I had this in my library:
@NonCPS def diag() { echo "isBlacklisted(String) = ${hudson.remoting.ClassFilter.DEFAULT.isBlacklisted('my.JenkinsfileContext').toString()}" echo "isBlacklisted(Class) = ${hudson.remoting.ClassFilter.DEFAULT.isBlacklisted(my.JenkinsfileContext.class).toString()}" }
It reports
[Pipeline] echo isBlacklisted(String) = false [Pipeline] echo isBlacklisted(Class) = false
But, I still get
java.lang.UnsupportedOperationException: Refusing to marshal my.JenkinsfileContext for security reasons; see https://jenkins.io/redirect/class-filter/
I am looking at https://github.com/jenkinsci/jenkins/blob/jenkins-2.289.2/core/src/main/java/hudson/util/XStream2.java#L561 and I can clearly see
return ClassFilter.DEFAULT.isBlacklisted(name) || ClassFilter.DEFAULT.isBlacklisted(type);
How in the world my `my.JenkinsfileContext ` is still end up in this `BlacklistedTypesConverter`??? How many more circles of Hell I'' have to go through for such a simple basic thing as preserving my groovy lib state between the Jenkins restarts?
I am raising priority of this ticket. This does NOT conform to the "Minor". This is fundamental basic functionality that is broken.
jglick, oleg_nenashev, abayer, dnusbaum - tagging as participants on both SECURITY-1336 and JEP-200.
We need a way to preserve groovy lib state across Jenkins restarts within the same pipeline run. One way or another - this is basic requirement. The whole CPS survivability makes no sense if my groovy lib is inoperable after a restart. From my perspective fixing the `@Field` annotation sounds like the right thing to do. But if the proper way to preserve state is through InvisibleAction - we need to be able to implement action in the the lib, which I'm not sure even feasible - it won't be on the class path outside of the `WorkflowRun` (probably that's why `BlacklistedTypesConverter` still caught me above). Meanwhile any other ideas/workarounds are highly appreciated.
llibicpep I think you are overthinking this. Do not write a plugin. If @Field is broken in this context, just do not use it unless and until the bug is fixed. (Obviously it had never occurred to anyone to write a functional test with load + @Field.)
a way to preserve groovy lib state across Jenkins restarts within the same pipeline run
Just have the Jenkinsfile pass a Map or other stateful object to the library or load’ed script? Or just use a local variable?
jglick - that's what I am already doing. The problem is with saving it or a pointer to it. Library doesn't have access to local variables defined in the Jenkinsfile context.
Consider I have something like this
```groovy
myLib(this, 'pipeline.yaml')
```
As my Jenkinsfile. Yes, I was thinking about extending it to:
```groovy
def state = [:]
myLib(this, 'pipeline.yaml', state)
```
But this library generates a pipeline (as String) and `evaluate`s it. I tried to evaluate it with a custom GroovyShell and passing a Binding inside - CPS is not happy about that. Otherwise - my dynamic Jenkinsfile has no access to that state object. When other steps being called from that dynamic Jenkinsfile - they don't have access to that either. I was trying to solve it with Singleton and it works - up until Controller restarts. `@Field` doesn't work at all.
Even if I didn't had the dynamic Jenkinsfile generation - I would then had to recursively pass `state` object to every step and method in my library, which is an inconvenience at the very least.
recursively pass state object to every step and method in my library
Yes do this.
I can't do this. There is no Jenkinsfile. And I can't pass `state` to the `evaluate` function.
Whatever the first real top-level Groovy code to be run consists of, it should initialize the state map.
The cumbersome is the `evaluate` step. The state can't traverse from the context of caller in to the DSL that's being `evaluate`d. No binding is allowed to be passes in and no custom `GroovyShell` or similar is allowed by CPS. That is why singleton/`@Field` was needed in the first place. Since they do not work - therefore not possible to save state for custom project factories and custom pipeline generators. The only case it can save state for at the moment is primitive inline Jenkinsfile pipelines that at best may delegate to some custom lib steps.
Let me unpack it a bit for more context.
Consider custom project factory like https://plugins.jenkins.io/inline-pipeline/ - custom recognizer (maybe a yaml file) instead of a Jenkinsfile with default inline Jenkinsfile hardcoded in the job:
library 'some-loader-lib@v1' loaderStep(this)
The `loaderStep` in that library `readTrusted` file (yaml or else), schedules a pod to process that input that generates an actual Jenkinsfile that the loader then `evaluate`. External process that generates Jenkinsfile does not have JVM context, so there is certain state the loader library needs to read (example Git data like repo path/branch etc) and it has to be preserved for both pipeline generator and actual generated pipeline. Also consider all the scripts or other configuration files pipeline generator needs to send back to the loader to be made available inside `evaluate` DSL - I can't embed too much into the DSL string itself due to infamous "Method code too large". Right now I am solving this via a library singleton that the loader writes to and the steps from `evaluate` can read - but it doesn't survive Controller restarts.
Is it maybe a time to stop fighting CPS and write a plugin that extends a FlowDefinition ?
https://www.jenkins.io/doc/developer/extensions/workflow-api/
llibicpep still sounds like you are overthinking this.
there is certain state the loader library needs to read (example Git data like repo path/branch etc)
is not (mutable) state, it is information which you can pass e.g. using withEnv.
I can't embed too much into the DSL string itself due to infamous "Method code too large".
You cannot too much into a single method. You can define as many methods as you like so long as each is a reasonable size.
Anyway, it is hard to follow what you are really trying to accomplish, or how you thought you should accomplish it, but this is all a topic for the users’ list or Discourse or Stack Overflow or something. So far as @Field is concerned, the status remains: this sounds like a bug in a corner case; it is unlikely anyone is going to work on it due to the effort and the risk of other regressions.
jglick - it is fairly simple actually - the pipeline generator needs to contribute stuff back to the state initiated in the loader to be consumed from the generated Jenkinsfile. As simple as that. And yes it IS one method - `evaluate` method so the entire Jenkinsfile should fit. Think of all the static resources in the pipeline that you'd need to pass from the generator via the loader to the actual steps.
Project factory:
library 'some-loader-lib@v1' loaderStep(this)
Loader:
def call(def jenkinsfile) { def context = Context.it() // data in singletone def output podTemplate(yaml: """ apiVersion: v1 kind: Pod metadata: spec: containers: - name: generator image: ... command: - sleep args: - 5m """) { node(POD_LABEL) { container('generator') { writeJSON(file: 'input.json', pretty: 4, json: context.input) output = readJSON(returnPojo: true, text: sh(returnStdout: true, script: 'cat input.json | generator').trim()) } } } context.save(output.templates) // there is a whole bunch of heavy string resources like scripts and config files context.save(output.etc) // and some other stuff that might not be as heavy library output.lib // a library with collection of custom steps generator telling the loaded to load evaluate(output.dsl) // now this string evaluating there will call steps from the above and these steps needs access to the context, including templates and etc }
Does it makes sense now?
duemir with all the time I already wasted on fighting artificial issues and limitations - sorry, no. I am looking for ways to get off of Jenkins at this point asap - there's so many great open source kubernetes native alternatives out there today like Tekton that are not flawed and are not making their goal to artificially limit and cripple user experience so that a commercial offering really shines comparing to open source.
IIUC just use
evaluate(output.dsl)(context)
where the output of the DSL is something like
{ context ->
// as before
}
I probably should not be surprised, but that didn't work too. Obviously it works for a very simple case, but if `output.dsl` actually calls any libs steps, which in turn are trying to access `context` - they have no access.
Obviously, it would work if the `context` were to be used as an argument inside the `output.dsl` every time it calls any library step, but at that point it defeats a purpose of the library, and all I get is a spaghetti code with methods (steps) that needs 50 arguments. Not to mention that I would need to refactor a library in a backward incompatible way, changing all the steps signatures.
Surprisingly, `binding` property is not the same inside and outside of a step, although `evaluate` method explicitly passing in the parent `binding`. So my attempt to use `binding` to communicate between my abstraction layers are failing too.
These "security" fixes start driving me crazy.
There seems to be no way to preserve the state across Jenkins restarts anymore in my library.
None of the classical Java singleton patterns (that relies on static fields) work because CPS does not serialize static fields.
`@Singleton` annotation doesn't work because AST transformation will define a `getInstance()` method without `@NonCPS` annotation making this singleton unusable from my `@NonCPS` annotated code.
Now this - I can't preserve state as a global var `@Field` anymore.
Really starting to feel like the whole objective here is to make user's life harder...