nfalco All pipelines are scanned, my Jenkins is actually a clean install and it doesn't persist any state. It runs in a container without any volumes, so when Jenkins is restarted, the entire JENKINS_HOME is destroyed (which happens constantly because I'm testing this in my local environment). When it starts all pipelines are created automatically through a groovy script, and a scan is triggered for each pipeline. Note that I don't use Organization Folders, only individual Multibranch Pipeline jobs, which get scanned when they are created and through Bitbucket (Data Center) webhooks.
I did notice that if I dump() the return of "checkout scm" the variables are there as entrySets, they are just not in the environment when I do an sh("printenv") or when I later use Script Console to dump EnvActionImpl of the build. The pipeline that worked with your sample was one that I created manually, maybe there is a difference between a manually created pipeline and one created through groovy, but I haven't had a chance yet to do a deep dive. Still intend to do that sometime soon, but not sure when I will have time. For now I'm just grabbing the information from the BitbucketSCMSource like I did before these envs were added.
Only created this ticket to log what I came across, I'm happy with my alternative solution. Thank you for the effort on this plugin and maintaining it!
Please open a config.xml of a workflow job in Jenkins master, for example in $JENKINS_HOME/jobs/<organization job name>/master/config.xml
Should be something like:
If not than run a "Scan Organization Folder Now" and check again