Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-67857

Updated Plugin is creating new directory with random directory name, even while using NullSCM class.

    • Icon: Bug Bug
    • Resolution: Not A Defect
    • Icon: Blocker Blocker
    • workflow-cps-plugin
    • None

      I want to use Jenkinsfile from local system, and thus I'm using hudson.NullSCM class of workflow-cps plugin. In earlier versions using this class didn't created any new directory. While in the latest version upgrade it is performing checkout and creating a new directory with random directory name, and searching the script in that location. 
      I want a workaround or fix of this issue, as its a blocker in a Jenkins plugin that I'm working on.

          [JENKINS-67857] Updated Plugin is creating new directory with random directory name, even while using NullSCM class.

          Bernhard M added a comment - - edited

          I have the same problem: I have a pipeline checking out the jenkinsfile and try to access other files from the checkout. 

          Why should the script from the jenkinsfile not be able to access other files from the own checkout (if the "Lightweight checkout" is not enabled)?

          So I think the solution is, that the plugin provides an official way to access the checkout.

          As a workaround you can downgrade workflow_cps (Pipeline: Groovy) to https://updates.jenkins-ci.org/download/plugins/workflow-cps/2648.va9433432b33c/workflow-cps.hpi

          Bernhard M added a comment - - edited I have the same problem: I have a pipeline checking out the jenkinsfile and try to access other files from the checkout.  Why should the script from the jenkinsfile not be able to access other files from the own checkout (if the "Lightweight checkout" is not enabled)? So I think the solution is, that the plugin provides an official way to access the checkout. As a workaround you can downgrade workflow_cps ( Pipeline: Groovy ) to  https://updates.jenkins-ci.org/download/plugins/workflow-cps/2648.va9433432b33c/workflow-cps.hpi

          bmaehr Thanks for your reply. I am already using the downgraded version of workflow-cps for now. But I have my own plugin in Jenkins that uses the workflow-cps plugin for getting Jenkinsfile script from jenkins workspace <job_name>@script directory.

          And the users that have latest Jenkins with latest plugins are facing issues. 

          We have to make our plugin compatible with the latest jenkins, so is there any workaround on the problem I mentioned in the Ticket ?

          Samarth Agarwal added a comment - bmaehr  Thanks for your reply. I am already using the downgraded version of workflow-cps for now. But I have my own plugin in Jenkins that uses the workflow-cps plugin for getting Jenkinsfile script from jenkins workspace <job_name>@script directory. And the users that have latest Jenkins with latest plugins are facing issues.  We have to make our plugin compatible with the latest jenkins, so is there any workaround on the problem I mentioned in the Ticket ?

          Bernhard M added a comment -

          dnusbaum Any updates on this?

          Bernhard M added a comment - dnusbaum  Any updates on this?

          Devin Nusbaum added a comment -

          I want to use Jenkinsfile from local system

          I don't think there is any way to get this to work directly anymore, and I don't think it was ever intended to be supported. If your goal is to reuse the same Jenkinsfile across Pipelines, the closest supported equivalent would be to use a Pipeline library as I recommended in JENKINS-67879. We could potentially add a way to disable the related security fixes to make this work again, but note that if you have a Jenkins instance where not all users are admins, the vulnerabilities would effectively give all of those users admin permission.

          Why should the script from the jenkinsfile not be able to access other files from the own checkout (if the "Lightweight checkout" is not enabled)?

          If you need to use the checkout, I would use the checkout step (ideally with a lightweight checkout of the Jenkinsfile if supported by your SCM plugin to avoid double checkouts), combined with the dir step if you need to put the checkout in a specific directory. The security fix only hashes the checkout directories that get automatically created on the controller to support heavyweight checkouts (to avoid directory reuse across SCMs), which is required for SCM plugins that do not support lightweight checkout. If your Pipeline is currently accessing the files from the heavyweight checkout on the Jenkins controller, then you are either running builds on the controller (very unsafe!), or are doing things in a trusted Pipeline library that are probably better done with a sh step that runs on an agent.

          If you cannot use lightweight checkout for the Jenkinsfile and really need to look at the heavyweight checkout directories on the controller, you can examine the contents of the <hash>-scm-key.txt files at the same level as the checkout directories to look for a specific SCM and then use the hash directory with the same name. Also, note that the directories are not randomized, they are a hashed version of the SCM key and a hidden key in Jenkins, so once the directory has been created once, the hash will always be the same for that SCM on that Jenkins instance going forward.

          But I have my own plugin in Jenkins that uses the workflow-cps plugin for getting Jenkinsfile script from jenkins workspace <job_name>@script directory.

          Do you have a link to this plugin? There may be better APIs to use depending on what exactly you are doing.

          Devin Nusbaum added a comment - I want to use Jenkinsfile from local system I don't think there is any way to get this to work directly anymore, and I don't think it was ever intended to be supported. If your goal is to reuse the same Jenkinsfile across Pipelines, the closest supported equivalent would be to use a Pipeline library as I recommended in JENKINS-67879 . We could potentially add a way to disable the related security fixes to make this work again, but note that if you have a Jenkins instance where not all users are admins, the vulnerabilities would effectively give all of those users admin permission. Why should the script from the jenkinsfile not be able to access other files from the own checkout (if the "Lightweight checkout" is not enabled)? If you need to use the checkout, I would use the checkout step (ideally with a lightweight checkout of the Jenkinsfile if supported by your SCM plugin to avoid double checkouts), combined with the dir step if you need to put the checkout in a specific directory. The security fix only hashes the checkout directories that get automatically created on the controller to support heavyweight checkouts (to avoid directory reuse across SCMs), which is required for SCM plugins that do not support lightweight checkout. If your Pipeline is currently accessing the files from the heavyweight checkout on the Jenkins controller, then you are either running builds on the controller (very unsafe!), or are doing things in a trusted Pipeline library that are probably better done with a sh step that runs on an agent. If you cannot use lightweight checkout for the Jenkinsfile and really need to look at the heavyweight checkout directories on the controller, you can examine the contents of the <hash>-scm-key.txt files at the same level as the checkout directories to look for a specific SCM and then use the hash directory with the same name. Also, note that the directories are not randomized, they are a hashed version of the SCM key and a hidden key in Jenkins, so once the directory has been created once, the hash will always be the same for that SCM on that Jenkins instance going forward. But I have my own plugin in Jenkins that uses the workflow-cps plugin for getting Jenkinsfile script from jenkins workspace <job_name>@script directory. Do you have a link to this plugin? There may be better APIs to use depending on what exactly you are doing.

          Bernhard M added a comment -

          dnusbaum Thank you for your response.

          I'm not reusing checkouts and I'm still think the pipeline should have access to it's files. Is that not the reason to do a Non-Lightweight-Checkout?

          Yes, I'm running the builds on the controller - like - in my experience - most users do. All the security things are no issue for many companies using Jenkins because in many cases the users managing/setting up the Jenkins are the same users defining/running the build scripts. I understand that it is necessery to support the secured setups but forcing the customers to use the more complex/expensive setup with controller and nodes is the wrong direction.

          Nevertheless the action I do (find out version numbers of artifacts) that needs access to the files is for my understanding somthing that in fact should run on the controller.

          Would it be possible to provide an variable or environment variable in the pipeline containing the information about the checkout directory? 

           

          Bernhard M added a comment - dnusbaum  Thank you for your response. I'm not reusing checkouts and I'm still think the pipeline should have access to it's files. Is that not the reason to do a Non-Lightweight-Checkout? Yes, I'm running the builds on the controller - like - in my experience - most users do. All the security things are no issue for many companies using Jenkins because in many cases the users managing/setting up the Jenkins are the same users defining/running the build scripts. I understand that it is necessery to support the secured setups but forcing the customers to use the more complex/expensive setup with controller and nodes is the wrong direction. Nevertheless the action I do (find out version numbers of artifacts) that needs access to the files is for my understanding somthing that in fact should run on the controller. Would it be possible to provide an variable or environment variable in the pipeline containing the information about the checkout directory?   

          Bernhard M added a comment -

          Hint: Temporary workaround I started to use: Manually create in each workspace symbolic links for the files needed during execution in the @script directory pointing to the file in the directory with the random name.

          Bernhard M added a comment - Hint: Temporary workaround I started to use: Manually create in each workspace symbolic links for the files needed during execution in the @script directory pointing to the file in the directory with the random name.

          My organization's problem with the random directory name is primarily the length of the name. Our controller cannot do a lightweight checkout of the repo so it creates non lightweight checkout on the controller and then runs the job on a node. But since our controller runs on Windows and the hash directory is 65 chars long we hit the maximum path length in Windows and cannot complete the checkout. (Large repo with several layers of directories.)

          I guess we can do a workaround and skip checking out via Jenkins git and do it by script/build step, but it doesn't really feel right...

          Anna Freiholtz added a comment - My organization's problem with the random directory name is primarily the length of the name. Our controller cannot do a lightweight checkout of the repo so it creates non lightweight checkout on the controller and then runs the job on a node. But since our controller runs on Windows and the hash directory is 65 chars long we hit the maximum path length in Windows and cannot complete the checkout. (Large repo with several layers of directories.) I guess we can do a workaround and skip checking out via Jenkins git and do it by script/build step, but it doesn't really feel right...

          Pascal Jacob added a comment -

          My workaround: when using a declarative pipeline instead of a scripted pipeline, Jenkins will also checkout the script repo into your workspace. This way, the auto generated sub dir can be ignored. So you can just import your libs using:

          def myVar = load "./<path_to_script>";

          Instead of:

          def rootDir = pwd();
          def myVar = load "${rootDir}@script/<path_to_script>";

          Pascal Jacob added a comment - My workaround: when using a declarative pipeline instead of a scripted pipeline, Jenkins will also checkout the script repo into your workspace. This way, the auto generated sub dir can be ignored. So you can just import your libs using: def myVar = load "./<path_to_script>" ; Instead of: def rootDir = pwd(); def myVar = load "${rootDir}@script/<path_to_script>" ;

          Jesse Glick added a comment -

          the length of the name. Our controller cannot do a lightweight checkout of the repo

          Sounds like JENKINS-67836.

          Jesse Glick added a comment - the length of the name. Our controller cannot do a lightweight checkout of the repo Sounds like JENKINS-67836 .

          Devin Nusbaum added a comment - - edited

          See this comment for discussion of some supported alternatives if the related changes break your use case.

          See JENKINS-67836 if the length of the directory names is causing issues for you.

          Devin Nusbaum added a comment - - edited See this comment for discussion of some supported alternatives if the related changes break your use case. See JENKINS-67836 if the length of the directory names is causing issues for you.

            Unassigned Unassigned
            sam_nagarro Samarth Agarwal
            Votes:
            4 Vote for this issue
            Watchers:
            8 Start watching this issue

              Created:
              Updated:
              Resolved: