-
Bug
-
Resolution: Fixed
-
Minor
-
None
-
Jenkins 2.204.5
Installed via helm chart version "1.9.21"
Installed plugins:
- blueocean:1.22.0
- command-launcher:1.4
- config-file-provider:3.6.3
- configuration-as-code:1.36
- credentials-binding:1.21
- file-leak-detector:1.6
- git:4.2.0
- hashicorp-vault-plugin:3.2.0
- jdk-tool:1.4
- job-dsl:1.76
- kubernetes:1.24.1
- matrix-auth:2.5
- oic-auth:1.7
- pipeline-aws:1.39
- pipeline-github-lib:1.0
- pipeline-utility-steps:2.5.0
- workflow-aggregator:2.6
Running on AWS EKS. Kubernetes version: v1.14.8Jenkins 2.204.5 Installed via helm chart version "1.9.21" Installed plugins: - blueocean:1.22.0 - command-launcher:1.4 - config-file-provider:3.6.3 - configuration-as-code:1.36 - credentials-binding:1.21 - file-leak-detector:1.6 - git:4.2.0 - hashicorp-vault-plugin:3.2.0 - jdk-tool:1.4 - job-dsl:1.76 - kubernetes:1.24.1 - matrix-auth:2.5 - oic-auth:1.7 - pipeline-aws:1.39 - pipeline-github-lib:1.0 - pipeline-utility-steps:2.5.0 - workflow-aggregator:2.6 Running on AWS EKS. Kubernetes version: v1.14.8
-
-
hashicorp-vault-plugin v3.3.0
Hi,
Provisioning build agents on kubernetes somehow results in "Too Many Open Files".
$ ps aux USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND jenkins 1 0.0 0.0 1148 4 ? Ss Mar06 0:11 /sbin/tini – /usr/local/bin/jenkins.sh --argumentsRealm.passwd.admin=${MASTER_ADMIN_PASSWORD} --argumentsRealm.roles.admin=admin --httpPort=8080 jenkins 7 1.1 7.1 3841028 1167032 ? Sl Mar06 82:21 java -Duser.home=/var/jenkins_home -Dpermissive-script-security.enabled=true -Xms512m -Xmx1024m -XX:MaxMetaspaceSize=200m -XX:CompressedClassSpaceSize=100m -Djenkins.model.Jenkins.slaveAgentPort=50000 -jar / jenkins 15138 0.0 0.0 19976 3656 pts/0 Ss+ 14:46 0:00 bash jenkins 15922 0.1 0.0 19972 3436 pts/1 Ss 14:57 0:00 bash jenkins 15927 0.0 0.0 38384 3264 pts/1 R+ 14:58 0:00 ps aux $ ls -la /proc/7/fd | head -n 20 total 0 dr-x------. 2 jenkins jenkins 0 Mar 11 14:46 . dr-xr-xr-x. 9 jenkins jenkins 0 Mar 11 14:46 .. lrwx------. 1 jenkins jenkins 64 Mar 11 14:46 0 -> /dev/null l-wx------. 1 jenkins jenkins 64 Mar 11 14:46 1 -> pipe:[295935440] lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 10 -> /dev/urandom lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 100 -> /var/jenkins_home/war/WEB-INF/lib/localizer-1.26.jar lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 1000 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 10000 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 10001 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 10002 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 10003 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 10004 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 10005 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 10006 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 10007 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 10008 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 10009 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 1001 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token lr-x------. 1 jenkins jenkins 64 Mar 11 14:46 10010 -> /run/secrets/kubernetes.io/serviceaccount/..2020_03_06_12_39_51.580529302/token $ ls -la /proc/6/fd | cut -d ' ' -f11 | grep -i token | wc -l 66108
If I start a a build there's a few leaked file descriptors:
jenkins@jenkins-579698569c-nh4g8:/$ ls -la /proc/6/fd | cut -d ' ' -f11 | grep -i token | wc -l 64725 # before build jenkins@jenkins-579698569c-nh4g8:/$ ls -la /proc/6/fd | cut -d ' ' -f11 | grep -i token | wc -l 64735 jenkins@jenkins-579698569c-nh4g8:/$ ls -la /proc/6/fd | cut -d ' ' -f11 | grep -i token | wc -l 64741 jenkins@jenkins-579698569c-nh4g8:/$ ls -la /proc/6/fd | cut -d ' ' -f11 | grep -i token | wc -l 64741 jenkins@jenkins-579698569c-nh4g8:/$ ls -la /proc/6/fd | cut -d ' ' -f11 | grep -i token | wc -l 64741 # after build ends
An example of the kinds of jobs we're using
#!/usr/bin/groovy // load pipeline functions // Requires pipeline-github-lib plugin to load library from github @Library('github.com/lachie83/jenkins-pipeline@dev') @Library('github.com/comquent/imperative-when@9ee7fbb323f2b106c4404473cfca50a3948fe1a6') _ = library identifier: 'plugin@master', retriever: modernSCM( [$class: 'GitSCMSource', remote: 'git@gitserver.mydomain.com/jenkins-plugin', credentialsId: 'creds']) def pipeline = new io.estrado.Pipeline() def label = "${env.BUILD_TAG{color:#569cd6}}".toLowerCase().replaceAll(/[^-\w]/, '-') podTemplate(label: label, containers: [ containerTemplate(name: 'helm', image: 'image', command: 'cat', ttyEnabled: true), ], imagePullSecrets: [ 'harbor' ], volumes:[ hostPathVolume(mountPath: '/var/run/docker.sock', hostPath: '/var/run/docker.sock'), persistentVolumeClaim(claimName: 'jenkins-maven-repo', mountPath: '/root/.m2/repository/') ]){ node (label) { checkout scm // read in required jenkins workflow config values def inputFile = readFile('Jenkinsfile.json') def config = new groovy.json.JsonSlurperClassic().parseText(inputFile) println "pipeline config ==> ${config{color:#569cd6}}" // continue only if pipeline enabled if (!config.pipeline.enabled) { println "pipeline disabled" return } // set additional git envvars for image tagging pipeline.gitEnvVars() // If pipeline debugging enabled if (config.pipeline.debug) { println "DEBUG ENABLED" println "pipeline config ==> ${config{color:#569cd6}}" sh "env | sort" } chartFiles = findFiles(glob: 'charts/*/Chart.yaml') stage ('lint helm charts') { chartFiles.each { chartFile -> directory = chartFile.path.minus('/Chart.yaml') container('helm') { pipeline.helmLint(directory) } } } stage ('publish helm charts') { when (BRANCH_NAME == 'master') { chartFiles.each { chartFile -> directory = chartFile.path.minus('/Chart.yaml') chartName = directory.split('/').last() specificConfig = config specificConfig.chart_repo.repo = chartName specificConfig.chart_repo.directory = directory + '/' container('helm') { String chart_version = helm.getChartVersion(config, env.BRANCH_NAME) helm.packageChart(config, chart_version) helm.uploadToHarborChartMuseum(config, chart_version) } } } } }
I know that this might now be enough to debug the problem fully - so please let me know what kind of info is required.