Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-61389

Pipeline Matrix return a "Method code too large!" on a really short pipeline

    XMLWordPrintable

    Details

    • Similar Issues:

      Description

      The following pipeline causes a "Method code too large!" error when you try to execute it, seems related to the environment variables and the number of axis but I dunno why.

      11:51:01  org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
      11:51:01  General error during class generation: Method code too large!
      11:51:01  
      11:51:01  java.lang.RuntimeException: Method code too large!
      11:51:01  	at groovyjarjarasm.asm.MethodWriter.a(Unknown Source)
      11:51:01  	at groovyjarjarasm.asm.ClassWriter.toByteArray(Unknown Source)
      11:51:01  	at org.codehaus.groovy.control.CompilationUnit$17.call(CompilationUnit.java:827)
      11:51:01  	at org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1065)
      11:51:01  	at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:603)
      11:51:01  	at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:581)
      11:51:01  	at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:558)
      11:51:01  	at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:298)
      11:51:01  	at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:268)
      11:51:01  	at groovy.lang.GroovyShell.parseClass(GroovyShell.java:688)
      11:51:01  	at groovy.lang.GroovyShell.parse(GroovyShell.java:700)
      11:51:01  	at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.doParse(CpsGroovyShell.java:142)
      11:51:01  	at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.reparse(CpsGroovyShell.java:127)
      11:51:01  	at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.parseScript(CpsFlowExecution.java:561)
      11:51:01  	at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.start(CpsFlowExecution.java:522)
      11:51:01  	at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:327)
      11:51:01  	at hudson.model.ResourceController.execute(ResourceController.java:97)
      11:51:01  	at hudson.model.Executor.run(Executor.java:428)
      11:51:01  
      11:51:01  1 error
      11:51:01  
      11:51:01  	at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
      11:51:01  	at org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1085)
      11:51:01  	at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:603)
      11:51:01  	at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:581)
      11:51:01  	at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:558)
      11:51:01  	at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:298)
      11:51:01  	at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:268)
      11:51:01  	at groovy.lang.GroovyShell.parseClass(GroovyShell.java:688)
      11:51:01  	at groovy.lang.GroovyShell.parse(GroovyShell.java:700)
      11:51:01  	at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.doParse(CpsGroovyShell.java:142)
      11:51:01  	at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.reparse(CpsGroovyShell.java:127)
      11:51:01  	at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.parseScript(CpsFlowExecution.java:561)
      11:51:01  	at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.start(CpsFlowExecution.java:522)
      11:51:01  	at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:327)
      11:51:01  	at hudson.model.ResourceController.execute(ResourceController.java:97)
      11:51:01  	at hudson.model.Executor.run(Executor.java:428)
      11:51:01  Finished: FAILURE
      
      #!/usr/bin/env groovy
      
      pipeline {
        agent { label 'linux' }
        stages {
          stage('Tests'){
            matrix {
              agent { label 'linux' }
              environment {
                TMPDIR = "/tmp"
                REUSE_CONTAINERS = "true"
                HOME = "/tmp"
                CONFIG_HOME = "/tmp"
                EC_WS ="/tmp/aaaa"
                VENV = "/tmp/.venv"
                PATH = "/tmp/aaaa/.ci/scripts:/tmp/bin:aaaa/bin:aaaa/.ci/scripts:${env.PATH}"
                CLUSTER_CONFIG_FILE="aaaaa/tests/environments/elastic_cloud.yml"
                ENABLE_ES_DUMP = "true"
              }
              axes {
                axis {
                    name 'TEST'
                    values 'all', 'dotnet', 'go', 'java', 'nodejs', 'python', 'ruby', 'rum'
                }
                axis {
                    name 'ELASTIC_STACK_VERSION'
                    values '8.0.0-SNAPSHOT', '7.7.0-SNAPSHOT', '7.6.1-SNAPSHOT', '6.8.7-SNAPSHOT'
                }
              }
              stages {
                stage('Prepare Test'){
                  steps {
                    echo "Running tests - ${ELASTIC_STACK_VERSION} x ${TEST}"
                  }
                }
              }
            }
          }
        }
      }
      
      

        Attachments

          Issue Links

            Activity

            Hide
            ifernandezcalvo Ivan Fernandez Calvo added a comment - - edited

            Another case that does not make any sense that it said that the method is too large, the following pipeline return the "Method code too large!" error

            pipeline {
              agent { label 'ubuntu && immutable' }
              environment {
                BASE_DIR = 'src/github.com/elastic/beats'
                JOB_GCS_BUCKET = 'bucket'
                JOB_GCS_CREDENTIALS = 'creds'
                DOCKERELASTIC_SECRET = 'secret'
                DOCKER_REGISTRY = 'docker.example.com'
                SNAPSHOT = "true"
              }
              options {
                timeout(time: 3, unit: 'HOURS')
                buildDiscarder(logRotator(numToKeepStr: '20', artifactNumToKeepStr: '20', daysToKeepStr: '30'))
                timestamps()
                ansiColor('xterm')
                disableResume()
                durabilityHint('PERFORMANCE_OPTIMIZED')
                disableConcurrentBuilds()
              }
              triggers {
                issueCommentTrigger('(?i)^\\/packaging$')
              }
              stages {
                stage('Build Packages'){
                  matrix {
                    agent { label 'ubuntu && immutable' }
                    axes {
                      axis {
                        name 'PLATFORMS'
                        values (
                          '+linux/armv7',
                          '+linux/ppc64le',
                          '+linux/s390x',
                          '+linux/mips64',
                          '+darwin',
                          '+windows/386',
                          '+windows/amd64'
                        )
                      }
                      axis {
                        name 'BEATS_FOLDER'
                        values (
                          'auditbeat',
                          'filebeat',
                          'heartbeat',
                          'journalbeat',
                          'metricbeat',
                          'packetbeat',
                          'winlogbeat',
                          'x-pack/auditbeat',
                          'x-pack/filebeat',
                          'x-pack/functionbeat',
                          'x-pack/heartbeat',
                          'x-pack/journalbeat',
                          'x-pack/metricbeat',
                          'x-pack/packetbeat',
                          'x-pack/winlogbeat'
                        )
                      }
                    }
                    stages {
                      stage('Package'){
                        options { skipDefaultCheckout() }
                        environment {
                          HOME = "${env.WORKSPACE}"
                        }
                        steps {
                          echo "Running tests"
                        }
                      }
                    }
                  }
                }
              }
            }
            

            moving the agent request from the matrix to the stage resolves the issue, but it should be supported

            pipeline {
              agent { label 'ubuntu && immutable' }
              environment {
                BASE_DIR = 'src/github.com/elastic/beats'
                JOB_GCS_BUCKET = 'bucket'
                JOB_GCS_CREDENTIALS = 'creds'
                DOCKERELASTIC_SECRET = 'secret'
                DOCKER_REGISTRY = 'docker.example.com'
                SNAPSHOT = "true"
              }
              options {
                timeout(time: 3, unit: 'HOURS')
                buildDiscarder(logRotator(numToKeepStr: '20', artifactNumToKeepStr: '20', daysToKeepStr: '30'))
                timestamps()
                ansiColor('xterm')
                disableResume()
                durabilityHint('PERFORMANCE_OPTIMIZED')
                disableConcurrentBuilds()
              }
              triggers {
                issueCommentTrigger('(?i)^\\/packaging$')
              }
              stages {
                stage('Build Packages'){
                  matrix {
                    axes {
                      axis {
                        name 'PLATFORMS'
                        values (
                          '+linux/armv7',
                          '+linux/ppc64le',
                          '+linux/s390x',
                          '+linux/mips64',
                          '+darwin',
                          '+windows/386',
                          '+windows/amd64'
                        )
                      }
                      axis {
                        name 'BEATS_FOLDER'
                        values (
                          'auditbeat',
                          'filebeat',
                          'heartbeat',
                          'journalbeat',
                          'metricbeat',
                          'packetbeat',
                          'winlogbeat',
                          'x-pack/auditbeat',
                          'x-pack/filebeat',
                          'x-pack/functionbeat',
                          'x-pack/heartbeat',
                          'x-pack/journalbeat',
                          'x-pack/metricbeat',
                          'x-pack/packetbeat',
                          'x-pack/winlogbeat'
                        )
                      }
                    }
                    stages {
                      stage('Package'){
                        agent { label 'ubuntu && immutable' }
                        options { skipDefaultCheckout() }
                        environment {
                          HOME = "${env.WORKSPACE}"
                        }
                        steps {
                          echo "Running tests"
                        }
                      }
                    }
                  }
                }
              }
            }
            
            Show
            ifernandezcalvo Ivan Fernandez Calvo added a comment - - edited Another case that does not make any sense that it said that the method is too large, the following pipeline return the "Method code too large!" error pipeline { agent { label 'ubuntu && immutable' } environment { BASE_DIR = 'src/github.com/elastic/beats' JOB_GCS_BUCKET = 'bucket' JOB_GCS_CREDENTIALS = 'creds' DOCKERELASTIC_SECRET = 'secret' DOCKER_REGISTRY = 'docker.example.com' SNAPSHOT = " true " } options { timeout(time: 3, unit: 'HOURS' ) buildDiscarder(logRotator(numToKeepStr: '20' , artifactNumToKeepStr: '20' , daysToKeepStr: '30' )) timestamps() ansiColor( 'xterm' ) disableResume() durabilityHint( 'PERFORMANCE_OPTIMIZED' ) disableConcurrentBuilds() } triggers { issueCommentTrigger( '(?i)^\\/packaging$' ) } stages { stage( 'Build Packages' ){ matrix { agent { label 'ubuntu && immutable' } axes { axis { name 'PLATFORMS' values ( '+linux/armv7' , '+linux/ppc64le' , '+linux/s390x' , '+linux/mips64' , '+darwin' , '+windows/386' , '+windows/amd64' ) } axis { name 'BEATS_FOLDER' values ( 'auditbeat' , 'filebeat' , 'heartbeat' , 'journalbeat' , 'metricbeat' , 'packetbeat' , 'winlogbeat' , 'x-pack/auditbeat' , 'x-pack/filebeat' , 'x-pack/functionbeat' , 'x-pack/heartbeat' , 'x-pack/journalbeat' , 'x-pack/metricbeat' , 'x-pack/packetbeat' , 'x-pack/winlogbeat' ) } } stages { stage( 'Package' ){ options { skipDefaultCheckout() } environment { HOME = "${env.WORKSPACE}" } steps { echo "Running tests" } } } } } } } moving the agent request from the matrix to the stage resolves the issue, but it should be supported pipeline { agent { label 'ubuntu && immutable' } environment { BASE_DIR = 'src/github.com/elastic/beats' JOB_GCS_BUCKET = 'bucket' JOB_GCS_CREDENTIALS = 'creds' DOCKERELASTIC_SECRET = 'secret' DOCKER_REGISTRY = 'docker.example.com' SNAPSHOT = " true " } options { timeout(time: 3, unit: 'HOURS' ) buildDiscarder(logRotator(numToKeepStr: '20' , artifactNumToKeepStr: '20' , daysToKeepStr: '30' )) timestamps() ansiColor( 'xterm' ) disableResume() durabilityHint( 'PERFORMANCE_OPTIMIZED' ) disableConcurrentBuilds() } triggers { issueCommentTrigger( '(?i)^\\/packaging$' ) } stages { stage( 'Build Packages' ){ matrix { axes { axis { name 'PLATFORMS' values ( '+linux/armv7' , '+linux/ppc64le' , '+linux/s390x' , '+linux/mips64' , '+darwin' , '+windows/386' , '+windows/amd64' ) } axis { name 'BEATS_FOLDER' values ( 'auditbeat' , 'filebeat' , 'heartbeat' , 'journalbeat' , 'metricbeat' , 'packetbeat' , 'winlogbeat' , 'x-pack/auditbeat' , 'x-pack/filebeat' , 'x-pack/functionbeat' , 'x-pack/heartbeat' , 'x-pack/journalbeat' , 'x-pack/metricbeat' , 'x-pack/packetbeat' , 'x-pack/winlogbeat' ) } } stages { stage( 'Package' ){ agent { label 'ubuntu && immutable' } options { skipDefaultCheckout() } environment { HOME = "${env.WORKSPACE}" } steps { echo "Running tests" } } } } } } }
            Hide
            henryborchers Henry Borchers added a comment -

            I'm running into this now as well

            Show
            henryborchers Henry Borchers added a comment - I'm running into this now as well
            Hide
            rensgroothuijsen Rens Groothuijsen added a comment -

            It does seem to be related to having a large number of axis values, and I assume Groovy can't compile all of that at once. When I remove any one of the axis values from your script, it does run as expected.

            Show
            rensgroothuijsen Rens Groothuijsen added a comment - It does seem to be related to having a large number of axis values, and I assume Groovy can't compile all of that at once. When I remove any one of the axis values from your script, it does run as expected.
            Hide
            bitwiseman Liam Newman added a comment -

            Rens Groothuijsen
            Ivan Fernandez Calvo
            This is not due to the size of Jenkinsfile but a limit in the JVM on the size of the compiled pipeline. There is a workaround available in the Declarative that can be turned on using the feature flag:
            org.jenkinsci.plugins.pipeline.modeldefinition.parser.RuntimeASTTransformer.SCRIPT_SPLITTING_TRANSFORMATION=true

            Please see JENKINS-37984 . I'm resolving this as a duplicate of that issue.

            Show
            bitwiseman Liam Newman added a comment - Rens Groothuijsen Ivan Fernandez Calvo This is not due to the size of Jenkinsfile but a limit in the JVM on the size of the compiled pipeline. There is a workaround available in the Declarative that can be turned on using the feature flag: org.jenkinsci.plugins.pipeline.modeldefinition.parser.RuntimeASTTransformer.SCRIPT_SPLITTING_TRANSFORMATION=true Please see JENKINS-37984 . I'm resolving this as a duplicate of that issue.
            Hide
            ifernandezcalvo Ivan Fernandez Calvo added a comment -

            this workaround has a side effect to take in count before enabling this flag.

            WARNING: this is a global change. If you install this version an have SCRIPT_SPLITTING_TRANSFORMATION=true, any declarative pipeline Jenkinsfile with a local variable declaration outside the pipeline block will fail. You may switch back to allowing locally declared variables by setting SCRIPT_SPLITTING_ALLOW_LOCAL_VARIABLES=true.
            
            Show
            ifernandezcalvo Ivan Fernandez Calvo added a comment - this workaround has a side effect to take in count before enabling this flag. WARNING: this is a global change. If you install this version an have SCRIPT_SPLITTING_TRANSFORMATION= true , any declarative pipeline Jenkinsfile with a local variable declaration outside the pipeline block will fail. You may switch back to allowing locally declared variables by setting SCRIPT_SPLITTING_ALLOW_LOCAL_VARIABLES= true .
            Hide
            bitwiseman Liam Newman added a comment -

            Ivan Fernandez Calvo
            Yes. This is meant to give users clearer feedback when using script splitting - there have a been a number of people who were unaware they were doing something that was defeating script splitting. Also, the fix for the error is to add @Field to any def variableName-style variables. Very easy and also a best practice.
            Finally, if you set SCRIPT_SPLITTING_ALLOW_LOCAL_VARIABLES=true you can silence the error.

            Have you tried the change? I have heard very little feedback.

            Show
            bitwiseman Liam Newman added a comment - Ivan Fernandez Calvo Yes. This is meant to give users clearer feedback when using script splitting - there have a been a number of people who were unaware they were doing something that was defeating script splitting. Also, the fix for the error is to add @Field to any def variableName -style variables. Very easy and also a best practice. Finally, if you set SCRIPT_SPLITTING_ALLOW_LOCAL_VARIABLES=true you can silence the error. Have you tried the change? I have heard very little feedback.
            Hide
            ifernandezcalvo Ivan Fernandez Calvo added a comment - - edited

            no, we workaround it moving the logic to functions or steps in the pipeline shared library, for the most complex pipeline we have (a big monorepo) we ended making an opinionated YAML file we interpreted from a Jenkinsfile, this gives us no limit of code.

            https://github.com/elastic/beats/pull/20104

            Show
            ifernandezcalvo Ivan Fernandez Calvo added a comment - - edited no, we workaround it moving the logic to functions or steps in the pipeline shared library, for the most complex pipeline we have (a big monorepo) we ended making an opinionated YAML file we interpreted from a Jenkinsfile, this gives us no limit of code. https://github.com/elastic/beats/pull/20104

              People

              Assignee:
              Unassigned Unassigned
              Reporter:
              ifernandezcalvo Ivan Fernandez Calvo
              Votes:
              1 Vote for this issue
              Watchers:
              6 Start watching this issue

                Dates

                Created:
                Updated:
                Resolved: