Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-61051

Jobs are started on master instead of EC2 slaves randomly

    XMLWordPrintable

    Details

    • Similar Issues:

      Description

      Jenkins master runs on an AWS Linux 2. Jenkins uses the EC2 plugin to create slaves whenever needed and many jobs are assigned to slaves using the labels.

      Since upgrading to EC2 plugin 1.49 (and to Jenkins 2.217 which contains remoting 4.0) some jobs - randomly, it seems - are started on the master node instead of using the started slaves. The aws slave is started, but the workspace is created on master (in the user's home which should have been used on the slave). The job's console log says it is running on the slave but it is not true.

      Maybe this is not related to EC2 plugin as I don't see any change related to this problem in the 1.49 version's release history.

      Attachment: I created a snapshot about a node's script console page while - according to the Jenkins logs - it was used for building. I asked for the hostname and although the name of the node suggests it is a slave node, the hostname belongs to the master. And of course the workspace was created on master.

        Attachments

          Activity

          gaborv Gabor V created issue -
          gaborv Gabor V made changes -
          Field Original Value New Value
          Attachment Screenshot 2020-02-11 at 13.59.23.png [ 50188 ]
          Description Jenkins master runs on an AWS Linux 2. Jenkins uses the EC2 plugin to create slaves whenever needed and many jobs are assigned to slaves using the labels.

          Since upgrading to EC2 plugin 1.49 some jobs - randomly, it seems - are started on the master node instead of using the started slaves. The aws slave is started, but the workspace is created on master (in the user's home which should have been used on the slave). The job's console log says it is running on the slave but it is not true.

          Maybe this is not related to EC2 plugin as I don't see any change related to this problem in the 1.49 version's release history.
          Jenkins master runs on an AWS Linux 2. Jenkins uses the EC2 plugin to create slaves whenever needed and many jobs are assigned to slaves using the labels.

          Since upgrading to EC2 plugin 1.49 some jobs - randomly, it seems - are started on the master node instead of using the started slaves. The aws slave is started, but the workspace is created on master (in the user's home which should have been used on the slave). The job's console log says it is running on the slave but it is not true.

          Maybe this is not related to EC2 plugin as I don't see any change related to this problem in the 1.49 version's release history.

          Attachment: I created a snapshot about a node's script console page while - according to the Jenkins logs - it was used for building. I asked for the hostname and although the name of the node suggests it is a slave node, the hostname belongs to the master. And of course the workspace was created on master.
          gaborv Gabor V made changes -
          Assignee FABRIZIO MANFREDI [ thoulen ] Jeff Thompson [ jthompson ]
          gaborv Gabor V made changes -
          Component/s remoting [ 15489 ]
          Labels agents ec2 plugin slave agents ec2 plugin remoting slave
          gaborv Gabor V made changes -
          Description Jenkins master runs on an AWS Linux 2. Jenkins uses the EC2 plugin to create slaves whenever needed and many jobs are assigned to slaves using the labels.

          Since upgrading to EC2 plugin 1.49 some jobs - randomly, it seems - are started on the master node instead of using the started slaves. The aws slave is started, but the workspace is created on master (in the user's home which should have been used on the slave). The job's console log says it is running on the slave but it is not true.

          Maybe this is not related to EC2 plugin as I don't see any change related to this problem in the 1.49 version's release history.

          Attachment: I created a snapshot about a node's script console page while - according to the Jenkins logs - it was used for building. I asked for the hostname and although the name of the node suggests it is a slave node, the hostname belongs to the master. And of course the workspace was created on master.
          Jenkins master runs on an AWS Linux 2. Jenkins uses the EC2 plugin to create slaves whenever needed and many jobs are assigned to slaves using the labels.

          Since upgrading to EC2 plugin 1.49 (and to Jenkins 2.217 which contains remoting 4.0) some jobs - randomly, it seems - are started on the master node instead of using the started slaves. The aws slave is started, but the workspace is created on master (in the user's home which should have been used on the slave). The job's console log says it is running on the slave but it is not true.

          Maybe this is not related to EC2 plugin as I don't see any change related to this problem in the 1.49 version's release history.

          Attachment: I created a snapshot about a node's script console page while - according to the Jenkins logs - it was used for building. I asked for the hostname and although the name of the node suggests it is a slave node, the hostname belongs to the master. And of course the workspace was created on master.
          jthompson Jeff Thompson made changes -
          Assignee Jeff Thompson [ jthompson ]
          gaborv Gabor V made changes -
          Assignee Francis Upton [ francisu ]

            People

            Assignee:
            francisu Francis Upton
            Reporter:
            gaborv Gabor V
            Votes:
            1 Vote for this issue
            Watchers:
            4 Start watching this issue

              Dates

              Created:
              Updated: