Uploaded image for project: 'Jenkins'
  1. Jenkins
  2. JENKINS-57240

Only last stage data is sent to influx db from jenkins pipeline

    • Icon: Bug Bug
    • Resolution: Unresolved
    • Icon: Major Major
    • influxdb-plugin, pipeline
    • None
    • Jenkins pipeline 2.150.2
      InfluxDB 1.6.2

      I have below code which reads each 'stage' data from a jenkins_pipeline project json output and sends each stage data to influxDB.

      Issue: It sends only last stage data to influx db but I noticed it iterating on each stages

       

      Any suggestions would be helpful.
      //Methods for InfluxData begins
      //Maps for Field type columns
       myDataField1 = [:]
       myDataField2 = [:]
       myDataField3 = [:]
      //Maps for Custom Field measurements
       myCustomDataFields1 = [:]
       myCustomDataFields2 = [:]
       myCustomDataFields3 = [:]
      //Maps for Tag type columns
       myDataTag1 = [:]
       myDataTag2 = [:]
       myDataTag3 = [:]
      //Maps for Custom Tag measurements
       myCustomDataTags1 = [:]
       myCustomDataTags2 = [:]
       myCustomDataTags3 = [:]
      @NonCPS
       def pushStageData() {
      def url_string = "${JENKINS_URL}job/ENO_ENG_TP/job/R421/13/wfapi/describe"
       def replaced = url_string.replaceAll(' ', '%20');
       get = null;
       def get = new URL(replaced).openConnection();
      get.addRequestProperty ("User-Agent","Mozilla/4.0"); 
       get.addRequestProperty("Authorization", "Basic dZXZvceDIwMTk=");
      //fetching the contents of the endpoint URL
       def jsonText = get.getInputStream().getText();
       //converting the text into JSON object using JsonSlurperClassic 
       def jsonObject = new JsonSlurperClassic().parseText(jsonText)
      // Extracting the details of all the stages present in that particular build number
       for (int i=0; i<jsonObject.stages.size()-1; i++){ //size-1 to ignore the post stage
       //populating the field type columns of InfluxDB measurements and pushing them to the map called myDataField1
       def size = jsonObject.stages.size()-1
       myDataField1['result'] = jsonObject.stages[i].status
       myDataField1['duration'] = jsonObject.stages[i].durationMillis
       myDataField1['stage_name'] = jsonObject.stages[i].name
      //populating the tag type columns of InfluxDB measurements and pushing them to the map called myDataTag1
       myDataTag1['result_tag'] = jsonObject.stages[i].status
       myDataTag1['stage_name_tag'] = jsonObject.stages[i].name
      //assigning field type columns to the measurement called CustomData
       myCustomDataFields1['CustomData'] = myDataField1
       //assigning tag type columns to the measurement called CustomData
       myCustomDataTags1['CustomData'] = myDataTag1
      //Push the data into influx instance
       try
      { step([$class: 'InfluxDbPublisher', target: 'jenkins_data', customPrefix: null, customDataMapTags: myCustomDataTags1]) }
      catch (err)
      { println ("pushStagData exception: " + err) }
      }
       }
      

       

          [JENKINS-57240] Only last stage data is sent to influx db from jenkins pipeline

          sridatta s created issue -

          Aleksi Simell added a comment - - edited

          Time is the primary key in InfluxDB. Now you write all points at some specific point in your job instead of spreading them out when you get the data. This causes InfluxDB to overwrite the point which have the exact same timestamp (this is a feature of InfluxDB itself, not the plugin).

          A quick workaround is to sleep for a millisecond after each write to force a new timestamp.

          Aleksi Simell added a comment - - edited Time is the primary key in InfluxDB. Now you write all points at some specific point in your job instead of spreading them out when you get the data. This causes InfluxDB to overwrite the point which have the exact same timestamp (this is a feature of InfluxDB itself, not the plugin). A quick workaround is to sleep for a millisecond after each write to force a new timestamp.

          sridatta s added a comment -

          Thank you aleksisimell.

          But it seem to be different issue. I have 20 stages in my jenkins pipeline and I see only first stage data is sent to influxdb and the above function is getting exited after one loop. Any suggestions? Sorry for my earlier interpretation.

          sridatta s added a comment - Thank you aleksisimell . But it seem to be different issue. I have 20 stages in my jenkins pipeline and I see only first stage data is sent to influxdb and the above function is getting exited after one loop. Any suggestions? Sorry for my earlier interpretation.

          Aleksi Simell added a comment -

          Based on your paste, you're collecting data to `myDataField1` and `myCustomDataFields1`, but never writing those to InfluxDB.

          Is the job completing successfully? If it is, then the only thing I can think of right away is that `jsonObject.stages` does not contain 20 objects, but only 2. But I have no experience from JsonSlurper.

          Aleksi Simell added a comment - Based on your paste, you're collecting data to `myDataField1` and `myCustomDataFields1`, but never writing those to InfluxDB. Is the job completing successfully? If it is, then the only thing I can think of right away is that `jsonObject.stages` does not contain 20 objects, but only 2. But I have no experience from JsonSlurper.

          sridatta s added a comment - - edited

          aleksisimell Thank you for the suggestion. Job is getting completed successfully and also I seee 20 objects in the json and jsonObject.stages.size() returns 20. Is it related to @NonCPS annotation? If I comment that it iterates over all stage but I get java.io.serilizable exception?

          sridatta s added a comment - - edited aleksisimell Thank you for the suggestion. Job is getting completed successfully and also I seee 20 objects in the json and jsonObject.stages.size() returns 20. Is it related to @NonCPS annotation? If I comment that it iterates over all stage but I get java.io.serilizable exception?
          sridatta s made changes -
          Description Original: I have below code which reads each 'stage' data from a jenkins_pipeline project json output and sends each stage data to influxDB.

          Issue: It sends only last stage data to influx db but I noticed it iterating on each stages

          Any suggestions would be helpful.

          //Methods for InfluxData begins

           //Maps for Field type columns
           myDataField1 = [:]
           myDataField2 = [:]
           myDataField3 = [:]
           
           //Maps for Custom Field measurements
           myCustomDataFields1 = [:]
           myCustomDataFields2 = [:]
           myCustomDataFields3 = [:]
           
           //Maps for Tag type columns
           myDataTag1 = [:]
           myDataTag2 = [:]
           myDataTag3 = [:]
           
           //Maps for Custom Tag measurements
           myCustomDataTags1 = [:]
           myCustomDataTags2 = [:]
           myCustomDataTags3 = [:]

          @NonCPS
          def pushStageData() {
                
          def url_string = "${JENKINS_URL}job/ENO_ENG_TP/job/R421/13/wfapi/describe"
          def replaced = url_string.replaceAll(' ', '%20');
          get = null;
          def get = new URL(replaced).openConnection();

          get.addRequestProperty ("User-Agent","Mozilla/4.0");
          get.addRequestProperty("Authorization", "Basic dZXZvceDIwMTk=");


          //fetching the contents of the endpoint URL
              def jsonText = get.getInputStream().getText();
              //converting the text into JSON object using JsonSlurperClassic
              def jsonObject = new JsonSlurperClassic().parseText(jsonText)
              
              // Extracting the details of all the stages present in that particular build number
              for (int i=0; i<jsonObject.stages.size()-1; i++){ //size-1 to ignore the post stage
                  //populating the field type columns of InfluxDB measurements and pushing them to the map called myDataField1
                  def size = jsonObject.stages.size()-1
          myDataField1['result'] = jsonObject.stages[i].status
          myDataField1['duration'] = jsonObject.stages[i].durationMillis
                  myDataField1['stage_name'] = jsonObject.stages[i].name
                  
                  //populating the tag type columns of InfluxDB measurements and pushing them to the map called myDataTag1
                  myDataTag1['result_tag'] = jsonObject.stages[i].status
                  myDataTag1['stage_name_tag'] = jsonObject.stages[i].name
                  
                  
                  //assigning field type columns to the measurement called CustomData
                  myCustomDataFields1['CustomData'] = myDataField1
          //assigning tag type columns to the measurement called CustomData
                  myCustomDataTags1['CustomData'] = myDataTag1
                 
                  //Push the data into influx instance
          try{
          step([$class: 'InfluxDbPublisher', target: 'jenkins_data', customPrefix: null, customDataMapTags: myCustomDataTags1])
          }
          catch (err){
          println ("pushStagData exception: " + err)
          }
              }
          }
          New: I have below code which reads each 'stage' data from a jenkins_pipeline project json output and sends each stage data to influxDB.

          Issue: It sends only last stage data to influx db but I noticed it iterating on each stages

           
          {noformat}
          Any suggestions would be helpful.
          //Methods for InfluxData begins
          //Maps for Field type columns
           myDataField1 = [:]
           myDataField2 = [:]
           myDataField3 = [:]
          //Maps for Custom Field measurements
           myCustomDataFields1 = [:]
           myCustomDataFields2 = [:]
           myCustomDataFields3 = [:]
          //Maps for Tag type columns
           myDataTag1 = [:]
           myDataTag2 = [:]
           myDataTag3 = [:]
          //Maps for Custom Tag measurements
           myCustomDataTags1 = [:]
           myCustomDataTags2 = [:]
           myCustomDataTags3 = [:]
          @NonCPS
           def pushStageData() {
          def url_string = "${JENKINS_URL}job/ENO_ENG_TP/job/R421/13/wfapi/describe"
           def replaced = url_string.replaceAll(' ', '%20');
           get = null;
           def get = new URL(replaced).openConnection();
          get.addRequestProperty ("User-Agent","Mozilla/4.0");
           get.addRequestProperty("Authorization", "Basic dZXZvceDIwMTk=");
          //fetching the contents of the endpoint URL
           def jsonText = get.getInputStream().getText();
           //converting the text into JSON object using JsonSlurperClassic
           def jsonObject = new JsonSlurperClassic().parseText(jsonText)
          // Extracting the details of all the stages present in that particular build number
           for (int i=0; i<jsonObject.stages.size()-1; i++){ //size-1 to ignore the post stage
           //populating the field type columns of InfluxDB measurements and pushing them to the map called myDataField1
           def size = jsonObject.stages.size()-1
           myDataField1['result'] = jsonObject.stages[i].status
           myDataField1['duration'] = jsonObject.stages[i].durationMillis
           myDataField1['stage_name'] = jsonObject.stages[i].name
          //populating the tag type columns of InfluxDB measurements and pushing them to the map called myDataTag1
           myDataTag1['result_tag'] = jsonObject.stages[i].status
           myDataTag1['stage_name_tag'] = jsonObject.stages[i].name
          //assigning field type columns to the measurement called CustomData
           myCustomDataFields1['CustomData'] = myDataField1
           //assigning tag type columns to the measurement called CustomData
           myCustomDataTags1['CustomData'] = myDataTag1
          //Push the data into influx instance
           try
          { step([$class: 'InfluxDbPublisher', target: 'jenkins_data', customPrefix: null, customDataMapTags: myCustomDataTags1]) }
          catch (err)
          { println ("pushStagData exception: " + err) }
          }
           }
          {noformat}
           

          Aleksi Simell added a comment -

          In your global configuration, have you checked the "Job scheduled time as timestamp" checkbox? If so, then you will have only 1 timestamp, which causes you to overwrite your data every time you write to InfluxDB. Unchecking this and adding a short wait will force a new timestamp for each write you do.

          Aleksi Simell added a comment - In your global configuration, have you checked the "Job scheduled time as timestamp" checkbox? If so, then you will have only 1 timestamp, which causes you to overwrite your data every time you write to InfluxDB. Unchecking this and adding a short wait will force a new timestamp for each write you do.

          z cloud added a comment -

          aleksisimell ,According to sridattasp, in the "jenkins_data" table, each build has multiple duplicate data? Is there any way to generate a single record in the "jenkins_data" table each time, and multiple records in the "jenkins_custom_data" table?

          z cloud added a comment - aleksisimell  ,According to sridattasp , in the "jenkins_data" table, each build has multiple duplicate data? Is there any way to generate a single record in the "jenkins_data" table each time, and multiple records in the "jenkins_custom_data" table?

          Aleksi Simell added a comment -

          zyun823 That is currently not possible. You're only able to add keys and values to a single "jenkins_custom_data" measurement. I can check how big the workload would be to change the functionality so that you would be able to send multiple measurements for the same custom data in a single call from InfluxDbPublisher.

          Aleksi Simell added a comment - zyun823 That is currently not possible. You're only able to add keys and values to a single "jenkins_custom_data" measurement. I can check how big the workload would be to change the functionality so that you would be able to send multiple measurements for the same custom data in a single call from InfluxDbPublisher.

          z cloud added a comment -

          aleksisimell ,Thank you for your reply and look forward to this feature. The customDataMap and customDataMapTags support list(map) will be great.

          For example:

          stageDataMapTagsList = []

          stageDataMapTags = [:]

          stageDataMapTagsList.add(stageDataMapTags)

          customDataMapTags['stage'] = stageDataMapTagsList

          z cloud added a comment - aleksisimell  ,Thank you for your reply and look forward to this feature. The customDataMap and customDataMapTags support list(map) will be great. For example: stageDataMapTagsList = [] stageDataMapTags = [:] stageDataMapTagsList.add(stageDataMapTags) customDataMapTags ['stage'] = stageDataMapTagsList

            aleksisimell Aleksi Simell
            sridattasp sridatta s
            Votes:
            1 Vote for this issue
            Watchers:
            5 Start watching this issue

              Created:
              Updated: