Broke item 5 into four parts. Please, find the comments below:
> 5) On [5], test marked as "TODO known breakage" are also marked red, i.e. as failures, but again the total number of failed tests oddly is correct. But the number of skipped tetsts is not. The header says "628 files 9549 tests, 8967 ok, 130 not ok, 1 skipped, 0 Bail Out!", but as you can see from the yellow entries in the table there are far more than 1 skipped tests.
Hmmm. Weird. I imported test-results/t0000-basic.sh into my dev-workspace (hope you don't mind it). And the plug-in successfully reported "7 skipped,"... Looking at the code, though, I found a bug in the TapResult#tally() method:
[snippet]
if (plan.isSkip()) {
this.skipped = testResults.size();
} else {
for (TestResult testResult : testResults) {
if (isSkipped(testResult))
{
skipped += 1;
}
else if (isFailure(testResult))
{
failed += 1;
}
else
{
passed += 1;
}
}
}
[/snippet]
[zooming-in]
if (plan.isSkip()) {
this.skipped = testResults.size();
}
[/zooming-in]
I hope you understand what I'm highlighting here. If the plan was marked with SKIP for any reason, the skip count would be reset. Say you had 1 test in that Test Set. Possibly, you would have 1 Skipped test being incorrectly displayed.
Normal tests running (after importing your TAP stream):
60 tests, 52 ok, 1 not ok, 7 skipped, 0 Bail Out!.
Running again, after adding a Test Set with 10 tests, with the Plan marks them all as SKIP.
70 tests, 52 ok, 1 not ok, 10 skipped, 0 Bail Out!.
After fixing the tally method (now it's: skipped += testResults.size()
70 tests, 52 ok, 1 not ok, 17 skipped, 0 Bail Out!.
Hopefully it'll fix your issue. I tried to find a plan with SKIP on your build... but there were too many files to look for.
> Also, whenever there's a comment in the raw TAP file, you make it part of the table. I think the plugin should simply ignore lines starting with "#" in the input files and not include them into the table. A good example on that page is at "test-results/t5801-remote-helpers.sh" where you see a lot of comments which should not be visible at all, IMHO.
Deal, I'll add a new feature while working on issue JENKINS-18703 or JENKINS-17804.
> The bottom of the page finally contains the details about parse errors, which is great. Maybe the "1 parse error(s)" from the overview page could link here? In any case I like this result page much better than [4], and I'd like to suggest to completely drop [4] and just fix the issues with [5] instead. This will also lead to less confusion for the user which TAP result page to look at.
Good idea mate. Added a <a name='parseErrors' /> just before the Parse errors section. Now if you type
http://localhost:8080/job/test-17941/3/tapResults/#parseErrors
in your browser URL you'll see the parse errors. I've also added a link the build page that goes direct to this section.
>> 2) Slightly below, there's the text "This build contains 628 TAP test sets. Click here for details" and "1 parse error(s)". Again I'm not sure what causes the parse error, whether it's a bug in the TAP plugin or a bug it Git's test suite output. In any case it would be nice to see the name of the (sub-)test where the error occurred here.
>I believe that information is available in the build summary (the "Click here" link points to that page). Not sure if we should add that information in the project page, since it can contain a long list when many TAP streams failed to be parsed. Perhaps it could be added as a feature... since other users could find useful to have a list of failed tests... or bail out'ed tests.
I agree that the information itself should not be duplicated on that page. What I was thinking about was more like adding an anchor like
https://qa.nest-initiative.org/view/msysGit/job/msysgit-mingwGitDevEnv-test/125/tapResults/#parse_errors
and linking the "1 parse error(s)" text on
https://qa.nest-initiative.org/view/msysGit/job/msysgit-mingwGitDevEnv-test/125/
to that anchor, so you can directly jump to that section of the TAP results.
> I think the plug-in was right to fail to parse it. However, I had to manually enter the file name in the browser URL to view the file that the plug-in failed to parse.
I agree that it's correct to report a parse error in this case. Thanks for adding that direct link to the offending file!