
Simon Peyton Jones via ghc-devs
| Hmm, it really should be a distinct block of output. Are you saying | that | you are seeing lines of unrelated output interspersed in the | performance | metrics table?
Yes, just so. I attach the tail of the validate run.
Ahh, I see what is happening here. This is due to interleaving of stdout (where the performance metrics are printed) and stderr (where exceptions are printed). This is indeed quite unfortunate. I'm not entirely sure
| The suggested command will fetch up-to-date performance metrics from | the | metrics repository (populated by CI). If you then run the testsuite | again you will see output comparing each test's output to the baseline | from CI. For instance,
Aha, I'll try that. I missed those words. But actually the words don't say "run validate again to see the differences". They say "a baseline may be recovered from ci results once fetched" which is a deeply cryptic utterance.
Also, does it make sense to spit out all these numbers if they are useless? Better, perhaps in the overall SUMMARY (at the end of validate) to
We started emitting these metrics at the end of the log because they can be quite handy when diagnosing performance changes in CI. However we have recently started dumping the metrics to a file which is uploaded as an artifact from the CI job, so perhaps this is no longer necessary.
- Have a section for perf tests, along with "Unexpected passes" and "Unexpected failure" we can have "Unexpected perf failures"
- In that section, if there is no baseline data, put the words that explain how to get it.
A fair suggestion. I will try to get to this soon. Cheers, - Ben