Skip to content

Conversation

ekouts
Copy link
Contributor

@ekouts ekouts commented Jul 14, 2025

Fixes #3460 .

Copy link

codecov bot commented Jul 14, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 91.22%. Comparing base (8000b58) to head (efb61fc).

Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #3516      +/-   ##
===========================================
- Coverage    91.23%   91.22%   -0.01%     
===========================================
  Files           62       62              
  Lines        13464    13473       +9     
===========================================
+ Hits         12284    12291       +7     
- Misses        1180     1182       +2     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@vkarak vkarak changed the title Add option to log sanity results in perfloggers [feat] Add option to log sanity results in perfloggers Jul 17, 2025
Copy link
Contributor

@vkarak vkarak left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lI think it should be even simpler. Since log_performance() is already called in sanity failures for performance tests, what if we simply rename it to log_result() and remove the check is_performance_check() from the log_performance()?

I think this should pretty much get us to what we want.

@github-project-automation github-project-automation bot moved this from Todo to In Progress in ReFrame Backlog Jul 18, 2025
@ekouts
Copy link
Contributor Author

ekouts commented Jul 18, 2025

lI think it should be even simpler. Since log_performance() is already called in sanity failures for performance tests, what if we simply rename it to log_result() and remove the check is_performance_check() from the log_performance()?

I think this should pretty much get us to what we want.

But could it be an issue for users if we add more entries in their logging, without giving the option to disable? For CSCS it's of course fine, but I added the option so that we keep compatibility.

@vkarak
Copy link
Contributor

vkarak commented Jul 18, 2025

lI think it should be even simpler. Since log_performance() is already called in sanity failures for performance tests, what if we simply rename it to log_result() and remove the check is_performance_check() from the log_performance()?
I think this should pretty much get us to what we want.

But could it be an issue for users if we add more entries in their logging, without giving the option to disable? For CSCS it's of course fine, but I added the option so that we keep compatibility.

I don't it's such a big issue as it is not breaking anything. Besides, the perflogs are per test, so there will not be new entries in existing files. So I'd rather go with this change than adding yet another configuration parameter.

@ekouts
Copy link
Contributor Author

ekouts commented Jul 18, 2025

lI think it should be even simpler. Since log_performance() is already called in sanity failures for performance tests, what if we simply rename it to log_result() and remove the check is_performance_check() from the log_performance()?
I think this should pretty much get us to what we want.

But could it be an issue for users if we add more entries in their logging, without giving the option to disable? For CSCS it's of course fine, but I added the option so that we keep compatibility.

I don't it's such a big issue as it is not breaking anything. Besides, the perflogs are per test, so there will not be new entries in existing files. So I'd rather go with this change than adding yet another configuration parameter.

Ok! will make the change :)

@ekouts ekouts changed the title [feat] Add option to log sanity results in perfloggers [feat] Log all tests in perfloggers, not only performance tests Jul 18, 2025
Comment on lines +966 to +974
if not self.check.perfvalues:
self.extra['check_perf_var'] = "$sanity_dummy"
self.extra['check_perf_value'] = None
self.extra['check_perf_ref'] = None
self.extra['check_perf_lower_thres'] = None
self.extra['check_perf_upper_thres'] = None
self.extra['check_perf_unit'] = None
self.extra['check_perf_result'] = None
self.log(level, msg)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to do it a bit more native if possible. This is like we are hacking the perfvalues to make it work for logging sanity-only tests.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually this is not needed at all. It's very easy to achieve what we want by simply removing the following check:

diff --git a/reframe/core/logging.py b/reframe/core/logging.py
index e166f857b..b44b1d8d8 100644
--- a/reframe/core/logging.py
+++ b/reframe/core/logging.py
@@ -931,7 +931,7 @@ class LoggerAdapter(logging.LoggerAdapter):
         )
 
     def log_performance(self, level, task, msg=None, multiline=False):
-        if self.check is None or not self.check.is_performance_check():
+        if self.check is None:
             return
 
         _, part, env = task.testcase

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So I think the only changes needed are the following:

  1. Remove the check above
  2. Rename the log_performance() to log_result()

The only question remaining is whether we need a configuration option for this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: In Progress
Development

Successfully merging this pull request may close these issues.

Log results of tests that are not performance tests
2 participants