-
Notifications
You must be signed in to change notification settings - Fork 386
Emit a warning when Miri is used with optimizations #2814
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Should we also tell the user that this will not significantly improve the performance of running miri except in microbenchmarks? |
@@ -43,7 +43,7 @@ function run_tests { | |||
# optimizations up all the way, too). | |||
# Optimizations change diagnostics (mostly backtraces), so we don't check | |||
# them. Also error locations change so we don't run the failing tests. | |||
MIRIFLAGS="${MIRIFLAGS:-} -O -Zmir-opt-level=4" MIRI_SKIP_UI_CHECKS=1 ./miri test -- tests/{pass,panic} | |||
MIRIFLAGS="${MIRIFLAGS:-} -O -Zmir-opt-level=4 -Cdebug-assertions=yes" MIRI_SKIP_UI_CHECKS=1 ./miri test -- tests/{pass,panic} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why add the flag back here? I guess some tests rely on it? Please add a comment.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah. I considered doing something more clever, such cfg(not(debug_assertions))
on the test cases, but the tests pretty explicitly call debug_assert!
so it seemed very circular.
e0863ef
to
a67fa85
Compare
@bors r+ |
☀️ Test successful - checks-actions |
1 similar comment
☀️ Test successful - checks-actions |
This may address #2797, by clarifying to the user what is going on and what the consequences of their choices are.
Fixes #2797