-
Notifications
You must be signed in to change notification settings - Fork 668
[Backend Tester] Add flow-specific skipped tests, skip argmin/max on Core ML due to native crashes, float16/64 on Vulkan #13992
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/13992
Note: Links to docs will display an error until the docs builds have been completed. ⏳ No Failures, 74 PendingAs of commit 88dc7d0 with merge base 48ba0a6 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
CoreMLTester, minimum_deployment_target=minimum_deployment_target | ||
), | ||
quantize=quantize, | ||
skip_patterns=["test_argmin", "test_argmax"], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How does this play when doing a cross delegate % - pass like comparison?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's not directly reported as a fail. It's not ideal, but native crashes are unrecoverable without process isolation, so I've been accounting for this in the reported numbers. My plan is to switch to pytest with parallel execution to provide process isolation. But, for now, hiding those 2 faiing tests lets the run at least complete in CI.
def wrapped_test(self): | ||
with TestContext(test_name, test_base_name, flow.name, params): | ||
if flow.should_skip_test(test_name): | ||
raise unittest.SkipTest(f"Skipping test due to matching flow {flow.name} skip patterns") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will this be counted and reported as SKIPPED in the report?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, it's reported as skipped.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
stamping to unblock.
Argmin and argmax are crashing on Core ML. Native crashes are hard to recover from. In the medium term, I'd like to look at using pytest-xdist, which, by nature of process isolation, can recover. But for now, I'm adding a way to skip tests on specific backend flows to allow running the remaining tests. I've also skipped Vulkan float16 and float64 tests, as they crash when using swiftshader (used in the CI for software execution of the Vulkan compute shaders). This should be re-visited in the future.
There is also a little technical debt in the code duplication between operator and model tests. This will also be cleaned up shortly.