-
Notifications
You must be signed in to change notification settings - Fork 69
fix: enable parallel test execution with pytest-xdist in CI workflow #620
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
fix: enable parallel test execution with pytest-xdist in CI workflow #620
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #620 +/- ##
===================================
Coverage 83% 83%
===================================
Files 43 43
Lines 6609 6610 +1
===================================
+ Hits 5492 5500 +8
+ Misses 1117 1110 -7 🚀 New features to boost your workflow:
|
.github/workflows/ci-testing.yml
Outdated
run: coverage run --source litdata -m pytest tests -v --durations=100 | ||
run: | | ||
pip install pytest-xdist | ||
coverage run --source litdata -m pytest tests -v --durations=100 -n auto |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
coverage run --source litdata -m pytest tests -v --durations=100 -n auto | |
coverage run --source litdata -m pytest tests -v --durations=100 -n 2 |
lets start just with 2 and we can bump the scale later
@@ -556,6 +556,8 @@ def _terminate(self) -> None: | |||
if self.remover and self.remover.is_alive(): | |||
self.remover.join() | |||
|
|||
sleep(5) # Give some buffer time for file creation/deletion |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Rather have loop testing the it was created/deleted
.github/workflows/ci-testing.yml
Outdated
@@ -47,7 +47,7 @@ jobs: | |||
|
|||
- name: Tests | |||
working-directory: tests | |||
run: pytest . -v --cov=litdata --durations=100 | |||
run: pytest . -v --cov=litdata --durations=100 -n auto --reruns 2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
run: pytest . -v --cov=litdata --durations=100 -n auto --reruns 2 | |
run: pytest . -v --cov=litdata --durations=100 -n auto |
rerun only specific tests, so in case something is really crashing, we won't take the test twice as long
@pytest.mark.flaky(reruns=5)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, these are just attempts to see if tests pass. The final pr will be updated to match the standard.
A weird thing I've seen is, after tests have completed, it still hangs for some reason. I'm currently investigating the causes.
@deependujha how much is missing? |
Before submitting
What does this PR do?
partial work on #612
follow-up work from #608 and #338
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in GitHub issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃