Skip to content

Conversation

RohanDisa
Copy link

  • Convert df_test.py from unittest to pytest with fixtures and markers
  • Convert df_integrationtest.py to pytest format with integration markers
  • Update sql_test.py with pytest fixtures and modern assertions
  • Add pytest configuration to pyproject.toml
  • Format all test files with pyink
  • Add @pytest.mark.unit, @pytest.mark.integration, @pytest.mark.gcs markers
  • Maintain existing project structure and Google-style docstrings

Tests now use modern pytest features while respecting project guidelines.

- Convert df_test.py from unittest to pytest with fixtures and markers
- Convert df_integrationtest.py to pytest format with integration markers
- Update sql_test.py with pytest fixtures and modern assertions
- Add pytest configuration to pyproject.toml
- Format all test files with pyink
- Add @pytest.mark.unit, @pytest.mark.integration, @pytest.mark.gcs markers
- Maintain existing project structure and Google-style docstrings

Tests now use modern pytest features while respecting project guidelines.
Copy link
Owner

@alxmrs alxmrs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few nits. I'd also like to confirm that all of the tests were ported. I think how I'm diffing things is a bit confusing to me; I just really want to be certain.

@@ -1,415 +0,0 @@
<?xml version="1.0" standalone="no"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg version="1.1" width="1200" height="2266" onload="init(evt)" viewBox="0 0 1200 2266" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:fg="http://github.com/jonhoo/inferno"><!--Flame graph stack visualization. See https://github.com/brendangregg/FlameGraph for latest version, and http://www.brendangregg.com/flamegraphs.html for examples.--><!--NOTES: --><defs><linearGradient id="background" y1="0" y2="1" x1="0" x2="0"><stop stop-color="#eeeeee" offset="5%"/><stop stop-color="#eeeeb0" offset="95%"/></linearGradient></defs><style type="text/css">
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for deleting these; they've gone stale.

# -------------------------
class TestExplode:

@pytest.mark.unit
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How do you intend to use this marker? What if we considered "unit" tests as the default type of test? Then, we could omit adding these markers to every test.

ds = next(iter(explode(air_dataset)))
iselection = {dim: slice(0, s) for dim, s in ds.sizes.items()}
self.assertEqual(self.air.isel(iselection), ds)
assert air_dataset.isel(iselection).equals(ds)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! This is slick.

# -------------------------
# Test Classes
# -------------------------
class TestExplode:
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am ok with omitting test classes! I'm happy to have everything just be functions, too. I leave the decision up to you.

LIMIT 30
"""
).to_pandas()
@pytest.fixture
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we move this fixture to the top of the file?

from .df import explode, read_xarray, block_slices, from_map, pivot, from_map_batched


# -------------------------
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(optional): I don't think we need these section comments, but I defer to your judgement if we should keep them (I do see some of their benefit).

pyproject.toml Outdated

[tool.setuptools_scm]

[tool.pytest.ini_options]
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this config or the CI config needs to be updated in order to ensure tests run properly.

@alxmrs
Copy link
Owner

alxmrs commented Sep 28, 2025

High level: Thank you for your contribution! This is such an improvement to how testing was done before :)

- Assumed tests to be default unit tests, so removed explicit markers.
- Omitted classes for a cleaner, more readable structure.
- Added all missing test cases that were previously skipped in df_test.
- Moved all fixtures to the top for better organization.
- Reverted to the original project.toml, which is sufficient for running tests.
@RohanDisa
Copy link
Author

"Hi @alxmrs , I have made the requested changes. When you get a chance, could you please take another look? Thanks!

Copy link
Owner

@alxmrs alxmrs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is such an improvement! Thank you for your contribution.

I have one last high level piece of feedback. You're free to fix this and then merge, or merge and then follow up with it in a future PR/issue. It's not a deal breaker to me.

class Era5TestCast(unittest.TestCase):

def test_open_era5(self):
@pytest.mark.integration
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One potential use for this marker would be to exclude integration tests from normal CI runs (and maybe run in a separate action called "integration" or "slow tests". The current way we handle this today is to use the file naming pattern "_integrationtest.py". I'm happy to do one of the following, but not both, of:

  • Moving this test to df_test.py and keeping the integration test marker
  • Removing the marker and keeping the file naming pattern to control what tests are run.

I leave it up to you to choose which of these patterns for managing slow tests to use. For now, I'm happy to have integration tests not be run in CI (since they aren't run there so far. That would be a good thing to do in the future; I'll file an issue about that soon).

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(My apologies for not bringing this up earlier).

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @alxmrs, I have gone with option 1!

@alxmrs
Copy link
Owner

alxmrs commented Oct 2, 2025

Before merging, please make sure all the checks pass.


@pytest.mark.integration
@pytest.mark.gcs
def test_open_era5():
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this option, but now, there needs to be something in the actions runner to exclude running this test, because it will take a long time (and may not even be working -- I think it requires additional GCP set up). Please exclude tests marked with integration in the CI before we merge.

@RohanDisa
Copy link
Author

Hey @alxmrs, I have fixed the linting errors. Do tell me if any other changes are needed from my side.

@RohanDisa
Copy link
Author

I noticed that all five tests failed with the same runner shutdown error:
Error: The runner has received a shutdown signal. This can happen when the runner service is stopped, or a manually started runner is canceled.
This seems to be an infrastructure issue rather than a problem with the code. Please let me know if there’s anything you would like me to check or adjust.

@alxmrs
Copy link
Owner

alxmrs commented Oct 12, 2025

I just reran the tests. But, reading the logs, it still does look like at least one test is failing.

Do tests pass locally? If so, I wonder what the differences between that and CI are. Would you like assistance trying to get the tests to pass?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants