-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Two moons test fit #175
Two moons test fit #175
Conversation
commit 320f1ae Author: lars <[email protected]> Date: Tue Jun 18 16:23:01 2024 +0200 fix two moons simulated dtype commit 27f99cd Author: lars <[email protected]> Date: Tue Jun 18 16:09:45 2024 +0200 fix data modification for tensorflow compiled mode commit c8060fc Merge: 3150d11 e2355de Author: lars <[email protected]> Date: Tue Jun 18 15:35:59 2024 +0200 Merge remote-tracking branch 'origin/streamlined-backend' into streamlined-backend commit 3150d11 Author: lars <[email protected]> Date: Tue Jun 18 15:35:52 2024 +0200 add JAX Approximator finalize all Approximators commit e2355de Author: Chase Grajeda <[email protected]> Date: Tue Jun 18 22:15:37 2024 +0900 Configurator Unit Tests (bayesflow-org#174) * First additions Added __init__.py for test module. Added test_configurators.py. Added basic fixtures and construction tests. * Remaining tests Added remaining unit tests * Added conftest Separated fixtures and placed them in conftest.py * Added requested changes Added batch_size, set_size, and num_features parameterizations in conftest.py. Combined repetitive fixtures in conftest.py. Combined repetitive tests in test_configurators.py. Parameterized Configurator initialization in conftest.py. Parameterized parameter selection in conftest.py. Removed initialization tests in test_configurators.py. Added summary_inputs and summary_conditions to parameters. Changed instances of '==None' to 'is None'. Removed 'config=Configurator' instances in test_configurators.py.
Added test for post-training loss < pre-training loss to test_fit.py::test_fit
Added test in test_fit.py::test_fit for vanishing weights
Added test to test_fit.py for verifying the simulator produces random and consistent data
Added MMD test to test_two_moons.py. Added MMD method to utils/ops.py. Added test_dataset to test_two_moons/conftest.py.
Added auto-formatting changes from ruff
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great changes, thank you. I added some comments for future improvement, but I will merge this as-is. Also make sure to pass the linter check in the future (you can do this automatically by installing the pre-commit hook with conda env update -f environment.yaml && pre-commit install && pre-commit run --all-files
.)
@@ -7,3 +7,30 @@ def isclose(x1, x2, rtol=1e-5, atol=1e-8): | |||
|
|||
def allclose(x1, x2, rtol=1e-5, atol=1e-8): | |||
return keras.ops.all(isclose(x1, x2, rtol, atol)) | |||
|
|||
|
|||
def max_mean_discrepancy(x, y): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the future, let's put these in bayesflow/metrics
directly. This time, I will move this and make it a bit more customizable.
# Test model weights have not vanished | ||
for layer in approximator.layers: | ||
for weight in layer.weights: | ||
assert not keras.ops.any(keras.ops.isnan(weight)).numpy() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tensor.numpy()
is not backend-agnostic and also not necessary here (when necessary, use keras.ops.convert_to_numpy
instead)
pre_loss = approximator.compute_metrics(train_dataset.data)["loss"] | ||
pre_val_loss = approximator.compute_metrics(validation_dataset.data)["loss"] | ||
x_before = approximator.inference_network(inf_vars, conditions=inf_conds) | ||
mmd_before = max_mean_discrepancy(x_before, y) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
inconsistent naming
approximator.compile(jit_compile=jit_compile, loss=keras.losses.KLDivergence()) | ||
inf_vars = approximator.configurator.configure_inference_variables(test_dataset.data) | ||
inf_conds = approximator.configurator.configure_inference_conditions(test_dataset.data) | ||
y = test_dataset.data["x"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
relies on the internal structure of the dataset fixture, which is not good. Use test_batch = test_dataset[0]; observables = test_batch["x"]
instead.
8b47c03
into
bayesflow-org:streamlined-backend
Adds tests to verify convergence in
test_two_moons/test_two_moons.py
test_fit