Skip to content

Contains the code accompanying the paper "Does Unsupervised Domain Adaptation Improve the Robustness of Amortized Bayesian Inference? A Systematic Evaluation".

bayesflow-org/NPE-UDA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Unsupervised Domain Adaptation for Robust Amortized Bayesian Inference

This repository contains the code for running and reproducing the experiments from the paper Does Unsupervised Domain Adaptation Improve the Robustness of Amortized Bayesian Inference? A Systematic Evaluation, published in Transactions on Machine Learning Research (arXiv version).

The acronym JET found throughout the repository stands for "Joint Embedding Training", an early development name of the considered methods before switching to the more general term "NPE-UDA" (Neural Posterior Estimation with Unsupervised Domain Adaptation). NPE-UDA methods augment NPE with UDA techniques that align the summary statistics of simulated and observed data during training (i.e., map their embeddings to a joint space) to achieve more robust inference in the real world. As our experiments show, the success of such methods is highly dependent on the type of misspecification causing the sim2real gap. The code depends on the BayesFlow library, which implements the neural network architectures and training utilities.

Installation

First, create a new Python 3.11 virtual environment, e.g. using conda or micromamba:

conda create --name "npe-uda" -c conda-forge python=3.11 -y
conda activate npe-uda

Next, install pip-tools, which will be used to install the requirements from the lock file:

pip install pip-tools

Then, run the command to sync the dependencies from the requirements.txt:

pip-sync

Note that the requirements.txt file is compiled for a linux environment and rerunning the compilation via pip-compile requirements.in before syncing may be necessary for other OS environments.

Finally, set Keras' backend environment variable to tensorflow, which is currently required for the NPE-DANN method:

conda env config vars set KERAS_BACKEND=tensorflow

Cite

The article can be cited as:

@article{elsemueller2025does,
  title={Does Unsupervised Domain Adaptation Improve the Robustness of Amortized Bayesian Inference? A Systematic Evaluation},
  author={Lasse Elsem{\"u}ller and Valentin Pratz and Mischa von Krause and Andreas Voss and Paul-Christian B{\"u}rkner and Stefan T. Radev},
  journal={Transactions on Machine Learning Research},
  year={2025},
  url={https://openreview.net/forum?id=ewgLuvnEw6},
}

About

Contains the code accompanying the paper "Does Unsupervised Domain Adaptation Improve the Robustness of Amortized Bayesian Inference? A Systematic Evaluation".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published