Skip to content

Commit d29ff5a

Browse files
committed
added documentation of constrain transform
1 parent 9deda9d commit d29ff5a

File tree

2 files changed

+41
-0
lines changed

2 files changed

+41
-0
lines changed

bayesflow/adapters/transforms/as_set.py

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,13 @@ class AsSet(ElementwiseTransform):
88
The `.as_set(["x", "y"])` transform indicates that both `x` and `y` are treated as sets.
99
That is, their values will be treated as *exchangable* such that they will imply the same inference regardless of the values' order.
1010
This would be useful in a linear regression context where we can index the observations in arbitrary order and always get the same regression line.
11+
12+
Useage:
13+
14+
adapter = (
15+
bf.Adapter()
16+
.as_set(["x", "y"])
17+
)
1118
"""
1219

1320
def forward(self, data: np.ndarray, **kwargs) -> np.ndarray:

bayesflow/adapters/transforms/constrain.py

Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,40 @@
1515

1616
@serializable(package="bayesflow.adapters")
1717
class Constrain(ElementwiseTransform):
18+
"""
19+
Constrains neural network predictions of a data variable to specificied bounds.
20+
21+
Parameters:
22+
String containing the name of the data variable to be transformed e.g. "sigma". See examples below.
23+
24+
Named Parameters:
25+
lower: Lower bound for named data variable.
26+
upper: Upper bound for named data variable.
27+
method: Method by which to shrink the network predictions space to specified bounds. Choose from
28+
- Double bounded methods: sigmoid, expit, (default = sigmoid)
29+
- Lower bound only methods: softplus, exp, (default = softplus)
30+
- Upper bound only methods: softplus, exp, (default = softplus)
31+
32+
33+
34+
Examples:
35+
Let sigma be the standard deviation of a normal distribution, then sigma should always be greater than zero.
36+
37+
Useage:
38+
adapter = (
39+
bf.Adapter()
40+
.constrain("sigma", lower=0)
41+
)
42+
43+
Suppose p is the parameter for a binomial distribution where p must be in [0,1] then we would constrain the neural network to estimate p in the following way
44+
45+
Usage:
46+
adapter = (
47+
bf.Adapter()
48+
.constrain("p", lower=0, upper=1, method = "sigmoid")
49+
)
50+
"""
51+
1852
def __init__(
1953
self, *, lower: int | float | np.ndarray = None, upper: int | float | np.ndarray = None, method: str = "default"
2054
):

0 commit comments

Comments
 (0)