Skip to content

Commit 4f43287

Browse files
committed
2 parents ff7b123 + e6c0f06 commit 4f43287

File tree

1 file changed

+17
-18
lines changed

1 file changed

+17
-18
lines changed

README.md

Lines changed: 17 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -1,25 +1,26 @@
11
BayesianNonparametrics.jl
22
===========
3-
BayesianNonparametrics.jl is a Julia package implementing state-of-the-art Bayesian nonparametric models for medium-sized unsupervised problems. The software package brings Bayesian nonparametrics to non-specialists allowing the widespread use of Bayesian nonparametric models. Emphasis is put on consistency, performance and ease of use allowing easy access to Bayesian nonparametric models inside Julia.
3+
[![Build Status](https://travis-ci.org/OFAI/BayesianNonparametrics.jl.svg?branch=master)](https://travis-ci.org/OFAI/BayesianNonparametrics.jl)
4+
[![Coverage Status](https://coveralls.io/repos/github/OFAI/BayesianNonparametrics.jl/badge.svg?branch=master)](https://coveralls.io/github/OFAI/BayesianNonparametrics.jl?branch=master)
5+
6+
*BayesianNonparametrics* is a Julia package implementing state-of-the-art Bayesian nonparametric models for medium-sized unsupervised problems. The software package brings Bayesian nonparametrics to non-specialists allowing the widespread use of Bayesian nonparametric models. Emphasis is put on consistency, performance and ease of use allowing easy access to Bayesian nonparametric models inside Julia.
47

5-
BayesianNonparametrics.jl allows you to
8+
*BayesianNonparametrics* allows you to
69

710
- explain discrete or continous data using: Dirichlet Process Mixtures or Hierarchical Dirichlet Process Mixtures
811
- analyse variable dependencies using: Variable Clustering Model
912
- fit multivariate or univariate distributions for discrete or continous data with conjugate priors
1013
- compute point estimtates of Dirichlet Process Mixtures posterior samples
1114

12-
Requirements
13-
------------
14-
* julia version 0.6
15-
* packages listed in REQUIREMENTS file
15+
#### News
16+
*BayesianNonparametrics* is Julia 0.7 / 1.0 compatible
1617

1718
Installation
1819
------------
19-
You can clone the package into your running julia 0.5 installation using
20+
You can install the package into your running Julia installation using Julia's package manager, i.e.
2021

2122
```julia
22-
Pkg.add("BayesianNonparametrics")
23+
pkg> add BayesianNonparametrics
2324
```
2425

2526
Documentation
@@ -29,7 +30,7 @@ Documentation is available in Markdown:
2930

3031
Example
3132
-------
32-
The following example illustrates the use of BayesianNonparametrics.jl for clustering of continuous observations using a Dirichlet Process Mixture of Gaussians.
33+
The following example illustrates the use of *BayesianNonparametrics* for clustering of continuous observations using a Dirichlet Process Mixture of Gaussians.
3334

3435
After loading the package:
3536

@@ -46,7 +47,7 @@ we can generate a 2D synthetic dataset (or use a multivariate continuous dataset
4647
and construct the parameters of our base distribution:
4748

4849
```julia
49-
μ0 = vec(mean(X, 1))
50+
μ0 = vec(mean(X, dims = 1))
5051
κ0 = 5.0
5152
ν0 = 9.0
5253
Σ0 = cov(X)
@@ -65,7 +66,7 @@ which is in this case a Dirichlet Process Mixture. Each model has to be initiali
6566
modelBuffer = init(X, model, KMeansInitialisation(k = 10))
6667
```
6768

68-
The resulting buffer object can now be used to apply posterior inference on the model given $X$. In the following we apply Gibbs sampling for 500 iterations without burn in or thining:
69+
The resulting buffer object can now be used to apply posterior inference on the model given `X`. In the following we apply Gibbs sampling for 500 iterations without burn in or thining:
6970

7071
```julia
7172
models = train(modelBuffer, DPMHyperparam(), Gibbs(maxiter = 500))
@@ -74,19 +75,19 @@ models = train(modelBuffer, DPMHyperparam(), Gibbs(maxiter = 500))
7475
You shoud see the progress of the sampling process in the command line. After applying Gibbs sampling, it is possible explore the posterior based on their posterior densities,
7576

7677
```julia
77-
densities = Float64[m.energy for m in models]
78+
densities = map(m -> m.energy, models)
7879
```
7980

8081
number of active components
8182

8283
```julia
83-
activeComponents = Int[sum(m.weights .> 0) for m in models]
84+
activeComponents = map(m -> sum(m.weights .> 0), models)
8485
```
8586

8687
or the groupings of the observations:
8788

8889
```julia
89-
assignments = [m.assignments for m in models]
90+
assignments = map(m -> m.assignments, models)
9091
```
9192

9293
The following animation illustrates posterior samples obtained by a Dirichlet Process Mixture:
@@ -111,12 +112,10 @@ end
111112
and find the optimal partition which minimizes the lower bound of the variation of information:
112113

113114
```julia
114-
mink = minimum([length(m.weights) for m in models])
115-
maxk = maximum([length(m.weights) for m in models])
115+
mink = minimum(length(m.weights) for m in models)
116+
maxk = maximum(length(m.weights) for m in models)
116117
(peassignments, _) = pointestimate(PSM, method = :average, mink = mink, maxk = maxk)
117118
```
118119

119120
The grouping wich minimizes the lower bound of the variation of information is illustrated in the following image:
120121
![alt text](pointestimate.png "Point Estimate")
121-
122-
[![Build Status](https://travis-ci.org/OFAI/BayesianNonparametrics.jl.svg?branch=master)](https://travis-ci.org/OFAI/BayesianNonparametrics.jl)

0 commit comments

Comments
 (0)