Public API

# User API

Documentation for `ForneyLab.jl`'s public API.

See Internal API for internal package docs.

## Model specification

@RV provides a convenient way to add Variables and FactorNodes to the graph.

Examples:

``````# Automatically create new Variable x, try to assign x.id = :x if this id is available
@RV x ~ GaussianMeanVariance(constant(0.0), constant(1.0))

# Explicitly specify the id of the Variable
@RV [id=:my_y] y ~ GaussianMeanVariance(constant(0.0), constant(1.0))

# Automatically assign z.id = :z if this id is not yet taken
@RV z = x + y

# Manual assignment
@RV [id=:my_u] u = x + y

# Just create a variable
@RV x
@RV [id=:my_x] x``````
source

A factor graph consisting of factor nodes and edges.

source

A Variable encompasses one or more edges in a FactorGraph.

source

Return currently active FactorGraph. Create one if there is none.

source

### Factor nodes

Description:

``````An addition constraint factor node.

f(out,in1,in2) = δ(in1 + in2 - out)``````

Interfaces:

``````1. out
2. in1
3. in2``````

Construction:

``Addition(out, in1, in2, id=:some_id)``
source
``-(in1::Variable, in2::Variable)``

A subtraction constraint based on the addition factor node.

source

Description: Bernoulli factor node

``````out ∈ {0, 1}
p ∈ [0, 1]

f(out, p) = Ber(out|p) = p^out (1 - p)^{1 - out}``````

Interfaces: 1. out 2. p

Construction: Bernoulli(id=:some_id)

source

Description: Beta factor node

``````Real scalars
a > 0
b > 0

f(out, a, b) = Beta(out|a, b) = Γ(a + b)/(Γ(a) Γ(b)) out^{a - 1} (1 - out)^{b - 1}``````

Interfaces: 1. out 2. a 3. b

Construction: Beta(id=:some_id)

source

Description: Categorical factor node

``````The categorical node defines a one-dimensional probability
distribution over the normal basis vectors of dimension d

out ∈ {0, 1}^d where Σ_k out_k = 1
p ∈ [0, 1]^d, where Σ_k p_k = 1

f(out, p) = Cat(out | p)
= Π_i p_i^{out_i}``````

Interfaces: 1. out 2. p

Construction: Categorical(id=:some_id)

source

@ensureVariables(...) casts all non-Variable arguments to Variable through constant(arg).

source

Description:

``````A factor that clamps a variable to a constant value.

f(out) = δ(out - value)``````

Interfaces:

``1. out``

Construction:

``````Clamp(out, value, id=:some_id)
Clamp(value, id=:some_id)``````
source

`constant` creates a `Variable` which is linked to a new `Clamp`, and returns this variable.

``y = constant(3.0, id=:y)``
source

`placeholder(...)` creates a `Clamp` node and registers this node as a data placeholder with the current graph.

``````# Link variable y to buffer with id :y,
# indicate that Clamp will hold Float64 values.
placeholder(y, :y, datatype=Float64)

# Link variable y to index 3 of buffer with id :y.
# Specify the data type by passing a default value for the Clamp.
placeholder(y, :y, index=3, default=0.0)

# Indicate that the Clamp will hold an array of size `dims`,
# with Float64 elements.
placeholder(X, :X, datatype=Float64, dims=(3,2))``````
source

The `@composite` macro allows for defining custom (composite) nodes. Composite nodes allow for implementating of custom update rules that may be computationally more efficient or convenient. A composite node can be defined with or without an internal model. For detailed usage instructions we refer to the `composite_nodes` demo.

source

Description: Contingency factor node

``````The contingency distribution is a multivariate generalization of
the categorical distribution. As a bivariate distribution, the
contingency distribution defines the joint probability
over two unit vectors. The parameter p encodes a contingency matrix
that specifies the probability of co-occurrence.

out1 ∈ {0, 1}^d1 where Σ_j out1_j = 1
out2 ∈ {0, 1}^d2 where Σ_k out2_k = 1
p ∈ [0, 1]^{d1 × d2}, where Σ_jk p_jk = 1

f(out1, out2, p) = Con(out1, out2 | p)
= Π_jk p_jk^{out1_j * out2_k}

A Contingency distribution over more than two variables requires
higher-order tensors as parameters; these are not implemented in ForneyLab.``````

Interfaces: 1. out1 2. out2 3. p

Construction: Contingency(id=:some_id)

source

Description: Dirichlet factor node

``````Multivariate:
f(out, a) = Dir(out|a)
= Γ(Σ_i a_i)/(Π_i Γ(a_i)) Π_i out_i^{a_i}
where 'a' is a vector with every a_i > 0

Matrix variate:
f(out, a) = Π_k Dir(out|a_*k)
where 'a' represents a left-stochastic matrix with every a_jk > 0``````

Interfaces: 1. out 2. a

Construction: Dirichlet(id=:some_id)

source

Description:

``````out = in1'*in2

in1: d-dimensional vector
in2: d-dimensional vector
out: scalar

in2
|
in1  V   out
----->[⋅]----->

f(out, in1, in2) =  δ(out - in1'*in2)``````

Interfaces:

``1 i[:out], 2 i[:in1], 3 i[:in2]``

Construction:

``DotProduct(out, in1, in2, id=:my_node)``
source

Description:

``````An equality constraint factor node

f(,,) = δ( - ) δ( - )``````

Interfaces:

``1, 2, 3``

Construction:

``````Equality(id=:some_id)

The interfaces of an Equality node have to be connected manually.``````
source

Description:

``````Maps a location to a scale parameter by exponentiation

f(out,in1) = δ(out - exp(in1))``````

Interfaces:

``````1. out
2. in1``````

Construction:

``Exponential(out, in1, id=:some_id)``
source

Description:

``````A gamma node with shape-rate parameterization:

f(out,a,b) = Gam(out|a,b) = 1/Γ(a) b^a out^{a - 1} exp(-b out)``````

Interfaces:

``````1. out
2. a (shape)
3. b (rate)``````

Construction:

``Gamma(out, a, b, id=:some_id)``
source

Description:

``````A Gaussian with mean-precision parameterization:

f(out,m,w) = 𝒩(out|m,w) = (2π)^{-D/2} |w|^{1/2} exp(-1/2 (out - m)' w (out - m))``````

Interfaces:

``````1. out
2. m (mean)
3. w (precision)``````

Construction:

``GaussianMeanPrecision(out, m, w, id=:some_id)``
source

Description:

``````A Gaussian with mean-variance parameterization:

f(out,m,v) = 𝒩(out|m,v) = (2π)^{-D/2} |v|^{-1/2} exp(-1/2 (out - m)' v^{-1} (out - m))``````

Interfaces:

``````1. out
2. m (mean)
3. v (covariance)``````

Construction:

``GaussianMeanVariance(out, m, v, id=:some_id)``
source

Description:

``````A Gaussian mixture with mean-precision parameterization:

f(out, z, m1, w1, m2, w2, ...) = 𝒩(out|m1, w1)^z_1 * 𝒩(out|m2, w2)^z_2 * ...``````

Interfaces:

``````1. out
2. z (switch)
3. m1 (mean)
4. w1 (precision)
5. m2 (mean)
6. w2 (precision)
...``````

Construction:

``GaussianMixture(out, z, m1, w1, m2, w2, ..., id=:some_id)``
source

Description:

``A Gaussian with weighted-mean-precision parameterization:``

Interfaces:

``````1. out
2. xi (weighted mean, w*m)
3. w (precision)``````

Construction:

``GaussianWeightedMeanPrecision(out, xi, w, id=:some_id)``
source

Description:

``````A log-normal node with location-scale parameterization:

f(out,m,s) = logN(out|m, s) = 1/out (2π s)^{-1/2} exp(-1/(2s) (log(out) - m)^2))``````

Interfaces:

``````1. out
2. m (location)
3. s (squared scale)``````

Construction:

``LogNormal(out, m, s, id=:some_id)``
source

Description:

``````For continuous random variables, the multiplication node acts
as a (matrix) multiplication constraint, with node function

f(out, in1, a) = δ(out - a*in1)``````

Interfaces:

``````1. out
2. in1
3. a``````

Construction:

``Multiplication(out, in1, a, id=:some_id)``
source

Description:

``````Nonlinear node modeling a nonlinear relation. Updates for
the nonlinear node are computed through local linearization.

f(out, in1) = δ(out - g(in1))``````

Interfaces:

``````1. out
2. in1``````

Construction:

``Nonlinear(out, in1, g::Function, J_g::Function, id=:my_node)``
source

Description: Poisson factor node

``````Real scalars
l > 0 (rate)

f(out, l) = Poisson(out|l) = 1/(x!) * l^x * exp(-l)``````

Interfaces: 1. out 2. l

Construction: Poisson(id=:some_id)

source

Description: Constrains a continuous, real-valued variable with a binary (boolean) variable.

``f(bin, real) = σ(bin⋅real)``

Interfaces: 1. bin 2. real

Construction: Sigmoid(id=:some_id)

source

Description:

``````The transition node models a transition between discrete
random variables, with node function

f(out, in1, a) = Cat(out | a*in1)

Where a is a left-stochastic matrix (columns sum to one)``````

Interfaces:

``````1. out
2. in1
3. a``````

Construction:

``Transition(out, in1, a, id=:some_id)``
source

Description:

``````A Wishart node:

f(out,v,nu) = W(out|v, nu) = B(v, nu) |out|^{(nu - D - 1)/2} exp(-1/2 tr(v^{-1} out))``````

Interfaces:

``````1. out
2. v (scale matrix)
3. nu (degrees of freedom)``````

Construction:

``Wishart(out, v, nu, id=:some_id)``
source

## Scheduling

A `MarginalSchedule` defines the update order for marginal computations.

source

A RecognitionFactorization holds a collection of (non-overlapping) recognition factors that specify the recognition factorization over a factor graph that is used for variational inference.

source

A `Schedule` defines the update order for message computations.

source

Return currently active RecognitionFactorization. Create one if there is none.

source

expectationPropagationSchedule() generates a expectation propagation message passing schedule.

source

sumProductSchedule() generates a sum-product message passing schedule that computes the marginals for each of the argument variables.

source

variationalExpectationPropagationSchedule() generates an expectation propagation message passing schedule that is limited to the `recognition_factor`. Updates on EP sites are computed with an `ExpectationPropagationRule`.

source

variationalSchedule() generates a variational message passing schedule that computes the marginals for each of the recognition distributions in the recognition factor.

source

## Algorithm generation

Create a sum-product algorithm to infer marginals over `variables`, and compile it to Julia code

source

The `freeEnergyAlgorithm` function accepts a `RecognitionFactorization` and returns (if possible) Julia code for computing the variational free energy with respect to the argument recognition factorization and corresponding `FactorGraph` (model).

source

Create a sum-product algorithm to infer marginals over `variables`, and compile it to Julia code

source

Create a variational algorithm to infer marginals over a recognition distribution, and compile it to Julia code

source

Create a variational EP algorithm to infer marginals over a recognition distribution, and compile it to Julia code

source

## Algorithm execution

Encodes a message, which is a probability distribution with a scaling factor

source

PointMass is an abstract type used to describe point mass distributions. It never occurs in a FactorGraph, but it is used as a probability distribution type.

source

Encodes a probability distribution as a FactorNode of type `family` with fixed interfaces

source

## Helper

Matrix inversion using Cholesky decomposition, attempts with added regularization (1e-8*I) on failure.

source

Helper function to construct 1x1 Matrix

source

Duplicate a method definition with the order of the first two arguments swapped. This macro is used to duplicate methods that are symmetrical in their first two input arguments, but require explicit definitions for the different argument orders. Example:

``````@symmetrical function prod!(x, y, z)
...
end``````
source

ensureMatrix: cast input to a Matrix if necessary

source

isApproxEqual: check approximate equality

source

isRoundedPosDef: is input matrix positive definite? Round to prevent fp precision problems that isposdef() suffers from.

source

`leaftypes(datatype)` returns all subtypes of `datatype` that are leafs in the type tree.

source

trigammaInverse(x): solve `trigamma(y) = x` for `y`.

Uses Newton's method on the convex function 1/trigramma(y). Iterations converge monotonically. Based on trigammaInverse implementation in R package "limma" by Gordon Smyth: https://github.com/Bioconductor-mirror/limma/blob/master/R/fitFDist.R

source