Message passing is to calculate a marginal probability given a structure of information model (a graph).
[Factor graph]: A factor graph is a bi-partile graph. One type of nodes is random variable (unknown we would like to know). The other type of nodes is factor, which represents the interaction between random variables, or evidences. The global probability must be able to be represented by the multiplication of local factors, i.e.,
p[x1,x2,...,xn] = f1[xv1]f2[xv2]...fm[xvm], where each xv is a list of x (partial set of xn).
The interpretation of the world (the guessing of the random variables) depends on the factors. So if we accumulate the factors,
p = f1*f2*...*fn,
we can calculate the marginal probability by probability propagation.
[Baysian Network]:
The joint probability of a network can be represented as multiplication of local conditional probability. The basic concept is built on sufficient statistics (also conditional independence.)
[e.g.] Three random variables,
x->y->z, p(z x, y) = p(z y). Given y, knowing x has no use. That means, knowing y is sufficient to predict z. (Look wiki for sufficient statistics).
The upshot is that we can factorize the distribution.
If we know p(x,y,z), we can calculate all the marginal distributions such as p(x), p(y), p(xz). If we want to calculate p[z,x=x'], we can start with
p[z,x= x'] = sum_{y}p[x=x',y,z]/sum_{z,y}p[x=x',y,z].
p[z,y,x=x'] = p[z,y,x=x']p[y,x'] = p[z,y]p[y,x']p[x'].
A Baysian network can therefore factorize into localized factors, f1, f2, ...fn.
A Baysian network is a special case of a factor graph.
A random markov field is also a special case of a factor graph.
[Probability Propagation]
To calculate the marginal probability of a factor graph, an algorithm of probability propagation can be adopted.
z-->
y-->f-->x: the edge from a factor node to a variable node.
沒有留言:
張貼留言