Skip to content Skip to sidebar Skip to footer

Mixture Of Multivariate Gaussian Distribution Tensorflow Probability

As said in the title, I am trying to create a mixture of multivariate normal distributions using tensorflow probability package. In my original project, am feeding the weights of

Solution 1:

When the components are the same type, MixtureSameFamily should be more performant.

There you only pass a single Categorical instance (with .batch_shape [b1,b2,...,bn]) and a single MVNDiag instance (with .batch_shape [b1,b2,...,bn,numcats]).

For only two classes, I wonder if Bernoulli would work?

Solution 2:

It seems you provided a mis-shaped input to tfp.distributions.Categorical. It's probs parameter should be of shape [batch_size, cat_size] while the one you provide is rather [cat_size, batch_size, 1]. So maybe try to parametrize probs with tf.concat([mix, 1-mix], 1).

There may also be a problem with yourlog_std which doesn't have the same shape as l1and l2. In case MultivariateNormalDiag doesn't properly broadcast it, try to specify it's shape as (None, 2) or to tile it so that it's first dimension corresponds to that of your location parameters.

Post a Comment for "Mixture Of Multivariate Gaussian Distribution Tensorflow Probability"