How do you sample uniformly from the unit simplex ? Specifically, I would like a uniform sample from the set X = { (x

_{1}, x

_{2}, ..., x

_{D}) | 0 <= x

_{i}<= 1, x

_{1}+ x

_{2}+ ... + x

_{D}= 1}. D is the dimension of the simplex. The first class of approaches that one might try are "rejection" sampling methods. For example, we can sample element from the unit hypercube, and reject elements that don't lie in the simplex. This of course gets worse and worse as the dimension increases: the number of rejections for each valid point gets more and more. Another type of approach is what is called a "projection" method. for example, sample a point from the unit hypercube, and mark off the point on the simplex that intersects a ray connecting the origin and the sampled point. The problem with these approaches is that they generate non-uniform distributions, and you then have to figure out a way of normalizing correctly. An interesting hybrid is the idea of sampling from the unit sphere, and then using the map x -> sqrt{x} to map the point to the simplex. However, this mapping is also non-uniform, and generating a uniform sample efficiently from the sphere is also non-trivial.

Generating a point from the D-dimensional simplex is equivalent to sampling D-1 points from the unit line, sorting them, and then taking the intervals between adjacent points. The distribution of the sorted set (the order statistics) can be derived from the distribution the points are sampled from. In general, if the points are taken from a distribution with pdf given by f and cdf given by F, then the pdf of the kth smallest element is given by

f

_{(k)}(x) = n f(x) F(x)

^{k-1}(1-F(x))

^{n-k}[ n-1 choose k-1 ]

For a uniform distribution, these values are given by the Beta distribution, with parameters k-1 and n-k. Of course, what we want are the differences between adjacent elements.

It turns out that there is an elegant way of doing this. We generate IID random samples from an exponential distribution (which you do by sampling X from [0,1] uniformly, and returning -log(X)). Take D samples, and then normalize. Voila, the resulting list of numbers is a uniform sample from the simplex.

A similar idea works to sample from the unit sphere. Sample elements IID from a normal distribution, and normalize so the resulting vector has norm 1.

For more on this, check out Luc Devroye's book, 'Non-Uniform Random Variate Generation'. It was originally published by Springer, and is now out of print. He has the entire book available on his website.

Update: As one of the commenters pointed out, uniform sampling from the simplex is a special case of sampling from a Dirichlet distribution (whose main claim to fame is as the conjugate prior for a multinomial distribution). To sample from the Dirichlet distribution, you can take normalized samples from an appropriately constructed gamma distribution, which for the case of sampling from the simplex reduces to sampling from the exponential distribution as above.

FYI,

Steven Thomas Smith

ReplyDeleteSee the paper "Mean and variance CFAR performance via fiber integration " for an application of this idea.

Personally, I like the QR decomposition of a Gaussian matrix to get a uniform sample of the unitary/orthogonal group.

Posted by

Also see:

Aleks

ReplyDeleteRubin, DB (1981) The Bayesian bootstrap. Ann. Statist. 9, 130-134.

The problem is how to generate samples from the Dirichlet(1,...,1) distribution (which is a kind of a simplex). The paper recommends to generate (n-1) random numbers from the uniform distribution on [0,1], sort them, and take the differences between the subsequent ones (and zero and one). This guarantees that they all sum up to 1.

Posted by

I should also mention that there is a library in C++, on www.boost.org, which provides various distributions and random generators, including the uniform distribution on sphere, but not the uniform distribution on the standard simplex. That would be a worthwhile addition, I'll see if I can make a suggestion.

Herve Bronnimann

ReplyDeleteThe url is http://boost.org/libs/random/index.html

And while you're there, check out the other libraries in Math/Numerics and Algorithms.

It's really a great piece of work, and very useful. Many, including the Mersenne Twister is a bit expensive but provides much truer randomness than a linear congruential generator. Much of the knowledge is taken from Knuth TAOCP Vol. 2.

Posted by

I was a tad disenchanted with the boost graph librarires: their performance seemed a bit slow, especially in comparison to some hacked up stuff I wrote in python. I guess that sampling libraries won't have that much overhead (the graph libraries were highly templated), so might work well.

Suresh

ReplyDeleteThanks for the tip !

Posted by

I think Jeff Vitter's "Faster Methods for Random Sampling" (1984) uses the exponential-RVs method, where the application is jumping through a database to get a random sample.

Ken Clarkson

ReplyDeleteThose must be some high dimensional simplices, to want to avoid sorting.

(Sorry, Suresh, I should be reading.)

Posted by

Suresh -- Any thoughts on how to implement this in Mathematica code? I have an almost identical problem to address.

ReplyDeleteRgds,

A

sorry. no clue. I don't know mathematica at all

ReplyDeleteI believe this code does it in Mathematica:

ReplyDeletesimplexPoint[k_Integer] := Normalize[-Log@RandomReal[{0, 1}, k], Total]

or, trusting Mathematica to generate Exponentials (why not?):

simplexPoint[k_Integer] :=

Normalize[RandomReal[ExponentialDistribution[2], k], Total]

Either way, in 10-space, for example:

simplexPoint[10]

{0.0312847, 0.0193921, 0.0125146, 0.071352, 0.00115483, 0.167582, \

0.0487961, 0.208346, 0.0438064, 0.395771}

Bobby