Saturday, January 13, 2007

Computing the size of a convex hull

We know that in the comparison model, computing a planar convex hull takes Ω(n log n) time (or Ω(n log h, if you're picky) in any reasonable algebraic decision tree model. The proof (at least for n log n) is a simple reduction from sorting. There's a more involved proof by Yao that shows that even generating the set of vertices of the convex hull takes at least n log n time. (i.e ordering the vertices isn't the only hard part). Yao's proof is on the quadratic test model; tests on objects are allowed to be signs of quadratic polynomials, and such tests characterize all known convex hull algorithms. [Addendum: Ben-Or generalizes these results to higher-degree algebraic tests as well]

My question is: is it known whether estimating the size of the convex hull is also lower bounded by n log n (or n log h) in this model ? It seems like this should be true: I don't see an obvious way of using a size-estimation oracle to actually compute the hull (or its vertices) in linear time, but I don't know if there's a proof of this. Yao's proof doesn't appear to yield any direct insight into this question.

It's worth pointing out in the light of recent results by Chan, (and by Pătraşcu on point location), that we can solve the 2D convex hull problem in sub- n log n time; we pick our favorite machine model and sort the x-coordinates in sub-n log n time, and then run the Graham scan in linear time. Chan's paper achieves sub n log n bounds for the 3D convex hull and related problems.


  1. Maybe I'm missing something, but isn't this Example 5 in the Ben-Or result (which applies to order d decision trees)? He attributes the quadratic decision tree lower bound to Yao.

  2. Since you mentined 3D hulls, there's a more recent result by Timothy and me that's achieving significantly better bounds:

    Voronoi Diagrams in n * 2^O(√lglg n) Time

  3. Bad geometer! No biscuit!

    In the comparison model, computing the convex hull takes at least Ω(n^213738912379) time, for the trivial reason YOU CAN'T DO IT AT ALL. Comparisons simply don't give you enough information to compute convex hulls.

    Yao proved the first Ω(n log n) lower bound for convex hulls by considering a model that is actually powerful enough to solve the problem!

    The answer to your question is no. It's fairly easy to compute a (1+ε)-approximation of the area in O(n/poly(ε)) time, using standard (ie, Sarielesque) approximation techniques.

  4. so much for trying to post something technical. Let's try again.

    As Jeff points out, I misspoke: it's not a "comparison model", but a quadratic decision tree model in which the lower bound holds.

    Ben-Or does appear to say that checking if a convex hull contains n vertices has a lower bound; but I am confused, because the result cites Yao's result for quadratic decision trees, and Yao's result doesn't say anything about the size, but about the specification of the actual set.

    Finally, when I say "size", I mean "cardinality". I'm not talking about approximating the area, which as Jeff points out can be done by using approximate extent technology.

  5. This question seems hard. You can get a good estimate of the number of vertices in the first O(log n) levels, by randomly choosing each point with probability 1/log n, and then computing the convex hull of the sample, using the Clarkson technique to bound the number of vertices. You get an Omega type bound, which might be useful for something, and then maybe not. See the work of Malmuley on ray shooting.

    now, does this has any connection to your question?


  6. Hello,
    one could consider a simple,
    extended comparison-based
    model where, for example, in
    addition to usual comparisons
    between real numbers, one can
    perform the following type
    of operations: given three points
    p1, p2 and p3, test
    whether p3 lies above, below or
    on the line defined
    by p1 and p2 (if this line is
    not vertical, of course).
    Such a model suffices to
    compute convex hulls in O(nlogn)
    time. It should not be difficult
    to show that in such a
    model you need Omega(nlogn) time
    in the worst case to
    determine the number of corners
    of the convex hull by
    standard methods.

    If the only concern is the number
    of corners of the convex hull,
    then we can, for example,
    consider the special case where
    all input points
    lie on the parabola y=x^2. Then
    the number of corners of the
    convex hull is equal to n iff
    no two input points have the same x-coordinate.
    Hence, we derive an Omega(nlogn)
    lower bound from the
    element-uniqueness problem.

    With regards,
    - Christos Levcopoulos
    Lund University, Sweden

  7. ... and perhaps I should clarify: we DON'T have to check that the points lie on the y=x^2 parabola. As far as our model concerns, we will only see that whenever we check whether a point p3 lies above or below a line (p1,p2), the result will only be consistent with respect to the respective x-coordinates of the points p1, p2 and p3. So the lower bound holds for our model.

    - Christos Levcopoulos
    Lund University

  8. By the way, the element uniqueness problem has an Omega(nlogn) lower bound even in the algebraic decision tree model, so the\
    same idea would carry over to the number of corners of the convex hull.

    - Christos Levcopoulos, Lund, Sweden

  9. The Ω(n log n) lower bound for "Are all n points extreme?" that Christos describes also holds if you assume (or require) that the points are distinct, by reduction from the integer element uniqueness problem. The idea is to put the points almost on the parabola. Put the ith point at coordinates (x_i, x_i^2 + iε), for some sufficiently small rational ε. Then the convex hull has n vertices if and only if the (integer) x-coordinates are distinct.

    In the algebraic decision tree model, you can reduce from the standard (real) element uniqueness problem by taking ε to be a formal, symbolic infinitesimal.

  10. I can't put my finger on what I find distasteful about the reduction from element uniqueness. Maybe it's the way in which it conflates finding the set of vertices and their number. I'd have preferred a general bound that said that checking if there were $h$ vertices is hard, or even distinguishing between k and k+1, for k = w(1), and o(log n)

  11. BTW, what is a good introductory text for the algebraic decision tree model, and for the other models of computations?

  12. I don't know if there is one. I'm still waiting for Jeff to write his monograph on lower bounds: hopefully then we'll have a nice reference ;)

  13. It will be really good, if the bloggers give the full reference. It will be easier for people like me, that are not experts to follow more easily the discussion.
    For example, what is the ref for
    "Example 5 in the Ben-Or result (which applies to order d decision trees)"

    I know. Silly question. But on the other hand, it a silly question is better than never know the answer :-)

  14. I've added in a ref to the Ben-Or paper (the ACM site was down, so the link is from DBLP)


Disqus for The Geomblog