Ruminations on computational geometry, algorithms, theoretical computer science and life
Wednesday, June 30, 2004
Depth thru Breadth
Chandra mentioned Avi Wigderson's invited STOC talk on whether theory is still a unified discipline. Avi's talk slides are now online (PPT alas!), and I recommend going through them, if only to get a magnificient birds eye view of what has been accomplished in the theory of computation, and it how all connects up.
What is STOC ?
From Googlism, answers to the question: What is STOC ?
The Factual:
stoc is acm symposium on the theory of computing
stoc is the largest acm symposium
stoc is considered one of the most prestigious conferences in computer science
stoc is traditionally held in a different location each year
The Legal:
stoc is not responsible or liable
The Predictive:
stoc is still growing
stoc is rising
stoc is bottoming
A bit confused, that one..
The Reflective:
stoc is a good source
stoc is pretty much for fun
stoc is for you
and finally, The Mystical:
stoc is god
The Factual:
stoc is acm symposium on the theory of computing
stoc is the largest acm symposium
stoc is considered one of the most prestigious conferences in computer science
stoc is traditionally held in a different location each year
The Legal:
stoc is not responsible or liable
The Predictive:
stoc is still growing
stoc is rising
stoc is bottoming
A bit confused, that one..
The Reflective:
stoc is a good source
stoc is pretty much for fun
stoc is for you
and finally, The Mystical:
stoc is god
Tuesday, June 29, 2004
Copy Protection: the next generation...
From Salon.com:
The Bush administration is offering a novel reason for denying a request seeking the Justice Department's database on foreign lobbyists: Copying the information would bring down the computer system.
"Implementing such a request risks a crash that cannot be fixed and could result in a major loss of data, which would be devastating," wrote Thomas J. McIntyre, chief in the Justice Department's office for information requests.
I don't know whether to find this amusing as a computer scientist, or to be very worried as a foreign visitor whose records all sit inside one of these systems. Maybe the open goverment advocates forgot to offer up some delightful dragon dainties.
The Bush administration is offering a novel reason for denying a request seeking the Justice Department's database on foreign lobbyists: Copying the information would bring down the computer system.
"Implementing such a request risks a crash that cannot be fixed and could result in a major loss of data, which would be devastating," wrote Thomas J. McIntyre, chief in the Justice Department's office for information requests.
I don't know whether to find this amusing as a computer scientist, or to be very worried as a foreign visitor whose records all sit inside one of these systems. Maybe the open goverment advocates forgot to offer up some delightful dragon dainties.
Lower bounds comeback ?
Maybe I am reading too much into titles, but it seems like hardness results are back in business again. From a highly unprofessional inspection of the FOCS list:
An Optimal Randomised Cell Probe Lower Bound for Approximate Nearest Neighbour Searching
Hardness of Approximating the Shortest Vector Problem in Lattices
Ruling Out PTAS for Graph Min-Bisection, Densest Subgraph and Bipartite Clique
On the (Im)possibility of Cryptography with Imperfect Randomness
The Hardness of Metric Labeling
Hardness of buy-at-bulk network design
Optimal Inapproximability Results for MAX-CUT and Other 2-variable CSPs?
If I get a chance I will comment on some of the interesting geometry papers out there (have been pinging authors for copies).
An Optimal Randomised Cell Probe Lower Bound for Approximate Nearest Neighbour Searching
Hardness of Approximating the Shortest Vector Problem in Lattices
Ruling Out PTAS for Graph Min-Bisection, Densest Subgraph and Bipartite Clique
On the (Im)possibility of Cryptography with Imperfect Randomness
The Hardness of Metric Labeling
Hardness of buy-at-bulk network design
Optimal Inapproximability Results for MAX-CUT and Other 2-variable CSPs?
If I get a chance I will comment on some of the interesting geometry papers out there (have been pinging authors for copies).
Academics and the media
I've always wondered how reporters know whom to ask for a pithy quote on a technical topic. This article from the Christian Science Monitor (via ALDaily) illuminates the matter.
However, the following caught my eye:
For scholars, press interviews can bring fringe benefits or advance a career. Although news clips never substitute for publishing in scholarly journals, they can help the teaching reputation of a tenure-seeking professor who consistently attracts students drawn to a celebrity. Beyond that, Williams says, comes the satisfaction of knowing, "I can reach a million people, and I love that." He admits, "There's maybe an ego satisfaction involved."
You'd think the blogosphere could take care of that problem....
However, the following caught my eye:
For scholars, press interviews can bring fringe benefits or advance a career. Although news clips never substitute for publishing in scholarly journals, they can help the teaching reputation of a tenure-seeking professor who consistently attracts students drawn to a celebrity. Beyond that, Williams says, comes the satisfaction of knowing, "I can reach a million people, and I love that." He admits, "There's maybe an ego satisfaction involved."
You'd think the blogosphere could take care of that problem....
What is a good review ?
In a comment on my previous post, David Molnar asks:
Could you be more specific about the "many other conferences" that give better feedback than theory?
Well, I don't want to name names :). The reason I don't want to name names is because every time I have, the usual reaction has been "but our papers are much higher quality than theirs", which I think misses the point and clouds the issue.
Let me say though what I consider a "good" set of reviews: the areas I have in mind exemplify this to varying degrees. Note that I don't necessarily think there is any such thing as a good review per se; I am more interested in a good set of reviews. This good set would contain:
* at least one that comments on the merits of the work and how it fits into the broader picture (or not, in case the reviewer doesn't like the paper).
* Some review that tracks detailed issues (even typos, minor errors etc): not all reviews need to do this, but it is useful for the author to know that someone read it thoroughly.
* Something that comments on why the reviewer likes the paper. This is an underrated feature: it's ok to learn from criticism, but without any sense of whether the reviewer actually appreciates my work, it is hard to get a sense of perspective on the criticism.
Overall, a sense that the reviewer understands the paper and assesses it from that perspective. Personally, I can accept a scathing review when the reviewer demonstrates an understanding of my work; it is harder to accept one when I see glaring gaps in the reviewer's comprehension.
My comments about "other areas" comes from submitting papers and looking at the extensive reviews that I get back, and comparing that with reviews from theory conferences, where I can occasionally be lucky to get even one line total from all three reviews. It also comes from the reaction of some of my coauthors (who are not used to theory reviews).
As I said earlier, a lot of this can be explained by the mechanics of the review process: it is a lot easier to review 10 papers thoroughly than 60.
Could you be more specific about the "many other conferences" that give better feedback than theory?
Well, I don't want to name names :). The reason I don't want to name names is because every time I have, the usual reaction has been "but our papers are much higher quality than theirs", which I think misses the point and clouds the issue.
Let me say though what I consider a "good" set of reviews: the areas I have in mind exemplify this to varying degrees. Note that I don't necessarily think there is any such thing as a good review per se; I am more interested in a good set of reviews. This good set would contain:
* at least one that comments on the merits of the work and how it fits into the broader picture (or not, in case the reviewer doesn't like the paper).
* Some review that tracks detailed issues (even typos, minor errors etc): not all reviews need to do this, but it is useful for the author to know that someone read it thoroughly.
* Something that comments on why the reviewer likes the paper. This is an underrated feature: it's ok to learn from criticism, but without any sense of whether the reviewer actually appreciates my work, it is hard to get a sense of perspective on the criticism.
Overall, a sense that the reviewer understands the paper and assesses it from that perspective. Personally, I can accept a scathing review when the reviewer demonstrates an understanding of my work; it is harder to accept one when I see glaring gaps in the reviewer's comprehension.
My comments about "other areas" comes from submitting papers and looking at the extensive reviews that I get back, and comparing that with reviews from theory conferences, where I can occasionally be lucky to get even one line total from all three reviews. It also comes from the reaction of some of my coauthors (who are not used to theory reviews).
As I said earlier, a lot of this can be explained by the mechanics of the review process: it is a lot easier to review 10 papers thoroughly than 60.
Monday, June 28, 2004
SODA 2005
A friendly reminder. SODA submissions are due a week from today (Jul 5 @ 5 pm EDT). If you are working on submissions, do register them in advance. More instructions are here.
The Importance of Operations Research
The Boston Globe has a careful and well thought out article by Virginia Postrel on the importance of operations research over the years. She finesses the tiresome theory-practice debates rather well, by describing how the theoretical development of the field was boosted by the emergence of large data sources on which to apply the field's arsenal of algorithms.
I found the following line hilarious:
"In some OR journals today, the only empirical data are, `Date of submission' and `date of acceptance."'
Via Instapundit
I found the following line hilarious:
"In some OR journals today, the only empirical data are, `Date of submission' and `date of acceptance."'
Via Instapundit
On gaming program committees
Lance Fortnow has an interesting take on conference acceptance procedures today. He argues:
Two easy ways to improve your paper but lessen your chances of acceptance at a conference: Add more results and simplify your proofs. Adding a result could only increase the usefulness of a paper but program committees see many results in a paper and conclude that none of them could be very strong. One of our students a few years ago had a paper rejected at STOC, he removed one of his two main theorems and won the best student paper award at FOCS.
Given the same theorem, the community benefits from a simple proof over a complicated proof. Program committees look for hard results so if they see a very simple proof, it can count against you.
I agree mostly with his take on how PCs view papers: simple proofs can indeed be looked down upon. The interesting question is this though: since PC members are the same people who, when not on PCs, have this problem, what is it about PC membership that causes their judgement to skew ?
The usual argument is load: PC members in algorithms conferences typically review far more submissions than PC members in any other conference mainly because of two inter-related reasons:
1. Our PCs are small
2. PC members are not permitted to submit papers to the conference
(note that (2) more or less forces (1)).
Or could one argue that it is the right and appropriate thing for PC members to prune papers in this fashion ? And that it is the authors' responsibility to make the best case for their submission in a system which will always be imperfect ? One might think that this reasoning would encourage, rather than discourage, simple proofs, because these are easier to understand and lead to a better exposition in a conference setting.
It seems to me that one reason an elegant proof might be looked down upon in comparison to a more technical, grungy proof is that if the reviewer is not intimately familiar with the area of the paper, they might not appreciate the value of the elegant result, or be aware of how hard it was to achieve such an understanding of a problem etc. This doesn't sound like a problem that can be fixed easily, unless every paper can be reviewed by an expert in that specific area, which seems difficult to manage.
I would like to venture the slightly controversial claim that theory (STOC/FOCS/SODA/etc) committees are not as rigorous in providing feedback and comprehensive reviews as many other conferences. There are many good reasons why this is the case, and I don't think one can fault reviewers who do the best they can under severe load, but the fact remains, and it would be nice to see more discussion of this in business meetings or even in informal forums in the community.
Although this is somewhat removed from the original point about reviews themselves, I feel that feedback itself is a method for ensuring accountability and openness. A reviewer who has to write a detailed explanation of what they like/don't like in a paper will automatically do a more thorough job of reviewing it. Again, this is not a matter of harasssing reviewers: structural changes will have to made in how theory committees are set up to make this practical.
Two easy ways to improve your paper but lessen your chances of acceptance at a conference: Add more results and simplify your proofs. Adding a result could only increase the usefulness of a paper but program committees see many results in a paper and conclude that none of them could be very strong. One of our students a few years ago had a paper rejected at STOC, he removed one of his two main theorems and won the best student paper award at FOCS.
Given the same theorem, the community benefits from a simple proof over a complicated proof. Program committees look for hard results so if they see a very simple proof, it can count against you.
I agree mostly with his take on how PCs view papers: simple proofs can indeed be looked down upon. The interesting question is this though: since PC members are the same people who, when not on PCs, have this problem, what is it about PC membership that causes their judgement to skew ?
The usual argument is load: PC members in algorithms conferences typically review far more submissions than PC members in any other conference mainly because of two inter-related reasons:
1. Our PCs are small
2. PC members are not permitted to submit papers to the conference
(note that (2) more or less forces (1)).
Or could one argue that it is the right and appropriate thing for PC members to prune papers in this fashion ? And that it is the authors' responsibility to make the best case for their submission in a system which will always be imperfect ? One might think that this reasoning would encourage, rather than discourage, simple proofs, because these are easier to understand and lead to a better exposition in a conference setting.
It seems to me that one reason an elegant proof might be looked down upon in comparison to a more technical, grungy proof is that if the reviewer is not intimately familiar with the area of the paper, they might not appreciate the value of the elegant result, or be aware of how hard it was to achieve such an understanding of a problem etc. This doesn't sound like a problem that can be fixed easily, unless every paper can be reviewed by an expert in that specific area, which seems difficult to manage.
I would like to venture the slightly controversial claim that theory (STOC/FOCS/SODA/etc) committees are not as rigorous in providing feedback and comprehensive reviews as many other conferences. There are many good reasons why this is the case, and I don't think one can fault reviewers who do the best they can under severe load, but the fact remains, and it would be nice to see more discussion of this in business meetings or even in informal forums in the community.
Although this is somewhat removed from the original point about reviews themselves, I feel that feedback itself is a method for ensuring accountability and openness. A reviewer who has to write a detailed explanation of what they like/don't like in a paper will automatically do a more thorough job of reviewing it. Again, this is not a matter of harasssing reviewers: structural changes will have to made in how theory committees are set up to make this practical.
Friday, June 25, 2004
On Apollonian supergaskets, or how I always wanted to use 'syzygy' in a sentence.
Friday at the AT&T Math Seminar, Allan Wilks gave a beautiful talk on 'The Apollonian Supergasket', a structure that contains all strongly integral Apollonian packings via a set of simple operations. This posting is drawn heavily from his talk: all mistakes are mine :)
Apollonian Gaskets:
Apollonian circle packings come from the Apollonius Problem:
Given three objects, draw a circle mutually tangent to all three.
When the objects are circles themselves, and are already mutually tangent, there are exactly two solutions to this problem (an inner and outer solution), and these are called Soddy circles1
It is very easy to compute the two Soddy circles. If we denote bi = 1/ri as the bend of a circle of radius ri, then
2(b12 + b22 + b32 + b42) = (b1 + b2 + b3 + b4)2
This is known as the Descartes Circle Theorem, and the four bends together are called a Descarte quadruple. What is intriguing is that if we wish to determine the circle centers, we can do so as well. If the centers are represented as complex numbers zj, then Colin Mallows has shown that the numbers bi * zi also form a Descarte quadruple. An important note is that bends can be negative: if three mutually tangent circles are touched by a circle that includes all of them, its bend has a negative sign (you can think of this on the sphere to see why: the "sense" of the circle is inverted)
Now take four mutually tangent circles, and pick one. For the other three, find the "second" mutually tangent circle. Add that in and repeat. This yields a packing of the plane, called the Apollonian Gasket. Depending on the starting set of four circles, one can obtain many different kinds of gaskets.
The Supergasket:
Now comes the stunning part2. A set of two simple inversion operations allows us to move from one Descarte configuration to another. One operation we have already seen: if we consider the two solutions to the 3-circle Apollonian problem, they invert to each other with respect to a fixed circle. This gives us one new circle for a fixed set of three.
The other inversion operator works as follows. Take one of the Soddy circles, and invert its three touching circles into it. This yields another 4-circle configuration inside the Soddy circle.
There are a total of eight such operations (4 for each kind of inversion). Together, these operations generate a special group called a Coxeter group with appropriate syzygies. Among the properties of these operations are:
That last statement is what for me is so amazing. Essentially there is a single universal packing that contains all Apollonian packings (in the integral case). This is a truly mysterious structure.
Also, there are beguiling symmetries if we look at outer circle configurations (configurations where there is one outer circle and three inner circle). The mod 2 bend groups are symmetric with different axes of symmetry, and all outer circles having the same bends form symmetric patterns in the supergasket.
Notes:
1. Apollonian circle problems were discussed in Sangaku (Japanese Temple Geometry) that I had talked about earlier. More on Sangaku here (via GJ).
2. When Allan was giving this talk, he said 'beautiful' so many times he had to restrain himself; now as I try to describe this in words, I understand his problem :)
References:
This is based on work by Jeff Lagarias, Ron Graham, Colin Mallows, Allan Wilks and Catherine Yan.
Papers: (at www.arxiv.org)
Apollonian Gaskets:
Apollonian circle packings come from the Apollonius Problem:
Given three objects, draw a circle mutually tangent to all three.
When the objects are circles themselves, and are already mutually tangent, there are exactly two solutions to this problem (an inner and outer solution), and these are called Soddy circles1
It is very easy to compute the two Soddy circles. If we denote bi = 1/ri as the bend of a circle of radius ri, then
2(b12 + b22 + b32 + b42) = (b1 + b2 + b3 + b4)2
This is known as the Descartes Circle Theorem, and the four bends together are called a Descarte quadruple. What is intriguing is that if we wish to determine the circle centers, we can do so as well. If the centers are represented as complex numbers zj, then Colin Mallows has shown that the numbers bi * zi also form a Descarte quadruple. An important note is that bends can be negative: if three mutually tangent circles are touched by a circle that includes all of them, its bend has a negative sign (you can think of this on the sphere to see why: the "sense" of the circle is inverted)
Now take four mutually tangent circles, and pick one. For the other three, find the "second" mutually tangent circle. Add that in and repeat. This yields a packing of the plane, called the Apollonian Gasket. Depending on the starting set of four circles, one can obtain many different kinds of gaskets.
The Supergasket:
Now comes the stunning part2. A set of two simple inversion operations allows us to move from one Descarte configuration to another. One operation we have already seen: if we consider the two solutions to the 3-circle Apollonian problem, they invert to each other with respect to a fixed circle. This gives us one new circle for a fixed set of three.
The other inversion operator works as follows. Take one of the Soddy circles, and invert its three touching circles into it. This yields another 4-circle configuration inside the Soddy circle.
There are a total of eight such operations (4 for each kind of inversion). Together, these operations generate a special group called a Coxeter group with appropriate syzygies. Among the properties of these operations are:
- Each pair of circles thus generated are tangent or disjoint
- If the starting configuration is (strongly) integral, so are all resulting configurations.
- The resulting set of configurations contains all strongly integral Apollonian packings
That last statement is what for me is so amazing. Essentially there is a single universal packing that contains all Apollonian packings (in the integral case). This is a truly mysterious structure.
Also, there are beguiling symmetries if we look at outer circle configurations (configurations where there is one outer circle and three inner circle). The mod 2 bend groups are symmetric with different axes of symmetry, and all outer circles having the same bends form symmetric patterns in the supergasket.
Notes:
1. Apollonian circle problems were discussed in Sangaku (Japanese Temple Geometry) that I had talked about earlier. More on Sangaku here (via GJ).
2. When Allan was giving this talk, he said 'beautiful' so many times he had to restrain himself; now as I try to describe this in words, I understand his problem :)
References:
This is based on work by Jeff Lagarias, Ron Graham, Colin Mallows, Allan Wilks and Catherine Yan.
Papers: (at www.arxiv.org)
- Beyond The Descarte Circle Theorem
- Apollonian Circle Packings: Geometry and Group Theory:
I: The Apollonian Group
II: Super-Apollonian Group and Integral Packings
III: Higher Dimensions - Apollonian Circle Packings: Number Theory
Einstein, Poincaré, and Picasso
An interesting review of cubism in the Guardian talks of the influence Henri Poincaré had on Picasso.
Cubism was never a style in that sense. It was an inquiry. Picasso and Braque were lucky enough to be young - Picasso was 28 in 1909, Braque 27 - at a time of intellectual revolution. Habits of perception and assumptions about the nature of things that had been stable since the 17th century were falling away. Arthur I Miller's 2001 book Einstein, Picasso: Space, Time and the Beauty that Causes Havoc demonstrates how uncannily Picasso's discovery of cubism parallelled Einstein's contemporary theories of special and general relativity. In science, mathematics and philosophy, the laws of a clockwork universe established by Sir Isaac Newton in the Baroque age were giving way before the first world war to extraordinary notions - that time and space are one, that light waves curve, that no two observers ever see exactly the same thing.
Picasso and Einstein, Miller has shown, were both influenced by the French thinker Henri Poincaré, who published his book La Science et l'Hypothèse in 1902.
...
Picasso learned about his ideas through the mathematician Maurice Princet, who was a regular at Montmartre cafe tables. Picasso's friend André Salmon wrote that Princet "preoccupies himself especially with painters who disdain ancient perspective. He praises them for no longer trusting the illusory optics of not long ago... "
Cubism was never a style in that sense. It was an inquiry. Picasso and Braque were lucky enough to be young - Picasso was 28 in 1909, Braque 27 - at a time of intellectual revolution. Habits of perception and assumptions about the nature of things that had been stable since the 17th century were falling away. Arthur I Miller's 2001 book Einstein, Picasso: Space, Time and the Beauty that Causes Havoc demonstrates how uncannily Picasso's discovery of cubism parallelled Einstein's contemporary theories of special and general relativity. In science, mathematics and philosophy, the laws of a clockwork universe established by Sir Isaac Newton in the Baroque age were giving way before the first world war to extraordinary notions - that time and space are one, that light waves curve, that no two observers ever see exactly the same thing.
Picasso and Einstein, Miller has shown, were both influenced by the French thinker Henri Poincaré, who published his book La Science et l'Hypothèse in 1902.
...
Picasso learned about his ideas through the mathematician Maurice Princet, who was a regular at Montmartre cafe tables. Picasso's friend André Salmon wrote that Princet "preoccupies himself especially with painters who disdain ancient perspective. He praises them for no longer trusting the illusory optics of not long ago... "
Thursday, June 24, 2004
Knot or not ?
A recent posting on comp.theory prompted this note:
given the 3 dimensional layout of a rope, I would like to know if
pulling the strings at both ends would result is knot or not.
MathWorld had this to say:
There is no general algorithm to determine if a tangled curve is a knot or if two given knots are interlocked. Haken (1961) and Hemion (1979) have given algorithms for rigorously determining if two knots are equivalent, but they are too complex to apply even in simple cases (Hoste et al. 1998).
Note that the question MathWorld asks appears to be slightly more general (asking if a given curve is a knot or not). The question by the comp.theory poster (I imagine) only asks about a motion pulling in two opposite directions (although this is probably not the most precise way to put it).
I imagine that the answer to the question posted above is well known: I'd be interested in a pointer if anyone has one.
There are a plethora of links on the web concerning knot theory. The Geometry Junkyard has a knot theory page, and a far more comprehensive set of links can be found at Knots On The Web.
given the 3 dimensional layout of a rope, I would like to know if
pulling the strings at both ends would result is knot or not.
MathWorld had this to say:
There is no general algorithm to determine if a tangled curve is a knot or if two given knots are interlocked. Haken (1961) and Hemion (1979) have given algorithms for rigorously determining if two knots are equivalent, but they are too complex to apply even in simple cases (Hoste et al. 1998).
Note that the question MathWorld asks appears to be slightly more general (asking if a given curve is a knot or not). The question by the comp.theory poster (I imagine) only asks about a motion pulling in two opposite directions (although this is probably not the most precise way to put it).
I imagine that the answer to the question posted above is well known: I'd be interested in a pointer if anyone has one.
There are a plethora of links on the web concerning knot theory. The Geometry Junkyard has a knot theory page, and a far more comprehensive set of links can be found at Knots On The Web.
Lindstrom's Lemma
I recently came across an interesting combinatorial lemma on graphs. Let G be a planar DAG, and designate the 2n boundary nodes as sources and sinks (consecutively, so walking around the boundary you see first the n sources and then the n sinks). Construct the n X n matrix M where mij = number of distinct paths from source i to sink j. Then
Lindstrom's Lemma:
Each minor DI,J of M is the number of families of non-intersecting paths from sources indexed by I to sources indexed by J.
Thus, every minor of M is non-negative. A matrix having this property is said to be totally non-negative.
Source: Combinatorics and Graph Theory, by Mark Skandera
Lindstrom's Lemma:
Each minor DI,J of M is the number of families of non-intersecting paths from sources indexed by I to sources indexed by J.
Thus, every minor of M is non-negative. A matrix having this property is said to be totally non-negative.
Source: Combinatorics and Graph Theory, by Mark Skandera
Wednesday, June 23, 2004
Mathematicians are a machine....
More visa woes...
I think Sariel needs to send the USCIS some more virgin red dragon blood.
From the AP Wire:
Foreigners with worker visas must reapply for them overseas when they expire, the State Department said Wednesday.
Department spokesman Richard Boucher said that in the past, foreigners with worker visas were able to reapply for visas in the United States.
The reason for the switch, he said, is that U.S. embassies abroad are better equipped than government offices in the United States to interview and fingerprint the growing number of visa applicants.
"These people can stay as long as they want. They can leave when they want," Boucher said. "But when they come back, instead of getting a visa here in advance, they will have to get one overseas at one of our embassies and consulates and then come back."
From the AP Wire:
Foreigners with worker visas must reapply for them overseas when they expire, the State Department said Wednesday.
Department spokesman Richard Boucher said that in the past, foreigners with worker visas were able to reapply for visas in the United States.
The reason for the switch, he said, is that U.S. embassies abroad are better equipped than government offices in the United States to interview and fingerprint the growing number of visa applicants.
"These people can stay as long as they want. They can leave when they want," Boucher said. "But when they come back, instead of getting a visa here in advance, they will have to get one overseas at one of our embassies and consulates and then come back."
Tuesday, June 22, 2004
Foreign Students and Visa Issues
For a long time, Lance Fortnow has been warning us about the problems facing foreign students and scholars (of which your humble blogger is one) trying to visit the US, stay here, and travel to scholarly meetings. Nitish Korula, an Indian student who recently got admission to the Ph.D program at UIUC, has been chronicling his suffering as he navigates the student visa process.
Although this may not be news to those of us in research/academics, it's nice to see discussion of this problem spreading beyond the confines of academe. Today at Talking Points Memo, one of the most widely read blogs on the Web, John Judis discusses this issue.
Lance mentioned the cold war and how countries tried to keep their scientists from coming here: this quote is from Judis's post:
Although this may not be news to those of us in research/academics, it's nice to see discussion of this problem spreading beyond the confines of academe. Today at Talking Points Memo, one of the most widely read blogs on the Web, John Judis discusses this issue.
Lance mentioned the cold war and how countries tried to keep their scientists from coming here: this quote is from Judis's post:
During the Cold War, American officials discovered that one of the best ways to promote democratic capitalism at the expense of communism was by luring foreign students to American colleges. Some of these foreign graduates returned home to become the leaders of reform movements in their countries. Others stayed in the United States and contributed their skills to the great postwar boom. The same reasoning that prevailed during the Cold War should prevail during the war on terror. The United States should be eager, one would imagine, to expose students from abroad to democracy and religious pluralism, as well as to take advantage of their skills.
More on Origami
I had posted a note on computational origami a couple of months ago. The NYT has an article today on David Huffman (of Huffman coding) and his fascination with computational origami. The Geometry Junkyard has a page on origami.
Monday, June 21, 2004
STOC 2004: A Report (Part III)
This is the third post in Chandra Chekuri's STOC 2004 report: Part I and Part II were posted earlier.
A Probabilistic Lemma due to Uri Feige
One of the cute results in STOC is the following probabilistic bound due to Feige.
Let X1, X2, ..., Xn be independent positive random variables with mean 1. Note that we do not make any assumptions about their distribution and hence their variance can be arbitrary and in particular they are not iid. Let X = X1 + X2 + ... + Xn. By linearity of expectation µ, the mean of X, is n.
A typical question in bounding the deviation of sums of random variables is the following.
What is probability that X deviates significantly from its mean?
We have a variety of bounds including the Markov inequality, the Chebyshev inequality and of course the Chernoff-Hoeffding bounds. Of these the Markov inequality is the simplest and pretty much makes no assumptions other than that the rv is non-negative:
Pr[X > t µ] <= 1/t for a non-negative rv X.
Phrasing it in a different form, Pr[X <= t µ] >= 1-1/t.
Cheybyshev inequality is useful when we have a bound on the variance of X but in our case we don't. Chernoff bounds are the most useful for sums of bounded independent random variables. Feige's bound is quite different. He showed that Pr[X <= n+1] > 1/13. In other words there is a constant probability that X deviates by an additive 1 from the expected value. Note that the Markov inequality would give a bound of 1/(n+1) which is very far from the truth in this case.
Feige uses the above bound for an application to estimate the density of a graph but it is not the most compelling one (as he himself states). He anticipates more interesting applications in the future.
An Open Problem:
The bound Feige shows is 1/13 but the worst case example he knows has a bound of 1/e where e is the base of the natural logarithm. It is the obvious one: Each Xi takes on a value of (n+1) with probability 1/(n+1) and 0 otherwise. Therefore the probability that X is at most n+1 tends to 1/e. Feige conjectures that this the right bound. The problem is all yours.
A Probabilistic Lemma due to Uri Feige
One of the cute results in STOC is the following probabilistic bound due to Feige.
Let X1, X2, ..., Xn be independent positive random variables with mean 1. Note that we do not make any assumptions about their distribution and hence their variance can be arbitrary and in particular they are not iid. Let X = X1 + X2 + ... + Xn. By linearity of expectation µ, the mean of X, is n.
A typical question in bounding the deviation of sums of random variables is the following.
We have a variety of bounds including the Markov inequality, the Chebyshev inequality and of course the Chernoff-Hoeffding bounds. Of these the Markov inequality is the simplest and pretty much makes no assumptions other than that the rv is non-negative:
Phrasing it in a different form, Pr[X <= t µ] >= 1-1/t.
Cheybyshev inequality is useful when we have a bound on the variance of X but in our case we don't. Chernoff bounds are the most useful for sums of bounded independent random variables. Feige's bound is quite different. He showed that Pr[X <= n+1] > 1/13. In other words there is a constant probability that X deviates by an additive 1 from the expected value. Note that the Markov inequality would give a bound of 1/(n+1) which is very far from the truth in this case.
Feige uses the above bound for an application to estimate the density of a graph but it is not the most compelling one (as he himself states). He anticipates more interesting applications in the future.
An Open Problem:
The bound Feige shows is 1/13 but the worst case example he knows has a bound of 1/e where e is the base of the natural logarithm. It is the obvious one: Each Xi takes on a value of (n+1) with probability 1/(n+1) and 0 otherwise. Therefore the probability that X is at most n+1 tends to 1/e. Feige conjectures that this the right bound. The problem is all yours.
Sunday, June 20, 2004
Calling India
WARNING: Non-geometry info ahead...
Reliance India is now offering 12c/min calling card service to India. This is significantly better than what current calling card operators are giving, and is way better than what the big telcos like AT&T et al are offering (much as it hurts me to say so :)
I just signed up, and so I don't know what the call quality is like, but given that that is a Reliance endeavour, it should be good.
Reliance India is now offering 12c/min calling card service to India. This is significantly better than what current calling card operators are giving, and is way better than what the big telcos like AT&T et al are offering (much as it hurts me to say so :)
I just signed up, and so I don't know what the call quality is like, but given that that is a Reliance endeavour, it should be good.
Saturday, June 19, 2004
More on loops and strings
Readers of this blog will recognize my unhealthy fascination with loop quantum gravity and string theory. In my browsings of arxiv.org, I discovered a section of the physics area on popular writings in physics (what a concept: why can't we do this in theory CS too?). There's a nice article there by Rüdiger Vaas called 'The Duel: Strings Meet Loops' that explains the conflicts between the two approaches to unifying quantum physics and general relativity.
A related article is a Socratic dialogue between a fictitious professor and student on the pros and cons of string theory and loops. Bear in mind that this was written by Carlo Rovelli, one of the fathers of loop quantum gravity, so it is may be somewhat biased :)
And for those of you who say 'Bah ! Humbug ! Why should I care about such nonsense', I say 'Don't you want to understand John Baez's talk at SODA ?'
A related article is a Socratic dialogue between a fictitious professor and student on the pros and cons of string theory and loops. Bear in mind that this was written by Carlo Rovelli, one of the fathers of loop quantum gravity, so it is may be somewhat biased :)
And for those of you who say 'Bah ! Humbug ! Why should I care about such nonsense', I say 'Don't you want to understand John Baez's talk at SODA ?'
Automaton Simulator
Someone emailed me, asking if I knew of any packages that simulated various kinds of automata for visualizing their behaviour. I did some sniffing around, and found this package.
It simulates DFAs, NFAs, DPDAs, and Turing machines. From the interface, it doesn't appear to be that useful for large scale automaton construction, but is easy to play with interactively. Runs in Java.
It simulates DFAs, NFAs, DPDAs, and Turing machines. From the interface, it doesn't appear to be that useful for large scale automaton construction, but is easy to play with interactively. Runs in Java.
Blog Page Update
I now have a (long overdue) collection of links to what I call "theory blogs", which I loosely define as pages with RSS feeds that are primarily theory-CS related. It's on the right side of the page, below my user profile. If there are other links I have missed, feel free to inform me (I won't guarantee I'll add them in though :) )
Friday, June 18, 2004
STOC 2004: A Report (Part II)
Chandra Chekuri continues his report on STOC 2004.
Todays topic is Jonathan Kelner's paper that won the best student paper at STOC a few days ago.
Many people know about the planar separator theorem: every planar graph has a vertex separator (a set of vertices whose deletion separates the graph into two pieces of roughly equal size) of size O(√n). For a bounded degree planar graph the same theorem also shows that there is a cut of sparsity O(1/√n). These results extend to graphs that can be embedded on a surface of genus g - vertex separators of size O(√(gn)) and cuts of sparsity O(√(g/n)) exist. Gilbert, Hutchinson, and Tarjan showed how to find the separator (cut) if the embedding of the graph in a genus g surface is given as part of the input. The catch is that given a graph G, the problem of finding the genus of the graph is NP-hard. It has been an open problem if one could find the separator and cut without finding an explicit embedding.
Another related question is the following. A popular graph partitioning technique used in practice is that of spectral partitioning. Spielman and Teng showed that this algorithm is also good in theory for planar graphs by demonstrating that the algorithm always yields a cut of ratio O(1/√n). They conjectured that the spectral method also works for higher genus graphs and thus pointed a way to overcome the problem of not knowing the genus explicitly.
Kelner's paper resolves the above two problems by proving that the spectral methods yields a cut of sparsity O(√(g/n)). The high level scheme is from Spielman and Teng. Kelner generalizes the scheme using some nice results from algebraic geometry. This result is beautiful because of the connection between combinatorial objects (graphs) and continuous mathematics (ed: Aha !! ).
Here is a brief description of the method. Spielman and Teng use the Koebe-Andreev-Thurston theorem which shows that planar graphs can be obtained as a contact graph of disjoint circles on the plane (or equivalently on the surface of a sphere in 3 dimensions) where there is an edge between two vertices u and v iff the circles corresponding to u and v touch tangentially.
Using this circle packing theorem, Spielman and Teng show via a simple but clever argument that the second eigenvalue of the Laplacian of the graph is O(1/n).
The result for genus g graphs would follow from the same approach if it can be shown that the graph can be obtained by the adjacency graph of circles touching on the sphere with the proviso that any point on the sphere is contained in at most O(g) circles - in other words we allow overlapping circles now. Unfortunately this cannot be done. To overcome this Kelner uses a theorem of He and Schramm that shows that genus g graphs can be obtained by touching circles where the circles are disjoint and embedded on a Riemannian surface of genus g. The technical part of the proof involves mapping the Riemannian surface back onto the sphere via conformal mappings (mappings that preserve angles). Difficulties arise because the original mapping to the Riemannian surface has branch points which are points of singularity and the mapping from the surface to the sphere does not maintain conformality at these branch points. The number of branch points is however limited to O(g) and by using a fine subdivision of the graph it is shown that these singularities do not play a major role.
The above description is at a very high level. The paper of Kelner is short - it uses several important known results from mathematics in a direct and effective way. It is impressive that he knows enough math to know and put them together - most people in the audience at STOC including myself have limited background in the continuous mathematics that is needed to do this proof. I wish that I had more classes in pure math ...(ed: Amen..)
References:
J. R. Gilbert, J. P. Hutchinson, and R. E. Tarjan. A separator theorem for graphs of bounded genus. J. Algorithms, 5:391-407, 1984.
Todays topic is Jonathan Kelner's paper that won the best student paper at STOC a few days ago.
Many people know about the planar separator theorem: every planar graph has a vertex separator (a set of vertices whose deletion separates the graph into two pieces of roughly equal size) of size O(√n). For a bounded degree planar graph the same theorem also shows that there is a cut of sparsity O(1/√n). These results extend to graphs that can be embedded on a surface of genus g - vertex separators of size O(√(gn)) and cuts of sparsity O(√(g/n)) exist. Gilbert, Hutchinson, and Tarjan showed how to find the separator (cut) if the embedding of the graph in a genus g surface is given as part of the input. The catch is that given a graph G, the problem of finding the genus of the graph is NP-hard. It has been an open problem if one could find the separator and cut without finding an explicit embedding.
Another related question is the following. A popular graph partitioning technique used in practice is that of spectral partitioning. Spielman and Teng showed that this algorithm is also good in theory for planar graphs by demonstrating that the algorithm always yields a cut of ratio O(1/√n). They conjectured that the spectral method also works for higher genus graphs and thus pointed a way to overcome the problem of not knowing the genus explicitly.
Kelner's paper resolves the above two problems by proving that the spectral methods yields a cut of sparsity O(√(g/n)). The high level scheme is from Spielman and Teng. Kelner generalizes the scheme using some nice results from algebraic geometry. This result is beautiful because of the connection between combinatorial objects (graphs) and continuous mathematics (ed: Aha !! ).
Here is a brief description of the method. Spielman and Teng use the Koebe-Andreev-Thurston theorem which shows that planar graphs can be obtained as a contact graph of disjoint circles on the plane (or equivalently on the surface of a sphere in 3 dimensions) where there is an edge between two vertices u and v iff the circles corresponding to u and v touch tangentially.
Using this circle packing theorem, Spielman and Teng show via a simple but clever argument that the second eigenvalue of the Laplacian of the graph is O(1/n).
The result for genus g graphs would follow from the same approach if it can be shown that the graph can be obtained by the adjacency graph of circles touching on the sphere with the proviso that any point on the sphere is contained in at most O(g) circles - in other words we allow overlapping circles now. Unfortunately this cannot be done. To overcome this Kelner uses a theorem of He and Schramm that shows that genus g graphs can be obtained by touching circles where the circles are disjoint and embedded on a Riemannian surface of genus g. The technical part of the proof involves mapping the Riemannian surface back onto the sphere via conformal mappings (mappings that preserve angles). Difficulties arise because the original mapping to the Riemannian surface has branch points which are points of singularity and the mapping from the surface to the sphere does not maintain conformality at these branch points. The number of branch points is however limited to O(g) and by using a fine subdivision of the graph it is shown that these singularities do not play a major role.
The above description is at a very high level. The paper of Kelner is short - it uses several important known results from mathematics in a direct and effective way. It is impressive that he knows enough math to know and put them together - most people in the audience at STOC including myself have limited background in the continuous mathematics that is needed to do this proof. I wish that I had more classes in pure math ...(ed: Amen..)
References:
J. R. Gilbert, J. P. Hutchinson, and R. E. Tarjan. A separator theorem for graphs of bounded genus. J. Algorithms, 5:391-407, 1984.
Thursday, June 17, 2004
Blogging etiquette
Most blogs, even today, are essentially online diaries; in fact the proliferation of blogs is directly due to this, since anyone can write a diary, even if not anyone can create web pages. However, more and more, one finds specialized "topic" blogs: political blogs being the vast majority, but also for topics like music, photography, the media, and even academic disciplines like economics, philosophy, and of course computer science.
As I write entries, I am conscious of the fact that my posts are a form of public announcement, and as such should be governed by some of the rules of normal academic public discourse. When I chat with people at conferences about problems and results, I have to be careful in what I say about these conversations, in the same way that I would be careful in one-on-one discussions. However, it seems to me that if I hear about a new result, and the result is not confidential, telling a colleague at lunch is still different to posting it on a blog, mainly because of the illusion that I can control the rate of dissemination of information. Moreover, as with normal conversations, I wouldn't want people to think that everything they told me could end up as a blog entry at some point.
A discussion on these lines has been going on in some of the philosophy blogs. I think this kind of discussion will probably happen more and more as more disciplines turn to blogs as a means for information dissemination and general discussion.
As I write entries, I am conscious of the fact that my posts are a form of public announcement, and as such should be governed by some of the rules of normal academic public discourse. When I chat with people at conferences about problems and results, I have to be careful in what I say about these conversations, in the same way that I would be careful in one-on-one discussions. However, it seems to me that if I hear about a new result, and the result is not confidential, telling a colleague at lunch is still different to posting it on a blog, mainly because of the illusion that I can control the rate of dissemination of information. Moreover, as with normal conversations, I wouldn't want people to think that everything they told me could end up as a blog entry at some point.
A discussion on these lines has been going on in some of the philosophy blogs. I think this kind of discussion will probably happen more and more as more disciplines turn to blogs as a means for information dissemination and general discussion.
Geometry Junkyard RSS feed.
Start your aggregators ! David Eppstein's Geometry Junkyard page, (that I have mentioned in the past), now has its own RSS feed.
Wednesday, June 16, 2004
From the "NP != Non-Polynomial" department...
Was browsing some science news, and came upon this tidbit (emphasis added) in a PR release for a SIGGRAPH paper on mesh simplification:
Computer scientists have struggled with the problem of finding an optimal mix of large and small elements for years. In 1998, theoreticians proved that the problem was ''NP hard'' -- that no general solution exists that can be solved by a computer in finite length of time.
Sigh...
Later on, the article goes on to say:
'This is not a hack,'' says another expert, in the field [.. snip ..], using the term for a makeshift, unsystematic improvisation. ''It has a strong formal basis. You can make up extreme cases that will trick it, but for ordinary shapes, it works remarkably well.''
That's funny. Strong formal basis, but has extreme cases that can trick it ? If it sounds like one and smells like one....
Computer scientists have struggled with the problem of finding an optimal mix of large and small elements for years. In 1998, theoreticians proved that the problem was ''NP hard'' -- that no general solution exists that can be solved by a computer in finite length of time.
Sigh...
Later on, the article goes on to say:
'This is not a hack,'' says another expert, in the field [.. snip ..], using the term for a makeshift, unsystematic improvisation. ''It has a strong formal basis. You can make up extreme cases that will trick it, but for ordinary shapes, it works remarkably well.''
That's funny. Strong formal basis, but has extreme cases that can trick it ? If it sounds like one and smells like one....
STOC 2004: A Report
I am happy to introduce my very first guestblogger, Chandra Chekuri from Bell Labs. Chandra was recently at the ACM Symposium on the Theory of Computing (STOC) in Chicago, and has this report from the conference:
This year's STOC was in Chicago as most of the readers of this blog already know. It was the first time in Chicago for me. The hotel was in downtown right across from Grant Park near Lake Michigan. I went early on Saturday and had a chance to walk around and check out the Blues Festival in the park and walked the Magnificient Mile. Didn't have much time to do anything else touristy but had a very positive impression of the city. Maybe I should also visit during the winter to have a balanced view!
I felt that the papers in this STOC were particularly good. In addition to the usual breadth of topics, several papers in each topic were quite interesting. The two best award papers and the two best student papers were very well received.
Best Papers:
* Multilinear Formulas for Permanent and Determinant are of Super-Polynomial Size, by Ran Raz.
* Expander Flows, Geometric Embeddings and Geometric Partitioning, by Sanjeev Arora, Satish Rao, and Umesh Vazirani.
Best student papers:
* Spectral Partitioning, Eigenvalue Bounds, and Circle Packings for Graphs of Bounded Genus, by Jonathan Kelner
* Lower bounds for local search by Quantum Arguments, by Scott Aaronson.
Interestingly there was one algorithms paper and one lower bounds paper in each category. Algorithmic game theory, quantum algorithms/complexity, and sub-linear time algorithms (including property testing) are still the relatively new topics along with the more established areas.
The invited talks reflected the invasion by the new areas. The most interesting of the invited talks was by Avi Wigderson who was asked by Laszlo Babai to comment on whether theory of computing is still a unified discipline given the many diverse sub-areas that are represented at STOC/FOCS. Avi gave a very nice talk on this. He showed how communication complexity is central to many of the topics. He also connected interactive proofs to many areas and finally concluded with what he thought was the glue between the areas. He promised to put up his slides on his web page soon and you can see them for yourself (ed:will add a link when it is available link). One of the glues that he mentioned are people like Les Valiant who have contributed deeply to many different areas.
Most of the Business meeting was taken up by the discussion on moving the STOC special issue from JCSS to other journals. You can read more about this on Lance Fortnow's and Jeff Erickson's blogs. I am all for moving away from for-profit journals (especially Elsevier and Kluwer) to either society journals (ACM, SIAM, IEEE, Math Programming, INFORMS etc) or free electronic journals. Laszlo Babai very recently started a new free electronic journal ToC (Theory of Computing). With several very highly respected journals such as JACM and SICOMP available as society journals, will authors send their good papers to ToC? Or will this journal turn out to be like the Chicago Journal of TCS that was started by Babai many years ago and failed to catch on? I think times have changed and there is a higher probability for ToC to be successful. One thing that did bother me was that Babai chose to have most of the STOC'04 committee (of which he was the chair) as editors. The STOC'04 committee is filled with many fantastic young researchers, however some of the comments that Babai made and the way he went about choosing the editors makes one feel that ToC is his own show. In my opinion a respected journal should have a certain reputation that is independent from the editor-in-chief even though the editor-in-chief is well known and respected through out the community. I wish the best for ToC and I hope the editors lead the way in making it a success.
STOC 2005 is in Baltimore and there was a bid from U. Washington faculty Paul Beam and Anna Karlin (delivered by Venkat Guruswami) to host STOC 2006 in Seattle. Vladen Koltun compared the registration fees and hotel rates at STOC with those of SoCG and was wondering why we don't make more of an effort to make the whole thing cheaper. This issue keeps coming up but people ignore it - my take is that the people who care about it the most are not present at the business meeting! Another potential reason is that we have several conferences (at least three major ones, STOC/FOCS/SODA) each year and it is difficulty to find volunteers to make all this work out cheaply. That brings up the topic of the next conference, FOCS 2004, to be held in Rome, Italy. The accepted papers list will be out soon, June 25th according to the call for papers.
Another post will detail some of the papers and ideas that I liked the most at STOC.
This year's STOC was in Chicago as most of the readers of this blog already know. It was the first time in Chicago for me. The hotel was in downtown right across from Grant Park near Lake Michigan. I went early on Saturday and had a chance to walk around and check out the Blues Festival in the park and walked the Magnificient Mile. Didn't have much time to do anything else touristy but had a very positive impression of the city. Maybe I should also visit during the winter to have a balanced view!
I felt that the papers in this STOC were particularly good. In addition to the usual breadth of topics, several papers in each topic were quite interesting. The two best award papers and the two best student papers were very well received.
Best Papers:
* Multilinear Formulas for Permanent and Determinant are of Super-Polynomial Size, by Ran Raz.
* Expander Flows, Geometric Embeddings and Geometric Partitioning, by Sanjeev Arora, Satish Rao, and Umesh Vazirani.
Best student papers:
* Spectral Partitioning, Eigenvalue Bounds, and Circle Packings for Graphs of Bounded Genus, by Jonathan Kelner
* Lower bounds for local search by Quantum Arguments, by Scott Aaronson.
Interestingly there was one algorithms paper and one lower bounds paper in each category. Algorithmic game theory, quantum algorithms/complexity, and sub-linear time algorithms (including property testing) are still the relatively new topics along with the more established areas.
The invited talks reflected the invasion by the new areas. The most interesting of the invited talks was by Avi Wigderson who was asked by Laszlo Babai to comment on whether theory of computing is still a unified discipline given the many diverse sub-areas that are represented at STOC/FOCS. Avi gave a very nice talk on this. He showed how communication complexity is central to many of the topics. He also connected interactive proofs to many areas and finally concluded with what he thought was the glue between the areas. He promised to put up his slides on his web page soon and you can see them for yourself (ed:
Most of the Business meeting was taken up by the discussion on moving the STOC special issue from JCSS to other journals. You can read more about this on Lance Fortnow's and Jeff Erickson's blogs. I am all for moving away from for-profit journals (especially Elsevier and Kluwer) to either society journals (ACM, SIAM, IEEE, Math Programming, INFORMS etc) or free electronic journals. Laszlo Babai very recently started a new free electronic journal ToC (Theory of Computing). With several very highly respected journals such as JACM and SICOMP available as society journals, will authors send their good papers to ToC? Or will this journal turn out to be like the Chicago Journal of TCS that was started by Babai many years ago and failed to catch on? I think times have changed and there is a higher probability for ToC to be successful. One thing that did bother me was that Babai chose to have most of the STOC'04 committee (of which he was the chair) as editors. The STOC'04 committee is filled with many fantastic young researchers, however some of the comments that Babai made and the way he went about choosing the editors makes one feel that ToC is his own show. In my opinion a respected journal should have a certain reputation that is independent from the editor-in-chief even though the editor-in-chief is well known and respected through out the community. I wish the best for ToC and I hope the editors lead the way in making it a success.
STOC 2005 is in Baltimore and there was a bid from U. Washington faculty Paul Beam and Anna Karlin (delivered by Venkat Guruswami) to host STOC 2006 in Seattle. Vladen Koltun compared the registration fees and hotel rates at STOC with those of SoCG and was wondering why we don't make more of an effort to make the whole thing cheaper. This issue keeps coming up but people ignore it - my take is that the people who care about it the most are not present at the business meeting! Another potential reason is that we have several conferences (at least three major ones, STOC/FOCS/SODA) each year and it is difficulty to find volunteers to make all this work out cheaply. That brings up the topic of the next conference, FOCS 2004, to be held in Rome, Italy. The accepted papers list will be out soon, June 25th according to the call for papers.
Another post will detail some of the papers and ideas that I liked the most at STOC.
Tuesday, June 15, 2004
Successful ideas...
In science, successful ideas are those that are pervasive and invasive, are invitingly elegant and methodical, are open to extensio9ns and variants, andc apture and objective necessity, ansswer a widespread but diffuse sense of dissatisfaction in the scientific community
Christos Papadimitriou, in NP-Completeness: A Retrospective (via the P-NP page)
Christos Papadimitriou, in NP-Completeness: A Retrospective (via the P-NP page)
geeks and fonts...
A couple of years ago, when Agarwal, Kayal and Saxena announced that PRIMES was in P, an entry was posted on slashdot and much of the early discussion centered not on the proof itself, but on how to convert PS to PDF, and the relative merits thereof (this was prompted by the fact that the authors had linked a PS file online).
A few days ago, Lance Fortnow posted a note about one of his favorite theorems in complexity, a result by Trevisan connecting extractors and pseudo-random number generators. He got 9 comments; a fair number. All of them were devoted to the technical details of PS to PDF conversions !!
I am as happy as the next person to jump into a lecture on using type1 fonts with latex, but....
A few days ago, Lance Fortnow posted a note about one of his favorite theorems in complexity, a result by Trevisan connecting extractors and pseudo-random number generators. He got 9 comments; a fair number. All of them were devoted to the technical details of PS to PDF conversions !!
I am as happy as the next person to jump into a lecture on using type1 fonts with latex, but....
Don Quixote enterprise ?
Via /., news of the first quantum computer simulation tool. However, what I find most interesting is this excerpt (emphasis mine):
Classical simulation of quantum processes seams like a Don Quixote enterprise - this is well known. But imagine, you may test all the quantum algorithms you have in mind now - not tomorrow when everybody is able to press them in hardware. Let's discover the quantum ways by simulation now - to reach the world of Quantum Computing today.
Something must have been lost in translation...
Classical simulation of quantum processes seams like a Don Quixote enterprise - this is well known. But imagine, you may test all the quantum algorithms you have in mind now - not tomorrow when everybody is able to press them in hardware. Let's discover the quantum ways by simulation now - to reach the world of Quantum Computing today.
Something must have been lost in translation...
Losing a Ph.D for discredited papers...
From today's NYT:
A German university has revoked the doctoral degree of the former Bell Labs scientist who claimed a series of research breakthroughs, then was fired two years ago when it was discovered that he had manipulated data and fabricated results.
The physicist, J. Hendrik Schön, 33, did not commit misconduct in his doctoral research at the University of Konstanz, an investigation there found last year. But on Friday, the university said it had a legal right to rescind a degree when the recipient behaved "unworthily" of it.
I have mixed feelings about this: on the one hand it appears that he did fabricate data at Bell Labs. However, to rescind his Ph.D seems arbitrary, even if the university "had a legal right" to rescind it.
A German university has revoked the doctoral degree of the former Bell Labs scientist who claimed a series of research breakthroughs, then was fired two years ago when it was discovered that he had manipulated data and fabricated results.
The physicist, J. Hendrik Schön, 33, did not commit misconduct in his doctoral research at the University of Konstanz, an investigation there found last year. But on Friday, the university said it had a legal right to rescind a degree when the recipient behaved "unworthily" of it.
I have mixed feelings about this: on the one hand it appears that he did fabricate data at Bell Labs. However, to rescind his Ph.D seems arbitrary, even if the university "had a legal right" to rescind it.
Monday, June 14, 2004
RSS for papers
The time has come to have RSS feeds for subjects at a much finer granularity than what arxiv.org provides. Suppose I want to track papers on art gallery problems. Today it is not hard to imagine a spider that crawls well known sites like SoCG conf pages/SODA web pages etc, finds paper titles, and determines what papers might be of relevance to an art gallery problem search. One could use Feedster right now for such a search, but searching over the entire web will generate a lot of junk (or nothing at all)
Seems like it would not be hard given current web technology....
Seems like it would not be hard given current web technology....
Thursday, June 10, 2004
SoCG: The Business Meeting
Read all about it here
SoCG: The Cruise
Last night our conference banquet was on a cruise boat that went up and down the Hudson and East River. It's a good way to see the less serious side of CG researchers. Bernard Chazelle is an accomplished guitarist, and during the cruise we had a number of solo pianists (jazz/classical) performing for our benefit.
Wednesday, June 09, 2004
Tverberg Points
Inspired by Algorithms for Center and Tverberg Points by Pankaj K. Agarwal, Micha Sharir, and Emo Welzl:
Usually, checking if a point satisfies property X is easier than finding a point that satisfies property X. Tverberg points are an example of a rather bizarre reversal of this situation.
Suppose I take a collection of points in the plane, and partition it into subsets cleverly so that the convex hulls of the subsets all intersect in at least one point. Such a point is called a Tverberg Point. It was shown by Tverberg that such a point always exists. However, the following problem has unknown complexity in three dimensions:
Given a point p and a set of points P, is p a Tverberg point of P
The Tverberg point is one among a family of nonparametric notions of "deep" points in a point set. Other notions are the centerpoint (a point that "sees" the maximum number of points no matter which direction it looks in), and the Radon point (a Tverberg point where you partition P into two pieces).
Usually, checking if a point satisfies property X is easier than finding a point that satisfies property X. Tverberg points are an example of a rather bizarre reversal of this situation.
Suppose I take a collection of points in the plane, and partition it into subsets cleverly so that the convex hulls of the subsets all intersect in at least one point. Such a point is called a Tverberg Point. It was shown by Tverberg that such a point always exists. However, the following problem has unknown complexity in three dimensions:
Given a point p and a set of points P, is p a Tverberg point of P
The Tverberg point is one among a family of nonparametric notions of "deep" points in a point set. Other notions are the centerpoint (a point that "sees" the maximum number of points no matter which direction it looks in), and the Radon point (a Tverberg point where you partition P into two pieces).
Problem 2: Coins...
Heard around the dinner table....
I am given n piles of coins, with the knowledge that the total number of coins is 1 + 2 + 3... + n. I take one coin from each pile, and make a new pile. If the ith pile has i coins, this operation leaves all piles unchanged. Prove that no matter what starting configuration of coins in piles I start with, I always end up with this stable configuration.
Answer in a week, or thereabouts...
I am given n piles of coins, with the knowledge that the total number of coins is 1 + 2 + 3... + n. I take one coin from each pile, and make a new pile. If the ith pile has i coins, this operation leaves all piles unchanged. Prove that no matter what starting configuration of coins in piles I start with, I always end up with this stable configuration.
Answer in a week, or thereabouts...
SoCG...Live !
Am at SoCG right now: will attempt to report on the activities as they happen...
Traditional dinner table conversation at conference dinners involves either gossip or tossing around of problems. Sometimes one even solves them ! I had mentioned the turnpike problem a while ago, and its variant, the "sum-turnpike" problem. Thanks to Vladlen Koltlun and Shankar Krishnan, there is now an efficient poly-time algorithm for the sum-turnpike problem.
Traditional dinner table conversation at conference dinners involves either gossip or tossing around of problems. Sometimes one even solves them ! I had mentioned the turnpike problem a while ago, and its variant, the "sum-turnpike" problem. Thanks to Vladlen Koltlun and Shankar Krishnan, there is now an efficient poly-time algorithm for the sum-turnpike problem.
Monday, June 07, 2004
Alan Turing: 50th Death Anniversary
Alan Turing died 50 years ago today. We often use the word genius lightly: for him, it is fitting...
Cryptonomicon is set partly in the Enigma period of WW II, and has many very accurate discussions of Turing machines and the halting problem. Turing is a character in the novel as well.
Cryptonomicon is set partly in the Enigma period of WW II, and has many very accurate discussions of Turing machines and the halting problem. Turing is a character in the novel as well.
SODA 2005 invited speakers
Some of you may recall my musings on quantum gravity and the discrete nature of spacetime. I am delighted to note that at the upcoming Symposium on Discrete Algorithms, we will have an invited talk on loop quantum gravity by John Baez of UC Riverside.
Prof. Baez writes an occasional column called This Week's Finds in Mathematical Physics.
Micha Sharir and Uri Feige will also be speaking at SODA; two better speakers one will be hard-pressed to find...
Prof. Baez writes an occasional column called This Week's Finds in Mathematical Physics.
Micha Sharir and Uri Feige will also be speaking at SODA; two better speakers one will be hard-pressed to find...
FSTTCS Announcement
FSTTCS is the premier Indian conference on theory/algorithms/fundamental aspects of computer science. It's always in winter, so the weather is nice :), and this year it is in Chennai, in the south. The deadline for submission is coming up (Jun 21). More details are in the call for papers
Saturday, June 05, 2004
Passwords as one-way functions
This is an interesting article: the premise is that a password should be something that is individualized, but not conscious:
In tests of the picture version, users went through a two-step process to get a set of user certificates, or unconscious passwords. Users were first shown a set of 100 to 200 pictures randomly selected from a database of 20,000 pictures. Pictures were organized in groups of 2 to 9 pictures with a common theme, and each user was certified on one picture from a given theme group. The user then practiced choosing certificate images from entire theme groups.
Later, in lieu of passwords, users identified most of a short series of certificate images. To guard against eavesdropping, each certificate picture is only used once, and the user retrains when they run low.
Subjects were able to recognize previously seen pictures with better than 90 percent accuracy for up to three months. According to the researchers' calculations, the chances that a user who guesses correctly four times in a row is an imposter is less than 1,000th of one percent.
In tests of the picture version, users went through a two-step process to get a set of user certificates, or unconscious passwords. Users were first shown a set of 100 to 200 pictures randomly selected from a database of 20,000 pictures. Pictures were organized in groups of 2 to 9 pictures with a common theme, and each user was certified on one picture from a given theme group. The user then practiced choosing certificate images from entire theme groups.
Later, in lieu of passwords, users identified most of a short series of certificate images. To guard against eavesdropping, each certificate picture is only used once, and the user retrains when they run low.
Subjects were able to recognize previously seen pictures with better than 90 percent accuracy for up to three months. According to the researchers' calculations, the chances that a user who guesses correctly four times in a row is an imposter is less than 1,000th of one percent.
Friday, June 04, 2004
Our ranks are growing :)
Just discovered Jeff Erickson's not-new blog (he has posts dating back to Sep 2003, but most activity appears to be recent). Congratulations on getting tenure !
How did I discover it ? Sitemeter: a great place to satisfy one's inner narcissist.
How did I discover it ? Sitemeter: a great place to satisfy one's inner narcissist.
419 scams and world affairs
I have decided that instead of actually trying to read newspapers to figure out what is going on in the world, I will just monitor my 419 scam mail. So far in the past few years, I have got 419 scam variants from major hotspots in the world:
Nigeria of course
Liberia
Sierra Leone
Haiti
And now Zimbabwe.
Thursday, June 03, 2004
Upcoming events: sausage and soda
- The Symposium on Computational Geometry will be in NYC from Jun 9-11.
- The ACM Symposium on the Theory of Computing (one of the main theory conferences) will be in Chicago from Jun 13-15, where you can also visit the original theory blogger :)
- The paper submission deadline for the Symposium on Discrete Algorithms (SODA) is approaching (July 5). Consider this your first warning...
Math meets Hollywood
(Via slashdot...)
A little-known fact about "Futurama", FOX's cancelled cartoon sitcom about a pizza delivery boy stuck in the future, is that many of the writers had advanced degrees in math or related fields. Read this article for more info.
A personal connection is to Jeff Westbrook, one of the writers. He used to be at AT&T Labs, and is the only theoretician I know* who writes scripts for Hollywood. We even wrote a paper together, so I guess my Hollywood number is 1 !
* Len Adleman consulted on 'A Beautiful Mind' and I believe even "decorated" a blackboard for a scene in the movie.
A little-known fact about "Futurama", FOX's cancelled cartoon sitcom about a pizza delivery boy stuck in the future, is that many of the writers had advanced degrees in math or related fields. Read this article for more info.
A personal connection is to Jeff Westbrook, one of the writers. He used to be at AT&T Labs, and is the only theoretician I know* who writes scripts for Hollywood. We even wrote a paper together, so I guess my Hollywood number is 1 !
* Len Adleman consulted on 'A Beautiful Mind' and I believe even "decorated" a blackboard for a scene in the movie.
Princeton math oral exams
Via Michael Nielsen, the Princeton Math Department Graduate Student Guide to Generals (their oral exams).
I found the following excerpt hilarious: this is the start of an orals transcript, with a commitee consisting of John Conway, Andrew Wiles, and Charles Fefferman (speakers are marked with lastname-firstletter:
C: "So who is the chair of this committee?"
F (scratching head): "I think it's one of you two."
W: "Is it me?"
C: "I don't know... Perhaps it's me."
W: "How will this be determined?"
F: "Does it matter??"
C: "Let me find out from the office downstairs."
(Just as he was about to leave, I informed Prof. Conway he was chair.
A similar discussion then ensued as to what my special topics were!!
It was quite entertaining. But at last, we were all ready to begin.)
C: "So who wants to ask the first question?"
(Everyone kept looking at Conway. (He was chair after all.) )
C: "Hey, why is everyone looking at me?!?"
(Another entertaining dispute followed.
Finally, Prof. Fefferman agreed to start.)
I found the following excerpt hilarious: this is the start of an orals transcript, with a commitee consisting of John Conway, Andrew Wiles, and Charles Fefferman (speakers are marked with lastname-firstletter:
C: "So who is the chair of this committee?"
F (scratching head): "I think it's one of you two."
W: "Is it me?"
C: "I don't know... Perhaps it's me."
W: "How will this be determined?"
F: "Does it matter??"
C: "Let me find out from the office downstairs."
(Just as he was about to leave, I informed Prof. Conway he was chair.
A similar discussion then ensued as to what my special topics were!!
It was quite entertaining. But at last, we were all ready to begin.)
C: "So who wants to ask the first question?"
(Everyone kept looking at Conway. (He was chair after all.) )
C: "Hey, why is everyone looking at me?!?"
(Another entertaining dispute followed.
Finally, Prof. Fefferman agreed to start.)
Wednesday, June 02, 2004
The Strong Law of Truly Misquoted Mathematics
On the one hand it is nice when technical terms from the world of math/CS sneak into common usage. On the other hand it can be rather grating. If I had a dime for every time someone used "exponential growth" to mean "something big", I could probably buy a single share of Google stock...
This excerpt from a Slate article on cell phone usage is another example:
...the cell-phone growth projections are based on premises that are more redolent of 1990s-era techno-enthusiasm than the current decade's putative realism. Some savvy observers of the telecom world detect the beginnings of a bubble. "At some point, we're going to run into the law of large numbers and it's going to be a nasty setup," said Cody Willard, general partner at CL Willard Capital and a columnist for TheStreet.com. "I just don't think we're quite there yet."
I have to ask: exactly what law is Mr. Willard scared of running into ? Let's consider the options:
The Weak Law of Large numbers:
The probability that the sum of i.i.d variables over the same distribution differs from the mean is arbitrarily small.
or even:
The Strong Law of Large Numbers Not only is the probability of deviation small, the actual deviation of the sum from the mean is arbitrarily small.
Somehow I don't see how either of these fit into an analysis of the global cell phone market. But wait!, there are some more intriguing possibilities.
The Frivolous Theorem Of Arithmetic Almost all natural numbers are really large
Possible, but Mr. Willard did after all quote a law of large numbers. We should grant him at least some degree of precision in his statements.
Strong Law of Small Numbers There aren't enough small numbers to meet the many demands made of them..
This seems somewhat more plausible: after all, soon we might run out of small numbers to describe cell phone usage, at which point the market might collapse.
But I think the real answer lies in a deeper analysis of his prognostication. Note if you will the use of the phrase 'nasty setup'. This clearly indicates his invocation of:
The Law of Truly Large Numbers With a large enough sample, any outrageous thing is likely to happen
This indeed settles it. As cell phone usage increases, something bad is going to happen. Now we know why they get paid the big bucks in Wall Street.
This excerpt from a Slate article on cell phone usage is another example:
...the cell-phone growth projections are based on premises that are more redolent of 1990s-era techno-enthusiasm than the current decade's putative realism. Some savvy observers of the telecom world detect the beginnings of a bubble. "At some point, we're going to run into the law of large numbers and it's going to be a nasty setup," said Cody Willard, general partner at CL Willard Capital and a columnist for TheStreet.com. "I just don't think we're quite there yet."
I have to ask: exactly what law is Mr. Willard scared of running into ? Let's consider the options:
The Weak Law of Large numbers:
or even:
Somehow I don't see how either of these fit into an analysis of the global cell phone market. But wait!, there are some more intriguing possibilities.
Possible, but Mr. Willard did after all quote a law of large numbers. We should grant him at least some degree of precision in his statements.
This seems somewhat more plausible: after all, soon we might run out of small numbers to describe cell phone usage, at which point the market might collapse.
But I think the real answer lies in a deeper analysis of his prognostication. Note if you will the use of the phrase 'nasty setup'. This clearly indicates his invocation of:
This indeed settles it. As cell phone usage increases, something bad is going to happen. Now we know why they get paid the big bucks in Wall Street.
Floating point arithmetic is not associative...
Especially in computational geometry, robust calculations are of paramount importance because of the algebraic calculations at the heart of most geometric algorithms, and the catastrophic effects of precision errors - you lose incidences, points disappear, algorithms crash, the universe slips into a black hole....
An old article by David Goldberg talks of the perils of floating point computations. Worth reading...
What Every Computer Scientist Should Know about Floating Point Arithmetic
An old article by David Goldberg talks of the perils of floating point computations. Worth reading...
What Every Computer Scientist Should Know about Floating Point Arithmetic
Tuesday, June 01, 2004
Page Numbering Planar Graphs
In honor of the recently passed Graph Drawing deadline....
Suppose we place all the vertices of a graph on the spine of a book, and partition the edge set into sets such that each set can be drawn on a single page without crossings. The page number of a graph is the minimum number of pages required to draw a graph in this way. Clearly an outerplanar graph has page number 1.
A well known result by Yannakakis states that any planar graph can be drawn with page number 4, and this is necessary. A very interesting related result by Lenwood Heath in SODA 1991 states that the edges of a planar graph can be partitioned into two parts, each inducing an outer planar graph.
It is NP-hard to determine whether a given planar graph has page number 2 or not.
Suppose we place all the vertices of a graph on the spine of a book, and partition the edge set into sets such that each set can be drawn on a single page without crossings. The page number of a graph is the minimum number of pages required to draw a graph in this way. Clearly an outerplanar graph has page number 1.
A well known result by Yannakakis states that any planar graph can be drawn with page number 4, and this is necessary. A very interesting related result by Lenwood Heath in SODA 1991 states that the edges of a planar graph can be partitioned into two parts, each inducing an outer planar graph.
It is NP-hard to determine whether a given planar graph has page number 2 or not.
Subscribe to:
Posts (Atom)