Thursday, February 23, 2006

Fractal cornrows

So all this while, Muthu was really making nonverbal statements about the fractal nature of the universe.

Categories

Friday, February 17, 2006

Chazelle throws down the gauntlet

From physorg.com:
..mathematics produces the equivalent of one-liners – equations that are pithy, insightful, brilliant. Computer science is more like a novel by Tolstoy: it is messy and infuriatingly complex. But that is exactly what makes it unique and appealing -- computer algorithms are infinitely more capable of capturing nuances of complex reality in a way that pure mathematics cannot.
Read the whole interview. I don't know about Einsteins of computer science: arguably we've already seen fundamental tectonic shifts in the way we think about theoryCS. But it is telling that some of the most bedrock developments in algorithms came way before computers were fashionable; the whole of the 70s was a golden era for the study of basic algorithms. It makes this quote seem oh-so-true, and yet tragic, in that we are still trying to make an argument that we've lived and breathed for so many years.
Theoretical computer science would exist even if there were no computers.
The more I think about this line, the more I appreciate its deceptive subversiveness. In one line, it turns on its head most of the received understanding about the nature of computer science and the nature of algorithms. How indeed could theoretical computer science exist without computers ? This statement can only make sense once you realize that that the theory of computation is not about computers, but is about the process of computing, about deduction and formal reasoning, and is fundamentally about efficiencies; how quickly, how succintly, how accurately ?

The confluence of computer science and programming has brought our field much riches and much attention, and for that we should be grateful. But it is time for our field to take its rightful place as the language of quantitive and effective science, as Bernard rightly puts it.

(HT: Neighbourhood of Infinity)

Update: Bernard kindly points me to an essay that expands upon the things he says in the interview. It's a freewheeling ride that smacks of crazy guitar riffs and strange rhythms. Don't believe me ? Read it for yourself. Here's a sampler:
Moore's Law has ruled the roost for the last 40 years. All the oohs and aahs you hear about the digital revolution are nothing but the squeals humans emit when tickled pink by Moore's Law. From the nice (medical imaging, e-commerce, whole-genome sequencing) to the vital (Xbox, IM, iPod), its rule has been a veritable ticklefest. Moore's Law has been the sizzling cauldron in which savvy cooks have whipped up a dazzling variety of tasty dishes. Without it, the Information Superhighway would be a back alley to Snoozeville; the coolest thing about a computer would still be the blinking lights.
Update II: Lance and Ernie weigh in as well.

Categories

Thursday, February 16, 2006

A message from the TCS funding committee

Via Sanjeev Arora, a note from the TCS funding commitee. A modified version of this will appear in the March issue of SIGACT News. A summary:
  • There will be a call for proposals in the NSF theory program this spring and grant sizes are expected to be larger than before. So please apply and send good proposals.
  • The report outlines things you can do to help improve funding for TCS (please read and act upon). It also describes initiatives launched by the Karp committee to help bring more funding to TCS.
  • Please feel free to forward this email to all interested parties. We have also started a new moderated mailing list tcs-funding (which will automatically cc to the older theory-advocacy list). Posts are allowed on this list only by permission (guaranteed to be spam-free; only news related to grants and funding).

Categories

Wednesday, February 15, 2006

And it's OUT !!!!

54, count 'em, 54 juicy sausages at the 2006 sausage fair in Sedona. Alas, I submitted a hot dog.

Some notable noteworthies:
  • (discussed earlier) Minimum Weight Triangulation is NP-hard. Wolfgang Mulzer, Guenter Rote.
  • On the worst case complexity of the k-means method. David Arthur, Sergei Vassilvitskii. They show an exponential lower bound for the convergence of k-means. An intricate and interesting construction.
  • On the ICP Algorithm. Esther Ezra, Micha Sharir, Alon Efrat. This paper analyzes a rather well known algorithm for the registration of point sets, the so-called "Iterated Closest Pair" or ICP algorithm.
It's worth noting that the latter two papers are both about a rigorous analysis of well known heuristics. Sariel has more notables.

Usual disclaimer: I hold no grudges against papers not mentioned. There are only so many hours in the day :). Am happy to mention papers that others find noteworthy: use the comments or email me.

Update (2/16): 142 papers were submitted.

p.s for those not in the know, if you say SoCG fast enough, it becomes 'sausage'.

p.p.s For Mr Googlebot, "SoCG 2006 accepted papers".

Categories:

Tuesday, February 14, 2006

Latest Baez

Is up, and this time it's on complexity theory ! It even contains a post-within-a-post, by Scott Aaronson. Baez covers crypto and pseudorandomness, and has a ton of pointers as usual. The most hilarious link:
From their home page:
...researchers can't just "think" of deterministic numbers; the human brain is an incredibly complex system that is poorly understood. In addition, medical research shows that our brains can be influenced by electromagnetic radiation, including cellular phones and even microwaves. It obviously wouldn't do to have important issues of national security corrupted by stray radio waves! Luckily, computers can help us solve this problem.



Categories

Monday, February 13, 2006

Concentration of Measure

Devdatt Dubhashi and Alessandro Panconesi have a first draft of a monograph on concentration of measure, essentially the study of tail bounds for random variables: given a random variable X, how does it deviate from E[X] ?

Their goal is to write a "user-friendly" guide to measure concentration. From the preface:
Our main goal here is to give an exposition of the basic methods for measure concentration in a manner which makes it accessible to the researcher in randomized algorithms and enables him/her to quickly start putting them to work in his/her application.
The book starts with the basic Chernoff-Hoeffding bounds and variants. It then delves deeper into "more recent" technology like Azuma's inequality, martingales, the method of bounded differences, and more. Talagrand's inequality comes next, followed by the log-Sobolev inequality.

What the draft does really well is connect the dots between the techniques and how they are applied. With exercises, problems, and worked-out examples, it is an excellent text for a course, or even as a companion text in a course. I should add that it is very well written; balancing exposition and rigor at the level that I like (your mileage may vary).

It's a book that's really needed. There are a lot of sophisticated methods for analyzing tail bounds, and a text written like this can potentially make martingales and Talagrand's inequality the kind of as-natural-as-breathing analysis tools that Chernoff bounds currently are.

Now I can only hope the authors manage to finish the draft. Maybe if you like the manuscript you should email them with encouragement !


Categories

Grants.gov

USACM discusses a Washington Post article about the new Grants.gov system for viewing and filing funding applications. Apparently the new system requires you to download a client program to view and submit applications. The kicker is that this is a Windows-only app; they recently started using Citrix as a way of dealing with this problem, but there are still no clean solutions.

The Washington Post article explicitly refers to Mac users (because of the preponderance of Mac users in the natural sciences, I imagine), but the problems remain for Linux users as well. I have never had to use this system, and since the NSF allows for submission via FastLane as well, I doubt many of my readers have, but I find it rather amazing that the government felt the need to design a special application just to allow researchers to connect securely to the main funding servers.

Update: Dave Schroeder of UWisc says:
The University of Wisconsin has released a standalone package for using Grants.gov on Mac OS X as a service to the community. The package uses Citrix client software and a special settings file to access the central Citrix server provided by Grants.gov, allowing users to access and use the PureEdge software via the remote Windows machine running Citrix server software:

http://apple.doit.wisc.edu/grants.gov/
Categories

Wednesday, February 08, 2006

String theory and NP-hardness

Via Peter Woit, a paper that proves the NP-hardness of a problem relating to the (in)famous string theory "landscape". Specifically,
We study the computational complexity of the physical problem of finding vacua of string theory which agree with data, such as the cosmological constant, and show that such problems are typically NP hard.
I mention this because it appears to be among the first few examples of NP-hardness impinging on the frontiers of physics. The paper has a very lucid explanation of some of the places in string theory where NP-hardness and other complexity notions pop up.

Some skepticism as to the relevance of this proof, here.
Categories

Tuesday, February 07, 2006

February is the cruelest month

I was recently grumbling about conflicting deadlines: conference A announces results well after conference B submission deadline has past. STOC and SoCG have an ongoing conflict that I have always felt "conflicted" about. In fairness though, with the number of conferences just in the broad area of algorithms/theory, it's hard not to have conflicts. This got me thinking though: exactly how bad are the collisions ?

I decided to make a somewhat ad hoc list of conferences in the general scope of algorithms and theory (i.e topics in this conference will often show up in STOC/FOCS/SODA), and look at their submission deadlines (a man's got to do something while waiting for his sausage...). The resulting timeline is reproduced below; the blue bar indicates the span of time from submission to result announcement, and the red dot indicates the actual date of the conference.

It is not surprising to see how academic schedules appear to influence conference scheduling (travelling in the summer is so much easier), but I did expect to see a slightly more equitable distribution of deadlines over the year. Indeed, February is the cruelest month.

Notes and caveats:
  • I used the most recent conference dates; there are variations from year to year, but not significant ones.
  • WADS and SWAT are really the same conference, but alternate.
  • RANDOM and APPROX operate together
  • I wonder why ISAAC and FSTTCS, which are both Asian theory conferences, schedule themselves to collide that way.
  • As some pointed out, these are not conflicts per se; the number of people submitting to (say) COLT and CPM at the same time is probably small. I was more intrigued by the density of deadlines in the early part of the year.
Not only does most travel happen in the summer, I imagine that much research must happen then as well. It's a good time to get manuscripts ready before the fall onslaught begins.


Categories

Sunday, February 05, 2006

Welcome To The Machine

S. Muthukrishnan, also known as Muthu, also known as the person who knows everyone in theoryCS*, has decided to eat some pizza. Welcome !!!



* A little known corollary is that if Muthu doesn't know you, you may not actually exist.
Categories

Thursday, February 02, 2006

I didn't realize finding airfares was that hard.

But it does make me feel a lot better. From a talk announcement over at [LB,UB]:
At any moment there are between 2,000 and 10,000 commercial airliners in the sky, part of a dense network that provides, for example, more than 100,000 practical paths from Boston to the San Francisco area every day. At its core, finding a sequence of flights that meets the user’s stated time constraints is a path-planning problem which can be solved with well-known techniques. But the airfare search problem is much more complex than that. In fact, the airlines’ price structure is so rich that finding the cheapest price for a simple round-trip journey is in the general case provably undecidable. Even if one bounds the size of solutions to a small number of flights there may be more than 1020 reasonable answers to a simple travel query. The problem is compounded by the fact that airline revenue management systems are constantly and dynamically adjusting the prices for each flight along a discretized scale.
Seriously though, how is this even possible ? There are a finite number of routes at any given time, and i am assuming that no one pays me to fly (I'm ignoring voluntary bumps of course), so the total path length must increase if I take longer and more byzantine routes...

Update (2/3): Michael Mitzenmacher indicates that there is more to the problem that meets the eye. In fact, he goes further:
If you have any smart students who are looking for a job in a "real-world" company, I'd strongly recommend they look at ITA software. Obviously I've drunk the Kool-Aid, but I think they'll continue to be an innovative, leading company in the travel space. And heck, how many companies do you know that even think to advertise themselves by giving a talk about the undecidable problems they are tackling!


Categories:

STOC Numerology

Via Piotr via Sariel, I hear that there were 288 submissions to STOC this year (and 78 accepted), making a net acceptance of rate of 27.1%. Comparing this to Jeff's charts from last this year, it looks like both submissions and acceptances held steady.
Categories

Disqus for The Geomblog