A binary search tree is probably the most basic data structure in computer science, and is one of the first structures you come across when learning algorithms. Given a set of keys and an order defined on the keys, a binary search tree maintains the property that the key of a node is greater than all keys of left descendants and less than all keys of right descendants.
One of the most important outstanding problems in data structures is designing an optimal BST. Structures like red-black trees and splay trees can be shown to take O(log n) time (amortized) per insert/delete/find operation, and this is tight, in the sense that there exist sequences of updates of length m for which Omega(mlog n) tree operations are required. For charging purposes, a single 'splay' or 'rotation' takes one unit of time.
However, if we restrict ourselves to a static universe, with only searches, can we do better with respect to the optimal offline solution ? The Dynamic Optimality conjecture, first stated by Sleator and Tarjan, conjectures that splay trees cost only a constant factor more than an optimal offline algorithm. Note that this optimal algorithm must change the tree (hence the term 'Dynamic'); if the optimal offline algorithm is forced to choose a fixed tree, then splay trees are asymptotically optimal.
A new FOCS 2004 paper by Demaine, Harmon, Iacono and Patrascu takes a crack at the Dynamic Optimality conjecture, presenting an algorithm that comes with O(log log n) of the optimal offline solution.
No comments:
Post a Comment