Using Tarjan's Red Rule for Fast Dependency Tree Construction
–Neural Information Processing Systems
We focus on the problem of efficient learning of dependency trees. It is well-known that given the pairwise mutual information coefficients, a minimum-weight spanning tree algorithm solves this problem exactly and in polynomial time. However, for large data-sets it is the construction of the correlation matrix that dominates the running time. We have developed a new spanning-tree algorithm which is capable of exploiting partial knowledge about edge weights. The partial knowledge we maintain is a probabilistic confidence interval on the coefficients, which we derive by examining just a small sample of the data. The algorithm is able to flag the need to shrink an interval, which translates to inspection of more data for the particular attribute pair. Experimental results show running time that is near-constant in the number of records, without significant loss in accuracy of the generated trees. Interestingly, our spanning-tree algorithm is based solely on Tarjan's red-edge rule, which is generally considered a guaranteed recipe for bad performance.
Neural Information Processing Systems
Dec-31-2003
- Country:
- North America
- Canada > British Columbia (0.14)
- United States
- California > San Francisco County
- San Francisco (0.14)
- Colorado (0.14)
- Pennsylvania > Allegheny County
- Pittsburgh (0.14)
- California > San Francisco County
- North America