Merged_FHC

Benjamin Moseley

Neural Information Processing Systems 

In this section we first prove that running constant-approximation algorithms on fairlets gives good solutions for value objective, and then give constant approximation algorithms for both revenue and value in weighted hierarchical clustering problem, as is mentioned in Corollary 9 and 12. That is, a weighted version of average-linkage, for both weighted revenue and value objective, and weighted ( /n)-locally densest cut algorithm, which works for weighted value objective. Both proofs are easily adapted from previous proofs in [19] and [33]. A.1 Running constant-approximation algorithms on fairlets In this section, we prove Theorem 10, which says if we run any -approximation algorithm for the upper bound on weighted value on the fairlet decomposition, we get a fair tree with minimal loss in approximation ratio. For the remainder of this section, fix any hierarchical clustering algorithm A that is guaranteed on any weighted input (V,d,m) to construct a hierarchical clustering with objective value at least m(V)d(V) for the value objective on a weighted input. In the following definition, we transform the point set to a new set of points that are weighted. We will analyze A on this new set of points. We then show how we can relate this to the objective value of the optimal tree on the original set of points. We begin by observing the objective value that A receives on the instance V (Y) is large compared to the weights in the original instance. On the instance V (Y) the algorithm A has a total weighted objective of (1) nd(V).