As a foundation for optimization, convexity is useful beyond the classical settings of Euclidean and Hilbert space. The broader arena of nonpositively curved metric spaces, which includes manifolds like hyperbolic space, as well as metric trees and more general CAT(0) cubical complexes, supports primal tools like proximal operations for geodesically convex functions. However, the lack of linear structure in such spaces complicates dual constructions like subgradients. To address this hurdle, we introduce a new type of subgradient for functions on Hadamard spaces, based on Busemann functions. Our notion supports generalizations of classical stochastic and incremental subgradient methods, with guaranteed complexity bounds. We illustrate with subgradient algorithms for $p$-mean problems in general Hadamard spaces, in particular computing medians in BHV tree space.
翻译:暂无翻译