|Jonathan Kadmon||Stanford University|
Often, large, high dimensional datasets collected across multiplemodalities can be organized as a higher order tensor.
Often, large, high dimensional datasets collected across multiplemodalities can be organized as a higher order tensor. Low-rank tensordecomposition then arises as a powerful and widely used tool to discoversimple low dimensional structures underlying such data. However, wecurrently lack a theoretical understanding of the algorithmic behaviorof low-rank tensor decompositions. We derive Bayesian approximatemessage passing (AMP) algorithms for recovering arbitrarily shapedlow-rank tensors buried within noise, and we employ dynamic mean fieldtheory to precisely characterize their performance. Our theory revealsthe existence of phase transitions between easy, hard and impossibleinference regimes, and displays an excellent match with simulations.Moreover, it reveals several qualitative surprises compared to thebehavior of symmetric, cubic tensor decomposition. Finally, we compareour AMP algorithm to the most commonly used algorithm, alternatingleast squares (ALS), and demonstrate that AMP significantly outperformsALS in the presence of noise.