L1-NORM TENSOR DATA ANALYSIS: THEORY AND ALGORITHMS

Background: Modern datasets are collected across diverse sensing modalities and are naturally stored and processed in the form of N-way arrays, also known as tensors. Tucker tensor decomposition is a standard method for the analysis and compression of tensor data. Tucker finds a plethora of applications across fields of science and engineering, such as communications, data analytics, machine intelligence, pattern recognition, and computer vision. From an algorithmic standpoint, Tucker is typically implemented by means of the Higher-Order Singular-Value Decomposition (HOSVD) algorithm, or the Higher-Order Orthogonal Iterations (HOOI) algorithm.

Motivation: Conventional Tucker decomposition tries to minimize the L2-norm of the residual-error in the low-rank approximated tensor that derives by multi-way projection of the original tensor onto the spans of N sought-after orthonormal bases –or, equivalently, Tucker tries to maximize the L2-norm of this multi-way projection. Due to its reliance to the L2-norm, whether it is implemented by means of HOSVD or HOOI, Tucker decomposition has been shown to be very sensitive against faulty entries within the processed tensor. This outlier sensitivity of Tucker is often attributed to its L2/Frobenius norm based formulation. 

Contributions: In this line of research, we set theoretical foundations and develop algorithms for reliable L1-norm based tensor analysis. Our contributions are as follows. We present generalized L1-Tucker decomposition for N-way tensors. For the special case of (N=3)-way tensors, we show that rank-1 L1-Tucker2 can be cast to combinatorial optimization and, therefore, solved exactly. Moreover, for N=3, we propose two algorithms that approximate the solution to general rank Tucker2 decomposition efficiently; one that follows an alternating-maximization approach and another that conducts optimal single bit-flips and employs a subspace deflation approach. Thereafter, we consider general N-way tensors and develop and propose two new algorithmic frameworks for the solution of L1-Tucker/L1-Tucker2, namely L1-norm Higher-Order SVD (L1-HOSVD) and L1-norm Higher-Order Orthogonal Iterations (L1-HOOI). We provide complete convergence analysis for L1-HOOI, as well as complexity analysis for L1-HOSVD and L1-HOOI. Finally, we develop a dynamic L1-Tucker solver, appropriate for streaming tensor measurements which is capable of (i) identifying/suppressing outliers on-the-fly and (ii) dynamically updating the decomposition when a change in the underlying structure of the data occurs. We conduct extensive numerical studies on both synthetic and real-world tensor datasets. Our studies illustrate that L1-Tucker/L1-Tucker2 solvers (1) attain similar (high) performance to that of standard Tucker solvers when the data are nominal/clean and (2) exhibit sturdy resistance against outliers in the processed tensor while significantly outperforming standard Tucker solvers when outliers appear in the processed tensor. 


Related Media

Related Publications

Collaborator Acknowledgements