An Online and Generalized Non-negativity Constrained Model for Large-scale Sparse Tensor Estimation on Multi-GPU

Publication date: Available online 22 February 2020Source: NeurocomputingAuthor(s): Linlin Zhuo, Kenli Li, Hao Li, Jiwu Peng, Keqin LiAbstractNon-negative Tensor Factorization (NTF) models are effective and efficient in extracting useful knowledge from various types of probabilistic distribution with multi-way information. Current NTF models are mostly designed for problems in computer vision which involve the whole Matricized Tensor Times Khatri−Rao Product (MTTKRP). Meanwhile, a Sparse NTF (SNTF) proposed to solve the problem of sparse Tensor Factorization (TF) can result in large-scale intermediate data. A Single-thread-based SNTF (SSNTF) model is proposed to solve the problem of non-linear time and space overhead caused by large-scale intermediate data. However, the SSNTF is not a generalized model. Furthermore, the above methods cannot describe the stream-like data from industrial applications in mainstream processors, e.g, Graphics Processing Unit (GPU) and multi-GPU in an online way. To address these two issues, a Generalized SSNTF (GSSNTF) is proposed, which extends the works of SSNTF to the Euclidean distance, Kullback Leibler (KL)-divergence, and Itakura-Saito (IS)-divergence. The GSSNTF only involves the feature elements rather than on the whole factor matrices, and can avoid the formation of large-scale intermediate matrices with convergence and accuracy promises. Furthermore, GSSNTF can merge the new data into the state-of-the-art built tree dataset for sparse ...
Source: Neurocomputing - Category: Neuroscience Source Type: research