Title | Randomization or Condensation?: Linear-Cost Matrix Sketching Via Cascaded Compression Sampling |
Publication Type | Conference Paper |
Year of Publication | 2017 |
Authors | Zhang, Kai, Liu, Chuanren, Zhang, Jie, Xiong, Hui, Xing, Eric, Ye, Jieping |
Conference Name | Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining |
Publisher | ACM |
Conference Location | New York, NY, USA |
ISBN Number | 978-1-4503-4887-4 |
Keywords | cascaded compression sampling, composability, compressive sampling, Cyber-physical systems, low-rank decomposition, matrix sketching, nystrom method, privacy, pubcrawl, random projection, randomized algorithms, resilience, Resiliency |
Abstract | Matrix sketching is aimed at finding compact representations of a matrix while simultaneously preserving most of its properties, which is a fundamental building block in modern scientific computing. Randomized algorithms represent state-of-the-art and have attracted huge interest from the fields of machine learning, data mining, and theoretic computer science. However, it still requires the use of the entire input matrix in producing desired factorizations, which can be a major computational and memory bottleneck in truly large problems. In this paper, we uncover an interesting theoretic connection between matrix low-rank decomposition and lossy signal compression, based on which a cascaded compression sampling framework is devised to approximate an m-by-n matrix in only O(m+n) time and space. Indeed, the proposed method accesses only a small number of matrix rows and columns, which significantly improves the memory footprint. Meanwhile, by sequentially teaming two rounds of approximation procedures and upgrading the sampling strategy from a uniform probability to more sophisticated, encoding-orientated sampling, significant algorithmic boosting is achieved to uncover more granular structures in the data. Empirical results on a wide spectrum of real-world, large-scale matrices show that by taking only linear time and space, the accuracy of our method rivals those state-of-the-art randomized algorithms consuming a quadratic, O(mn), amount of resources. |
URL | http://doi.acm.org/10.1145/3097983.3098050 |
DOI | 10.1145/3097983.3098050 |
Citation Key | zhang_randomization_2017 |