Research Article
Open access
Published on 23 October 2023
Download pdf
Zhou,X. (2023). Research and analysis of matrix multiplica-tion in distributed learning algorithms. Applied and Computational Engineering,14,18-24.
Export citation

Research and analysis of matrix multiplica-tion in distributed learning algorithms

Xuze Zhou *,1,
  • 1 Hangzhou Dianzi University

* Author to whom correspondence should be addressed.

https://doi.org/10.54254/2755-2721/14/20230753

Abstract

Matrix multiplication have become increasingly important nowa-days, which is applied in many kinds of fields. In this case, the need to improve the speed and efficiency of matrix multiplication is in-creasing. In this paper, the author analyzes some relevant theories about matrix multiplication as well as the advantage and disad-vantage of some applications that based on matrix multiplication. It turns out that matrix multiplication has a lot of room for devel-opment in the future cause the current method still has many de-fects and is not perfect. For example, it will still take plenty of time to finish the process of matrix multiplication when the matrices are large in size. However, it’s gratifying that some improvements have been achieved, which can help to optimize the efficiency. In addi-tion, some advanced methods start to appear. For instance, the method of combining matrix multiplication with AI provides a new direction for future research and development. Consequently, it is predictable that a significant achievement to optimize matrix mul-tiplication will be made in the future.

Keywords

matrix multiplication, theory, accelerator

[1]. Son, Myungjun, and Kyungyong Lee. "Distributed matrix multiplication performance estimator for machine learning jobs in cloud computing." 2018 IEEE 11th International Conference on Cloud Computing (CLOUD). IEEE(2018).

[2]. Fawzi, A., Balog, M., Huang, A., Hubert, T., Romera-Paredes, B., Barekatain, M., ... & Kohli, P. Discovering faster matrix multiplication algorithms with reinforcement learning. Nature, 610(7930), 47-53(2022).

[3]. Yu, Qian, Mohammad Ali Maddah-Ali, and A. Salman Avestimehr. "Straggler mitigation in distributed matrix multiplication: Fundamental limits and optimal coding." IEEE Transactions on Information Theory 66.3, 1920-1933(2020).

[4]. Le Gall, François. "Faster algorithms for rectangular matrix multiplication." 2012 IEEE 53rd annual symposium on foundations of computer science. IEEE(2012).

[5]. Higham, Nicholas J. "Exploiting fast matrix multiplication within the level 3 BLAS." ACM Transactions on Mathematical Software (TOMS) 16.4, 352-368(1990).

[6]. Osawa, Kazuki, et al. "Accelerating matrix multiplication in deep learning by using low-rank approximation." 2017 International Conference on High Performance Computing & Simulation (HPCS). IEEE(2017).

[7]. San Juan, Pablo, et al. "Low precision matrix multiplication for efficient deep learning in NVIDIA Carmel processors." The Journal of Supercomputing 77, 11257-11269(2021).

[8]. Muralee Krishnan, Nikhil Krishnan, Seyederfan Hosseini, and Ashish Khisti. "Coded sequential matrix multiplication for straggler mitigation." Advances in Neural Information Processing Systems 33, 16060-16069(2020).

[9]. Gunnels, J. A., Katz, D. S., Quintana-Orti, E. S., & Van de Gejin, R. A. (2001, July). Fault-tolerant high-performance matrix multiplication: Theory and practice. International Conference on Dependable Systems and Networks (pp. 47-56). IEEE(2001).

[10]. Moldaschl, Michael, Karl E. Prikopa, and Wilfried N. Gansterer. "Fault tolerant communication-optimal 2.5 D matrix multiplication." Journal of Parallel and Distributed Computing 104, 179-190(2017).

[11]. Cohn, Henry, et al. "Group-theoretic algorithms for matrix multiplication." 46th Annual IEEE Symposium on Foundations of Computer Science (FOCS'05). IEEE(2005).

[12]. Cohn, Henry, and Christopher Umans. "A group-theoretic approach to fast matrix multiplication." 44th Annual IEEE Symposium on Foundations of Computer Science, 2003. Proceedings. IEEE(2003).

[13]. Le Gall, François. "Algebraic complexity theory and matrix multiplication." ISSAC(2014).

[14]. Landsberg, J. "Geometry and the complexity of matrix multiplication." Bulletin of the American Mathematical Society 45.2, 247-284(2008).

[15]. Hailong Zhou, et al. "Photonic matrix multiplication lights up photonic accelerator and beyond." Light: Science & Applications 11.1 (2022): 30.

[16]. Baek, Daehyeon, et al. "InnerSP: A memory efficient sparse matrix multiplication accelerator with locality-aware inner product processing." 2021 30th International Conference on Parallel Architectures and Compilation Techniques (PACT). IEEE(2021).

[17]. Srivastava, Nitish, et al. "Matraptor: A sparse-sparse matrix multiplication accelerator based on row-wise product." 2020 53rd Annual IEEE/ACM International Symposium on Microarchitecture (MICRO). IEEE(2020).

Cite this article

Zhou,X. (2023). Research and analysis of matrix multiplica-tion in distributed learning algorithms. Applied and Computational Engineering,14,18-24.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

Disclaimer/Publisher's Note

The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

About volume

Volume title: Proceedings of the 5th International Conference on Computing and Data Science

Conference website: https://2023.confcds.org/
ISBN:978-1-83558-019-6(Print) / 978-1-83558-020-2(Online)
Conference date: 14 July 2023
Editor:Alan Wang, Marwan Omar, Roman Bauer
Series: Applied and Computational Engineering
Volume number: Vol.14
ISSN:2755-2721(Print) / 2755-273X(Online)

© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open access policy for details).