
Portfolio Optimization Strategy Based on Four Deep Learning Models
- 1 Xiamen University
* Author to whom correspondence should be addressed.
Abstract
Deep learning techniques have provided a fresh outlook on the evergreen subject of portfolio optimization within the finance domain. This article selects the stocks of Google, Tesla, Tractor Supply Company, Analog Devices, and Duke Energy Corporation and deploys four deep learning models to estimate returns and covariance respectively. The mean-variance model is utilized to generate the target portfolio for each deep learning model, incorporating the predicted outcomes. Ultimately, the returns of each portfolio are compared to the market benchmark (S&P 500) returns. The findings demonstrate that the proposed target model outperforms the market benchmark (S&P 500) across multiple financial metrics. This study highlights the groundbreaking and promising applications of deep learning in the financial sector, providing valuable insights into innovative portfolio allocation strategies for risk-averse investors who aim to achieve stable and positive returns even in turbulent market conditions.
Keywords
RNNs, self-Attention, transformer, portfolio optimization, mean-variance
[1]. Markowitz, H. (1952) Portfolio Selection. The Journal of Finance, 7(1), 77–91.
[2]. Das, S., Markowitz, H., Scheid, J., and Statman, M. (2010) Portfolio optimization with mental accounts. Journal of financial and quantitative analysis, 45(2), 311-334.
[3]. Ma, Y., Han, R., and Wang, W. (2021) Portfolio optimization with return prediction using deep learning and machine learning. Expert Systems with Applications, 165, 113973.
[4]. Laher, S., Paskaramoorthy, A., and Van Zyl, T. L. (2021). Deep learning for financial time series forecast fusion and optimal portfolio rebalancing. In 2021 IEEE 24th International Conference on Information Fusion, 1-8.
[5]. Kisiel, D., and Gorse, D. (2022). Portfolio transformer for attention-based asset allocation. In International Conference on Artificial Intelligence and Soft Computing, 61-71.
[6]. Bailey, J. V. (1992) Evaluating benchmark quality. Financial Analysts Journal, 48(3), 33-39.
[7]. Schmidhuber, J., and Hochreiter, S. (1997) Long short-term memory. Neural Comput, 9(8), 1735-1780.
[8]. Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., and Bengio, Y. (2014) Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.
[9]. Hu, Y., and Xiao, F. (2022) Network self attention for forecasting time series. Applied Soft Computing, 124, 109092.
[10]. Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., and Jin, R. (2022) Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In International Conference on Machine Learning, 27268-27286.
Cite this article
Zhang,E. (2023). Portfolio Optimization Strategy Based on Four Deep Learning Models. Advances in Economics, Management and Political Sciences,47,295-302.
Data availability
The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.
Disclaimer/Publisher's Note
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
About volume
Volume title: Proceedings of the 2nd International Conference on Financial Technology and Business Analysis
© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and
conditions of the Creative Commons Attribution (CC BY) license. Authors who
publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons
Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this
series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published
version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial
publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and
during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See
Open access policy for details).