Research Article
Open access
Published on 26 November 2024
Download pdf
Ma,J. (2024). Depression Detection Method Using Multimodal Social Media Data: An Integrative Review. Applied and Computational Engineering,106,44-51.
Export citation

Depression Detection Method Using Multimodal Social Media Data: An Integrative Review

Jiawen Ma *,1,
  • 1 School of Information Science and Engineering, Yanshan University

* Author to whom correspondence should be addressed.

https://doi.org/10.54254/2755-2721/106/20241291

Abstract

An increasing number of people are suffering from depression due to rising chronic stress levels. With the advent of Web 2.0, individuals are more inclined to express their emotions on social media, offering new opportunities for depression prediction. Researchers have developed various single-modal methods for early-stage depression prediction. Recently, multimodal social media data has been utilized to enhance the accuracy of depression detection methods. These methods primarily extract multidimensional information such as text, language, and images from social media users, integrating these diverse modes to assess the risk or severity of depression. This approach significantly improves the precision of depression prediction. However, the research is still in its early stages, with challenges such as limited datasets and many areas requiring further improvement. To aid researchers in better understanding and refining multimodal approaches, we conducted a review that summarizes emerging research directions in using multimodal techniques for depression prediction on social media. Additionally, this review compares different depression detection methods, datasets, and the various modalities used in multimodal approaches, analyzing their strengths and limitations. Finally, it offers suggestions for future research.

Keywords

depression detection, multimodal social media data, deep learning, feature fusion.

[1]. W. H. O. (WHO), “Depressive disorder (depression).” Accessed: Sep. 05, 2024. [Online]. Available: https://www.who.int/news-room/fact-sheets/detail/depression

[2]. H. K. Kim et al., “Relationship between Patient Health Questionnaire (PHQ-9) and Montgomery-Asberg Depression Rating Scale (MADRS) total scores in older adults with major depressive disorder: An analysis of the OPTIMUM clinical trial,” J. Affect. Disord., vol. 361, no. December 2023, pp. 651–658, 2024, doi: 10.1016/j.jad.2024.06.068.

[3]. M. Trotzek, S. Koitka, and C. M. Friedrich, “Utilizing Neural Networks and Linguistic Metadata for Early Detection of Depression Indications in Text Sequences,” IEEE Trans. Knowl. Data Eng., vol. 32, no. 3, pp. 588–601, 2020, doi: 10.1109/TKDE.2018.2885515.

[4]. R. Chiong, G. S. Budhi, S. Dhakal, and F. Chiong, “A textual-based featuring approach for depression detection using machine learning classifiers and social media texts,” Comput. Biol. Med., vol. 135, p. 104499, 2021, doi: 10.1016/j.compbiomed.2021.104499.

[5]. Y. Wu et al., “PIE: A Personalized Information Embedded model for text-based depression detection,” Inf. Process. Manag., vol. 61, no. 6, p. 103830, 2024, doi: 10.1016/j.ipm.2024.103830.

[6]. A. Pérez, J. Parapar, and Á. Barreiro, “Automatic depression score estimation with word embedding models,” Artif. Intell. Med., vol. 132, no. June, p. 102380, 2022, doi: 10.1016/j.artmed.2022.102380.

[7]. J. Singh, M. Wazid, D. P. Singh, and S. Pundir, “An embedded LSTM based scheme for depression detection and analysis,” Procedia Comput. Sci., vol. 215, pp. 166–175, 2022, doi: 10.1016/j.procs.2022.12.019.

[8]. A. Joulin, E. Grave, P. Bojanowski, and T. Mikolov, “Bag of tricks for efficient text classification,” 15th Conf. Eur. Chapter Assoc. Comput. Linguist. EACL 2017 - Proc. Conf., vol. 2, pp. 427–431, 2017, doi: 10.18653/v1/e17-2068.

[9]. A. Joulin, E. Grave, P. Bojanowski, and T. Mikolov, “Bag of tricks for efficient text classification,” 15th Conf. Eur. Chapter Assoc. Comput. Linguist. EACL 2017 - Proc. Conf., vol. 2, pp. 427–431, 2017, doi: 10.18653/v1/e17-2068.

[10]. T. Gui et al., “Cooperative multimodal approach to depression detection in twitter,” 33rd AAAI Conf. Artif. Intell. AAAI 2019, 31st Innov. Appl. Artif. Intell. Conf. IAAI 2019 9th AAAI Symp. Educ. Adv. Artif. Intell. EAAI 2019, pp. 110–117, 2019, doi: 10.1609/aaai.v33i01.3301110.

[11]. C. Lin et al., “SenseMood: Depression detection on social media,” ICMR 2020 - Proc. 2020 Int. Conf. Multimed. Retr., pp. 407–411, 2020, doi: 10.1145/3372278.3391932.

[12]. M. Li, Y. Wei, Y. Zhu, S. Wei, and B. Wu, “Enhancing multimodal depression detection with intra- and inter-sample contrastive learning,” Inf. Sci. (Ny)., vol. 684, no. July, p. 121282, 2024, doi: 10.1016/j.ins.2024.121282.

[13]. H. Fan et al., “Transformer-based multimodal feature enhancement networks for multimodal depression detection integrating video, audio and remote photoplethysmograph signals,” Inf. Fusion, vol. 104, no. July 2023, p. 102161, 2024, doi: 10.1016/j.inffus.2023.102161.

[14]. Z. Li, Z. An, W. Cheng, J. Zhou, F. Zheng, and B. Hu, “MHA: a multimodal hierarchical attention model for depression detection in social media,” Heal. Inf. Sci. Syst., vol. 11, no. 1, pp. 1–13, 2023, doi: 10.1007/s13755-022-00197-5.

[15]. A. Malhotra and R. Jindal, “Multimodal deep learning based framework for detecting depression and suicidal behaviour by affective analysis of social media posts,” EAI Endorsed Trans. Pervasive Heal. Technol., vol. 6, no. 21, pp. 1–9, 2020, doi: 10.4108/eai.13-7-2018.164259.

[16]. P. Mann, A. Paes, and E. H. Matsushima, “See and read: Detecting depression symptoms in higher education students using multimodal social media data,” Proc. 14th Int. AAAI Conf. Web Soc. Media, ICWSM 2020, no. Icwsm, pp. 440–451, 2020, doi: 10.1609/icwsm.v14i1.7313.

[17]. H. Zogan, I. Razzak, S. Jameel, and G. Xu, “DepressionNet: Learning Multi-modalities with User Post Summarization for Depression Detection on Social Media,” SIGIR 2021 - Proc. 44th Int. ACM SIGIR Conf. Res. Dev. Inf. Retr., pp. 133–142, 2021, doi: 10.1145/3404835.3462938.

[18]. H. Zhang, H. Wang, S. Han, W. Li, and L. Zhuang, “Detecting depression tendency with multimodal features,” Comput. Methods Programs Biomed., vol. 240, no. January 2022, p. 107702, 2023, doi: 10.1016/j.cmpb.2023.107702.

[19]. M. P. M. Tadlagi, M. V. P. Deshpande, M. A. A. Gaffar Chanda, M. P. R. Kakade, and D. K. K. S. Liyakat, “Depression Detection,” J. Ment. Heal. Issues Behav., no. 26, pp. 1–7, 2022, doi: 10.55529/jmhib.26.1.7.

[20]. X. Zhang, B. Li, and G. Qi, “A novel multimodal depression diagnosis approach utilizing a new hybrid fusion method,” Biomed. Signal Process. Control, vol. 96, no. PA, p. 106552, 2024, doi: 10.1016/j.bspc.2024.106552.

[21]. X. Xu, Y. Wang, X. Wei, F. Wang, and X. Zhang, “Attention-based acoustic feature fusion network for depression detection,” Neurocomputing, vol. 601, no. July, p. 128209, 2024, doi: 10.1016/j.neucom.2024.128209.

[22]. A. K. Das and R. Naskar, “A deep learning model for depression detection based on MFCC and CNN generated spectrogram features,” Biomed. Signal Process. Control, vol. 90, no. November 2023, p. 105898, 2024, doi: 10.1016/j.bspc.2023.105898.

Cite this article

Ma,J. (2024). Depression Detection Method Using Multimodal Social Media Data: An Integrative Review. Applied and Computational Engineering,106,44-51.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

Disclaimer/Publisher's Note

The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

About volume

Volume title: Proceedings of the 2nd International Conference on Machine Learning and Automation

Conference website: https://2024.confmla.org/
ISBN:978-1-83558-707-2(Print) / 978-1-83558-708-9(Online)
Conference date: 21 November 2024
Editor:Mustafa ISTANBULLU
Series: Applied and Computational Engineering
Volume number: Vol.106
ISSN:2755-2721(Print) / 2755-273X(Online)

© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open access policy for details).