Volume 82

Published on September 2024

Volume title: Proceedings of the 2nd International Conference on Machine Learning and Automation

Conference website: https://www.confmla.org/
ISBN:978-1-83558-565-8(Print) / 978-1-83558-566-5(Online)
Conference date: 21 November 2024
Editor:Mustafa ISTANBULLU
Research Article
Published on 26 July 2024 DOI: 10.54254/2755-2721/82/2024GLG0054
Xianghui Meng
DOI: 10.54254/2755-2721/82/2024GLG0054

In the rapidly advancing field of AI and ML, this paper explores their pivotal role in transforming cybersecurity. Highlighting the integration of sophisticated techniques like deep learning for intrusion detection and reinforcement learning for adaptive threat modeling, it emphasizes the shift towards AI-driven cybersecurity solutions. The study meticulously analyzes supervised and unsupervised learning's impact on threat detection accuracy and the dynamic capabilities of neural networks in real-time threat identification. It reveals how these methodologies enhance digital defenses against complex cyber threats, underscoring the theoretical underpinnings and practical applications of AI and ML in cybersecurity. The paper also discusses the challenges and future directions, contributing significant insights into the evolving landscape of cybersecurity technologies. This comprehensive research background sets the stage for understanding the unique contributions and potential of AI and ML in strengthening cybersecurity measures.

Show more
View pdf
Meng,X. (2024).Advanced AI and ML techniques in cybersecurity: Supervised and unsupervised learning, reinforcement learning, and neural networks in threat detection and response.Applied and Computational Engineering,82,1-5.
Export citation
Research Article
Published on 10 September 2024 DOI: 10.54254/2755-2721/82/2024GLG0055
Haotian Zheng, Kangming Xu, Mingxuan Zhang, Hao Tan, Hanzhe Li
DOI: 10.54254/2755-2721/82/2024GLG0055

This paper proposes an innovative AI-driven approach for efficient resource allocation in cloud computing environments using predictive analytics. The study addresses the critical challenge of optimizing resource utilization while maintaining high quality of service in dynamic cloud infrastructures. A hybrid predictive model combining XGBoost and LSTM networks is developed to forecast workload patterns across various time horizons. The model leverages historical data from a large-scale cloud environment, encompassing 1000 servers and over 52 million data points. A dynamic resource scaling algorithm is introduced, which integrates the predictive model outputs with real-time system state information to make proactive allocation decisions. The proposed framework incorporates advanced techniques such as workload consolidation, resource oversubscription, and elastic resource pools to maximize utilization efficiency. Experimental results demonstrate significant improvements in key performance indicators, including increasing resource utilization from 65% to 83%, reducing SLA violation rates from 2.5% to 0.8%, and enhancing energy efficiency, with PUE improving from 1.4 to 1.18. Comparative analysis shows that the proposed model outperforms existing prediction accuracy and resource allocation efficiency methods. The study contributes to the field by presenting a comprehensive, AI-driven solution that addresses the complexities of modern cloud environments and paves the way for more intelligent and autonomous cloud resource management systems.

Show more
View pdf
Zheng,H.;Xu,K.;Zhang,M.;Tan,H.;Li,H. (2024).Efficient resource allocation in cloud computing environments using AI-driven predictive analytics.Applied and Computational Engineering,82,6-12.
Export citation