Volume 82
Published on September 2024Volume title: Proceedings of the 2nd International Conference on Machine Learning and Automation
In the rapidly advancing field of AI and ML, this paper explores their pivotal role in transforming cybersecurity. Highlighting the integration of sophisticated techniques like deep learning for intrusion detection and reinforcement learning for adaptive threat modeling, it emphasizes the shift towards AI-driven cybersecurity solutions. The study meticulously analyzes supervised and unsupervised learning's impact on threat detection accuracy and the dynamic capabilities of neural networks in real-time threat identification. It reveals how these methodologies enhance digital defenses against complex cyber threats, underscoring the theoretical underpinnings and practical applications of AI and ML in cybersecurity. The paper also discusses the challenges and future directions, contributing significant insights into the evolving landscape of cybersecurity technologies. This comprehensive research background sets the stage for understanding the unique contributions and potential of AI and ML in strengthening cybersecurity measures.
This paper proposes an innovative AI-driven approach for efficient resource allocation in cloud computing environments using predictive analytics. The study addresses the critical challenge of optimizing resource utilization while maintaining high quality of service in dynamic cloud infrastructures. A hybrid predictive model combining XGBoost and LSTM networks is developed to forecast workload patterns across various time horizons. The model leverages historical data from a large-scale cloud environment, encompassing 1000 servers and over 52 million data points. A dynamic resource scaling algorithm is introduced, which integrates the predictive model outputs with real-time system state information to make proactive allocation decisions. The proposed framework incorporates advanced techniques such as workload consolidation, resource oversubscription, and elastic resource pools to maximize utilization efficiency. Experimental results demonstrate significant improvements in key performance indicators, including increasing resource utilization from 65% to 83%, reducing SLA violation rates from 2.5% to 0.8%, and enhancing energy efficiency, with PUE improving from 1.4 to 1.18. Comparative analysis shows that the proposed model outperforms existing prediction accuracy and resource allocation efficiency methods. The study contributes to the field by presenting a comprehensive, AI-driven solution that addresses the complexities of modern cloud environments and paves the way for more intelligent and autonomous cloud resource management systems.