Research Article
Open access
Published on 8 November 2024
Download pdf
Peng,S. (2024). Deep learning-based real-time ray tracing technology in games. Applied and Computational Engineering,101,124-131.
Export citation

Deep learning-based real-time ray tracing technology in games

Siqi Peng *,1,
  • 1 Dundee International Institution, Central South University, Changsha, China

* Author to whom correspondence should be addressed.

https://doi.org/10.54254/2755-2721/101/20240992

Abstract

In recent years, deep learning-based techniques have revolutionized real-time ray tracing for gaming, significantly enhancing visual fidelity and rendering performance. This paper reviews various state-of-the-art methods, including the use of Generative Adversarial Networks (GANs) for realistic shading, the use of neural temporal adaptive sampling, the use of subpixel sampling reconstruction, and the use of neural scene representation. Key findings highlight improvements in perceived realism, temporal stability, image fidelity, and computational efficiency. Techniques such as neural intersection functions and spatiotemporal reservoir resampling further optimize rendering speed and memory usage. Additionally, approaches like adaptive sampling and neural denoising using layer embeddings contribute to reduced noise and enhanced image clarity. Collectively, these advancements make real-time ray tracing more feasible for high-fidelity gaming applications, offering enhanced graphics without compromising performance. These improvements are significant. My analysis underscores the critical role of deep learning in overcoming traditional ray tracing challenges, paving the way for more immersive and responsive gaming experiences. Furthermore, these innovations suggest a promising future for integrating advanced ray tracing techniques into a broader range of interactive media, ensuring both visual excellence and operational efficiency.

Keywords

Deep learning, Real-time ray tracing, Image processing, Computer graphics.

[1]. Arturo J, Szabolcs K, James R 2023 Generative Adversarial Shaders for Real-Time Realism Enhancement J. Comput. Graph. Interact. Tech. 10 1237

[2]. Jon D, Peter L, Alex M 2022 Neural Temporal Adaptive Sampling and Denoising for Real-Time Ray Tracing J. Real-Time Image Process. 15 567-578

[3]. Boyu Z, Chen H, Fang Z 2023 High-Quality Real-Time Rendering Using Subpixel Sampling Reconstruction Comput. Graph. Forum 42 223-234

[4]. Julian S, Markus H, Alex M 2022 Neural Ray-Tracing: Learning Surfaces and Reflectance for Relighting and View Synthesis ACM Trans. Graph. 41 56-67

[5]. Shin K, Chih-Chen H, Takahiro S 2023 Neural Intersection Function for Efficient Ray Tracing J. Comput. Graph. Tech. 8 987-998

[6]. Benedikt S, Zander A, Mark P 2022 Spatiotemporal Reservoir Resampling for Real-Time Ray Tracing with Dynamic Direct Lighting IEEE Trans. Vis. Comput. Graph. 28 3456-3467

[7]. Antoine D, Lukas M, Lorenze K 2023 RL-based Stateful Neural Adaptive Sampling and Denoising for Real-Time Path Tracing Comput. Graph. Forum 42 789-800

[8]. Killian F, Max S, Carsten R 2022 Minimal Convolutional Neural Networks for Temporal Antialiasing in Real-Time Rendering J. Real-Time Image Process. 16 678-689

[9]. Zander M, Lars P, Ingo W 2023 Extending Probe-Based Techniques for Dynamic Global Illumination in Real-Time Rendering J. Comput. Graph. Interact. Tech. 9 345-356

[10]. Ingo W, Milan T, Stefan H 2022 Data Parallelism in Multi-GPU Environments for Improved Path Tracing Performance ACM Trans. Graph. 40 1123-1134

[11]. Jacob R, Jon T 2023 Neural Denoising Using Layer Embeddings for Real-Time Ray Tracing J. Real-Time Image Process. 17 234-245

[12]. Joohwan K, Seung H, Takashi M 2023 Post-Render Warping and Post-Input Sampling for High Latency Real-Time Rendering IEEE Trans. Vis. Comput. Graph. 29 456-467

[13]. Simon D, Alper Y 2022 Compressed Opacity Mapping for Efficient Ray Tracing Algorithms J. Comput. Graph. Tech. 7 789-800

[14]. Jian L, Wei X, Jun H 2023 Real-Time Point Cloud Relighting Combining BRDF Decomposition and Ray Tracing Comput. Graph. Forum 42 123-134

Cite this article

Peng,S. (2024). Deep learning-based real-time ray tracing technology in games. Applied and Computational Engineering,101,124-131.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

Disclaimer/Publisher's Note

The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

About volume

Volume title: Proceedings of the 2nd International Conference on Machine Learning and Automation

Conference website: https://2024.confmla.org/
ISBN:978-1-83558-691-4(Print) / 978-1-83558-692-1(Online)
Conference date: 12 January 2025
Editor:Mustafa ISTANBULLU
Series: Applied and Computational Engineering
Volume number: Vol.101
ISSN:2755-2721(Print) / 2755-273X(Online)

© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open access policy for details).