
Research on ink color matching method based on CNN-Transformer model
- 1 Wenzhou University
* Author to whom correspondence should be addressed.
Abstract
This paper proposes a CNN-Transformer hybrid model for ink formulation prediction, named CTNet. The model leverages Convolutional Neural Networks (CNN) to extract local features from the spectral reflectance of sample surfaces and incorporates the self-attention mechanism of the Transformer to achieve efficient mapping between color and formulation. In addition, Bayesian optimization is introduced for hyperparameter tuning, further enhancing model performance. Experimental results demonstrate that CTNet outperforms CNN, RNN, LSTM, and the standard Transformer model in terms of Mean Absolute Error (MAE), achieving higher prediction accuracy. This provides an effective solution for high-precision and automated ink color matching, showing promising potential for industrial applications.
Keywords
ink color matching, CNN, Transformer, Bayesian optimization
[1]. Fu, J., & Caulfield, J. H. (2006). Designing spectral sensitivity curves for use with Artificial Color. Pattern Recognition, 40(8), 2251-2260.
[2]. Wang, Y., Tang, Y. L., Lee, C. H., & Kan, C. W. (2019). A Computer Color-Matching Study of Reverse Micellar Dyeing of Wool with Reactive Dyes. Polymers, 11(1), 132. https://doi.org/10.3390/polym11010132
[3]. Zhang, B., & Liu, H. (2008). Research on Application for Color Matching in Textile Dyeing Based on Numerical Analysis. IEEE International Conference on Computer Science and Software Engineering, 16(9), 357-360.
[4]. Li, X., Lv, X., Wang, J., Yang, G., & Jiang, X. (2013). The research of spectrophotometric color matching based on multi-peaks Gaussian fit. Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series, 8910(5), 1-7.
[5]. Raikwar, S., & Vemula, M. R. A. (2025). Self-attention-based 1DCNN model for multiclass EEG emotion classification. The Journal of Supercomputing, 81(4), 520-520.
[6]. Fernandez-Bermejo, J., Martinez-Del-Rincon, J., Dorado, J., Toro, X. D., Santofimia, M. J., & Lopez, J. C. (2024). Edge computing transformers for fall detection in older adults. International Journal of Neural Systems, 34(5), 2450026. https://doi.org/10.1142/S0129065724500266
[7]. Prechelt, L. (1998). Early stopping—but when? In Neural Networks: Tricks of the Trade (pp. 55-69). Springer, Berlin, Heidelberg.
[8]. Zong, Y., Nian, Y., Zhang, C., Tang, X., Wang, L., & Zhang, L. (2025). Hybrid Grid Search and Bayesian optimization-based random forest regression for predicting material compression pressure in manufacturing processes. Engineering Applications of Artificial Intelligence, 141(6), 109580.
[9]. Pérez-Padilla, N. A., Garcia-Sanchez, R., Avalos, O., Gálvez, J., Bian, M., Yu, L., Shu, Y., Feng, M., & Yelian, F. D. (2024). Optimizing trigger timing in minimal ovarian stimulation for In Vitro fertilization using machine learning models with random search hyperparameter tuning. Computers in biology and medicine, 179, 108856. https://doi.org/10.1016/j.compbiomed.2024.108856
[10]. Shahriari, B., Swersky, K., Wang, Z., Adams, R. P., & Freitas, N. D. (2015). Taking the human out of the loop: A review of Bayesian optimization. Proceedings of the IEEE, 104(1), 148-175.
[11]. Mukesh, R., Dass, S. C., Vijay, M., Kiruthiga, S., Praveenkumar, M., & Prashanth, M. (2024). Analysis of TEC variations and prediction of TEC by RNN during Indonesian earthquakes between 2004 – 2024 and comparison with IRI-2020 model. Advances in Space Research, 74(10), 4865-4905. https://doi.org/10.1016/j.asr.2024.07.055
[12]. Sanjib, S., & Debashis, N. (2024). LM-DNN: pre-trained DNN with LSTM and cross Fold validation for detecting viral pneumonia from chest CT. Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, 11(7), 365-369.
[13]. Wang, J., Song, Y. J., & Tang, Q. A. (2012). The Use of BP Neural Network in Evaluation about the Appearance of the Garments. Advanced Materials Research, 1566(433-440), 4320-4323.
[14]. Chen, T., Zheng, J., Peng, C., Zhang, S., Jing, Z., & Wang, Z. (2024). Tapping process fault identification by LSTM neural network based on torque signal singularity feature. Journal of Mechanical Science and Technology, 38(3), 1123-1133.
Cite this article
Lei,Y. (2025). Research on ink color matching method based on CNN-Transformer model. Advances in Engineering Innovation,16(4),106-111.
Data availability
The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.
Disclaimer/Publisher's Note
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
About volume
Journal:Advances in Engineering Innovation
© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and
conditions of the Creative Commons Attribution (CC BY) license. Authors who
publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons
Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this
series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published
version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial
publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and
during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See
Open access policy for details).