Plasma etching is a critical unit operation in semiconductor fabrication, where real-time control of etch rate is essential for maintaining uniformity, dimensional accuracy, and process stability. This talk presents a machine learning (ML) framework that enables real-time optimization of etch rates using a neural network surrogate model trained on physics-informed synthetic data. The model predicts etch rate based on seven key process parameters and achieves high predictive accuracy, with R² values above 0.98.
To enable dynamic process tuning, the surrogate model is paired with a differential evolution optimization algorithm that determines the input conditions required to meet a specified target etch rate. Benchmark testing shows that the optimization can be completed in under one second on standard CPU hardware, making the solution practical for integration with existing tool control systems. The framework also supports feedback from in-situ metrology to enable closed-loop operation.
This presentation will outline the model development workflow, optimization methodology, and integration architecture. Broader use cases across other semiconductor processes will also be discussed. The results demonstrate how targeted machine learning applications can enhance process control, equipment efficiency, and yield in advanced semiconductor manufacturing.