Deep Learning Loss Function and Data Optimization for A.I. Detection of Thin Film Grain Boundaries
Published:
We systematically generated training datasets for grain-boundary prediction by cropping smaller patches from existing micrographs and evaluating their effect on model performance. The resulting dataset yielded lower training loss and faster convergence. However, the model’s outputs remained noisy within the grain interiors rather than being concentrated along the boundaries.
