About this Abstract |
Meeting |
2023 TMS Annual Meeting & Exhibition
|
Symposium
|
Advanced Characterization Techniques for Quantifying and Modeling Deformation
|
Presentation Title |
Using Deep Learning to Reconstruct Grains from Simulated Far-Field Diffraction Data |
Author(s) |
Ashley Lenau, Yuefeng Jin, Ashley Bucsek, Stephen Niezgoda |
On-Site Speaker (Planned) |
Ashley Lenau |
Abstract Scope |
Far-Field High Energy Diffraction Microscopy (ff-HEDM) is invaluable for quantifying the orientation and elastic strain within the bulk of a 3D polycrystalline sample. However, it has limited ability to capture morphology or orientation and strain gradients that is needed to model the mechanical behavior of metal materials. In this presentation, we demonstrate a deep learning framework that reconstructs the 3D grain shape given diffraction spots from a single grain. The network is based on Pix2Vox, which uses an encoder-decoder structure to convert multiple 2D images of an object into a 3D volume render. Unlike standard Pix2Vox, which uses a single encode-decoder for all 2D images, our network utilizes an independent encoder for each diffraction spot. The ground truth grain shapes are generated via DREAM.3D and the simulated ff-HEDM data is generated by a virtual diffractometer. While still in the nascent stages of development, here we demonstrate high-fidelity 3D grain reconstruction. |
Proceedings Inclusion? |
Planned: |
Keywords |
Machine Learning, Characterization, Modeling and Simulation |