Virtual microstructure design for steels using generative adversarial networks

oleh: Jin‐Woong Lee, Nam Hoon Goo, Woon Bae Park, Myungho Pyo, Kee‐Sun Sohn

Format: Article
Diterbitkan: Wiley 2021-01-01

Deskripsi

Abstract The prediction of macro‐scale materials properties from microstructures, and vice versa, should be a key part in modeling quantitative microstructure‐physical property relationships. It would be helpful if the microstructural input and output were in the form of visual images rather than parameterized descriptors. However, only a typical supervised learning technique would be insufficient to build up a model with real‐image‐output. A generative adversarial network (GAN) is required to treat visual images as output for a promising PMPR model. Recently developed deep‐learning‐based GAN techniques such as a deep convolutional GAN (DCGAN), a cycle‐consistent GAN (Cycle GAN), and a conditional GAN‐based image to image translation (Pix2Pix) could be of great help via the creation of realistic microstructures. In this regard, we generated virtual micrographs for various types of steels using a DCGAN, a Cycle GAN, and a Pix2Pix and confirmed the generated micrographs are qualitatively indistinguishable from the ground truth.