This article was submitted to Computational Materials Science, a section of the journal Frontiers in Materials
This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
Digital twins are emerging as powerful tools for supporting innovation as well as optimizing the in-service performance of a broad range of complex physical machines, devices, and components. A digital twin is generally designed to provide accurate in-silico representation of the form (i.e., appearance) and the functional response of a specified (unique) physical twin. This paper offers a new perspective on how the emerging concept of digital twins could be applied to accelerate materials innovation efforts. Specifically, it is argued that the material itself can be considered as a highly complex multiscale physical system whose form (i.e., details of the material structure over a hierarchy of material length) and function (i.e., response to external stimuli typically characterized through suitably defined material properties) can be captured suitably in a digital twin. Accordingly, the digital twin can represent the evolution of structure, process, and performance of the material over time, with regard to both process history and in-service environment. This paper establishes the foundational concepts and frameworks needed to formulate and continuously update both the form and function of the digital twin of a selected material physical twin. The form of the proposed material digital twin can be captured effectively using the broadly applicable framework of n-point spatial correlations, while its function at the different length scales can be captured using homogenization and localization process-structure-property surrogate models calibrated to collections of available experimental and physics-based simulation data.
香京julia种子在线播放
Recent forward-looking roadmaps (
Materials, in their own right, represent highly complex multiscale and multi-physics systems. Their production and in-service responses are controlled by a wide range of phenomena occurring at length scales ranging from the atomic to the macroscale and an equally broad range of associated time scales.
A schematic depiction of the multiscale and multi-physics nature of material systems and their relationship with the component performance. A comprehensive understanding of material performance requires a complete hierarchical representation of structural/chemical features, the relationship between those features and material properties, and the mechanisms that drive their evolution either through processing or service history. All arrows represent scale bridging, i.e., upscaling via homogenization and downscaling via localization.
However, this task faces many hurdles. The most significant of these hurdles comes from the fact that the relevant data, even for a selected single material system, is necessarily generated by distributed teams of researchers with the requisite expertise. For example, on the experimental front, materials data comes from a wide range of imaging modalities (e.g., optical microscopy, scanning and transmission electron microscopy, various diffraction and spectroscopic techniques, X-ray tomography, atomic force microscopy) (
Modeling and experimental tools typically used to obtain relevant materials data at different length and time scales. Example of modeling tools used include Density Functional Theory (DFT), Molecular Dynamics (MD), Accelerated MD (AMD), Dislocation Dynamics (DD), kinetic Monte Carlo (kMC), Crystal Plasticity Finite Element Modeling (CPFEM), FEM, and extended FEM (xFEM). Examples of experimental tools used include Atomic Force Microscopy (AFM), High Resolution Transmission Microscopy (HRTEM),
As already noted, the perspectives presented above build on multiple national and international initiatives. Specifically, ICME (
The main components of the proposed roadmap for building digital twins for material systems.
Digital twins of macroscale engineered components and machines typically aim to represent a uniquely identified single physical twin. For example, a digital twin might target a specific turbine engine in service on an airplane. However, in building digital twins for a material system, it becomes intractable to consider each individual material sample as the physical twin. This is not only because of the large number of distinct material samples that can be produced for a nominally specified chemical composition and processing history, but also due to the fact that non-destructive characterization techniques are not yet mature for evaluating both the three-dimensional structure of the material as well as its properties of interest. Furthermore, even with the use of destructive techniques for materials characterization, one can only hope to establish distributions that adequately quantify the material structure and properties in a stochastic framework (i.e., accounting for the significant uncertainty associated with these quantities for any given material sample). Given these considerations, it is readily apparent that the digital twins for materials systems can only be established in a stochastic framework at the present time. In other words, we propose here that digital twins of materials systems should aim to produce multiple instantiations (as many as needed) sampled from the distributions of the possible material structure and their associated properties (with both structure and properties defined over a hierarchy of material length scales). Therefore, in our proposed framework, we will associate the digital twins of the material system to the nominal chemical composition and processing/service history that created the material samples. In doing so, we will implicitly define the material by the controllable details (each of which is identified with aleatoric uncertainty) of the generative process used to create the material samples (i.e., instantiations of the physical twin). This, we believe, will result in a much more pragmatic approach to building digital twins for material systems that will have high value for the design and in-service prognosis of engineered components and devices.
The mathematical framework underpinning the digital twins for material systems should address two main needs: (i) the statistical quantification of the material structure over a hierarchy of material length scales
The term
A digital twin of a material system should be able to instantiate a representative volume of the material with sufficient statistical sampling of all the relevant lower length-scale structural features and their spatial arrangements. Given the roughly eight orders of magnitude in length scales (from ∼Å to ∼ cms) involved, it should become clear that such instantiations cannot be deterministic or unique. Therefore, what is required here is the ability to produce multiple instantiations that reflect as accurately as possible the inherent stochasticity of the material structure for a given nominal composition and process history. Laplace conjectured that by knowing every atom, its position and momentum, we could anticipate the behavior of the material (
A comprehensive and systematic framework available today that is capable of providing the requisite feature engineering capabilities for the material structure is the framework of n-point spatial correlations (
At its core, MKS defines and utilizes a material structure function (
The MKS framework described in
The MKS workflow for feature engineering of material structure. In this example, we start with microstructures belonging to three distinct classes (corresponding to vertical, horizontal, or random ellipses), with one example of each class shown on the left. Their corresponding 2-point features are shown in the middle and reflect a large number of statistics (including volume fractions, size and shape distributions) for each microstructure. The low-dimensional representations of the microstructure statistics are shown on the right, in the subspace of the first two PC scores. The clusters in the PC plots successfully classify the microstructures in the three classes. The intra-class variance between microstructures within each class can also be quantified from the PC representations.
As stated earlier in
It is also noted that there are a number of other options based on neural networks that allow one to combine feature engineering and property prediction into a single-step framework. These approaches offer attractive avenues when one is interested in a limited number of properties as targets. If one insists on de-coupling the form and function of the digital twins (as we have argued here), then it is imperative to pursue feature engineering of the material structure independently from establishing property predictors (discussed in the next section). In this context, it should be recognized that the autoencoder-decoder networks (
Reliable prediction of the effective properties of a given material sample is a challenging task. At a high level, the main options are to either measure experimentally the properties of interest or to leverage known physics (often delivered in physics-based simulation packages) to estimate their values. Both approaches face hurdles when one desires to produce a multiscale, digital twin for materials. On the experimental front, the effort and cost involved in measuring all of the properties of interest along with the related information (e.g., anisotropy, variances) over the multiple material length scales of interest are often prohibitive. On the modelling front, there is substantial uncertainty in the model forms and/or parameter values used in the physics-based models. It is therefore clear that neither approach by itself is optimal in getting us the requisite information. In this regard, the recent emergence and successful application of materials data analytics tools has opened up new avenues for addressing these gaps.
Recently (
A Bayesian framework has the potential to address scale-bridging with uncertain physics. The proposed Bayesian framework will be described next using the structure-property (SP) linkages as an example. However, they will be formulated such that they can also be easily applied to capturing process-structure linkages (PS). Typically, SP linkages are formulated to take structure variables as inputs and predict property values as output. The mapping implied in these linkages can be expressed as
Schematic illustration of the scale-bridging between the response of an individual grain and the response of a polycrystalline aggregate. At the grain scale, the structure-property linkage is formulated to take grain orientation (structure variable) and critical resolved shear strength(s) (CRSS; physics variables) as input and predict the overall property of interest (e.g., indentation yield strength of grains of different orientations). This linkage is used with both experimental and modeling datasets to extract a posterior on the CRSS for a given material system (see
In establishing the material physics parameters, one has to exploit all of the available data, collected from disparate sources (e.g., experiments and physics-based simulations). Machine learning of
As noted above, the practical implementation of
An example application of the proposed Bayesian approach methodology is depicted in
An example application of the Bayesian update strategy for the fusion of experimental and simulation datasets from indentation of a-Ti grains in a polycrystalline sample (
Cyberinfrastructure supports the acquisition, storage, management, and fusion of data within a collaborative, but distributed, research environment. The creation of a robust cyberinfrastructure is critical to the realization of a digital twin, as digital twins exist at the confluence of multiple disparate data streams (e.g., simulation data, experimental data, real time sensor data). These data streams present challenges in managing both the variety and volume of data ingested, as well as any associated metadata needed to ensure high utility of the data for future use. Challenges in the variety of data come from the
Material structure measurements capture the
The external stimuli (e.g., thermo-chemo-mechanical loading) driving material structure evolution need to be tracked through the use of suitable sensors. Sensors generally transduce various forms of energy (
Example of energy forms that drive changes in material state and the transducers employed to observe the corresponding exposure history.
Stimuli | Application examples | Sensor examples |
---|---|---|
Mechanical | Vibration, Shock, Sound/Phonon, Stress, Strain | Strain gauges, piezoelectric, magnetostrictive, eddy current, accelerometer, capacitive |
Electrical | Current, magnetic fields | Voltage sensors, current sensors, resistance sensors, power sensors hall-effect sensors, giant magnetoresistance sensors, fluxgate sensor |
Radiant Energy | Gamma, X-ray, UV, Infrared, Visible light, Microwave, Radio waves | Photoresistors (LDR), photodiodes, phototransistors, charged-coupled devices, gamma ray detectors, microwave sensors, CMOS detector |
Thermal | Convective, conductive, latent | Thermocouples, RTDs, Thermistors, infrared, semiconductor sensors |
Chemical | Gases, liquids, solids, ions, isotopesetc. | Hygrometer, gas sensor, pH sensor |
Nuclear | Neutron, Beta, Alpha, Proton | Gas-filled proportional detectors, ionization chambers, Geiger-Mueller tubes, scintillators, solid-state detectors |
Gravitational | weight | See mechanical sensors |
The high volume and high variety of materials data quickly outpaces rudimentary data organization techniques typically used by humans (project specific folder structures, ad hoc organization or note taking). We therefore require more sophisticated data management tools to manage the storage and organization of the materials data relevant to the digital twin. In their most basic forms data management tools act as
In order to truly realize FAIR data principles for materials data, we need to adopt emerging software tools in ontologies and linked data.
A recently proposed materials ontology (
There currently exist several software packages than can be used to support the mathematical framework proposed in
AI tools support digital twins beyond the needs of the mathematical framework alone. AI based segmentation strategies have gained traction, and Bayesian CNNs have recently been used to characterize the segmentation uncertainty in materials images (
The ability to use a digital twin to provide an accurate picture of the corresponding physical twin at any given point in time is expected to significantly improve the guidance to subject-matter experts towards rational (and optimized) material/process improvements. Additionally, predictions of component performance can drive upstream changes in design or manufacturing process. To date, the development of detection and prognosis-driven planning strategies has largely focused on tuning individual process parameters such as temperature or materials composition for example, despite the urge to devise efficient strategies for the selection of multiple interdependent variables to substantially accelerate and improve scientific discovery. Digital twins open up new opportunities to enable such strategies and accelerate autonomous experimental design and exploration. Autonomous experiments are emerging in materials research leading to the acceleration of materials design and discovery (
One particular application domain of interest for digital twins is the material/process exploration in additive manufacturing, with origins in rapid prototyping. There are extensive model-based simulations of the additive manufacturing process, ranging from powder packing through the entire laser-matter interaction and solidification process that can be taken as input into the Bayesian update strategy described in
Digital twins of the components in devices have enabled the in-service monitoring, prognosis, and design of complex systems. This work proposes both the conceptual framework and the cyberinfrastructure required to extend the concept of digital twins to the material level. Digital twins for materials provide a statistical in-silico materials representation of both structure and performance. The proposed framework consists of a materials representation based on n-point spatial correlations and PCA, a performance prediction framework centered around a two-step Bayesian framework, and a cyberinfrastructure that leverages new material ontologies for the management of multimodal materials data. Together, these foundational elements offer new opportunities for the extension of current digital twins to include important details of the material over a multitude of material structure length scales (from the macroscale to the atomistic).
All authors listed have made a substantial, direct, and intellectual contribution to the work and approved it for publication.
MB and SRK acknowledge support from NSF DMREF Award# 2119640. This work was performed, in part, at the Center for Integrated Nanotechnologies, an Office of Science User Facility operated for the U.S. Department of Energy. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE-NA0003525. This paper describes objective technical results and analysis. Any subjective views or opinions that might be expressed in the paper do not necessarily represent the views of the U.S. Department of Energy or the United States Government.
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors, and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
In PSP linkages, one associates a material structure to an instant of time. The structure is then assumed to be responsible completely for the properties exhibited by the sample. In any imposed process, the structure is assumed to evolve with time. When the structure evolves, its associated properties are also expected to evolve.