A self-driving microscope and the Atomic Forge

  • PDF / 728,856 Bytes
  • 2 Pages / 585 x 783 pts Page_size
  • 27 Downloads / 136 Views

DOWNLOAD

REPORT


MATERIAL MATTERS

A self-driving microscope and the Atomic Forge By Ondrej Dyck, Stephen Jesse, and Sergei V. Kalinin Center for Nanophase Materials Sciences, Oak Ridge National Laboratory, USA

T

he electron microscope predates the transistor and the charge-coupled device (CCD). The opportunity to integrate these advancements into the electron microscope was seized and revolutionized in the modern (scanning) transmission electron microscope (STEM). Real-time analysis became possible and is now routine. Consider the efforts one would have to undertake to perform a Fourier transform in the days when images were acquired by exposing photographic plates. Indeed, it was real-time data analysis that enabled the measurement and correction of aberrations in modern instrumentation.1 What

are the next opportunities on the horizon for the STEM? With the impressive progress made in the fields of deep learning, computer vision, and automation, we posit that the next revolution in microscopy will stem from the integration of these tool sets: a self-driving microscope. Such a machine will “understand” what it is looking at and automatically document features of interest. The microscopist will have high-level tools to tell the microscope to “look for distortions at that interface” or “obtain a tomographic reconstruction of this structure.” The microscope will know what various features look like by referencing databases, or it can be shown examples on the fly. Consider the trainable Weka segmentation plugin for ImageJ.2,3 This plugin allows the user to highlight regions of an image to inform the computer which features belong in which categories. As examples are added the computer becomes increasingly accurate at classifying the rest of the image automatically. Advances in deep learning to interpret atomically resolved images are already beginning. For example, Ziatdinov et al. applied deep convolutional neural networks to automate the detection of molecular orientation, defect identification, Using the Atomic Forge, Oak Ridge National Laboratory (ORNL) and classification. 4,5 researchers brought two, three, and four silicon atoms together Maksov et al. illustrated to build clusters (green) and make them rotate within a layer of the untangling of lattice graphene (blue). Photo credit: ORNL.

dynamics involving the interaction of thousands of defects using deep learning and unsupervised unmixing strategies,6 and Vasudevan et al. showed strategies for automated classification of Bravais lattice symmetries.7 There is no fundamental impediment for implementing tools such as these in the microscope to tell the computer what to pay attention to and where to gather data. As these fields progress and as the techniques are refined and proven, integration into the microscope itself becomes increasingly attractive and powerful. A parallel development is the recognition that the STEM can be used to tailor materials at the atomic level, termed the Atomic Forge.8 The electron beam can alter materials but historically, steps have been taken t