Scanning Transmission Electron Microscopy (STEM) is a well-established method for looking into the physical properties of complex nanostructures. However, one major drawback is that acquiring very high-resolution images leads to effects such as radiolysis and knock-on damage [ 1]. It has been shown that by lowering the electron beam dose, sample damage is reduced, however, this leads to a weaker signal-to-noise ratio (SNR) reducing the final quality of the image [ 2].
Deep learning frameworks such as Convolutional Neural Networks (CNN) work as a type of feed-forward neural network that is used as a powerful tool for image signal problems such as denoising [ 3]. Recently, aberration-corrected STEM using CNNs has seen promising results below 0.1 nm. Specifically, improved SNR without using high-dose electron beams has been achieved by using both pretrained CNN models on large datasets of microscopy data and context driven models that learn from the given image [ 4]. These methods have demonstrated that CNNs aid in reducing damage to samples by improving image quality of low-dose STEM below 0.1 nm.
Inspired by the successful use of deep learning-based convolution for noise reduction outside of STEM [ 5], we propose the use of hybrid deep CNNs utilising partial pretrained models on both synthetic and real high-dose data to improve the SNR of low-dose data. We aim to create a robust network that is portable to methods outside of denoising by using the high-dose data to retrain the network for a variety of conditions [ 6] (such as hysteresis, defocus, and image blur).