Axel Munk, University of Göttingen
Conventional light microscopes have been used for centuries for the study of small length scales down to about 250 nanometers. At such a resolution level images are blurred and noisy and the data often can be well approximated by a Gaussian or Poisson model. This has been the focus of a multitude of statistical deconvolution techniques during the past.
However, such conventional light microscopes have an intrinsic physical limit of resolution which was broken recently with the advent of modern superresolution fluorescence nanoscopy techniques (nanoscopy), acknowledged with the Nobel prize in Chemistry 2014. Nowadays, nanoscopy is an indispensable tool in medical and biological research for studying structure, communication and dynamics of living cells. Current experimental advances go to the physical limits of nanoscopic imaging where the quantum nature of photons becomes predominant. Consequently, nanoscopy is inherently random and we argue that this challenges established methods of data analysis, such as deconvolution for conventional light microscopy. In this talk we discuss several examples from nanoscopy where new statistical methodology is required. This includes multiscale testing and statistical image registration. Finally, we address the task how to quantify the amount of molecules at a certain spot. We argue that it becomes necessary to model the temporal transitions of quantum states of fluorescent molecules in order to achieve the highest resolution and statistical precision possible. This leads to some novel issues for the analysis of hidden Markov models.
Author: Axel Munk, Georg August University Göttingen and Max Planck Institute for Biophysical Chemistry