- Main
From Photonics to AI: A Holistic Framework for Next-Generation 4D Fluorescence Microscopy
- Carmona, Javier
- Advisor(s): Arisaka, Katsushi
Abstract
Recording the neural activity of biological organisms is paramount to understanding how they, and consequently, we, process the world. Fluorescence microscopy has served as the standard for recording this neural activity due to its ability to capture large populations of neurons simultaneously. Recent efforts in fluorescence microscopy have been concentrated on imaging large-scale volumes; however, most of these efforts have been limited by spatiotemporal and bandwidth constraints.
I present a novel system called Transverse-Sheet Illumination Microscopy (TranSIM), which captures axially separated planes onto multiple two-dimensional sCMOS sensors at near diffraction-limited resolution with 1.0 μm, 1.4 μm, and 4.3 μm (x, y, and z, respectively). The parallel use of sensors reduces the bandwidth bottlenecks typically found in other systems. TranSIM allows for the capturing of data at large-scale volumetric fields of view up to 748 × 278 × 100 μm³ at 100 Hz. Moreover, I was able to capture smaller fields of view of 374 × 278 × 100 μm³ at a faster volumetric rate of 200 Hz. Additionally, I found that the system’s versatile design allowed us to change the vertical magnification programmatically rather than necessitating a change of objectives. With this baseline system, I was able to record intricate neuronal communication in both larval and adult-stage fruit flies. Moreover, I was able to reconstruct the complex physiological deformation of larval-stage zebrafish hearts.
Despite its advantages, TranSIM acquires sparsely sampled volumetric data, necessitating computational reconstruction techniques to infer missing planes. To address this, I leveraged deep learning-based volumetric reconstruction methods to enhance data continuity. I first explored three-dimensional convolutional neural networks (3D-CNNs) with self-attention mechanisms, which effectively capture spatial dependencies and refine structural details across planes. These time-independent networks demonstrated high performance in reconstructing static volumes, and while satisfactory, are potentially limited when capturing temporally evolving neural dynamics. To overcome these limitations, I further investigated the implementation of four-dimensional recurrent neural networks (4D-RNNs), which integrate temporal dependencies alongside spatial information. By incorporating recurrent components, in the form of long-short term memory, these networks improved temporal coherence in the reconstructions, particularly in dynamic imaging experiments.
These results highlight the potential of artificial neural networks to significantly enhance TranSIM’s imaging capabilities, enabling accurate volumetric reconstructions from sparse data while preserving both spatial and temporal fidelity. This advancement paves the way for more efficient high-speed volumetric fluorescence microscopy, facilitating the study of large-scale neural networks in living organisms with unprecedented detail. Altogether, I have demonstrated how the combination of TranSIM coupled with neural networks can serve as the framework for next-generation 4D microscopy.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-