Research

Our research covers optics, signal processing, computer algorithms and system design. Optical waves can be treated in a signal processing framework, where optical beams are represented as complex fields split up into spatial frequencies and temporal frequencies (color). Coherence is the 2nd order correlation function of a statistical optical beam. We aim to measure and control all of these parameters for better imaging systems.

Phase imagingphase is important

Light is a wave, having both an amplitude and phase. Our eyes and cameras, however, only see real values (i.e. intensity), so cannot measure phase directly. Phase is important, especially in biological imaging, where cells are typically transparent (i.e. invisible) but yet impose phase delays. 3D phase imaging also reveals volumetric refractive indices, which is helpful in studying thick transparent samples, such as embryos. When we can acquire quantitative phase information, we get back important shape and density maps. We develop methods for phase imaging from simple experimental setups and efficient algorithms, which can be implemented in optics, X-ray, neutron imaging, etc.

Phase from chromatic aberrations: L. Waller, S. Kou, C. Sheppard, G. Barbastathis, Optics Express 18(22), 22817-22825 (2010).
Phase from through-focus: L. Waller, M. Tsang, S. Ponda, G. Barbastathis, Opt. Express 19 (2011).
Z. Jingshan, J Dauwels, M. A. Vasquez, L. Waller, Optics Express 21(15), 18125-18137 (2013).
Z. Jingshan, L. Tian, J. Dauwels, L. Waller, Biomedical Optics Express 6(1), 257-265 (2014)
Differential phase contrast: Z.F. Phillips, M. Chen, L. Waller, PLOS ONE12(2), e0171228 (2017).
M. Kellman, M. Chen, Z.F. Phillips, M. Lustig, L. Waller, Biomed. Optics Express 9(11), 5456-5466 (2018).
M. Chen, Z.F. Phillips, L. Waller, Optics Express 26(25), 32888-32899 (2018).
3D phase imaging: L. Tian and L. Waller, Optica 2, 104-111 (2015).
M. Chen, L. Tian, and L. Waller, Biomedical Optics Express 7(10), 3940-3950 (2016).
Applications in lithography: A. Shanker, M. Sczyrba, B. Connolly, F. Kalk, A. Neureuther, L. Waller, in SPIE Photomask Technology (2013).
A. Shanker, M. Sczyrba, B. Connolly, F. Kalk, A. Neureuther, L. Waller, in SPIE Advanced Lithography paper 9052-49, February 2014, San Jose, CA.
R. Claus, A. Neureuther, P. Naulleau, L. Waller, Optics Express 23(20), 26672-26682 (2015).

LED array microscope

We work on a new type of microscope hack, where the lamp of a regular microscope is replaced with an LED array, allowing many new capabilities. We do brightfield, darkfield, phase contrast, super-resolution or 3D phase imaging, all by computational illumination tricks.

L. Tian, X. Li, K. Ramchandran, L. Waller, Biomedical Optics Express 5(7), 2376-2389 (2014).
L. Tian, J. Wang, L. Waller, Optics Letters, 39(5), 1326-1329 (2014).
L. Tian, Z. Liu, L. Yeh, M. Chen, Z. Jingshan, L. Waller, Optica 2(10), 904-911 (2015).
Z.F. Phillips, R. Eckert, L. Waller, OSA Imaging and Applied Optics; paper IW4E.5 (2017).

We used this system to create the Berkeley Single Cell Computational Microscopy (BSCCM) dataset, which contains over 12,000,000 images of 400,000 of individual white blood cells under different LED array illumination patterns and paired with a variety of fluorescence measurements.

H. Pinkard, C. Liu, F. Nyatigo, D. Fletcher, L. Waller, (2024).

DiffuserCam: Lensless imaging

DiffuserCam is a lensless camera made of a piece of bumpy plastic, called a diffuser, in front of a bare image sensor. Since there are no other elements in the system, the camera is cheap, compact, and easy to build. We’ve demonstrated that our simple system can be used for both 2D photography and 3D image reconstruction from a single acquisition (left).
Check out our project page for more information.

Lensless single-shot video and hyperspectral imaging

Building from our DiffuserCam work, we’ve extended our lensless camera to two new dimensions: wavelength and time.

Single-shot Hyperspectral:

Spectral DiffuserCam, combines a spectral filter array and diffuser in order to capture single-shot hyperspectral volumes with 64 spectral bands in a single image with higher resolution than would be possible with a lens-based system.

Check out the Spectral DiffuserCam project page for more information.

Single-shot Video: For time, a diffuser is used to encode different time points by leveraging a camera’s rolling shutter, enabling the recovery of 40 video frames at over 4,500 frames per second from a single image.  Both systems can be very cheap, compact, and easy to build.
N. Antipa, P. Oare, E. Bostan, R Ng, L. Waller Video from stills: Lensless imaging with rolling shutter Inproceedings ICCP (2019)

Data-driven adaptive microscopy

Using a combination a physics-based modeling and machine learning, microscopes can adaptively image samples based on feedback from the sample itself

Single shot autofocus

Using only the addition of one or a few LEDs as an illumination source, a physics-based neural network can be used to focus a microscope based on a single image of the sample. [Tutorial]

Henry Pinkard, Zachary Phillips, Arman Babakhani, Daniel A. Fletcher, and Laura Waller, “Deep learning for single-shot autofocus microscopy,” Optica 6, 794-797 (2019)

Learned adaptive illumination multiphoton microscopy
Using a physics-based neural network, the shape of a sample can be used to predict the correct laser power needed to image a highly scattering sample using multiphoton microscopy. This enables physiologically-accurate imaging of developing immune responses on an organ-wide level with cellular resolution. [Tutorial]

Pinkard, H., Baghdassarian, H., Mujal, A. et al. Learned adaptive multiphoton illumination microscopy for large-scale immune response imaging. Nat Commun 12, 1916 (2021).

Structured illumination & imaging with scattering

We adapt the existing framework for structured illumination (SI) super-resolution microscopy towards SI imaging with unknown patterns. This allows super-resolution fluorescent reconstruction of biological samples after illuminating with unknown and uncalibrated patterns, and has applications when imaging through aberrations or turbid media. Furthermore, this enables high throughput fluorescence imaging – Click the photo to the right to see a high resolution fluorescent image across a large field of view. We further develop new scattering models and apply them to reconstruct increasingly complex samples.

Machine learning for computational imaging

Computational imaging lives at the intersection of hardware and software. Machine learning provides us with the opportunity to jointly optimize both hardware and software systems. We investigate various machine learning techniques to efficiently design a computational imaging system to maximize its overall performance.
Machine learned LED illumination results

Machine learned LED illumination patternsMachine learned LED illumination training loss

Fourier ptychography algorithms

The algorithms behind achieving large field-of–view and high resolution are rooted in phase retrieval. We use large-scale nonlinear non-convex optimization, much like neural networks in machine learning, but we have new challenges for imaging applications.

Yeh, J. Dong, Z. Jingshan, L. Tian, M. Chen, G. Tang, M. Soltanolkotabi, L. Waller, Optics Express 23(26), 33212-33238 (2015).

R. Eckert, Z.F. Phillips, L. Waller, Applied Optics 57(19), 5434-5442 (2018).

Picture1

Computational CellScope

In collaboration with the CellScope group at Berkeley (Fletcher Lab), we work on computational illumination and algorithms for cell-phone based microscopes – in particular, the CellScope.

Phillips, M. D’Ambrosio, L. Tian, J. Rulison, H. Patel, N. Sadras, A. Gande, N. Switz, D. Fletcher, L. Waller, PLOS ONE (2015).

Picture2

10x, NA~0.25, 10 Hz, Elegans (Dillin lab)

Real-time multi-mode microscope system

We show here a single-camera imaging system that can simultaneously acquire brightfield, darkfield and phase contrast (DPC) images in real-time. Our method uses a programmable LED array as the illumination source, which provides flexible patterning of illumination angles. We achieve a frame rate of 50 Hz with 2560X2160 pixels and 100Hz with 1920X1080 pixels, with speed only limited by the camera.
Z. Liu, L. Tian, S. Liu, and L. Waller, Real-time brightfield, darkfield, and phase contrast imaging in a light-emitting diode array microscope,” Journal of Biomedical Optics 19(10), p. 106002, 2014.

Coherence engineering – phase space/statistical optics – light fields

Partially coherent beams do not have a well-defined phase. Light at any point in space doesn’t travel in a single direction (with a single spatial frequency), but rather, has a statistical distribution of frequencies (phase values). This means that a spatially partially coherent 2D beam can only be described by a 4D coherence function, akin to a covariance matrix. We aim to measure these 4D correlations with high resolution in all 4D, which creates engineering challenges for data management and compression. Basically, it’s big data for imaging.

L. Waller, G. Situ, J. W. Fleischer, Nature Photonics 6, 474-479 (2012).

Funding:

 

 

 

 

tumblr visitor