Deep neural networks for aberrations compensation in digital holographic imaging of the retina

Abstract

In computational imaging by digital holography, lateral resolution of retinal images is limited to about 20 microns by the aberrations of the eye. To overcome this limitation, the aberrations have to be canceled. Digital aberration compensation can be performed by post-processing of full-field digital holograms. Aberration compensation was demonstrated from wavefront measurement by reconstruction of digital holograms in subapertures, and by measurement of a guide star hologram. Yet, these wavefront measurement methods have limited accuracy in practice. For holographic tomography of the human retina, image reconstruction was demonstrated by iterative digital aberration compensation, by minimization of the local entropy of speckle-averaged tomographic volumes. However image-based aberration compensation is time-consuming, preventing real-time image rendering. We are investigating a new digital aberration compensation scheme with a deep neural network to circumvent the limitations of these aberrations correction methods. To train the network, 28.000 anonymized images of eye fundus from patients of the 15-20 hospital in Paris have been collected, and synthetic interferograms have been reconstructed digitally by simulating the propagation of eye fundus images recorded with standard cameras. With a U-Net architecture, we demonstrate defocus correction of these complex-valued synthetic interferograms. Other aberration orders will be corrected with the same method, to improve lateral resolution up to the diffraction limit in digital holographic imaging of the retina.