This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

AI technique generates clear images of thick biological samples without additional hardware

New AI technique generates clear images of thick biological samples without the fancy hardware
Concept and simulations illustrating deep learning-based aberration compensation. Credit: Nature Communications (2025). DOI: 10.1038/s41467-024-55267-x

Depth degradation is a problem biologists know all too well: The deeper you look into a sample, the fuzzier the image becomes. A worm embryo or a piece of tissue may only be tens of microns thick, but the bending of light causes microscopy images to lose their sharpness as the instruments peer beyond the top layer.

To deal with this problem, microscopists add technology to existing microscopes to cancel out these distortions. But this technique, called , requires time, money, and expertise, making it available to relatively few biology labs.

Now, researchers at HHMI's Janelia Research Campus and collaborators have developed a way to make a similar correction, but without using adaptive optics, adding additional hardware, or taking more images. A team from the Shroff Lab has developed a new AI method that produces sharp microscopy images throughout a thick biological sample.

The paper is published in the journal Nature Communications.

To create the new technique, the team first figured out a way to model how the image was being degraded as the microscope imaged deeper into a uniform sample. They then applied their model to near-side images of the same sample that weren't degraded, causing these clear images to become distorted like the deeper images. Then, they trained a to reverse the distortion for the entire sample, resulting in a clear image throughout the entire depth of the sample.

Researchers have developed a new AI method that produces sharp microscopy images throughout thick biological samples. This video shows how the method, DeAbe, restores highly dynamic time-lapse images of live roundworms expressing a GCaMP6 marker targeted to neurons acquired with instant Structured Illumination Microscopy. The top video shows the raw images and the bottom video shows restoration after DeAbe, which allowed the researchers to resolve structural details in the nerve ring that were obscured in the raw data. Credit: Guo et al.

Not only does the method produce better-looking images, but it also enabled the team to count the number of cells in worm embryos more accurately, trace vessels and tracts in the whole mouse embryos, and examine mitochondria in pieces of mouse livers and hearts.

The new deep learning-based method does not require any equipment beyond a standard microscope, a computer with a graphics card and a short tutorial on how to run the computer code, making it more accessible than traditional adaptive optics techniques.

The Shroff Lab is already using the new technique to image worm embryos, and the team plans to further develop the model to make it less dependent on the structure of the sample so the new method can be applied to less uniform samples.

More information: Min Guo et al, Deep learning-based aberration compensation improves contrast and resolution in fluorescence microscopy, Nature Communications (2025). DOI: 10.1038/s41467-024-55267-x

Journal information: Nature Communications

Citation: AI technique generates clear images of thick biological samples without additional hardware (2025, January 7) retrieved 8 January 2025 from https://phys.org/news/2025-01-ai-technique-generates-images-thick.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Scientists adapt astronomy method to unblur microscopy images

39 shares

Feedback to editors