AI Accelerates Nanoparticle Research

University of Konstanz

Nanoparticle researchers spend most of their time on one thing: counting and measuring nanoparticles. Each step of the way, they have to check their results. They usually do this by analyzing microscopic images of hundreds of nanoparticles packed tightly together. Counting and measuring them takes a long time, but this work is essential for completing the statistical analyses required for conducting the next, suitably optimized nanoparticle synthesis.

Alexander Wittemann is a professor of colloid chemistry at the University of Konstanz. He and his team repeat this process every day. "When I worked on my doctoral thesis, we used a large particle counting machine for these measurements. It was like a cash register, and, at the time, I was really happy when I could measure three hundred nanoparticles a day", Wittemann remembers. However, reliable statistics require thousands of measurements for each sample. Today, the increased use of computer technology means the process can move much more rapidly. At the same time, the automated methods are very prone to errors, and many measurements still need to be conducted, or at least double-checked, by the researchers themselves.

A correct count – even with complex particles

During the coronavirus pandemic, good fortune brought Wittemann into contact with his doctoral student Gabriel Monteiro, who not only has knowledge of programming and AI, but also has connections to computer scientists. Wittemann and Monteiro developed a program based on Meta's open source AI technology "Segment Anything Model". The program enables the AI-supported counting of nanoparticles in a microscopic image and the subsequent automatic measurement of each individual particle.

"For clearly definable particles, the 'watershed method' has worked quite well so far. Our new method, however, can also automatically count particles that have a dumbbell or caterpillar shape, consisting of strings of two or three overlapping spheres", Wittemann explains. "This saves a massive amount of time", he adds. "In the time it would usually take to complete a particle synthesis and make the corresponding time-consuming measurements, we can now concentrate on particle syntheses and examining them under the microscope, while the AI system takes care of most of the rest. This last step is now possible in a fraction of the time it used to require. This means, we can complete eight to ten particle analyses in the time we used to need for one."

In addition to this, the AI measurements are not only more efficient, but also more reliable. The AI method recognizes the individual fragments more accurately and measures them more precisely than other methods – even those conducted by humans. As a result, subsequent experiments can be adapted and carried out more precisely, which leads to the faster success of the test series.

The research team has published the new AI routine as well as the required codes and data from the study Open Access on Git-Hub and KonData for other researchers to use and discuss.

Key facts:

  • The study: Monteiro, G. A. A., Monteiro, B. A. A., dos Santos, J. A., & Wittemann, A. (2025). Pre-trained artificial intelligence-aided analysis of nanoparticles using the segment anything model. Scientific Reports , 15(1), 2341. DOI: 10.1038/s41598-025-86327-x
  • Professor Alexander Wittemann is a professor of colloid chemistry at the University of Konstanz. His main research areas include nanoparticle fabrication, supracolloidal assembly, functional and responsive nanostructures, and morphological properties of nanoparticles.
/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.