AI Hair Analysis Method Promises Better Health Research

WSU

PULLMAN, Wash. - A new application that uses artificial intelligence may revolutionize the way scientists study hair and could lead to the development of health diagnostics based solely on hair.

The AI model speeds up and streamlines the hair quantification process, allowing a microscope to scan slides and collect images of hundreds of hairs at a time. In a matter of seconds, it can capture an abundance of high-resolution data that is then processed with a deep learning algorithm that collects the color, shape, width and length of each individual hair. Researchers tested it using mouse fur, but it could be applied to hair of any species including humans.

Research behind the application, conducted and developed by scientists at Washington State University's College of Veterinary Medicine, was published in the Journal of Investigative Dermatology.

"In many ways an individual's hair is somewhat a reflection of health, and if you start separating them out with tweezers, which a lot of hair scientists do, you can make some really interesting discoveries, but you're doing this manually, right underneath the microscope," Ryan Driskell associate professor and principal investigator of the research said. "So, the idea was what happens if you can make a computer program do that for you?"

The concept for the application was dreamt up by Jasson Makkar, a molecular biosciences graduate student at WSU who was tasked with the monotonous job of manually separating thousands of hairs for various research projects focused on hair and skin in Driskell's lab.

To bring that idea to life, Makkar trained an AI computer vision model to identify hair using WSU's high-performance computing cluster, Kamiak. With the added help of the Aperio GT450 microscope at the Washington Animal Disease Diagnostic Laboratory, high resolution imaging of the hair fibers was automated.

The application has many implications, including in forensics and the hair product industry, but allowing scientists to assess the health of a person or animal through their hair is perhaps the greatest of all, Makkar said.

By determining longitudinal data points for what healthy hair looks like in each species, he said a scale could be created for human doctors and veterinarians to grade overall health based on hair. Different conditions, such as hormonal imbalances or nutritional deficiencies, alter hair growth in ways that can be detected and potentially used for diagnosis.

The new technology could not only identify the species a hair is derived from but also shed light on age, health, and ethnicity in humans, which could aid criminal investigations.

"There's this methodology in law enforcement agencies that utilizes hair fiber classification as a forensic tool in criminal investigations," Driskell said. "This methodology has been somewhat controversial because much of this work was performed by forensic technicians visually identifying hair types found at a crime scene and then cross-referencing them against a limited database of hair types across all mammals."

Driskell added the technology allows scientists to not only perform highly accurate cross-referencing of hair fibers in an unbiased manner but also generate a large enough database to accurately quantify hair types from different individuals and possibly anatomical positions.

Using these same tools, Makkar said assessing the effects of various hair products on hair is another capability the application brings.

"Take a swatch of hair, apply the cosmetic that you're testing to it and then look at it with our deep hair phenomics tool and see how it changes," Makkar said.

The data generated in this study is available through an interactive webtool at skinregeneration.org.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.