New AI Model Boosts Woody Breast Detection in Chickens

University of Arkansas System Division of Agriculture

FAYETTEVILLE, Ark. — It's called "woody breast" and for consumers it can mean a chewier chicken sandwich, but for the industry it can mean up to $200 million annual yield loss.

Work done by the Arkansas Agricultural Experiment Station is not only making woody breast easier to detect in chicken meat but is accurate up to 95 percent of the time.

The development could help improve quality assurance and customer confidence in one of the state's most economically important agricultural products. What allows researchers to see inside the meat is a combination of a hyperspectral camera, which examines the meat through various energy wavelengths, and machine learning to interpret what the camera sees.

"We've been able to improve accuracy of detection of woody breast by utilizing machine learning to analyze complex data from images with a hyperspectral camera," said Dongyi Wang, an assistant professor in the biological and agricultural engineering department for the experiment station, the research arm of the University of Arkansas System Division of Agriculture.

"The next step will be trying to integrate the system online and make this beneficial for stakeholders," Wang said, noting this specific application of image analysis had not been done before.

Loss in premium meat

"Woody breast" meat is harder and chewier than normal chicken breast, but it is still safe to eat, according to Casey Owens, professor of poultry processing and products for the experiment station and a co-author of the study. When detected by processers, either by humans or computer-assisted imaging technology, she said the meat is diverted from whole-breast packaging for further processing into products including chicken nuggets and patties.

The loss in premium as a whole-muscle product accounts for yield loss as high as $200 million in Arkansas and over $1 billion in direct and indirect costs annually across the United States poultry industry, Owens added. Up to 20 percent of chicken breast meat can have the defect, which is more common in larger birds of 8 to 9 pounds versus 6- to 7-pound birds.

Hyperspectral imaging

Hyperspectral imaging is a rapid, non-invasive way to capture detailed data about objects and their composition. This data can be used to classify food products according to food quality, consumer preferences and other product requirements.

But hyperspectral images come with tons of data. That's where machine learning comes in.

Chaitanya Pallerla, a food science graduate student who has been working on the project for the past two years with Wang as his adviser, said the new machine learning model is called NAS-WD. When correlated with known data about the "woodiness" of chicken breasts, the model allows for deeper and wider analysis of hyperspectral images to identify the defect.

"In hyperspectral imaging, there are common machine learning models being used, but we were able to develop a new model that could be well-suited for correlating more than two variables," Pallerla said. "We kind of took two different models, made a few changes, and put them together to detect patterns better and correlate the hyperspectral data with hardness of the chicken meat."

The results of their research were published in the journal Artificial Intelligence in Agriculture under the title "Neural network architecture search enabled wide-deep learning (NAS-WD) for spatially heterogenous property awared chicken woody breast classification and hardness regression."

The results showed that NAS-WD can classify three woody breast defect levels with an overall accuracy of 95 percent, outperforming the traditional models like the Support Vector Machine and Multi-Layer Perception, which offered 80 percent and about 73 percent accuracy, respectively.

Wang said the study offers an example of how to use new algorithms to mine data and dig into key information. The form of hyperspectral imaging used in the research is called "push broom," which takes an image of several objects once every 40 seconds, compared to a more common industry method of a "snapshot," which takes an image of individual objects as fast as every 30 milliseconds. The "snapshots" have a lower resolution than the "push broom" method, but software upgrades may one day provide higher resolution for "snapshot" images, Pallerla said.

Wang said his team is working on deploying this technology in the real-time system.

The study was supported in part by the Agriculture and Food Research Initiative, project award nos. 2023-70442-39232 and 2024-67022-42882, from the U.S. Department of Agriculture's National Institute for Food and Agriculture.

To learn more about the Division of Agriculture research, visit the Arkansas Agricultural Experiment Station website

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.