
If you’re someone who tracks their calories, then a new AI tool can make your life much easier. Before enjoying a meal, all you need to do is scan the food using your smartphone and this AI tool will reveal the amount of calories, fat, and other nutrients it contains.
Developed by researchers at New York University (NYU) Tandon School of Engineering, this new AI food scanner could help people with diabetes, obesity, and other health conditions manage their diet effectively.
“Traditional methods of tracking food intake rely heavily on self-reporting, which is notoriously unreliable. Our system removes human error from the equation,” said Prabodh Panindre, lead researcher and an associate Research Professor at NYU.
Challenges with an AI food scanner

The motivation for the AI food scanner comes from a problem that is quite common but has been overlooked for years. The NYU team has studied the health and work-related challenges firefighters face for more than 10 years. They came to some surprising findings.
The studies reveal that up to 88 percent of full-time and 87 percent of volunteer firefighters are obese. This not only makes it challenging for them to do their work but also puts them at higher risk of developing heart diseases and other health problems. The NYU team wanted to come up with a practical and easy way to help the firefighters manage their diet and weight.
This is when the idea of an AI food scanner struck their mind. However, scanning nutrients in food is not as easy as scanning a document or barcode.
“The sheer visual diversity of food is staggering. Unlike manufactured objects with standardized appearances, the same dish can look dramatically different based on who prepared it,” Sunil Kumar, one of the study authors and a professor of Mechanical Engineering at NYU Abu Dhabi, said.
For instance, “a burger from one restaurant bears little resemblance to one from another place, and homemade versions add another layer of complexity,” Kumar added.
There have been many attempts at making an AI food scanner, but most had trouble figuring out how much food was on a plate. This is a major challenge since determining the exact portion size is important for calculating calories and nutrients. If the portion size is wrong, the nutrition data will also be inaccurate.
Another issue with such food-tracking systems is that to work in real-time, they require a lot of processing power. Some of them even relied on remote cloud servers, which made them slow and also raised data privacy issues. The NYU team claims that their AI tool overcome all these limitations and provides accurate nutritional analysis.
An automated web-based food-tracking tool
The researchers selected millions of food images, then used an algorithm to group these images into similar food types, remove rarely seen items, and adjust category importance. This processing resulted in a refined dataset with 95,000 food images covering 214 categories. The AI tool was trained using these images.
“One of our goals was to ensure the system works across diverse cuisines and food presentations. We wanted it to be as accurate with a hot dog—280 calories according to our system—as it is with baklava, a Middle Eastern pastry that our system identifies as having 310 calories and 18 grams of fat,” Panindre said.
It utilizes a volumetric computation function, a technology that allows the system to determine how much area a particular food item occupies on a plate. However, knowing just the area isn’t enough because different foods have different densities. For example, a spoonful of rice and a spoonful of peanut butter take up the same volume but have very different weights and calorie counts.
The volumetric computation function works by correlating the area of food with known food density and macronutrient data. This allows the AI system to estimate not just what the food is, but also how much of it is present and its nutritional value. This is how the system provides highly accurate meal assessments without requiring manual input.
To reduce processing power and provide quick analysis in real-time, they designed their AI tool to work as a website instead of an app that runs locally on the device. They integrated technologies such as YOLOv8 and ONNX runtime that provide powerful image recognition at the highest efficiency.
When the study authors tested a variety of cuisines using their AI scanner such as pizza, hot dog, and idli sambhar (a South Indian dish), the tool provided accurate nutritional analysis.
“The AI can accurately locate and identify food items approximately 80% of the time, even when they overlap or are partially obscured,” the NYU team notes.
The web-based AI tool is currently at the proof-of-concept stage. The researchers plan to further improve its performance and soon make it available for public use on a large scale.
The study is published in the journal IEEE.