Snap, Scan, and Know: AI Turns Meal Photos into Instant Nutrition Facts

Forget food diaries—NYU Tandon’s breakthrough AI system lets you photograph your meal and instantly get calorie and macronutrient estimates. With advanced deep learning and volumetric analysis, this tech simplifies diet tracking for better health.

Research: Deep Learning Framework for Food Item Recognition and Nutrition Assessment. Image Credit: Pixel-Shot / ShutterstockResearch: Deep Learning Framework for Food Item Recognition and Nutrition Assessment. Image Credit: Pixel-Shot / Shutterstock

Snap a photo of your meal, and artificial intelligence instantly tells you its calorie count, fat content, and nutritional value - no more food diaries or guesswork.

This futuristic scenario is now much closer to reality, thanks to an AI system developed by NYU Tandon School of Engineering researchers. This system promises a new tool for the millions of people who want to manage their weight, diabetes, and other diet-related health conditions.

The technology, detailed in a paper presented at the 6th IEEE International Conference on Mobile Computing and Sustainable Informatics, uses advanced deep-learning algorithms to recognize food items in images and calculate their nutritional content, including calories, protein, carbohydrates, and fat.

For over a decade, NYU's Fire Research Group, which includes the paper's lead author Prabodh Panindre and co-author Sunil Kumar, has studied critical firefighter health and operational challenges. Several research studies show that 73-88% of career and 76-87% of volunteer firefighters are overweight or obese, facing increased cardiovascular and other health risks that threaten operational readiness. These findings directly motivated the development of their AI-powered food-tracking system.

"Traditional methods of tracking food intake rely heavily on self-reporting, which is notoriously unreliable," said Panindre, Associate Research Professor of NYU Tandon School of Engineering's Department of Mechanical Engineering. "Our system removes human error from the equation."

Despite the concept's apparent simplicity, developing reliable food recognition AI has stumped researchers for years. Previous attempts have struggled with three fundamental challenges the NYU Tandon team appears to have overcome.

"The sheer visual diversity of food is staggering," said Kumar, Professor of Mechanical Engineering at NYU Abu Dhabi and Global Network Professor of Mechanical Engineering at NYU Tandon. "Unlike manufactured objects with standardized appearances, the same dish can look dramatically different based on who prepared it. A burger from one restaurant bears little resemblance to one from another place, and homemade versions add another layer of complexity."

Earlier systems also failed to estimate portion sizes, a crucial factor in nutritional calculations. The NYU team's advance is their volumetric computation function, which uses advanced image processing to measure the exact area each food occupies on a plate.

The system correlates the area occupied by each food item with density and macronutrient data to convert 2D images into nutritional assessments. This integration of volumetric computations with the AI model enables precise analysis without manual input, solving a longstanding challenge in automated dietary tracking.

The third major hurdle has been computational efficiency. Previous models required too much processing power to be practical for real-time use, often necessitating cloud processing that introduced delays and privacy concerns.

The researchers used a powerful image-recognition technology called YOLOv8 with ONNX Runtime (a tool that helps AI programs run more efficiently) to build a food-identification program that runs on a website instead of as a downloadable app. This allows people to simply visit the website using their phone's web browser to analyze meals and track their diet.

When tested on a pizza slice, the system calculated 317 calories, 10 grams of protein, 40 grams of carbohydrates, and 13 grams of fat - nutritional values that closely matched reference standards. It performed similarly well when analyzing more complex dishes such as idli sambhar, a South Indian specialty featuring steamed rice cakes with lentil stew, for which it calculated 221 calories, 7 grams of protein, 46 grams of carbohydrates, and just 1 gram of fat.

"One of our goals was to ensure the system works across diverse cuisines and food presentations," said Panindre. "We wanted it to be as accurate with a hot dog - 280 calories according to our system - as it is with baklava, a Middle Eastern pastry that our system identifies as having 310 calories and 18 grams of fat."

The researchers solved data challenges by combining similar food categories, removing food types with too few examples, and emphasizing certain foods during training. These techniques helped refine their training dataset from countless initial images to a more balanced set of 95,000 instances across 214 food categories.

The technical performance metrics are impressive: the system achieved a mean Average Precision (mAP) score of 0.7941 at an Intersection over Union (IoU) threshold of 0.5. For non-specialists, this means the AI can accurately locate and identify food items approximately 80% of the time, even when they overlap or are partially obscured.

The system has been deployed as a web application that works on mobile devices, making it potentially accessible to anyone with a smartphone. The researchers describe their current system as a "proof-of-concept" that could be refined and expanded for broader healthcare applications very soon.

In addition to Panindre and Kumar, the paper's authors are Praneeth Kumar Thummalapalli and Tanmay Mandal, both master's degree students in NYU Tandon's Department of Computer Science and Engineering.

Source:
Journal reference:
  • P. Panindre, P. K. Thummalapalli, T. Mandal and S. Kumar, "Deep Learning Framework for Food Item Recognition and Nutrition Assessment," 2025 6th International Conference on Mobile Computing and Sustainable Informatics (ICMCSI), Goathgaun, Nepal, 2025, pp. 1648-1653, doi: 10.1109/ICMCSI64620.2025.10883519, https://ieeexplore.ieee.org/abstract/document/10883519

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.