Wearables + Words: Estimating Calories with NPLM
What if your watch could help estimate your daily calories—not just count steps? Researchers introduce the Nutrition Photoplethysmography Language Model (NPLM), which blends heart-signal data from consumer wearables (PPG) with meal descriptions so AI can reason about both physiology and food.
Trained on 19,340 people and 1.1 million meal–PPG pairs, NPLM improved daily caloric intake prediction by 11% over text-only baselines. Even when 80% of meal text was removed, accuracy largely held up. An independent validation study (n=140) with controlled dining replicated these gains.
Why it matters: integrating real-world physiology with what we eat could enable low-effort, noninvasive dietary monitoring at scale. It’s early-stage research—not a medical tool—but it shows how wearables can move from counting to understanding.
Read the paper: https://arxiv.org/abs/2511.19260v1
Authors: Kyle Verrier, Achille Nazaret, Joseph Futoma, Andrew C. Miller, Guillermo Sapiro
Paper: https://arxiv.org/abs/2511.19260v1
Register: https://www.AiFeta.com
AI health wearables nutrition machinelearning LLM PPG calories arxiv research