Dear Google: I want my smart glasses to become my food logbook

Dear Google: I want my smart glasses to become my food logbook


Ray Ban Meta Live Translation (3 of 3)

C. Scott Brown / Android Authority

I don’t count myself as a fitness-tracking nerd who needs a smartwatch strapped to his wrist every waking second just to log every muscle twitch. I wear mine only while working out, or occasionally when I’m away from my phone for a bit — like when it’s in another room — to avoid missing calls or notifications. That’s it. That’s the entire purpose of a smartwatch for me.

While my watch can track everything from my broken sleep to biking laps — you know, the calories I burn — without me lifting a finger, it can’t as seamlessly track the calories that go into me. I’ve always struggled with nutrition tracking. Even though dozens of apps let you log your daily meals, the friction remains.

The only wearable that could bridge this wide gap in fitness tracking, I foresee, is smart glasses. And even though I’m not too fond of making every wearable “smart,” if glasses could handle nutrition tracking for me, I’d happily throw my wallet at them.

If your smart glasses could track your food automatically, would you use it?

1 votes

Manual logging is urghhhhhh

The calorie tracking app offers dedicated plans for managing diet and exercise.

Kaitlyn Cimino / Android Authority

The last thing I want to do is pull out my phone to log my meal while I’m still chewing or right after I’m done eating. Because I know that if I don’t do it right, then, I’ll either forget to do it later or forget the specifics of what I ate. And any such gaps make the rest of the data meaningless — all that carefully logged information suddenly becomes unreliable. That, in turn, makes me slack off even more, and soon the whole tracking habit derails.

So, it’s either I track everything meticulously and religiously, or not at all. Anything in between is just meaningless, sporadic data with little to no use, except maybe for spotting my pattern of failure.

Apps have tried to make things easier with barcode scanning for snacks, saved meals, AI photo recognition, and so on. Sure, that helps a bit, especially the last one. But I still have to remember to log, pull out my phone at the dinner table, and risk being judged for snapping a picture of my food like I’m about to post it on Instagram.

How do I tell the app that the snack whose barcode I just scanned was shared by five people, and one friend grabbed most of it? Or how do I know how much and what kind of oil went into a restaurant dish?

And yes, I may be lazy too. But it gets worse when you add more complexities to the logging process. How do I tell the app that the snack whose barcode I just scanned was shared by five people, and one friend grabbed most of it? Or how do I know how much and what kind of oil went into a restaurant dish? Worse still, Indian food alone can break any algorithm — there are so many ingredients and variations that even our neurological algorithm struggles to make sense of it.

Our wearables, meanwhile, have gotten incredibly good at automating other forms of tracking. You just wear your smartwatch, and it automatically detects your workout, logs your activity, maps your route with GPS — all without a prompt. That seamlessness makes tracking effortless, even enjoyable. Nutrition tracking, on the other hand, still feels like paperwork. The real problem isn’t a lack of apps; it’s the lack of a wearable designed to make it easy. And I can already see which one could.

Smart glasses + AI = a deadly combination 🔥

BleeqUp Ranger smart glasses 1

Harley Maranan / Android Authority

Smart glasses, resting on your nose, can see what you see, hear what you hear, and respond to what you say. That’s sci-fi-level stuff, already brought into our present, maybe a little too fast.

You might want your smart glasses to translate signs while traveling, record memories through your literal eyes, or answer random questions about the plants and dogs you spot on your walk. But not me. I just want them to fix nutrition tracking. It’s a big part of overall fitness, yet the most neglected one.

Just like your smartwatch automatically tracks your run, your glasses could automatically track your meals.

Modern smart glasses are practically made for this. They see what you see, so — just like your smartwatch automatically tracks your run — your glasses could automatically track your meals. No barcode scanning, no typing, no photo-taking required. They could see your plate, estimate the portion size, identify the restaurant, maybe even cross-check reviews to infer cooking styles or visible oil levels, and estimate your calories.

And if something needs fine-tuning, you could just tell it, “It’s my cheat day, I added more cream.” With advanced LLMs humming in the background, AI would instantly know what to do with that data. This would take the burden of logging off me and put it where it belongs: on the algorithm. You know, the kind of thing we actually need artificial intelligence for — not to create abstract art or fake music.

Meta Ray Ban Display 2

Right now, the only mainstream AR smart glasses are the Meta Ray-Bans, which recently gained a screen, and Google is reportedly rebooting its Glass project. But none of these tackle nutrition tracking yet, as Meta seems more focused on general-purpose use and media creation than fitness.

You could sort of do something similar with Gemini Live using your phone’s camera, but there’s no proper integration with Google Fit, Fitbit, or any third-party health platforms. What we need is deep ecosystem-level integration; a way to connect visual recognition, voice commands, and nutrition tracking into one coherent system that just works.

I’m fairly sure that as the category matures and smart glasses become mainstream, companies will eventually focus on this and make fitness tracking a lot more cohesive.

AI isn’t perfect — but it’s the only way forward

smart glasses from above

Adam Birney / Android Authority

Meal and food tracking are far more nuanced than fitness tracking, which has clear metrics and limited variables. Nutrition tracking, though, changes from house to house and person to person. That’s overwhelming even for AI. No matter how smart it gets, AI will always lack the human context behind our meals: why we eat what we do, how much, and with whom. It’ll always be a step or two behind. But even that would still be a massive leap from where we are today.

The future I imagine has the main smart device on your wrist (and not in your palm!) and a pair of smart eyes sitting in front of your real ones, bringing visual context to health data. Positioning smart glasses as fitness companions that can automatically track your meals and calorie intake? That’s just me giving away a billion-dollar idea for free.

Thank you for being part of our community. Read our Comment Policy before posting.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *