Food photography and carb estimation

Prompted by @RogerType1’s comment about Bitesnap from MDI - sorry for complaining/venting I started looking at “food photography” apps which could estimate carb (and other nutrition) content.

I found Bitesnap, Suggestic and Undermyfork, the latter from Food logging app — testers wanted!. These are all available on iPhone and the first is Android too.

I have now got all of them installed, but of course, while my family are chronic food photographers, going to three apps and taking a picture in each of every meal is something I probably won’t manage to do. So I’m interested in other peoples’ mileage with these apps.

Do any of them work for restaurant meals? You never know how much sugar the chef has added to even the most apparently carb free meals, but counting the croutons in a chef’s salad is probably possible, so that might work.

What about the integration with other apps, particular on iPhone where Apple provide a proprietary framework for nutrition data with Apple Health? UnderMyFork is the only one of the three that asks to access Apple Health, but it is only looking for blood glucose data (apparently), it doesn’t seem to be writing carb information, but maybe that comes later.

It sounds like UnderMyFork is the only one of the apps I found that is targeted at diabetics, as opposed to general monitoring of nutrition intake. UnderMyFork supports the Dexcom G6, which I use and has a really simple setup; just a BG range, whereas the other apps want to know about health goals and stuff like that.

Suggestic seems more oriented to telling me what to eat that telling me what I’m eating. It also requires an account to be set up, which suggests they want to sell my data. UnderMyFork says it keeps the data only on the phone, BiteSnap is less clear but it isn’t asking for any permissions up front; presumably both with ask to access the camera when I first take a picture.

My out-of-the-box take is that Suggestic has an overcomplicated UI and it doesn’t seem to be the same as the other apps; it takes pictures of written menus not actual food. I suspect it will be the first one I de-install.

4 Likes

I just use xdrip for insulin on board, predicted glucose after meals and insulin, etc, but when I’m not sure, I’ll use bitesnap and usually estimate the portions correctly… sometimes I’ll have to take another correction, or “feed the insulin” if I’ve taken too much. I tried daily plate a long time ago, but bitesnap is my go to for quick estimated carbs…I once saw a type 1 diabetic child’s mother screaming at Sams club cafe workers because they couldn’t tell her how many carbs were in her child’s icey and pizza…I was estimating 80 for the icey, and 80 for the 2 slices of pizza, but decided not to intervene, or show her how bitesnap worked…I’d rather use an educated guess than add a scale to the plethora of items in my cargo pants pockets

5 Likes

Hi, I’m with the Undermyfork team — thanks for trying our app! Our team would love to hear more feedback from you, we do hope you will stick to using it :slight_smile: Did you use the Insights in the app yet? Is it working for you?

I must say, Undermyfork specifically doesn’t aim at estimating carbs by photo (the state of image recognition tech is not there yet). And it doesn’t use carb counting approach to guide you.

The app was designed as a simple personal handbook of your meals in different context, so you can browse it and see which meals (+ insulin + any other details tagged on the meal) work best for you in terms of time in range. We believe such way is less burdensome for daily management, as long as you can do basic carb estimation for the right insulin dosing, which I guess is the first thing most people learn when they are first diagnosed.

Plus, not all carbs are equal, and fats break into carbs as well, raising blood sugar at a different pace — so a personal postprandial time-in-range for each meal might be a more accurate metric to refer to in terms of ‘what to it - how to dose for it’ guidance.

2 Likes

UPDATED: I got the SugarMate and UnderMyfork handling of Dexcom G6 mixed up (see italic sections below):

I’ve ended up with four apps plus the Dexcom G6 and Omnipod View apps. They all have some elements of overlap but really each performs a different function:

  1. SugarMate. This is the closest I can find to an xDrip+ replacement. It’s running my logging and prediction although it’s prediction isn’t anywhere up to the xDrip+ standard. Unfortunately it relies on a Dexcom share login and so it fails totally if there is no Internet coverage; SugarMate couldn’t get my BG readings for 13 hours yesterday because I was on an international flight. This is precisely when I do need it while trying to deal with 13 hours sitting down and being fed carbs!

  2. UnderMyFork. This falls between SugarMate and BiteSnap (below). It logs insulin but doesn’t do any prediction and it logs, and photographs, carbs but doesn’t do any estimation. The glucose logging uses AppleHealth to get the Dexcom G6 to get the readings, so it should be much more reliable than SugarMate, however it sometimes lags on the readings by several hours. I don’t know if this is because of SugarMate or because the G6 update to Apple Health lags.

  3. BiteSnap. Just handles the carb aspect; it takes pictures of food :-), guesses the content (like UnderMyFork) but then guesstimates the nutritional values. Not particularly designed for diabetics but most of the time the carb estimates match mine and my wife’s and it does protein and fat as well.

  4. Suggestic. A diet helping application. I deleted it because it doesn’t do anything useful for me but it may well be very helpful to a newly diagnosed T1 or T2 diabetic; it makes suggestions as to what to eat (and what not to eat) and has a possibly appropriate “low carb” diet (or you could do Paleo or Keto.)

There is a fifth squeaky wheel in the picture; Apple Health. It is an important part of the potential system because it is basically just an OS provided database of health data (effectively my medical records). It has enormous potential. The problem is that apart from the Dexcom G6 app the applications seem to fail to contribute data. Standardization required.

These are very much first impressions; I’ve only been running BiteSnap and UnderMyFork for 48 hours! So far I can’t do without SugarMate until I find something better (please port xDrip+ to iOS and implement an Apple Health data source!) BiteSnap has real potential because of the prediction:

I think BiteSnap disproves that though I accept it’s not infallible; T1Ds will always be better measures of the amount of carbs in food than anything else and photographic analysis can’t normally work out how much sugar was added to things that should not have any sugar added. More on this when I’ve seen more BiteSnap analyses.

1 Like

I’ve tried BiteSnap, and agree, it doesn’t always recognize the food, especially home made food with lots of ingredients. It comes pretty close to the carb estimate, once the ingredients are manually entered. I really only tested it for my standard breakfast and dinner meals. I like that it gives the nutritional breakdown of ALL ingredients, esp nice if you are tracking them :slight_smile: I know I lack potassium, for instance, so this gives me a tracking mechanism for it.

2 Likes

I certainly agree on the advantage of getting the extra nutritional information and BiteSnap produces an enormous list so it should meet all needs.

Both BiteSnap and UnderMyFork seem to have a similar recognition accuracy and both seem to be using the “cloud” based (neural net) recognition services. Both can’t do recognition unless they are connected to the internet.

This is just image recognition, but the state of the art in computer vision (image recognition is just a part of that) means that it is easy to extend it to quantification; recognizing something and then estimating the quantity of that thing. Indeed, recognition will always be sloppy because of the issues you raise; “white sludge” might be yoghurt or it might be creamed potato. However quantification should be accurate; the computer will know it is a cup of white sludge :wink:

2 Likes

That’s a great point!! I guess as long as the user knows what “white sludge” is it could be named anything, as long as nutrition breakdown is accurate, of course :slight_smile: