I’m new here so first, let me introduce myself. My name is Jeff and am a blind T 1. My doctor was going to proscribe me the Free Stile Libre, but, is not covered with my health plan. The Dexcom is so tomorrow he is going to sho me how to use it with my iPhone. I have some questions. First, is the sensor hard to insert? Is it hard to connect the transmitter? Has the app gotten more accessible with Voice over for IOS? If all I have to do is have someone read me the ID number on my transmitters every 3 or 6 months, then that’s what I want, something I can mostly do on my own.
Thank you all in advance,
Welcome. At first trust but always be sceptical:
Hey siri! What’s my BG?
I think if that doesn’t work tomorrow your doc has answered your question. I, for one, would like to know what the result was; please tell us how it goes down.
UPDATE: I just tried it with the Dexcom G6 app. Siri is useless; it’s just a spoken google search, however the screen reader (Accessibility/Spoken Content) does work with the Dexcom app. The default app display only has one speakable item on it, but it’s the right one. Reading the whole screen says:
“128 milligrammes per decilitre and is constant, trend graph.”
The speech controller pops up over the BG reading, but if you use the “touch to speak item” it is possible to touch it (the speak controller scrolls out of the way) and it just says “128 milligrammes per deciliter and is constant.” I.e. the useless “trend graph” speech is omitted.
I can’t answer the accessibility questions, but we have some members with sight problems, so I am sure they will be by with better answers than I. I can unequivocally say the insertion of the Dexcom G6 sensors is light years ahead of the G5. Night and day easier.
In my opinion, Dexcom is far superior to Libre and I have used them both. I am mostly talking about accuracy but could go into other reasons as well. And the G6 sensor is far easier and simpler to install than the G5. I have had the G5 for 18 months now and still can’t install the G5 easily. We are taking about Dexcom products. Dexcom also comes with an app including one for your phone, the Dexcom mobile app. I would recommend Dexcom products but with the Xdrip app. The app is light years better than the Dexcom app and will even read you your BG every five minutes or whatever frequency you choose. But Xdrip runs on Android not IOS. My recommendation is Dexcom; G6 sensor (which everyone will be moving to) and Xdrip app if you have Android. Best of luck.
I also love the G6, but I’ve never used the Libre. The alarms on the G6 are incredibly helpful at night, and the Libre on its own does not have those.
The G6 sensor is easy to put on; however, it does require that you take a picture of a 4 digit code with your phone or enter the code yourself. If you’re able to photograph the code with your phone, then it seems like it could work well for you. I’m sure that if you contacted Dexcom, they could provide you with some advice on how to make that work.
Also, the transmitter has an option to photograph the code as well, rather than entering it in.
Hi @jworthin, welcome to FUD. I also have T1D and have been legally blind since birth. I do have some residual vision but use a cane, braille, screen reader in addition to low vision aids.
I haven’t used the Dexcom G6, but have used the G5 and Libre. For me, the Dexcom was much more accurate than the Libre. Both the Dexcom and Libre apps are accessible with VoiceOver except for the trend graphs and some of the statistics. But current blood sugar, trend arrows, menus, and setup are accessible other than the transmitter ID on the Dexcom.
The app does have a feature to take a picture of the transmitter ID, but this is not accessible (at least not with the G5 app, maybe the G6 app has made it accessible). If you want to be independent with the transmitter ID, you could see if Seeing AI or Be My Eyes or Aira could help.
I have never tried the Siri feature with these apps. Alarms go to the lock screen or notifications centre and can be read as usual like other notifications with VoiceOver.
I never had a problem inserting the sensor or transmitter by touch. The G5 transmitter was easy to orient to snap into the sensor, and I assume the G6 would also be easy to orient tactually. I’ve heard the G6 is even easier than the G5 was.
I hope this helps, and good luck! Feel free to connect any time about diabetes and blindness issues.
Ha! There are a variety of apps that work with the Dexcom G6 on an iPhone. One of them is an app called “SugarMate” and this can be installed from the Apple app store (lots of others can’t because Apple classes them as medical apps and requires the app developer to do a whole of of stuff before it will allow them.)
Now SugarMate does have Siri support and can be programmed to interact via voice for a number of things, including entering stuff like carb intake. This has to be turned on in Settings/Siri Shortcuts but then you get access to a whole load of logging commands and one to read your current sugar levels. The text you speak can be edited and each item can be turned on or off.
So now when I say “Hey Siri! What’s my blood glucose.” Siri comes back and responds (after a delay and with periodic failures):
Sugarmate says: “It’s 119 and steady, last checked 5 minutes ago.”
That’s far more useful than the screen reader because it works without touching the phone, however SugarMate requires an internet connection and sometimes loses readings; it uses Dexcom share, it doesn’t communicate directly with the G6 app. Your doc may well set up Dexcom share anyway; it is one of the ways of getting reports from the G6. If not I recommend it anyway; with your permission it allows other people to access your blood sugar data and it is conceivable that there may be additional assistive technology that uses it.
I fairly often see sighted people say that Siri is better than a screen reader for accessing these types of diabetes apps…to which I would wholeheartedly disagree. Siri allows you to access one single feature of a supported app. A screen reader allows you to access an entire OS and use any app or feature that a sighted person would use - as long as it’s designed to be accessible. Siri might be handy whether you’re sighted or visually impaired, but it’s no replacement for a full-fledged screen reader to use and access an app.
Besides, I’m posting this using a braille display connected over Bluetooth. Not touching my phone and speech is turned off (but VoiceOver is running, because it’s what sends information to my braille display). That is WAY better than Siri in my book, and by far my preferred way of accessing computer content. I’m able to check my BG silently in work meetings using this technology - no way would I want to have to ask Siri what my blood sugar is and have her announce it to all in the middle of a meeting.
So far as the sensor ID (the calibration code) is concerned, the QR code on the G6 is on one of the adhesive cover papers on the massive insertion device. The dialog for sensor setup gets to a point where you need to enter the code and there is a “take a photo” button on that screen. I imagine that if you can insert the sensor by touch it would be easy to take the photo before removing the adhesive covers.
The same code is on the bottom left of the packing for the sensor plus insertion device. The packing is asymmetric, so working out which way to hold it should be easy. Of course in both cases it is necessary to wave the phone around a bit to get the QR code in view, that will require practice and could be very frustrating, however the app grabs the code instantly it is in the right place; no button press required.
The transmitter ID, which only has to be entered once every 90-100 days, it is only on the packaging and they keep changing the package. It might be possible to sweep over the whole package it it would be tedious. I think it is there as a QR code, but I can’t remember the details.
The transmitter ID is displayed on the Settings screen with the G6 app. I can’t find the sensor code; xDrip+ does show it on the status screen, but the G6 receiver never did and so far as I can see the app doesn’t either.
Well yes, I was just comparing the built-in audio screen reader with Siri for the purpose of rapidly finding my blood sugar; this is something I do several times an hour and when I do it I want minimal interaction with the phone. Indeed, Apple doesn’t allow me to have any interaction when driving.
I did miss something in the G6 app; it has a single Siri shortcut for reading reading out my blood glucose. This is better than using the SugarMate stuff because it doesn’t depend on an internet connection. Neither does spike-app for that matter because it takes over the whole G6 management; it can’t co-exist with the G6 app.
Thanks for this information. I wasn’t aware that there were QR codes on the boxes nor that the G6 sensor and transmitter had an ID code (for the G5, only the transmitter had an ID code). For the G5 app, I can’t remember if the picture was automatically taken or not. I just remember that I got to a point where I could not move on, either because I couldn’t get it to take a picture or because there was a button I needed to press to continue that wasn’t accessible with VoiceOver. The only way I got it to work was to exit out and enter the transmitter ID manually.
If entering the ID is stil inaccessible and it’s something that needs to o be done every 10 days in addition to every three months, that could up the annoyance factor hugely. Hopefully Dexcom has made this part fully accessible. If one can tactually locate the QR code’s on the sensor and transmitter that need scanning and the picture is automatically taken when it comes into the frame and there are no accessibility traps where one can’t continue, then that would be great!
With Xdrip, you don’t meed to access your phone to hear your blood sugar. You simply set it to tell you every 5 minutes, 30 minutes or not at all and it will just tell you. Of course Siri tells you when you want it and ask it. Both sound great.
Having an option to announce your blood sugar at time intervals or as requested is handy, especially if doing something like exercising or driving (which, obviously, anyone who is blind can’t do). But there are many situations where I just would not want a timed announcement or any type of announcement at all… Like while doing any number of things at work (meeting, workshop, consulting), while watching a movie, while at a social event, while on public transit… So options are good, but screen reader access is always a necessity (e.g., if an app only had Siri type access, I would never use nor recommend it, because that has such extremely limited functionality if it’s the only feature that can be used and needs to be set up by a sighted person). I have never used xDrip for Android, but xDrip for iPhone is quite accessible with VoiceOver; I tried SugarMate once and have a sense that perhaps it was not great for accessibility, or there was some other reason I gave up on it.
Hello all, I want to thank each and everyone of you for responding to my question. I went to my doctor. I am now officially on the decks calm and so far so good. I inserted it myself and in 10 days will do it again. I especially want to thank Jen. It’s good to get the perspective from somebody else who is in a similar position to me. Thank you all again and I will continue to update you all on how things are going. If I run into any issues I will definitely let you know.
Love the Autocorrect for Dexcom into “decks calm” it took me a second but I finally got it.
And more interesting is I think the title “descomunal” is another translation.
That is very definitely in my imagination. I just tried it (a wardrobe malfunction removed my G6 sensor a day early). So far as I can see it is fundamentally impossible to capture the sensor code unless I can see the 'phone screen.
Of course I am sighted, therefore severely disabled with my eyes closed, but all the same there seems to be a fundamental UI bug here; after pressing the “Take Photo” button for entering the sensor code (I am assuming this is actually accessible via Voice Over) the camera focus and exposure are fixed. This means that it is extraordinarily unlikely that it will be possible to grab the QR code; the sensor has to be in exactly the right position before the “Take Photo” button is pressed.
For a sighted user it is easy to see this and to “fix” it by pressing the focus box the UI presents for the QR code when the sensor can be seen to be in approximately the correct position; this redoes the focus and exposure. But I had my eyes closed and I couldn’t possibly see what was going wrong.
I rate this as a total disaster area, I’m very interested in what you say, @jworthin 10 days down the road when you try to get the code off a new sensor. I’m also interested in whether there is any way for a braille screen reader to give any information about take-a-photo screens like this, I guess that’s a question for @Jen; there are fixed UI items (there is a dotted sensor outline and the QR code grab box) but I don’t think those are going to appear in Voice Over and even if they did, could that possibly help?
Yes, this pretty much matches my experience trying to take a picture with the G5 app. I have some vision and still couldn’t figure out what was going wrong.
The dotted lines and stuff on the Dexcom app will not help VoiceOver at all. However, Apple’s default Camera app provides spoken feedback about what is in the viewfinder and the options available to be adjusted, and can be used by someone relying on VoiceOver. There are also many apps designed specifically for users who are blind or visually impaired that rely on image capture (e.g., KNFB Reader, Seeing AI, Voice Dream Scanner, many others) that provide spoken feedback about whether the item of interest is in frame or not and what adjustments need to be made (e.g., “right side of page is not visible,” so then you know to move the camera to the right).
So this section could definitely be made fully accessible if Dexcom decided it was a priority. Convincing companies that accessibility is a priority is often an uphill battle. It would still be worth submitting feedback to them about this, because otherwise they may not even know the problem exists. (I try to always submit feedback when I find accessibility issues, even though in the vast majority of cases they sadly never get fixed…)
Also, just to clarify, the screen reader on iOS/iPadOS is VoiceOver. That screen reader can speak content aloud or send content to a braille display, but it’s all being done through the same screen reader. Braille displays are just like computer monitors: they will only display information sent to them. So the information VoiceOver is able to present via speech and braille would be basically identical, only the medium (audio or tactile) would be different.
I believe so too, speech to text can come up with some strange translations. Could “Descomunal” actually be a poor rendering of the word Dexcom. If anyone agrees with me perhaps someone can edit the title of this post.