ST App 1.6.67 Voiceover Fails

The lightbulb came on for me today as to why the smartthings app is so difficult to use with a voice reader. It doesn’t follow the most basic UI design principles for voice use.

Let’s say there are two actionable elements, like two device tiles on a screen.

What I expect to hear are

  1. Name of the actionable element

  2. current status information

  3. the type of element it is. Like “header,“ “button,“ “switch button,” etc.

But the smartthings app frequently puts two actionable elements into a single tile. That’s visually clear, but really confusing when you are listening. Particularly since one of the elements is often just given its generic name, like “action power on“ or “scene.“

Also in some cases you hear the type of element before you hear the current status information, which is also confusing.

Here is an example. There is a tile for bedtime scene. Apparently there are two actionable elements. The first one lets you edit the scene. The second one lets you run it. I don’t know how clear that is to someone just looking at it, but it’s totally confusing using a screen reader. What you hear is

“Bedtime. button.”
“Scene. Button.”

A person using a voice reader has no idea that that “scene” is the bedtime scene. Nor that choosing “bedtime“ takes you to an edit screen and doesn’t actually run the scene. Nor that this particular “scene“ is an any way different from the other seven identical “scene“ buttons on the same page. :scream:

Here’s the image.


In some ways it’s even more confusing for devices. That’s because the devices have a “action power on“ generic control. But you can also tap the name. So this is what you hear:

“Atomic. off. Button.”
“Action Power on. Button.”

Someone using a screen reader has no way of knowing that that action button goes with the Atomic device. In fact, they have no way of knowing that the action button is actually off because part of its name is “power on.“ Seriously, that’s the generic name that is used a zillion times on the page.

Here’s the image:

It just seems really clear that they designed the UI for sighted people and then ran some auto tool to add voiceover controls. But they didn’t design it for use by someone with a screen

reader. and they definitely didn’t test it for accessibility.


In contrast, let’s take a look at the Hue app. It’s by no means a perfect voiceover implementation. But it’s a solid B, and miles better than smartthings.

Here’s a very similar kind of tile with two actionable elements. if you select just the name, it opens up the detail screen. Or you can use the switch button as a quick control. Here’s the picture.

What makes this better? The unique name and the status are repeated for the switch button. So what you hear is:

“Family Room. All lights on.
Family Room. All lights on. Switch button.”

And if you select the switch button, you will get a status update to “all lights off.“

Like I said, it’s not perfect. you do have to do a second gesture to realize that you could tap for the first element. But at least you aren’t confused by the second element and you get all the information you need to decide whether you want to activate it. As well as the status update if you do. And every element on the page has a unique spoken name.



Maybe lawyers would get thier attention?

One of my former employers basically treated accessibility this way until an ADA lawsuit happened… :sunglasses:

Not my style for something like this when there are accessible alternatives available.

If I were required to use this app for work, I would definitely be filing an HR action.

1 Like

Yeah I know JD, you’re a nice guy. I just imagine if I had to rely on screenreader tech and it couldnt work out of sheer laziness… Well, spitting nails wouldnt cover the emotion quite well enough.

1 Like

I don’t think it’s laziness. I think it’s being focused on features rather than solutions. Something we’ve seen Samsung do elsewhere. :wink:

1 Like

@JDRoberts, let’s call it what it is. It is discrimination. If the person with a voiceover solution cannot use the app an understandable and confident way as a person with sight, that is discrimination.

Not technically, because the company has to do something extra to make it possible. So it’s “failure to accommodate.“ Rather than actual discrimination.

That’s not illegal in the US for commercial products unless they are required for a job or education, in which case it is the responsibility of the employer/school to make things work.

The number of products in the world that I can’t use is a really long list. Like, you know, a fork. Or a zipper. Seriously, it is not the responsibility of every single product manufacturer to make things work for tetraplegics. And that even includes app writers. :wink:

However, when a company puts out a statement like this

They are making a promise that the current smartthings app definitely doesn’t keep. So they should be held accountable for that. They don’t get to have it both ways, take public credit for accessibility and then not do the work.


I cannot remember when, but about 10 years ago or so there was a club in Budapest which provided free entrance to girls (of course for boys there was an entrance fee). A boy started a trial against the club saying that this practice was discrimination. He has won it, no question.

Releasing a software to public and not treating equally people is discrimination. If you advertise that you are doing it so (the link what you posted), but you still not provide equivalent means for everyone, that is still discrimination.

The EU is actually funding a lot of things to make it compliant for people with handicaps. Google is supporting people with handicaps by collecting and providing information of parking places and entrances available.

I also wanted to add that breaking widgets is one of the worst things you can do from an accessibility standpoint. And they seriously broke the widgets this time.

First of all, it’s really unclear how the widgets are supposed to work now. Sometimes they activate a scene, which is what they did before. Sometimes they just open the smartthings app, which, since it is not voice-friendly, is tragic.

Also, it seems like sometimes it opens the app and activates the scene and sometimes it doesn’t but I can’t tell that for sure since I can’t see very well.

In addition, it appears that they have suddenly dropped the number of allowable widgets from 8 to 4. That’s another major issue for many people who were relying on widgets.

Less than one year ago, Samsung announced new widget features with a flourish and an indication that they understood what widgets were for:

  • Add a widget – for additional simplicity, the new SmartThings scene widget, introduced in Autumn last year, allows users to quickly trigger scenes – like “bedtime” and “leaving the house” – without entering the SmartThings app.

Samsung’s SmartThings Just Got Smarter, with New App and Ecosystem Updates – Samsung Newsroom U.K.

Now that’s all gone. :disappointed_relieved:

Also, I haven’t checked for sure, but based on what some other people have written in the forum it sounds like you have to do some long press actions to assign scenes to widgets now, and that has so far not been working with the voice navigation. But I haven’t tried it myself.

Widgets are really helpful for many different kinds of accessibility challenges, from cognitive to hand function. And of course for those who use voice readers.

This is one I think could definitely be reported to the Samsung accessibility team, surely they have some standards for widget use. :thinking:

edited to update @Automated_House points out below that the change in widget behavior (but not the change in number of scenes per widget) is due to some Apple changes for iOS 14. I’ve never run into that on anything else because I mostly use Siri shortcuts which can be added like a widget to the home screen. So the accessibility answer at this point should be to add shortcuts functionality to smartthings and then we would have the same equivalent as the old widgets. See more discussion below.


This could come into play as SmartThings inks more and more multi-tenant deals and IoT as an appliance preventative maintenance service. This means you have tenants and service employees that have to use the app for their life/job. If Samsung HQ were paying attention they would get this sorted out ASAP before there is someone that requires voiceover in the SmartThings app to do their job.

This is Apple’s fault. The June 2021 app update moved from the “old” iOS widgets to new iOS 14 widgets. They’re nice because they can be placed on the home screen instead of only the Today View, but one big limitation is that interacting with them requires launching the widget’s app. The old widgets let you take actions without opening the app, but the new iOS14 one doesn’t.


Interesting. I haven’t run into this before because I don’t actually use actionable widgets for anything other than smartthings. For pretty much everything else I use Siri shortcuts. But as far as I know Samsung still hasn’t added shortcut support to the app, right?

I know I can use webhooks and IFTTT, but not doing that.

So if the answer is “Apple has changed the functionality of widgets with iOS 14,“ then the next sentence should be “but you can use Siri shortcuts if that helps.“ Only, smartthings doesn’t do that, I believe.

Also, it’s not Apple who cut the max number of scenes from 8 to 4, is it?

1 Like

Correct. It would be nice if they did for scenes at the minimum.

Nope, that was a SmartThings design choice. The bottom row of the widget is fixed and corresponds to the 4 tabs at the bottom of the app. Only the top row can be filled with scenes and only 4 scenes will fit.


I asked Mark what he does since he doesn’t use a voice reader but he does use an Apple Watch, and he says that you can add a shortcut button to the home screen or to the watch, so essentially they are a super powered widget. Which is cool, I didn’t even know that. Here’s one from Switchbot:

yeah, apparently Apple have given themselves some secret code that allows Shortcuts to be run from the widget without launching the Shortcuts app


The Android Auto and Android setup has similar disfunctionalities. Sometimes widgets fires, sometimes you have to enter the app first (probably for authentication) and the widget starts working. In AA, the location based pop-up scene has similar issues. I wouldn’t bet that is only iOS.

And seeing how much interest is in the subject from SmartThings, I wouldn’t have high hopes that it will be fixed ever, unless as @Automated_House said, one of the employers will give a stab to this Circus of Samsung.

As the level of commitment is that low for this, I cannot really imagine how the Matter implementation will work. As so many things the platform lacks and hasn’t fully developed/documented, I foresee that it will be disastrous. Can you imagine the new protocol and Certified by SmartThings? - They will tell the other partners to build your own integration if you want to be part of the ecosystem. (Not to mention the hardware requirements… who is going to do that Aeotec?)

I just cannot foresee that anything good is going to come out of this soon. But you can share how much you like your new TV from the app, that’s far more important.

Note that article is on how to add a Shortcut to the Home Screen. The secret on how to run Shortcuts from the Shortcuts widget is under “Run Shortcuts from a Widget”. Whether the Shortcuts app runs or not depends on what is in the particular shortcut. The cases that require the Shortcut app to open are documented here.

I learned a couple of things digging up those restrictions up - thanks @Automated_House !


I had to use the app yesterday to change some Automations, because my wife broke one of the roller blinds. To be honest, it is even confusing when you can see things. I was adjusting the Automations during night time, so I didn’t want to open the blinds, and I had no clue where should I tap or long press to edit it and not to trigger it.

This UI just socks (!sucks! bloody autocorrect on a Samsung phone… Hahaha). It has the same logic as the voice assistant integrations. All in…

1 Like