IOS 1.6.33 dropped. Seeing some huge speed improvements. Also device handlers that have never worked are now working for me. A aeotec HEMv1 for sure has never worked. Also seeing better results on activity…
You can now make lighting groups as well, great for multiple lights like fixtures or rooms!
And somehow Nest cam feeds are showing up. Not sure if it’s live or snapshots. But none of my cams are shared or public yet I see images in the app.
SHM UI is different too. will play more on the train.
@JDRoberts they changed the How to Use menu. Looks like it is text now instead of pictures. Wonder if it is better for you?
They added Security Mode Reminders I believe. I don’t remember seeing it previously or I just missed it if it was there in earlier versions
Could hardly be worse than the picture of text in the previous version, so probably.
But I haven’t checked it yet, I will today when John gets here.
Landscape mode? Or maybe that’s coming in 2.0…
Here’s the significant part of the update notes
The lighting groups is nice and a often requested feature. There is very minimal popcorn affect when using it, so I wonder if they implemented zigbee multi broadcast (I think that’s what it is called). The group isn’t available to automations.
Here’s the new SmartThings Home Monitor card. Shall we still use SHM acronym or STHM now?
I think we should definitely start using STHM, particularly if they are not renaming the one in the V2 classic app. That would actually help a lot if they have two different names.
Light Groups not being available to automations is… Weird. But at least it’s a step forward.
In zigbee, the official term is “groupcast.” I have no idea what smartthings calls it. Sometimes you’ll see the acronym UBG, which means unicast/broadcast/groupcast are all supported.
Interestingly, groupcast is one of those terms which was first used informally in the developer community, and then a couple of years later adopted officially by the standards body.
Zigbee has a really interesting mechanism for groups, quite different from most other protocols. The coordinator doesn’t know which devices are in which groups. The underlying mechanism is the regular broadcast (“tell everybody“) Transmission, but with the group ID attached.
It is then up to the individual device to know which groups are relevant to it.
It’s sort of a weird concept until you walk through a couple of examples. Here’s an old paper from 2010, but it happens to describe the technical aspects really well.
https://www.edn.com/Pdf/ViewPdf?contentItemId=4204960
The Popcorn Effect
Normally you would not see a popcorn effect if you were using zigbee groupcast, because the command goes out to the entire network at the same time. Sometimes there’s a little lag on a multi endpoint device while that device itself figures out if both endpoints are in the same group, but it should normally be quicker than the human Eye could detect.
The popcorn effect is usually a giveaway that the system is not using groupcast, but rather issuing individual commands. That’s why in the past you would have a group created in the hue app that had no popcorn effect while a scene with the same devices in SmartThings would have.
I don’t like being so cynical about this app, but why is it so bad? What is taking so long to make it work properly.
Successfully moved over one of my classic routines now that the custom automation creator has the once a day feature
And two years later VoiceOver navigation is still so broken that it is obvious that no one ever tested this blindfolded. Nor did they consult any accessibility experts.
First category: unclear on the concept.
Second category: somebody found an accessibility template. But again, never tested the results.
Many selections are not marked as selectable, making the navigation very confusing.
There are lots of places where you get stuck on a page and there’s no way to figure out how to get off of it.
Not surprising. That was even happening a short while ago for non-accessibility access. If you navigated to a device it could not display for whatever reason (eg. custom handler) you got a white screen of death with no way to navigate back. That’s not even alpha quality.
If you are looking at the page and you have full use of your eyes and hands, you can navigate most of it using voiceover.
But of course, if you have full use of your eyes and hands you’re not likely to be using voiceover!
If you were trying to navigate this with screen curtain turned on, you’d never make it.
and for those who want to experience what this is like:
Turn voiceover on
Triple tap anywhere on the screen with three fingers this will turn screen curtain on But leave voiceover live.
Start using voiceover navigation:
https://www.applevis.com/guides/ios-voiceover/complete-list-ios-gestures-available-voiceover-users
Then just triple tap with three fingers again to turn screen curtain off whenever you want to check where you are.
This will make it (ahem) blindingly obvious whether The navigation on a particular screen was designed around visual cues. Which this app definitely was.