How Should ST Architecture Handle Composite Devices?

Personally, I think this may be part of a 10-year-old issue with SmartThings, that the architecture just never effectively deals with the concept of endpoints. There have been multiple stumbles over the years because devices which worked fine with other controllers certified under an independent third-party standard didn’t work with SmartThings because SmartThings didn’t handle multi endpoint devices correctly. Also, they typically arbitrarily pick a maximum number of endpoints to be supported which has nothing to do with the actual maximum from the independent specification.

Whether it’s something very simple like a Light with its own built in motion sensor or a thermostat with a built-in temperature sensor or even just a triple gang light switch, or something a little more complicated like a ceiling fan with a dimmable light or a true multisensor with four or five different types of sensors included, SmartThings has simply failed to recognize that different endpoints might be different device classes or even just endpoints with the same device class, but different settings. They did it with Zwave, they did it with Zigbee, they did it with Wi-Fi And now they’re doing it again with matter.

As an engineer, it’s annoying to see this same pattern being repeated over and over. :thinking:

6 Likes

Up to a point. Matter drivers, in particular, seem to already be as modular as they want them to be, and could be more so if they were exploiting more library code. Default handlers doesn’t seem to be the in thing yet for Matter.

I just feel there is something less than ideal about the way fingerprinting and device profiles works. There having to be implicit profiles for every required combination of capability in the driver packages just seems nonsensical.

Rather than a driver specific fingerprint file matching a fingerprint to an implicit profile name, I’d like to see a global fingerprint database matching a fingerprint to an actual real device profile with the ability to override the profile ID as well as the driver at the device level.

In reality though the controller platform will have resource limits and design constraints, and then there is the problem of usefully presenting an arbitrary device in a UI in a useful fashion. So there quickly comes a point where the manufacturer needs to be managing expectations and getting more involved with the platform.

1 Like

Yes, while the driver might have all the handlers necessary for a composite device, you still have to have a statically defined profile that has the correct capabilities defined for that composite device. And unfortunately, the device profile also plays a role in the presentation for the mobile app (which I suspect is the larger issue).

Currently for Matter, there are several ways a fingerprint can match for a device:

  • a specific vendorId
 - id: "Eve/Energy/US"
    deviceLabel: Eve Energy
    vendorId: 0x130A
    deviceProfileName: power-energy-powerConsumption
  • a specific vendorId and productID
#ThirdReality
  - id: "ThirdReality/WiFi"
    deviceLabel: THIRDREALITY Night Light
    vendorId: 0x1407
    productId: 0x1088
    deviceProfileName: light-color-level-illuminance-motion-1000K-15000K 
  • generic profiles based on one or more deviceTypes from the endpoints of the device
 - id: "matter/colorTemperature/light"
    deviceLabel: Matter Color Temperature Light
    deviceTypes:
      - id: 0x010C # Color Temperature Light
    deviceProfileName: light-level-colorTemperature
  - id: "matter/colorTemperature/light/2"
    deviceLabel: Matter Color Temperature Light
    deviceTypes:
      - id: 0x0100 # OnOff Light
      - id: 0x0101 # Dimmable Light
      - id: 0x010C # Color Temperature Light
    deviceProfileName: light-level-colorTemperature

But, at the end of the day, all of those profiles are static and every time there is a different combination of capabilities, a new profile must be added to the driver.

Ideally, the driver should be able to identify the capabilities of the device by looking at the deviceTypes from the endpoints which are already part of the driver’s information received from the device when it was commissioned:

[{"endpointId":0x00,"deviceTypes":[{"deviceTypeId":0x16}]},{"endpointId":0x01,"deviceTypes":[{"deviceTypeId":0x10D}]},{"endpointId":0x02,"deviceTypes":[{"deviceTypeId":0x106}]},{"endpointId":0x03,"deviceTypes":[{"deviceTypeId":0x107}]}]

Then it should dynamically assemble/determine which capability handler(s) are necessary to support the device, composite or otherwise, as well as provide information to the mobile app on how to present those capabilities. Until something like this happens, Matter on ST will suffer from all the ills we currently experience for Z-Wave, Zigbee, and Wi-Fi devices that rely on fingerprinting and static profiling to understand a device’s capabilities.

Application Device Types for Matter 1.2 can be found here Matter Device Library.

3 Likes

It really is a pity that Samsung has added this whole “device profile” UI layer to the SmartThings app. As everyone has said, it really defeats the purpose of Matter (and other standards) that explicitly define their capabilities when onboarding the device. And it also forces companies to have to pay for WWST certification if they want to make it easy for their customers to have a pleasant experience using their device with SmartThings. Maybe we found the reason for its existence…

6 Likes

At least for Matter, I’d like to see the device profile be a dynamically generated construct that is created at the time of device commissioning. If that could be done, it might not require any changes to the mobile app which I think is always an obstacle to getting certain platform changes implemented.

One other thing that should not be overlooked is that an additional function of fingerprints is to identify which driver should be selected even before the device profile is identified (since the fingerprints.yml file that maps fingerprint to device profile is associated with the driver). Revamping that would likely be a much larger undertaking that could require fundamental changes to the onboarding process.

It’s a shame that when the Edge architecture was being created that there wasn’t a shift from doing device drivers that get loaded onto the hub and instead doing capability drivers that get loaded onto the hub. Then a device could be associated with the capability drivers for the capabilities identified during onboarding. It would have offered a more dynamic and modular approach rather than what we have now.

4 Likes

I believe the manufacturer should only get involved to provide a deeper integration, custom features / settings / presentations, etc.

Take the Third Reality night light, it does not use any custom UI feature, has no custom settings, private clusters or anything, it just adds the motionSensor and illuminanceMeasurement capabilities and uses standard Matter clusters for occupancy and illuminance.

It feels like no specific support for that device should have been necessary, let alone modifying the driver for lights to include support for sensors when there are already drivers that handle the sensors.

For devices like that, as others have said, a dynamically generated profile would be better. Has a light? Cool, load the driver for lights and add the capabilities used by each cluster. Has a motion sensor? Add the motionSensor capability and load the matter sensor driver to handle the events and the capability.

Most devices would be covered by that approach, the ones that don’t or that require a tighter integration to provide extra features could still create custom drivers or go the WWST way, like the Eve Energy which uses private clusters for power monitoring.

2 Likes

What I notice about the Matter drivers is that register_for_default_handlers doesn’t appear to be being used, though it does seem to exist. So the drivers are full of code that is already in the libraries. It seems to me that really next to nothing will be needed, they just aren’t that far on yet. They are also probably hampered by hubs having been on different library versions.

It has an occupancy sensor in Matter terms. So is that being a Motion Sensor or an Occupancy Sensor in SmartThings terms?

There can be a many to one mapping between SmartThings capabilities and protocol capabilities and that is where I believe device profiles play an important role.

The Location owner should have the ultimate control over how to profile a device. They might want it to be considered as an occupancy sensor in one place and a motion sensor in another. They might also want to determine which elements of a device should be supported.

That’s not to say it shouldn’t be possible to automatically create a profile for a device. Just that your mileage may vary.

4 Likes

:exploding_head: You’re right! They do exist under st\matter\defaults and indeed the code is pretty much the same and modular, as it should be!

Right now, the same code for illuminance sensors is in three places: the default handler which is not used, the matter-sensor driver and the matter-switch driver to account for lights with illuminance sensors.

Default handlers will play an important role for composite devices in the future as well as for type-independent features like power monitoring, otherwise it’s going to be a copy-paste feast resulting in bloated drivers, difficult maintenance and more memory usage.

3 Likes

I think this is a really interesting discussion, but it had started to take over a thread from a developer asking for support on the integration of a specific composite Matter device. So to keep from flooding that topic, I have opened this new one for discussion of the ST architecture in general.

Carry on. :sunglasses:

5 Likes

It’s happening, the driver for matter switches / plugs / lights, initially focused on OnOff, Level, Color, etc. for lights, that later had to add support for motion or illuminance sensors for the Third Reality night light, is going to absorb the whole matter-button driver for the Inovelli switches, handling all the typical button events like tap, double tap, hold and that stuff. There’s a branch to combine the drivers.

I guess eventually the matter-switch driver will absorb more drivers and become “the matter driver”. That’s what I meant by “monolithic” in my original reply before the posts being split. Driver and profile composition seems a must-have feature in Matter.

1 Like

I wonder what the pros and cons are of multiple drivers versus a single ‘bigger’ driver.

2 Likes

A bigger driver is harder to maintain, loads more unneeded stuff into memory, will need more updates, etc. There are no pros, it’s just forced by SmartThings architecture since the only way to handle a composite device is not by compositing handlers but merging all the handlers into a big one.

Of course, none of that matters to the end user, but… here we love custom drivers and making a custom feature is now more difficult to include and maintain.

Let’s say you want to add an innocent function for Matter lights, like a brighten / dim which is a native Step command but it’s not exposed by SmartThings:

  • If composition of drivers and profiles was allowed for a device you would just implement the handler for your custom capability, send the Matter command to increase/decrease brightness a certain amount and call it a day. Then share it and people will enjoy it. Since it’s such a basic feature it will hardly break so you won’t need further updates, will work with any device that has a light, present or future.
  • If the way going forward is a “bigger” driver with no composition at all, to add the feature you have to modify and publish the whole matter driver. Now it’s not a small handler, not even a handler for lights, it’s a driver handling lots of devices which include a light. If you want it to keep working with any light you’ll have to update the whole driver, even if your small code to add the feature is still the same and has not changed.
2 Likes

Hi @mocelet

The default, zigbee, zwave and I imagine matter furmware libraries contain everything necessary for most devices to work, even if they are composite and with multiple endpoints, without having to add any custom code.

The problem is that you have to choose a profile to show in the app when the device is paired.

Many profiles could be chosen automatically by combining deviceIdentifiers, zigbeeProfile, and the clusters exposed by the device.

The problem is that sometimes manufacturers do not strictly follow the use of each cluster for what is standard, for example, there are manufacturers that use cluster 0006, (on/off) to make an open/close sensor.
Others use the occupancy cluster to detect movement instead of occupancy. What profile doed driver assign to it?
In these cases the default handlers do not work and device need a custom handler, with a subdriver as the best solution

Other times a cluster serves different capabilities, 0500(IASZone), is used for open/close, motion, waterleak, smoke, Co, tamper, … sensors.

Cluster 0300 (color) is used for color temperature and color, which profile dors driver assign only color temperature or only color or both?

There are many more similar cases, also in zwave with commandClass like SensorMiltilevel, for example, which sends information used in many different capabilities.

Could do something similar to what is done on the smartthings page to search for a stock driver for a device, ask what capabilities your device has when you are going to pair it and thus assign the compatible profile with the exposed clusters and the user information.

On the other hand, large drivers are no longer a problem, since version 9 libraries, you can use the lazy_load function for subdrivers. Only the subdrivers of the devices paired with the driver are loaded into hub memory, even though the driver supports hundreds of different devices.

5 Likes

Hi Mariano, good points there, having to choose profiles instead of creating them is indeed one of the main issues when it comes to composite devices.

Drivers like matter-sensor actually do that, but since profiles cannot be dynamically created they have a whole bunch of manually created profiles like “motion-battery”, “motion-contact”, “motion-illuminance”, “motion-illuminance-temperature”, “motion-contact-battery”, etc. and the driver builds the string to use as profile according to what the device supports.

However, that requires creating every possible profile combination and, even then, a device like the Third Reality night light arrives and months have to pass for explicit support to be added.

Even if a competitor launched a similar night light, with the same features, it would not work in SmartThings because the profile is coupled to the product ID instead of offering generic support for a composite device like that. Unless the manufacturer paid for the WWST certification to add the fingerprint, it might not get support at all.

2 Likes

I was just curious and have tested the default handlers with a Matter light. While it almost works, there are bugs like not being able to change the colour temperature or not updating the brightness.

For instance:

[string “st/matter/defaults/switchLevel.lua”]:25: attempt to index a number value (field ‘value’)

because the code uses “ib.data.value.value” but the correct is “ib.data.value”.

That’s why they don’t use them and will probably stay that way for a long time if they want to keep compatibility with the current v9 libraries, just like they’ve defined Matter 1.2 clusters directly in the drivers because they won’t be available in the API until v10.

1 Like

Yet another example of the composite device debacle…sigh

3 Likes