I’ve noticed a different between the two… The old motion sensors trigger within < 1 second. The new ones, at least the two (now 3) I have tested trigger between 2.5 and 3.5 seconds. Is there something that I can do to make them act like the old ones? I got 5 of them from ST a couple weeks ago. For my use cases they are unusable.
1 . Turn the lights on when I start to walk up the stairs. I am 3/4 of the way up before they trigger.
2. Turn the lights on when I walk into a room, I’ve tripped over something before they turn on 1/2 way in the room.
You’d think they would have compared this new version to the iris, or anything else for that matter as a benchmark, which IMHO is easily the best motion sensor I’ve used. Even my crappy gocontrol motion sensors aren’t that slow.
I assume they are using the same dht? Unless there’s some config snippet that can be sent to adjust the delay, which would have to be found out via ST, I’d say you’re on the path to RMA.
They are using the same DH. That is the first thing I looked at. I should have mentioned that. There also hasn’t been any changes to the code recently that would impact this, only a change related to Groovy 2.4.4 which in theory should also impact the old ones…
Well, support chat / phone just shut down. I knew getting these things from ST and not Amazon was going to come back and bite me, no clear path to an RMA. Amazon is much easier.
I wonder if the delay is to address the complaint about false alarms associated with the sensitivity of the incumbent models? I certainly would prefer the delayed ones if it reduces the number of false alarms I have when I leave the SHM on. Maybe get a combination of them, some for lighting some for security. (Obviously assuming this delay is indeed to address this and not just a slower device.)
Update, support has asked me how I tested them. Here was my response…
I tested two sensors, not part of any automation other than the same lighting automation. When motion, turn light on, when no motion, turn light off. For my testing I did not have them mounted on a wall or ceiling. See my scenarios below:
I tested several scenarios with the same result each time.
A - Pick each one up off my desk while face down and time how long until the light turns on.
B - Put each one in a 8 x 8 box with a lid, take the lid off and time how long it took for the light to come on.
C - Put each one on the end of my desk, facing away from me. Moved my hand in front of the sensor, timed how long it took each one to turn on the lights.
In each scenario, the light with the old style sensor would com on in less than 1 second. The new style sensor took 2.5 seconds plus each time.
I should note, that I first noticed the problem when I mounted it to use in a real automation. In an attempt to see why it wasn’t performing as expected I ran tests A-C.
After months of my motion sensors working fine (though my multipurpose and arrival sensors definitely were not working fine), we just had a false alarm last night due to a motion sensor.
It picked up motion (though there was absolutely none) and triggered our SHM alarm. No pets, no air motion in the vicinity of the sensor, and securely mounted to the wall. Absolutely no chance there was any motion. Yet our alarm sounded.
Seeing the exact same thing here with the new 2016 sensors vs. the 2015 motion sensors. Along with the much delayed response time, I also don’t see the same range with the new sensors.
Have two side by side in a lighting automation and the 2015 sensor detects motion much more quickly and reliably than the 2016. Oftentimes the 2016 module doesn’t ever detect motion even after the 2015 sensor registered seconds before.
It is so bad in that as I was deploying the first couple of 2016 modules, I thought I was having issues with the zigbee connections in that portion of the house. Tried two different sensors and then finally moved a 2015 sensor to the same location and verified the issue.
That sent me to the forums where I ran across this post. Will be contacting Smarthings support shortly so the answer that no one else is reporting the issue shouldn’t apply much longer. Looking like the 12 I just received need to go back if they can address the issue.