From the research I’ve done I cannot find a single sensor that is reliable. Most seem to be ±2c (which is huge really) or more. I don’t want to be setting a thermostat for 72 when it’s thinks it’s 70 but it’s actually 74.
The first one I’ve bought is the Aeon labs unit and it’s way off. 2-3 degrees.
Additionally why no calibrate capability? If I could use a known accurate reading to calibrate in software that would be wonderful.
Most mesh network temperature sensors are not intended for continual monitoring, but rather for interval sampling, and an accuracy variance of 2 degrees Fahrenheit is pretty typical.
There are wifi temperature monitors intended for continuous monitoring for facility management which are typically accurate to a variance of 0.2 degrees celsius (around 1/3 degrees F)–but they also typically cost about $300.
There are also some high accuracy zigbee temperature sensors used in zone deployment, but again, $250 per device is typical.
After using Fibaro’s for weeks now, I think it’s the “closest” that you can get with easily available and affordable (???) sensors. But I am comparimg the temp. against 2 CT100’s, 3 ecobee remote sensors and a non functioning Lennox/Honeywell temps. which by itself may not be accurate but the readings are very close enough to the Fibaros (± 1). Aeons are way way off.
Yea my Aeon Multisensor is +6 DegF off… It’s pretty bad, but at least it’s repeatably off. I’m writing a thermostat app that allows you to enter a + or - offset for each temp sensor you put into your equation.
I just don’t understand why, knowing the problem is so prevalent, that there isn’t a built in calibration setting for these devices.
To JD’s comment, I know thermostats have accurate readings, that’s not really the point. The point is to have accurate supplemental sensors that can be used to fine tune temperatures in rooms where the thermostat isn’t present.
Frankly I’d like to see companies like Nest create say a “Nest node” device that can be placed in other areas to supplement the main nest with accurate temp readings. I’d be fine with plugging the device in to a continuous source as well.
I think the SmartSense temperature sensor has an offset you can do at the device level. Personally I think it should be built into any analog value at the “Capability” level. I’m trying to see if I can get some help with that for my Aeon but I’m having some trouble. It’s probably something I’m doing wrong.
Calibration wouldn’t help, though, if there’s a basic accuracy issue, it would just be equally inaccurate around a new baseline.
There are wildly varying reviews for the kumostats (“wireless tag”) sensors. If you get one that works, you’re very happy. If shipping is delayed or the tag has a problem, customer service seems spotty at best. Cost is obviously excellent ($45 for midrange accuracy per tag, plus the $60 base station reader), size and weatherproofing are great. I looked into them as a presence sensor, but the customer service issues put me off.
Ecobee3 remote sensors are very accurate. I have been checking the temp. on those. Haven’t actually used the ecobee3 though as it is hooked to AC only and not heating. I reiterate the readings are very very close to Fibaro’s.
I say “can”. The Netatmo has a website showing the Netatmo stations at various locations and you would be surprised how far they are off between them. I have done considerable effort to calibrate mine. Basically you provide a list of measurements to Netatmo and they will do some magic to calibrate the device. The first device was so much if they were unable to calibrate it and it had to be returned. The second worked quite well.
I then compared it to two different types Zigbee sensors from a well known vendor and graphed the results. Those results were not pretty.
My conclusion was that I could not use those devices for any application which requires reliable temperature measurements. Probably something like warn me when it is over 100F would work.
Most of the $50 zigbee sensors available for interval monitoring should be accurate within ± 2 degrees Celsius, or around 4 degrees farenheit. That’s usually fine for things like “near freezing” or “above 80 - 85 degrees,” which is what many industrial and agricultural applications are looking for.
It’s just that you typically have to spend more like $250 to get a “high accuracy” temperature sensor that has an accuracy within 1/2 a degree Fahrenheit. They exist, but they’re not usually sold for home automation except as part of an actual thermostat.
So it all comes down to the use case.
I understand that for zone control of central/heating cooling you’d like more accuracy. From an engineering standpoint, “accurate to within ± 2 degrees” is still considered “reliable” if it hits its published range. It may not be the right tool for the job you want done, but that’s a different issue.
± 2 degrees celsius is completely unacceptable for a home imo. That translates to as much as 6-7 degrees off, Fahrenheit. I can’t think of any scenario where that would be acceptable for adjusting the temperature of a home.
I also disagree that it is considered reliable. By whom? The manufacturers? Engineering is an exact science, if you are that far off your engineering is faulty. If you are going to sell a 50+ dollar sensor and claim it’s usable in a home for measuring temperature it should be MUCH more accurate imo otherwise it’s completely useless.
They should have a lable “not suitable for home temperature adjustment”
I agree it’s not appropriate for central thermostat adjustment, but each person’s priorities are different. For some, a range of 2 degrees farenheit is fine. That’s why it’s important to review the published specs before purchase to make sure it fits your particular use case.
Even a 5 degree range is still very usable for monitoring a greenhouse, an outdoor shed, even a garage in many cases. It’s also more than adequate for using small heaters to prevent pipe freeze in unoccupied vacation homes without having to turn on the central heat.
I was an engineer, and “reliability” has a very specific meaning in engineering–it means “performs consistently to the published specifications.” If they publish +/- 2 degrees Celsius and they perform within that range, then that’s a reliable device from an engineering standpoint.
I believe the term you are looking for is “accurate.” Not “reliable.”
If you want something that performs reliably and with more accuracy, say to +/- 0.2 degrees Celsius, you can get that–but you do have to spend more money. That’s typical.
I agree that a device that has an accuracy of +/- more than one degree farenheit shouldn’t be used to control a central HVAC unit. But those aren’t the only home automation uses for a temperature sensor.
On a related note - what sensors have people been using that are “reliable” in that they don’t produce bogus readings occasionally?
My AEON multi has this issue – once every few week’s I’m notified that the temperature in the cellar is too high, and the value is obviously due to a bit error somewhere in the device/stack. Last time it was 598degress! I can live with +/- 2deg, but occasional random bogus values are harder to accept.
@georgeh even at the risk of getting dinged on my head by several friends in the community, aeon multi’s are simply horrible. First hand experience. Fibaro’s are better but I am kind of doubting a little these days with motions getting stuck pretty often and well, it may be coz of the present state of the platform. Now I risk getting dinged by ST.
Update - I’ve installed a Fibaro multi in the room (along w/the AEON). For the first few weeks it seemed great. Now I’m again getting bogus occasional high temp alerts from the Fibrao – so again it is not sufficiently “reliable” to satisfy my use case. At least I have two sensors now to check.
So I guess what I need is a “smarter” extreme temperature notification app – either to check both senor’s are too high before notifying me, or to filter out “obviously invalid” temp readings by using multiple readings before deciding there’s a real problem.
What I don’t understand is why, knowing how inaccurate pretty much every sensor out there is, Smart Things doesn’t provide a calibration setting/offset in software. I’m pretty sure that given the average range of temps of a home an offset to calibrate a given sensor would remain accurate for those ranges.