Best way to collect logs from the hub?

Mine is still sending logs off to my splunk with no issues here. Sorry to hear that you’re having issues let me know what I can do to try and help figure this out.

1 Like

THank you Jason ; Was able to resolve after power cycling both my router and ST Hub.

1 Like

This is great. I can verify this code works today in SmartThings without modification (phew!). I wanted to share some of the gotcha’s I faced, probably due to changes in the Elastic Stack (running 5.5.0), not the ST Platform itself.

This is still a work in progress. Here’s my current logstash config:

# Open TCP port 1234 and listen for data
input {
  tcp {
    port => 1234
    type => "smartthings"
  }
}

filter {
  if [type] == "smartthings" {
    json {
      source => "message"
    }

    # Only want likely valid events (JSON), so drop all others (like HTTP headers)
    if "_jsonparsefailure" in [tags] {
      drop { }
    }

    # Fix bug where event variables are not replaced with real values in descriptionText field
    if "{{" in [descriptionText] {
      mutate {
        gsub => [ "descriptionText", "{{ device.displayName }}", "%{displayName}" ]
        gsub => [ "descriptionText", "{{ value }}", "%{value}" ]
        gsub => [ "descriptionText", "{{ linkText }}", "%{displayName}" ]
        add_tag => [ "descriptionText_bug" ]
      }
  }

    # Replace the logstash received @timestamp with actual event time
    date {
      match => [
        "date",
        "E MMM dd HH:mm:ss 'UTC' yyyy"
      ]
    }

  }
}

output {
  if [type] == "smartthings" {
    elasticsearch {
      hosts => ["127.0.0.1:9200"]
      index => "logstash-smartthings"
    }
  }
}

NOTES: I had issues specifying the json codec in my input. I would receive the header data (useless) but not the actual messages. Moving the JSON parsing to a filter fixed this. I am also running the Elastic Stack (Elasticsearch, Logstash, Kibana) as well as nginx as a front-end to Kibana on a 2GB RAM, 1vCPU Ubuntu 16 VM. In case anyone is curious how low you can go on specs :wink: Not sure I’d go lower than that but I may try tuning further. I’ve been curious how light-weight I could get Elastic for a bit.

FIXED: I’ve fixed this in my logstash config above but including for reference. On a few of my devices, the descriptionText is messed up with the variable placeholders being sent instead of the displayName and value. But it knows about those values because it’s including them individually in the message. Haven’t looked at it AT ALL and I know I can write a logstash filter that will detect and fix it so it’s not a huge deal, but weird. Almost all of my devices report fine. Will have to also see if there’s a common theme with the devices that don’t report well. Here’s an example message:

{“date”:“Wed Jul 12 18:09:22 UTC 2017”,“name”:“temperature”,“displayName”:“Garage Interior”,“device”:“Garage Interior”,“deviceId”:"",“value”:“82”,“isStateChange”:“true”,“id”:"",“description”:“temperature: 28.05”,“descriptionText”:"{{ device.displayName }} was {{ value }}°F",“installedSmartAppId”:“null”,“isoDate”:“2017-07-12T18:09:22.532Z”,“isDigital”:“false”,“isPhysical”:“false”,“location”:"",“locationId”:"*",“unit”:“F”,“source”:“DEVICE”,“program”:“SmartThings”}

I’m not sure if you’re still doing this since it’s so long ago but here’s my comments anyway :slight_smile:

Elasticsearch will try and datatype match to a basic degree. It’s not perfect though. Also, if your index is named ‘logstash-*’ then logstash comes with its’ own pre-defined template that it applies to your data, which is good for general data but ideally you’d develop a mapping of your own and use that to define your datatypes. I haven’t done this yet myself. Another issue is that the field ‘value’ is not actually an integer. I’m getting both integers and strings come across. What I’m planning to do initially is add a filter that inspects the ‘value’ field and then populates one of two additional fields (‘value_int’, ‘value_str’) depending on what it is. I haven’t looked at the rest of the data to see what’s needed but I’m most interested in visualizing that field I think as well as some of the booleans.

I’d recommend against splitting events into multiple indexes, Mapping Types would be the way to go (I think) and something I’ll look at eventually. Indexes are akin to databases, types are kinda like tables. You will be heavily restricted trying to visualize your data in Kibana because it can only visualize one index pattern at a time (can add visualizations from multiple indexes in a dashboard). So you’d have to have an index pattern of ‘*’ to create charts that combine multiple device types.

Thanks for the smartapp - works great and the “local” option works as expected on my local splunk instance without having to punch holes through the firewall.
A question that I have is how do I poll the status and attributes of all devices periodically - for example, the temperature of all sensors? So far, I’ve only been able to capture events for contact sensors and switches.
Thanks again.

So my temp sensors log like this

{"date":"Mon Sep 11 13:09:09 UTC 2017","name":"temperature","displayName":"West Side Motion","device":"West Side Motion","deviceId":"5df62ea8-d840-4ce8-9afb-a9cf0f9f2259","value":"66.0","isStateChange":"true","id":"66357180-96f2-11e7-9d81-0a8594209a64","description":"zw device: 2A, command: 3105, payload: 01 2A 02 94","descriptionText":"West Side Motion temperature is 66.0°F","installedSmartAppId":"null","isoDate":"2017-09-11T13:09:09.887Z","isDigital":"false","isPhysical":"false","location":"Home","locationId":"652fa5ce-fb35-430e-ad5a-0aedfcf30dd2","unit":"F","source":"DEVICE",}

Where the value is the current temp that its reporting. All of my Aeon Lab multi sensors and my thermostat will log the temp like that. Let me know if that gets you to where you want to be :slight_smile:

Thanks, my temp sensors are a bit temperamental - they are the Samsung Smartthings motion sensor and the multipurpose sensor.
Sometimes it took a few hours for a new reading - I noticed that forcing a “poll” and “refresh” command from WebCore might have increased the responsiveness. I will continue to observe.
Thanks for the great app again - local logging is really useful and increases the security without punching firewall holes and doing the NAT.

1 Like

I have a few Temperature and Humidity sensors - the app captures temperature readings to Splunk locally without issues.
However, Humidity readings aren’t captured.
From my quick look at the source code, it looks like the humidityHandler method has not been implemented?
I have added the following code, but still doesn’t seem to work. Any assistance appreciated:

<snip>
def humidityHandler(evt) {
genericHandler(evt)
}
<snip>
def doSubscriptions() {
<snip>
subscribe(humidities, "humidity", humidityHandler)
}

Just FYI, the above code modification seems to have worked - it did take a few attempts enabling/disabling on the Smartthings Apps page to get it to take effect. Humidity values are now logged onto the local Splunk!

I’ll update the code with this in the repo and push out the update. Thanks for this :slight_smile:

Ok I’ve updated the code in github but its not showing as in a repo in my smartapps so if you’re not seeing that you need to update there please do a manual update from github. I also approved a pull request that I didn’t notice with some spelling fixes as well.

I just put some issues on the GitHub. Illuminance is not reporting, and I can’t seem to get the app to update using GitHub integration in the ST api.

Also, batteries have NEVER reported, and our little hack that we used to get locks to report continues to bother me.

Any thoughts from the community on these topics would be appreciated :slight_smile:

Hey there @rlyons20 I’ll take a look. I too noticed the other day that the lux isn’t reporting. I’ll see what I can figure out with that. I even looked at the ST docs about lux and compared it to what we have in the code and as best as I can tell it is the correct string for it. Thanks.

I am open to all suggestions here on how to get ST to log the current Lux reading. I’m going to tag @Mike_Maxwell on here and see if he knows some good code example as what I’ve tried with my code so far just is not reporting back the lux readings on any of my devices. I also cleaned up the code some as well by removing some unnecessary entries in there. Let me know if the locks and batteries are logging if not I’ll continue to plug away. I wasn’t logging battery values but I am now so I won’t know until one of mine reports in.

The device input name doesn’t match the name supplied to subscribe

Hey there Mr. Maxwell thanks for dropping in. So I looked at the docs for the lux and best as I can tell the code is correct. Can you perhaps give me a little nudge on what you mean? Thanks.

Can anyone help with easy instructions on getting bkeifers code to logstash?
I am specifically looking to understand what port and host is required in the input and output of the logstash config file. None of the samples given in the above posts work.
And ST live logging gives java.net.SocketException: Connection reset @line 164 (genericHandler) [i have port forwarded the logstash port 9300)

I can’t get this to post to splunk. Running tcpdump and I don’t see any Posts on the Splunk server. Grabbing the JSON from the debug logs, I can curl them at Splunk and see the events. Tried using IP/hostname, gone over the code, can’t see any bugs. Using Splunk lite locally. Would love to get this working!

I’m a dummy, missed where you said this only works for HTTP. I am loving this.

1 Like

Anyone care to help?

I have installed Splunk and enabled HTTP Event Collector, see pictures.

I generated a token with json as sourcetype. Setup the app with local ip to the splunk server. 8088 port, but when i search for sourcetype=“smartthings” i get nothing?

tips?