I’ve been using a SmartApp I wrote to subscribe to all of my devices and ship the events off to a Logstash server, using Kibana as the front-end. This gets me all of the historical data I want and the ability to make pretty pictures out of it as well.
The links are broken 404 error. Was it removed on purpose?
Nope, just broke during a repo re-org. The link above should work now.
Anyone know of a cloud version of Logstash functionality? I’m hoping to inspire, or be inspired to create a cloud logger for the community to catch recent poltergeist issues.
I’m open to other methods as well, I realize you could leave the logger open in ST, but it’s hard to keep that up for history (and it crashes a fair bit lately). You could probably also send debug as an event, or as texts, but I feel like that might punish the platform, and cause unnecessary load.
Some quick research:
https://www.loggly.com/
https://papertrailapp.com/
Will add more as time allows…
Has anyone wrote a SmartApp logger through the SmartThings Web Services directly? It could interface with pretty much any server side language or database of choice.
and
I use initial state, but it’s data oriented, not log oriented. I believe Thinglayer is the same (tried it out too before…).
Maybe I’m missing a functionality?
Ohhhhh…
What logs are you trying to capture … the output of Live Logging, or the output of Events?
Not sure if there is any way to continuously stream Live Logging to a collection service. Perhaps some clever Javascript to scrape the output.
As for Events, the two example services can (theoretically?) capture all Events, not just State changes and Value reporting. Message Events are just text values + “descriptionText
”, after all.
I’ve seen a couple of cases where if I could go back a day in live logging, I’d be able to tell what happened.
Even if I had to put lines in code, it would be useful as a trace.
I’m using Logstash on one of my own servers to do this. In any SmartApp where I need to retain history, I add the following method and use it to log messages instead of log.debug():
I also wrote another SmartApp that subscribes to all of my things and logs their events to the same Logstash server for easier correlation with the log entries:
Cloud version? I don’t want to run my own server. Been there before.
No idea. If I were looking for such a thing, I’d google “hosted logstash”. You’ll probably need some sort of front-end on it too. Kibana’s fairly popular, graylog2 is out there as well.
Continuing the discussion from Best way to collect logs from the hub?:
Using your SmartApp, I see it attempting to hit my server but I get the following error:
10:01:45 PM: error org.apache.http.conn.ConnectTimeoutException: Connect to 10.0.0.20:9992 [/10.0.0.20] failed: connect timed out @ line 163
10:01:20 PM: debug JSON: {“date”:“Wed Dec 02 03:01:19 UTC 2015”,“name”:“power”,“displayName”:“Aeon Home Energy Meter”,“device”:“Aeon Home Energy Meter”,“deviceId”:"":“DEVICE”}
I am able to connect to my server with telnet and I see the data on my server.
ie:
telnet 10.0.0.20 9992
testing 123
the following appears in the log
{“message”:“testing 123\r”,"@version":“1”,"@timestamp":“2015-12-02T03:04:47.342Z”,“host”:“172.17.42.1:43307”,“type”:“syslog”}
my logstash.conf for the TCP section is as follows:
tcp {
port => 9992
type => syslog
}
Can you post your TCP section?
thanks
I realized that I was giving an internal IP address and it is coming from the cloud. I updated it with an external address and it seems to be going through.
I still get the following error
java.net.SocketTimeoutException: Read timed out @ line 163
thanks
Getting the same thing. If I use my Raspberry Pi’s (which is hosting this ELK stack) hostname, I get a ‘blacklisted’ error. Using the IP, I’m getting this one.
For the record, this is the logstash.conf I’m using:
> input {
> syslog {
> type => syslog
> port => 9992
> codec => "json"
> }
> }
> filter {
> json {
> # parse JSON in "message" field,
> # put resulting structure in "data" field
> source => "message"
> }
> }
> output {
> stdout { }
> elasticsearch { hosts => ["localhost:9200"] }
> # elasticsearch {
> # cluster => "SmartThings"
> # }
> }
Here’s what I’m using:
input {
http {
host => "x.x.x.x"
port => "xxxxx"
codec => "json"
}
}
output {
elasticsearch {
host => "localhost"
index => "st_events"
}
}
Here is my logstash config file. (this is working right now)
input {
http {
port => 9992
codec => "json"
}
}
filter {
json {
# parse JSON in "message" field,
# put resulting structure in "data" field
source => "message"
}
}
output {
stdout { }
elasticsearch { hosts => ["localhost:9200"] }
}
}
I’m noticing that elasticsearch and Kibana have no idea what the type information the logs are. Maybe I’m using it wrong or something. My impression is that it was supposed to glean what the data type is so that you can use it. It gets the dates good enough, but I’ve found that it’s necessary to mutate the data supplied in the logstash.conf file to get it to draw something. I was also looking into multiple indices. Right now, it seems to lump everything together. Ideally, I would like to have index st_events, then by device, then value and unit. Kind of hierarchically. Anyway, my logstash.conf filter section is below.
> filter {
> mutate {
> convert => { "name" => "string" }
> convert => {"displayName" => "string" }
> convert => {"device" => "string" }
> convert => {"deviceId" => "string" }
> convert => {"value" => "integer" }
> convert => {"isStateChange" => "boolean" }
> convert => {"installedSmartAppId" => "string" }
> convert => {"isDigital" => "boolean" }
> convert => {"isPhysical" => "boolean" }
> convert => {"location" => "string" }
> convert => {"locationId" => "string" }
> convert => {"descriptionText" => "string" }
> convert => {"description" => "string" }
> convert => {"id" => "string" }
> convert => {"unit" => "string" }
> convert => {"source" => "string" }
> }
> }
Basically, I’m not interested in counts, I want useful graphs where multiple pieces of information can be abstracted upon one another. (i.e. correlating switches on with electricity usage…)
Edit: I’ve tried the following, which is adding each major category to an index of its own. Any thoughts @btk?
> output {
> stdout{}
> if [name] == "power" {
> elasticsearch {
> hosts => ["localhost:9200"]
> index => "power"
> }
> } else if [name] == "contact" {
> elasticsearch {
> hosts => ["localhost:9200"]
> index => "contact"
> }
> } else if [name] == "switch" {
> elasticsearch {
> hosts => ["localhost:9200"]
> index => "switch"
> }
> } else if [name] == "temperature" {
> elasticsearch {
> hosts => ["localhost:9200"]
> index => "temperature"
> }
> } else if [name] == "motion" {
> elasticsearch {
> hosts => ["localhost:9200"]
> index => "motion"
> }
> } else if [name] == "lock" {
> elasticsearch {
> hosts => ["localhost:9200"]
> index => "lock"
> }
> } else if [name] == "humidity" {
> elasticsearch {
> hosts => ["localhost:9200"]
> index => "humidity"
> }
> }
> }
Brian,
Just a small update. Initial State just came out with an log export function. Initial State might now appeal to loggers AND people who wanna show off cool graphs and gauges.