Spark Core/Photon Integration Idea

Long time listener, first time caller… :slight_smile:

After looking at a variety of options, a month ago I decided to give SmartThings a try for my home automation system. Beyond the Z-Wave and ZigBee support, the primary attractor for me was its extensibility model for new devices and custom apps, because I know I will want to create and integrate various things that aren’t available commercially.

I also recently started playing with the Spark Core cloud-integrated microcontroller board. The simplicity of how one exposes variables, functions and events from the device, and the fact that it is basically an Arduino, makes it perfect for a whole host of custom sensors, actuators, etc.

The Spark boards expose themselves via a token-protected REST endpoint. After only a few minutes of digging, I found some pretty good examples of Smart Device Types that could read or activate via the endpoint. My experiments seemed to work well, but I found a few issues:

  1. Cloud-Connected Device Types only support polling for their data. Honestly, this sucks pretty bad if you want to use a Spark board as a sensor. One minute latency is terrible. Especially given that Spark supports an actual eventing model (using SSE).
  2. Although the Spark device is available on the LAN, and ST supports device event notifications for LAN-Connected devices, the only mechanism eventing mechanism supported by ST is UPNP, which is a terrible protocol to implement and much heavier than is actually required for this use case.
  3. The docs for anything other than Z-Wave or ZigBee HA devices are pretty poor. One complete example of a Service Manager + Device Type for each of Cloud- and LAN-Connected devices is really needed. Digging through the templates in the Web IDE is not a good solution.
  4. Creating a custom device type each time I create something with the Spark is annoying.

So I started thinking about to make a general reusable system for Cloud-Connected devices, for use by my custom Spark devices.

The following diagram shows what I’m thinking about at a high level, more detail in the text to follow.

Since the Spark publishes via the cloud, my plan is to use the Cloud-Connected Device Type model. As I said above, the examples I found for the device type are pretty self-explanatory and “just work”. But I wanted to add a Service Manager app to take care of the OAUTH login / access-token part.

To solve the polling/eventing issue, I am thinking about using the mapping{} section to add an HTTP Endpoint that the actual Spark device could call, triggering a refresh from the cloud. Spark supports publishing events, but they are SSE, which ST doesn’t support. I the diagram above, I’m showing the endpoint on the Service Manager since I could, in theory, have dozens of Spark boards running, and I’m not sure having dozens of different endpoints (one for for each device handler instance) is a good idea. I’m also not sure if I can use mappings{} in a device type (would love confirmation or an example).

Another thing I’m planning is to use a convention that maps the ST Capabilities model to Spark Device Types. So if I had a Spark core that was behaving like a switch, the Service Manager would instantiate a “Spark-Switch” device handler. That handler would know that it could expect to find a string property in the Spark cloud called “switch” with possible values “on” and “off”, as well as two functions, “on() and off()”, matching the definition of ST’s capability.switch.

Using this convention-based approach should make it very simple to create new (or even compound) device types, and also make the coding on the Spark side more consistent.

I’ve only started prototyping this approach, but I think it should work. I still need to create a working example of the notification REST API and confirm that I can configure the Spark to call it, but I think that is achievable.

If you have any thoughts or suggestions about this approach, please let me know. If it works, we might all be able to use these killer little inexpensive ($19 for the Photon) boards for custom devices, without the nightmare that is the current state of ZigBee HA / XBee.*

Thanks! I’ll post more on this thread as I proceed.

* [rant]I have to rant here for a bit. The amazing thing about ST is the extensibility and integration opportunities, but ZigBee HA is totally unapproachable for the hobby/maker community. Sure, there is the ST Shield, but a) it seems to be constantly out of stock and b) is based on the Arduino Uno footprint (big). I would love to see ST provide a more maker friendly mechanism for LAN and Cloud connected devices so I don’t have to do all of this. In particular it would be nice to have ST support Server-Side Eventing (SSE) or HTTP Callbacks that aren’t UPNP. I need to be able to do them for LAN or Cloud devices. ZigBee and Z-Wave are great for consumer/store-bought devices, but even that space is seeing more and more LAN and Cloud based connectivity. Of course, if there was a reasonable way to connect XBee modules without having to hack our way into the ZigBee HA Profile, that would be nice too.[/rant]


Excited to see this and would love to hear more details on what you are having the spark power. I’m waiting for mine to show up so I can get started with it. Also playing with a blink(1) device as well.

I don’t see anything wrong with this approach. If power is not an issue, working with WiFi is definitely easier that with Zigbee for DIY projects. In fact, there’s already exist a couple of projects utilizing Spark Core. Check this out:


1 Like

Xbee modules are relatively simple to connect to ST. I wouldn’t call it hacking.
A couple of fairly simple command exchanges to connect in then it’s off and running…
Xbee… "I’m here. ,"
ST… "Oh. OK… What endpoints do you have?"
Xbee… "Endpoint x"
ST… "What does x do?"
Xbee… “Receives these clusters and sends these ones” (fingerprint)

Job done!

1 Like

Yeah, it’s not a rocket science for sure, except XBee module will set you back around $35, plus you still need a micro controller. With Spark Photon you get Cortex M3 plus WiFi for $19.

1 Like

Good call. They aren’t cheap for sure.

@geko, I agree that the WiFi power draw on the Spark is a bit of a drag, but for places where you can plug in (which is many places in a home), they’re quite nice. And quite cheap ($19 for Photon, $10 for the bare chip). Thanks for posting those other articles in case others find this thread. Of course I read them as getting started material for me, but didn’t think to reference them here.

Question for you: Do you know the answer to my question about using mappings{} in the device handler? Does it work? Is it a good/bad idea to do it that way?

@Falcogeorge, I assume you’re referring to the steps you posted on your blog here:

While it isn’t rocket science, it certainly isn’t simple. Especially for someone like me who is new to the ZigBee world. I’ve got an S2 XBee on order from SparkFun and was planning to try out your instruction, but even you have to admit that it is a pretty hairy set of steps to go through. :smile:

What would be really nice is to have a tool that I can use to configure the module correctly, but I’m far from having that level of XBee/ZigBee understanding.

Thanks to you both for your replies and guidance.

It did not work for me, although I have not spent a lot of time on it. The problem is that accessing REST endpoints requires authentication key and the Auth workflow is only documented for the apps. It may be possible that child devices would accept the key generated for the parent app, I don’t know. Some official clarification would definitely be helpful. But since you’re going to have a service manager app anyway, I’d just use it as a multiplexor for the upstream data. This way you only need one key to access all devices.

1 Like

Yes please! As someone who works in the developer tools space, I have to say the docs are a little disappointing.

Yeah, that’s what I was thinking, hence my API interface on the Service Manager app in the diagram.


The Zigbee/Xbee steps I blogged are a bit clumsy mainly because it’s done very manually through XCTU which is really a configuration tool for the Xbee.
I think it’s a good way to look at the nuts and bolts of how it works.

What it really needs is someone to write an Arduino library for Zigbee commands and it would become a lot easier. It did run through my mind at one stage but I think by the time I got to it somebody else would have done it and done a much better job of it.
Fingers crossed.

It’s been a learning exercise for me too but like all this stuff, once you get the basics it’s really realively simple. It’s the documentation that Zigbee put out that is really bad.

1 Like

Your writeup is very helpful. Thank you for taking your time documenting it. I have a few XBee modules left over from previous projects. Who knows, they may come handy one day. :smile:

1 Like

When Spark launches Webhooks making HTTPS web requests based on core varaible ,events, functions, etc…life will be a lot easier between the two platforms.

Also remember that Spark just integrated IFTTT.

While waiting for Webhooks and before the IFTTT announcement, I have recently been working on a Spark serial integration with @ogiewon ST_Anything project. I can then do publish/subscribe to cores in the field.

1 Like

Has ST made any public announcements on WebHooks support? A quick search of the forum didn’t turn anything up.

Looks like in about two weeks…

It’s still far from a complete project but I have a spark core/photon setup with SmartThings using Webhooks. For triggering time sensitive things (1-3 second delay) from the Spark to Smartthings it uses the Webhook. It uses Spark.variables for less time sensitive things that SmartThings is going to poll anyway. It uses Spark.functions to trigger things on the Spark device.

So far I just have the smartapp for connecting and making the webhook and talking to existing virtual device types like “Simulated Motion Sensor”. I have also made a device type to show the status and RSSI WiFi signal strength of the Spark.

I haven’t actually figured out the best way to lay all this out as a flexible and easy to configure and use thing yet but it is all working together right now. I’ll keep working on this as time allows but if anyone wants to see what I have so far you can check it out here.


I know this is an older topic but have you heard anybody bringing sensor data from a photon / core into ST? I’m thinking about temperature for each room I do my blind in since I have a ton of spare I/O but I haven’t seen anybody deal with listening for events from a photon / core and getting a ST device to update for use in a SmartApp.

@keithcroshaw Sure, here is what I have so far, it is still not what I would consider done and all the devicetypes are not functional yet. Sending temperature data to SmartThings should be pretty easy but I have not added that yet. I have been using it for my old alarm system wired sensors (motions, door contacts and a glass break) for months now and I haven’t had any problems.

A SmartApp that will connect to your Particle account and let you pick the core/photon from your list of registered devices. It then creates an auth token to use and the webhooks needed to notify SmartThings of events on the core/photon. You configure the core/photon in the smartapp by choosing what smartthings device type is on each pin and then pick the device to update or respond to. (The devices currently need to be made separately in the ide and should be the Simulated Switch or other Simulated device types.) After you configure the SmartApp it triggers an update to the core/photon and sends the new config and reboots it.

The core/photon runs this firmware that looks to SmartThings for it’s config and is updated from the SmartApp, no need to modify the firmware.

I also made a quick optional device type for the core/photon itself that show online/RSSI info. You need to add the devicetype and then make a new one from the ide. Then edit the preferences of the device, get your device id and auth token from the particle build page and put rssi in the spark variable field.

1 Like

Wow that’s pretty awesome! I was hoping just to listen at the device type like I have for my command like below:

There’s always time to dissect what you have here though.

@AndyRawson, that looks awesome and just what I’m after to start playing/understanding with Webhooks from Particle. But I’ve already hit a snag trying to run the Smartapp, probably my ignorance! When I try and install the Smartapp, it gets stuck loading page 2 of the preferences (Where is goes off to get an oauth token). The simulator seems to say: Bad Request @ line 66, which for me is:
httpPost(uri: “”,
So I though maybe I need to change to the Particle naming convention and tried the new oauth address - which I believe is:
httpPost(uri: “”,
But I get the same result.

Any ideas?
I’m a hardware guy by trade, so quite a novice when it comes to the software side! I’ve done the usual, modified someone’s device type handler so I can call functions on my Photon - like a simple on/off switch, I’ve also got it so my hub can poll a variable from the Photon, but now I’m interested in instant notifications from the Photon - like motion detection via webhooks. This seems like everything I need and more! Thanks

Sorry it took a bit to get back to this and sort it out. It looks like the problem is with sending particle:particle (or spark:spark) in the uri wasn’t working anymore from SmartThings. I have found encoding it and passing it as an Authorization header does work.

   if (!state.sparkToken){
   def clientAuth = "particle:particle"
   clientAuth = "Basic " + clientAuth.encodeAsBase64().toString()
   def params = [
        headers: [Authorization: clientAuth],
        uri: "",
	body: [grant_type: "password", 
        username: sparkUsername,
        password: sparkPassword] 
   try {
        httpPost(params) {response -> 
        	state.sparkToken =
	} catch (e) {
        log.error "error: $e"

There are still several pieces of this SmartApp that weren’t finished and I have not had time to spend on it. The whole thing needs a redesign since it kind of got out of hand while I was working on it but the basics are there.