[NO LONGER SUPPORTED] Ask Alexa

The current version is 2.2.4a and was released Friday. It is rather stable and the main thread for that version is here: [DEPRECATED] Ask Alexa 2.2.4

Thinking of it as a THREE part setup is probably more accurate:

First set up the app within SmartThings. Configure it, the get your setup variables
Set up you AWS Lambda Code
Set up the Developer Area with the variables from the first step.

I am here, along with over a thousand others that have installed this, ready to help.

The documents are in the order you should proceed, and if you follow them exactly it SHOULD work :slight_smile: http://thingsthataresmart.wiki/index.php?title=Ask_Alexa

If you run into ANY issues, I recommend going to the 2.2.4 thread above and you will get some help.

Good luck!

Thanks for that.
Can’t wait to get it setup :slight_smile: , I think I saw your post on a Alexa UK Facebook group but have been waiting until I bought SmartThings (last week)

1 Like

@MichaelS

The live login was blank when I pressed the Ask Alexa-St Integration. I also got a error message as well.

Can you send me a screen shot of your app
Also, what is the full error you are receiving? Things I would check right away are:

  • What is “lights” in terms of your app? A device group? or an individual unit? Ask Alexa only allows you to check items you set up in your app. If ‘lights’ is not a device or a macro, you will get an error like this

  • If you have a valid item called ‘lights’ did you add it to your developer slots? Just default, Alexa knows nothing about your environment. You need to ‘train’ it for the nouns you will be using. Going to Settings>>Setup Variables will train up the development area.

Is this the only command that doesn’t work. or is the whole thing not working? There was an outage yesterday:

Is that when you tested it?

Were you able to get this working? I tested this a little last night and will again tonight, but I didn’t see any issues.

Hello Michael,

Now it is crashing again although I only have “line in” configured in the stations, but it also crashes with the “mood music” app!

I think it is an issue from my Sonos device!

Thanks for the attention! :slight_smile:

Hmm
I will look at that, but I think you are right.

The “lights” is an actual device and it was added to the “device type Slot” in developer slot.

When I ask Alexa to ask ST to turn on lights —her answered was "There was an error with the Ask Alexa SmartApp execution. If this continues, please contact the author of the SmartApp. ",

The whole thing is not working. if there was an outage yesterday, I will try again later.

For the Service Simulator in the developer test section , the 'Submit for Certification" was never clickable.

Which screen shot do you need? The Alexa app in ST?

Let’s start with the basics
if you say “Alexa, ask SmartThings for Help” what does it say?

I will let you know in an hour when I get home.

There was a problem with the requested skill response, Alexa said.

Ok
While you probably did this already, I recommend doing the following:

Go to About
ensure the area where the application ID is filled.
Go to Settings>>Setup Variables and copy the appropriate lines from the web site to your lambda

Give that test another try
watch the live logging to see if anything comes up when you say the command. If not, I recommend resetting the token (under settings>>security) and do the above steps again.

Let me know if that helps.

Hi Michael,

       I deleted the existing Lambda and created a new skill.  Is working now.  Cheer :smile:
1 Like

Great work! I look forward to your feedback!

Apologies as this is probably a dumb question. I have “Ask Alexa” working fine with my Echos and ST devices. When I ask “Ask Alexa” to run a ST routine it responds with “I am executing the XXX routine”.
My question is where is that response text setup and can I modify it? I have just spent some time searching everywhere but without success!

There are no dumb questions
probably means I could improve the documentation :slight_smile:

In this case, however, it is quite hidden
Before I answer, what would you be interested in changing it to? From an end user perspective would your expectations (or hopes) be that this output is configurable? The reason I ask is that if you update the code on your own (which you are able to do), any customization you do will be lost and has to be re-created if you upgrade. While you are not obligated to upgrade, sometimes it is unavoidable in terms of getting the bug fixes or feature updates, which over the next few revisions will be considerable.

Finally, and this is where I could do a better job documenting, is that you can create a custom response right now. It is not really intuitive, but it does work:

  • Create a new Control macro
  • Call the macro any name you want (you can customize it to feel like you are answering a question or anything
  • Use it to activate a routine
  • In the custom acknowledgement area, customize your response.

Again, have at it if you want to modify the code
just understand that I also use different methods to get the output results
in other words, if you pull a thread in the tapestry, the whole thing might fall apart. If you are in construction, this would be anologous to ‘load bearing’ code. In this case, however, you are safe modifying line 1527 of version 2.2.4a

Thanks a lot Michael - that all makes sense now and I’ll have a play around with Macros and custom acknowledgments to achieve what I was after. I see no reason to modify the upgradeable code now.

1 Like

Again, I feel some of the problem here is that Ask Alexa is rather complex
in a good way, it has multiple ways to do the same thing. In a bad way, there are a LOT of ways to do things people aren’t even aware of. I will be solving this in the coming months with a video tutorial showing you ‘pro tips’ for the app!

1 Like

Hi MichaelS, I upgraded to the latest version and initially I was having the same problem because I forgot to update the Lambda code with my STappID and Token and URL. I’ve corrected that, and I can ask for help and it tells me all the help stuff, but when I try to initiate a voice macro or other command it just does the blinky blink for a second or two and then the lights just turn off and nothing happens. Not sure what I did wrong or how to find out?

I tried re-pasting the Lambda code and hit test and I get this error;

{
“errorMessage”: “RequestId: 849347d7-20ba-11e7-9134-99f3ddadc35f Process exited before completing request”
}

Not sure if the Lambda code is supposed to test correctly or not? Where else should I look?

Thanks!

Also under log output I see this;

START RequestId: e3cb0b04-20ba-11e7-a94c-1524f9c01eb0 Version: $LATEST
2017-04-14T02:34:32.496Z e3cb0b04-20ba-11e7-a94c-1524f9c01eb0 TypeError: Cannot read property ‘type’ of undefined
at IncomingMessage. (/var/task/index.js:62:30)
at emitOne (events.js:77:13)
at IncomingMessage.emit (events.js:169:7)
at IncomingMessage.Readable.read (_stream_readable.js:360:10)
at flow (stream_readable.js:743:26)
at resume
(_stream_readable.js:723:3)
at nextTickCallbackWith2Args (node.js:437:9)
at process._tickDomainCallback (node.js:392:17)
END RequestId: e3cb0b04-20ba-11e7-a94c-1524f9c01eb0
REPORT RequestId: e3cb0b04-20ba-11e7-a94c-1524f9c01eb0 Duration: 834.12 ms Billed Duration: 900 ms Memory Size: 128 MB Max Memory Used: 17 MB
RequestId: e3cb0b04-20ba-11e7-a94c-1524f9c01eb0 Process exited before completing request