Examples ?
I keep asking her what a whale sounds like and she " didn’t understand the question I heard"
I guess Alexa team doesn’t watch GH commercials
3 Likes
bamarayne
(Jason "The Enabler" as deemed so by @Smart)
3
ask her what a blue whale sounds like… you’ll get a good laugh…
Seems some work for me some don’t. Looks like they are trying to emulate GH in conversational skills.
For entertainment, ask what does the fox say or knock knock. There are multiple responses to both. Sometimes she gets stuck saying the same responses repeatedly. Ask what sound does horse make.
Now that is just crazy! I tried your exact phrasing and she gleefully answered back correctly, just like my high school science teacher. What’s next for Alexa, Yodish phrasing…
Alexa, "Waters freezing point, what is it"? She should have responded, "Hum, by that question, I'm not sure what you meant"
Since Amazon has spent the past 15 years selling us stuff while google has spent that time developing computer intelligence, your question is kinda silly. It is INCREDIBLY DIFFICULT to get “basic smarts” into computer systems. I dunno, have you been following the development of “basic smarts” for SmartThings? Been installing stuff off GitHub to do something as simple as brighten the living room lights and turn the stereo to the right station and turn the heat up a degree? Been through the failed rules engines of the past, and struggling with CoRE and its pistons?
If it’s so very easy as you claim, why can’t we just type a simple sentence into ST and have ‘smarts’ that write the rules for us??
Dang @Glen_King: Did you have your morning coffee on that reply, really “Silly” ?
I’m just saying that the Echo could simply perform some response to basic answers much like Siri, GH, Yahoo, Bing do, by issuing a query to any of the the web search engines and providing back a short speech to text. As pointed out by @bamarayne they have some of this done, but working on the intent language since similar questions have many ways of being asked.
As far as Core and Rules, that development application has been a HUGE leap frog into 4G/5G GUI visual programming and providing so many people additional NEW and IMPROVED Home automation capabilities without having to do any 3G coding with Groovy, Javascript, Oauth, etc…
Yeah. Nonetheless, you can’t type in “SmartThings, I want the kitchen lights to go on automatically if my wife’s presence sensor shows up any time between 5p and 7pm AND the kids’ sensors are present, but NOT if the kids’ sensors are not present, but the living room should go on when she gets in regardless of whether the kids are home or not”. Simple request. Very simple. You and I understand that intuitively… not ‘intuitively’ really, as we’ve spent decades using our non-digital brains to learn and process speech. But try to have a computer understand you as you speak it. Which is why you have to spend ten minutes in CoRE setting it up.
I did not mean to attack you. I just find the presumption that Alexa should already be able to EASILY grasp these things - when Alexa does not have the massive power of the google search/intelligence engine behind it - to be, well… what’s a less robust word for ‘silly’??
Approximately four million new Alexa devices have come online over the past few months. Those moments might be server/system load issues.
I’m finding it quite responsive. The responses to ‘native’ (alexa, turn off the ____) commands is almost always within about a second or less, and the IFTTT commands are within about 3-5 seconds.i’m finding it’s forcing me to enunciate better lol
Check out a new device coming to the market called WooHoo by Smartbeings. Facial recognition and voice recognition and a lot more . Why didn’t Amazon or Google make this?