Apple Applies for Patent on AI-based "Hyperlocal Geopresence" for Home Automation

Oooooh, shiny! :heart_eyes:

First, a reminder once again that you can’t patent an idea. You have to demonstrate an actual practical working application. It’s that specific method/device that you are patenting, not the general idea.

OK, this is awesome. While everybody else has been talking for years about using heartbeats or something like that for microlocation inside the home, Apple is taking a completely different, and what looks like very doable, approach using existing hardware right now. But new Ai software.

Their idea is that there would be known Home automation devices, they specifically mentioned sensors, that would already be reporting to HomeKit. They could be using different protocols ( they mention Wi-Fi, Bluetooth, and GPS).

Now they’re going to use something similar to Siri suggestions. This is all done locally, not in the cloud or stored in the cloud, so there are minimal privacy issues.

Your HomeKit application will know which devices you normally use when you enter a specific very small area in the home. And it will pop up those devices in the app as you walk around the home with either your phone Or a wearable like an Apple Watch. So they are waiting for you to do something with them.

They also make suggestions based on what you usually do.

And… you would be able to add these actions to HomeKit automations. Which is where we get true Microlocation. Note that while part of this is based on iBeacon algorithms, it doesn’t require I beacon hardware. It can work with the existing hardware by evaluating signal strength to the known devices and triangulating with other known devices. This is really cool.

  1. A method of suggesting one or more accessory devices controlled by an application executing on a mobile device, the method comprising performing by the mobile device: measuring one or more sensor values to determine a data point at each of a plurality of first times, the one or more sensor values measured using one or more sensors of the mobile device, wherein the one or more sensor values are determined from one or more signals emitted by corresponding one or more accessory devices; clustering data points within a threshold distance of each other to create a plurality of clusters, wherein the plurality of clusters correspond to one or more locations in a physical location; determining one or more physical locations based on the clustered data points; and associating one or more accessory devices with each of the one of more physical locations based on the one or more signals emitted by the one or more accessory devices.

They are literally counting the steps that the person takes in order to estimate distances within the space.

Thus, improvements for identifying an accessory device that should be controlled upon determining a particular location of a mobile device of a user is desired. Specifically, it is desired for the home application on the mobile device to be able to determine a location of the user and suggest a corresponding accessory device that a user may want to control or automatically operate a corresponding accessory device based on the location of the mobile device of the user.

And, yes, they do mention different user profiles as part of the system. :sunglasses:


Sounds like they are monitoring everything you do with your device and merging those actions with environment items like they do with ultrasonic beacons to determine location/occupancy.

I find it very creepy but interesting.

1 Like

Your own phone is doing the monitoring, it’s not available to the cloud, which is one of the big differences in apple’s approach to home automation in general.