iBeacons using a user's mobile to trigger a digital sign - ios

I am trying to figure out a way to use iBeacons to trigger a physical Digital Screen. Has anyone done this or seen this?
What I would like to do is when a customer gets close to a digital screen an iBeacon would be triggered and would load to a digital tv screen an ad hosted on a website. I know you can do this to the phone screen, but can the trigger load content to digital signage? If so what would be needed? I already have an app that is triggering API calls. I assume we need some kind of computer hooked to the screen that receives the trigger and then displays content, but having a hard time wrapping my head around what is needed and have failed to see this anyplace.
Any help is appreciated.

There are a number of architectures that would support this. Below is a simple approach that I have prototyped before:
Make an electronic billboard display out of a computer or tablet that has:
Internet connectivity to refresh an ad from a remote web app.
A Web Browser that opens to the add display URL in the web app mentioned above.
Transmits as a beacon with a known identifier. You can make the tablet or computer send a beacon transmission with the onboard bluetooth interface, or you can simply plug in a USB bluetooth beacon to an open port.
Build a server-side app that does the following:
Has a number of configurable advertisements that display in a web browser.
Has a mapping of user identifiers to advertisements (basically the logic of what users are shown what ads)
Exposes a web service to register new mobile app installations that provide the user identifier, along with any user info (name, address, etc.) needed to target ads.
Exposes a web service to accept notifications of what nearby users have detected the beacon.
Automatically refreshes the advertisement to be targeted to whatever user is nearby, or some default otherwise.
Build a mobile app that:
Upon installation, calls the web service on the server-side app mentioned above to send the user info (name, address, etc.) needed to target advertisements.
Detects the beacon mentioned above using CoreLocation (iOS) or the Android Beacon Library (Android).
When the beacon is detected, calls the web service on server-side app with the user identifier to tell it that the user is nearby.

Related

How to send direct command from Google Home to custom smart device without app name?

I try to build my custom IoT device that will be controlled via Google Home device, and serve people with disabilities.
The device itself is Tiva C Launchpad, that I program from scratch, meaning I will have a full control on it.
In my vision, the user wil say something like: "Ok Google, press play button", and as a result, the Google Home device will send a direct command of press_play_button to the IoT device, preferably via the local network.
I found the Google Action SDK, alongside with the Local SDK extention, but if I understood correctly, I have to be in the app mode first ("OK Google, play {app_name}") before pronouncing the action I want, which is inconvenient.
Is there any way to achieve my requirement?
If not, I may give up on the local network control, and use sort of a webhook to send HTTP request to my smart device, and in that case I wonder if MQTT will be more suitable.
Thanks.
The Local SDK is an extension to the Smart Home API. If your device matches up with the device types and traits that the Smart Home API supports then you can use that to control your device.
It has support for media players so things like play/stop should be possible.
I have build generic Smart Home control using MQTT to reach the device, but you have to provide a HTTP endpoint for the Google System to interface with. This take a little thought as you have to map MQTT asynchronous approach to HTTP's synchronous nature.

Retrieving phone geolocation when user tap NFC chip?

I am developing a software to remotely manage NFC tags - change their actions and track campaign performance such as:
No. of taps on individual NFC chips
date and time of taps on individual NFC chips
location of NFC chips when tapped
When the user tap the NFC, they are directed to our server which then redirect the phone's browser to the designated destination set in the campaign.
I have completed all of the above except for the location one as it gives me the inaccurate location.
At the moment, I'm using the site http://ipinfodb.com + their API to get the phone ip address and supposedly location, but it's innacurate.
Another, more accurate approach would be to "get" the phone's location via the GPS but not sure how to approach this.
Any suggestions would be much appreciated.
Probably an easier solution would be to get the data out of an analytics backend such as Google Analytics (relying on their accuracy ...)
Otherwise you need to ask the user's permission - but that needs to be done on the web site acting as the redirection target, as otherwise you would break the flow.

Using a Website to Recognize Google Beacons

As of KitKat 4.4, the required proximity API is baked into the Nearby functionality of the Android OS. This means that Android devices no longer require an application to detect and interact with beacons.
iOS, however, still requires either an app or the chrome browser to do so with Google's beacons.
My question: With current technology, if a website is designed using Google's PWA standards, can it have the ability to detect and interact with beacons in the same fashion that an application would (regardless of the browser being used)?
Follow-up, if YES, would it be able to perform these tasks while open in the background?
The short answer is no, you generally cannot interact with beacons from web apps. This is true even on Android devices that use the Chrome browser. On Android, you can launch a web app on beacon detection using Nearby, but only if the user taps the Nearby notification.
Here's the longer explanation:
Android devices do support Google Nearby which allows you to send a notification to a user when your beacon is detected that can (a) launch a native app, (b) launch the Google Play store to install an app, or (c) launch a URL in the default browser.
When launching a URL, the URL can be to a web app and may include a URL parameter that tells it that the web app was launched by the beacon detection through Nearby. But once the launch of complete, the web app's interaction with beacons is over.
In order to have dynamic interaction with beacons, there must be web APIs that give the web app callbacks when beacons are detected. These currently do not exist. There is hope for this in the future using Web Bluetooth APIs (See: https://webbluetoothcg.github.io/web-bluetooth/), however they do not currently support scanning for arbitrary Bluetooth advertisements needed to detect beacons.

Can you communicate with nearby devices using a website?

Can a website help a user communicate with nearby devices via bluetooth/WLAN without downloading software?
User requests that something be done on their device (which could be, for example a wirelessly connected printer or a bluetooth keyboard).
The site, which contains a repository of relevant actions, sends specific instructions for that device to the user's own machine.
Those instructions are then relayed to the correct device (with the user's permission) via the user's device's WLAN or existing bluetooth connection.
Part 3 is what I'm not sure of - is there a mechanism by which a website can contribute to a wireless/bluetooth connection held locally?
It is not possible. User browser can't interact with hardware for wireless networking.
You should force user to install some custom software to do this.
You would have to submit the "commands" first, then have the device make requests to the website server, i.e., check for any pending "commands" for the device, and then process them locally. A website is not "thing" that can directly interface with a hardware device.

Can an iPhone application running in the background transfer data via USB interface?

There are 2 iPhone applications. One application running in the foreground and the other running in the background. Is there any way to get the background application to send data over USB without coming into foreground? Ideally we want to keep the foreground app in the foreground, while the background app process some data. Once the data is processed it will inform the foreground app that the data has been processed.
No it cannot. It cannot even do this without the use of private frameworks, unless you're in the Made for iPhone program. If you are, then your organization will know, based on the documentation made available to you, what you can and cannot access, when and how.
Should you be in the Made for iPhone program, and are unclear as to what you have access to and when, contact the person in your organization who is the technical contact with Apple for this program, they will be able to give you the details.
If the task is started while the app is in the foreground and you call the appropriate beginBackgroundTask/endBackgroundTask methods, you should be able to have it continue running after the app is backgrounded.
Note that access to USB is restricted (see jer's answer) and that there's no officially sanctioned way to communicate between different apps on the same device. Also, you can only buy/download one app at a time in the App Store and I can't see Apple approving an app that required you to download a second app for it to work. So you may have bigger problems to solve first.
It would help significantly if you told us what you actually wanted to achieve. For example, "I want MyApp on the user's phone to communicate with MyApp on the user's computer".
The absolute easiest way is to send data between the phone and a computer is to require that they're both on the same Wi-Fi network. Several iPhone apps incorporate a web server (this was the easiest way of "file sharing" before OS 3.2), and many more iPhone apps connect to a computer running server software.
Your other options, more or less:
Reverse-engineer the Bluetooth side of GameKit and reimplement it on the computer-side. I'm not aware of anyone who's done this. Loosely, I think it's IP over Bluetooth PAN plus some sort of Bluetooth service discovery.
Audio input/output, e.g. the headphone jack or certain pins on the dock connector. I'm not entirely sure how the mic side works (the resistance was a bit high for a carbon mic when I checked), but you might get lucky and find a way to turn it into "line in" or find "line in" pins on the dock connector.
A webcam pointing at the iDevice screen (and the iDevice camera pointing at the computer screen). Ewwwww.
Join the MFi program.

Resources