Everywhere i am reading that it is not possible to write data in iOS Swift or ObjectiveC via NFC. What about handling a contactless payment process? Is this even possible without using Apple Pay? If not is there any workaround which enables “NFC Writing” for iOS, because NfCCore doesnt provide it yet?
Apple keeps the NFC functionality closed, so only Apple Pay is allowed to function. Other than reading NDEF tags, no apps can use the phone's hardware either to work as an NFC reader (for example to top-up balance for a public transit contactless card) or to emulate a contactless card, either purely in software or using the phone's embedded secure element.
As far as I know (Apple as always does not say much), the latest iPhones' hardware is perfectly capable of doing all of the above, but some commercial decision resulted in keeping it unavailable for third party developers.
Looks like it will be possible with iOS 13: https://developer.apple.com/documentation/corenfc/creating_nfc_tags_from_your_iphone
Related
I am looking to develop my IOS application to communicate with iXpand Flash Drive using lighting port. I am going to use iXpand SDK for that. The flash drive is defined in public database of apple. Does this need me to have MFI Program to deploy the app on App Store?
I can answer it well as I have deal with the same case:
You need not to have MFI License, as you are not manufacturer of
iXpand Flash Drive Hardware.
The SDK does not communicate with WesternDigital to get any information.
In my case, Apple Does not have asked for MFI details for iXpand. You may just need to provide details harware details for using lighting port.
At last but not least, we cannot 100% sure about apple review process. They can accept / reject with any reason but here you have good chances to accept it if you have valid reason to us external hardware.
I'm working on an iOS application (for iOS 13+), that should use the most secure BLE pairing method, Out of Band pairing (OOB) (which usually uses an NFC tag for BLE address/temporary key (TK) storage).
Unfortunately, either my Google skills suck, or there is literally no useful information on Google for OOB pairing in iOS..
From my understanding, Apple is already using some OOB technique to pair with the Apple watch (through an image rather than NFC, with fallback to passkey).
During my Google research I've found one single link to StackOverflow,
Bluetooth “out of band” (OOB) pairing on iOS?,
which raises the same question, but it's from September 2015, so the answer there is (hopefully) outdated!
Also, having had a look at CoreBluetooth API, it seems like there is no information in there on OOB pairing..
> Does anyone have more (up-to-date) information on this topic?
Any link/clue/explanation is much appreciated!
(I can't believe really, that this hasn't been implemented in iOS as of today..)
In 2015 Uber was fingerprinting iPhones to reduce fraud in China.
What methods were they using to do this? Was it as simple as recording the serial number in a database? Were they using private methods?
From the NYT article:
"To halt the activity, Uber engineers assigned a persistent identity to iPhones with a small piece of code, a practice called “fingerprinting.” Uber could then identify an iPhone and prevent itself from being fooled even after the device was erased of its contents.
There was one problem: Fingerprinting iPhones broke Apple’s rules. Mr. Cook believed that wiping an iPhone should ensure that no trace of the owner’s identity remained on the device.
So Mr. Kalanick told his engineers to “geofence” Apple’s headquarters in Cupertino, Calif., a way to digitally identify people reviewing Uber’s software in a specific location. Uber would then obfuscate its code for people within that geofenced area, essentially drawing a digital lasso around those it wanted to keep in the dark. Apple employees at its headquarters were unable to see Uber’s fingerprinting."
Found more info and a potential method on this article.
Will Strafach examined a 2014 build of the Uber iOS app and found them using private APIs to use IOKit to pull the device serial number from the device registry.
https://twitter.com/chronic/status/856250223777206273
There might be more, but this alone is a blatant violation of App Store policy. Strafach confirms that the technique Uber was using no longer works in iOS 10.]
Github project here - https://github.com/erica/uidevice-extension/blob/7adc1d13946fca6fcb4b5f0b6e45911ab4a9a671/UIDevice-IOKitExtensions.m
Even I was too curious to know. I read a tweet, that basically hints that they exploited IOKits registries to do this.
Seems like they got some identifier from IOKits internals and saved it at their end to identify as device.
https://twitter.com/chronic/status/856250223777206273
I'm developing a custom electronic device - think of it as a special kind of data logger, and I need to connect a computer to it to configure it and to extract the data.
I know I can do this without too much trouble on a PC, but I'd like to use an iOS device to do this.
Two questions:
Can I do this with a regular dock connector / USB cable? Will the EA framework let me do all the communicating?
Once I have extracted the data, what's the best way to get that out of the iPad? Make an email with it, save to a dropbox or something?
Thanks!
Afaik, you need to join the MFi program to make USB accessories for iPad/iPhone. That will give you all the technical resources needed.
As for data transfer there are only "opinions", I say the more options of sending, the better. Just don't force the user to choose more than once, then make it changeable in settings.
If you're doing very light communication, you might be able to get away with using the headphone jack.
Apps communicate to the headphone port through the various audio frameworks on iOS. AVFoundation is a high-level abstract framework to do various audio operations, but for fine-tuning the communication to a device over this interface, you will likely be using the C-language callback-based Audio Queue Services framework to do audio I/O.
This is nice because your device can be cross-platform (iOS, Android, Mac/PC) as long as you write the corresponding software, and because you don't need to go through Apple's MFi approval program. Think like the Square credit card scanner.
You will have to write the communication stack between the device and your iOS device but yes, you can.
there's very few docs about using the EA.framework. All the juicy parts are in the Mfi program but Apple is very strict about giving access to it.
So if you succeed, sharing a tuto will make you a EA hero ;)
About sharing your data, imho, email + CSV is a winning combo.
If you want to plug something into the dock connector, you want to have a look at https://developer.apple.com/programs/mfi/
Is it possible that Apple does or will provide an API for Siri? It would be great if I can be sipping my coffee and say,
User: Hey Siri, could you please open Angry
Birds; Level 4 and throw a first bird for me. Make sure you at least hit one green pig or it's coming out of your paycheck.
Siri: Yes sure, I will do that for you.
Is this possible? And would you think Apple will provide this to us?
THIS IS NO LONGER ACCURATE:
There is no API and there is no indication of it changing anytime soon. There are private headers that you can look at by decompiling the SDK. This is a great synopsis:
Quora
You can be clever like RTM though, this is as close as it gets:
http://www.rememberthemilk.com/services/siri/
In iOS 10, Apple has announced an API for Siri called SiriKit. However, you can only do it as an app extension and only if your app implements one of the following types of services:
Audio or video calling
Messaging
Payments
Searching photos
Workouts
Ride booking
Climate and radio
SiriKit is a way for you to make your content available through Siri.
It also lets you add support for your services to the Maps app. To
support SiriKit, you use the Intents framework and Intents UI
framework to implement one or more extensions that you then include
inside your iOS app. When the user requests specific types of services
through Siri or Maps, the system uses your extensions to provide those
services.
This means SiriKit cannot be used for the scenario mentioned in the question and in ways that many of us would like.
Source: Apple Docs for SiriKit
When the iPhone was first released, there was absolutely no public talk from Apple about custom app development. The delayed release of the SDK gave them plenty of time to get public feedback on the iPhone user experience and make the SDK ready for public use.
It seems likely that they're taking a similar approach with Siri.
Not yet. If you want it, file a feature request at bugreport.apple.com, and briefly describe what you want it for. The more people ask for it, the more likely it is to happen.