Is offline dictation officially supported by Apple? - ios

I cannot seem to find any official documentation on this. This is what I have found so far using not-so-reliable means:
Voice to text used to require an internet connection prior to iOS 9
In iOS 9, Apple experimented with offline dictation for certain
devices
In a later revision of iOS 9, Apple removed offline dictation.
Apple re-introduced Offline Dictation in iOS 10. I have checked this on about a dozen iOS 10 devices. The ones with A9 processor show this option and the ones without don't.
But I would like to get some official documentation on this before telling our clients that iOS now supports offline voice to text.

According to https://support.apple.com/en-us/HT208343
On iPhone 6s or later, iPhone SE, and iPad Pro, you can dictate
without being connected to the Internet. Earlier models of iPhone and
iPad require an Internet connection.
So you are correct that devices with A9+ processor can use dictation offline

Related

How to use Sinch VOIP with iOS 13 or higher?

I have an app in Android and iOS that uses Sinch videocalls but since the iOS 13 launching the video the iOS 13 sends is bad, it looks like static. I've discovered that if during the videocall I use the iPhone in landscape orientation the video transmitted is normal. This problem happens even with the sinch demo. Any ideas to solve this?
I've tried: use multiple connections, updating the framework to the last version, in multiple devices, modifiy the frame from the callAsyncLocalVideoFrameHandler.
I expect to be able to send good video from devices with iOS 13 of higher and iPadOS.
we are aware of the issue, it's caused by a compatibility issue between iOS 13's H264 Codec and the WebRTC version we use on our SDK.
We are working to fix that. There will be a new major iOS SDK release to support iOS 13, at the moment we do not recommend customers to release new versions of their apps with iOS 13/Xcode 11 support.
Your current apps will still work on devices running iOS <12 and the new iOS 13, as long as they are not built with XCode 11.
iOS 13 brings also new directives to use VoIP Push on Apple devices, this will also cause changes on how customers integrate any VoIP SDK, if you use VoIP Push notifications.
See note and links on our website.
https://www.sinch.com/docs/resources/downloads/index_vvv.html#info-ios-13-voip-push-changes
Jorge Siqueira - Sinch Voice & Video.
I also see problems with iOS 13 and Sinch Video call. I have tried to go back and build with Xcode 10.3 but that does not help either. iOS 12 still works like you say

iOS 9 an 10, Bluetooth connection with Bluetooth 3 device

I have a request of implementing an iOS application which will be able to communicate with bluetooth device. The chip on that device has Bluetooth 3. The App will be privately distributed, so we don't need AppStore review and we can use anything - the only requirement is, no iPhone jailbreaking. The purpose of the application in reading diagnostic data from the device and sending a few commands to it - like turn left, turn right, setValue - nothing special (it is a special valve, which can be remotely controlled).
Now if a learned correctly the options with using Bluetooth on iOS are:
CoreBluetooth: library for Bluetooth 4.0 LE devices. It cannot be used, because Bluetooth chip is 3.0
ExternalAccessory: the device must be MFI certified, if this option is about to be used. Quite hard that hardware producer will go into the MFI certification, so at the moment not an options.
Bluetooth.framework: private framework which sounded like a good option (I have managed to make it run on iOS9 and iOS10) but the communication with devices on iOS10 does not work anymore. On iOS 9.3.5 on iPhone5 I made it work, but only iOS9 is not an option
BTStack: available via Cydia, but this packages if I have learned correctly require iPhone jailbreaking. Sadly again not an option
The situation clearly does not look good here. I have also read rumors, that it might be possible to use HID bluetooth profile for communication (anyone tried that). is there any C Bluetooth library which could be run on iOS device and work with iOS10?
Thanks for any iadeas.

iOS CLLocation difference between device models

I'm using the Google Maps API for iOS to essentially highlight a width of coverage using the GMSPolygon.
Running the iOS simulator I get an odd behavior. Using both a iOS 9.3 build for all the testing, the map properly highlights coverage using the location simulated by the simulator on an iPhone 5S, 6, and newer.
You can see this in this screen capture that I did: 5S Highlighting
Now when using the iPhone 4S and 5 it does not draw the GMSPolygons! iPhone 5 Not Highlighting
A few things to note, I made sure that location permission were authorized on all testing and the same "city Run" simulation was used on all the testing. Again the highlighting works fine on 5S and newer devices running 9.3, but not on 4S & 5 using 9.3. Is there a fundamental API change in the CLLocationManager between the builds for these devices?
As discussed in Simulator User Guide, there are some hardware and API differences in Simulator which may affect your app when testing in Simulator.
In addition to that, please note of these hardware features that are not simulated as of iOS 8.2:
Motion support (accelerometer and gyroscope) are unsupported.
Audio and video input (camera and microphone) are unsupported.
Proximity sensor
Barometer
Ambient light sensor
Aside from those, there are also API differences wherein Simulator APIs don’t have all the features that are available on a device such as these:
Receiving and sending Apple push notifications
Privacy alerts for access to Photos, Contacts, Calendar, and Reminders
The UIBackgroundModes key
Handoff support
Please try going through the Simulator User Guide for more information.

Is UIDevice multitaskingSupported checking necessary nowadays?

I checked this wiki and got to know that from the 3rd-generation iOS device (iPod touch 3 and iPhone 3GS I guess), the multitasking was supported.
So that means all the iOS device nowadays are multitasking supporting since the deployment target is 4.3. And the devices running that version of iOS are newer than "the 3rd-generation".
So no need to check [UIDevice currentDevice].multitaskingSupported , am I right?
You are correct. The last iPhone that didn't support multitasking was the iPhone 3G.
And since new apps these days shouldn't support anything older than iOS 7 (maybe nut rarely, iOS 6), there is no reason to check for multitasking.
And Apple only accepts apps that support iOS 4.3 or later. Any device that didn't support multitasking could never be updated to 4.3.
Any device a modern app will be running on these days will support multitasking.

iOS 7.1 update breaks External Accesory inputStream UART read ability

We have MFI approved accessory device. Our protocol assumes continuous readings from accessory in UART mode. It was working perfect before iOS 7.1, but after testing on the iOS 7.1 it is not working properly.
In ATS test suite log we can see success eap records and some AccessoryDataTransfer acked by iPhone.
We can reproduce problem using EADemo. EADSessionController stops read data after some show/close cycles.
Only killing EADemo and relaunch allows us read some data.
Configuration:
iPhone 4 and newer (iOS 7.0), any iPod (iOS 6.x) - OK
iPhone 4s, iPhone 5 (iOS7.1) - Failed
What happens with EA framework in iOS 7.1?
Apple just released update 7.1.2 claiming they solved the issue.
"• Fixes a bug with data transfer for some 3rd party accessories, including bar code scanners"*

Resources