In Apple's iOS 13 feature list page they have the following blurb:
Image Capture API
The Image Capture API allows developers to leverage the Camera
Connection Kit to import photos directly into their apps.
I've been looking but I can't seem to find any actual documentation about this change, and where it exists in the API. I also remember hearing a second or two talk about it in the keynote/state of the union in WWDC 19, but again no details in any session I've found so far.
It seems like you would be able to plug in a camera or it's SD card to the USB-C/Lightning port on the iOS device and be able to access that from within a 3rd party app. I know you can import to the system photo library, but that has been around for years. I also know about ExternalAccessory framework for MiFi hardware, but I don't see any significant changes to that, and it doesn't seem to have the described functionality exposed.
I do see that UIDocumentPicker can be shown and it allows the user to select a location that may be on a connected USB device. While that could work, it's not camera specific and would be quite error prone, if the user doesn't select a valid camera location.
Anybody know where I can find more info about this change or how you can programmatically access the camera's filesystem? The camera will have the standard camera folder structure DCIM and stuff, so it is recognized as a camera filesystem by many Mac apps.
You're looking for the ImageCaptureCore framework. This is the same framework that exists on macOS for importing from SD Cards and Cameras. It is now available in iOS 13.2.
Update:
The ImageCaptureCore API is now working as of iOS 13.2.
However, be warned that as of iOS/iPadOS 13.1 Beta 3 (17A5837a) I have not been able to get it working yet (reported to Apple FB6799036). It is now listed with an asterisk on the iPadOS Features page indicating that it will be "Coming later this year".
I'm able to start an ICDeviceBrowser, but I see permissions errors when a device is connected and don't get any delegate messages. So there may be some permission or entitlement that is needed before it starts working.
Unfortunately there is no documentation or sample code (even for macOS) on Apple's developer site. But the framework does exist in the iOS 13 SDK and you can look at the header files there.
We use this framework in our macOS app and using just the headers to figure things out isn't too bad. You'd start by creating an ICDeviceBrowser (ICDeviceBrowser.h), setting its delegate, and then starting the browser:
#interface CameraManager() : NSObject <ICDeviceBrowserDelegate>
{
ICDeviceBrowser* _deviceBrowser;
}
#end
#implementation CameraManager
- (id) init
{
self = [super init];
_deviceBrowser = [[ICDeviceBrowser alloc] init];
_deviceBrowser.delegate = self;
[_deviceBrowser start];
return self;
}
...
#end
You should then start receiving delegate messages when a camera device is connected:
- (void)deviceBrowser:(ICDeviceBrowser*)browser didAddDevice:(ICDevice*)addedDevice moreComing:(BOOL)moreComing;
- (void)deviceBrowser:(ICDeviceBrowser*)browser didRemoveDevice:(ICDevice*)removedDevice moreGoing:(BOOL)moreGoing;
When you get a didAddDevice: message you'll then want to use the ICDevice (ICDevice.h) and ICCameraDevice (ICCameraDevice.h) APIs to set a delegate and start a session. Once the session has started you'll start receiving delegate messages:
- (void)deviceBrowser:(ICDeviceBrowser*)browser didAddDevice:(ICDevice*)addedDevice moreComing:(BOOL)moreComing
{
if ((addedDevice.type & ICDeviceTypeMaskCamera) == ICDeviceTypeCamera)
{
ICCameraDevice* camera = (ICCameraDevice *) addedDevice;
camera.delegate = self;
[camera requestOpenSession];
// probably want to save 'camera' to a member variable
}
}
You can use the delegate method:
- (void)cameraDevice:(nonnull ICCameraDevice *)camera
didAddItems:(nonnull NSArray<ICCameraItem *> *)items;
To get a list of items as they are enumerated by the API or wait for:
- (void)deviceDidBecomeReadyWithCompleteContentCatalog:(ICDevice*)device;
And then use the .contents property on the ICCameraDevice to get all of the contents.
From there you can use the ICCameraDevice to request thumbnails, metadata, and to download specific files. I'll leave that as an exercise to the reader.
As I mentioned above this doesn't seem to be working in iOS/iPadOS 13.1 Beta 3. Hopefully this will all start working soon as I'd really like to start testing it myself.
This is now working in iOS 13.2.
Related
According to those SO questions: UIImagePickerController not asking for permission and No permission to pick a photo from the photo library
If you want to select one image on iOS, you don't have to ask for permission to do it as the app doesn't actually access the gallery.
However, I can't find a way of doing it Flutter. Packages like ImagePicker always ask for permission.
Has anyone succeeded in picking an image in Flutter on iOS without asking for permission?
From Apple documentation:
PHPickerViewController is a new picker that replaces UIImagePickerController. Its user interface matches that of the Photos app, supports search and multiple selection of photos and videos, and provides fluid zooming of content. Because the system manages its life cycle in a separate process, it’s private by default. The user doesn’t need to explicitly authorize your app to select photos, which results in a simpler and more streamlined user experience.
This library uses PHPickerViewController as seen here
The old UIImagePickerController allowed it on older iOS'es, but it has been deprecated, since iOS 14.
The Flutter ImagePicker plugin uses the PHPicker in the iOS code, as I checked for their code on Github, and it allows you to pick an image from the user without requesting permissions. I recommend highly to use that plugin.
Try file_picker it should work for you as it supports all the platform including IOS and Mac supporting various types of file type, you can specify your custom file types also limiting your file selections as well as you can pick files from cloud (GDrive, Dropbox, iCloud)...
First of all add the latest file_picker as a dependency in your pubspec.yaml file.
Import this dependency in file wherever you want to use and then you are good to go...
for picking single file use this code:
FilePickerResult? result = await FilePicker.platform.pickFiles();
if (result != null) {
File file = File(result.files.single.path);
} else {
// User canceled the picker
}
files with extension filter:
FilePickerResult? result = await FilePicker.platform.pickFiles(
type: FileType.custom,
allowedExtensions: ['jpg', 'pdf', 'doc'],
);
You can find detailed usage Here
you can find the documentation Here
I am trying to create an iPhone app that records not only the app's screen but if put into the background it records everything on the screen, including other apps. This is how recording from "Control Center" works. The difference is I want to get access to the video immediately without user intervention, with the user's consent of course.
I've implemented code using ReplayKit2 on iOS 12 that uses an embedded Broadcast Upload Extension. I have not found any examples online that work like this.
I posted the code on Bitbucket: https://bitbucket.org/breelig/replaykitbroadcasttofile/src/master/
The closet similar question I found on SO is: ReplayKit stops screen recording in background mode of the application or outside the app?
Update
Based on the good responses by #KaneCheshire and #AndreyA. below and other random sources I was able to develop a solution that works. Please see the code in my BitBucket link above.
From the docs:
Apps on a user’s device can share the recording function, with each
app having its own instance of RPScreenRecorder. Your app can record
the audio and video inside of the app, along with user commentary
through the microphone
The only other way to record the screen is through a Broadcast Upload Extension, which requires the user to initiate it through Control Centre.
I've faced almost the same problem that you did and it's absolutely lacking any kind of guides or documentation.
The way I resolved this problem to myself is setting nil to my preferred extensions, so it makes RPSystemBroadcastPickerView to show all of them including system screen video capture:
override func viewDidLoad() {
super.viewDidLoad()
let broadcastPicker = RPSystemBroadcastPickerView(frame: CGRect(x: 100, y: 100, width: 80, height: 80))
broadcastPicker.preferredExtension = nil
view.addSubview(broadcastPicker)
}
Also I've found one thing that I've figured out to be useful - this Twilio lib and its example - https://github.com/twilio/video-quickstart-swift/tree/master/ReplayKitExample - These guys have made a decent work in area of video/audio capturing and we can try to use their experience.
You can find your exactly preferedExtension here:
When you add pickerView.preferredExtension exactly the Bundle Identifier, your app will be showed on the Recording App List. Hope this helps!
There is some API that allows third party apps to be display as an option in iOS native phone app:
As one can see there is Viber and WhatsApp, and by selecting it will open third party app.
which API is being used here? I've no clue what to search for in order get information how to integrate my voip app with native iPhone app. I suppose some kind of extension. any help on keywords to search for and any example are much appreciated.
Here is how to activate this option in native Apps:
First, one has to use CallKit framework. iOS will generate those options(called handles) automatically. The property responsible for that is supportedHandleTypes with CXHandleTypePhoneNumber handle type.
CXProviderConfiguration * config = [[CXProviderConfiguration alloc] initWithLocalizedName:NSLocalizedString(#"myAppName", #"")];
config.supportedHandleTypes = [NSSet setWithObjects:#(CXHandleTypePhoneNumber), nil];
The handle in native app will appear after this configuration is called. I've called this in my app delegate:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
If video handle should be appeared as well then configuration should be:
config.supportsVideo = YES;
There are two other handle types CXHandleTypeGeneric and CXHandleTypeEmailAddress, but I could not figure out what changes were caused by those handles. please do comment if you know what they do.
ps: I was using CXHandleTypeGeneric before, and I could not see handles for my App. As soon as I changed it to CXHandleTypePhoneNumber, the handle was added automatically in contact details.
I am writing an app that includes text-to-speech using AVSpeechSynthesizer. The code for generating the utterance and using the speech synthesizer has been working fine.
let utterance = AVSpeechUtterance(string: text)
utterance.voice = currentVoice
speechSynthesizer.speak(utterance)
Now with iOS 11, I want to match the voice to the one selected by the user in the phone's Settings app, but I do not see any way to get that setting.
I have tried getting the list of installed voices and looking for one that has a quality of .enhanced, but sometimes there is no enhanced voice installed, and even when there is, it may or may not be the voice selected by the user in the Settings app.
static var enhanced: AVSpeechSynthesisVoice? {
for voice in AVSpeechSynthesisVoice.speechVoices() {
if voice.quality == .enhanced {
return voice
}
}
return nil
}
The questions are twofold:
How can I determine which voice has been selected by the user in the Setting app?
Why on some iOS 11 phones that are using the new Siri voice am I not finding an "enhanced" voice installed?
I suppose if there was a method available for selecting the same voice as in the Settings app, it'd be shown on the documentation for class AVSpeechSynthesisVoice under the Finding Voices topic. Jumping to the definition in code of AVSpeechSynthesisVoice, I couldn’t find any different methods to retrieve voices.
Here's my workaround on getting an enhanced voice over for the app I am working on:
Enhanced versions of voices are probably not present in new iOS devices by default in order to save storage. Iterating thru available voices on my brand new iPhone, I only found Default quality voices such as: [AVSpeechSynthesisVoice 0x1c4e11cf0] Language: en-US, Name: Samantha, Quality: Default [com.apple.ttsbundle.Samantha-compact]
I found this article on how to enable additional voice over voices and downloaded the one named “Samantha (Enhanced)” among them. Checking the list of available voices again, I noticed the following addition:
[AVSpeechSynthesisVoice 0x1c4c03060] Language: en-US, Name: Samantha (Enhanced), Quality: Enhanced [com.apple.ttsbundle.Samantha-premium]
As of now I was able to select an enhanced language on Xcode. Given that the AVSpeechSynthesisVoice.currentLanguageCode() method exposes the currently selected language, ran the following code to make a selection of the first enhanced voice I could find. If no enhanced version was available I’d just pick the available default (the code below is for a VoiceOver custom class I am creating to handle all speeches in my app. The piece below updates its voice variable).
var voice: AVSpeechSynthesisVoice!
for availableVoice in AVSpeechSynthesisVoice.speechVoices(){
if ((availableVoice.language == AVSpeechSynthesisVoice.currentLanguageCode()) &&
(availableVoice.quality == AVSpeechSynthesisVoiceQuality.enhanced)){ // If you have found the enhanced version of the currently selected language voice amongst your available voices... Usually there's only one selected.
self.voice = availableVoice
print("\(availableVoice.name) selected as voice for uttering speeches. Quality: \(availableVoice.quality.rawValue)")
}
}
if let selectedVoice = self.voice { // if sucessfully unwrapped, the previous routine was able to identify one of the enhanced voices
print("The following voice identifier has been loaded: ",selectedVoice.identifier)
} else {
self.voice = AVSpeechSynthesisVoice(language: AVSpeechSynthesisVoice.currentLanguageCode()) // load any of the voices that matches the current language selection for the device in case no enhanced voice has been found.
}
I am also hoping Apple will expose a method to directly load the selected language, but I hope this work around can serve you in the meantime. I guess Siri’s enhanced voice is downloaded on the go, so maybe this is the reason it takes so long to answer my voice commands :)
Best regards.
It looks like the new Siri voice in iOS 11 isn't part of the AVSpeechSynthesis API, and isn't available to developers.
In macOS 10.13 High Sierra (which also gets the new voice), there seems to be a new SiriTTS framework that's probably related to this functionality, but it's in PrivateFrameworks so it doesn't have a developer API.
I'll try to provide a more detailed answer. AVSpeechSynthesizer cannot use the Siri voice. Apple has locked this voice to ensure privacy as the malicious app could impersonate Siri and get private information that way.
Apple hasn't changed this for years, but there is ongoing initiative regarding this. We already know that there is a solution to access privacy sensitive features in the iOS using the permissions, and there is no reason why Siri voice couldn't be accessed with user permission. You may vote for this to happen using this petition and with some hope Apple may implement that in the future: https://www.change.org/p/apple-apple-please-allow-3rd-party-apps-to-use-siri-voices-for-improved-accessibility
This SO post addresses how to customize the UIActivityViewController by excluding services like AirDrop or printing.
It also mentions this Apple doc which highlights the stock services supported, but how do we identify other supported end points like Line and other messaging apps?
Specifically:
(1) Do Skype, Kakao, Line, Viber, WeChat, Kik, WhatsApp, and Facebook Messenger (not Facebook proper) have end points?
(2) What are those end points?
You can't do that currently on iOS 7, because no application can talk directly to other applications yet for security reasons. One of the highlights of the last WWDC was the introduction of extensions for iOS 8, which will make this possible; you can read how in the Creating Action Extensions example.
There are however attempts at fixing this. A notable example is IntentKit, which works by having a repository of known apps.
What is IntentKit?
IntentKit is an open-source iOS library that makes it easier to link to other apps. It's sort of like Android's Intents or Windows Phone's Contracts.
Another example of one of such attempts is OvershareKit
Why OvershareKit?
Sharing is far too cumbersome to implement on iOS. UIActivityViewController is too limiting, and rolling your own library is too time-consuming. Most devs end up settling for underwhelming sharing options for lack of the time or inclination to make something better.
OvershareKit makes it trivial to add rich sharing options to your iOS apps.
How to know if an application is installed?
Even though you can't discover them. If you know the application you're looking for and what kind of URL Scheme it responds to, then you can check if your app is able to open that kind of URL.
That's what IntentKit is for, it's a repository of knowledge about applications, the URL Schemes they respond to and the kind of actions they can perform. With the introduction of extensions.
For example, you can check if Facebook is installed by checking if you can open a fb:// URL.
BOOL isFacebookInstalled = [[UIApplication sharedApplication] canOpenURL:[NSURL URLWithString:#"fb://"]];
About IntentKit's inner workings
Internally, IntentKit will check for that same thing, as you can see in INKActivity's implementation:
- (BOOL)canPerformCommand:(NSString *)command {
if (!self.actions[command]) { return NO; }
if (self.presenter) {
return [self.presenter canPerformAction:command];
} else {
NSURL *url = [NSURL URLWithString:[self.actions[command] urlScheme]];
return [self.application canOpenURL:url];
}
}
Info about requested UIActivity services:
Skype uses the "skype:" URI, more info in the official documentation
Kakao & Line, with DCActivity (there seems to be an official API for Kakao, but the documentation is in korean)
Line, with LINEActivity
WeChat, with WeixinActivity (there's also an official API with which you can make your own UIActivity)
WhatsApp uses the "whatsapp:" URI, more info on the official FAQ, there are also various UIActivity implementations for WhatsApp, take a look at them in cocoapods.com
Facebook Messenger uses the "fb-messenger:" URI, more info in this other answer by tia, also see workarounds.
Kik has a public API, but no SDK nor UIActivity implementation that I know of. Also, see workarounds.
Viber has no SDK nor public API, see workarounds.
Workarounds
Most of these services are based on known protocols, or slight variations of them. For example, you can use XMPP (aka Jabber) to directly send messages to a Facebook IM or Kik account; some people say that Viber seems to use a modification of SIP for signaling with VoIP phones. So you could work around some SDK/API limitations by using the underlying mechanisms.
SDK or API?
If all you need is to send a message to those services, I'd argue that you don't really need to communicate with the installed application via an SDK or URL Schemes, I haven't been able to test the Big Emoji app you mentioned, as it just crashes on iOS 8, but if it's using the services API's, you could easily work it out by using Charles or Wireshark.
Presumably they are adding a bunch of their own custom actions, as described in this answer.
There is no central repository for third-party sharing support before iOS 8. You can check for the other apps' presence by using URL Schemes. To do this, you'll have to look at each app's documentation and figure out what schemes they accept, then do something like this:
NSArray* items = /* stuff you want to share */
NSMutableArray* activities = NSMutableArray.array;
if ([UIApplication.sharedApplication canOpenUrl:#"whatsapp://url"])
{
UIActivity* activity = /* create activity for whatsapp */
[activities addObject:activity];
}
if ([UIApplication.sharedApplication canOpenUrl:#"facebook://url"])
{
UIActivity* activity = /* create activity for facebook */
[activities addObject:activity];
}
// ... repeat for other services ...
UIActivityViewController *activityVC = [[UIActivityViewController alloc] initWithActivityItems:items applicationActivities:activities];
// show the VC however appropriate.
In addition to #NinoScript, you can find here the URL schemes for the iOS apps (inside the .plist files) which is provided by IntentKit as he mentioned.
Here is a summarized list from the project:
1Password ophttps://{{{url}}}
Chrome: googlechromes://{{{url}}}
Gmail: googlegmail:///co?to={{recipient}}
Google Maps: comgooglemaps://?q={{query}}
Google+: gplus://plus.google.com/{{userId}}
Safari: http://{{{url}}}
For a full URL-schemes search the git project.