Export audiofiles via “open in:” from Voice Memos App - ios

I have the exact same issue as "Paul" posted here: Can not export audiofiles via "open in:" from Voice Memos App - no answers have yet been posted on this topic.
Essentially what I'm trying to do is simple:
After having recorded a Voice Memo on iOS, I select "Open With" and from the popup that is shown I want to be able to select my app.
I've tried everything I can think of and experimented with LSItemContentTypes without success.
Unfortunately I don't have enough reputation to comment on the existing post above, and I'm getting quite desperate for a solution to this. Any help is hugely appreciated, even just to know whether it's doable or not.
Thanks!

After some experimentation and much guidance from this blog post ( http://www.theappguruz.com/blog/share-extension-in-ios-8 ), it appears that it is possible to do this using a combination of app extensions (specifically an Action Extension) and app groups. I'll describe the first part which will enable you to get your recording from Voice Memos to your app extension. The second part -- getting the recording from the app extension to the containing app (your "main" app) -- can be done using app groups; please consult the blog post above for how to do this.
Create a new target within your project for the app extension, by selecting File > New > Target... from Xcode's menu. In the dialog box that prompts you to "Choose a template for your new target:" choose the "Action Extension" and click "Next".
CAUTION: Do not choose the "Share Extension" as is done in the blog post example above. That approach is more appropriate for sharing with another user or posting to a website.
Fill in the "Product Name:" for your Action Extension, e.g., MyActionExtension. Also, for "Action Type:" I selected "Presents User Interface" because this is the way Dropbox appears to do it. Selecting this option adds a view controller (ActionViewController) and storyboard (Maininterface.storyboard) to your app extension. The view controller is a good place to provide feedback to the user and to give the user an opportunity to rename the audio file before exporting it to your app.
Click "Finish." You will be prompted to "Activate “MyActionExtension” scheme?". Click "Activate" and this new scheme will be made active. Building it will build both the action extension and the containing app.
Click the disclosure triangle for the "MyActionExtension" folder in the Project Navigator (Cmd-0) to reveal the newly-created storyboard, ActionViewController source file(s), and Info.plist. You will need to customize these files for your needs. But for now ...
Build and run the scheme you just created. You will be prompted to "Choose an app to run:". Select "Voice Memos" from the list and click "Run". (You will probably need a physical device for this; I don't think the simulator has Voice Memos on it.) This will build and deploy your action extension (and its containing app) to your device. and then proceed to launch "Voice Memos" on your device. If you now make a recording with "Voice Memos" and then attempt to share it, you should see your action extension (with a blank icon) in the bottom row. If you don't see it there, tap on the "More" button in that row and set the switch for your action extension to "On". Tapping on your action extension will just bring up an empty view with a "Done" button. The template code looks for an image file, and finding none does nothing. We'll fix this in the next step.
Edit ActionViewController.swift to make the following changes:
6a. Add import statements for AVFoundation and AVKit near the top of the file:
// the next two imports are only necessary because (for our sample code)
// we have chosen to present and play the audio in our app extension.
// if all we are going to be doing is handing the audio file off to the
// containing app (the usual scenario), we won't need these two frameworks
// in our app extension.
import AVFoundation
import AVKit
6b. Replace the entirety of override func viewDidLoad() {...} with the following:
override func viewDidLoad() {
super.viewDidLoad()
// Get the item[s] we're handling from the extension context.
// For example, look for an image and place it into an image view.
// Replace this with something appropriate for the type[s] your extension supports.
print("self.extensionContext!.inputItems = (self.extensionContext!.inputItems)")
var audioFound :Bool = false
for inputItem: AnyObject in self.extensionContext!.inputItems {
let extensionItem = inputItem as! NSExtensionItem
for attachment: AnyObject in extensionItem.attachments! {
print("attachment = \(attachment)")
let itemProvider = attachment as! NSItemProvider
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMPEG4Audio as String)
//|| itemProvider.hasItemConformingToTypeIdentifier(kUTTypeMP3 as String)
// the audio format(s) we expect to receive and that we can handle
{
itemProvider.loadItemForTypeIdentifier(kUTTypeMPEG4Audio as String,
options: nil, completionHandler: { (audioURL, error) in
NSOperationQueue.mainQueue().addOperationWithBlock {
if let audioURL = audioURL as? NSURL {
// in our sample code we just present and play the audio in our app extension
let theAVPlayer :AVPlayer = AVPlayer(URL: audioURL)
let theAVPlayerViewController :AVPlayerViewController = AVPlayerViewController()
theAVPlayerViewController.player = theAVPlayer
self.presentViewController(theAVPlayerViewController, animated: true) {
theAVPlayerViewController.player!.play()
}
}
}
})
audioFound = true
break
}
}
if (audioFound) {
break // we only handle one audio recording at a time, so stop looking for more
}
}
}
6c. Build and run as in the previous step. This time, tapping on your action extension will bring up the same view controller as before but now overlaid with the AVPlayerViewController instance containing and playing your audio recording. Also, the two print() statements I've inserted in the code should give output that looks something like the following:
self.extensionContext!.inputItems = [<NSExtensionItem: 0x127d54790> - userInfo: {
NSExtensionItemAttachmentsKey = (
"<NSItemProvider: 0x127d533c0> {types = (\n \"public.file-url\",\n \"com.apple.m4a-audio\"\n)}"
);
}]
attachment = <NSItemProvider: 0x127d533c0> {types = (
"public.file-url",
"com.apple.m4a-audio"
)}
Make the following changes to the action extension's Info.plist file:
7a. The Bundle display name defaults to whatever name you gave your action extension (MyActionExtension in this example). You might wish to change this to Save to MyApp. (By way of comparison, Dropbox uses Save to Dropbox.)
7b. Insert a line for the key CFBundleIconFile and set it to Type String (2nd column), and set its value to MyActionIcon or some such. You will then need to provide the corresponding 5 icon files. In our example, these would be: MyActionIcon.png, MyActionIcon#2x.png, MyActionIcon#3x.png, MyActionIcon~ipad.png, and MyActionIcon#2x~ipad.png. (These icons should be 60x60 points for iphone and 76x76 points for ipad. Only the alpha channel is used to determine which pixels are gray, the RGB channels are ignored.) Add these icon files to your app extension's bundle, NOT the containing app's bundle.
7c. At some point you will need to set the value for the key NSExtension > NSExtensionAttributes > NSExtensionActivationRule to something other than TRUEPREDICATE. If you want your action extension to only be activated for audio files, and not for video files, pdf files, etc., this is where you would specify such a predicate.
The above takes care of getting the audio recording from Voice Memos to your app extension. Below is an outline of how to get the audio recording from the app extension to the containing app. (I'll flesh it out later, time permitting.) This blog post ( http://www.theappguruz.com/blog/ios8-app-groups ) might also be useful.
Set up your app to use App Groups. Open the Project Navigator (Cmd-0) and click on the first line to show your project and targets. Select the target for your app, click on the "Capabilities" tab, look for the App Groups capability, and set its switch to "On". Once the various entitlements have been added, click on the "+" sign to add your App Group, giving it a name like group.com.mycompany.myapp.sharedcontainer. (It must begin with group. and should probably use some form of reverse-DNS naming.)
Repeat the above for your app extension's target, giving it the same name as above (group.com.mycompany.myapp.sharedcontainer).
Now you can write the url of the audio recording to the app group's shared container from the app extension side. In ActionViewController.swift, replace the code fragment that instantiates and presents the AVPlayerViewController with the following:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
sharedContainerDefaults?.setURL(audioURL, forKey: "SharedAudioURLKey")
sharedContainerDefaults?.synchronize()
Similarly, you can read the url of the audio recording from the containing app's side using something like this:
let sharedContainerDefaults = NSUserDefaults.init(suiteName:
"group.com.mycompany.myapp.sharedcontainer") // must match the name chosen above
let audioURL :NSURL? = sharedContainerDefaults?.URLForKey("SharedAudioURLKey")
From here, you can copy the audio file into your app's sandbox, e.g., your app's Documents directory or your app's NSTemporaryDiretory(). Read this blog post ( http://www.atomicbird.com/blog/sharing-with-app-extensions ) for ideas on how to do this in a coordinated fashion using NSFileCoordinator.
References:
Creating an App Extension
Sharing Data with Your Containing App

Related

find and share a downloaded video on Flutter ios without going through picker?

I have a Flutter app that can view mp4 files from a URL. (Using a video controller playing directly from the URL.) I want the user to be able to share them if they wish. As best I can tell the file has to actually exist on the device so I have broken down the steps for now into download file, invoke share.
I'm using this guide: https://retroportalstudio.medium.com/saving-files-to-application-folder-and-gallery-in-flutter-e9be2ebee92a
I need to work on ios and android. The problem is that on ios neither the filename I get from the dio downloader nor the ImageGallerySaver seem to "work" when passed to the system ShareSheet.
I'm using the flutter extensions dio, share_plus, cross_file, image_gallery_saver as I've seen recommended in various places.
File saveFile = File(directory.path + "/$fileName");
developer.log("starting download...");
await dio.download(url, saveFile.path,
onReceiveProgress: (value1, value2) {
developer.log("got progress " + value1.toString());
setState(() {
downloadProgress = value1 / value2;
});
});
_permaFile = saveFile.path;
if (Platform.isIOS) {
var galleryResult = await ImageGallerySaver.saveFile(saveFile.path,
isReturnPathOfIOS: true);
developer.log("gallery save result = " + galleryResult.toString());
_permaFile = galleryResult['filePath'];
}
After getting a directory we use dio to download the file, do some log chirping, and then save the name to an object member called _permaFile.
Then the share button triggers:
void _shareAction() async {
final box = context.findRenderObject() as RenderBox?;
final files = <XFile>[];
if (_permaFile == null) {
return;
}
developer.log("sharing file: " + _permaFile.toString());
files.add(XFile(_permaFile!));
await Share.shareXFiles(files,
text: "Event",
// subject: "Subject for Event",
sharePositionOrigin: box!.localToGlobal(Offset.zero) & box.size);
}
This works on android device... after I download I hit share, and I can share the video to a third-party app like WhatsApp.
On ios the ShareSheet is invoked but when I share I only get the text "Event", not the video file that goes along with it.
Note that I have tried both results... setting the _permaFile to be what comes back from ImageGallerySaver but also just using what the dio downloader gives back.
Note also that the ImageGallerySaver seems to work: the video really does land and is there in the ios video lib. If I go into the Photos app I can share from there to WhatsApp and have the video get sent.
In each case I get errors like this:
[ShareSheet] error fetching item for URL:file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 -- file:/// : (null)
[ShareSheet] error fetching file provider domain for URL:file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 -- file:/// : (null)
[ShareSheet] error loading metadata for
documentURL:file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 --
file:/// error:Error Domain=NSFileProviderInternalErrorDomain Code=0
"No valid file provider found from URL
file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 -- file:///."
UserInfo={NSLocalizedDescription=No valid file provider found from URL
file:/var/mobile/Media/DCIM/100APPLE/IMG_0021.MP4 -- file:///.}
In order to test this further I built the share_plus demo app:
https://github.com/fluttercommunity/plus_plugins/tree/main/packages/share_plus/share_plus
I modified it to share videos to see what was different. The share plus example (sp_example) works for sharing videos that have been selected by the picker.
For this reason I think the problem is something I'm missing about ios video filenames/formats and possibly a built-in conversion step that happens.
Here are what the filenames look like that I see in my app:
dio download result:
file:///var/mobile/Containers/Data/Application/223BF2B9-DDF0-490E-932F-09D5F03B98B3/Library/Caches/test.mp4
ImageGallerySaver result:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0019.MP4
This is what video filenames look like when they are picked and shared in sp_example:
/private/var/mobile/Containers/Data/Application/E5CB4D7C-6CDF-4AA2-8134-C4322ED7C886/tmp/trim.E6633D68-44E3-4853-A29E-A71AC95A0913.MOV
Note that it has been converted to MOV extension and the user gets trim step right in the picker that results in trim in the name.
For my purposes I don't want to go through the picker, the user is on the screen showing the video and they shouldnt have to repick, so where do I get the post-conversion ios filename that references what I just saved?

Choose destination when saving file

Is it possible to make user able to choose a destination for the file that he wants to download, something like DocumentPicker which you can use when choosing a file to upload?
I want something like this:
Yes, for iOS 13 and later you can ask the user to select a directory via UIDocumentPickerViewController. You'll get back a security scoped url(s) for the directories selected by the user.
Details here: https://developer.apple.com/documentation/uikit/view_controllers/providing_access_to_directories
I've pasted the sample code from that page below, but you'll want to read the documentation carefully because security scoped URLs require careful handling :)
If you need iOS 12 or earlier the user can only select files so I'm unclear on a clean way to do this (but we're on iOS 14 and iOS 15 is about to come out so hopefully you don't have to support back past iOS 13).
Here's the sample code from the link above showing how this is done:
// Create a document picker for directories.
let documentPicker =
UIDocumentPickerViewController(forOpeningContentTypes: [.folder])
documentPicker.delegate = self
// Set the initial directory.
documentPicker.directoryURL = startingDirectory
// Present the document picker.
present(documentPicker, animated: true, completion: nil)

On Xcode 11, how can I configure an Intent to run in the background?

TL;DR
On iOS 13 and Xcode 11, how can I configure an Intent to run in the background and just return the result, so it can be used as the input for other actions in the Shortcuts app?
Details of what I'm trying to achieve
My app has a list of songs that I want to expose via Shortcuts (actually, metadata about the song, not the song itself). The idea is to give advanced users a way to integrate this database of music with other apps or actions they want. For example, one may find useful to get a list of upcoming music for the next month, and use it to create Calendar events for each song. Having access to this list on the Shortcuts app can enable them to do this.
I have created an Intent called "List All Unread Music Releases" and defined its response as a list of objects that contains information about each song. The problem is, when I go to the Shortcuts app, create a new shortcut using this Intent, and run it, it opens my app instead of running in the background.
Steps I've done to create and configure Intents
Here's a high level definition of what I did to configure Intents in the project. The next section will have the actual source code and screenshots.
Created a new SiriKit Intent Definition File.
Created a new Intent.
Defined it's Title, Description, Parameters, and disabled the "Intent is eligible for Siri Suggestions" checkbox.
Defined the response property as an Array (because it's going to be a list of songs), and configured the Output to be this array property.
Created a new Intents Extension, with the checkbox "Include UI Extension" disabled. The idea here is to process the user request in the background and return a list with the results - no UI required.
In the Intents Extension target, defined the IntentsSupported array inside Info.plist with the name of the intent created in step 2.
Made the IntentHandler class implement the protocol generated for the intent created in step 2.
Code samples and screenshots
My SiriKit Intent Definition File and the GetUnreadReleases Intent:
The GetUnreadReleases Intent response:
The Intents Extension IntentHandler class:
import Intents
class IntentHandler: INExtension, GetUnreadReleasesIntentHandling {
func handle(intent: GetUnreadReleasesIntent, completion: #escaping (GetUnreadReleasesIntentResponse) -> Void) {
let response = GetUnreadReleasesIntentResponse(code: .success, userActivity: nil)
let release1 = IntentRelease(identifier: "1", display: "Teste 1")
release1.name = "Name test 1"
release1.artist = "Artist test 1"
response.releases = [release1]
completion(response)
}
func resolveMediaType(for intent: GetUnreadReleasesIntent, with completion: #escaping (IntentMediaTypeResolutionResult) -> Void) {
if intent.mediaType == .unknown {
completion(.needsValue())
} else {
completion(.success(with: intent.mediaType))
}
}
override func handler(for intent: INIntent) -> Any {
// This is the default implementation. If you want different objects to handle different intents,
// you can override this and return the handler you want for that particular intent.
return self
}
}
The Intents Extension Info.plist file:
Conclusion
So, I would like this intent to run in the background, assemble the list of songs based on the user defined parameters, and return this list to be used as an input to other actions in the Shortcuts app.
It looks like previous versions of the Intents editor (Xcode < 11 / iOS < 13.0) had a checkbox "Supports background execution" that did just that, but I can't find it anymore on Xcode 11.
Thanks to edford from Apple Developer Forums, I was able to make it work. In the intents definition file, the "Intent is eligible for Siri Suggestions" checkbox must be checked for the background execution to work.

iOS Notification Content Extension - How to pass data to app?

I wrote a custom Notification Content Extension for my Push Notifications like this:
The thing is, whenever the user is on a certain item in the carousel, I want the GO TO APP button to send a String to the app when it's opening, and from there, handle that string to move the user to the correct ViewController.
I already have the handling part inside the app, I just need to know how to pass that String from the Notification Content Extension to the container app.
Thanks! :)
Enable app groups in capabilities and use suite userDefaults to write the key and read it in the app
NSUserDefaults*defaults= [[NSUserDefaults alloc] initWithSuiteName:#"group.com.company.appName"];
// Write in extension
[defaults setObject:#"anyThing" forKey:#"sharedContent"];
// Read in app
[defaults objectForKey:#"sharedContent"];
If your app is configured for Universal Links or you have defined a Custom URL Scheme for your app, you can also open your app's URL (e.g. with data in query parameters) by calling
extensionContext?.open(url)
in your NotificationViewController.
iOS 13, Swift 5.
Based on the answer by Sh_Khan, here is some Swift Syntax. Obviously I have added App Group as a capability to the target of the main app + the target of the extension, naming the group as "group.ch.Blah" for this example.
Setting your app group, saving a string in our case, needed to set the type as Any cause strings not a type that is available in groups.
let localK = getPrivateKey64() as Any
let defaults = UserDefaults.init(suiteName: "group.ch.Blah")
defaults?.set(localK, forKey: "privateK")
Setting your app group, and reading the string back, needed to recast it back to string.
let defaults = UserDefaults.init(suiteName: "group.ch.Blah")
let localK = defaults?.object(forKey: "privateK") as? String
Worked perfectly with a notification service extension.

App launched with custom URL scheme. How do I return data to the calling app when done?

I am taking an Android programming course at my University only I have been allowed by the teacher to do IOS but I have to implement the same projects. This project is to have two apps. The first app is a color picker from a previous assignment. The second app is to call the colorpicker and allow the user to choose a color and when done return it too the second app to be displayed.
I have defined a custom URL scheme in my ColorPicker which works fine. In my second app I have a changeColor button that has the following IBAction method.
- (IBAction)colorChangePressed:(UIButton *)sender {
UIApplication *test = [ UIApplication sharedApplication ];
BOOL found =
[ test openURL:[ NSURL URLWithString:#"colorPicker://" ] ];
if (found) NSLog( #"Resource was found" );
else NSLog(#"unable to locate resource" );
}
This indeed launches the color picker app and it behaves as expected. My question is, after the color has been selected how do I return to the calling app with the selected color? I will add a finished button in my colorPicker to be clicked when the user is done selecting the color and I will capture the values I need but I can't figure out how to get this data back to the calling app. Is there some protocol/delegate pattern I need to implement?
The complete code is on git hub at. https://github.com/jnels124/CS390H
Thanks in advance for any insight as to how to solve my problem.
You need to have both apps with unique schemes. Encode the scheme of app1 and use it as a part of app1->app2 URL. When app2 is finished, you'll have a app2->app1 URL, use itto open app1 and send it required information (encoded).
It is similar as if you've put a String extra to app2 Intent with the name of app1 Intent, but instead of Intent you use URL and parse it as needed.
I defined a custom scheme in the other project as stated in the first answer but I was unsure how to generate the query string in the called URL and return it to the calling application to be parsed. I had this resolved in the following post.
Syntax for passing NSArray to other application with custom URL Scheme

Resources