Creating Architecture for Sharing Photos with Apple Watch OS2 - ios

I'm trying to figure out the proper approach for sharing 10+ photos from an iOS app to an Apple Watch app using watchOS 2.
I want to transfer these images in the background so that the user doesn't have to open the iOS app in order to view the photos.
I've tried querying photos from Facebook and sending them to the watch via transferUserInfo() but the payload is too large:
FBSDKGraphRequest(graphPath: "me/photos?limit=2", parameters:["fields": "name, source"]).startWithCompletionHandler({ (connection, result, error) -> Void in
if (error != nil){
print(error.description)
}
else {
var arr = [NSData]()
for res in result["data"] as! NSArray {
if let string = res["source"] as? String {
if let url = NSURL(string: string) {
if let data = NSData(contentsOfURL: url){
arr.append(data)
}
}
}
}
print(arr)
if arr.count > 0 {
self.session.transferUserInfo(["image" : arr])
}
}
})
Any ideas how I should go about doing this?

The proper method is mentioned in the WCSession documentation:
Use the transferFile:metadata: method to transfer files in the background. Use this method in cases where you want to send more than a dictionary of values. For example, use this method to send images or file-based documents.
The images will be asynchronously delivered to the watch on a background thread. session:didReceiveFile: will be called when the watch successfully receives an image.
Make sure to include (date) metadata with the image, and remove any existing images from the watch which are no longer a part of the ten most recent Facebook uploads.

Related

How to fetch live photo or video from PHPickerViewController delegate

Apple's new iOS 14 PHPickerViewController delegate receives only an NSItemProvider. The 2020 WWDC video shows how to get from there to a UIImage:
let prov = result.itemProvider
prov.loadObject(ofClass: UIImage.self) { im, err in
if let im = im as? UIImage {
DispatchQueue.main.async {
// display the image here
But what about if the user chose a live photo or a video? How do we get that?
Live photos are easy; do it exactly the same way. You can do that because PHLivePhoto is a class. So:
let prov = result.itemProvider
prov.loadObject(ofClass: PHLivePhoto.self) { livePhoto, err in
if let photo = livePhoto as? PHLivePhoto {
DispatchQueue.main.async {
// display the live photo here
Videos are harder. The problem is that you do not want to be handed the data; it does you no good and is likely to be huge. You want the data saved to disk so that you can access the video URL, just like what UIImagePickerController used to do. You can in fact ask the item provider to save its data for you, but it wants to let go of that data when the completion handler returns. My solution is to access the URL in a .sync function:
let prov = result.itemProvider
prov.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, err in
if let url = url {
DispatchQueue.main.sync {
// display the video here

How to use my view controllers and other class in the Share Extension ? iOS | Swift 4

I am creating a chatting application. User can share the images from other application to my application. I have added Share Extension to show my app in the native share app list. I'm also getting the selected data in didSelectPost Method. From here I want to show the list of the users to whom the image can be forwarded. For this, I'm using an already created view controller in the main app target.
override func didSelectPost() {
// This is called after the user selects Post. Do the upload of contentText and/or NSExtensionContext attachments.
if let content = self.extensionContext!.inputItems[0] as? NSExtensionItem {
let contentType = kUTTypeImage as String
// Verify the provider is valid
if let contents = content.attachments as? [NSItemProvider] {
for attachment in contents {
if attachment.hasItemConformingToTypeIdentifier(contentType) {
attachment.loadItem(forTypeIdentifier: contentType, options: nil) { (data, error) in
let url = data as! URL
let imageData = try! Data(contentsOf: url)
// Here I'm navigating to my viewcontroller, let's say: ForwardVC
}
}
}
}
}
I don't want to recreate the same screen in Share Extension. Apart from this view controllers, I have many more classes and wrappers that I want to use within the share extension. Like, SocketManager, Webservices, etc. Please suggest me your approach to achieve the same.
P.S.: I've tried setting multiple targets to required viewControllers and using same pods for Share Extention. In this approach, I'm facing a lot of issues as many of the methods and pods are not extention compliant. Also, is it the right way to do this.

UNNotificationServiceExtension's didRecieve not called

I moved step by step for getting rich push notifications. Here they are :
Created Notification service extension with plist :
NotificationService didRecieve :
override func didReceive(_ request: UNNotificationRequest, withContentHandler contentHandler: #escaping (UNNotificationContent) -> Void) {
func failEarly() {
contentHandler(request.content)
}
self.contentHandler = contentHandler
bestAttemptContent = (request.content.mutableCopy() as? UNMutableNotificationContent)
// Get the custom data from the notification payload
if let data = request.content.userInfo as? [String: AnyObject] {
// Grab the attachment
// let notificationData = data["data"] as? [String: String]
if let urlString = data["attachment-url"], let fileUrl = URL(string: urlString as! String) {
// Download the attachment
URLSession.shared.downloadTask(with: fileUrl) { (location, response, error) in
if let location = location {
// Move temporary file to remove .tmp extension
let tmpDirectory = NSTemporaryDirectory()
let tmpFile = "file://".appending(tmpDirectory).appending(fileUrl.lastPathComponent)
let tmpUrl = URL(string: tmpFile)!
try! FileManager.default.moveItem(at: location, to: tmpUrl)
// Add the attachment to the notification content
if let attachment = try? UNNotificationAttachment(identifier: "video", url: tmpUrl, options:nil) {
self.bestAttemptContent?.attachments = [attachment]
}else if let attachment = try? UNNotificationAttachment(identifier: "image", url: tmpUrl, options:nil) {
self.bestAttemptContent?.attachments = [attachment]
}else if let attachment = try? UNNotificationAttachment(identifier: "audio", url: tmpUrl, options:nil) {
self.bestAttemptContent?.attachments = [attachment]
}else if let attachment = try? UNNotificationAttachment(identifier: "image.gif", url: tmpUrl, options: nil) {
self.bestAttemptContent?.attachments = [attachment]
}
}
// Serve the notification content
self.contentHandler!(self.bestAttemptContent!)
}.resume()
}
}
}
Configured AppId and provision profile for extension.
Rich notification is coming correctly :
But here are the issues I am facing :
didRecieve is not getting called. For that I attached the serviceExtension process to the app target and ran the app.
Note : Extension is getting called as soon as notification arrives but didRecieve is not called :
On opening the push notification (which has video attachment), nothing happens. Ideally it should get played.
If I have to open the video and play it, do I have to explicitly do something or extension will take care of that ?
Payload :
aps = {
alert = "This is what your message will look like! Type in your message in the text area and get a preview right here";
badge = 1;
"mutable-content" = 1;
sound = default;
};
"attachment-url" = "https://www.sample-videos.com/video/mp4/720/big_buck_bunny_720p_1mb.mp4";
deeplinkurl = "";
"message_id" = 1609;
}
I did try going through following posts but that didn't help :
iOS10 UNNotificationServiceExtension not called
NotificationServiceExtension not called
UNNotificationServiceExtension not working on iPhone 5 (iOS 10)
Good news! Your service extension is indeed being called - the image on your notification is evidence of that. What is probably happening here is that you are unable to debug the extension using the workflow you are used to with applications.
Debugging notification extensions is not like debugging an app. Extensions are plug-ins to an iOS process outside your application. Just setting a breakpoint is not a reliable way to debug them. Instead:
Debugging A Notification Service Extension
Launch the app from Xcode or the device
In Xcode, select Attach To Process or PID By Name... from the Debug menu
Enter the name of your notification extension
Trigger a notification (by sending a push, etc.).
When the notification is delivered the service extension should launch in to the debugger. Service extensions are only relevant to remote (push) notifications, so you will need a device to troubleshoot them.
Debugging A Notification Content Extension
There are at least two ways. The steps shown above for a service extension also work for a content extension. The second method is more familiar but less reliable.
Select the extension scheme in Xcode using the toolbar
In the Product menu, select Edit Scheme...
Set the Executable to the parent application.
Set a breakpoint inside the content extension.
Now build and run your extension. It will launch the parent application.
Trigger a notification that will cause the content extension to load.
It's worth noting that adding logging using the logging framework can be very useful for debugging and troubleshooting as well.
Why The Video May Not Be Playing
iOS limits the size of content that can be presented in notifications. This is described in the documentation for UNNotificationAttachment. For video it is generally 50Mb. Make sure your video is as small as you can make it in terms of bytes, and of course provide a video that is sized appropriately for the device it will be played on. Do not try to play a 1080p video in a notification that is 400 points wide!
In practice it is almost always better to use HLS instead of downloading video, and present it in a content extension.
Another thing in your code that may be problematic is the identifiers you are assigning to your attachments. Identifiers should be unique. Typically this would be a reverse-domain notation string like your bundle ID followed by a UUID string. You could also use the original URL of the content followed by a UUID string. If you provide an empty string iOS will create a unique identifier for you.
With the user notifications framework having non-unique identifiers (for notifications, attachments, etc.) tends to cause difficult to track down issues inside the framework. For example, this can cause an attached watchOS device to crash.
If you want to implement "auto play" for your video - it is not clear from your question wether that is what you are describing - you will need to implement your own player functionality in a content extension.
If you are going to do that, again, HLS is the preferred way to display video in a notification. It usually uses less RAM, offers a better user experience and tends to be more stable.

Load image from Firestore and save it to cache in Swift

I'm working on an app that has to load some images and data from server on every launch (to make sure it's using up-to-date info). I'm using Firestore as a DB and currently storing images in it as an URL to Firebase storage.
Is it somehow possible to store an actual image in Firestore? And how can I cache loaded image? Either from
UIImage(contentsOf: URL)
or from Firestore?
Try this Asynchronous image downloader with cache support as a UIImageView category - http://cocoadocs.org/docsets/SDWebImage
It is called sdwebimage really easy to use
I don't know if that's the most efficient way of solving my problem but I did it the following way:
In my Firestore DB I stored references to images in Cloud Storage. Then when app starts for the first time, it loads those files from Firestore DB using default methods AND saves those images in app's container (Documents folder) using Swift's FileManager().
Next time the app starts, it goes through references array and skips the files which are already in app's container.
You could use the bytes type in Firestore (see a list of types) to save whatever binary data you want (use NSData on iOS), but this is almost certainly not what you actually want to do. The limit for the size of an entire document is 1 MB, and images can easily exceed that. Also, you'll be paying the cost of downloading that image to the client any time that document is read, which could be wasteful.
You'll be far better off storing the actual file data in Cloud Storage (using the Firebase SDK on the client), then storing a reference or URL to that in the document, and fetch it from there only when needed.
You could use https://github.com/pinterest/PINRemoteImage, this framework use https://github.com/pinterest/PINCache
import PINRemoteImage
extension UIImageView {
public func setImageFrom(urlString: String!, animated: Bool = false) {
guard let urlString = urlString else {
return
}
guard let url = URL(string: urlString) else {
return
}
layer.removeAllAnimations()
pin_cancelImageDownload()
image = nil
if !animated {
pin_setImage(from: url)
} else {
pin_setImage(from: url, completion: { [weak self] result in
guard let _self = self else { return }
_self.alpha = 0
UIView.transition(with: _self, duration: 0.5, options: [], animations: { () -> Void in
_self.image = result.image
_self.alpha = 1
}, completion: nil)
})
}
}
}
....
UIImageView(). setImageFrom(urlString: "https://ssssss")

Extensive memory usage when uploading assets (images, videos) to firebase in Swift?

Suppose I have an array of UIImage called photos, they are to be uploaded to Firebase storage. I wish to do the following things:
Upload them to Firebase storage
Get paths of the uploaded photos and store in an array called uploadedAssets (paths, not download url, it looks like this: "photos/folder_name/photo_id"), where "folder_name" is randomly generated and "photo_id" is an integer, representing the order of photos
Call Cloud Function and pass uploadedAssets to it. The server then uses the paths to find all pictures and generates a thumbnail for each one.
Finally, store the original photos' download urls and thumbnails' download urls in database.
I have something that's working, but uses too much memory (300+MB when uploading only 4 pictures):
// Swift
let dispatchGroup = DispatchGroup()
let dispatchQueue = DispatchQueue.init(label: "AssetQueue")
var uploadedAssets = [String]()
let folderName: String = UUID().uuidString
dispatchQueue.async {
for i in 0..<photos.count {
dispatchGroup.enter()
let photo: UIImage = photos[i]
let fileName: String = "\(folderName)/\(i)"
let assetRef = Storage.storage().reference().child("photos/\(fileName)")
let metaData = StorageMetaData()
metaData.contentType = "image/jpg"
if let dataToUpload = UIImageJPEGRepresentation(photo, 0.75) {
assetRef.putData(
dataToUpload,
metaData: metaData,
completion: { (_, error) in
uploadedAssets.append("photos/\(fileName)")
dispatchGroup.leave()
}
)
}
}
}
dispatchGroup.notify(queue: dispatchQueue) {
Alamofire.request(
"https://<some_url>",
method: .post,
parameters: [
"uploadedAssets": uploadedAssets
]
)
}
And the code that generates thumbnails runs on server side, therefore, in my opinion, is irrelevant, I won't post it here. So, the above code snippet consumes 300+MB of memory when there are 4 photos to upload. After successfully uploaded those photos, the memory usage stays at 300+MB and never drops. When I try to upload more, say another 4 photos, it could even go up to 450+MB. I know that's not normal, but can't seem to figure out why this would happen?

Resources