Image from attachment from Local notification is not shown in UNNotificationContentExtension - ios

I've been working on rich notification experience which has been introduced in iOS10 and stuck with passing images as attachments to UNNotificationContentExtension.
Here's my ContentExtension:
class NotificationViewController: UIViewController, UNNotificationContentExtension {
#IBOutlet weak var attachmentImage: UIImageView!
func didReceive(_ notification: UNNotification) {
if let attachment = notification.request.content.attachments.first {
if attachment.url.startAccessingSecurityScopedResource() {
attachmentImage.image = UIImage(contentsOfFile: attachment.url.path)
attachment.url.stopAccessingSecurityScopedResource()
}
}
}
}
As a tutorial, I've been following Advanced Notifications video from WWDC.
I've checked - UIImage I'm assigning to UIImageView:
is not nil
has proper CGSize (191x191)
attachment.url.path equals /var/mobile/Library/SpringBoard/PushStore/Attachments/<bundle of app>/<...>.png
Here's how I send local notification from the app:
let content = UNMutableNotificationContent()
content.title = "Sample title"
content.body = "Sample body"
content.categoryIdentifier = "myNotificationCategory"
let attachement = try! UNNotificationAttachment(identifier: "image",
url: Bundle.main.url(forResource: "cat", withExtension: "png")!,
options: nil)
content.attachments = [ attachement ]
let request = UNNotificationRequest(identifier:requestIdentifier, content: content, trigger: nil)
UNUserNotificationCenter.current().delegate = self
UNUserNotificationCenter.current().add(request){(error) in
if (error != nil){
}
}
"cat.png" is just a dummy resource I've added to proj.
As you can see, notification shows the pic, so I assume, that I'm sending it correctly, but in the expanded state(in NotificationViewController) I've never succeed at showing the same image.
What am I doing wrong?
Thanks!

When you create an UIImage with contentsOfFile, the UIImage object reads the image header only, which indicates basic info, such as image size, etc.
So, try move stopAccessingSecurityScopedResource to [NotificationViewController dealloc].
Or using following:
objective-c code:
NSData *imageData = [NSData dataWithContentsOfURL:imageURL];
UIImage *image = [UIImage imageWithData:imageData];
swift code:
let imageData = NSData(contentsOf: attachment.url)
let image = UIImage(data: imageData! as Data)
There is no document saying that contentsOfFile only reads the image header. But when I run the following code:
NSString *docFolderPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) firstObject];
NSString *pngPath = [docFolderPath stringByAppendingPathComponent:#"test.png"];
UIImage *image = [UIImage imageWithContentsOfFile:pngPath];
[[NSFileManager defaultManager] removeItemAtPath:pngPath error:nil];
imageView.image = image;
An error occurs:
ImageIO: createDataWithMappedFile:1322: 'open' failed '/Users/fanjie/Library/Developer/CoreSimulator/Devices/FFDFCA06-A75E-4B54-9FC2-8E2AAE3B1405/data/Containers/Data/Application/E2D26210-4A53-424E-9FE8-D522CFD4FD9E/Documents/test.png'
error = 2 (No such file or directory)
So I made a conclusion that UIImage contentsOfFile only reads the image header.

Thanks to #jeffery ,
Here is the exact code for the image shown in the notification extension:
if let attachment = notification.request.content.attachments.first {
if attachment.url.startAccessingSecurityScopedResource() {
let data = NSData(contentsOfFile: attachment.url.path);
self. attachmentImage?.image = UIImage(data: data! as Data);
attachment.url.stopAccessingSecurityScopedResource()
}
}

Related

Save image to documents folder using Share Extension

My goal (besides learning how to write an iOS app extension) is to allow a user to share an image using the share button from a variety of apps including Photos and automatically rename them. Lastly then I want to save the image to the "documents" folder of the app for further use.
I'm having some problems trying to get the actual didSelectPost portion working since it seems that, unlike Objective-C examples I've seen, the loadItem operation returns a NSURL instead of an UIImage. When attempting to copy the NSUrl to my apps documents folder I get an error:
Error Domain=NSCocoaErrorDomain Code=260 "The file “IMG_0941.JPG”
couldn’t be opened because there is no such file."
UserInfo={NSFilePath=file:///var/mobile/Media/PhotoData/OutgoingTemp/B79263E5-9512-4317-9C5D-817D7EBEFA9A/RenderedPhoto/IMG_0941.JPG,
NSUnderlyingError=0x283f89080 {Error Domain=NSPOSIXErrorDomain Code=2
"No such file or directory"}}
This happens when I push the share button on a photo in the "photos" app, tap my extension and then press the "post" button.
I get the same error regardless if it's running in a simulator or real device.
Here's my hacked together progress so far:
override func didSelectPost() {
// This is called after the user selects Post. Do the upload of contentText and/or NSExtensionContext attachments.
let inputItem = extensionContext?.inputItems.first as! NSExtensionItem
let attachment = inputItem.attachments!.first!
if attachment.hasItemConformingToTypeIdentifier(kUTTypeJPEG as String) {
attachment.loadItem(forTypeIdentifier: kUTTypeJPEG as String, options: nil) { data, error in
var image: UIImage?
if let someUrl = data as? NSURL {
do {
// a ends up being nil in both of these cases
let a = NSData(contentsOfFile: someUrl.absoluteString!)
image = UIImage(data: a as! Data)
// let a = try Data(contentsOf: someUrl)
// image = UIImage(contentsOfFile: someUrl.absoluteString)
} catch {
print(error)
}
} else if let someImage = data as? UIImage {
image = someImage
}
if let someImage = image {
guard let compressedImagePath = FileManager.default.urls(for: .cachesDirectory, in: .userDomainMask).first?.appendingPathComponent("theimage.jpg", isDirectory: false) else {
return
}
let compressedImageData = someImage.jpegData(compressionQuality: 1)
guard (try? compressedImageData?.write(to: compressedImagePath)) != nil else {
return
}
} else {
print("Bad share data")
}
}
}
// Inform the host that we're done, so it un-blocks its UI. Note: Alternatively you could call super's -didSelectPost, which will similarly complete the extension context.
self.extensionContext!.completeRequest(returningItems: [], completionHandler: nil)
}
Notice I'm casting the img variable as an NSURL. I've tried to cast it as a UIImage but that throws an exception.
I have some other things I'd like to do to the image, like read it's EXIF data but for now this is what I have. Any suggestions would be great as I'm really struggling to wrap my head around and learn this environment.
Similar but unsuccessful posts I've tried, notice they are all Objective-C:
iOS Share Extension issue when sharing images from Photo library
Share image using share extension in ios8
How to add my app to the share sheet action
[edit] Matched the layout of one of the better answers, still with no luck.
I have review your code and there is some mistake in the code. I have fixed it .
Replace your code with it
func share() {
let inputItem = extensionContext!.inputItems.first! as! NSExtensionItem
let attachment = inputItem.attachments!.first as! NSItemProvider
if attachment.hasItemConformingToTypeIdentifier( kUTTypeImage as String) {
attachment.loadItem(forTypeIdentifier: kUTTypeImage as String, options: [:]) { (data, error) in
var image: UIImage?
if let someURl = data as? URL {
image = UIImage(contentsOfFile: someURl.path)
}else if let someImage = data as? UIImage {
image = someImage
}
if let someImage = image {
guard let compressedImagePath = FileManager.default.urls(for: .cachesDirectory, in: .userDomainMask).first?.appendingPathComponent("shareImage.jpg", isDirectory: false) else {
return
}
let compressedImageData = UIImageJPEGRepresentation(someImage, 1)
guard (try? compressedImageData?.write(to: compressedImagePath)) != nil else {
return
}
}else{
print("bad share data")
}
}
}
}
I have the same issue. The solution I was able to implement:
Get URL to image. This URL is useless because I got 260 error when try to load image using this URL. Interesting that this comes after some recent updates because it works before
Get file name with extension from this URL
Iterate over all images in user's photo library and find the image name == name from ULR
Extract the image data
- (void)didSelectPost {
for (NSItemProvider* itemProvider in ((NSExtensionItem*)self.extensionContext.inputItems[0]).attachments ) {
// get type of file extention (jpeg, file, url, png ...)
NSArray *registeredTypeIdentifiers = itemProvider.registeredTypeIdentifiers;
if ([itemProvider hasItemConformingToTypeIdentifier:registeredTypeIdentifiers.firstObject]) {
[itemProvider loadItemForTypeIdentifier:registeredTypeIdentifiers.firstObject options:nil completionHandler:^(id<NSSecureCoding> item, NSError *error) {
NSData *imgData;
NSString* imgPath = ((NSURL*) item).absoluteString;
if(imgPath == nil)
imgPath = [NSString stringWithFormat:#"%#", item];
NSCharacterSet* set = [NSCharacterSet URLHostAllowedCharacterSet];
NSString* imgPathEscaped = [imgPath stringByAddingPercentEncodingWithAllowedCharacters:set];
NSString* fileName = [imgPath lastPathComponent];
NSError* error2 = nil;
//try load from file path
__block NSData* data2 = [NSData dataWithContentsOfFile:imgPath options: NSDataReadingUncached error:&error2];
if(data2 == nil) //try load as URL
data2 = [NSData dataWithContentsOfURL:[NSURL URLWithString:imgPath] options: NSDataReadingUncached error:&error2];
if(data2 == nil) //all failed so try hacky way
{
NSString* searchFilename = [fileName lowercaseString];
PHFetchResult *results = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];
[results enumerateObjectsUsingBlock:^(PHAsset *obj, NSUInteger idx, BOOL * _Nonnull stop) {
NSArray* resources = [PHAssetResource assetResourcesForAsset:obj];
NSString* fileName2 = [NSString stringWithFormat:#"%#", ((PHAssetResource*)resources[0]).originalFilename].lowercaseString;
if ([fileName2 isEqual:searchFilename])
{
NSLog(#"found %#", fileName2);
PHImageManager* mgr = [PHImageManager defaultManager];
PHImageRequestOptions * options = [PHImageRequestOptions alloc];
options.synchronous = YES;
[mgr requestImageDataForAsset:obj options:options resultHandler:^(NSData * _Nullable imageData33, NSString * _Nullable dataUTI, UIImageOrientation orientation, NSDictionary * _Nullable info)
{
//imageData33 is your image
data2 = imageData33;
}];
}
}];
}
}];
}
}
// Inform the host that we're done, so it un-blocks its UI. Note: Alternatively you could call super's -didSelectPost, which will similarly complete the extension context.
[self.extensionContext completeRequestReturningItems:#[] completionHandler:nil];
}
func getPhotofolder() -> String{
let fileManager = FileManager.default
let paths = (NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0] as NSString).appendingPathComponent("hsafetyPhoto")
if !fileManager.fileExists(atPath: paths){
try! fileManager.createDirectory(atPath: paths, withIntermediateDirectories: true, attributes: nil)
}else{
print("Already dictionary created.")
}
return paths
}
func saveImageDocumentDirectory(photo : UIImage, photoUrl : String) -> Bool{
let fileManager = FileManager.default
let paths = Utility.getPhotofolder().stringByAppendingPathComponent(pathComponent: photoUrl)
print("image's path \(paths)")
if !fileManager.fileExists(atPath: paths){
print("file already exits \(paths)")
let imageData = UIImageJPEGRepresentation(photo, 0.5)
fileManager.createFile(atPath: paths as String, contents: imageData, attributes: nil)
if !fileManager.fileExists(atPath: paths){
return false
}else{
return true
}
}else{
print(paths)
let imageData = UIImageJPEGRepresentation(photo, 0.5)
fileManager.createFile(atPath: paths as String, contents: imageData, attributes: nil)
if !fileManager.fileExists(atPath: paths){
return false
}else{
return true
}
}
}
func showimage(image_name : String) {
let documentsUrl = URL(fileURLWithPath: NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0])
let imgUrl = documentsUrl.appendingPathComponent(image_name)
if(FileManager.default.fileExists(atPath:imgUrl.path))
{
do {
let data = try Data(contentsOf:imgUrl)
self.imageView.image = UIImage(data:data)
}catch {
print(error)
} } else{
self.imageView.image = UIImage(named:"default.jpg") //Display any default image
}
}

Importing an image using Action Extension - URL to a local Image works but not with actual image data

My iOS app (Swift 3) needs to important images from other apps using an Action Extension. I'm using the standard Action Extension template code which works just fine for apps like iOS Mail and Photos where the image shared is a URL to a local file. But for certain apps where the image being shared is the actual image data itself, my action extension code isn't getting the image.
for item: Any in self.extensionContext!.inputItems {
let inputItem = item as! NSExtensionItem
for provider: Any in inputItem.attachments! {
let itemProvider = provider as! NSItemProvider
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage as String) { //we'll take any image type: gif, png, jpg, etc
// This is an image. We'll load it, then place it in our image view.
weak var weakImageView = self.imageView
itemProvider.loadItem(forTypeIdentifier: kUTTypeImage as String, options: nil, completionHandler: { (imageURL,
error) in
OperationQueue.main.addOperation {
if let strongImageView = weakImageView {
if let imageURL = imageURL as? NSURL {
strongImageView.image = UIImage(data: NSData(contentsOf: imageURL as URL)! as Data)
let imageData = NSData(contentsOf: imageURL as URL)! as Data
self.gifImageView.image = UIImage.gif(data: imageData)
let width = strongImageView.image?.size.width
let height = strongImageView.image?.size.height
.... my custom logic
}
}
For reference, I reached out to the developer for one of the apps where things aren't working and he shared this code on how he is sharing the image to the Action Extension.
//Here is the relevant code. At this point the scaledImage variable holds a UIImage.
var activityItems = Array<Any?>()
if let pngData = UIImagePNGRepresentation(scaledImage) {
activityItems.append(pngData)
} else {
activityItems.append(scaledImage)
}
//Then a little later it presents the share sheet:
let activityVC = UIActivityViewController(activityItems: activityItems,applicationActivities: [])
self.present(activityVC, animated: true, completion: nil)
Figured it out thanks to this post which explains the challenge quite well https://pspdfkit.com/blog/2017/action-extension/ . In summary, we don't know if the sharing app is giving us a URL to an existing image or just raw image data so we need to modify the out of the box action extension template code to handle both cases.
for item: Any in self.extensionContext!.inputItems {
let inputItem = item as! NSExtensionItem
for provider: Any in inputItem.attachments! {
let itemProvider = provider as! NSItemProvider
if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage as String) { //we'll take any image type: gif, png, jpg, etc
// This is an image. We'll load it, then place it in our image view.
weak var weakImageView = self.imageView
itemProvider.loadItem(forTypeIdentifier: kUTTypeImage as String, options: nil, completionHandler: { (imageURL,
error) in
OperationQueue.main.addOperation {
if let strongImageView = weakImageView {
if let imageURL = imageURL as? NSURL {
strongImageView.image = UIImage(data: NSData(contentsOf: imageURL as URL)! as Data)
let imageData = NSData(contentsOf: imageURL as URL)! as Data
self.gifImageView.image = UIImage.gif(data: imageData)
let width = strongImageView.image?.size.width
let height = strongImageView.image?.size.height
.... my custom logic
}
else
guard let imageData = imageURL as? Data else { return } //can we cast to image data?
strongImageView_.image = UIImage(data: imageData_)
//custom logic
}

How to get Exif data from local image using Swift?

If we have image imageName.jpg in Assets.xcassets and let's put that image in UIImage like this:
let imageForExif: UIImage = UIImage(named: "imageName.jpg")!
How we can get (extract) Exif data from imageForExif using Swift?
Try this...
if let imageData = UIImageJPEGRepresentation(imageForExif, 1.0) {
let imageCFData = imageData as CFData
if let cgImage = CGImageSourceCreateWithData(imageCFData, nil), let metaDict: NSDictionary = CGImageSourceCopyPropertiesAtIndex(cgImage, 0, nil) {
let exifDict: NSDictionary = metaDict.object(forKey: kCGImagePropertyExifDictionary) as! NSDictionary
print(exifDict)
}
}
But you may have very limited data available. Hope it helps.

How to get static image from google maps in iOS

I use google maps api for iOS. I want to get static image of special city and paste it in UIImageView. How can I make it?
the reply of #Ankit is right, but #Alexsander asked in Swift, so :
var staticMapUrl: String = "http://maps.google.com/maps/api/staticmap?markers=color:blue|\(self.staticData.latitude),\(self.staticData.longitude)&\("zoom=13&size=\(2 * Int(mapFrame.size.width))*\(2 * Int(mapFrame.size.height))")&sensor=true"
var mapUrl: NSURL = NSURL(string: staticMapUrl.stringByAddingPercentEscapesUsingEncoding(NSUTF8StringEncoding))!
var image: UIImage = UIImage.imageWithData(NSData.dataWithContentsOfURL(mapUrl))
var mapImage: UIImageView = UIImageView(frame: mapFrame)
for swift 4
let staticMapUrl: String = "http://maps.google.com/maps/api/staticmap?markers=\(self.finalLatitude),\(self.finalLongitude)&\("zoom=15&size=\(2 * Int(imgViewMap.frame.size.width))x\(2 * Int(imgViewMap.frame.size.height))")&sensor=true"
let mapUrl: NSURL = NSURL(string: staticMapUrl)!
self.imgViewMap.sd_setImage(with: mapUrl as URL, placeholderImage: UIImage(named: "palceholder"))
NSString *staticMapUrl = [NSString stringWithFormat:#"http://maps.google.com/maps/api/staticmap?markers=color:blue|%#,%#&%#&sensor=true",self.staticData.latitude, self.staticData.longitude, [NSString stringWithFormat:#"zoom=13&size=%dx%d",2*(int)mapFrame.size.width, 2*(int)mapFrame.size.height]];
NSURL *mapUrl = [NSURL URLWithString:[staticMapUrl stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]];
UIImage *image = [UIImage imageWithData: [NSData dataWithContentsOfURL:mapUrl]];
UIImageView *mapImage = [[UIImageView alloc] initWithFrame:mapFrame];
This should help.
Using Swift 3:
let lat = ..
let long = ..
let staticMapUrl: String = "http://maps.google.com/maps/api/staticmap?markers=color:red|\(lat),\(long)&\("zoom=13&size=\(2 * Int(cell.imgAddress.frame.size.width))x\(2 * Int(cell.imgAddress.frame.size.height))")&sensor=true"
let url = URL(string: staticMapUrl.addingPercentEncoding(withAllowedCharacters: .urlQueryAllowed)!)
do {
let data = try NSData(contentsOf: url!, options: NSData.ReadingOptions())
cell.imgAddress.image = UIImage(data: data as Data)
} catch {
cell.imgAddress.image = UIImage()
}
Try this. Note you have to get API key from Google cloud
let API_Key = //Your API Key.
let url = "https://maps.googleapis.com/maps/api/staticmap?center=Brooklyn+Bridge,New+York,NY&zoom=13&size=\(2 * Int(imgBanner.frame.size.width))x\(2 * Int(imgBanner.frame.size.height))&maptype=roadmap&key=\(API_Key)"
let mapUrl: NSURL = NSURL(string: staticMapUrl)!
self.imgBanner.sd_setImage(with: mapUrl as URL, placeholderImage: UIImage(named: "palceholder"))
Always check the link in browser to view whether it is working fine or not means in the image is visible or not.

Load app icon from xcassets

I'd like to display the app icon inside my app. The icon is in the default assets catalog (Images.xcassets).
How do you load it? I tried the following and they all return nil:
image = [UIImage imageNamed:#"AppIcon"];
image = [UIImage imageNamed:#"icon"];
image = [UIImage imageNamed:#"icon-76"];
image = [UIImage imageNamed:#"icon-60"];
Other images in the assets catalog work as expected.
By inspecting the bundle I found that the icon images were renamed as:
AppIcon76x76~ipad.png
AppIcon76x76#2x~ipad.png
AppIcon60x60#2x.png
And so on.
Thus, using [UIImage imageNamed:#"AppIcon76x76"] or similar works.
Is this documented somewhere?
I recommend retrieving the icon URL by inspecting the Info.plist since there's no guarantee how the Icon files are named:
NSDictionary *infoPlist = [[NSBundle mainBundle] infoDictionary];
NSString *icon = [[infoPlist valueForKeyPath:#"CFBundleIcons.CFBundlePrimaryIcon.CFBundleIconFiles"] lastObject];
imageView.image = [UIImage imageNamed:icon];
In this case we're fetching the last image URL of the CFBundleIconFiles array. It has the largest resolution. Change this if you need a smaller resolution.
Following Ortwin answer, a Swift 4 approach:
func getHighResolutionAppIconName() -> String? {
guard let infoPlist = Bundle.main.infoDictionary else { return nil }
guard let bundleIcons = infoPlist["CFBundleIcons"] as? NSDictionary else { return nil }
guard let bundlePrimaryIcon = bundleIcons["CFBundlePrimaryIcon"] as? NSDictionary else { return nil }
guard let bundleIconFiles = bundlePrimaryIcon["CFBundleIconFiles"] as? NSArray else { return nil }
guard let appIcon = bundleIconFiles.lastObject as? String else { return nil }
return appIcon
}
Then it can be used like:
let imageName = getHighResolutionAppIconName()
myImageView.image = UIImage(named: imageName)

Resources