I'm developing an app that allows users to edit photos using PhotoKit. I was previously saving the edited photo to disk as a JPEG. I would like to avoid converting to JPEG and have implemented the modifications in order to do that. It works great for photos taken with the camera, but if you try to edit a screenshot, the PHPhotoLibrary.sharedPhotoLibrary().performChanges block will fail and log The operation couldn’t be completed. (Cocoa error -1.). I am not sure why this is causing the performChanges block to fail, what have I done wrong here?
I've created a sample app available to download that demonstrates the problem, and I've included the relevant code below. The app attempts to edit the newest photo in your photo library. If it succeeds it will prompt for access to edit the photo, otherwise nothing will happen and you'll see the console log. To reproduce the issue, take a screenshot then run the app.
Current code that works with screenshots:
let jpegData: NSData = outputPhoto.jpegRepresentationWithCompressionQuality(0.9)
let contentEditingOutput = PHContentEditingOutput(contentEditingInput: self.input)
var error: NSError?
let success = jpegData.writeToURL(contentEditingOutput.renderedContentURL, options: NSDataWritingOptions.AtomicWrite, error: &error)
if success {
return contentEditingOutput
} else {
return nil
}
Replacement code that causes screenshots to fail:
let url = self.input.fullSizeImageURL
let orientation = self.input.fullSizeImageOrientation
var inputImage = CIImage(contentsOfURL: url)
inputImage = inputImage.imageByApplyingOrientation(orientation)
let outputPhoto = createOutputImageFromInputImage(inputImage)!
let originalImageData = NSData(contentsOfURL: self.input.fullSizeImageURL)!
let imageSource = CGImageSourceCreateWithData(originalImageData, nil)
let dataRef = CFDataCreateMutable(nil, 0)
let destination = CGImageDestinationCreateWithData(dataRef, CGImageSourceGetType(imageSource), 1, nil) //getType automatically selects JPG, PNG, etc based on original format
struct ContextStruct {
static var ciContext: CIContext? = nil
}
if ContextStruct.ciContext == nil {
let eaglContext = EAGLContext(API: .OpenGLES2)
ContextStruct.ciContext = CIContext(EAGLContext: eaglContext)
}
let cgImage = ContextStruct.ciContext!.createCGImage(outputPhoto, fromRect: outputPhoto.extent())
CGImageDestinationAddImage(destination, cgImage, nil)
if CGImageDestinationFinalize(destination) {
let contentEditingOutput = PHContentEditingOutput(contentEditingInput: self.input)
var error: NSError?
let imageData: NSData = dataRef
let success = imageData.writeToURL(contentEditingOutput.renderedContentURL, options: .AtomicWrite, error: &error)
if success {
//it does succeed
return contentEditingOutput
} else {
return nil
}
}
The problem happens due to the fact that adjusted photos are always saved as JPG files, and screenshots are in fact PNG files.
It occurred to me while I was debugging your sample project and saw the in the PhotoEditor, contentEditingOutput.renderedContentURL is a URL to a JPG, while if you examine the result of CGImageSourceGetType(imageSource) it is clear the it's a PNG (returns a PNG UTI: public.png).
So I went and read the documentation for renderedContentURL which states that if editing a photo asset, the altered image is written in JPEG format - which clearly won't work if your image is a PNG. This leads me to think that Apple don't support editing PNG files or don't want you to. Go figure..
Related
I try to add IPTC, TIFF and EXIF data to an PHAsset. When I apply changes with the following code snipped I got the mentioned error:
guard let ciImage = CIImage(contentsOf: input.fullSizeImageURL!, options: [.applyOrientationProperty:true]) else {
fatalError("Not able to create CIImage from input")
}
//Write the edited image as a JPEG.
do {
try CIContext().writeJPEGRepresentation(of: ciImage,
to: output.renderedContentURL,
colorSpace: outputImage.colorSpace!,
options: [kCGImageDestinationLossyCompressionQuality as CIImageRepresentationOption:1.0])
} catch let error {
fatalError("Can't apply metadata to the image: \(error).")
}
PHPhotoLibrary.shared().performChanges({
let request = PHAssetChangeRequest(for: self.asset!)
request.contentEditingOutput = output
}, completionHandler: { success, error in
if !success {
print("Can't edit the asset: \(error?.localizedDescription)")
}
}
Error:
[PhotoKit] Original resource choice is only valid for an unadjusted base version
What am I doing wrong? Is there a better way to add IPTC metadata to a PHAsset, resp. UIImage file?
If found the issue which was in the code before I created the CIImage shown in my question. I deleted the code and instead of overriding the binary data of the CIImage I just use the CIImage to store my changes in the PHAsset / PHAssetLibrary.
I have an application where user can upload multiple images and all the images will be stored in a server and will be displayed on a web view in my iOS application.
Now everything used to work just about fine till iOS 10 but suddenly we started seeing some pictures/ images not being displayed , after a little debugging we found out that this is the problem caused because of the new image format of apple (HEIC),
I tried changing back to the Native UIImagePicker (picks only one image) and the images are being displayed as Apple I guess is converting the Image from HEIC to JPG when a user picks them, but this is not the case when I use 3rd party libraries as I need to implement multiple image picker.
Though we are hard at work to make the conversion process on the server side to avoid users who have not updated the app to face troubles, I also want to see if there is any way in which I can convert the image format locally in my application.
There's a workaround to convert HEIC photos to JPEG before uploading them to the server :
NSData *jpgImageData = UIImageJPEGRepresentation(image, 0.7);
If you use PHAsset, the, in order to have the image object, you'll need to call this method from PHImageManager:
- (PHImageRequestID)requestImageForAsset:(PHAsset *)asset targetSize:(CGSize)targetSize contentMode:(PHImageContentMode)contentMode options:(nullable PHImageRequestOptions *)options resultHandler:(void (^)(UIImage *__nullable result, NSDictionary *__nullable info))resultHandler;
On server side you also have the ability to use this API or this website directly
I've done it this way,
let newImageSize = Utility.getJpegData(imageData: imageData!, referenceUrl: referenceUrl!)
/**
- Convert heic image to jpeg format
*/
public static func getJpegData(imageData: Data, referenceUrl: NSURL) -> Data {
var newImageSize: Data?
if (try? Data(contentsOf: referenceUrl as URL)) != nil
{
let image: UIImage = UIImage(data: imageData)!
newImageSize = image.jpegData(compressionQuality: 1.0)
}
return newImageSize!
}
In Swift 3, given an input path of an existing HEIF pic and an output path where to save the future JPG file:
func fromHeicToJpg(heicPath: String, jpgPath: String) -> UIImage? {
let heicImage = UIImage(named:heicPath)
let jpgImageData = UIImageJPEGRepresentation(heicImage!, 1.0)
FileManager.default.createFile(atPath: jpgPath, contents: jpgImageData, attributes: nil)
let jpgImage = UIImage(named: jpgPath)
return jpgImage
}
It returns the UIImage of the jpgPath or null if something went wrong.
I have found the existing answers to be helpful but I have decided to post my take on the solution to this problem as well. Hopefully it's a bit clearer and "complete".
This solution saves the image to a file.
private let fileManager: FileManager
func save(asset: PHAsset, to destination: URL) {
let options = PHContentEditingInputRequestOptions()
options.isNetworkAccessAllowed = true
asset.requestContentEditingInput(with: options) { input, info in
guard let input = input, let url = input.fullSizeImageURL else {
return // you might want to handle this case
}
do {
try self.save(input, at: url, to: destination)
// success!
} catch {
// failure, handle the error!
}
}
}
private func copy(
_ input: PHContentEditingInput, at url: URL, to destination: URL
) throws {
let uniformType = input.uniformTypeIdentifier ?? ""
switch uniformType {
case UTType.jpeg.identifier:
// Copy JPEG files directly
try fileManager.copyItem(at: url, to: destination)
default:
// Convert HEIC/PNG and other formats to JPEG and save to file
let image = UIImage(data: try Data(contentsOf: url))
guard let data = image?.jpegData(compressionQuality: 1) else {
return // you might want to handle this case
}
try data.write(to: destination)
}
}
I have an iMessage app that displays some remote content using SDWebImage. The images are downloaded and cached on disk. After choosing an image, I want to attach it to the message as a plain UIImage (not a MSMessage).
Here's the code I'm using
// image is already downloaded
let cache = SDImageCache.shared()
let key = remoteImageUrl
let fileUrlString = cache.defaultCachePath(forKey: key)!
let fileUrl = URL(string: fileUrlString)!
// image holds the correct UIImage
let image = UIImage(contentsOfFile: fileUrlString)
activeConversation?.insertAttachment(fileUrl, withAlternateFilename: "a funny gif", completionHandler: { (error) in
// error is nil here
print("error: \(error)")
})
Here's what the message looks like
It seems like the Messages framework can't find the image at that path.
Note: after tapping send, I the iMessage app crashes "MobileSMS quit unexpectedly."
I found out that I needed to use
let fileUrl = URL(fileURLWithPath: fileUrlString)
Hope this helps someone else
I am developing a share extension for photos for my iOS app. Inside the extension, I am able to successfully retrieve the UIImage object from the NSItemProvider.
However, I would like to be able to share the image with my container app, without having to store the entire image data inside my shared user defaults. Is there a way to get the PHAsset of the image that the user has chosen in the share extension (if they have picked from their device)?
The documentation on the photos framework (https://developer.apple.com/library/ios/documentation/Photos/Reference/Photos_Framework/) has a line that says "This architecture makes it easy, safe, and efficient to work with the same assets from multiple threads or multiple apps and app extensions."
That line makes me think there is a way to share the same PHAsset between extension and container app, but I have yet to figure out any way to do that? Is there a way to do that?
This only works if the NSItemProvider gives you a URL with the format:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0007.PNG
which is not always true for all your assets, but if it returns a URL as:
file:///var/mobile/Media/PhotoData/OutgoingTemp/2AB79E02-C977-4B4A-AFEE-60BC1641A67F.JPG
then PHAsset will never find your asset. Further more, the latter is a copy of your file, so if you happen to have a very large image/video, iOS will duplicate it in that OutgoingTemp directory. Nowhere in the documentation says when it's going to be deleted, hopefully soon enough.
I think this is a big gap Apple has left between Sharing Extensions and PHPhotoLibrary framework. Apple should've be creating an API to close it, and soon.
You can get PHAsset if image is shared from Photos app. The item provider will give you a URL that contains the image's filename, you use this to match PHAsset.
/// Assets that handle through handleImageItem:completionHandler:
private var handledAssets = [PHAsset]()
/// Key is the matched asset's original file name without suffix. E.g. IMG_193
private lazy var imageAssetDictionary: [String : PHAsset] = {
let options = PHFetchOptions()
options.includeHiddenAssets = true
let fetchResult = PHAsset.fetchAssetsWithOptions(options)
var assetDictionary = [String : PHAsset]()
for i in 0 ..< fetchResult.count {
let asset = fetchResult[i] as! PHAsset
let fileName = asset.valueForKey("filename") as! String
let fileNameWithoutSuffix = fileName.componentsSeparatedByString(".").first!
assetDictionary[fileNameWithoutSuffix] = asset
}
return assetDictionary
}()
...
provider.loadItemForTypeIdentifier(imageIdentifier, options: nil) { imageItem, _ in
if let image = imageItem as? UIImage {
// handle UIImage
} else if let data = imageItem as? NSData {
// handle NSData
} else if let url = imageItem as? NSURL {
// Prefix check: image is shared from Photos app
if let imageFilePath = imageURL.path where imageFilePath.hasPrefix("/var/mobile/Media/") {
for component in imageFilePath.componentsSeparatedByString("/") where component.containsString("IMG_") {
// photo: /var/mobile/Media/DCIM/101APPLE/IMG_1320.PNG
// edited photo: /var/mobile/Media/PhotoData/Mutations/DCIM/101APPLE/IMG_1309/Adjustments/FullSizeRender.jpg
// cut file's suffix if have, get file name like IMG_1309.
let fileName = component.componentsSeparatedByString(".").first!
if let asset = imageAssetDictionary[fileName] {
handledAssets.append(asset)
imageCreationDate = asset.creationDate
}
break
}
}
}
I hope someone can help me out, I already searched Stackoverflow and Google but I couldn't get the right solution.
I am having a very simple app which takes a photo (using the standard iOS Camera through UIImagePickerController) then I save it to the file system with a very low resolution - let thumbNailData = UIImageJPEGRepresentation(image, 0.02) after that I display the images in a collection view using Core Data - I only save the filename in Core Data, not the image, image is only in the filesystem.
So, however, when I run the app it shows me a memory usage of not more than 15 MB and Process is around 1-2 %. Everything runs fine but after adding 6-7 Photos I get strange errors like the Memory Warning, Lost connection to my iPhone and this on:
Communications error: <OS_xpc_error: <error: 0x198adfa80> { count = 1, contents =
"XPCErrorDescription" => <string: 0x198adfe78> { length = 22, contents = "Connection interrupted"
So, I am really stuck is I thought I made it very lightweight and then I get these errors....
I already submitted a note taking app to the App Store which was much more high functionality than this one but this one runs very stable...
Any ideas?
Here is some of my code:
// Here I load the picture names into an array to display in the collection view
func loadColl () {
let appDelegate = (UIApplication.sharedApplication().delegate as AppDelegate)
let context:NSManagedObjectContext = appDelegate.managedObjectContext!
let fetchRequest:NSFetchRequest = NSFetchRequest(entityName: "PictureNote")
var error:NSError?
var result = context.executeFetchRequest(fetchRequest, error: &error) as [PictureNote]
for res in result {
println(res.latitude)
println(res.longitude)
println(res.longDate)
println(res.month)
println(res.year)
println(res.text)
pictures.append(res.thumbnail)
}
}
// Here is the code to display in collection view
func collectionView(collectionView: UICollectionView, cellForItemAtIndexPath indexPath: NSIndexPath) -> UICollectionViewCell {
let myImageCell:ImageCell = myCollectionView.dequeueReusableCellWithReuseIdentifier("imageCell", forIndexPath: indexPath) as ImageCell
myImageCell.imageCell.image = self.loadImageFromPath(fileInDocumentsDirectory(self.pictures[indexPath.row]))
return myImageCell
}
// Here is the code to load the pictures from disk
func loadImageFromPath(path: String) -> UIImage {
let image = UIImage(contentsOfFile: path)
if image == nil {
self.newAlert("Error loading your image, try again", title: "Notice")
}
return image!
}
// Here is my saving code
func saveNewEntry () {
var unique = NSUUID().UUIDString
var imageTitle = "Picture\(unique).jpg"
var image = self.pickedImageView.image
var path = fileInDocumentsDirectory(imageTitle)
var thumbImageTitle = "ThumbPicture\(unique).jpg"
var thumbPath = fileInDocumentsDirectory(thumbImageTitle)
if self.saveImage(image!, path: path) == true && self.saveThumbImage(image!, thumbPath: thumbPath) == true {
// Create the saving context
let context = (UIApplication.sharedApplication().delegate as AppDelegate).managedObjectContext!
let entityOne = NSEntityDescription.entityForName("PictureNote", inManagedObjectContext: context)
let thisTask = PictureNote(entity: entityOne!, insertIntoManagedObjectContext: context)
// Get all the values here
self.getMyLocation()
var theDate = NSDate()
thisTask.month = Date.toString(date: theDate, format: "MM")
thisTask.year = Date.toString(date: theDate, format: "yyyy")
thisTask.longDate = theDate
thisTask.longitude = getMyLocation()[1]
thisTask.latitude = getMyLocation()[0]
thisTask.text = self.noteTextView.text
thisTask.imageURL = imageTitle
thisTask.thumbnail = thumbImageTitle
thisTask.id = unique
// Saving to CoreData
if context.save(nil) {
self.newAlert("Saving your note was successful!", title: "Notice")
self.noteTextView.text = ""
} else {
self.newAlert("Error saving your note, try again", title: "Notice")
}
} else {
self.newAlert("Error saving your image, try again", title: "Notice")
}
self.pickedImageView.image = UIImage(named: "P1000342.jpg")
}
I am really thankful for every suggestion....if you need more code, just let me know...
I notice that you are using a dramatically reduced quality factor in conjunction with UIImageJPEGRepresentation. If this is an attempt is to reduce the memory involved, all that does is reduce the size of the resulting NSData your write to persistent storage, but loading that image into the image view will still require something on the order of (4 × width × height) bytes (note, that's the dimensions of the image, not the image view). Thus the 3264 × 2448 image from a iPhone takes up 30mb per image, regardless of the quality factor employed by UIImageJPEGRepresentation.
Usually I will make sure my collection/table view uses thumbnail representation (either the thumbnail property of the ALAsset or resize the default representation myself). If I'm caching this thumbnail anywhere (such as persistent storage suggested by your question), I can then use a high quality representation of the thumbnail (I use PNG because it's lossless with compression, but JPEG with 0.7-0.9 quality factor is a reasonably faithful version, too).