I have an iOS app in which there are 2 ways the user can get a picture:
Select it from photos library (UIImagePickerController)
Click it from a custom made camera
Here is my code for clicking the image from a custom camera (this is within a custom class called Camera, which is a subclass of UIView)
func clickPicture(completion:#escaping (UIImage) -> Void) {
guard let videoConnection = stillImageOutput?.connection(withMediaType: AVMediaTypeVideo) else { return }
videoConnection.videoOrientation = .portrait
stillImageOutput?.captureStillImageAsynchronously(from: videoConnection, completionHandler: { (sampleBuffer, error) -> Void in
guard let buffer = sampleBuffer else { return }
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer)
let dataProvider = CGDataProvider(data: imageData! as CFData)
let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
let image = UIImage(cgImage: cgImageRef!, scale: 1, orientation: .right)
completion(image)
})
}
Here is how I click the image within the ViewController:
#IBAction func clickImage(_ sender: AnyObject) {
cameraView.clickPicture { (image) in
//use "image" variable
}
}
Later, I attempt to upload this picture to the user's iCloud account using CloudKit. However I receive an error saying the record is too large. I then came across this SO post, which says to use a CKAsset. However, the only constructor for a CKAsset requires a URL.
Is there a generic way I can get a URL from any UIImage? Otherwise, how can get a URL from the image I clicked using my custom camera (I have seen other posts about getting a url from a UIImagePickerController)? Thanks!
CKAsset represents some external file (image, video, binary data and etc). This is why it requires URL as init parameter.
In your case I would recommend to use following steps to upload large image to CloudKit:
Save UIImage to local storage (e.g. documents directory).
Initialize CKAsset with path to image in local storage.
Upload asset to Cloud.
Delete image from local storage when uploading completed.
Here is some code:
// Save image.
let path = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first!
let filePath = "\(path)/MyImageName.jpg"
UIImageJPEGRepresentation(image, 1)!.writeToFile(filePath, atomically: true)
let asset = CKAsset(fileURL: NSURL(fileURLWithPath: filePath)!)
// Upload asset here.
// Delete image.
do {
try FileManager.default.removeItem(atPath: filePath)
} catch {
print(error)
}
Related
I have a portrait image with depth data with it and after some processing, I want to save a copy of it to a user photo album with depth data preserved (UIImage not an option in this case). For this task, I am using the function writeJPEGRepresentation() which seems to successfully save the modified image with the depth info to somewhere; however, it does not show up on the photo album.
In order it to appear on the photo album, I when try performChanges() function of PHPhotoLibrary,
this time it appeared on the album, but not the modified but the original one!?
Any help highly appreciated. Thanks.
Here is the code:
func saveWithDepth(image : CIImage) {
do {
let colorSpace = CGColorSpace(name: CGColorSpace.sRGB)
let depthdata = DepthData
let url = Url
try Context.writeJPEGRepresentation(of: image, to: url!, colorSpace: colorSpace!,
options: [CIImageRepresentationOption.avDepthData :depthdata!])
PHPhotoLibrary.shared().performChanges({
let options = PHAssetResourceCreationOptions()
let creationRequest = PHAssetCreationRequest.forAsset()
creationRequest.addResource(with: .alternatePhoto, fileURL: url!, options: options)
}, completionHandler: { success, error in
if !success {
print("AVCam couldn't save the movie to your photo library: \(String(describing: error))")
}
})
} catch {
print("failed")
}
}
I think the problem is that JPEG can't store depth data (as far as I know). HEIF would be the format you should use for that. Maybe you can try something like this:
func saveWithDepth(image: CIImage) {
let colorSpace = CGColorSpace(name: CGColorSpace.sRGB)
let depthdata: DepthData
let imageData = context.heifRepresentation(of: image, format: .BGRA8, colorSpace: colorSpace!,
options: [CIImageRepresentationOption.avDepthData: depthdata!])
PHPhotoLibrary.shared().performChanges({
let options = PHAssetResourceCreationOptions()
let creationRequest = PHAssetCreationRequest.forAsset()
creationRequest.addResource(with: .photo, data: imageData, options: options)
}, completionHandler: { success, error in
if !success {
print("Couldn't save the photo to your photo library: \(String(describing: error))")
}
})
}
A few remarks:
I assume depthdata is actually a meaningful value?
You can create and later pass the image data directly to the creationRequest. Then you don't need to save a file to some intermediate location (which you would need to delete afterward).
Problem: Loading an UIImage with an url obtained from PHAsset returns nil.
Context: I have a UI component which works with image URLs and is populated in different parts of the app with different providers. In a specific case I receive PHAsset.
Code for requesting PHAsset url:
let inputRequestOptions = PHContentEditingInputRequestOptions()
inputRequestOptions.isNetworkAccessAllowed = true
asset.requestContentEditingInput(with: inputRequestOptions, completionHandler: { (edittingInput, _) in
guard let url = edittingInput?.fullSizeImageURL else {
return
}
switch asset.mediaType {
case .image: editMediaContentContainerPresenter.assets = [.image(url)]
case .video: editMediaContentContainerPresenter.assets = [.movie(url)]
default:
debugPrint("unsupported asset")
}
})
Code for loading the URL into an UIImage inside my UI component:
func image(for url: URL, maximumPixelSize: CGFloat) -> UIImage? {
let options = [kCGImageSourceCreateThumbnailWithTransform: true,
kCGImageSourceCreateThumbnailFromImageAlways: true,
kCGImageSourceThumbnailMaxPixelSize: maximumPixelSize] as CFDictionary
guard let source = CGImageSourceCreateWithURL(url as CFURL, nil) else {
return nil
}
guard let imageReference = CGImageSourceCreateThumbnailAtIndex(source, 0, options) else {
return nil
}
return UIImage(cgImage: imageReference, scale: UIScreen.main.scale, orientation: .up)
}
Special considerations: I need to transform the PHAsset into an URL because the current UI component is working with many other providers and parts of the app and all of them resolve to URLs. Sometimes the loading from that URL works, but usually maximum once. More often it doesn't manage to load an UIImage from the URL provided.
Question: How can I obtain a valid URL.
Solutions taken into consideration already, but to be avoided:
loading the image with phasset requests and then saving it separately to a local url
passing the UIImage instead of URL's
Now that AssetsLibrary has been deprecated, we're supposed to use the photos framework, specifically PHPhotoLibrary to save images and videos to a users camera roll.
Using ReactiveCocoa, such a request would look like:
func saveImageAsAsset(url: NSURL) -> SignalProducer<String, NSError> {
return SignalProducer { observer, disposable in
var imageIdentifier: String?
PHPhotoLibrary.sharedPhotoLibrary().performChanges({
let changeRequest = PHAssetChangeRequest.creationRequestForAssetFromImageAtFileURL(url)
let placeholder = changeRequest?.placeholderForCreatedAsset
imageIdentifier = placeholder?.localIdentifier
}, completionHandler: { success, error in
if let identifier = imageIdentifier where success {
observer.sendNext(identifier)
} else if let error = error {
observer.sendFailed(error)
return
}
observer.sendCompleted()
})
}
}
I created a gif from a video using Regift and I can verify that the gif exists inside my temporary directory. However when I go save that gif to the camera roll, I get a mysterious error: NSCocoaErrorDomain -1 (null), which is really super helpful.
Has anyone ever experienced this issue?
You can try this.
let data = try? Data(contentsOf: /*Your-File-URL-Path*/)
PHPhotoLibrary.shared().performChanges({
PHAssetCreationRequest.forAsset().addResource(with: .photo, data: data!, options: nil)
})
I'm creating a video app where a user records a video and adds some additional information. The user can then the video and its information on a separate screen.
On the information screen I'm trying to display a still from the video. This works if I open the information screen within that session, but when I recompile and run the application the screen shots no longer appear. I get the following error:
NSUnderlyingError=0x162b9350 "The operation couldn’t be completed. No such file or directory", NSLocalizedDescription=The requested URL was not found on this server.
The video is on the device somewhere as I can view it via the built in 'Photos' app.
Im using the following code to perform the saving of the url string and for generating the preview image.
Saving the video url
let videoURL = info[UIImagePickerControllerMediaURL] as! NSURL
let videoData = NSData(contentsOfURL: videoURL)
let path = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0] as! String
let dataPath = path.stringByAppendingPathComponent("-cached.MOV")
videoData.writeToFile(dataPath, atomically: false)
NSUserDefaults.standardUserDefaults().setObject(dataPath, forKey: "dataPath")
videoNote.url = dataPath
Creating preview image
// filePathLocal == videoNote.url
func videoSnapshot(filePathLocal: NSString) -> UIImage? {
let vidURL = NSURL(fileURLWithPath:filePathLocal as String)
let asset = AVURLAsset(URL: vidURL, options: nil)
let generator = AVAssetImageGenerator(asset: asset)
generator.appliesPreferredTrackTransform = true
let timestamp: CMTime = asset.duration
var error: NSError?
if let imageRef = generator.copyCGImageAtTime(timestamp, actualTime: nil, error: &error){
return UIImage(CGImage: imageRef)
} else {
print("Image generation failed with error \(error)")
return nil
}
}
Any help much appreciated
Although I never managed to get the code from the videoSnapshot() method to work, I found that I could use the following to generate a preview shot. The downside is there is not control over the timestamp that the image is taken from.
ALAssetsLibrary().assetForURL(nsURL, resultBlock: { (asset) -> Void in
if let ast = asset {
return image = UIImage(CGImage: ast.defaultRepresentation().fullResolutionImage().takeUnretainedValue())
}
})
I've got a completed app that takes photos and puts them in custom albums. I can name each album and I can retrieve all the images perfectly. However what I really need is to be able to name the individual photos (or use some kind of metadata) so that I can show them at appropriate times inside the app.
I know it can be done if you are storing the photos in the app's documents directory but I've had to move away from that and go with the device's photo library.
Has anyone got any ideas around how to do this?
PS. I am using Objective-C not SWIFT.
You can do this in two ways:
1- Saving the photo in a temporary directory. Example:
var fileManager = NSFileManager()
var tmpDir = NSTemporaryDirectory()
let filename = "YourImageName.png"
let path = tmpDir.stringByAppendingPathComponent(filename)
var error: NSError?
let imageData = UIImagePNGRepresentation(YourImageView.image)
fileManager.removeItemAtPath(path, error: nil)
println(NSURL(fileURLWithPath: path))
if(imageData.writeToFile(path,atomically: true)){
println("Image saved.")
}else{
println("Image not saved.")
}
2- Saving using Photos Framework. Example:
if let image: UIImage = YourImageView.image
let priority = DISPATCH_QUEUE_PRIORITY_DEFAULT
dispatch_async(dispatch_get_global_queue(priority, 0), {
PHPhotoLibrary.sharedPhotoLibrary().performChanges({
let createAssetRequest = PHAssetChangeRequest.creationRequestForAssetFromImage(image)
let assetPlaceholder = createAssetRequest.placeholderForCreatedAsset
if let albumChangeRequest = PHAssetCollectionChangeRequest(forAssetCollection: self.assetCollection, assets: self.photosAsset) {
albumChangeRequest.addAssets([assetPlaceholder])
}
}, completionHandler: {(success, error)in
dispatch_async(dispatch_get_main_queue(), {
NSLog("Adding Image to Library -> %#", (success ? "Sucess":"Error!"))
})
})
})
}
You can check this project sample.