I am developing a share extension for photos for my iOS app. Inside the extension, I am able to successfully retrieve the UIImage object from the NSItemProvider.
However, I would like to be able to share the image with my container app, without having to store the entire image data inside my shared user defaults. Is there a way to get the PHAsset of the image that the user has chosen in the share extension (if they have picked from their device)?
The documentation on the photos framework (https://developer.apple.com/library/ios/documentation/Photos/Reference/Photos_Framework/) has a line that says "This architecture makes it easy, safe, and efficient to work with the same assets from multiple threads or multiple apps and app extensions."
That line makes me think there is a way to share the same PHAsset between extension and container app, but I have yet to figure out any way to do that? Is there a way to do that?
This only works if the NSItemProvider gives you a URL with the format:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0007.PNG
which is not always true for all your assets, but if it returns a URL as:
file:///var/mobile/Media/PhotoData/OutgoingTemp/2AB79E02-C977-4B4A-AFEE-60BC1641A67F.JPG
then PHAsset will never find your asset. Further more, the latter is a copy of your file, so if you happen to have a very large image/video, iOS will duplicate it in that OutgoingTemp directory. Nowhere in the documentation says when it's going to be deleted, hopefully soon enough.
I think this is a big gap Apple has left between Sharing Extensions and PHPhotoLibrary framework. Apple should've be creating an API to close it, and soon.
You can get PHAsset if image is shared from Photos app. The item provider will give you a URL that contains the image's filename, you use this to match PHAsset.
/// Assets that handle through handleImageItem:completionHandler:
private var handledAssets = [PHAsset]()
/// Key is the matched asset's original file name without suffix. E.g. IMG_193
private lazy var imageAssetDictionary: [String : PHAsset] = {
let options = PHFetchOptions()
options.includeHiddenAssets = true
let fetchResult = PHAsset.fetchAssetsWithOptions(options)
var assetDictionary = [String : PHAsset]()
for i in 0 ..< fetchResult.count {
let asset = fetchResult[i] as! PHAsset
let fileName = asset.valueForKey("filename") as! String
let fileNameWithoutSuffix = fileName.componentsSeparatedByString(".").first!
assetDictionary[fileNameWithoutSuffix] = asset
}
return assetDictionary
}()
...
provider.loadItemForTypeIdentifier(imageIdentifier, options: nil) { imageItem, _ in
if let image = imageItem as? UIImage {
// handle UIImage
} else if let data = imageItem as? NSData {
// handle NSData
} else if let url = imageItem as? NSURL {
// Prefix check: image is shared from Photos app
if let imageFilePath = imageURL.path where imageFilePath.hasPrefix("/var/mobile/Media/") {
for component in imageFilePath.componentsSeparatedByString("/") where component.containsString("IMG_") {
// photo: /var/mobile/Media/DCIM/101APPLE/IMG_1320.PNG
// edited photo: /var/mobile/Media/PhotoData/Mutations/DCIM/101APPLE/IMG_1309/Adjustments/FullSizeRender.jpg
// cut file's suffix if have, get file name like IMG_1309.
let fileName = component.componentsSeparatedByString(".").first!
if let asset = imageAssetDictionary[fileName] {
handledAssets.append(asset)
imageCreationDate = asset.creationDate
}
break
}
}
}
Related
I am trying load SCNParticleSystem from download bundle which i am not able to load.
Path for the resource.
file:///var/mobile/Containers/Data/Application/A91E9970-CDE1-43D8-B822-4B61EFC6149B/Documents/so/solarsystem.bundle/Contents/Resources/
let objScene = SCNParticleSystem(named: "stars", inDirectory: directory)
This object is nil.
This is a legitimate problem since SceneKit does not provide an out-of-the-box solution for initializing particle systems from files that are outside of the main bundle (the only init method SCNParticleSystem.init(named:inDirectory:) implies that SCNParticleSystem.scnp files are in the main bundle).
Luckily for us .scnp files are just encoded/archived SCNParticleSystem instances that we can easily decode/unarchive using NSKeyedUnarchiver:
extension SCNParticleSystem {
static func make(fromFileAt url: URL) -> SCNParticleSystem? {
guard let data = try? Data(contentsOf: url),
let object = try? NSKeyedUnarchiver.unarchiveTopLevelObjectWithData(data),
let system = object as? SCNParticleSystem else { return nil }
return system
}
}
If you do not need to support iOS 9 and iOS 10 you can use NSKeyedUnarchiver.unarchivedObject(ofClass: SCNParticleSystem.self, from: data) instead of NSKeyedUnarchiver.unarchiveTopLevelObjectWithData(_:) and type casting, which was introduced in iOS 11.0.
Another issue that you're most likely to encounter is missing particle images. That is because by default SceneKit will look for them in the main bundle. As of current versions of iOS (which is iOS 12) and Xcode (Xcode 10) particle images in .scnp files (particleImage property) are String values which are texture filenames in the main bundle (that might change, but probably won't, however there's not much else we could use).
So my suggestion is to take that filename and look for the texture file with the same name in the same directory where the .scnp file is:
extension SCNParticleSystem {
static func make(fromFileAt url: URL) -> SCNParticleSystem? {
guard let data = try? Data(contentsOf: url),
let object = try? NSKeyedUnarchiver.unarchiveTopLevelObjectWithData(data),
let system = object as? SCNParticleSystem else { return nil }
if let particleImageName = system.particleImage as? String {
let particleImageURL = url
.deletingLastPathComponent()
.appendingPathComponent(particleImageName)
if FileManager.default.fileExists(atPath: particleImageURL.path) {
system.particleImage = particleImageURL
}
}
return system
}
}
You can just set the URL of the image file and SceneKit will handle it from there.
As a little side-note, the recommended directory for downloadable content is Application Support directory, not Documents.
Application Support: Use this directory to store all app data files except those associated with the user’s documents. For example, you might use this directory to store app-created data files, configuration files, templates, or other fixed or modifiable resources that are managed by the app. An app might use this directory to store a modifiable copy of resources contained initially in the app’s bundle. A game might use this directory to store new levels purchased by the user and downloaded from a server.
(from File System Basics)
Don't have enough reps to add the comment so adding it as the answer.
The answer by Lësha Turkowski works for sure but was had issues with loading the particle images using only NSURL.
All particles were appearing square which meant,
If the value is nil (the default), SceneKit renders each particle as a
small white square (colorized by the particleColor property).
SCNParticleSystem particleImage
In the documentation it says You may specify an image using an
NSImage (in macOS) or UIImage (in iOS) instance, or an NSString or
NSURL instance containing the path or URL to an image file.
Instead of using the NSURL, ended up using the UIImage and it loaded up fine.
extension SCNParticleSystem {
static func make(fromFileAt url: URL) -> SCNParticleSystem? {
guard let data = try? Data(contentsOf: url),
let object = try? NSKeyedUnarchiver.unarchiveTopLevelObjectWithData(data),
let system = object as? SCNParticleSystem else { return nil }
if let particleImageName = system.particleImage as? String {
let particleImageURL = url
.deletingLastPathComponent()
.appendingPathComponent(particleImageName)
if FileManager.default.fileExists(atPath: particleImageURL.path) {
// load up the NSURL contents in UIImage
let particleUIImage = UIImage(contentsOfFile: particleImageURL.path)
system.particleImage = particleUIImage
}
}
return system
}
}
I found out, that sometimes when dragging a SCNParticleSystem file into your project (probably form a different project) a silent error can happen due to some bugs in Xcode. As a result you can't get a reference to an instance of your SCNParticleSystem.
Solution: Check your BuildSettings in your target. The SCNPaticleSystem AND the associated ImageFile should be listed there and then you should get it right. (see screenShot below)
I am saving images to a custom album after either select or camera completion. Obviously, after camera completion, there is only one image, but when a user selects images in the gallery picker, in the completion handler, when I save that image to the custom album, a duplicate is ALWAYS created. Both in the gallery as well as the root photoAlbum. Everywhere, it seems. I cannot reference the ID to see if it was created before, because the ID is being newly created with the placeholder.
Is there a way to get the base image reference ID so that I can associate EVERY image to the original? As I understand it, IOS (I hate ios btw), saves only one actual image and the rest are just pointers to the original image object. If that is the case, I would expect there is a way to get a solid reference to the original Image and from there, I can easily manage assets created from that base image.
public static func addNewImage(_ image:UIImage, toAlbum albumName:String,imageID:String?,onSuccess success:#escaping(String)->Void, onFailure failure:#escaping(Error?)->Void) {
guard let album = self.getAlbum(withName: albumName) else {
failure(SDPhotosHelper.albumNotFoundError)
return
}
var localIdentifier = String();
if(imageID != nil){
if(self.hasImageInAlbum(withIdentifier: imageID!, fromAlbum: albumName)){
failure(SDPhotosHelper.albumNotFoundError)
return;
}
}
PHPhotoLibrary.shared().performChanges({
let albumChangeRequest = PHAssetCollectionChangeRequest(for: album)
let assetCreationRequest = PHAssetChangeRequest.creationRequestForAsset(from: image)
//assetCreationRequest.location = "";
let placeHolder = assetCreationRequest.placeholderForCreatedAsset
albumChangeRequest?.addAssets([placeHolder!] as NSArray)
if placeHolder != nil {
localIdentifier = (placeHolder?.localIdentifier)!
}
}) { (didSucceed, error) in
OperationQueue.main.addOperation({
didSucceed ? success(localIdentifier) : failure(error)
})
}
}
No one bothered to assist with this. Luckily, I was able to find the solution. For any who come across this or the similar one that was also sitting around with the crickets: Choosing a picture causes resave to camera roll here is a solution.
The code I have is to CREATE A NEW ASSET. It is useful only for the saving the image to your custom album after the user has taken a picture with the camera. It is for brand new assets.
However, for existing assets, you do not want to create a new asset. Instead, you want to add the existing asset to the custom album. To do this, you need a different method. Here is the code I created and it seems to be working. Keep in mind that you will have to get the asset ID FIRST, so that you can send it to your method and access the existing asset.
So, in your imagePickerController, you have to determine whether the user chose an existing image or whether the method is being called from a new camera action.
let pickerSource = picker.sourceType;
switch(pickerSource){
case .savedPhotosAlbum, .photoLibrary:
if(let url = info[UIIMagePickerControllerReferenceURL] as? NSURL{
let refURLString = refURL?.absoluteString;
/* value for refURLString looks something like assets-library://asset/asset.JPG?id=82A6E75C-EA55-4C3A-A988-4BF8C7F3F8F5&ext=JPG */
let refID = {function here to extract the id query param from the url string}
/*above gets you the asset ID, you can get the asset directly, but it is only
available in ios 11+.
*/
MYPHOTOHELPERCLASS.transferImage(toAlbum: "myalbumname", withID: refID!, ...)
}
break;
case .camera:
...
break;
}
Now, in your photohelper class (or in any function anywhere, whatever), to EDIT the asset instead of create a new one, this is what I have. I am assuming the changeRequest variable can be ommitted. I was just playing around until I got this right. Going through the completely ridiculous apple docs I was able to at least notice that there were other methods to play with. I found that the NSFastEnumeration parameter can be an NSArray of PHAssets, and not just placeholder PHObjectPlaceholder objects.
public static func transferImage(toAlbum albumName:String, withID imageID:String, onSuccess success:#escaping(String)->Void, onFailure failure:#escaping(Error?)->Void){
guard let album = self.getAlbum(withName: albumName) else{
... failure here, albumNotFoundError
return;
}
if(self.hasImageInAlbum(withIdentifier: imageID, fromAlbum: albunName)){
... failure here, image already exists in the album, do not make another
return;
}
let theAsset = self.getExistingAsset(withLocalIdentifier: imageID);
if(theAsset == nil){
... failure, no asset for asset id
return;
}
PHPhotoLibrary.shared().performChanges({
let albumChangeRequest = PHAssetCollectionChangeRequest(for: album);
let changeRequest = PHAssetChangeRequest.init(for: theAsset!);
let enumeration:NSArray = [theAsset!];
let cnt = album.estimatedAssetCount;
if(cnt == 0){
albumChangeRequest?.addAssets(enumeration);
}else{
albumChangeRequest?.inserAssets(enumeration, at: [0]);
}
}){didSucceed, error) in
OperationQueue.main.addOperation({
didSucceed ? success(imageID) : failure(error);
})
}
}
So, it is pretty much the same, except instead of creating an Asset Creation Request and generating a placeholder for the created asset, you instead just use the existing asset ID to fetch an existing asset and add the existing asset to the addasset/insertasset NSArray parameter instead of a newly created asset
I want to retrieve the image that is stored in the storage of an user and place it next to his name in a custom UITableViewCell. The problem now is that the tableview will load when the images aren't done downloading (I think?), causing the application to crash because the image array is nil. So what is the correct way to load the tableview? I think, for the user experience, it is important that the tableviewcell image should be shown even if the images aren't done downloading, and present them a default image that is saved in the assists. I thought about making an array with UIImages that links to the default asset of loading an image and changing the image to the profile picture when it is done downloading. But I really have no clue how to do that. This is what I got so far about downloading the image:
let storage = FIRStorage.storage()
let storageRef = storage.reference(forURL: "link.appspot.com")
channelRef?.observeSingleEvent(of: .value, with: { (snapshot) in
if let snapDict = snapshot.value as? [String:AnyObject]{
for each in snapDict{
let UIDs = each.value["userID"] as? String
if let allUIDS = UIDs{
let profilePicRef = storageRef.child((allUIDS)+"/profile_picture.png")
profilePicRef.data(withMaxSize: 1 * 500 * 500) { data, error in
if let error = error {
}
if (data != nil)
{
self.playerImages.append(UIImage (data: data!)!)
}
}
}
let userNames = each.value["username"] as? String
if let users = userNames{
self.players.append(users)
}
}
}
self.tableView.reloadData()
})
This is in the cellForRow
cell.playersImage.image = playerImages[indexPath.row] as UIImage
My rules, haven't changed it from the default rules:
service firebase.storage {
match /b/omega-towers-f5beb.appspot.com/o {
match /{allPaths=**} {
allow read, write: if request.auth != null;
}
}
}
Thank you.
Regarding user experience, you are correct. It is standard to have some sort of default image when loading an image from a URL. A great library to use for image caching and using default assets in its' place is AlamofireImage
Vandan Patel's answer is correct in saying you need to ensure your array is not nil when loading the tableview. You will be given a completion block to handle any extra work you would like to do with your image, using the AlamofireImage library.
This is all assuming you are getting a correct image URL back for your Firebase users.
You should call tableView.reloadData() when the images are done downloading. One important thing, initialize your playerImages as playerImages = [UIImage]() instead of playerImages: [UIImage]!. if it's empty, it wouldn't show your array is nil.
Update:
if let players = playerImages {
//code
}
Has anyone figured out how to extract the video portion from a Live Photo? I'm working on an app to convert Live Photos into a GIF, and the first step is to get the video file from the Live Photo. It seems like it should be possible, because if you plug in your phone to a Mac you can see the separate image and video files. I've kinda run into a brick wall in the extraction process, and I've tried many ways to do it and they all fail.
The first thing I did was obtain a PHAsset for what I think is the video part of the Live Photo, by doing the following:
if let livePhoto = info["UIImagePickerControllerLivePhoto"] as? PHLivePhoto {
let assetResources = PHAssetResource.assetResourcesForLivePhoto(livePhoto)
for assetRes in assetResources {
if (assetRes.type == .PairedVideo) {
let assets = PHAsset.fetchAssetsWithLocalIdentifiers([assetRes.assetLocalIdentifier], options: nil)
if let asset = assets.firstObject as? PHAsset {
To convert the PHAsset to an AVAsset I've tried:
asset.requestContentEditingInputWithOptions(nil, completionHandler: { (contentEditingInput, info) -> Void in
if let url = contentEditingInput?.fullSizeImageURL {
let movieUrl = url.absoluteString + ".mov"
let avAsset = AVURLAsset(URL: NSURL(fileURLWithPath: movieUrl), options: nil)
debugPrint(avAsset)
debugPrint(avAsset.duration.value)
}
})
I don't think this one works because the debug print with the duration.value gives 0.
I've also tried without the ".mov" addition and it still doesn't work.
I also tried:
PHImageManager.defaultManager().requestAVAssetForVideo(asset, options: nil, resultHandler: { (avAsset, audioMix, info) -> Void in
debugPrint(avAsset)
})
And the debugPrint(avAsset) prints nil so it doesn't work.
I'm kind of afraid they might have made it impossible to do, it seems like I'm going in circles since it seems like the PHAsset I got is still a Live Photo and not actually a video.
Use the PHAssetResourceManager to get the video file from the PHAssetResource.
PHAssetResourceManager.defaultManager().writeDataForAssetResource(assetRes,
toFile: fileURL, options: nil, completionHandler:
{
// Video file has been written to path specified via fileURL
}
NOTE: The Live Photo specific APIs were introduced in iOS 9.1
// suppose you have PHAsset instance (you can get it via [PHAsset fetchAssetsWithOptions:...])
PHAssetResource *videoResource = nil;
NSArray *resourcesArray = [PHAssetResource assetResourcesForAsset:asset];
const NSInteger livePhotoAssetResourcesCount = 2;
const NSInteger videoPartIndex = 1;
if (resourcesArray.count == livePhotoAssetResourcesCount) {
videoResource = resourcesArray[videoPartIndex];
}
if (videoResource) {
NSString * const fileURLKey = #"_fileURL";
NSURL *videoURL = [videoResource valueForKey:fileURLKey];
// load video url using AVKit or AVFoundation
}
I accidentally did. I have an ios app called Goodreader (available in the appstore) which features a windows-like file manager. When importing a live photo, it will save it as a folder ending in .pvt containing the jpg and mov files in it. There is only one caveat: you need to open the live photo from within the messages app after you've sent it to yourself or somebody else to see the "import to goodreader" option, not from the photos app.
I recently saw this project in which a user can tap on a GIF from a custom keyboard and they would see a "copied" toolip appear. I have one question:
How does one reproduce this tooltip in the products GIF-Tutorial?
Could anyone give me some sample code to work with. I understand how to use UIPasteboard and it's functions, but I can't seem to get it to work when I put in the UTI type "public.png" in this function: (I noticed in Objective-c it's "#public.png", but I placed "public.png" I couldn't find a source online for this)
let imageURL = NSString(string:NSBundle.mainBundle().pathForResource("test", ofType: "png")!)
var data = NSData(contentsOfURL: NSURL(string:imageURL)!)
UIPasteboard.generalPasteboard().setData(data!, forPasteboardType: "public.png")
Try using this code:
let image = UIImage(named: "myimage.png")
UIPasteboard.generalPasteboard().image = image;
you can find out how this works here!
Hope this helps
Swift 5.1
UIPasteboard.general.image = image
When using UIPasteboard.generalPasteboard().image = image; it seems the image is not copied to the pasteboard. Instead try the next code, it also explains how you can replace "public.png" string:
// The Pasteboard is nil if full access is not granted
// 'image' is the UIImage you about to copy to the pasteboard
if let pb = UIPasteboard.generalPasteboard() {
let type = UIPasteboardTypeListImage[0] as! String
if !type.isEmpty {
pb.setData(UIImagePNGRepresentation(image), forPasteboardType: type)
if let readData = pb.dataForPasteboardType(type) {
let readImage = UIImage(data: readData, scale: 2)
println("\(image) == \(pb.image) == \(readImage)")
}
}
}