Parse array images saving and fetching - ios

I have a mosaic app that takes multiple size photos and breaks them into smaller photos. Depending on the size of the photo, the amount of smaller photos could vary. Now I have an NSMutableArray named imageNameList2 that holds all of the smaller images taken from the larger image. For this example I showed an example with the images being called from the image assets list to make it easier to answer this question.
Here is the imageNameList (NSMutableArray that holds all the smaller images)
var imageNameList: [String] {
var imageNameList2:[String] = [] //[NSMutableArray]()
for i in 0...149 {
let imageName = String(format: "pic_%03d", Int(i))
imageNameList2.append(imageName)
}
return imageNameList2
}
What I'd like to do is have a continue button that will save all these images in order as piffles or any other format to parse that works best and have another button called retrieve that will retrieve all these photos from parse. I basically have a parse server that utilizes parse frameworks to help speed up the backend process. Can you please show me how I would save and retrieve this NSMutableArray if there are different numbers of stored images each time?

I think you're trying to do something like this. This is just an example. There's a lot of work to be done but hopefully this will get you started. I did not run or test this code.
The idea is to save your images as PFFiles, and create a 'tile' PFObject for each file. Then save all the 'tile' PFObjects to a 'tiles' key of the image PFObject. Then recall the image when you need it by objectId.
Good luck.
let appleTiles = ["apple1, apple2, apple3"]
let orangeTiles = ["orange1, orange2, orange3, orange4, orange5"]
func usage() {
//dont literally run these synchronously like this
post(appleTiles)
post(orangeTiles)
download()
}
func post(_ tileNames: [String]) {
let image = PFObject(className: "Image")
let tilesPF = tileNames.map({ name in
let data = UIImagePNGRepresentation(UIImage(named: name))!
let file = PFFile(data: data)
let tile = PFObject(className: "Tile")
tile["tile"] = file
})
image["tiles"] = tilesPF
image?.saveInBackground(block: { responseObject, error in
//you'll want to save the object ID of the PFObject if you want to retrieve a specific image later
})
}
func download() {
let query = PFQuery(className: "image")
//add this if you have a specific image you want to get
query.whereKey("objectId", equalTo: "someObjectId")
query.findObjectsInBackground({ result, error in
//this is probably close to how you'd unwrap everything but again, I didn't test it so...
if let objects = result as? [PFObject], let first = objects.first, let image = first["image"] as? PFObject, let tiles = image["tiles"] as? [PFObject] {
tiles.forEach({ tile in
let file = tile["tile"]
//now you have an individual PFFile for a tile, do something with it
})
}
})
}

Related

Adding image from Firebase to UITableViewCell

I want to retrieve the image that is stored in the storage of an user and place it next to his name in a custom UITableViewCell. The problem now is that the tableview will load when the images aren't done downloading (I think?), causing the application to crash because the image array is nil. So what is the correct way to load the tableview? I think, for the user experience, it is important that the tableviewcell image should be shown even if the images aren't done downloading, and present them a default image that is saved in the assists. I thought about making an array with UIImages that links to the default asset of loading an image and changing the image to the profile picture when it is done downloading. But I really have no clue how to do that. This is what I got so far about downloading the image:
let storage = FIRStorage.storage()
let storageRef = storage.reference(forURL: "link.appspot.com")
channelRef?.observeSingleEvent(of: .value, with: { (snapshot) in
if let snapDict = snapshot.value as? [String:AnyObject]{
for each in snapDict{
let UIDs = each.value["userID"] as? String
if let allUIDS = UIDs{
let profilePicRef = storageRef.child((allUIDS)+"/profile_picture.png")
profilePicRef.data(withMaxSize: 1 * 500 * 500) { data, error in
if let error = error {
}
if (data != nil)
{
self.playerImages.append(UIImage (data: data!)!)
}
}
}
let userNames = each.value["username"] as? String
if let users = userNames{
self.players.append(users)
}
}
}
self.tableView.reloadData()
})
This is in the cellForRow
cell.playersImage.image = playerImages[indexPath.row] as UIImage
My rules, haven't changed it from the default rules:
service firebase.storage {
match /b/omega-towers-f5beb.appspot.com/o {
match /{allPaths=**} {
allow read, write: if request.auth != null;
}
}
}
Thank you.
Regarding user experience, you are correct. It is standard to have some sort of default image when loading an image from a URL. A great library to use for image caching and using default assets in its' place is AlamofireImage
Vandan Patel's answer is correct in saying you need to ensure your array is not nil when loading the tableview. You will be given a completion block to handle any extra work you would like to do with your image, using the AlamofireImage library.
This is all assuming you are getting a correct image URL back for your Firebase users.
You should call tableView.reloadData() when the images are done downloading. One important thing, initialize your playerImages as playerImages = [UIImage]() instead of playerImages: [UIImage]!. if it's empty, it wouldn't show your array is nil.
Update:
if let players = playerImages {
//code
}

Swift - How to retrieve multiple images at once (GCD)?

Let me give you some insight on my application itself.
To put it in short, I am creating a social-networking app. Each post consists of an image, profile picture, and caption. Each post exists in my MySQL database. I am using my own framework to retrieve each post. However, once I retrieve each post I still have to retrieve the profile picture and image using the URLs which I retrieved from the database. I would like to retrieve all images at once rather than running in sequential order.
As of now, there are about 5 posts in the database. Loading the necessary images for one post takes about 4 seconds. So right now I am loading the images for one post then retrieving the next in sequential order. So this whole process takes around 20 seconds. So say have 50 posts then it will take an extremely long time to load all the posts. I have some knowledge of GCD (grand-dispatch-queues) however I don't know how to implement it in my app.
Here is my code for retrieving my posts and images:
ConnectionManager.sharedInstance.retrievePosts(UserInformationInstance.SCHOOL) {
(result: AnyObject) in
if let posts = result as? [[String: AnyObject]] {
print("Retrieved \(posts.count) posts.")
for post in posts {
let postIDCurrent = post["id"] as? Int
var UPVOTES = 0;
var UPVOTED: Bool!
var query = ""
if let profilePictureCurrent = post["profile_picture"] {
// Loading profile picture image
let url = NSURL(string: profilePictureCurrent as! String)
let data = NSData(contentsOfURL: url!)
let image = UIImage(data: data!)
UserInformationInstance.postsProfilePictures.append(image!)
print("added profile pic")
} else {
print("error")
}
if let postPictureCurrent = post["image"] {
if (postPictureCurrent as! String != "") {
// Loading image associated with post
let url = NSURL(string: postPictureCurrent as! String)
let data = NSData(contentsOfURL: url!)
let image = UIImage(data: data!)
let imageArray: [AnyObject] = [postIDCurrent!, image!]
UserInformationInstance.postsImages.append(imageArray)
print("added image pic")
}
} else {
print("error")
}
UserInformationInstance.POSTS.append(post)
}
} else {
self.loadSearchUsers()
}
}
So my question is, how can I retrieve all the images at the same time instead of retrieving one after the other?
It would be great if someone could give an explanation as well as some code :)
I would recommend to revise your approach. If your server is fine - it's not busy and well reachable, so that resources downloading is limited by device network adapter bandwidth (X mbps), then it does not matter how you downloading images - concurrently or sequently.
Let me show this. Downloading time of 10 files with size Y mb simultaneously is equal to downloading time of one file, but in this case the downloading speed will be 10 times slower per file:
X/10 - downloading speed per one file
Time = Amount / Speed
T = Y / (X/10) = 10 * Y / X
Now if your are downloading sequently:
T = 10 * (Y / X) = 10 * Y / X
I would recommend to show posts immediately once you retrived them from the storage, then you need to start image downloading asynchronously and set image once that's downloaded. That's the best practice in the industry, consider Facebook, Twitter, Instagram apps.

Fetching CKAsset Image From CloudKit is Very Slow

I am using CloudKit as a server backend for my iOS application. I'm using it to house some relatively static data along with a handful of images(CKAsset). I ran into a problem when the time came for me to actually fetch those assets from the public database. They load at an excruciatingly slow speed.
My use case is to load an image into every cell inside of a collection view. The images are only 200kb in size, but the fetch process took an average of 2.2 seconds for the download to complete and set the image in a cell. For comparison, I took URLs of similar sized stock images and loaded them in using NSURLSession. It took a mere 0.18 - 0.25 seconds for each image to load.
I have tried multiple different ways of downloading the images from CK: direct fetch of the record, query, and operation query. All of them have similar results. I am also dispatching back to the main queue within the completion block prior to setting the image for the cell.
My database is setup to have a primary object with several fields of data. I then setup a backwards reference style system for the photos, where each photo just has a reference to a primary object. That way I can load the photos on demand without bogging down the main data.
It looks something like this:
Primary Object:
title: String, startDate: Date
Photo Object:
owner: String(reference to primary object), image: Asset
Here is an example request that I tried to directly fetch one of the photos:
let publicDb = CKContainer.defaultContainer().publicCloudDatabase
let configRecordId = CKRecordID(recordName: "e783f542-ec0f-46j4-9e99-b3e3ez505adf")
publicDb.fetchRecordWithID(configRecordId) { (record, error) -> Void in
dispatch_async(dispatch_get_main_queue()) {
guard let photoRecord = record else { return }
guard let asset = photoRecord["image"] as? CKAsset else { return }
guard let photo = NSData(contentsOfURL: asset.fileURL) else { return }
let image = UIImage(data: photo)!
cell.cardImageView.image = image
}
}
I can't seem to figure out why these image downloads are taking so long, but it's really quite the showstopper if I can't get them to load in a reasonable about of time.
Update: I tried the fetch operation with a smaller image, 23kb. The fetch was faster, anywhere from 0.3 - 1.1 seconds. That's better, but still doesn't meet the expectation that I had for what CloudKit should be able to provide.
I am using CKQueryOperation. I found that once I added the following line to my code that downloading CKAssets sped up by about a factor of 5-10x.
queryOperation.qualityOfService = .UserInteractive
Here is my full code:
func getReportPhotos(report:Report, completionHandler: (report:Report?, error:NSError?) -> ()) {
let photo : Photo = report.photos![0] as! Photo
let predicate : NSPredicate = NSPredicate(format: "recordID = %#", CKRecordID(recordName: photo.identifier!))
let query : CKQuery = CKQuery(recordType: "Photo", predicate: predicate)
let queryOperation : CKQueryOperation = CKQueryOperation()
queryOperation.query = query
queryOperation.resultsLimit = numberOfReportsPerQuery
queryOperation.qualityOfService = .UserInteractive
queryOperation.recordFetchedBlock = { record in
photo.date = record.objectForKey("date") as? NSDate
photo.fileType = record.objectForKey("fileType") as? String
let asset : CKAsset? = record.objectForKey("image") as? CKAsset
if asset != nil {
let photoData : NSData? = NSData(contentsOfURL:asset!.fileURL)
let photo : Photo = report.photos![0] as! Photo
photo.image = UIImage(data:photoData!)
}
}
queryOperation.queryCompletionBlock = { queryCursor, error in
dispatch_async(dispatch_get_main_queue(), {
completionHandler(report: report, error: error)
})
}
publicDatabase?.addOperation(queryOperation)
}
There seems to be something slowing down your main thread which introduces a delay in executing the capture block of your dispatch_async call. Is it possible that your code calls this record fetching function multiple times in parallel ? This would cause the NSData(contentsOfURL: asset.fileURL) processing to hog the main thread and introduce cumulative delays.
In any case, if only as a good practice, loading the image with NSData should be performed in the background and not on the main thread.

Get PHAsset from iOS Share Extension

I am developing a share extension for photos for my iOS app. Inside the extension, I am able to successfully retrieve the UIImage object from the NSItemProvider.
However, I would like to be able to share the image with my container app, without having to store the entire image data inside my shared user defaults. Is there a way to get the PHAsset of the image that the user has chosen in the share extension (if they have picked from their device)?
The documentation on the photos framework (https://developer.apple.com/library/ios/documentation/Photos/Reference/Photos_Framework/) has a line that says "This architecture makes it easy, safe, and efficient to work with the same assets from multiple threads or multiple apps and app extensions."
That line makes me think there is a way to share the same PHAsset between extension and container app, but I have yet to figure out any way to do that? Is there a way to do that?
This only works if the NSItemProvider gives you a URL with the format:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0007.PNG
which is not always true for all your assets, but if it returns a URL as:
file:///var/mobile/Media/PhotoData/OutgoingTemp/2AB79E02-C977-4B4A-AFEE-60BC1641A67F.JPG
then PHAsset will never find your asset. Further more, the latter is a copy of your file, so if you happen to have a very large image/video, iOS will duplicate it in that OutgoingTemp directory. Nowhere in the documentation says when it's going to be deleted, hopefully soon enough.
I think this is a big gap Apple has left between Sharing Extensions and PHPhotoLibrary framework. Apple should've be creating an API to close it, and soon.
You can get PHAsset if image is shared from Photos app. The item provider will give you a URL that contains the image's filename, you use this to match PHAsset.
/// Assets that handle through handleImageItem:completionHandler:
private var handledAssets = [PHAsset]()
/// Key is the matched asset's original file name without suffix. E.g. IMG_193
private lazy var imageAssetDictionary: [String : PHAsset] = {
let options = PHFetchOptions()
options.includeHiddenAssets = true
let fetchResult = PHAsset.fetchAssetsWithOptions(options)
var assetDictionary = [String : PHAsset]()
for i in 0 ..< fetchResult.count {
let asset = fetchResult[i] as! PHAsset
let fileName = asset.valueForKey("filename") as! String
let fileNameWithoutSuffix = fileName.componentsSeparatedByString(".").first!
assetDictionary[fileNameWithoutSuffix] = asset
}
return assetDictionary
}()
...
provider.loadItemForTypeIdentifier(imageIdentifier, options: nil) { imageItem, _ in
if let image = imageItem as? UIImage {
// handle UIImage
} else if let data = imageItem as? NSData {
// handle NSData
} else if let url = imageItem as? NSURL {
// Prefix check: image is shared from Photos app
if let imageFilePath = imageURL.path where imageFilePath.hasPrefix("/var/mobile/Media/") {
for component in imageFilePath.componentsSeparatedByString("/") where component.containsString("IMG_") {
// photo: /var/mobile/Media/DCIM/101APPLE/IMG_1320.PNG
// edited photo: /var/mobile/Media/PhotoData/Mutations/DCIM/101APPLE/IMG_1309/Adjustments/FullSizeRender.jpg
// cut file's suffix if have, get file name like IMG_1309.
let fileName = component.componentsSeparatedByString(".").first!
if let asset = imageAssetDictionary[fileName] {
handledAssets.append(asset)
imageCreationDate = asset.creationDate
}
break
}
}
}

How do i change this code to query by creation date with swift from parse data?

I have this code to retrieve images from parse data.
How do I query it by creation date? I tried using findObjectsInBackgroundWithBlock but it wont work with a PFFile. What do i do?
Also, as a side question, sometimes when i place the images in a UIImageView they are upside down or sideways. Why?
if let objects = objects as? [PFObject] {
for object in objects {
if let userPicture = object.valueForKey("Image") as? PFFile {
userPicture.getDataInBackgroundWithBlock({ (imageData: NSData?, error: NSError?) -> Void in
if (error == nil)
{
let image = UIImage(data:imageData!)
self.ImageArray.insert(image!, atIndex: 0)
}
else {
self.alert("Error: \(error!) \(error!.userInfo!)", Message: "Make sure you have a secure internet connection")
}
dispatch_async(dispatch_get_main_queue())
{
self.collectionView.reloadData()
println("Finished Pictures")
}
})
}
}}
You have a set of objects which all reference an image file, but probably a set of other things too, and the image can likely be changed. So, the object creation date isn't the same as the image creation date and the image file doesn't know (or at least doesn't expose) it's creation date. You're also currently always adding the images to the start of the array so the position will be set by how big (and therefore how long it takes to download) each image is. Also, trying to download lots of images at the same time could just mean you get lots of timeouts.
So, really you should have a column on your object which holds the date at which the image was updated, and sort the objects by that date. As you download the images you place them into the image array in the same index as the owning object in its array (pad the array out with NSNull so you know what's going on).

Resources