Issue converting NSData to UIImage via Parse.com download - ios

I'm having a strange issue. My app works fine in the iPhone 6 emulator, but not in any other emulators or on my iDevice. Here's the main code
query.findObjectsInBackgroundWithBlock {
(objects, error) -> Void in
if(error == nil){
let imageObjects = objects as! [PFObject]
if let myObjects = objects {
for object in myObjects {
let myTitle = object["imageName"] as! NSString
println(myTitle)
let thumbNail = object["imageFile"] as! PFFile
// println(thumbNail)
thumbNail.getDataInBackgroundWithBlock({
(imageData: NSData?, error: NSError?) -> Void in
if (error == nil) {
self.imageArray.append(imageData!)
let image = UIImage(data:imageData!)
self.imageView.contentMode = .ScaleAspectFit
self.imageView.image=image
}
})//getDataInBackgroundWithBlock - end
}//for - end
}
}//end of if
else{
println("Error in retrieving \(error)")
}
}//findObjectsInBackgroundWithblock - end
I think this line is the culprit
let image = UIImage(data:imageData!)
But I'm not sure why. "image" returns nil in all other devices except iPhone 6 emulator. Any suggestions would be awesome.
Thanks
NOTE: the image displays just fine in iPhone 6 emulator. I'm using storyboard and unchecked autolayout.

There shouldn't really be a situation where the app only works on one simulator setting. Resetting your simulator should solve the problem.

One possibility is that you previously successfully downloaded this data, and Parse has cached the response in your iPhone app, but you've since changed the PFObject remotely and removed the image data. Or perhaps changed code.
Because Parse.com caches images, it's possible that it works only on this device because of Parse.com caching.
You could use the debugger, or worse, NSLog to determine what is nil. You need to determine if imageData is coming back nil or not. Also, is there a reason why you're assuming imageData is non-null? I'm referring to references to imageData! -- that seems dangerous. Just because there's no error doesn't mean imageData is non-null.

Related

how to remove thumbnailData in CSSearchableItemAttributeSet

My indexed data don't have a photo, so I don't want to show a photo, I only want the title and description:
I have tried to set:
attributes.thumbnailData = nil
But it stil shows this blank image.
static func setupSearchableContentForSpotlight() {
let realm = try! Realm(configuration:Constants.realmConfigration.wordsConfigration)
var words: Results<Word>!
words = realm.objects(Word.self)
var searchableItems = [CSSearchableItem]()
words?.forEach { word in
let attributes = CSSearchableItemAttributeSet(itemContentType: kUTTypeData as String)
attributes.title = word.defination
attributes.contentDescription = word.meaning
attributes.thumbnailData = nil
let searchableVegetable = CSSearchableItem(uniqueIdentifier: nil, domainIdentifier: nil, attributeSet: attributes)
}
CSSearchableIndex.default().indexSearchableItems(searchableItems) { (error) -> Void in
print("indxing completed")
UserDefaults.standard.set(true, forKey: "spotLightIndexed")
if let error = error {
print(error.localizedDescription)
}
}
}
I expect to for example the result to be with only:
exams
الماس
without this blank image.
note: the app already has display icons.
An image is always shown next to a Core Spotlight result. If you don't supply a thumbnail image, then iOS will show your app icon.
Ensure that you have the right size icons in your icon set. You need a 40x40 icon at standard, 2x and 3x resolutions.
The problem solved by restarting my iPhone.
Extra tip: if you are dealing with CoreSpotlight, restaurant your iPhone after significant changes.

Retrieving an image to my UIImageView

I load the image using my Android application Parse in my database, and I want to show it on my iphone.
when I run the Xcode tells me: 'Could not release the value of type' NSConcreteData 'to' PFFile ''
there any way to get this image and show on my iPhone (UIImageView) ??
Any idea to help me ?
I'll be very grateful !!
enter image description here
The problem is, as the Compiler tells you, that you're trying to cast NSData to PFFile.
Just use:
let userImageProfilePicture = event["profile_picture"]
userImageProfilePicture.getDataInBackgroundWithBlock({ (data, error) -> Void in
// handle here with your userAuth == true block
if let data = data where error == nil{
var image = UIImage(data: data)
}
})

Fetching CKAsset Image From CloudKit is Very Slow

I am using CloudKit as a server backend for my iOS application. I'm using it to house some relatively static data along with a handful of images(CKAsset). I ran into a problem when the time came for me to actually fetch those assets from the public database. They load at an excruciatingly slow speed.
My use case is to load an image into every cell inside of a collection view. The images are only 200kb in size, but the fetch process took an average of 2.2 seconds for the download to complete and set the image in a cell. For comparison, I took URLs of similar sized stock images and loaded them in using NSURLSession. It took a mere 0.18 - 0.25 seconds for each image to load.
I have tried multiple different ways of downloading the images from CK: direct fetch of the record, query, and operation query. All of them have similar results. I am also dispatching back to the main queue within the completion block prior to setting the image for the cell.
My database is setup to have a primary object with several fields of data. I then setup a backwards reference style system for the photos, where each photo just has a reference to a primary object. That way I can load the photos on demand without bogging down the main data.
It looks something like this:
Primary Object:
title: String, startDate: Date
Photo Object:
owner: String(reference to primary object), image: Asset
Here is an example request that I tried to directly fetch one of the photos:
let publicDb = CKContainer.defaultContainer().publicCloudDatabase
let configRecordId = CKRecordID(recordName: "e783f542-ec0f-46j4-9e99-b3e3ez505adf")
publicDb.fetchRecordWithID(configRecordId) { (record, error) -> Void in
dispatch_async(dispatch_get_main_queue()) {
guard let photoRecord = record else { return }
guard let asset = photoRecord["image"] as? CKAsset else { return }
guard let photo = NSData(contentsOfURL: asset.fileURL) else { return }
let image = UIImage(data: photo)!
cell.cardImageView.image = image
}
}
I can't seem to figure out why these image downloads are taking so long, but it's really quite the showstopper if I can't get them to load in a reasonable about of time.
Update: I tried the fetch operation with a smaller image, 23kb. The fetch was faster, anywhere from 0.3 - 1.1 seconds. That's better, but still doesn't meet the expectation that I had for what CloudKit should be able to provide.
I am using CKQueryOperation. I found that once I added the following line to my code that downloading CKAssets sped up by about a factor of 5-10x.
queryOperation.qualityOfService = .UserInteractive
Here is my full code:
func getReportPhotos(report:Report, completionHandler: (report:Report?, error:NSError?) -> ()) {
let photo : Photo = report.photos![0] as! Photo
let predicate : NSPredicate = NSPredicate(format: "recordID = %#", CKRecordID(recordName: photo.identifier!))
let query : CKQuery = CKQuery(recordType: "Photo", predicate: predicate)
let queryOperation : CKQueryOperation = CKQueryOperation()
queryOperation.query = query
queryOperation.resultsLimit = numberOfReportsPerQuery
queryOperation.qualityOfService = .UserInteractive
queryOperation.recordFetchedBlock = { record in
photo.date = record.objectForKey("date") as? NSDate
photo.fileType = record.objectForKey("fileType") as? String
let asset : CKAsset? = record.objectForKey("image") as? CKAsset
if asset != nil {
let photoData : NSData? = NSData(contentsOfURL:asset!.fileURL)
let photo : Photo = report.photos![0] as! Photo
photo.image = UIImage(data:photoData!)
}
}
queryOperation.queryCompletionBlock = { queryCursor, error in
dispatch_async(dispatch_get_main_queue(), {
completionHandler(report: report, error: error)
})
}
publicDatabase?.addOperation(queryOperation)
}
There seems to be something slowing down your main thread which introduces a delay in executing the capture block of your dispatch_async call. Is it possible that your code calls this record fetching function multiple times in parallel ? This would cause the NSData(contentsOfURL: asset.fileURL) processing to hog the main thread and introduce cumulative delays.
In any case, if only as a good practice, loading the image with NSData should be performed in the background and not on the main thread.

Parse:Swift: getDataInBackground, but data doesn't exist

I have a problem with parse. I was trying to save profile pics for my users, which worked perfectly. Then I wanted to show the profile pic inside an image view, if the user looks at his own profile. Problem is that when a user hasn't set an profile pic before, it lets my app crash, because there is no data to find when it tries to do the getDataInBackgroundWithBlock function. How can I solve this? Here is my code that should display the corresponding profile pic inside the profilePicImageView. I tried it without the else statement before, same problem :/
let profileImage = currentUser?.objectForKey("profilePic") as? PFFile
profileImage!.getDataInBackgroundWithBlock({ (imageData:NSData?, error:NSError?) -> Void in
if (error == nil){
let image:UIImage = UIImage(data: imageData!)!
self.profilePicImageView.image = image
}else {
self.profilePicImageView.image = nil
}
})
ParseUI has a class named PFImageView, which is a subclass of UIImageView. Instead of using UIImageView, you can do something like:
1. Set the class to PFImageView in Interface Builder
2. Make an outlet to that PFImageView to profilePicImageView
3. Change your code to:
self.profilePicImageView,image = <Any placeholder>
if let profileImage = currentUser?.objectForKey("profilePic") as? PFFile {
self.profilePicImageView.file = profileImage
self.profilePicImageView.loadInBackground()
}

Unable to edit screenshots, performChanges block fails

I'm developing an app that allows users to edit photos using PhotoKit. I was previously saving the edited photo to disk as a JPEG. I would like to avoid converting to JPEG and have implemented the modifications in order to do that. It works great for photos taken with the camera, but if you try to edit a screenshot, the PHPhotoLibrary.sharedPhotoLibrary().performChanges block will fail and log The operation couldn’t be completed. (Cocoa error -1.). I am not sure why this is causing the performChanges block to fail, what have I done wrong here?
I've created a sample app available to download that demonstrates the problem, and I've included the relevant code below. The app attempts to edit the newest photo in your photo library. If it succeeds it will prompt for access to edit the photo, otherwise nothing will happen and you'll see the console log. To reproduce the issue, take a screenshot then run the app.
Current code that works with screenshots:
let jpegData: NSData = outputPhoto.jpegRepresentationWithCompressionQuality(0.9)
let contentEditingOutput = PHContentEditingOutput(contentEditingInput: self.input)
var error: NSError?
let success = jpegData.writeToURL(contentEditingOutput.renderedContentURL, options: NSDataWritingOptions.AtomicWrite, error: &error)
if success {
return contentEditingOutput
} else {
return nil
}
Replacement code that causes screenshots to fail:
let url = self.input.fullSizeImageURL
let orientation = self.input.fullSizeImageOrientation
var inputImage = CIImage(contentsOfURL: url)
inputImage = inputImage.imageByApplyingOrientation(orientation)
let outputPhoto = createOutputImageFromInputImage(inputImage)!
let originalImageData = NSData(contentsOfURL: self.input.fullSizeImageURL)!
let imageSource = CGImageSourceCreateWithData(originalImageData, nil)
let dataRef = CFDataCreateMutable(nil, 0)
let destination = CGImageDestinationCreateWithData(dataRef, CGImageSourceGetType(imageSource), 1, nil) //getType automatically selects JPG, PNG, etc based on original format
struct ContextStruct {
static var ciContext: CIContext? = nil
}
if ContextStruct.ciContext == nil {
let eaglContext = EAGLContext(API: .OpenGLES2)
ContextStruct.ciContext = CIContext(EAGLContext: eaglContext)
}
let cgImage = ContextStruct.ciContext!.createCGImage(outputPhoto, fromRect: outputPhoto.extent())
CGImageDestinationAddImage(destination, cgImage, nil)
if CGImageDestinationFinalize(destination) {
let contentEditingOutput = PHContentEditingOutput(contentEditingInput: self.input)
var error: NSError?
let imageData: NSData = dataRef
let success = imageData.writeToURL(contentEditingOutput.renderedContentURL, options: .AtomicWrite, error: &error)
if success {
//it does succeed
return contentEditingOutput
} else {
return nil
}
}
The problem happens due to the fact that adjusted photos are always saved as JPG files, and screenshots are in fact PNG files.
It occurred to me while I was debugging your sample project and saw the in the PhotoEditor, contentEditingOutput.renderedContentURL is a URL to a JPG, while if you examine the result of CGImageSourceGetType(imageSource) it is clear the it's a PNG (returns a PNG UTI: public.png).
So I went and read the documentation for renderedContentURL which states that if editing a photo asset, the altered image is written in JPEG format - which clearly won't work if your image is a PNG. This leads me to think that Apple don't support editing PNG files or don't want you to. Go figure..

Resources