Memory warning on iOS 8 App - but usage is low - ios

I hope someone can help me out, I already searched Stackoverflow and Google but I couldn't get the right solution.
I am having a very simple app which takes a photo (using the standard iOS Camera through UIImagePickerController) then I save it to the file system with a very low resolution - let thumbNailData = UIImageJPEGRepresentation(image, 0.02) after that I display the images in a collection view using Core Data - I only save the filename in Core Data, not the image, image is only in the filesystem.
So, however, when I run the app it shows me a memory usage of not more than 15 MB and Process is around 1-2 %. Everything runs fine but after adding 6-7 Photos I get strange errors like the Memory Warning, Lost connection to my iPhone and this on:
Communications error: <OS_xpc_error: <error: 0x198adfa80> { count = 1, contents =
"XPCErrorDescription" => <string: 0x198adfe78> { length = 22, contents = "Connection interrupted"
So, I am really stuck is I thought I made it very lightweight and then I get these errors....
I already submitted a note taking app to the App Store which was much more high functionality than this one but this one runs very stable...
Any ideas?
Here is some of my code:
// Here I load the picture names into an array to display in the collection view
func loadColl () {
let appDelegate = (UIApplication.sharedApplication().delegate as AppDelegate)
let context:NSManagedObjectContext = appDelegate.managedObjectContext!
let fetchRequest:NSFetchRequest = NSFetchRequest(entityName: "PictureNote")
var error:NSError?
var result = context.executeFetchRequest(fetchRequest, error: &error) as [PictureNote]
for res in result {
println(res.latitude)
println(res.longitude)
println(res.longDate)
println(res.month)
println(res.year)
println(res.text)
pictures.append(res.thumbnail)
}
}
// Here is the code to display in collection view
func collectionView(collectionView: UICollectionView, cellForItemAtIndexPath indexPath: NSIndexPath) -> UICollectionViewCell {
let myImageCell:ImageCell = myCollectionView.dequeueReusableCellWithReuseIdentifier("imageCell", forIndexPath: indexPath) as ImageCell
myImageCell.imageCell.image = self.loadImageFromPath(fileInDocumentsDirectory(self.pictures[indexPath.row]))
return myImageCell
}
// Here is the code to load the pictures from disk
func loadImageFromPath(path: String) -> UIImage {
let image = UIImage(contentsOfFile: path)
if image == nil {
self.newAlert("Error loading your image, try again", title: "Notice")
}
return image!
}
// Here is my saving code
func saveNewEntry () {
var unique = NSUUID().UUIDString
var imageTitle = "Picture\(unique).jpg"
var image = self.pickedImageView.image
var path = fileInDocumentsDirectory(imageTitle)
var thumbImageTitle = "ThumbPicture\(unique).jpg"
var thumbPath = fileInDocumentsDirectory(thumbImageTitle)
if self.saveImage(image!, path: path) == true && self.saveThumbImage(image!, thumbPath: thumbPath) == true {
// Create the saving context
let context = (UIApplication.sharedApplication().delegate as AppDelegate).managedObjectContext!
let entityOne = NSEntityDescription.entityForName("PictureNote", inManagedObjectContext: context)
let thisTask = PictureNote(entity: entityOne!, insertIntoManagedObjectContext: context)
// Get all the values here
self.getMyLocation()
var theDate = NSDate()
thisTask.month = Date.toString(date: theDate, format: "MM")
thisTask.year = Date.toString(date: theDate, format: "yyyy")
thisTask.longDate = theDate
thisTask.longitude = getMyLocation()[1]
thisTask.latitude = getMyLocation()[0]
thisTask.text = self.noteTextView.text
thisTask.imageURL = imageTitle
thisTask.thumbnail = thumbImageTitle
thisTask.id = unique
// Saving to CoreData
if context.save(nil) {
self.newAlert("Saving your note was successful!", title: "Notice")
self.noteTextView.text = ""
} else {
self.newAlert("Error saving your note, try again", title: "Notice")
}
} else {
self.newAlert("Error saving your image, try again", title: "Notice")
}
self.pickedImageView.image = UIImage(named: "P1000342.jpg")
}
I am really thankful for every suggestion....if you need more code, just let me know...

I notice that you are using a dramatically reduced quality factor in conjunction with UIImageJPEGRepresentation. If this is an attempt is to reduce the memory involved, all that does is reduce the size of the resulting NSData your write to persistent storage, but loading that image into the image view will still require something on the order of (4 × width × height) bytes (note, that's the dimensions of the image, not the image view). Thus the 3264 × 2448 image from a iPhone takes up 30mb per image, regardless of the quality factor employed by UIImageJPEGRepresentation.
Usually I will make sure my collection/table view uses thumbnail representation (either the thumbnail property of the ALAsset or resize the default representation myself). If I'm caching this thumbnail anywhere (such as persistent storage suggested by your question), I can then use a high quality representation of the thumbnail (I use PNG because it's lossless with compression, but JPEG with 0.7-0.9 quality factor is a reasonably faithful version, too).

Related

how to remove thumbnailData in CSSearchableItemAttributeSet

My indexed data don't have a photo, so I don't want to show a photo, I only want the title and description:
I have tried to set:
attributes.thumbnailData = nil
But it stil shows this blank image.
static func setupSearchableContentForSpotlight() {
let realm = try! Realm(configuration:Constants.realmConfigration.wordsConfigration)
var words: Results<Word>!
words = realm.objects(Word.self)
var searchableItems = [CSSearchableItem]()
words?.forEach { word in
let attributes = CSSearchableItemAttributeSet(itemContentType: kUTTypeData as String)
attributes.title = word.defination
attributes.contentDescription = word.meaning
attributes.thumbnailData = nil
let searchableVegetable = CSSearchableItem(uniqueIdentifier: nil, domainIdentifier: nil, attributeSet: attributes)
}
CSSearchableIndex.default().indexSearchableItems(searchableItems) { (error) -> Void in
print("indxing completed")
UserDefaults.standard.set(true, forKey: "spotLightIndexed")
if let error = error {
print(error.localizedDescription)
}
}
}
I expect to for example the result to be with only:
exams
الماس
without this blank image.
note: the app already has display icons.
An image is always shown next to a Core Spotlight result. If you don't supply a thumbnail image, then iOS will show your app icon.
Ensure that you have the right size icons in your icon set. You need a 40x40 icon at standard, 2x and 3x resolutions.
The problem solved by restarting my iPhone.
Extra tip: if you are dealing with CoreSpotlight, restaurant your iPhone after significant changes.

Parse array images saving and fetching

I have a mosaic app that takes multiple size photos and breaks them into smaller photos. Depending on the size of the photo, the amount of smaller photos could vary. Now I have an NSMutableArray named imageNameList2 that holds all of the smaller images taken from the larger image. For this example I showed an example with the images being called from the image assets list to make it easier to answer this question.
Here is the imageNameList (NSMutableArray that holds all the smaller images)
var imageNameList: [String] {
var imageNameList2:[String] = [] //[NSMutableArray]()
for i in 0...149 {
let imageName = String(format: "pic_%03d", Int(i))
imageNameList2.append(imageName)
}
return imageNameList2
}
What I'd like to do is have a continue button that will save all these images in order as piffles or any other format to parse that works best and have another button called retrieve that will retrieve all these photos from parse. I basically have a parse server that utilizes parse frameworks to help speed up the backend process. Can you please show me how I would save and retrieve this NSMutableArray if there are different numbers of stored images each time?
I think you're trying to do something like this. This is just an example. There's a lot of work to be done but hopefully this will get you started. I did not run or test this code.
The idea is to save your images as PFFiles, and create a 'tile' PFObject for each file. Then save all the 'tile' PFObjects to a 'tiles' key of the image PFObject. Then recall the image when you need it by objectId.
Good luck.
let appleTiles = ["apple1, apple2, apple3"]
let orangeTiles = ["orange1, orange2, orange3, orange4, orange5"]
func usage() {
//dont literally run these synchronously like this
post(appleTiles)
post(orangeTiles)
download()
}
func post(_ tileNames: [String]) {
let image = PFObject(className: "Image")
let tilesPF = tileNames.map({ name in
let data = UIImagePNGRepresentation(UIImage(named: name))!
let file = PFFile(data: data)
let tile = PFObject(className: "Tile")
tile["tile"] = file
})
image["tiles"] = tilesPF
image?.saveInBackground(block: { responseObject, error in
//you'll want to save the object ID of the PFObject if you want to retrieve a specific image later
})
}
func download() {
let query = PFQuery(className: "image")
//add this if you have a specific image you want to get
query.whereKey("objectId", equalTo: "someObjectId")
query.findObjectsInBackground({ result, error in
//this is probably close to how you'd unwrap everything but again, I didn't test it so...
if let objects = result as? [PFObject], let first = objects.first, let image = first["image"] as? PFObject, let tiles = image["tiles"] as? [PFObject] {
tiles.forEach({ tile in
let file = tile["tile"]
//now you have an individual PFFile for a tile, do something with it
})
}
})
}

Swift - How to retrieve multiple images at once (GCD)?

Let me give you some insight on my application itself.
To put it in short, I am creating a social-networking app. Each post consists of an image, profile picture, and caption. Each post exists in my MySQL database. I am using my own framework to retrieve each post. However, once I retrieve each post I still have to retrieve the profile picture and image using the URLs which I retrieved from the database. I would like to retrieve all images at once rather than running in sequential order.
As of now, there are about 5 posts in the database. Loading the necessary images for one post takes about 4 seconds. So right now I am loading the images for one post then retrieving the next in sequential order. So this whole process takes around 20 seconds. So say have 50 posts then it will take an extremely long time to load all the posts. I have some knowledge of GCD (grand-dispatch-queues) however I don't know how to implement it in my app.
Here is my code for retrieving my posts and images:
ConnectionManager.sharedInstance.retrievePosts(UserInformationInstance.SCHOOL) {
(result: AnyObject) in
if let posts = result as? [[String: AnyObject]] {
print("Retrieved \(posts.count) posts.")
for post in posts {
let postIDCurrent = post["id"] as? Int
var UPVOTES = 0;
var UPVOTED: Bool!
var query = ""
if let profilePictureCurrent = post["profile_picture"] {
// Loading profile picture image
let url = NSURL(string: profilePictureCurrent as! String)
let data = NSData(contentsOfURL: url!)
let image = UIImage(data: data!)
UserInformationInstance.postsProfilePictures.append(image!)
print("added profile pic")
} else {
print("error")
}
if let postPictureCurrent = post["image"] {
if (postPictureCurrent as! String != "") {
// Loading image associated with post
let url = NSURL(string: postPictureCurrent as! String)
let data = NSData(contentsOfURL: url!)
let image = UIImage(data: data!)
let imageArray: [AnyObject] = [postIDCurrent!, image!]
UserInformationInstance.postsImages.append(imageArray)
print("added image pic")
}
} else {
print("error")
}
UserInformationInstance.POSTS.append(post)
}
} else {
self.loadSearchUsers()
}
}
So my question is, how can I retrieve all the images at the same time instead of retrieving one after the other?
It would be great if someone could give an explanation as well as some code :)
I would recommend to revise your approach. If your server is fine - it's not busy and well reachable, so that resources downloading is limited by device network adapter bandwidth (X mbps), then it does not matter how you downloading images - concurrently or sequently.
Let me show this. Downloading time of 10 files with size Y mb simultaneously is equal to downloading time of one file, but in this case the downloading speed will be 10 times slower per file:
X/10 - downloading speed per one file
Time = Amount / Speed
T = Y / (X/10) = 10 * Y / X
Now if your are downloading sequently:
T = 10 * (Y / X) = 10 * Y / X
I would recommend to show posts immediately once you retrived them from the storage, then you need to start image downloading asynchronously and set image once that's downloaded. That's the best practice in the industry, consider Facebook, Twitter, Instagram apps.

Fetching CKAsset Image From CloudKit is Very Slow

I am using CloudKit as a server backend for my iOS application. I'm using it to house some relatively static data along with a handful of images(CKAsset). I ran into a problem when the time came for me to actually fetch those assets from the public database. They load at an excruciatingly slow speed.
My use case is to load an image into every cell inside of a collection view. The images are only 200kb in size, but the fetch process took an average of 2.2 seconds for the download to complete and set the image in a cell. For comparison, I took URLs of similar sized stock images and loaded them in using NSURLSession. It took a mere 0.18 - 0.25 seconds for each image to load.
I have tried multiple different ways of downloading the images from CK: direct fetch of the record, query, and operation query. All of them have similar results. I am also dispatching back to the main queue within the completion block prior to setting the image for the cell.
My database is setup to have a primary object with several fields of data. I then setup a backwards reference style system for the photos, where each photo just has a reference to a primary object. That way I can load the photos on demand without bogging down the main data.
It looks something like this:
Primary Object:
title: String, startDate: Date
Photo Object:
owner: String(reference to primary object), image: Asset
Here is an example request that I tried to directly fetch one of the photos:
let publicDb = CKContainer.defaultContainer().publicCloudDatabase
let configRecordId = CKRecordID(recordName: "e783f542-ec0f-46j4-9e99-b3e3ez505adf")
publicDb.fetchRecordWithID(configRecordId) { (record, error) -> Void in
dispatch_async(dispatch_get_main_queue()) {
guard let photoRecord = record else { return }
guard let asset = photoRecord["image"] as? CKAsset else { return }
guard let photo = NSData(contentsOfURL: asset.fileURL) else { return }
let image = UIImage(data: photo)!
cell.cardImageView.image = image
}
}
I can't seem to figure out why these image downloads are taking so long, but it's really quite the showstopper if I can't get them to load in a reasonable about of time.
Update: I tried the fetch operation with a smaller image, 23kb. The fetch was faster, anywhere from 0.3 - 1.1 seconds. That's better, but still doesn't meet the expectation that I had for what CloudKit should be able to provide.
I am using CKQueryOperation. I found that once I added the following line to my code that downloading CKAssets sped up by about a factor of 5-10x.
queryOperation.qualityOfService = .UserInteractive
Here is my full code:
func getReportPhotos(report:Report, completionHandler: (report:Report?, error:NSError?) -> ()) {
let photo : Photo = report.photos![0] as! Photo
let predicate : NSPredicate = NSPredicate(format: "recordID = %#", CKRecordID(recordName: photo.identifier!))
let query : CKQuery = CKQuery(recordType: "Photo", predicate: predicate)
let queryOperation : CKQueryOperation = CKQueryOperation()
queryOperation.query = query
queryOperation.resultsLimit = numberOfReportsPerQuery
queryOperation.qualityOfService = .UserInteractive
queryOperation.recordFetchedBlock = { record in
photo.date = record.objectForKey("date") as? NSDate
photo.fileType = record.objectForKey("fileType") as? String
let asset : CKAsset? = record.objectForKey("image") as? CKAsset
if asset != nil {
let photoData : NSData? = NSData(contentsOfURL:asset!.fileURL)
let photo : Photo = report.photos![0] as! Photo
photo.image = UIImage(data:photoData!)
}
}
queryOperation.queryCompletionBlock = { queryCursor, error in
dispatch_async(dispatch_get_main_queue(), {
completionHandler(report: report, error: error)
})
}
publicDatabase?.addOperation(queryOperation)
}
There seems to be something slowing down your main thread which introduces a delay in executing the capture block of your dispatch_async call. Is it possible that your code calls this record fetching function multiple times in parallel ? This would cause the NSData(contentsOfURL: asset.fileURL) processing to hog the main thread and introduce cumulative delays.
In any case, if only as a good practice, loading the image with NSData should be performed in the background and not on the main thread.

Unable to edit screenshots, performChanges block fails

I'm developing an app that allows users to edit photos using PhotoKit. I was previously saving the edited photo to disk as a JPEG. I would like to avoid converting to JPEG and have implemented the modifications in order to do that. It works great for photos taken with the camera, but if you try to edit a screenshot, the PHPhotoLibrary.sharedPhotoLibrary().performChanges block will fail and log The operation couldn’t be completed. (Cocoa error -1.). I am not sure why this is causing the performChanges block to fail, what have I done wrong here?
I've created a sample app available to download that demonstrates the problem, and I've included the relevant code below. The app attempts to edit the newest photo in your photo library. If it succeeds it will prompt for access to edit the photo, otherwise nothing will happen and you'll see the console log. To reproduce the issue, take a screenshot then run the app.
Current code that works with screenshots:
let jpegData: NSData = outputPhoto.jpegRepresentationWithCompressionQuality(0.9)
let contentEditingOutput = PHContentEditingOutput(contentEditingInput: self.input)
var error: NSError?
let success = jpegData.writeToURL(contentEditingOutput.renderedContentURL, options: NSDataWritingOptions.AtomicWrite, error: &error)
if success {
return contentEditingOutput
} else {
return nil
}
Replacement code that causes screenshots to fail:
let url = self.input.fullSizeImageURL
let orientation = self.input.fullSizeImageOrientation
var inputImage = CIImage(contentsOfURL: url)
inputImage = inputImage.imageByApplyingOrientation(orientation)
let outputPhoto = createOutputImageFromInputImage(inputImage)!
let originalImageData = NSData(contentsOfURL: self.input.fullSizeImageURL)!
let imageSource = CGImageSourceCreateWithData(originalImageData, nil)
let dataRef = CFDataCreateMutable(nil, 0)
let destination = CGImageDestinationCreateWithData(dataRef, CGImageSourceGetType(imageSource), 1, nil) //getType automatically selects JPG, PNG, etc based on original format
struct ContextStruct {
static var ciContext: CIContext? = nil
}
if ContextStruct.ciContext == nil {
let eaglContext = EAGLContext(API: .OpenGLES2)
ContextStruct.ciContext = CIContext(EAGLContext: eaglContext)
}
let cgImage = ContextStruct.ciContext!.createCGImage(outputPhoto, fromRect: outputPhoto.extent())
CGImageDestinationAddImage(destination, cgImage, nil)
if CGImageDestinationFinalize(destination) {
let contentEditingOutput = PHContentEditingOutput(contentEditingInput: self.input)
var error: NSError?
let imageData: NSData = dataRef
let success = imageData.writeToURL(contentEditingOutput.renderedContentURL, options: .AtomicWrite, error: &error)
if success {
//it does succeed
return contentEditingOutput
} else {
return nil
}
}
The problem happens due to the fact that adjusted photos are always saved as JPG files, and screenshots are in fact PNG files.
It occurred to me while I was debugging your sample project and saw the in the PhotoEditor, contentEditingOutput.renderedContentURL is a URL to a JPG, while if you examine the result of CGImageSourceGetType(imageSource) it is clear the it's a PNG (returns a PNG UTI: public.png).
So I went and read the documentation for renderedContentURL which states that if editing a photo asset, the altered image is written in JPEG format - which clearly won't work if your image is a PNG. This leads me to think that Apple don't support editing PNG files or don't want you to. Go figure..

Resources