Downloading Images using AlamoFireImages - ios

I am trying to download my images to my table view on the home page for a project. I have retrieved the URLs from the API and they are valid (Printed in terminal)
I am then dumping these URLs into an array let imageURL = [String]()
I looped through the urls and requested the images through the AlamofireImage, but they are not get displayed
for url in self.imagesURL {
ItemImages.removeAll()
Alamofire.request(url).responseImage(completionHandler: { (response) in
guard let image = response.result.value else {return}
self.ItemImages.append(image)
completed(true)
})
}
next I am looking to convert these urls into images to be displayed..If someone could help me on where to move forward next that would be great.Or if I am going wrong anywhere...

Related

Extensive memory usage when uploading assets (images, videos) to firebase in Swift?

Suppose I have an array of UIImage called photos, they are to be uploaded to Firebase storage. I wish to do the following things:
Upload them to Firebase storage
Get paths of the uploaded photos and store in an array called uploadedAssets (paths, not download url, it looks like this: "photos/folder_name/photo_id"), where "folder_name" is randomly generated and "photo_id" is an integer, representing the order of photos
Call Cloud Function and pass uploadedAssets to it. The server then uses the paths to find all pictures and generates a thumbnail for each one.
Finally, store the original photos' download urls and thumbnails' download urls in database.
I have something that's working, but uses too much memory (300+MB when uploading only 4 pictures):
// Swift
let dispatchGroup = DispatchGroup()
let dispatchQueue = DispatchQueue.init(label: "AssetQueue")
var uploadedAssets = [String]()
let folderName: String = UUID().uuidString
dispatchQueue.async {
for i in 0..<photos.count {
dispatchGroup.enter()
let photo: UIImage = photos[i]
let fileName: String = "\(folderName)/\(i)"
let assetRef = Storage.storage().reference().child("photos/\(fileName)")
let metaData = StorageMetaData()
metaData.contentType = "image/jpg"
if let dataToUpload = UIImageJPEGRepresentation(photo, 0.75) {
assetRef.putData(
dataToUpload,
metaData: metaData,
completion: { (_, error) in
uploadedAssets.append("photos/\(fileName)")
dispatchGroup.leave()
}
)
}
}
}
dispatchGroup.notify(queue: dispatchQueue) {
Alamofire.request(
"https://<some_url>",
method: .post,
parameters: [
"uploadedAssets": uploadedAssets
]
)
}
And the code that generates thumbnails runs on server side, therefore, in my opinion, is irrelevant, I won't post it here. So, the above code snippet consumes 300+MB of memory when there are 4 photos to upload. After successfully uploaded those photos, the memory usage stays at 300+MB and never drops. When I try to upload more, say another 4 photos, it could even go up to 450+MB. I know that's not normal, but can't seem to figure out why this would happen?

Replace UIWebView's (loaded through a web service) <img> tag source with a local (cached) source

I am working on an app (swift) where i need to load a web page inside UIWebView. Inside that UIWebView there's an <img src="http://www.example.com/uploads/43454.jpg" /> element.
All works fine in this scenario, but the problem is that my 43454.jpg image can be of 5-10 megabytes everytime. So when the UIWebView loads it keeps on loading image for about 2 minutes. Plus this <img> tag can have random sources i.e. 22234.jpg, 98734.jpg, 33123.jpg, and so on.
So to tackle this situation I am trying to use following approach:
List all possible images that we need to show in UIWebView, download and cache them (used Kingfisher library for this purpose) at aplication's startup.
When my UIWebView loads my URL initially, it has nothing in it's <img> elements src attribute, but have a data-image-name="22234.jpg" attribute-value pair.
Now when UIWebView finishes loading its contents, get the image name value from data-image-name attribute.
Check for that image in cache, and update the <img> element's src attribute with that image from cache.
This way UIWebView won't be downloading the image over and over again.
Note: Assuming that UIWebView automatically manages resource cache. All other file types *.js, *.css are being properly cached and not being loaded over and over again. The same doesn't go for images, don't know why.
If this approach seems okay, then how should I accomplish it (Swift 2.2)?
Any quick help will be much appreciated. Thanks
This seems to be the same situation in one of my projects. I had exactly this same scenario. I am going to paste my code here, may be it helps you figure out your solution.
As soon as my app loads I create an array of image URLs and pass it to Kingfisher library to download and cache all images (to disk).
for url in response {
let URL = NSURL(string: url as! String)!
self.PlanImagesArray.append(URL)
}
let prefetcher = ImagePrefetcher(
urls: self.PlanImagesArray,
optionsInfo: nil,
progressBlock: {
(skippedResources, failedResources, completedResources) -> () in
print("progress resources are prefetched: \(completedResources)")
print("progress resources are failedResources: \(failedResources)")
print("progress resources are skippedResources: \(skippedResources)")
},
completionHandler: {
(skippedResources, failedResources, completedResources) -> () in
print("These resources are prefetched: \(completedResources)")
print("These resources are failedResources: \(failedResources)")
print("These resources are skippedResources: \(skippedResources)")
self.activityLoadingIndicator.stopAnimating()
})
prefetcher.start()
At my web view screen I initially loaded my web view and after that used following code to check for particular image in cache and if found converted it into a base64 string and put it back in src attribute of the element.
func webViewDidFinishLoad(webView : UIWebView) {
//UIApplication.sharedApplication().networkActivityIndicatorVisible = false
print("webViewDidFinishLoad")
let xlinkHref = webView.stringByEvaluatingJavaScriptFromString("document.getElementById('background_image').getAttribute('xlink:href')")
//print("xlink:href before = \(webView.stringByEvaluatingJavaScriptFromString("document.getElementById('background_image').getAttribute('xlink:href')"))")
if xlinkHref == "" {
let imageCacheKey = webView.stringByEvaluatingJavaScriptFromString("document.getElementById('background_image').getAttribute('data-cache-key')")
let imageCachePath = ImageCache.defaultCache.cachePathForKey(imageCacheKey! as String)
var strBase64:String = ""
webView.stringByEvaluatingJavaScriptFromString(
"var script = document.createElement('script');" +
"script.type = 'text/javascript';" +
"script.text = \"function setAttributeOnFly(elemName, attName, attValue) { " +
"document.getElementById(elemName).setAttribute(attName, attValue);" +
"}\";" +
"document.getElementsByTagName('head')[0].appendChild(script);"
)!
if(imageCachePath != "") {
ImageCache.defaultCache.retrieveImageForKey(imageCacheKey! as String, options: nil) { (imageCacheObject, imageCacheType) -> () in
let imageData:NSData = UIImageJPEGRepresentation(imageCacheObject!, 100)!
strBase64 = "data:image/jpg;base64," + imageData.base64EncodedStringWithOptions(.EncodingEndLineWithCarriageReturn)
webView.stringByEvaluatingJavaScriptFromString("setAttributeOnFly('background_image', 'xlink:href', '\(strBase64)');")!
}
} else {
webView.stringByEvaluatingJavaScriptFromString("setAttributeOnFly('background_image', 'xlink:href', '\(imageCacheKey)');")!
}
}
Hope it helps.
It sounds like you're basically saying "I know what the images are ahead of time, so I don't want to wait for the UIWebView to load them from the server every time. I want the html to use my local copies instead."
While it's a hack, my first thought was:
Have 2 UIWebViews: one is visible to the user and the other is hidden.
"Trigger" the real page in the hidden UIWebView. When the HTML reaches you...
Create a new string, that is a copy of that HTML, but with the image tags that say <img src="http://... replaced with <img src=\"file://... and pointing to the correct image(s) on disk.
Now tell the visible UIWebView to load that HTML string you built in step 3.

Adding image from Firebase to UITableViewCell

I want to retrieve the image that is stored in the storage of an user and place it next to his name in a custom UITableViewCell. The problem now is that the tableview will load when the images aren't done downloading (I think?), causing the application to crash because the image array is nil. So what is the correct way to load the tableview? I think, for the user experience, it is important that the tableviewcell image should be shown even if the images aren't done downloading, and present them a default image that is saved in the assists. I thought about making an array with UIImages that links to the default asset of loading an image and changing the image to the profile picture when it is done downloading. But I really have no clue how to do that. This is what I got so far about downloading the image:
let storage = FIRStorage.storage()
let storageRef = storage.reference(forURL: "link.appspot.com")
channelRef?.observeSingleEvent(of: .value, with: { (snapshot) in
if let snapDict = snapshot.value as? [String:AnyObject]{
for each in snapDict{
let UIDs = each.value["userID"] as? String
if let allUIDS = UIDs{
let profilePicRef = storageRef.child((allUIDS)+"/profile_picture.png")
profilePicRef.data(withMaxSize: 1 * 500 * 500) { data, error in
if let error = error {
}
if (data != nil)
{
self.playerImages.append(UIImage (data: data!)!)
}
}
}
let userNames = each.value["username"] as? String
if let users = userNames{
self.players.append(users)
}
}
}
self.tableView.reloadData()
})
This is in the cellForRow
cell.playersImage.image = playerImages[indexPath.row] as UIImage
My rules, haven't changed it from the default rules:
service firebase.storage {
match /b/omega-towers-f5beb.appspot.com/o {
match /{allPaths=**} {
allow read, write: if request.auth != null;
}
}
}
Thank you.
Regarding user experience, you are correct. It is standard to have some sort of default image when loading an image from a URL. A great library to use for image caching and using default assets in its' place is AlamofireImage
Vandan Patel's answer is correct in saying you need to ensure your array is not nil when loading the tableview. You will be given a completion block to handle any extra work you would like to do with your image, using the AlamofireImage library.
This is all assuming you are getting a correct image URL back for your Firebase users.
You should call tableView.reloadData() when the images are done downloading. One important thing, initialize your playerImages as playerImages = [UIImage]() instead of playerImages: [UIImage]!. if it's empty, it wouldn't show your array is nil.
Update:
if let players = playerImages {
//code
}

Accessing URL and Array from within a Block of JSON Data

Let's say I have JSON data structured in the following way:
{ "fruits" : {
"apple": {
"name": "Gala"
"color": "red",
"picture": "//juliandance.org/wp-content/uploads/2016/01/RedApple.jpg",
"noOfFruit": [1, 2]
}
}
How would I access the picture and noOfFruit array using the iOS version of Firebase? I want to make a table view with a cell that lists the apple's name, the color, a picture of the apple, and then lists the number of fruit. I get how to obtain the "color" and "name" values but how would I access the array and turn it into a string and the image so that it shows the image in the table view? Any help is appreciated!
For the array, it's really simple. Wherever you have your function that listenes for the firebase changes, I'll imagine that you have the info under the apple key stored in a variable like let apple
Then, you could cast the value of noOfFruit to an array, like the following:
let apple = // what you had before
guard let noOfFruit = apple["noOfFruit"] as? [Int] else {
return
}
//Here you have the array of ints called noOfFruit
For the image, theres several options out there. The first (and bad one) is to synchrounsly fetch the data of the url and set it to an image view as the following:
let url = URL(string: picture)
let data = try? Data(contentsOf: url!) //this may break due to force unwrapping, make sure it exists
imageView.image = UIImage(data: data!)
The thing with this approach is that it's NOT OK. It will block the main thread while its making the request and dowloading the image, leaving the app unresponsive.
The better approach would be to go fetch it asynchronously.There are several libraries that really help, such as AlamofireImage, but it can be done with barebones Foundation really easy. To do that, you should use URLSession class as the following:
guard let url = URL(string: picture) else {
return
}
URLSession.shared.dataTask(with: url) { data, response, error in
if let error = error {
print(error)
return
}
//Remember to do UI updates in the main thread
DispatchQueue.main.async {
self.myImageView.image = UIImage(data: data!)
}
}.resume()

Swift - How to retrieve multiple images at once (GCD)?

Let me give you some insight on my application itself.
To put it in short, I am creating a social-networking app. Each post consists of an image, profile picture, and caption. Each post exists in my MySQL database. I am using my own framework to retrieve each post. However, once I retrieve each post I still have to retrieve the profile picture and image using the URLs which I retrieved from the database. I would like to retrieve all images at once rather than running in sequential order.
As of now, there are about 5 posts in the database. Loading the necessary images for one post takes about 4 seconds. So right now I am loading the images for one post then retrieving the next in sequential order. So this whole process takes around 20 seconds. So say have 50 posts then it will take an extremely long time to load all the posts. I have some knowledge of GCD (grand-dispatch-queues) however I don't know how to implement it in my app.
Here is my code for retrieving my posts and images:
ConnectionManager.sharedInstance.retrievePosts(UserInformationInstance.SCHOOL) {
(result: AnyObject) in
if let posts = result as? [[String: AnyObject]] {
print("Retrieved \(posts.count) posts.")
for post in posts {
let postIDCurrent = post["id"] as? Int
var UPVOTES = 0;
var UPVOTED: Bool!
var query = ""
if let profilePictureCurrent = post["profile_picture"] {
// Loading profile picture image
let url = NSURL(string: profilePictureCurrent as! String)
let data = NSData(contentsOfURL: url!)
let image = UIImage(data: data!)
UserInformationInstance.postsProfilePictures.append(image!)
print("added profile pic")
} else {
print("error")
}
if let postPictureCurrent = post["image"] {
if (postPictureCurrent as! String != "") {
// Loading image associated with post
let url = NSURL(string: postPictureCurrent as! String)
let data = NSData(contentsOfURL: url!)
let image = UIImage(data: data!)
let imageArray: [AnyObject] = [postIDCurrent!, image!]
UserInformationInstance.postsImages.append(imageArray)
print("added image pic")
}
} else {
print("error")
}
UserInformationInstance.POSTS.append(post)
}
} else {
self.loadSearchUsers()
}
}
So my question is, how can I retrieve all the images at the same time instead of retrieving one after the other?
It would be great if someone could give an explanation as well as some code :)
I would recommend to revise your approach. If your server is fine - it's not busy and well reachable, so that resources downloading is limited by device network adapter bandwidth (X mbps), then it does not matter how you downloading images - concurrently or sequently.
Let me show this. Downloading time of 10 files with size Y mb simultaneously is equal to downloading time of one file, but in this case the downloading speed will be 10 times slower per file:
X/10 - downloading speed per one file
Time = Amount / Speed
T = Y / (X/10) = 10 * Y / X
Now if your are downloading sequently:
T = 10 * (Y / X) = 10 * Y / X
I would recommend to show posts immediately once you retrived them from the storage, then you need to start image downloading asynchronously and set image once that's downloaded. That's the best practice in the industry, consider Facebook, Twitter, Instagram apps.

Resources