recently, I want to save some images from urls to the user photo album.
I use the function UIImageWriteToSavedPhotosAlbum, and the didFinishSavingWithError call back told me the result, my problem is if there is a lot urls of images,using this way, some images always cannot be saved to the album even its already download from url.
As I checked the size offailed saved image , it is a png, 1290 × 1288, I don`t know if it is because of the size is too big too save. Do you guys have this kind of issues? please help~thanks!
UIImageWriteToSavedPhotosAlbum(image, self,#selector(image:didFinishSavingWithError:contextInfo:), nil);
- (void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo{
if (error != nil) {
isImagesSavedFailed = true;
}
}
You might want to try to write them one by one instead of in parallel. Especially since you have many pictures. You are letting it become "too busy." So you'll want to queue them up and write not all at once.
You can get some more details at ios programming: Using threads to add multiple images to library
Related
I have an iOS application where I am saving an image to my photos folder to my iPad. However, what I would like to do is store the image with a custom name (e.g. using the date and time stamp for when the image was actually captured), so that it would be easier for it to be retrieved in the future.
My code at the moment for storing my images is as follows:
- (IBAction)imageCapture:(id)sender {
UIImage *myImage = [_imageView currentImage];
UIImageWriteToSavedPhotosAlbum(myImage, nil, nil, nil);
}
I am able to store my image, but how do I name the image prior to storing it (e.g. "082620131100am.png"?
Since you cannot override any image/video in the user's photo library, you will not be able to give the the file a specific filename.
You can add some metadata to your image if you use the method: writeImageDataToSavedPhotosAlbum:metadata:completionBlock:, but you will not be able to set the filename.
So I have code that will delete images from the camera roll. It works fine, and can delete single images from a burst, however one of the images, if deleted, will delete the entire batch and I can't figure out how to stop that. It usually seems to be the last image in the burst group. And in my request options, I turn on includeAllBurstAssets.
func deletePhotos(assetsToDelete: [PHAsset]){
PHPhotoLibrary.sharedPhotoLibrary().performChanges({
PHAssetChangeRequest.deleteAssets(assetsToDelete)
return
}, completionHandler: { success, error in
guard let error = error else {
return
}
print(error)
}
})
}
I can confirm this behavior. I guess the API is designed to delete the entire batch, if one burst photo is deleted.
Please note, that the Apple Photos app also has no method to delete single burst photos.
It would make sense to make that behavior customizable and I would suggest you fill a bug report /enhancement request.
I am trying to use Path's FastImageCache library to handle photos in my app. The sample they provide simply reads the images from disk. Does anyone know how I might modify it to read from a url? In the section about providing source images to the cache they have
- (void)imageCache:(FICImageCache *)imageCache wantsSourceImageForEntity:(id<FICEntity>)entity withFormatName:(NSString *)formatName completionBlock:(FICImageRequestCompletionBlock)completionBlock {
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
// Fetch the desired source image by making a network request
NSURL *requestURL = [entity sourceImageURLWithFormatName:formatName];
UIImage *sourceImage = [self _sourceImageForURL:requestURL];
dispatch_async(dispatch_get_main_queue(), ^{
completionBlock(sourceImage);
});
});
}
Has anyone used this api before and know how to get the source from the server to pass to the cache? Another example that still uses hard disk is
- (void)imageCache:(FICImageCache *)imageCache wantsSourceImageForEntity:(id<FICEntity>)entity withFormatName:(NSString *)formatName completionBlock:(FICImageRequestCompletionBlock)completionBlock {
// Images typically come from the Internet rather than from the app bundle directly, so this would be the place to fire off a network request to download the image.
// For the purposes of this demo app, we'll just access images stored locally on disk.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
UIImage *sourceImage = [(FICDPhoto *)entity sourceImage];
dispatch_async(dispatch_get_main_queue(), ^{
completionBlock(sourceImage);
});
});
}
I worked on Fast Image Cache while I was at Path. The critical portion of Fast Image Cache is that it is the absolute fastest way to go from image data on disk to being rendered by Core Animation. No decoding happens, none of the image data is kept in memory by your app, and no image copies occur.
That said, the responsibility is yours to figure out how to download the images. There's nothing inherently special about downloading images. You can use NSURLConnection or one of many popular networking libraries (like AFNetworking) to actually download the image data from your server. Once you have that image data, you can call the relevant completion block for Fast Image Cache to have it optimize it for future rendering.
If you're looking for a simple way to download an image and display it when it's finished, then use something like SDWebImage. It's great for simple cases like that. If you are running into performance bottlenecks—especially with scrolling—as a result of your app needing to display tons of images quickly, then Fast Image Cache is perfect for you.
Your Approach Seems a Lot Like Lazy Loading Images from the URL, I had to do this once I had Used the following Library to do it, It dosent stores the Images in the disk, but uses cached Images..the below is its link..
https://github.com/nicklockwood/AsyncImageView
I added the networking logic to our fork > https://github.com/DZNS/FastImageCache#dezine-zync-additions-to-the-class
It utilizes NSURLSessionDownloadTasks, has a couple of configuration options (optional). All you need to do is create a new instance of DZFICNetworkController and set it as the delegate for FICImageCache's sharedCache instance object. It'll take care of downloading images with reference to the sourceImageURLWithFormatName: method on your objects conforming to <FICEntity>.
As I assume you'd use this in a UITableView or UICollectionView, calling cancelImageRetrievalForEntity:withFormatName: on the imageCache will cancel the download operation (if it's still in-flight or hasn't started).
I am banging my head about an issue I have on iOS7 development. I use the following piece of code to load an image from a webserver:
NSData* data = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:#"http://someServer/someImage.jpg"]];
This works like a charme in simulator, reading exactly the 134185 bytes that the image has. Creating an UIImage from that data works as intended.
Once I test the exact same code on a device (iPad Mini, iOS 7.03), though, it just reads 14920 byte from the same URL. Needless to say that I can't create an UIImage from that data then, creation fails and returns a nil.
The read does not produce any errors (no console output, and also using the signature with the error output param returns nil here). Is there anything I missed around this rather straightforward task? Haven't found anything on the web on this…
Thanks, habitoti
So you don't have any error, and something is downloading. Maybe try to read this response and post here (I guess it is html/text body)?
You can use NSString method:
+ (instancetype)stringWithContentsOfURL:(NSURL )url encoding:(NSStringEncoding)enc error:(NSError *)error;
Can I suggest you use a library like SDWebImage to retrieve your image, it caches it and downloads the images asynchronously.
It also has a category for UIImageView so you can just call [imageView setImageWithURL:]; and it will load the image in when its ready.
I am trying to build a nice function to access the network for images, if they are found on the web, I store them in a cache system I made.
If the image was already stored on the cache, I return it.
The function is called getImageFromCache and returns an image if it is in the cache, else, it would go to the network and fetch.
The code might look like this:
UIImageView* backgroundTiles = [[UIImageView alloc] initWithImage[self getImageFromCache:#"http://www.example.com/1.jpg"]];
Now, I am moving on to using threads because of big latencies due to network traffic. So I want images to show a temp image before I get the result from the web.
What I want to know is how can I keep track of so many images being accessed sequentially, being added to UIImageViews by this function (getImageFromCache).
Something just won't work there:
-(UIImage*)getImageFromCache:(NSString*)forURL{
__block NSError* error = nil;
__block NSData *imageData;
__block UIImage* tmpImage;
if(forURL==nil) return nil;
if(![self.imagesCache objectForKey:forURL])
{
// Setting a temporary image until we start getting results
tmpImage = [UIImage imageNamed:#"noimage.png"];
NSURL *imageURL = [NSURL URLWithString:forURL];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
imageData = [NSData dataWithContentsOfURL:imageURL options:NSDataReadingUncached error:&error];
if(imageData)
{
NSLog(#"Thread fetching image URL:%#",imageURL);
dispatch_async(dispatch_get_main_queue(), ^{
tmpImage = [UIImage imageWithData:imageData];
if(tmpImage)
{
[imagesCache setObject:tmpImage forKey:forURL];
}
else
// Couldn't build an image of this data, probably bad URL
[imagesCache setObject:[UIImage imageNamed:#"imageNotFound.png"] forKey:forURL];
});
}
else
// Couldn't build an image of this data, probably bad URL
[imagesCache setObject:[UIImage imageNamed:#"imageNotFound.png"] forKey:forURL];
});
}
else
return [imagesCache objectForKey:forURL];
return tmpImage;
}
This is not a direct answer to your question, but are you aware that there is no need to use GCD to download things asynchronously (on a background thread)? Just use NSURLConnection and its delegate methods. All your code will be on the main thread but the actual connection and downloading will happen in the background.
(And in fact I have written a class, MyDownloader, that takes care of all this for you:
http://www.apeth.com/iOSBook/ch37.html#_http_requests
Scroll down to the part about MyDownloader and its subclass MyImageDownloader, which is doing exactly the sort of thing you need done here. Moreover, note the subsequent code in that chapter showing how to use a notification when a download completes, prompting the table view that need these images to reload the row that contains the image view whose image has just arrived.)
its good your building it from scratch but if you want to save the all the work, there's a drop in Replacement SDWebImage Library with support for remote images coming from the web, and has all the functionality Like Temp Image, Asychronous Loading, Caching etc, you said you need
In your background thread, once the download has completed and you've saved the image to the cache, I'd suggest you post a notification using the NSNotificationCenter to let other parts of your app know that the cache has been updated.
This assumes that whichever part of the app manages the image views has registered its interest in those notification with the addObserverForName method. When it receives such a notification, it can then attempt to retrieve the images from the cache again and update its image views if appropriate.
Depending on the number of image views, you may want to pass through the image url in the notification in some way (e.g. in the userInfo dictionary), and then based on that decide which image views should be refreshed rather than refreshing them all.
I should add that I would also recommend getting rid of the inner dispatch_async call. There's no need for that, although you may need to add synchronisation to your cache object so it can be safely accessed from the main thread as well as the download thread.