Faster way to grab images from XML feed - ios

I'm pulling data into a UITableView via an XML feed from a Wordpress site. I wanted to display the table with an image if the post contained one and a default image if it did not. So in my
- (void) parser:(NSXMLParser *)parser didEndElement:(NSString *)elementname namespaceURI:(NSString *)namespaceURI qualifiedName:(NSString *)qName
method, I have an if statement that looks like this:
if ([elementname isEqualToString:#"content:encoded"]) {
NSString *firstImageURL = [self getFirstImageUrl:currentStory.content];
currentStory.imageURL = firstImageURL;
UIImage *image = [UIImage imageWithData: [NSData dataWithContentsOfURL:[NSURL URLWithString: firstImageURL]]];
currentStory.image = image;
}
This calls getFirstImageURL, which looks like this:
-(NSString *)getFirstImageUrl: (NSString *) html {
NSScanner *theScanner;
NSString *imageURL = nil;
theScanner = [NSScanner scannerWithString: html];
// find start of tag
[theScanner scanUpToString: #"<img" intoString: NULL];
if ([theScanner isAtEnd] == NO) {
[theScanner scanUpToString: #"src=\"" intoString: NULL];
NSInteger newLoc2 = [theScanner scanLocation] + 5;
[theScanner setScanLocation: newLoc2];
// find end of tag
[theScanner scanUpToString: #"\"" intoString: &imageURL];
}
return imageURL;
}
Everything works as it should, but loading the table takes about 5 to 6 seconds and can sometimes take up to 10 seconds, which is not desirable. I was wondering if there is anything I can do to speed up the process of grabbing the first photo.
UPDATE
So after more investigation, it appears that the bottleneck I'm seeing has nothing to do with me downloading images. In fact the actual downloading of images takes no longer than 2 seconds consistently. It looks like the bottleneck happens when I download the RSS feed:
NSData *data = [[NSData alloc] initWithContentsOfURL:url];
This consistently takes the longest.
2012-03-30 14:35:11.506 gbllc[883:3203] inside grabDataForFeed
2012-03-30 14:35:11.510 gbllc[883:3203] reached loadXMLByURL
2012-03-30 14:35:11.512 gbllc[883:3203] after stories alloc
**** 5 seconds ****
2012-03-30 14:35:16.568 gbllc[883:3203] after initWithContentsOfURL
2012-03-30 14:35:16.570 gbllc[883:3203] after initWithData
2012-03-30 14:35:16.573 gbllc[883:3203] about to parse
*** I now parse the XML and download images, takes 2 seconds ***
2012-03-30 14:35:18.066 gbllc[883:3203] Parsed successfully
Right after I alloc my data object, I grab the data for parsing. So I guess my original question is no longer valid and I should probably ask if there's a faster way to grab the initial data for parsing or if I should change my model and try to use json or something?

That's because you're downloading the image data itself over the network. You need to offload that and do it asynchronously. Have an NSOperationQueue where you can queue up the image download to happen on a separate thread.
Here's a great example of doing just that: http://davidgolightly.blogspot.com/2009/02/asynchronous-image-caching-with-iphone.html

I marked Joel's answer as the best answer because he gave me the idea for async downloading of images, but the solution I ended up using is here:
http://howtomakeiphoneapps.com/how-to-asynchronously-add-web-content-to-uitableview-in-ios/1732/
It's by far the easiest and most elegant I've seen after hours of searching.

Related

Loading UIImage from disk sometimes fails with permission error in background state?

I have this strange issue that I am having trouble resolving. I am creating an App which allows music to be played back. When the screen is locked (and there is a currently playing song), the lock screen will populate with a bunch of data. One piece is the album art.
The problem is that after the phone is locked and I skip a few tracks (forwards or backwards), the UIImages are no longer being loaded. If I test out the functionality and quickly skip forward and backwards in my playback queue, the album art will appear for the first 4-5 songs. After that, the images stop appearing because I get a NSFileReadNoPermissionError from my code that grabs the image. I understand that I apparently do not have permission to access the png image files, but I do not understand why. My application created them, saved them on disk, and is now trying to load them from disk while my app is running in a background state.
The relevant code snippet:
+ (void)updateLockScreenInfoAndArtForSong:(Song *)song
{
NSString *documentsPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *artDirPath = [documentsPath stringByAppendingPathComponent:#"Album Art"];
NSString *path = artDirPath;
//-----> LIST ALL FILES for debugging <-----//
NSLog(#"LISTING ALL FILES FOUND");
int count;
NSArray *directoryContent = [[NSFileManager defaultManager] contentsOfDirectoryAtPath:path
error:NULL];
for (count = 0; count < (int)[directoryContent count]; count++)
{
NSLog(#"File %d: %#", (count + 1), [directoryContent objectAtIndex:count]);
}
//-----> END DEBUG CODE <-----//
Song *nowPlayingSong = [MusicPlaybackController nowPlayingSong];
Class playingInfoCenter = NSClassFromString(#"MPNowPlayingInfoCenter");
if (playingInfoCenter) {
NSMutableDictionary *songInfo = [[NSMutableDictionary alloc] init];
NSError *error;
NSData *data = [NSData dataWithContentsOfURL:
[AlbumArtUtilities albumArtFileNameToNSURL:nowPlayingSong.albumArtFileName] options:NSDataReadingUncached error:&error];
NSInteger code = error.code;
NSLog(#"Error code: %li", (long)code); //prints 257 sometimes, which is NSFileReadNoPermissionError
UIImage *albumArtImage = [UIImage imageWithData:data];
if(albumArtImage == nil){ //song has no album art, check if its album does
Album *songsAlbum = song.album;
if(songsAlbum){
albumArtImage = [UIImage imageWithData:[NSData dataWithContentsOfURL:
[AlbumArtUtilities albumArtFileNameToNSURL:songsAlbum.albumArtFileName]]];
}
}
[songInfo setObject:nowPlayingSong.songName forKey:MPMediaItemPropertyTitle];
NSInteger duration = [nowPlayingSong.duration integerValue];
[songInfo setObject:[NSNumber numberWithInteger:duration]
forKey:MPMediaItemPropertyPlaybackDuration];
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:songInfo];
}
}
Any help would be immensely appreciated! I have tried so many thing that I am at a loss for what to even try next. Note the above code snippet is called in
- (void)remoteControlReceivedWithEvent:(UIEvent *)event
when event.subtype is UIEventSubtypeRemoteControlNextTrack or UIEventSubtypeRemoteControlPreviousTrack.
Figured it out after trying everything all day lol. Turns out that by default in iOS 8, much of the system is encrypted (files cannot be accessed after the phone is locked)...with a small delay of course. This is why a few album art images were loading but they stopped working after a few seconds. The delay between when the phone locked and the encryption enabled itself made it seem like my issue was "random".
Anyway for anyone reading this, my solution involved setting the file protection of all folders and subfolder and files leading to the album art directory.
Hint:
[attributes setValue:NSFileProtectionCompleteUntilFirstUserAuthentication forKey:NSFileProtectionKey];
Set that attribute on a file when it is being created (provide the information as an NSDictionary). If an existing directory or file is to be modified, gather the attributes of the thing using NSFileManager, and then set the values as shown above.

Play Motion JPG stream in iOS out of memory

I am trying to play video that is coming from an IP camera in iOS, but currently I tried 2 methods and they both seem to be filling up the memory of my iOS device really fast. I am using ARC for this project.
My IP camera uses Videostream.cgi (Foscam), which is a well-known way for IP cameras to stream 'video' through the browser.
So, I tried 3 ways, which all end up in crashing my iOS app, with an out-of-memory exception.
1. Putting an UIWebView on my UIViewController and call the CGI directly using a NSURLRequest.
NSString* url = [NSString stringWithFormat:#"http://%#:%#/videostream.cgi?user=%#&pwd=%#&rate=0&resolution=%ld", camera.ip, camera.port, camera.username, camera.password, (long)_resolution];
NSURLRequest* request = [NSURLRequest requestWithURL:[NSURL URLWithString:url]];
webView = [[UIWebView alloc] init];
[webView loadRequest:request];
2. Putting an UIWebView on my UIViewController and creating a piece of HTML (in code) which includes a <img> tag which has a source to the CGI mentioned before. (see: IP camera stream with UIWebview works on IOS 5 but not on IOS 6)
NSString* imgHtml = [NSString stringWithFormat:#"<img src='%#'>", url];
webView = [[UIWebView alloc] init];
[webView loadHTMLString:imgHtml];
3. Using a custom control, based on a UIImageView, which fetches data continuously. https://github.com/mateagar/Motion-JPEG-Image-View-for-iOS
All of these things burn through memory and even when I try to remove them and re-add them after a certain period of time, but this does not seem to fix the problem. Memory won't be released and the iPad crashes.
UPDATE:
I am currently modifying option 3 of the solutions I tried. It is based on a NSURLConnection and the data it retrieves.
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data {
if (!_receivedData) {
_receivedData = [NSMutableData new];
}
[_receivedData appendData:data];
NSRange endRange = [_receivedData rangeOfData:_endMarkerData
options:0
range:NSMakeRange(0, _receivedData.length)];
NSUInteger endLocation = endRange.location + endRange.length;
if (_receivedData.length >= endLocation) {
NSData *imageData = [_receivedData subdataWithRange:NSMakeRange(0, endLocation)];
UIImage *receivedImage = [UIImage imageWithData:imageData];
if (receivedImage) {
NSLog(#"_receivedData length: %d", [_receivedData length]);
self.image = receivedImage;
_receivedData = nil;
_receivedData = [NSMutableData new];
}
}
if (_shouldStop) {
[connection cancel];
}
}
_receivedData is a NSMutableData object. which I try to "empty" once an image is retrieved from the stream. The part in if (receivedImage) is called when it is supposed to be called. The length of the _receivedData object is also not increasing, it stays around the same size (~ 14k), so that seems to work.
But somehow, with every didReceiveData the memory my app is using increases, even when I disable the line self.image = receivedImage.
UPDATE
As iosengineer suggested, I have been playing with autorelease pools, but this does not solve the problem.
Using Instruments I found out that most of the allocations are done by CFNetwork, in the method HTTPBodyData::appendBytes(unsigned char const*, long). (This allocates 64KB at a time and keeps them alive).
The next step I'd take would be to analyse the request/response patterns using Charlie, step through the source using Xcode and probably write my own solution using NSURLSession and NSURLRequest.
Streams don't just create themselves - something is pulling in data from the responses and not getting rid of it fast enough.
Here's my guess on what is possibly happening:
When you download something using NSURLRequest, you create an instance of NSMutableData to collect the responses in chunks until you are ready to save it to disk. In this case, the stream never ends and so the store grows massive and then bails.
A custom solution to this would have to know when it's safe to ditch the store based on the end of a frame (for example). Good Luck! Instruments is your friend.
P.S. Beware of autoreleased memory - use autoreleasepools wisely
In your revised question, the code sample shows a few objects that are created using autoreleased memory. The appropriate use of autorelease pools should fix this. It should be fairly straight-forward to see which object is causing the most problems and if your problems have been solved by profiling the app using Instruments (allocation tool).
Of particular interest, the UIImage imageWithData: call should definitely be wrapped, as that is creating a new image object every time.
Also subdataWithRange creates a new object which is only released once the pool is flushed.
I don't use the "new" syntax for creation ever so I can't recall how that one actually works. I always use alloc init.
Wrap MOST of this whole routine with this:
#autoreleasepool
{
ROUTINE
}
That will make it so that at each chunk of data received, the pool will be drained and you will mop up any autoreleased memory objects.
I rewrote the MotionJpegImageView thing, which was causing all my problems:
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data {
if (!_receivedData) {
_receivedData = [NSMutableData new];
}
[_receivedData appendData:data];
NSRange endRange = [_receivedData rangeOfData:_endMarkerData
options:0
range:NSMakeRange(0, _receivedData.length)];
if (endRange.location == NSNotFound) {
return;
}
#autoreleasepool {
UIImage *receivedImage = [UIImage imageWithData:_receivedData];
if (receivedImage) {
self.image = receivedImage;
}
else {
DDLogVerbose(#"Invalid image data");
}
}
[_receivedData setLength:0];
if (_shouldStop) {
[connection cancel];
DDLogVerbose(#"Should stop connection");
}
}
Also, my connections were opened multiple times in the end by not correctly canceling the old ones. Pretty stupid mistake, but for the people wanting to know how it works. Code is mentioned above.

iOS NSXMLPARSER Parsing Media Content Tag Inside RSS Feed

I am using NSXMLParser to parse RSS feed. Earlier i was using the image url from cdata block but now i have come across a feed that has the image url inside
<media:content url="http://cdn.example.com/wp-content/uploads/2013/10/article-2462931-18C5CBC000000578-776_634x452.jpg" medium="image"/>
How do i pick the image url from this tag? This is what i am trying to use inside didStartElement method but its not working:
if ([elementName isEqual:#"media:content"])
{
currentString = [[NSMutableString alloc] init];
// Basically i need the image url string at this point
NSString *imageURLString = [self getFirstImageUrl:currentString];
imageURL = [NSURL URLWithString:imageURLString];
[self downloadThumbnails:imageURL];
}
How can i pick the image url string from the media content tag? Thanks!
I haven't received any answers but after trying different things, i was able to solve this issue. I am answering my own question now so that if anyone faces the same issue, they can solve this problem without wasting much time as i did.
In didStartElement method of NSXMLParser, i wrote the following:
if ([elementName isEqual:#"media:content"])
{
NSString *imageURLString = [attributeDict objectForKey:#"url"];
NSLog(#"imgURL %#",imageURLString);
// Here you can use the imageURLString to download the image
}

iOS - ASIHTTPRequest - Trying to parse through responseString

I've spent about 16 hours researching and attempting different code changes, but cannot figure this one out. I have an iOS app that consumes a website using ASIHTTPrequest:
-(void)refresh{
NSURL *url = [NSURL URLWithString#"http://undignified.podbean.com/"];
ASIHTTPRequest *request = [ASIHTTPRequest requestWithURL:url];
[request startSynchronous];
NSString *response = [request responseString];
NSLog(#"%#",response];
}
The above code returns the website source and spits it out to the console via NSLog. The goal I'm trying to achieve from this is search through the 'responseString' for URLs ending in *.mp3, load those into an array and finally load the mp3 URLs into a UITableView.
To summarize:
Consuming website data with ASIHTTPRequest
Trying to search through the responseString for all links that have *.mp3 extensions and load them into an array.
Add parsed links to UITableView.
I think at this junction I have attempted too many things to make any sound judgements at this point. Any suggestions or nudges in the right direction would be greatly appreciative.
Below is an example of the response (HTML). Please note this is only a snippet as the entire HTML source is rather large, but this includes a section to the mp3 files:
a href="http://undignified.podbean.com/mf/web/ih2x8r/UndignifiedShow01.mp3" target="new"><img src="http://www.podbean.com/wp-content/plugins/podpress/images/audio_mp3_button.png" border="0" align="top" class="podPress_imgicon" alt="icon for podbean" /></a> Standard Podcasts [00:40:24m]: <span id="podPressPlayerSpace_2649518_label_mp3Player_2649518_0">Play Now</span> | Play in Popup | Download | Embeddable Player | <a>Hits (214)</a><br/
I really don't like answers that say "Just don't do it that way." But... Just don't do it that way.
It is possible to extract all of the links to mp3s in the html from the address you posted. But this is almost always the wrong way to approach things, and this case is no exception.
Essentially what it seems you are trying to do is create a podcaster client. You should give some thought as to how others have handled this type of use case before. Generally a podcast will have an associated RSS feed that outlines exactly the data you are looking for, again your podcast is no exception. If one simply navigates to the link supplied in your question and then looks around the page for either "subscribe to podcast" or "RSS" they will find the link that leeds to the RSS feed: http://undignified.podbean.com/feed/. This address leads to the XML which contains the items of the podcasts.
This document, unlike the document returned by your original address, is valid XML meaning it can be parsed with an NSXMLParser. NSXMLParser is very powerful & flexible. But a little hard to get started with. Here is some sample code for an NSXMLParser subclass which acts as it's own delegate.
UnDigParser.h
#import <Foundation/Foundation.h>
#interface UnDigParser : NSXMLParser <NSXMLParserDelegate>
#property (readonly) NSArray *links;
#end
UnDigParser.m
#import "UnDigParser.h"
#implementation UnDigParser{
NSMutableArray *_links;
}
#synthesize links = _links;
-(void)parserDidStartDocument:(NSXMLParser *)parser{
_links = [[NSMutableArray alloc] init];
}
-(void)parser:(NSXMLParser *)parser didStartElement:(NSString *)elementName namespaceURI:(NSString *)namespaceURI qualifiedName:(NSString *)qName attributes:(NSDictionary *)attributeDict{
if ([elementName isEqualToString:#"enclosure"]){
NSString *link = [attributeDict objectForKey:#"url"];
if (link){
[_links addObject:link];
}
}
}
-(BOOL)parse{
self.delegate = self;
return [super parse];
}
#end
This code can be tested like so:
dispatch_async(dispatch_get_global_queue(0, 0), ^{
NSURL *url = [NSURL URLWithString:#"http://undignified.podbean.com/feed/"];
// This is a sync call thus the background thread
UnDigParser *parser = [[UnDigParser alloc] initWithContentsOfURL:url];
[parser parse];
NSLog(#"links:%#",parser.links);
});
The logged output was:
links:(
"http://undignified.podbean.com/mf/feed/cfdayc/UndignifiedShow03.mp3",
"http://undignified.podbean.com/mf/feed/wbbpjw/UndignifiedShow02Final.mp3",
"http://undignified.podbean.com/mf/feed/ih2x8r/UndignifiedShow01.mp3",
"http://undignified.podbean.com/mf/feed/t4x54d/UndignifiedShow00.mp3"
)
That should be enough to get you started.
I would do this in two steps:
Get all href values
Use regular expressions to check for a valid url ending in .mp3
BTW, I think ASIHTTPRequest is getting outdated. And if you only use it to fetch HTML, I suggest you look into the build in methods from the iOS framework to do just that.
Using requestDidReceiveResponseHeadersSelector and get file name from header, trying print all key and value from NSDictionary header.

library for iOS to split huge binary files by certain sizes?

I am looking for the library for splitting (divide) a binary file into multiple files.
If there is 20MB size of file named "test.m4v" in iOS temporary folder (NSTemporaryDirectory()),
I would like to split that to
test.m4v.000 (7MB)
test.m4v.001 (7MB)
test.m4v.002 (6MB)
Something like that (It doesn't have to be '7MB', could be 5MB like that)
like command line split command., I don't think we can call this command inside iOS app.
Is there iOS (free/paid) library to do that? I might need to just low level access and write it, but I am too lazy to do that ;)
This should work assuming the file isn't so large that it freaks out at dataWithContentsOfFile:filename. iOS might do caching in the background, but I don't know.
-(NSUInteger)splitFile:(NSString *)filename chunkSize:(NSUInteger)chunkSize {
NSUInteger chunksWritten;
NSFileManager *fm = [[[NSFileManager alloc] init] autorelease];
NSData *fileData = [NSData dataWithContentsOfFile:filename];
NSString *newFileName;
NSRange dataRange;
for (chunksWritten = 0; chunksWritten * chunkSize < [fileData length]; chunksWritten++) {
newFileName = [filename stringByAppendingPathExtension:[NSString stringWithFormat:#"%03d", chunksWritten]];
dataRange = NSMakeRange(chunksWritten * chunkSize, MIN(chunkSize, [fileData length] - chunksWritten * chunkSize));
if (![fm createFileAtPath:newFileName contents:[fileData subdataWithRange:dataRange] attributes:nil]) {
NSLog(#"Error writing chunk #%d", chunksWritten);
break;
}
}
return chunksWritten;
}
The error checking obviously needs to be more robust.

Resources