I am attempting to load image files as an NSString, but all of them come up nil using this code:
NSString *path = [[NSBundle mainBundle] pathForResource:[NSString stringWithUTF8String:name.data()] ofType:nil];
NSString *da = [NSString stringWithContentsOfFile:path encoding:NSUTF8StringEncoding error:nil];
I am able to load many files, but all JPEG and PNG files fail for some reason. I thought it might have something to do with encoding so I switched it to usedEncoding, but it still didn't work.
What am I missing?
EDIT:
I have been making an iOS/Android cross platform OpenGL graphics library in C++. Everything works except texture loading. Any file loading from disk goes through one function that is abstracted between systems. I need the image file in an STL string, so that I can pass it to an image parsing library to get the raw pixel data.
I just think that it's reduculous that the function I have can open any file except images.
If you run your code, passing an NSError instance instead of nil,
NSError *error = nil;
NSString *string = [NSString stringWithContentsOfFile:filePath
encoding:NSUTF8StringEncoding
error:&error];
you will see that stringWithContentsOfFile cannot open the image file, returning nil and the error given is:
Error Domain=NSCocoaErrorDomain Code=261 "The operation couldn’t be completed. (Cocoa error 261.)"...
Cocoa error 261 is NSFileReadInapplicableStringEncodingError which means the encoding of the file is different from the one you are passing (NSUTF8StringEncoding). But I have tried with the other encodings, and none works for PNG files.
You can still achieve what you want by loading the file as a UIImage and then converting the UIImage into a Base64 string.
Since iOS 7, this is easier because you can use the built in method base64EncodedStringWithOptions:
// Load the image and convert it to NSData
UIImage *image = [UIImage imageNamed:#"yourImageName"];
NSData *imageData = UIImagePNGRepresentation(image);
// You can use the equivalent UIImageJPEGRepresentation() for JPEG images
// Convert NSData to a Base64 NSString
NSString *base64ImageString = [imageData base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];
Previous to iOS 7, you can do the exact same thing but you will have to implement your own Base64 encoding method (Or import any of the many already available, eg. nicklockwood/Base64).
Related
I am trying to read in arabic text that I have contained inside of a .doc file, and use it in my app. Unfortunately, the only way I am able to retrieve the text is if I convert the document into .txt file.
Here is the code I have:
NSError *error = nil;
NSString *path = #"MyArabicDocument";
NSString *root = [[NSBundle mainBundle]pathForResource:path ofType:#"doc"];
NSString *myFile = [NSString stringWithContentsOfFile:root encoding:NSUTF8StringEncoding error:&error];
NSLog(#"my file contents are: %#", myFile);
NSLog(#"error is: %#", error);
The output of my NSString object is (null), and the error I get is:
error is: Error Domain=NSCocoaErrorDomain Code=256 "The operation couldn’t be completed. (Cocoa error 256.)" UserInfo=0x7aace470 {NSFilePath=/Users/MyName/Library/Developer/CoreSimulator/Devices/.../data/Containers/Bundle/Application/..MyApp.app/MyArabicDocument.doc}
If I convert my document into an .rtf format, then my output (after changing the extension in the above block of code) is the following:
my file contents are: {\rtf1\ansi\ansicpg1252\cocoartf1347\cocoasubrtf570
{\fonttbl\f0\fnil\fcharset0 LucidaGrande;\f1\fnil\fcharset178 AlBayan;\f2\fnil\fcharset178 GeezaPro;
}
{\colortbl;\red255\green255\blue255;}
\vieww10800\viewh8400\viewkind0
\deftab709
\pard\pardeftab709\pardirnatural
\f0\fs46 \cf0 1
\f1 - \'de\'f3\'dc\'c7\'e1\'f3 \'c7\'c8\'fa\'dc\'e4\'f5 \'c2\'c8\'f3\'f8 \'e6\'f3\'c7\'d3\'fa\'e3\'f5\'dc\'e5\'f5 \'e3\'f5\'cd\'f3\'e3\'f3\'f8\'dc\'cf\'f5
\f0 ~~~
\f1 \'c7\'e1\'e1\'e5\'f3 \'dd\'f6\'dc\'ed \'df\'f5\'dc\'e1\'f6\'f8 \'c7\'e1\'c3\'f5\'e3\'f5\'dc\'e6\'d1\'f6 \'c3\'f3\'cd\'fa\'dc\'e3\'f3\'dc\'cf\'f5 \
...
If I try to use an NSAttributedString object instead of an NSString object, but I still get a (null) value for my NSAttributedString object:
NSDictionary *attrs = #{NSDocumentTypeDocumentAttribute: NSRTFTextDocumentType, NSWritingDirectionAttributeName:#[#(NSWritingDirectionRightToLeft | NSTextWritingDirectionOverride)]};
NSAttributedString *text = [[NSAttributedString alloc] initWithFileURL:[[NSBundle mainBundle] URLForResource:#"MyArabicDocument" withExtension:#"doc"] options:attrs documentAttributes:nil error:&error];
The reason why this is important is that while my arabic text does indeed appear in my UITextView in my app, the problem is that it's appearance is nowhere near as nice as in the original document, which is what I would like to maintain in my app. Is this not possible?
.doc file in question is in binary format. (probably compressed like .docx)
http://en.wikipedia.org/wiki/Doc_(computing)
So you cannot put it in NSString as is. But you can get NSData:
NSString *path = [[NSBundle mainBundle] pathForResource:#"MyArabicDocument" ofType:#"doc"];
NSData *data = [NSData dataWithContentsOfFile:path];
Unfortunately you cannot make an NSAttributedString from .doc in iOS, but you can in OS X (in iOS there only four doc types supported)
NSError *attrError;
NSDictionary *options = #{NSDocumentTypeDocumentAttribute: NSDocFormatTextDocumentType};
NSAttributedString *content = [[NSAttributedString alloc] initWithData:data options:options documentAttributes:nil error:&attrError];
Instead you may try to load your .doc file into WebView.
Using NSData:
[self.webView loadData:data MIMEType:#"application/msword" textEncodingName:#"UTF-8" baseURL:nil];
But I think better with NSURLRequest (since you don't nee to set up encoding there)
NSURL *url = [NSURL fileURLWithPath:path];
NSURLRequest *request = [NSURLRequest requestWithURL:url];
[webView loadRequest:request];
NOTE: Any method you choose very likely will BREAK your format, I mean rendered document will be corrupted. Instead I recommend to convert .doc to .pdf In this case it will be good-loking.
For example Dropbox app for iOS defenetly converts .doc/.docx to pdf and than presented to the user as PDF (Of course not telling that it is PDF indeed).
I think you have a encoding issue when reading a file,
Refer below link
https://developer.apple.com/library/ios/documentation/Cocoa/Conceptual/Strings/Articles/readingFiles.html
May be it solve your problem
Best of luck!
The only other information I could find on this error was here, which wasn't helpful.
I get the following error when I try to save images. This seems to only happen when I have several images (~6) at once. It also seems to be completely random as to when it occurs. Sometimes everything is fine, sometimes it'll fail on 1 image, sometimes 3, and sometimes the app will completely crash in a EXC_BAD_ACCESS error.
Error: ImageIO: CGImageReadGetBytesAtOffset : ^^^ ERROR ^^^ CGImageSource was created with data size: 1144891 - current size is only: 1003855
Here is the code that saves the image:
- (void)saveWithImage:(UIImage *)anImage andFileName:(NSString *)aFileName {
NSString *subDirectory = #"Images";
NSString *fileName = [aFileName stringByAppendingString:#".png"];
NSString *documentsPath = [[CMAStorageManager sharedManager] documentsSubDirectory:subDirectory].path;
NSString *imagePath = [subDirectory stringByAppendingPathComponent:fileName];
__block NSString *path = [documentsPath stringByAppendingPathComponent:fileName];
__block NSData *data = UIImagePNGRepresentation(anImage);
self.image = anImage;
self.tableCellImage = anImage;
self.galleryCellImage = anImage;
self.imagePath = imagePath; // stored path has to be relative, not absolute (iOS8 changes UUID every run)
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
if (![data writeToFile:path atomically:YES])
NSLog(#"Error saving image to path: %#", path);
dispatch_async(dispatch_get_main_queue(), ^{
});
});
}
I get that error and as a result my images aren't saved (or only half of them are saved), which completely messes up the UI display, and any subsequent app launches. I've narrowed it down to the UIImagePNGRepresentation call.
On a related note, that code locks up the UI; I think because of the UIImagePNGRepresentation call; however, as far as I know UIImagePNGRepresentation is no thread safe, so I can't do it in the background. Does anyone know a way around this?
Thanks!
In case anyone comes across a similar issue, this is what fixed it for me.
iPhone iOS saving data obtained from UIImageJPEGRepresentation() fails second time: ImageIO: CGImageRead_mapData 'open' failed
and this is the solution I used to save UIImages in a background thread:
Convert UIImage to NSData without using UIImagePngrepresentation or UIImageJpegRepresentation
I was getting this error because the api I called was throwing a 500 and returning html error page in the tmp >> NSData file I was trying to convert to PNG.
You may wish to check the file your trying to open is a image at all before conversion.
If youre downloading the file with dowloadTask then check statusCode is 200 before moving tmp > /Document/.png
heres my answer in other SO
I'm trying to display a BLOB image (get from web server using Json) in my iOS app, but when I run my application I get an empty UIimageView, here is my code :
NSData *dataURL = [NSData dataWithContentsOfURL:[NSURL URLWithString:encodedUrl]];
NSData *profileImage1 = [[NSData alloc] initWithBytes:[dataURL bytes] length:[dataURL length]];
UIImage *profileImage2 = [UIImage imageWithData:profileImage1];
[profilImage setImage:profileImage2];
How can I fix this problem?
[UIImage imageWithData:data] only parses known image file formats like JPEG, PNG, etc. (Full info in the decomentation). Passing blobs isn't supported by UIImage. You need to do some decoding to be able to use the data for the UIImage. You can use GMTBase.64 for encoding and decoding of data. Read the docs and other posts and you'll find out how to change your code.
Hope this helps.
I have developed an iOS7 app. I use the UIImagePickerController to get a UIImage. During the usage of the app an image
is stored to the local app directory and loaded. The file will be overwritten several times. I use UIImagePNGRepresentation and NSData storeToFile to store
the picture. In order to load the image I use UIImage initWithContentsOfFile.
It works fine for some store and load processes. With the Debugger I can confirm that the picture is stored and loaded appropriately.
But after some time the stored picture does not contain a picture anymore. When I look at the load procedure with the debugger I can see that
a picture is loaded but it seems to be completely transparent with no other information.
This phenomenon only occurs on a device and not during a simulation with the simulator. Furthermore, the effect seems to occur only when the app is kicked out of the memory (double tap of the home button)
I debugged the app for hours but I cannot imagine the reasons for this behaviour. Additionally it is difficult to debug due to the complete termination of the app. Does anybody know a solution or has been faced with the same problem? Could anybody give me a hint what to debug to track down the problem?
Thanks in advance. Any help is appreciated. :)
Thanks for your answers. Here is a code excerpt:
Code to store image:
Generate file path in documents directory
NSString *name = [NSString stringWithFormat:#"ImageforButton%i.png",i];
NSArray *path = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *file = [[path objectAtIndex:0] stringByAppendingPathComponent:name];
Create NSData from UIImage (image is a UIImage that always has the correct image data)
NSData *picData = UIImagePNGRepresentation(image);
Write NSData to file
[picData writeToFile:file atomically:true];
Code to load image:
Generate file path
NSString *name = [NSString stringWithFormat:#"ImageforButton%i.png",i];
NSArray *path = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *file = [[path objectAtIndex:0] stringByAppendingPathComponent:name];
Load picture
UIImage *img;
if ([[NSFileManager defaultManager] fileExistsAtPath:file isDirectory:false] == true)
{
img = [UIImage imageWithContentsOfFile:[self saveFilePath:name]];
}
img does not always contain the right data. Before the app is completely terminated img
contains the expected picture. After the app was completely terminated a valid
UIImage can be loaded to img but it seems only to contain a transparent picture.
All in all it does not contain the right picture anymore.
How do I get image data out of JSON returned by a WCF Data Service?
I've got a WCF Data Service serving some objects that include some binary data that is an image (column type in SQL Server is image). It comes across as text gibberish in the JSON. In the iOS client I'm writing for this service, I want to display this data as an image ([UIImage imageWitData:myData]).
Here is what I'm doing:
NSData *networkData = nil; //assume this magically gets data from the service
NSError *err = nil;
//assume the returned JSON has one object and is not an array
NSDictionary *dict = [NSJSONSerialization JSONObjectWithData:networkData
options:NSJSONReadingAllowFragments error:&err];
//handle error if needed
NSString *imageString = [dict objectForKey:#"Image"];
NSData *imageData = [imageString dataUsingEncoding:NSUTF8StringEncoding];
///The service is using UTF-8.
UIImage *image = [UIImage imageWithData:imageData];
Putting that image in a UIImageView doesn't show anything. What am I doing wrong?
Turns out that binary data returned from WCF Data Services is base64 encoded. I'm using the NSData-Base64 category to parse it, and it's working great. So the solution looks like this:
NSData imageData = [NSData dataFromBase64String:imageString];
When I imported those files into my project ARC threw up some errors, but I was able to solve them by removing a couple of autorelease calls in the .m file.