How do I convert NSData to a PDF? - ios

I have pdf files stored on Parse.com, I want to download them and set them as images. I have googled around trying to find out how to do this but I'm still clueless. I have got my parse object downloading successfully, the pdf file is stored in the field "image"
//DOWNLOAD IMAGE CODE
PFFile *image = object[#"image"];
[image getDataInBackgroundWithBlock:^(NSData *data, NSError *error) {
//we have data, now we want to convert it to a UIImage
}];
just have no idea what to do with the data. Can someone please give me some pointers? thanks

I haven't done this, but I think what you need to do is to first create a data provider out of your data using a call like CGDataProviderCreateWithCFData(), then use the data provider to create a PDF object using CGPDFDocumentCreateWithProvider.
These are Core Foundation calls so you need to do manual memory management of the CF objects (ARC doesn't manage CF objects)
You should be able to Google around using those terms and find code to do what you want.

Are the files store on parse as images or PDF's?
It sounds to me like you've got actual PDF files on parse. You can't just convert them to an image as they aren't images.
If they're PDF files you can save them to disk and then view them with QuickLookPreview
https://developer.apple.com/library/prerelease/ios/documentation/NetworkingInternet/Reference/QLPreviewController_Class/index.html#//apple_ref/occ/instp/QLPreviewController/currentPreviewItem
If they're actually images then you simply use:
[UIImage imageWithData:data]

Related

How do I access image data in Xcode iOS app developer?

I am VERY new to Xcode and even iOS apps/platform in general. I have a lot of image processing experience using other development platform environments and am looking to apply this toward iOS apps. I have noticed that nothing is mentioned for Xcode in regards to accessing image data and/or directly modifying it. Many people that have made tutorials seem to use an image picker but never have I seen where they say or show how to access the image data.
An answer to this would great. Guidance would be most appreciated. Thanks.
UIImage provides the ability to display images from several different formats, but it's immutable -- you can't change the data it uses and expect to see the image change. Instead, you'll get an image in one of the underlying types and work with that. You'll probably want to read up on both Core Graphics (Quartz) and Core Image. For example, you can use UIImage's -CGImage method to get the image as a CGImage and then use the Core Graphics CGImage...() functions to find out about the image data (format, bit depth, etc) and get the actual bits.
AVAssetLibrary can help you to modify image data.
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc]init];
[assetsLibrary assetForURL:photoUrl resultBlock:resultBlock
failureBlock:nil];
Using above code you will get asset of image in result block.
ALAssetsLibraryAssetForURLResultBlock resultBlock = ^(ALAsset *photoAsset)
{
ALAssetRepresentation *image_representation = [photoAsset defaultRepresentation];
//Do the stuff to get-modify image data from image_representation...
}
Hope this is what you are looking for.

RestKit CoreData and UIImage

I'm using Rest Kit with Core Data, one of the Core Data entities has an attribute 'image' that has a binary type.
I'm still in mockup stage so the image is populated with this code:
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:#"http://lorempixel.com/60/60/people"]]];
entry.image = UIImagePNGRepresentation(image);
Another tab has a collection view that uses fetchedResultsController.
After creating a new entity, if I only save the context the image works fine.
But if I push the entity to the web server using 'postObject:' the image is corrupted when it comes back from the server. I've confirmed the server receives the same string representation of the image "<2f396a2f 34414151 536b5a4a 52674142 ... 6a6e502f 32513d3d>" and stores it directly into a MySQL column of type long blob and at all points the string representation is the same.
But when the collection view is populated using a server call via RestKit the entities image is invalid. I'm think the issue is the data is being converted into the data representation of the description of the data.
Does anyone have a working example with images. The only thing I can think of is that I need to add a custom transformation, but the documentation and examples are lacking as far as how to actually implement one.
RestKit is storing the plain NSData for the image in Core Data - it has no idea what else you might want to do with it. Generally you don't want to manage images directly in Core Data or using RestKit.
Generally, store the path of the image in Core Data and the file on disk. Download them asynchronously (from the URL's which would also be in Core Data).
For uploading, you could make RestKit upload the data, but you probably actually want to file upload or convert to base64. You will need to write some code for this (which you could have RestKit pick up by using the key of the method name that returns the appropriate data). A similar process will work for mapping the data in.
RestKit data transformers are hard to make work in this situation as you are converting between data and strings and they are too general to be able to intercept accurately.

UIImage from NSInputStream

I'm downloading a 2400x1600 image from Parse and I don't want it to hold all that data in memory at once. PFFile object from Parse has a convenient method to get NSData as NSInputStream so when the data is finally downloaded I end up with a NSInputStream.
So now I want to use that NSInputStream to get my UIImage. It should work like creating a UIImage with contents of file method i.e. not the whole image is loaded into memory at once.
I think that writing to a file from NSInputStream and then use the UIImage's contents of file method should work fine in my case, but I have found no way to write to a file from a NSInputStream.
Any code example or some guideline would be really appreciated.
Thanks in advance.
To accomplish this you can set up an NSOutputStream to then stream the received data to a file. Create your output stream using initToFileAtPath:append: with YES for append. In your input stream call back, pass the data to your output stream by calling write:maxLength: (read more in the docs). Once the stream is complete, you then have the full image on file without ever having it fully in memory.
Henri's answer above is more appropriate since you're using Parse, but this is the general solution.
In the documentation on iOS/OS X, Parse brings this an example.
Retrieving the image back involves calling one of the getData variants on the PFFile. Here we retrieve the image file off another UserPhoto named anotherPhoto:
PFFile *userImageFile = anotherPhoto[#"imageFile"];
[userImageFile getDataInBackgroundWithBlock:^(NSData *imageData, NSError *error) {
if (!error) {
UIImage *image = [UIImage imageWithData:imageData];
}
}];
Now, I don't quite see the reason for you to use NSInputStream, mainly for two reasons:
NSInputStream is supposedly meant for INPUTTING data, not taking it from somewhere
NSInputStream is meant for streaming, so for scenarios in which you want to do something with the data as it is coming in, from your description it seems as if you only ever care about the data once it has completed the download.
In short, you should be using the aforementioned way, unless you truly care about the way the data is loaded in, for example wanting to manipulate it as it comes in (highly unlikely in the case you describe).
As to having it all in memory at once, the dimensions you give are not that large, yes you could stream it into a file, but assuming you want to show it full-size in the app, the problem of memory would appear at some point nevertheless, i.e you would just be postponing the inevitable. If that is not the case (not showing full-size), then it might be a good idea to chop the source image up into tiles and use those instead, far quicker to download specific tiles and easier on memory.

How to upload image in Sqlite Database Browser?

I am making one IOS application in which will be using core data and I want to save image in database can anyone tell me how to upload image in Sqlite Database Browser?
To store image in Sqlite Database using core data u need to convert image to data
for example
NSData *photo_data = UIImagePNGRepresentation(aImage);
Now u can store this photo_data. In core data model u create a attribute of type "Binary data"
use this attribute while storing the image data.
You can't just slap in an image file into sqlite. you need to transform the image into a format such as base64 or something that can be stored. when you want to retrieve the image you must transform it back into its original form. for most cases all you need to do is store the path to the image, and not the image itself.
You can use blob object at sqlite. First of all you transform your image to NSData type:
NSData *uiimage=UIImagePNGRepresentation(im);
After bind your datas as blob object such as :
sqlite3_bind_blob(statement,.., [uiimage bytes], [uiimage length], SQLITE_TRANSIENT);

I want to convert AlAsset object to nsstring to store it in database

I am trying to store a copy of the data in an AlAsset into my database as the URL of an image retrieved from my photo library. To do this, I'd like to convert it into an NSString, but I'm not sure how.
My intent is to later pull the URL from the database, and load the AlAsset or NSString path, then use it to load a UIImage. Any help or suggestions would be greatly appreciated.
NSLog(#"path : %#", [asset valueForProperty:ALAssetPropertyAssetURL]);
And refer my this answer for how to get that image back using this URL:name of the picked image xcode

Resources