I have generated an image in my app an I would like to share it via UIDocumentInteractionController which takes an argument in form of NSURL. Of course I don't want do directly convert image data into a URL, but what is the best way to achieve this? Can you temporarily store that image somewhere and then get the url and delete it afterwards after UIDocumentInteractionController has done it's job? Or convert it to NSData somehow and then get the url of the NSData object? (I tried this and failed btw)
Related
I want to post an image using the generated swift-client. After a lot of researching I think the best way to specify this is:
/user/profilepicture:
put:
description: |
upload profile picture of user
consumes:
- multipart/form-data
parameters:
- name: profilePhoto
in: formData
type: file
The generated swift client function signature is:
public class func usersProfilepicturePut(profilePhoto profilePhoto: NSURL? = nil, completion: ((error: ErrorType?) -> Void))
The problem I am having is the NSURL type. The reason is that it seems very difficult to get an NSURL out of a UIImage, especially if the photo has been taken from the camera with the UIImagePickerController.
Then again I do not want to change the type of the parameter to a string, and use a base64 encoding because it adds a lot of overhead to convert the image to a string.
Could someone verify that my yaml spec is correct? (I am choosing file type, because the only other data type I could use to upload a photo is string, with format Byte, but that would lead in an overhead to convert the photo in string.
If it is indeed correct, does anyone know if there is a way to get an NSURL from a UIImage. This second question exists, however the answer in [Getting the URL of picture taken by camera with Photos Framework does not return a URL but a string identifier. Also other answers to similar questions all suggest to save the image and then retrieve it again just to get an NSURL which seems hacky.
So should I change the generated implementation to accept an NSData type, or do you have anything better to suggest?
Looks like the swagger API at the time of writing is Base64 encoding NSData into the post. So avoid that if you don't want to use Base64 or if don't want to extend/modify the swagger generated code.
To send binary data looks like you need a NSURL of the local file.
How can I fetch the URL from images stored in Parse?
I have made an app where I can store images into Parse (works fine).
But how do I retrieve the URL from the image stored in Parse so I can show it in a web browser by typing the URL?
Can I see the URL for the images stored in Parse anywhere?
I'm thinking something like this:
https://www.parse.com/apps/ClassName/collections#class/Photo/CzEgPXalzM
You can get the url of the image from the PFFile object containing the image like this:
NSString *urlString = yourImageFileFromParse.url;
I need to be able to access the raw image data from a loaded photo. Originally we were supposed to use Base64, however the API creator has changed it to raw image data.
In SWIFT, how would I accomplish the following?
I'm currently using Alamofire for my networking, but I'm not sure that it will work for this part of the API.
I already have the image resource as a variable to access, I just need to know how to get the raw data, and then formulate a POST request with just that data as the body of the request. A method with a callback would be awesome.
UIImagePNGRepresentation
UIImageJPEGRepresentation
These method can encode a UIImage to NSData.
upload code:
Alamofire.upload(.POST, "http://your.org/upload", data: imageData)
I am currently working with the 3rd party library SCRecorder to try and create an AVAsset with a URL and then create an AVPlayerItem -> AVPlayer to output a video.
I believe my problem arises because the video data I am trying to play is originally being saved to Parse as a raw data file. The reference URL to this file is in the format "http ://files.parsetfss.com/something-file". It does not appear that there is a way to create an AVAsset, and ultimately an AVPlayer, using this type of URL.
My question is whether or not there is a way to create an AVPlayerItem/AVPlayer using this URL which returns raw data, or create one using the raw data itself as an NSData? If this is not possible is there a way to save a PFFile to Parse in a different format, one that would be accepted by AVAsset/AVPlayer such as .mov/.mp4?
I figured out the issue... I just needed to use PFFile(data: data, contentType: "video/mp4") :P
I am getting such kind of URL for pictures on my iPad:assets-library://asset/asset.JPG?id=00000000-0000-0000-0000-000000000BC4&ext=JPG
How can I read binary the content of the picture ?Opening the file with, for example, stringFromFileAtURL gives me that path is not found.Thanks.
You are not allowed to read stuff from there as you please, because it is outside your app's sandbox.
This is not going to be as simple as you want it to be. First you must get the actual image file. This involves getting an ALAsset object (you can see this process in this question. Then, you get that asset's defaultRepresentation and then you get the representation's fullResolutionImage. Then you have a CGImageRef, and you can get is data provider via CGImageGetDataProvider and then a copy of the pixel data via CGDataProviderCopyData which is a CFDataRef (you can cast it to NSData *).
Do you really need the binary? Or is a UIImage good enough? Not sure of your intent with this.
It is not necessary to use the assets library, UIImage is the binary representation. If all you want to do is save the image as a file use
[UIImagePNGRepresentation(image) writeToFile:pngPath atomically:YES];