How to upload a live photo to the server? - ios

How do I upload Live photos to a server? I am using AFNetworking for the upload of images, videos, and Slow motion videos. The upload of images and videos is very straightforward.
Upload of Images and Videos
//Physical location i.e. URL for video
PHVideoRequestOptions *options = [PHVideoRequestOptions new];
options.networkAccessAllowed = YES;
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if ([asset isKindOfClass:[AVURLAsset class]]) {
NSLog(#"%#",((AVURLAsset *)asset).URL);
}
}];
//Physical location i.e. URL for an image
[asset requestContentEditingInputWithOptions:nil
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL *imageURL = contentEditingInput.fullSizeImageURL;
}];
These URLs are used for the upload of images and video Files.
// manager is of class AFHTTPSessionManager initialised with background session.
NSURLsession *uploadTask=[manager uploadTaskWithRequest:request
fromFile:[NSURL URLWithString:fdi.filePath]
progress:nil
completionHandler:nil];
Upload of Slow Motion file
Slow motion file is a combination of 2 or more video files. If you try to upload it like a normal video file then it would loose its slow motion feature. To work this out, first we have to create a slow motion file, then save it on the disk and then upload it.
PHVideoRequestOptions *options = [PHVideoRequestOptions new];
options.networkAccessAllowed = YES;
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if(([asset isKindOfClass:[AVComposition class]] && ((AVComposition *)asset).tracks.count == 2)){
//Added by UD for slow motion videos. See Here: https://overflow.buffer.com/2016/02/29/slow-motion-video-ios/
//Output URL
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = paths.firstObject;
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"mergeSlowMoVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
//Begin slow mo video export
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (exporter.status == AVAssetExportSessionStatusCompleted) {
NSURL *URL = exporter.outputURL;
self.filePath=URL.absoluteString;
NSURLsession *uploadTask=[manager uploadTaskWithRequest:request
fromFile:[NSURL URLWithString:self.filePath]
progress:nil
completionHandler:nil];
//Use above method or use the below one.
// NSData *videoData = [NSData dataWithContentsOfURL:URL];
//
//// Upload
//[self uploadSelectedVideo:video data:videoData];
}
});
}];
}
}];
How to upload a Live Photo?
A Live photo is combination of 2 files, *.jpg and *mov. We can neither upload it like a photo or video nor we can create a new file from the combination of 2 or more files like we did with the slow motion video because we have to send both the files of a live photo to the server.
It is clear that we have to send both the files to the server. But how to send multiple files in a single request.

Read a very interesting solution to this problem. It suggests that we do not need to upload both the *jpg and *.mov to the server for live photo. We can only upload the video and create image at the server from the mid point of video.
The Live Photo begins with the middle of the video - I just read this
video, find the middle, save the image.
We can also create a Live Photo from a video as the iPhone 6s does.
The 6s captures a short video with the photo and plays the video when
the photo is 3D touched. However, by using this method of creating a
PHLivePhoto and playing it using the PHLivePhotoView, we can get it to
work on any iPhone, such as my regular iPhone 6.
Source: See Here.
We can get the video URL of a live photo. See How to get extract video URL from a live photo.
But I am not sure whether this would be good solution for this problem. Can any tell me whether it is a good solution or not.

Related

NSData dataWithContentsOfURL for URL from camera roll

I have some URLs for the videos stored in camera rolls. For example:
file:///var/mobile/Media/DCIM/100APPLE/IMG_0249.MP4
I would like to further process those videos as NSData, but if I use
NSURL* videoURL = [NSURL fileURLWithPath:#"file:///var/mobile/Media/DCIM/100APPLE/IMG_0249.MP4"];
NSData *imageData = [NSData dataWithContentsOfURL:videoURL options:NSDataReadingMappedIfSafe error:&error];
The data is always nil. And I get the following error:
Error Domain=NSCocoaErrorDomain Code=257 "The file “IMG_0249.MP4”
couldn’t be opened because you don’t have permission to view it."
UserInfo={NSFilePath=/var/mobile/Media/DCIM/100APPLE/IMG_0249.MP4,
NSUnderlyingError=0x165351d0 {Error Domain=NSPOSIXErrorDomain Code=1
"Operation not permitted"}}
I noticed that if the URL is from my local app document folder, I can generate NSData without any issues. I wonder if it's not allowed to create NSData out of the videos stored in Camera Roll. I wonder if I need to use the Photos framework or some other method in order to have the permission.
It turns out that I can't turn a URL for Camera Rolls videos into NSData directly, I need to use PHImageManager to do that (And use the localIdentifier of the PHAsset of the video I acquired in the past):
PHFetchResult* assets = [PHAsset fetchAssetsWithLocalIdentifiers:#[videoID] options:nil];
PHAsset *asset = assets.firstObject;
if (asset.mediaType == PHAssetMediaTypeVideo) {
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:nil resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if ([asset isKindOfClass:[AVURLAsset class]]) {
AVURLAsset* urlAsset = (AVURLAsset*)asset;
NSData *data = [NSData dataWithContentsOfURL:urlAsset.URL];
[self sendToTwitter:data withMessage:message account: account];
}
}];
}
My guess is that the AVURLAsset* urlAsset has some additional permission info that my previous approach through [NSURL fileURLWithPath] doesn't have.

Exporting video using PhotoKit (PHAsset) gives different video file every time

I use the method (a the end of this question) to retrieve video from the device. What it does, it finds the first video in the library, creates export session and exports the video into MOV file.
After two runs of the application (stopping the app between method runs), two resulting files are being compared. Both files are different. I was expecting that both files would be the same, as the same asset is being exported.
One more remark: running the method twice in the same application run gives me two identical files as expected.
Is it possible to make PhotoKit to export the same file every time it runs?
- (void)testVideoRetrievalSO {
PHAsset *oneVideo = [[PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeVideo options:nil] firstObject];
PHVideoRequestOptions *options = [[PHVideoRequestOptions alloc] init];
options.networkAccessAllowed = YES;
options.deliveryMode = PHVideoRequestOptionsDeliveryModeHighQualityFormat;
options.version = PHVideoRequestOptionsVersionOriginal;
[[PHImageManager defaultManager] requestExportSessionForVideo:oneVideo
options:options
exportPreset:AVAssetExportPresetPassthrough
resultHandler:
^(AVAssetExportSession * _Nullable exportSession, NSDictionary * _Nullable info) {
NSLog(#"Video test run on asset %#", oneVideo.localIdentifier);
NSString *folderPath = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).firstObject;
NSString *fileName = [[[NSUUID UUID] UUIDString] stringByAppendingPathExtension:#"mov"];
NSString *tempFile = [folderPath stringByAppendingPathComponent:fileName];
NSURL *tempFileUrl = [NSURL fileURLWithPath:tempFile];
[exportSession setOutputFileType:AVFileTypeQuickTimeMovie];
[exportSession setOutputURL:tempFileUrl];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"Video test run exported video into file: %#", tempFile);
}];
}];
}
UPDATED
It's not clear but I think exporting video from the camera roll does not guarantee fetching same video in every time. So I copied the video from camera roll to my document folder with url (avurlasset.URL) by [NSFileManager copyItemAtURL:toURL:error:] then it copies the same video file in every time. For now it is my final solution.
In this case you have to use requestAVAssetForVideo not requestExportSessionForVideo
So in your case,
PHVideoRequestOptions *options = [PHVideoRequestOptions new];
options.version = PHVideoRequestOptionsVersionOriginal;
[[PHImageManager defaultManager] requestAVAssetForVideo:asset
options:options
resultHandler:
^(AVAsset * _Nullable avasset,
AVAudioMix * _Nullable audioMix,
NSDictionary * _Nullable info)
{
NSError *error;
AVURLAsset *avurlasset = (AVURLAsset*) avasset;
// Write to documents folder
NSURL *fileURL = [NSURL fileURLWithPath:tmpShareFilePath];
if ([[NSFileManager defaultManager] copyItemAtURL:avurlasset.URL
toURL:fileURL
error:&error]) {
NSLog(#"Copied correctly");
}
}];

Upload Video to Facebook via video-graph.facebook iOS

I am trying to upload a video. I have saved my video file to an MP4 file as follows:
NSString *fileName = [NSString stringWithFormat:#"chatVideo-%d.mp4",arc4random()%1000];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:fileName];
NSURL *videoURL = [NSURL fileURLWithPath:myPathDocs];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:asset2 presetName:AVAssetExportPresetHighestQuality];
exportSession.outputURL = videoURL;
exportSession.outputFileType = AVFileTypeMPEG4;
NSData *videodata =[NSData dataWithContentsOfURL:videoURL];
I call the graph API for upload to Facebook:
[parameters setValue:videoData forKey:#"source"];
[parameters setValue:thumbData forKey:#"thumb"];
[[[FBSDKGraphRequest alloc] initWithGraphPath:#"me/videos"
parameters:parameters
HTTPMethod:#"POST"]
startWithCompletionHandler:^(FBSDKGraphRequestConnection *connection, id result, NSError *error) {
if ([error.userInfo[FBSDKGraphRequestErrorGraphErrorCode] isEqual:#200]) {
NSLog(#"permission error");
}
}];
But I receive this error:
Sorry, the video file you selected is in a format that we don't support.
yeah,after a long time I have find out answer:
[parameters setValue:videoData forKey:#"source"];
It must be:
[parameters setValue:videoData forKey:#"video.mp4"];
I hate Facebook docs ==!.
Have a look at
https://developers.facebook.com/docs/graph-api/reference/user/videos/#Creating
to see the list of supported video types:
3g2, 3gp, 3gpp, asf, avi, dat, divx, dv, f4v, flv, m2ts, m4v, mkv, mod, mov, mp4, mpe, mpeg, mpeg4, mpg, mts, nsv, ogm, ogv, qt, tod, ts, vob, wmv
Ergo, check if the video is really a MP4 video...

Upload videos from gallery using Photos framework

What is the best way to upload videos from gallery using Photos framework?
Before I used ALAssetRepresentation and next method:
- (NSUInteger)getBytes:(uint8_t *)buffer fromOffset:(long long)offset length:(NSUInteger)length error:(NSError **)error;
this allowed to upload file without first copying it to app temp directory. Don’t see any alternatives in Photos framework. Only way seems to use AVAssetExportSession -> export to local directory -> upload, but this requires additional storage space (could be a problem, if video is too big)
Seems the only valid way is to request AVAsset from PHImageManager, and check if returned asset is AVURLAsset. In this case URL can be used to directly access file and get needed chunk of bytes:
[[PHImageManager defaultManager] requestAVAssetForVideo:videoAsset options:nil resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if ([asset isKindOfClass:[AVURLAsset class]]) {
NSURL *URL = [(AVURLAsset *)asset URL];
// use URL to get file content
}
}];
This will not work with slow motion videos, because AVComposition instead of AVURLAsset is returned. Possible solution is to use PHVideoRequestOptionsVersionOriginal video file version:
PHVideoRequestOptions *options = [[PHVideoRequestOptions alloc] init];
options.version = PHVideoRequestOptionsVersionOriginal;
[[PHImageManager defaultManager] requestAVAssetForVideo:videoAsset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if ([asset isKindOfClass:[AVURLAsset class]]) {
NSURL *URL = [(AVURLAsset *)asset URL];
// use URL to get file content
}
}];
And to get fullsize image url:
PHContentEditingInputRequestOptions *options = [[PHContentEditingInputRequestOptions alloc] init];
options.canHandleAdjustmentData = ^BOOL(PHAdjustmentData *adjustmentData) {
return YES;
};
[imageAsset requestContentEditingInputWithOptions:options completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
// use contentEditingInput.fullSizeImageURL
}];

how to play sound by AVAssetReader

I am using AVAssetReader to get the individual frames from a video file. I would like to know how I can play the audio from Mp4 file.
the return value of method [player play] is false, so there is no sound to play, but why
thanks.
create AVAssetReader
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"zhang" ofType:#"mp4"]];
AVURLAsset *avasset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetTrack *track1 = [[avasset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
NSMutableDictionary *dic2 = [NSMutableDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey, [NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsNonInterleaved, nil];
output1 = [[AVAssetReaderTrackOutput alloc] initWithTrack:track1 outputSettings:dic2];
AVAssetReader *reader = [[AVAssetReader alloc] initWithAsset:avasset error:nil];
[reader addOutput:output1];
[reader startReading];
output code is as following:
CMSampleBufferRef sample = [output1 copyNextSampleBuffer];
CMBlockBufferRef blockBufferRef = CMSampleBufferGetDataBuffer(sample);
size_t length = CMBlockBufferGetDataLength(blockBufferRef);
UInt8 buffer[length];
CMBlockBufferCopyDataBytes(blockBufferRef, 0, length, buffer);
NSData * data = [[NSData alloc] initWithBytes:buffer length:length];
NSString *docDirPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *filePath = [NSString stringWithFormat:#"%#/test.mp3", docDirPath];
[data writeToFile:filePath atomically:YES];
[data release];
NSError *error;
NSURL *fileURL = [NSURL fileURLWithPath:filePath];
AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:&error];
player.numberOfLoops = 0;
[player play];
One option would be to avoid using the AVAudioPlayer, and instead use the PCM data you have decoded to fill an AudioQueue for output.
The ideal method would be to create an AVComposition from the audio track of an AVAsset that was created from your source mp4 file. That AVComposition (as a subclass of AVAsset) can be used inside an AVPlayer, which will then play just the audio track for you.
You're not able to simply write out blocks of PCM data into a file with the extension ".mp3" - that's an encoded format. The AVAudioPlayer will be looking for certain encoded data in the file, and the file you have written will be incompatible, which is why you are receiving a return value of false. If you have your heart set on writing the audio out to a file, use an AVAssetWriter with the appropriate settings. This could be written out as a Core Audio file for maximum performance. This step could also encode the audio as another format (say, mp3, or AAC), however this will have a heavy performance cost associated with it, and it's likely that it will not be suitable for real-time playback.
The assets is not playable immediately. Need to observe the value of status using key-value observing. Refer to the AVCam sample of Apple.

Resources