Upload Video to Facebook via video-graph.facebook iOS - ios

I am trying to upload a video. I have saved my video file to an MP4 file as follows:
NSString *fileName = [NSString stringWithFormat:#"chatVideo-%d.mp4",arc4random()%1000];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:fileName];
NSURL *videoURL = [NSURL fileURLWithPath:myPathDocs];
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:asset2 presetName:AVAssetExportPresetHighestQuality];
exportSession.outputURL = videoURL;
exportSession.outputFileType = AVFileTypeMPEG4;
NSData *videodata =[NSData dataWithContentsOfURL:videoURL];
I call the graph API for upload to Facebook:
[parameters setValue:videoData forKey:#"source"];
[parameters setValue:thumbData forKey:#"thumb"];
[[[FBSDKGraphRequest alloc] initWithGraphPath:#"me/videos"
parameters:parameters
HTTPMethod:#"POST"]
startWithCompletionHandler:^(FBSDKGraphRequestConnection *connection, id result, NSError *error) {
if ([error.userInfo[FBSDKGraphRequestErrorGraphErrorCode] isEqual:#200]) {
NSLog(#"permission error");
}
}];
But I receive this error:
Sorry, the video file you selected is in a format that we don't support.

yeah,after a long time I have find out answer:
[parameters setValue:videoData forKey:#"source"];
It must be:
[parameters setValue:videoData forKey:#"video.mp4"];
I hate Facebook docs ==!.

Have a look at
https://developers.facebook.com/docs/graph-api/reference/user/videos/#Creating
to see the list of supported video types:
3g2, 3gp, 3gpp, asf, avi, dat, divx, dv, f4v, flv, m2ts, m4v, mkv, mod, mov, mp4, mpe, mpeg, mpeg4, mpg, mts, nsv, ogm, ogv, qt, tod, ts, vob, wmv
Ergo, check if the video is really a MP4 video...

Related

How to upload a live photo to the server?

How do I upload Live photos to a server? I am using AFNetworking for the upload of images, videos, and Slow motion videos. The upload of images and videos is very straightforward.
Upload of Images and Videos
//Physical location i.e. URL for video
PHVideoRequestOptions *options = [PHVideoRequestOptions new];
options.networkAccessAllowed = YES;
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if ([asset isKindOfClass:[AVURLAsset class]]) {
NSLog(#"%#",((AVURLAsset *)asset).URL);
}
}];
//Physical location i.e. URL for an image
[asset requestContentEditingInputWithOptions:nil
completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
NSURL *imageURL = contentEditingInput.fullSizeImageURL;
}];
These URLs are used for the upload of images and video Files.
// manager is of class AFHTTPSessionManager initialised with background session.
NSURLsession *uploadTask=[manager uploadTaskWithRequest:request
fromFile:[NSURL URLWithString:fdi.filePath]
progress:nil
completionHandler:nil];
Upload of Slow Motion file
Slow motion file is a combination of 2 or more video files. If you try to upload it like a normal video file then it would loose its slow motion feature. To work this out, first we have to create a slow motion file, then save it on the disk and then upload it.
PHVideoRequestOptions *options = [PHVideoRequestOptions new];
options.networkAccessAllowed = YES;
[[PHImageManager defaultManager] requestAVAssetForVideo:asset options:options resultHandler:^(AVAsset *asset, AVAudioMix *audioMix, NSDictionary *info) {
if(([asset isKindOfClass:[AVComposition class]] && ((AVComposition *)asset).tracks.count == 2)){
//Added by UD for slow motion videos. See Here: https://overflow.buffer.com/2016/02/29/slow-motion-video-ios/
//Output URL
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = paths.firstObject;
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"mergeSlowMoVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
//Begin slow mo video export
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
if (exporter.status == AVAssetExportSessionStatusCompleted) {
NSURL *URL = exporter.outputURL;
self.filePath=URL.absoluteString;
NSURLsession *uploadTask=[manager uploadTaskWithRequest:request
fromFile:[NSURL URLWithString:self.filePath]
progress:nil
completionHandler:nil];
//Use above method or use the below one.
// NSData *videoData = [NSData dataWithContentsOfURL:URL];
//
//// Upload
//[self uploadSelectedVideo:video data:videoData];
}
});
}];
}
}];
How to upload a Live Photo?
A Live photo is combination of 2 files, *.jpg and *mov. We can neither upload it like a photo or video nor we can create a new file from the combination of 2 or more files like we did with the slow motion video because we have to send both the files of a live photo to the server.
It is clear that we have to send both the files to the server. But how to send multiple files in a single request.
Read a very interesting solution to this problem. It suggests that we do not need to upload both the *jpg and *.mov to the server for live photo. We can only upload the video and create image at the server from the mid point of video.
The Live Photo begins with the middle of the video - I just read this
video, find the middle, save the image.
We can also create a Live Photo from a video as the iPhone 6s does.
The 6s captures a short video with the photo and plays the video when
the photo is 3D touched. However, by using this method of creating a
PHLivePhoto and playing it using the PHLivePhotoView, we can get it to
work on any iPhone, such as my regular iPhone 6.
Source: See Here.
We can get the video URL of a live photo. See How to get extract video URL from a live photo.
But I am not sure whether this would be good solution for this problem. Can any tell me whether it is a good solution or not.

Exporting video using PhotoKit (PHAsset) gives different video file every time

I use the method (a the end of this question) to retrieve video from the device. What it does, it finds the first video in the library, creates export session and exports the video into MOV file.
After two runs of the application (stopping the app between method runs), two resulting files are being compared. Both files are different. I was expecting that both files would be the same, as the same asset is being exported.
One more remark: running the method twice in the same application run gives me two identical files as expected.
Is it possible to make PhotoKit to export the same file every time it runs?
- (void)testVideoRetrievalSO {
PHAsset *oneVideo = [[PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeVideo options:nil] firstObject];
PHVideoRequestOptions *options = [[PHVideoRequestOptions alloc] init];
options.networkAccessAllowed = YES;
options.deliveryMode = PHVideoRequestOptionsDeliveryModeHighQualityFormat;
options.version = PHVideoRequestOptionsVersionOriginal;
[[PHImageManager defaultManager] requestExportSessionForVideo:oneVideo
options:options
exportPreset:AVAssetExportPresetPassthrough
resultHandler:
^(AVAssetExportSession * _Nullable exportSession, NSDictionary * _Nullable info) {
NSLog(#"Video test run on asset %#", oneVideo.localIdentifier);
NSString *folderPath = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).firstObject;
NSString *fileName = [[[NSUUID UUID] UUIDString] stringByAppendingPathExtension:#"mov"];
NSString *tempFile = [folderPath stringByAppendingPathComponent:fileName];
NSURL *tempFileUrl = [NSURL fileURLWithPath:tempFile];
[exportSession setOutputFileType:AVFileTypeQuickTimeMovie];
[exportSession setOutputURL:tempFileUrl];
[exportSession exportAsynchronouslyWithCompletionHandler:^{
NSLog(#"Video test run exported video into file: %#", tempFile);
}];
}];
}
UPDATED
It's not clear but I think exporting video from the camera roll does not guarantee fetching same video in every time. So I copied the video from camera roll to my document folder with url (avurlasset.URL) by [NSFileManager copyItemAtURL:toURL:error:] then it copies the same video file in every time. For now it is my final solution.
In this case you have to use requestAVAssetForVideo not requestExportSessionForVideo
So in your case,
PHVideoRequestOptions *options = [PHVideoRequestOptions new];
options.version = PHVideoRequestOptionsVersionOriginal;
[[PHImageManager defaultManager] requestAVAssetForVideo:asset
options:options
resultHandler:
^(AVAsset * _Nullable avasset,
AVAudioMix * _Nullable audioMix,
NSDictionary * _Nullable info)
{
NSError *error;
AVURLAsset *avurlasset = (AVURLAsset*) avasset;
// Write to documents folder
NSURL *fileURL = [NSURL fileURLWithPath:tmpShareFilePath];
if ([[NSFileManager defaultManager] copyItemAtURL:avurlasset.URL
toURL:fileURL
error:&error]) {
NSLog(#"Copied correctly");
}
}];

Play song from ipod library with superpowered

since Apple anounced that every app has to support 64-bit starting February 1rst, I can't use Dirac3LE anymore. So I found Superpowered which seems to do the same. The only problem I currently see is, that I can't get it to play songs from the iPod Library.
I've tried importing the song via AVAssetExportSession but can't get it to work. This is the code I've tried so far:
NSURL *url = [playingSong valueForProperty:MPMediaItemPropertyAssetURL];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:url options:nil];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset: songAsset presetName: AVAssetExportPresetPassthrough];
exporter.outputFileType = #"com.apple.coreaudio-format";
NSString *fname = [[NSString stringWithFormat:#"tmp"] stringByAppendingString:#".mp3"];
NSString *tempPath = NSTemporaryDirectory();
NSString *exportFile = [tempPath stringByAppendingPathComponent: fname];
exporter.outputURL = [NSURL fileURLWithPath:exportFile];
[exporter exportAsynchronouslyWithCompletionHandler:^{
self.player = [[SuperpoweredAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:exportFile]];
[self.player play];
}];
With opening the file via:
player->open([[url path] fileSystemRepresentation]);
Even if this would work, I'm kind of concerned if this would be fast enough for a music player. Importing a song, as soon as the other finished.
Are there any other options?
Thanks!
if you have a MPMediaItem *item got from iTunes Library, you can use:
player->open([[item assetURL].absoluteString UTF8String]);

Get video size after Save this video to Photo Library in iOS

I created Camera using UIImagePickerController, after take a video. I clicked on Use Video, I saved this video to Photo library. And now I want to get size of this video. I used AlAssetLibrary to save video to Photo library. How can get size of the video has just taken? Please help me. Thanks in advance.
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
NSURL *recordedVideoURL= [info objectForKey:UIImagePickerControllerMediaURL];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:recordedVideoURL]) {
[library writeVideoAtPathToSavedPhotosAlbum:recordedVideoURL
completionBlock:^(NSURL *assetURL, NSError *error){}
];
}
}
I tried below code to get size of video but not works:
ALAssetRepresentation *rep = [alasset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSError *error = nil;
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:&error];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
1) Please see this
question to get your video.
2) After getting your video convert it into NSData as refer this question.
3)Now use NSData's length function .
It will give to to get the length of your video file.
You can do it in three ways
1.Use NSData length
ALAssetRepresentation *rep = [alasset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSError *error = nil;
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:&error];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
NSLog(#"Size of video %d",data.length);
2.Make use of ALAssetRepresentationsize
3.Worst-Case Scenario: Use the videoPathURL that points to the saved video file, retrieve it again using ALAsset and caliculate the size.
you can also use this for getting size of video :
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:YOURURL];
CMTime duration = playerItem.duration;
float seconds = CMTimeGetSeconds(duration);

how to play sound by AVAssetReader

I am using AVAssetReader to get the individual frames from a video file. I would like to know how I can play the audio from Mp4 file.
the return value of method [player play] is false, so there is no sound to play, but why
thanks.
create AVAssetReader
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"zhang" ofType:#"mp4"]];
AVURLAsset *avasset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetTrack *track1 = [[avasset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
NSMutableDictionary *dic2 = [NSMutableDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey, [NSNumber numberWithInt:16],AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO],AVLinearPCMIsNonInterleaved, nil];
output1 = [[AVAssetReaderTrackOutput alloc] initWithTrack:track1 outputSettings:dic2];
AVAssetReader *reader = [[AVAssetReader alloc] initWithAsset:avasset error:nil];
[reader addOutput:output1];
[reader startReading];
output code is as following:
CMSampleBufferRef sample = [output1 copyNextSampleBuffer];
CMBlockBufferRef blockBufferRef = CMSampleBufferGetDataBuffer(sample);
size_t length = CMBlockBufferGetDataLength(blockBufferRef);
UInt8 buffer[length];
CMBlockBufferCopyDataBytes(blockBufferRef, 0, length, buffer);
NSData * data = [[NSData alloc] initWithBytes:buffer length:length];
NSString *docDirPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *filePath = [NSString stringWithFormat:#"%#/test.mp3", docDirPath];
[data writeToFile:filePath atomically:YES];
[data release];
NSError *error;
NSURL *fileURL = [NSURL fileURLWithPath:filePath];
AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:&error];
player.numberOfLoops = 0;
[player play];
One option would be to avoid using the AVAudioPlayer, and instead use the PCM data you have decoded to fill an AudioQueue for output.
The ideal method would be to create an AVComposition from the audio track of an AVAsset that was created from your source mp4 file. That AVComposition (as a subclass of AVAsset) can be used inside an AVPlayer, which will then play just the audio track for you.
You're not able to simply write out blocks of PCM data into a file with the extension ".mp3" - that's an encoded format. The AVAudioPlayer will be looking for certain encoded data in the file, and the file you have written will be incompatible, which is why you are receiving a return value of false. If you have your heart set on writing the audio out to a file, use an AVAssetWriter with the appropriate settings. This could be written out as a Core Audio file for maximum performance. This step could also encode the audio as another format (say, mp3, or AAC), however this will have a heavy performance cost associated with it, and it's likely that it will not be suitable for real-time playback.
The assets is not playable immediately. Need to observe the value of status using key-value observing. Refer to the AVCam sample of Apple.

Resources