FFMPEG making requests for each frame when decoding a stream, slow performance - ios
I am having an issue playing MOV camera captured files from an iPhone. My FFMPEG implementation has no problem playing most file formats, this issue is exclusive only for camera captured MOV.
When trying to open the file, I can see in the logs that many requests are made, each requests decoding only one frame, before making a new request which results the video being buffered extremely slowly.
It takes roughly a minute to buffer about a few seconds of the video.
Another thing to mention is that the very same problematic file is played without an issue locally. The problem is when trying to decode while streaming.
I compiled my code on Xcode 11, iOS SDK 13, with cocoapods mobile-ffmpeg-https 4.2.
Here is a rough representation of my code, its pretty standard:
Here is how I open AVFormatContext:
AVFormatContext *context = avformat_alloc_context();
context->interrupt_callback.callback = FFMPEGFormatContextIOHandler_IO_CALLBACK;
context->interrupt_callback.opaque = (__bridge void *)(handler);
av_log_set_level(AV_LOG_TRACE);
int result = avformat_open_input(&context, [request.urlAsString UTF8String], NULL, NULL);
if (result != 0) {
if (context != NULL) {
avformat_free_context(context);
}
return nil;
}
result = avformat_find_stream_info(context, NULL);
if (result < 0) {
avformat_close_input(&context);
return nil;
}
Video decoder is opened like so, audio decoder is nearly identical
AVCodecParameters *params = context->streams[streamIndex]->codecpar;
AVCodec *codec = avcodec_find_decoder(params->codec_id);
if (codec == NULL) {
return NULL;
}
AVCodecContext *codecContext = avcodec_alloc_context3(codec);
if (codecContext == NULL) {
return NULL;
}
codecContext->thread_count = 6;
int result = avcodec_parameters_to_context(codecContext, params);
if (result < 0) {
avcodec_free_context(&codecContext);
return NULL;
}
result = avcodec_open2(codecContext, codec, NULL);
if (result < 0) {
avcodec_free_context(&codecContext);
return NULL;
}
I read the data from the server like so:
AVPacket packet;
int result = av_read_frame(formatContext, &avPacket);
if (result == 0) {
avcodec_send_packet(codecContext, &avPacket);
// .... decode ....
}
Logs after opening the decoders:
// [tls] Request is made here
// [tls] Request response headers are here
Probing mov,mp4,m4a,3gp,3g2,mj2 score:100 size:2048
Probing mp3 score:1 size:2048
[mov,mp4,m4a,3gp,3g2,mj2 # 0x115918e00] Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
[mov,mp4,m4a,3gp,3g2,mj2 # 0x115918e00] type:'ftyp' parent:'root' sz: 20 8 23077123
[mov,mp4,m4a,3gp,3g2,mj2 # 0x115918e00] ISO: File Type Major Brand: qt
[mov,mp4,m4a,3gp,3g2,mj2 # 0x115918e00] type:'wide' parent:'root' sz: 8 28 23077123
[mov,mp4,m4a,3gp,3g2,mj2 # 0x115918e00] type:'mdat' parent:'root' sz: 23066642 36 23077123
// [tls] Request is made here
// [tls] Request response headers are here
[mov,mp4,m4a,3gp,3g2,mj2 # 0x115918e00] stream 0, sample 4, dts 133333
[mov,mp4,m4a,3gp,3g2,mj2 # 0x115918e00] stream 1, sample 48, dts 1114558
[mov,mp4,m4a,3gp,3g2,mj2 # 0x115918e00] stream 2, sample 1, dts 2666667
[h264 # 0x116080200] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1
// [tls] Request is made here
// [tls] Request response headers are here
[mov,mp4,m4a,3gp,3g2,mj2 # 0x115918e00] stream 0, sample 4, dts 133333
[mov,mp4,m4a,3gp,3g2,mj2 # 0x115918e00] stream 1, sample 48, dts 1114558
[mov,mp4,m4a,3gp,3g2,mj2 # 0x115918e00] stream 2, sample 1, dts 2666667
[h264 # 0x116080200] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1
// [tls] Request is made here
// [tls] Request response headers are here
// ...
These are some warnings I found in the log
[mov,mp4,m4a,3gp,3g2,mj2 # 0x11c030800] interrupted
[mov,mp4,m4a,3gp,3g2,mj2 # 0x11c030800] stream 0: start_time: 0.000 duration: 11.833
[mov,mp4,m4a,3gp,3g2,mj2 # 0x11c030800] stream 1: start_time: 0.000 duration: 11.832
[mov,mp4,m4a,3gp,3g2,mj2 # 0x11c030800] stream 2: start_time: 0.000 duration: 11.833
[mov,mp4,m4a,3gp,3g2,mj2 # 0x11c030800] stream 3: start_time: 0.000 duration: 11.833
[mov,mp4,m4a,3gp,3g2,mj2 # 0x11c030800] format: start_time: 0.000 duration: 11.833 bitrate=15601 kb/s
[mov,mp4,m4a,3gp,3g2,mj2 # 0x11c030800] Could not find codec parameters for stream 0 (Video: h264, 1 reference frame (avc1 / 0x31637661), none(bt709, left), 1920x1080, 1/1200, 15495 kb/s): unspecified pixel format
Consider increasing the value for the 'analyzeduration' and 'probesize' options
[mov,mp4,m4a,3gp,3g2,mj2 # 0x11c030800] After avformat_find_stream_info() pos: 23077123 bytes read:16293 seeks:1 frames:0
Also when calling avformat_open_input(...), 2 GET requests are made, before the returning.
Notice the "Probing mp3 score:1", that is not shown for other MOV files or any other files.
I have tried different versions of ffmpeg, I have tried messing around with the delays of the stream, I tried removing my custom interrupt callback, nothing have worked.
Code works fine with any other videos I have tested (mp4, mkv, avi).
Metadata of the test file:
Metadata:
major_brand : qt
minor_version : 0
compatible_brands: qt
creation_time : 2019-04-14T08:17:03.000000Z
com.apple.quicktime.make: Apple
com.apple.quicktime.model: iPhone 7
com.apple.quicktime.software: 12.2
com.apple.quicktime.creationdate: 2019-04-14T11:17:03+0300
Duration: 00:00:16.83, bitrate: N/A
Stream #0:0(und), 0, 1/600: Video: h264, 1 reference frame (avc1 / 0x31637661), none(bt709), 1920x1080 (0x0), 0/1, 15301 kb/s, 30 fps, 30 tbr, 600 tbn (default)
Metadata:
creation_time : 2019-04-14T08:17:03.000000Z
handler_name : Core Media Video
encoder : H.264
Stream #0:1(und), 0, 1/44100: Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, 100 kb/s (default)
Metadata:
creation_time : 2019-04-14T08:17:03.000000Z
handler_name : Core Media Audio
Stream #0:2(und), 0, 1/600: Data: none (mebx / 0x7862656D), 0/1, 0 kb/s (default)
Metadata:
creation_time : 2019-04-14T08:17:03.000000Z
handler_name : Core Media Metadata
Stream #0:3(und), 0, 1/600: Data: none (mebx / 0x7862656D), 0/1, 0 kb/s (default)
Metadata:
creation_time : 2019-04-14T08:17:03.000000Z
handler_name : Core Media Metadata
I found a fix for this (sorta of):
Set the io_open callback of the AVFormatContext to your own function, and then when that is called, after you call the default io_open you change the buffer size.
static int (*IO_OPEN_DEFAULT)(struct AVFormatContext *s, AVIOContext **pb, const char *url, int flags, AVDictionary **options);
int IO_OPEN_OVERRIDE(struct AVFormatContext *s, AVIOContext **pb, const char *url, int flags, AVDictionary **options) {
int result = IO_OPEN_DEFAULT(s, pb, url, flags, options);
pb[0]->buffer_size = 41239179;
return result;
}
This fixes the issue. The value you set it to is usually very large (20 to 40MB). You can get that value from the second network request bytes range when you are opening the format context (first network request is made with bytes range 0-* and then the second network request is made with bytes range XXXX-* where XXXX should be the buffer size).
The reason why this fixes the issue, is because by buffering all of that data, aviocontext no longer needs to make a new network request to get the audio data. The audio data is already buffered (or at least the first position is).
There might be a better way to fix this issue, it seems like apple MOV files separate their video and audio data with these massive chunks for some reason and that causes ffmpeg to make a million network requests, for each frame.
Related
MediaRecorder iOS Chrome does not obey the bitrate
I'm using MediaRecorder to record video from webcam and in case of safari (or iOS Chrome) output is mp4 file. The issue is that video from iOS chrome has huge bitrate. Using recordrtc library, this code: this.recorder = new RecordRTCPromisesHandler(new MediaStream([ this.ownStream.getVideoTracks()[0], this.ownStream.getAudioTracks()[0] ]), { type: 'video', mimeType: 'video/mp4', recorderType: MediaStreamRecorder, audioBitsPerSecond: 48 * 1024, videoBitsPerSecond: 384 * 1024, }); await this.recorder.startRecording(); creates mp4 successfully, but result bitrate of output video is 92643 kb/s for 10 FPS video! Full ffprobe output: Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'input.mp4': Metadata: major_brand : iso5 minor_version : 1 compatible_brands: isomiso5hlsf creation_time : 2021-12-14T14:52:28.000000Z Duration: 00:00:00.59, start: 0.000000, bitrate: 92643 kb/s Stream #0:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuvj420p(pc), 640x480, 97341 kb/s, 10.14 fps, 20 tbr, 600 tbn, 1200 tbc (default) Metadata: rotate : 90 creation_time : 2021-12-14T14:52:28.000000Z handler_name : Core Media Video Side data: displaymatrix: rotation of -90.00 degrees Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, mono, fltp, 2552 kb/s (default) Metadata: creation_time : 2021-12-14T14:52:28.000000Z handler_name : Core Media Audio
This is only a bug in RecordRTC. Working with vanilla mediarecorder and videoBitsPerSecond works on iOS.
ffmpeg conversion: Twitter rejects video with "Not valid video"
I have an app that uploads via twitter api chunked upload, and it finally works with photos. However, I am trying to get it to work with videos. Uploading didnt work out-of-the-box even though the video format is mp4. The twitter guidelines for uploads are these With that in mind, I have this ffmpeg command so far: ffmpeg -i in.mp4 -vf \"scale=1280:720\" -b:v 5000K -minrate 5000K -maxrate 5000K -b:a 128K -r 30 -f mp4 -vcodec libx264 -profile:v high -pix_fmt yuv420p -strict -2 -ac 2 -acodec aac out.mp4 I still get "Not valid video", and I don't know why. Here's my mediainfo output for out.mp4: General Count : 328 Count of stream of this kind : 1 Kind of stream : General Kind of stream : General Stream identifier : 0 Count of video streams : 1 Count of audio streams : 1 Video_Format_List : AVC Video_Format_WithHint_List : AVC Codecs Video : AVC Video_Language_List : English Audio_Format_List : AAC Audio_Format_WithHint_List : AAC Audio codecs : AAC LC Audio_Language_List : English Complete name : video-5e4405cd4348a5e4405cd434d2.mp4 File name : video-5e4405cd4348a5e4405cd434d2 File extension : mp4 Format : MPEG-4 Format : MPEG-4 Format/Extensions usually used : mov mp4 m4v m4a m4b m4p 3ga 3gpa 3gpp 3gp 3gpp2 3g2 k3g jpm jpx mqv ismv isma ismt f4a f4b f4v Commercial name : MPEG-4 Format profile : Base Media Internet media type : video/mp4 Codec ID : isom Codec ID : isom (isom/iso2/avc1/mp41) Codec ID/Url : http://www.apple.com/quicktime/download/standalone.html CodecID_Compatible : isom/iso2/avc1/mp41 Codec : MPEG-4 Codec : MPEG-4 Codec/Extensions usually used : mov mp4 m4v m4a m4b m4p 3ga 3gpa 3gpp 3gp 3gpp2 3g2 k3g jpm jpx mqv ismv isma ismt f4a f4b f4v File size : 52664272 File size : 50.2 MiB File size : 50 MiB File size : 50 MiB File size : 50.2 MiB File size : 50.22 MiB Duration : 79744 Duration : 1 min 19 s Duration : 1 min 19 s 744 ms Duration : 1 min 19 s Duration : 00:01:19.744 Duration : 00:01:19:20 Duration : 00:01:19.744 (00:01:19:20) Overall bit rate : 5283334 Overall bit rate : 5 283 kb/s Frame rate : 30.000 Frame rate : 30.000 FPS Frame count : 2390 Stream size : 88780 Stream size : 86.7 KiB (0%) Stream size : 87 KiB Stream size : 87 KiB Stream size : 86.7 KiB Stream size : 86.70 KiB Stream size : 86.7 KiB (0%) Proportion of this stream : 0.00169 HeaderSize : 40 DataSize : 52575500 FooterSize : 88732 IsStreamable : No File last modification date : UTC 2020-02-12 14:05:37 File last modification date (local) : 2020-02-12 15:05:37 Writing application : Lavf57.83.100 Writing application : Lavf57.83.100 Video Count : 342 Count of stream of this kind : 1 Kind of stream : Video Kind of stream : Video Stream identifier : 0 StreamOrder : 0 ID : 1 ID : 1 Format : AVC Format/Info : Advanced Video Codec Format/Url : http://developers.videolan.org/x264.html Commercial name : AVC Format profile : High#L3.1 Format settings : CABAC / 4 Ref Frames Format settings, CABAC : Yes Format settings, CABAC : Yes Format settings, ReFrames : 4 Format settings, ReFrames : 4 frames Internet media type : video/H264 Codec ID : avc1 Codec ID/Info : Advanced Video Coding Codec : AVC Codec : AVC Codec/Family : AVC Codec/Info : Advanced Video Codec Codec/Url : http://developers.videolan.org/x264.html Codec/CC : avc1 Codec profile : High#L3.1 Codec settings : CABAC / 4 Ref Frames Codec settings, CABAC : Yes Codec_Settings_RefFrames : 4 Duration : 79667 Duration : 1 min 19 s Duration : 1 min 19 s 667 ms Duration : 1 min 19 s Duration : 00:01:19.667 Duration : 00:01:19:20 Duration : 00:01:19.667 (00:01:19:20) Bit rate : 5000000 Bit rate : 5 000 kb/s Width : 1280 Width : 1 280 pixels Height : 720 Height : 720 pixels Sampled_Width : 1280 Sampled_Height : 720 Pixel aspect ratio : 1.000 Display aspect ratio : 1.778 Display aspect ratio : 16:9 Rotation : 0.000 Frame rate mode : CFR Frame rate mode : Constant FrameRate_Mode_Original : VFR Frame rate : 30.000 Frame rate : 30.000 FPS Frame count : 2390 Resolution : 8 Resolution : 8 bits Colorimetry : 4:2:0 Color space : YUV Chroma subsampling : 4:2:0 Chroma subsampling : 4:2:0 Bit depth : 8 Bit depth : 8 bits Scan type : Progressive Scan type : Progressive Interlacement : PPF Interlacement : Progressive Bits/(Pixel*Frame) : 0.181 Stream size : 51297022 Stream size : 48.9 MiB (97%) Stream size : 49 MiB Stream size : 49 MiB Stream size : 48.9 MiB Stream size : 48.92 MiB Stream size : 48.9 MiB (97%) Proportion of this stream : 0.97404 Writing library : x264 - core 152 r2854 e9a5903 Writing library : x264 core 152 r2854 e9a5903 Encoded_Library_Name : x264 Encoded_Library_Version : core 152 r2854 e9a5903 Encoding settings : cabac=1 / ref=3 / deblock=1:0:0 / analyse=0x3:0x113 / me=hex / subme=7 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=1 / me_range=16 / chroma_me=1 / trellis=1 / 8x8dct=1 / cqm=0 / deadzone=21,11 / fast_pskip=1 / chroma_qp_offset=-2 / threads=12 / lookahead_threads=2 / sliced_threads=0 / nr=0 / decimate=1 / interlaced=0 / bluray_compat=0 / constrained_intra=0 / bframes=3 / b_pyramid=2 / b_adapt=1 / b_bias=0 / direct=1 / weightb=1 / open_gop=0 / weightp=2 / keyint=250 / keyint_min=25 / scenecut=40 / intra_refresh=0 / rc_lookahead=40 / rc=abr / mbtree=1 / bitrate=5000 / ratetol=1.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / ip_ratio=1.40 / aq=1:1.00 Language : en Language : English Language : English Language : en Language : eng Language : en Audio Count : 275 Count of stream of this kind : 1 Kind of stream : Audio Kind of stream : Audio Stream identifier : 0 StreamOrder : 1 ID : 2 ID : 2 Format : AAC Format/Info : Advanced Audio Codec Commercial name : AAC Format profile : LC Format settings, SBR : No (Explicit) Format settings, SBR : No (Explicit) Codec ID : mp4a-40-2 Codec : AAC LC Codec : AAC LC Codec/Family : AAC Codec/CC : 40 Duration : 79744 Duration : 1 min 19 s Duration : 1 min 19 s 744 ms Duration : 1 min 19 s Duration : 00:01:19.744 Duration : 00:01:19:25 Duration : 00:01:19.744 (00:01:19:25) Bit rate mode : CBR Bit rate mode : Constant Bit rate : 128257 Bit rate : 128 kb/s Channel(s) : 2 Channel(s) : 2 channels Channel positions : Front: L R Channel positions : 2/0/0 ChannelLayout : L R Samples per frame : 1024 Sampling rate : 48000 Sampling rate : 48.0 kHz Samples count : 3827712 Frame rate : 46.875 Frame rate : 46.875 FPS (1024 SPF) Frame count : 3738 Compression mode : Lossy Compression mode : Lossy Stream size : 1278470 Stream size : 1.22 MiB (2%) Stream size : 1 MiB Stream size : 1.2 MiB Stream size : 1.22 MiB Stream size : 1.219 MiB Stream size : 1.22 MiB (2%) Proportion of this stream : 0.02428 Language : en Language : English Language : English Language : en Language : eng Language : en Default : Yes Default : Yes Alternate group : 1 Alternate group : 1 Edit: Guess Ill add my PHP code too (This is chopped in a sensible manner because the actual file is very large): // Set up Codebird \Codebird\Codebird::setConsumerKey($consumer_key, $consumer_secret); // static, see README $cb = \Codebird\Codebird::getInstance(); $cb->setToken($token, $token_secret); $cb->setTimeout(60 * 1000); // 60 second request timeout $video = new Video($path); // Convert to the parameter required by twitter. $converted = $video->convert(); $path = realpath('videos/' . $converted); $file = fopen($path, 'rb'); $size = fstat($file)['size']; $mime_type = mime_content_type($path); $media = $cb->media_upload([ 'command' => 'INIT', 'media_type' => $mime_type, 'media_category' => 'tweet_video', 'total_bytes' => $size, ]); $success = $media->httpstatus >= 200 && $media->httpstatus < 300; # 2xx if (!$success) { throw new TwitterException("Failed to INIT upload for $path..."); } // APPEND chunks to upload. $mediaId = $media->media_id_string; $segmentId = 0; while (!feof($file)) { echo "chunk #$segmentId...."; $chunk = fread($file, 512 * 1024); // caps out at 512 MB echo "chunk size: ". strlen($chunk); $media = $cb->media_upload([ 'command' => 'APPEND', 'media_id' => $mediaId, 'segment_index' => $segmentId, 'media' => $chunk, ]); $success = $media->httpstatus >= 200 && $media->httpstatus < 300; # 2xx if (!$success) { throw new TwitterException("Failed to APPEND to upload for $path, chunk $segmentId..."); } $segmentId++; } // Close file and FINALIZE upload. fclose($file); echo "FINALIZING id $mediaId..."; $media = $cb->media_upload([ 'command' => 'FINALIZE', 'media_id' => $mediaId, ]); $success = $media->httpstatus >= 200 && $media->httpstatus < 300; # 2xx if (!$success) { var_dump($media); throw new TwitterException("Failed to FINALIZE upload for $path..."); } return $mediaId; Video is my ffmpeg class, which I'll paste below, and $path is a URL leading to a perfectly valid mp4 video. Video.php: <?php class VideoConversionException extends \Exception {} class Video { public $name; public $converted; public function __construct($name) { self::clear(); $this->name = $name; } public function convert() { $tmpVideo = 'video-' . uniqid() . uniqid() . '.mp4'; $videoPath = 'videos/' . $tmpVideo; //$ffmpeg = "ffmpeg -i https://tvcanarias.acfipress.com/BC_190907_gc_teror.mp4 -vf "scale=1280:720" -b:v 5000K -b:a 128K -r 30 -f mp4 -vcodec libx264 -acodec aac output_video.mp4"; $ffmpeg = "ffmpeg -i {$this->name} -vf \"scale=1280:720\" -b:v 5000K -minrate 5000K -maxrate 5000K -b:a 128K -r 30 -f mp4 -vcodec libx264 -profile:v high -pix_fmt yuv420p -strict -2 -ac 2 -acodec aac $videoPath"; //$ffmpeg = "ffmpeg -i {$this->name} -pix_fmt yuv420p -vcodec libx264 -vf scale=640:-1 -acodec aac -vb 1024k -minrate 1024k -maxrate 1024k -bufsize 1024k -ar 44100 -ac 2 -strict experimental -r 30 $videoPath"; $output = []; exec($ffmpeg, $output, $status); if ($status != 0) { //die("Couldnt run ffmpeg. (Error code: #$status)"); throw new VideoConversionException("Couldn't run ffmpeg. (Error code: #$status)"); } $this->converted = $tmpVideo; return $tmpVideo; } public function shredConverted() { // delete video. #unlink("videos/{$this->$converted}"); } public static function clear() { // We can't really shred videos right away as they might be still uploading. // Therefore, every time this library is used, we will just delete videos older // than, say, an hour. $files = scandir('videos'); $curtime = time(); foreach ($files as $file) { if ($file == '.gitignore' || $file == '.' || $file == '..') { continue; } $mtime = filemtime("videos/$file"); $diff = $curtime - $mtime; $overAnHour = $diff > (60 * 60); if ($overAnHour) { #unlink("videos/$file"); } } } }
how can I get a audio data from webm file using libwebm parser
I have a webm file which include audio data(using opus codec). I want to use libwebm parser, but I cant find infor about that, and I dont know how to do. there is a few sample code. anyway I succeed to parse(just do by the sample code) but that is just only print, parsing is ok. I want ther really audio data, but dont know how can I get it. below linke is demo code which i used. https://github.com/webmproject/libwebm/blob/master/webm_parser/demo/demo.cc and below is the results. how can I get the audio data?? EBML header: [0, 5) body: [5, 36) EBMLVersion: 1 EBMLReadVersion: 1 EBMLMaxIDLength: 4 EBMLMaxSizeLength: 8 DocType: webm DocTypeVersion: 4 DocTypeReadVersion: 2 Segment header: [36, 48) body: [48, ?) Info header: [48, 53) body: [53, 78) TimecodeScale: 1000000 MuxingApp: Chrome WritingApp: Chrome Tracks header: [78, 83) body: [83, 146) TrackEntry header: [83, 85) body: [85, 146) TrackNumber: 1 TrackUID: 15362115268361576 TrackType: 2 (audio) FlagEnabled (implicit): 1 FlagDefault (implicit): 1 FlagForced (implicit): 0 FlagLacing (implicit): 1 CodecID: A_OPUS CodecPrivate: <19 bytes> SeekPreRoll (implicit): 0 Audio SamplingFrequency: 48000 Channels: 1 BitDepth: 32 Cluster header: [146, 158) body: [158, ?) Timecode: 0 SimpleBlock header: [161, 164) body: [164, 1419) track number: 1 frames: 1 timecode: 0 lacing: 0 (none) flags: visible, key frame frame byte range: [168, 1419) SimpleBlock header: [1419, 1422) body: [1422, 2427) track number: 1 frames: 1 timecode: 59 lacing: 0 (none) flags: visible, key frame frame byte range: [1426, 2427) SimpleBlock header: [2427, 2430) body: [2430, 3396)
Invalid image metadata when try to display a livephoto with PHLivePhotoView objective-c
I am trying to load a jpg image together with a mov file with objective-c on ios device to display a live photo, and I make following code snippet to do that in viewDidLoad function: - (void)viewDidLoad { [super viewDidLoad]; PHLivePhotoView *photoView = [[PHLivePhotoView alloc]initWithFrame:self.view.bounds]; NSURL *imageUrl = [[NSBundle mainBundle] URLForResource:#"livePhoto" withExtension:#"jpg"]; NSURL *videoUrl = [[NSBundle mainBundle] URLForResource:#"livePhoto" withExtension:#"mov"]; [PHLivePhoto requestLivePhotoWithResourceFileURLs:#[videoUrl, imageUrl] placeholderImage:[UIImage imageNamed:#"livePhoto.jpg"] targetSize:self.view.bounds.size contentMode:PHImageContentModeAspectFit resultHandler:^(PHLivePhoto *livePhoto, NSDictionary *info){ NSLog(#"we are in handler"); photoView.livePhoto = livePhoto; photoView.contentMode = UIViewContentModeScaleAspectFit; photoView.tag = 87; [self.view addSubview:photoView]; [self.view sendSubviewToBack:photoView]; }]; } I have drag the file livePhoto.jpg and livePhoto.mov to Xcode project But when build this Xcode log this error: 2017-11-28 17:46:08.568455+0800 Live Photos[3669:1276778] we are in handler 2017-11-28 17:46:08.580439+0800 Live Photos[3669:1276778] we are in handler 2017-11-28 17:46:08.597147+0800 Live Photos[3669:1276806] Error: Invalid image metadata 2017-11-28 17:46:08.607881+0800 Live Photos[3669:1276806] Error: Invalid video metadata 2017-11-28 17:46:08.608329+0800 Live Photos[3669:1276778] we are in handler Any idea about that? Thanks. And another thing to ask: Why does the resultHandler was called twice according to what is printed?
TL;DR Here's the code to store Live Photos and upload them to a server: 1. Capturing Live Photo - (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error { if (error) { [self raiseError:error]; return; } NSData *imageData = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer]; CIImage *image = [CIImage imageWithData:imageData]; [self.expectedAsset addInput:image.properties]; // 1. This is the metadata (which will be lost in step 2.) [self.expectedAsset addInput:[UIImage imageWithCIImage:image]]; // 2. Creating image, but UIImage is not designed to contain the required metadata } - (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingLivePhotoToMovieFileAtURL:(NSURL *)outputFileURL duration:(CMTime)duration photoDisplayTime:(CMTime)photoDisplayTime resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings error:(nullable NSError *)error { if (error) { [self raiseError:error]; } else { [self.expectedAsset addInput:outputFileURL]; // 3. Store the URL to the actual video file } } expectedAsset is just an object holding all required information. You can use a NSDictionary instead. And since this code snippet is a >= iOS 11 API, heres the one for "deprecated" iOS... #pragma clang diagnostic push #pragma clang diagnostic ignored "-Wunguarded-availability" - (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(NSError *)error { if (error) { [self raiseError:error]; } else { [self.expectedAsset addInput:[photo metadata]]; [self.expectedAsset addInput:[UIImage imageWithData:[photo fileDataRepresentation]]]; } } #pragma clang diagnostic pop 2. Generate NSData - (NSData*)imageData { NSData *jpgData = UIImageJPEGRepresentation(self.image, 1); // This is the UIImage (without metadata) from step 2 above CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL); NSMutableData *dest_data = [NSMutableData data]; CFStringRef uti = CGImageSourceGetType(source); NSMutableDictionary *maker = [NSMutableDictionary new]; [maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary]; // imageMetadata is the dictionary form step 1 above CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL); CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker); CGImageDestinationFinalize(destination); return dest_data; } - (void)dataRepresentation:(DataRepresentationLoaded)callback { callback(#{#"image": self.imageData, #"video": [NSData dataWithContentsOfURL:self.livePhotoURL]}); // LivePhotoURL is the url from step 3 above } Long Answer This is caused by wrong Metadata in the video/image file. When creating a live photo, PHLivePhoto searches for the key 17 in kCGImagePropertyMakerAppleDictionary (which is the asset identifier) and matches this with the com.apple.quicktime.content.identifier of the mov file. The mov file also needs to have an entry for the time where the still image was captured (com.apple.quicktime.still-image-time). Make sure your files haven't been edited (or exported) somewhere. Event the UIImageJPEGRepresentation function will remove this data from the image. Here's a code snippet I'm using to convert the UIImage to NSData: - (NSData*)imageData { NSData *jpgData = UIImageJPEGRepresentation(self.image, 1); CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL); NSMutableData *dest_data = [NSMutableData data]; CFStringRef uti = CGImageSourceGetType(source); NSMutableDictionary *maker = [NSMutableDictionary new]; [maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary]; CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL); CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker); CGImageDestinationFinalize(destination); return dest_data; } The Handler gets called twice to first tell you about corrupt data, and the second time about the cancellation of the process (these are two different keys). EDIT: Here's your mov data: $ ffmpeg -i cf70b7de66bd89654967aeef1d557816.mov Metadata: major_brand : qt minor_version : 0 compatible_brands: qt creation_time : 2018-01-27T11:07:38.000000Z com.apple.quicktime.content.identifier: cf70b7de66bd89654967aeef1d557816 Duration: 00:00:15.05, start: 0.000000, bitrate: 1189 kb/s Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 540x960, 1051 kb/s, 29.84 fps, 29.97 tbr, 30k tbn, 59.94 tbc (default) Metadata: creation_time : 2018-01-27T11:07:38.000000Z handler_name : Core Media Data Handler encoder : 'avc1' Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default) Metadata: creation_time : 2018-01-27T11:07:38.000000Z handler_name : Core Media Data Handler The com.apple.quicktime.still-image-time key is missing here. Here's the metadata how it should look like: Metadata: major_brand : qt minor_version : 0 compatible_brands: qt creation_time : 2017-12-15T12:41:00.000000Z com.apple.quicktime.content.identifier: 89CB44DA-D129-43F3-A0BC-2C980767B810 com.apple.quicktime.location.ISO6709: +51.5117+007.4668+086.000/ com.apple.quicktime.make: Apple com.apple.quicktime.model: iPhone X com.apple.quicktime.software: 11.1.2 com.apple.quicktime.creationdate: 2017-12-15T13:41:00+0100 Duration: 00:00:01.63, start: 0.000000, bitrate: 8902 kb/s Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, smpte170m/smpte432/bt709), 1440x1080, 8135 kb/s, 26.94 fps, 30 tbr, 600 tbn, 1200 tbc (default) Metadata: rotate : 90 creation_time : 2017-12-15T12:41:00.000000Z handler_name : Core Media Data Handler encoder : H.264 Side data: displaymatrix: rotation of -90.00 degrees Stream #0:1(und): Audio: pcm_s16le (lpcm / 0x6D63706C), 44100 Hz, mono, s16, 705 kb/s (default) Metadata: creation_time : 2017-12-15T12:41:00.000000Z handler_name : Core Media Data Handler Stream #0:2(und): Data: none (mebx / 0x7862656D), 12 kb/s (default) Metadata: creation_time : 2017-12-15T12:41:00.000000Z handler_name : Core Media Data Handler Stream #0:3(und): Data: none (mebx / 0x7862656D), 43 kb/s (default) Metadata: creation_time : 2017-12-15T12:41:00.000000Z handler_name : Core Media Data Handler And just FYI, heres your JPEG Data: $ magick identify -format %[EXIF:*] cf70b7de66bd89654967aeef1d557816.jpg exif:ColorSpace=1 exif:ExifImageLength=960 exif:ExifImageWidth=540 exif:ExifOffset=26 exif:MakerNote=65, 112, 112, 108, 101, 32, 105, 79, 83, 0, 0, 1, 77, 77, 0, 1, 0, 17, 0, 2, 0, 0, 0, 33, 0, 0, 0, 32, 0, 0, 0, 0, 99, 102, 55, 48, 98, 55, 100, 101, 54, 54, 98, 100, 56, 57, 54, 53, 52, 57, 54, 55, 97, 101, 101, 102, 49, 100, 53, 53, 55, 56, 49, 54, 0, 0
Why the video's start time output by AVAssetWriter is no zero?
When i use PBJVsion to capture a video ,why is the video's start time output by AVAssetWriter is not zero? the video output imformation use some video tools as below: major_brand : isom minor_version : 512 compatible_brands: isomiso2avc1mp41 encoder : Lavf55.19.104 Duration: 00:00:08.12, start: 0.023220, bitrate: 1050 kb/s Stream #0.0(und): Video: h264 (High), yuv420p, 480x480, 976 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc Stream #0.1(und): Audio: aac, 44100 Hz, mono, fltp, 75 kb/s [17:39:56] scan: decoding previews for title 1