Encoding SWF to video with Melt - ios

I'm doing a project which requires converting SWF movies to H.264 video on server-side, to be able to play them both in Flash player and on iPhone/iPad. And I really got stuck.
I'm using Melt from http://www.mltframework.org/ and this is my command-line:
melt movie.swf -consumer avformat:video.mp4 r=30 s=640x360 f=mp4 acodec=aac ab=128k ar=48000 vcodec=libx264 b=1000k an=1
It does play in Flash player, but fails to play on iDevices. I googled for iPhone video requirements and it seems my video files do satisfy them(frame size, framerate and bitrate). What settings should I change to make it play?

I've spent a lot of time in google but managed to gather all the pieces, so these are parameters that work for iPhone:
r=30 s=640x360 f=mp4 acodec=aac ab=128k ar=48000 vcodec=libx264 level=30 b=1024k flags=+loop+mv4 cmp=256 partitions=+parti4x4+parti8x8+partp4x4+partp8x8+partb8x8 me_method=hex subq=7 trellis=1 refs=1 bf=0 flags2=+mixed_refs-wpred-dct8x8 coder=0 wpredp=0 me_range=16 g=250 keyint_min=25 sc_threshold=40 i_qfactor=0.71 qmin=10 qmax=51 qdiff=4 maxrate=10M bufsize=10M an=1 threads=0
Also, I use faac -w to convert audio to appropriate format and MP4Box to join video and sound.

Related

how to set video quality for ios 270 360 480 720 1080

To set video quality for ios.
I have tried to load m3u8 video url from server and i downloaded the m3u8 file & i segregate all RESOLUTION from video quality & AFTER SEGMENTS get the bandwidth of url in array.
When i load base url sample.m3u8 it has video & audio after that i set the base url of before segments and i have append the bandwidth url from array it was loading video as per quality selected but no audio came.
To achieve this i have made some solutions will work
I make separate to run original url which contains both video & audio and i run separately low bandwidth url which contains no audio to make sync
ex: RESOLUTION=1280x720,SAMPLE_720p_v4.m3u8
SAMPLE.m3u8
#EXTM3U
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio-0",NAME="Default",AUTOSELECT=YES,DEFAULT=YES,URI="segments/SAMPLE_audio_v4.m3u8"
#EXT-X-I-FRAME-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=30681000,CODECS="avc1.640028",URI="segments/SAMPLE_1080p_iframe.m3u8"
#EXT-X-I-FRAME-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=30140000,CODECS="avc1.4d001f",URI="segments/SAMPLE_720p_iframe.m3u8"
#EXT-X-I-FRAME-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=15431000,CODECS="avc1.42001f",URI="segments/SAMPLE_480p_iframe.m3u8"
#EXT-X-I-FRAME-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=11009000,CODECS="avc1.42001e",URI="segments/SAMPLE_360p_iframe.m3u8"
#EXT-X-I-FRAME-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=7850000,CODECS="avc1.420015",URI="segments/SAMPLE_270p_iframe.m3u8"
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=4080000,RESOLUTION=1280x720,CODECS="avc1.640028,mp4a.40.2",AUDIO="audio-0"
segments/SAMPLE_1080p_v4.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=3471000,RESOLUTION=1280x720,CODECS="avc1.4d001f,mp4a.40.2",AUDIO="audio-0"
segments/SAMPLE_720p_v4.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1934000,RESOLUTION=854x480,CODECS="avc1.42001f,mp4a.40.2",AUDIO="audio-0"
segments/SAMPLE_480p_v4.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1106000,RESOLUTION=640x360,CODECS="avc1.42001e,mp4a.40.2",AUDIO="audio-0"
segments/SAMPLE_360p_v4.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=837000,RESOLUTION=480x270,CODECS="avc1.420015,mp4a.40.2",AUDIO="audio-0"
segments/SAMPLE_270p_v4.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=185000,CODECS="mp4a.40.2",AUDIO="audio-0"
segments/SAMPLE_audio_v4.m3u8
Use the preferredPeakBitRate property on your playeritem https://developer.apple.com/documentation/avfoundation/avplayeritem/1388541-preferredpeakbitrate you need to pass a valid bandwidth value.
Not sure why you are downloading the m3u8 file AVFoundation manage this for you.

How to change url from m3u8 to .ts

I'm trying to make an iptv link work on my receiver
this is the original link that i want to convert
http://s7.iapi.com:8000/re-NBA/index.m3u8?token=BzyIVQOtO77MTw
and this is the format that i want to reach in the end.
http://pro-vision.dyndns.pro:12580/live/laurent/laurent/2791.ts
An m3u8 file is just a text file that acts as an index for media streams - it will contain 'pointers' to the location of video and audio streams themselves.
A TS file is a 'container' that contains the video and audio streams themselves - i.e. the actual video and audio data.
You can't simply convert any m3u8 to a ts file or stream, but you can extract from the m3u8 file a ts file URL, which maybe is what you want.
If you look at the overview section of the m3u8 definition there is a very simple example which is maybe the best way of understanding this:
https://datatracker.ietf.org/doc/html/draft-pantos-http-live-streaming-19
The m3u8 file includes the ts references and can be seen in this extract from the above document:
#EXTM3U
#EXT-X-TARGETDURATION:10
#EXTINF:9.009,
http://media.example.com/first.ts
#EXTINF:9.009,
http://media.example.com/second.ts
#EXTINF:3.003,
http://media.example.com/third.ts
The numbers here refer to the length of the stream. More complex examples allow you have multiple variants of a particular stream, to allow different bit rate versions of a video for Adaptive Bit Rate (ABR) streaming for example.

HLS stream not working on Apple devices

I have a live RTSP stream that i have managed to transcode to HLS via VLC. Now it works perfect on Android and on desktop browsers (via flash).
But not on Apple (i can test it on iPad and desktop Safari on my virtual machine). I can see the player but when i press the 'play' button all i see is a black rectangle inside the player. On desktop Safari there is also a text 'Loading...' near the play/pause button and nothing else happens.
My HTML:
<video id="player" controls style="width:100%; height:100%">
<source src="http://178.79.164.114/playlist.m3u8" type="application/x-mpegURL">
</video>
The command for vlc:
vlc -I dummy rtsp://<stream-url> --sout '#transcode{width=320,height=240,fps=25,vcodec=h264,vb=256,acodec=none,venc=x264{aud,profile=baseline,level=30,keyint=30,bframes=0,ref=1,nocabac}}:std{access=livehttp{seglen=10,delsegs=true,numsegs=5,index=/path/to/server/directory/playlist.m3u8,index-url=http://178.79.164.114/seg-########.ts},mux=ts{use-key-frames},dst=/path/to/server/directory/seg-########.ts}'
And an example of the playlist file:
#EXTM3U
#EXT-X-TARGETDURATION:10
#EXT-X-VERSION:3
#EXT-X-ALLOW-CACHE:NO
#EXT-X-MEDIA-SEQUENCE:179
#EXTINF:9.60,
http://178.79.164.114/seg-00000179.ts
#EXTINF:9.60,
http://178.79.164.114/seg-00000180.ts
#EXTINF:9.60,
http://178.79.164.114/seg-00000181.ts
#EXTINF:9.61,
http://178.79.164.114/seg-00000182.ts
#EXTINF:9.59,
http://178.79.164.114/seg-00000183.ts
And here is the strange output of ffprobe http://178.79.164.114/playlist.m3u8 (why there are these N/A and the variant_bitrate is 0?). Maybe it can help:
Input #0, hls,applehttp, from 'http://178.79.164.114/playlist.m3u8':
Duration: N/A, start: 3995.330722, bitrate: N/A
Program 0
Metadata:
variant_bitrate : 0
Stream #0:0: Video: h264 (Constrained Baseline) ([27][0][0][0] / 0x001B), yuv420p, 320x240 [SAR 11:12 DAR 11:9], 25 fps, 25 tbr, 90k tbn, 50 tbc
I have also configured correct MIME types for .m3u8 and .ts files and spent a day searching and trying different options for the transcode command: width, height, bitrate, fps, different profiles and levels... - nothing works. But if i try some examples from apple (http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8) - all is fine, though it's not a live stream.
If anyone has any ideas or has a possibility to test my stream with the mediastreamvalidator - please help.
UPDATE
Now i'm experimenting with variant playlist but it changes nothing.
The player might expect muxed video and audio so add a silent audio track.
The Apple HLS documentation says:
The media segment files are normally produced by the stream segmenter, based on input from the encoder, and consist of a series of .ts files containing segments of an MPEG-2 Transport Stream carrying H.264 video and AAC, MP3, or AC-3 audio
Support for audio-only streams is mentioned in Technical Note TN2224 and the 7th revision of the protocol introduced support for alternate renditions (unmuxed streams) but this is done with EXT-X-MEDIA tags in a master playlist controlling the playback (yours is a media playlist).

MPMoviePlayer playing audio but not video when seeking into a file

I'm trying to seek into a video file at a certain point. Lets say the video is 5 minutes long and I'm jumping in at 110 seconds.
When I play from the beginning, everything plays through fine, however, when I try to seek into the file, I can hear the audio but I can't see the video. I first thought this was maybe an issue with the order I'm loading the subviews but I can still see (and use) the controls for the player. Sliding back to 0:00 starts the video.
The following is code from my video class. The initIntoView method accepts a UIView and then returns an amended copy which then gets written to the main view. Sorry in advance for the messy code. I'm still quite new to Objective-C.
Init the Video view
- (WWFVideo*) initIntoView: (UIView*) view withContent:(NSDictionary*)contentDict{
self=[super init];
viewRef=view;
contentData = contentDict;
NSURL *videoUrl = [[NSURL alloc]initWithString:[contentDict objectForKey:#"cnloc"]]; //Returns a HTTP link to my video file (MP4, H.246, AAC Audio)
videoController = [[MPMoviePlayerController alloc] init];
videoController.movieSourceType = MPMovieSourceTypeFile;
[videoController setContentURL:videoUrl];
videoController.view.frame = viewRef.bounds;
[videoController.view setAutoresizingMask:UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight];
[viewRef addSubview:videoController.view];
return self;
}
Start playing the video
-(void)play:(int)offset { //Offset is "110"
[videoController setInitialPlaybackTime:offset];
[videoController play];
}
I've tried adding the videoController to viewRef both before and after the video starts playing but it has the same outcome.
I've also tried using an MPMoviePlayerViewController with no avail.
Another thing I tried was changing the streaming type to MPMovieSourceTypeStreaming but it seemed to have no effect.
If I've missed any more vital code, just ask and I'll see what I can do.
Edit:
Xcode 4.6.3
iOS 6
Testing on an iPad 2
Edit #2:
Works perfectly on the simulator, just not on the device.
After trying to piece together a sample app to upload here, I found that the w3 version of Big Buck Bunny worked fine. This indicates it was an encoding problem and not an objective C issue.
I've re-encoded the same file I was trying to play before but now with the baseline profile with the following command:
ffmpeg -i {filename} -acodec aac -ac 2 -strict experimental -ab 160k -s {ssize} -vcodec libx264 -preset slow -profile:v baseline -level 30 -maxrate 10000000 -bufsize 10000000 -b 1200k -f mp4 -threads 0 {filename}.ipad.mp4
I found this code on here through this Stack Overflow post.
Primarily for low-cost applications that require additional data loss robustness, this profile is used in some videoconferencing and mobile applications. This profile includes all features that are supported in the Constrained Baseline Profile, plus three additional features that can be used for loss robustness (or for other purposes such as low-delay multi-point video stream compositing). The importance of this profile has faded somewhat since the definition of the Constrained Baseline Profile in 2009. All Constrained Baseline Profile bitstreams are also considered to be Baseline Profile bitstreams, as these two profiles share the same profile identifier code value.
-From Wikipedia
I realise this may not help anyone here looking for Objective-C help but if it saves just one person the 5 hours I spent today trying to get this working, this will be worth it.

Why I am receiving only a few audio samples per second when using AVAssetReader on iOS?

I'm coding something that:
record video+audio with the built-in camera and mic (AVCaptureSession),
do some stuff with the video and audio samplebuffer in realtime,
save the result into a local .mp4 file using AVAssetWritter,
then (later) read the file (video+audio) using AVAssetReader,
do some other stuff with the samplebuffer (for now I do nothing),
and write the result into a final video file using AVAssetWriter.
Everything works well but I have an issue with the audio format:
When I capture the audio samples from the capture session, I can log about 44 samples/sec, which seams to be normal.
When I read the .mp4 file, I only log about 3-5 audio samples/sec!
But the 2 files look and sound exactly the same (in QuickTime).
I didn't set any audio settings for the Capture Session (as Apple doesn't allow it).
I configured the outputSettings of the 2 audio AVAssetWriterInput as follow:
NSDictionary *settings = #{
AVFormatIDKey:#(kAudioFormatLinearPCM),
AVNumberOfChannelsKey:#(2),
AVSampleRateKey:#(44100.),
AVLinearPCMBitDepthKey:#(16),
AVLinearPCMIsNonInterleaved:#(NO),
AVLinearPCMIsFloatKey:#(NO),
AVLinearPCMIsBigEndianKey:#(NO)
};
I pass nil to the outputSettings of the audio AVAssetReaderTrackOutput in order to receive samples as stored in the track (according to the doc).
So, the sample rate should be 44100Hz from the CaptureSession to the final file. Why I am reading only a few audio samples? And why is it working anyway? I have the intuition that it will not work well when I'll have to work with the samples (I need to update their timestamps for example).
I tried several other settings (such as kAudioFormatMPEG4AAC), but AVAssetReader can't read compressed audio formats.
Thanks for your help :)

Resources