Update:
I record VP8 video from Android that is saved as MP4 format, which should be supported by Safari. From some research I know Opus audio format is partially supported by Safari...
What else do I need to add while saving file (eg: output settings)?
Is it because of the Opus audio format that Safari is not playing VP8 video format?
Question:
In my application which records the video from any device and saving in AWS server, and admin can play recorded videos from any os/ device.
After the iOS updated on Dec 14 2022, my video component is messed up, I cannot record or play video from Apple devices.
As I cannot write entire project code, I have the code in the below link
video player code - Github
In the code the video src is from AWS S3 (storage) which is recorded from Android phone, from the link below you can play the video.
video player testing
This video can be played in Mac Chrome, but not in Mac Safari, Apple Safari and Apple Chrome.
I am looking for quick solution.
I don't use and don't have access to iOS or Mac Safari, so you must test this for us...
Looking at your MP4 file in a hex editor I see this:
It seems you have a WebM file that simply has .mp4 added to the file name. It is not an MP4 (there is no MPEG header/data inside these file bytes, etc). Your file here has webm in second line something which a real MP4 would never do. A valid MP4 usually has ftyp written in its first line.
Possible solutions:
(1) To avoid confusing some decoders (about file format) try renaming file as:
159_159_1652901854.webm
Some players will use the file .type extension to decide which decoder to use. If it is file.mp4 and some player uses the MPEG decoder on those VP8 bytes then you will get no picture or sound.
(2) Also try to set correct WEBM mime-type (ie: because it's not a video/mp4)
<video height="700" controls muted autoplay playsinline loop>
<source src="https://api.tenrol.com/uploads/videos/159_159_1652901854.webm" type="video/webm">
</video>
note: consider testing a WebM video with no sound, to answer if Opus audio is the problem.
See if the above <video> tag code works on Mac Safari and iPad Safari. I think Opus (usually audio for OGV files) and Vorbis (usually audio for webM files) are not supported on iPhone (unless you put the audio in an MOV format, but then you lose the video part). Basically iPhone has no WebM sound.
(3) If all else fails then just understand the following:
MP4 plays in all major browsers.
MP4 contains H.264 or H.265 picture codecs, and also AAC or MP3 audio codecs.
Browsers prefer to record in their own in-house/licensed codecs.(Google == WebM (VP8/9), Mozilla == OGV (Theora) and Apple == MP4 (H.264/H.265)).
Don't use Webm (Chrome/Firefox) if you want guaranteed Safari playback of video.
Don't use MP4/H265 (Safari) if you want guaranteed Chrome/Firefox video playback.
There is no easy solution. So your options are:
To guarantee that you get H264/MP4 encoded in Chrome/Firefox browsers, then try finding a Javascript based H.264 encoder. Here's a search example to start with. Then you also need an audio encoder for AAC or MP3 formats. Such a file would play in Safari.
Or else, push for getting WebM decoded by Safari browser (even if without sound). This means a lot of testing. Use a tool like FFmpeg to output short 10 second test videos of WebM (at different settings) until something displays. Then you can move onto dealing with sound.
The video you recorded is using the vp8 video codec which should be supported on Safari on Mac but also the opus audio codec which is not usually supported.
Video Info for https://api.tenrol.com/uploads/videos/159_159_1652901854.mp4
Input #0, matroska,webm, from 'test.mp4':
Metadata:
encoder : Chrome
Duration: N/A, start: 0.000000, bitrate: N/A>
Stream #0:0(eng): Audio: opus, 48000 Hz, mono, fltp (default)>
Stream #0:1(eng): Video: vp8, yuv420p(progressive), 1080x1920, SAR 1:1 >DAR 9:16, 1k tbr, 1k tbn, 1k tbc (default)>
You can check the video format using ffprobe (https://ffmpeg.org/ffprobe.html)
or online here (it will take a little while): https://getvideoinfo.westshoretechnology.com
You can check browser support for codecs in the browser release notes or at sites like this which are usually up to date: https://caniuse.com/opus
The Android video encoding recommendations are available online and the current recommendations at the time of writing are:
These are pretty good guides cross device at this time also - the link is here: https://developer.android.com/guide/topics/media/media-formats
How can I save what AVPlayer is currently playing (both video and audio) from Live HLS stream?
I know how to load and play m3u8 video file using AVPlayer.
Please note that the HLS stream is live and not Video on demand so cannot use AVAggregateAssetDownloadTask In the perfect scenario will get CMSampleBuffer objects which can save to file easily. Also AVPlayerItemOutput is not entirely an option because I am unable to see how will get the audio channel.
Seems not possible with the current SDK. I've implemented it using ffmpeg
I have a remote MP4 file that has video and sound. I want to only stream the audio track of the mp4 file. I don't need the video and want to decrease the internet usage in my app. I have no idea where to start with this, Google doesn't seem to help. Is this impossible? Any ideas?
You can't... If you want audio only, you need to split video / audio on the server or put a stream software (like vlc, ffmpeg or mplayer) who can re-encode file in realtime (so you can drop video for new stream)
It's better, for me, to process all files on server and extract audio track...
I have managed to read matroska container and able to extract video and audio streams from a mkv file in my metro app. Now, I don't know how to feed the streams for playing. I want to know the concept of throwing media data on display. I have another option to repack the video and audio(s) with subtitles into MP4 container which will be played by MediaElement by default but that will not look like having a matroska codec in my metro app.
So, basically my question is: How to use MediaElement or any display graph to read the video and audio pulled out from any video container.
Please guide me. Thanks.
I am trying to upload 3gp files into my rails application on ami instance using paperclip.
I then move that 3gp file to darwing streaming server folder to stream it.
To have a better identification of file names , it has been prepended with some ids'.
Now when i am trying to play that video through its rtsp link on mobile, i am not able to play that video. Interestingly, when i download that video to my local machine and play it with say vlc - i am able to do that.
what could be missing here.
To stream a 3gp or mp4 file via darwin streaming server, you must ensure that the media file is hinted.
To add hint track to a 3gp or mp4 file, you can use MP4Box
http://gpac.wp.institut-telecom.fr/mp4box/
mp4box -hint sample.3gp