How to programmatically change video codec in Xamarin iOS - ios

I've recently run into a issue with taking video on new iPhones(8 and up) written in Xamarin. When capturing the video video on older devices the codec is H.264, but on new devices Apple has switch to H.265. These videos are played in browser and pretty much everything I've checked doesn't support H.265.
Since you can change in the setting of the device between (high efficiency-H.265 and most compatiable-H.264), I figured you can do this programmatically. I haven't been able to find any information on how to do this if at all. Any help would be appreciated.

You can set the codec on a AVCaptureVideoDataOutput, which you add to your Session, through WeakVideoSettings which is just a dictionary of settings.
You can find the keys in the official apple docs: https://developer.apple.com/documentation/avfoundation/avassetwriterinput/video_settings_dictionaries

Related

Mp4 file created from ffmpeg can play on Simulator/PC browser but can't play on iOS app/safari in real device

The backend team uses ffmpeg to create video from images.
The strange thing is that the video can be played on Mac browser/iPhone Simulator but not on browser/iOS app on real phone.
I tried using AVPlayer to print the error but error = nil
here is the file: https://firebasestorage.googleapis.com/v0/b/chatchatapp123.appspot.com/o/image_rendered.mp4?alt=media
here is its metadata: https://www.metadata2go.com/result/46e72635-7fac-46ee-acfe-cb6ffda49692
Has anyone encountered this before and if so, any ideas as to why?
Thanks.
I've noticed, field order in metadata is tt, which means
Interlaced video, top field coded and displayed first
But according to the document of “HTTP Live Streaming” from Apple, there is a description,” Important: Interlaced video is not supported on iOS devices.”, which is written on the official document.
https://developer.apple.com/documentation/http_live_streaming/hls_authoring_specification_for_apple_devices

Convert a video into an IOS Live Photo format without an Apple device

I'm currently making live/animated wallpapers for Android phones, which is pretty easy with GIF/Mp4 files. But I would also like to make the live wallpapers compatible natively with ios/iPhones. I've seen many tutorials on how to convert a video to a Live photo but they all involve some kind of app you need to install on your IPhone. Issue is, I don't own an iPhone or any iOS device to do that and apparently iOS emulators are not a thing, so my question is :
Is it possible to convert a (mp4) video to an iOs Live photo through a 3rd party ? If so, how would you do it ?
I'm pretty inexperienced with the iOS environnement so thank you for your help !
Well, a live photo is a container that holds both a .jpeg and a H.264-encoded .mov file. More details here.
Do you have an Apple computer to do so, or do you want to create it independently from any apple hardware?
For the first case, there are a bunch of (demo) applications on github you could take a look into.
For the second case, I'm not sure if anything exists.

How to fill metadata info for tvOS info panel when using Airplay?

I'm barely new to iOS.
I'm able to reproduce streams(no local video) via AVPlayer using Airplay.
Also, MPNowPlayingInfo and RemoteCommandManager are supported, using external medatada, not included into the streams.
But, I would like to fill the info panel with title, artwork, etc. on AppleTv/tvOS.
The image is part of WWDC17 talk titled "Now Playing and Remote 
 Commands on tvOS".
My question is not about tvOS apps, which the referenced talk is about, but about a iOS app that plays video via Airplay.
My guess is that the played AVAsset needs to have medatada, which currently would be empty.
I've been checking AVMutableMetaDataItem, but still don't understand if that's what I would need to use, nor how to do it.
Does anyone has any hint?
The WWDC 2019 Talk :( https://developer.apple.com/videos/play/wwdc2019/503/ ) which is about Delivering Intuitive Media Playback with AVKit speaks about using the external metadata during Airplay and how they have provided the API for iOS now which is similar to what was present on tvOS. (Refer from duration of 7 minutes in the Video mentioned above where they explain the same.) Hope this helps:)

MP4 Videos on website embed with html5 does not play on iOS

So I have a couple of videos on my website that I shot using the iPhone 4 and then converted to mp4, webm and ogg, so that I can use them with html5. Thing is, the video does not play at all on the 4 iOS devices that I tested and neither on Chrome for Android.
The Chrome issue could be because some of the mp4 are actually m4v files, but still after encoding with handbrake a video to the iphone 4 presset and in mp4 format, it still does not play.
What happens, you ask? Well, it shows the play button crossed out with a diagonal bar, the debug console on Safari does not show any message untill I try to access the video directly. Then it says: QuickTime Movie could not be played.
What can I do? I have been trying to encode with ffmpeg, have tried a handful of different solutions, some even found here on stackoverlow, but to no avail. The videos do get shorter, both in display size and MBs, but nothing works to fix the issue at hand.
I've been trying to get this corrected for a couple of weeks now. Any help and/or suggestions are welcome.
Thank you.
By the way, all the videos are on a registred users section of the website, but I have one for debugin on the main page, so feel free to test.
https://sidnerwebsite.sytes.net
It seems the issue is caused by the iOS' need of an Intermediate Certificate trusted by the iOS to play videos through an SSL connection. After disabling the re-routing of the website to its ssl counterpart, the videos display correctly on iOS.

"djay for iPad" How do they do it?

for anyone that have seen the new "djay for iPad" application, how have they managed to access and manipulate tracks from the iTunes music library? So far it was only possible to playback the music with the MPMusicPlayerController. I have looked at iOS 4.2 but could not find any new features that indicates the trick.
Any idea?
André
It isn't actually manipulating the files in iTunes. It is reading them in and copying them to do the manipulations.
You've been able to stream from the iPod library since iOS 4.1. Have a look at AVFoundation.

Resources