multiple webRTC replaceTrack() calls for swapping two video streams - ios

I'm making a video call application with two video sources: camera / screen record,
and let the users switch camera <-> screen record multiple times during a call.
RTCRtpSender.replaceTrack() allows me to replace the camera stream with the screen record stream.
But if I try to switch back to the camera stream, it doesn't work on ios.
On android, I found this document. It seems that replaceTrack() will auto-dispose the track when it's no longer needed. If I use setTrack(_, takeOwnership=false), and it works ok.
I wonder if I can do the same on ios.
I'm developing a flutter application & using this library, but I wonder if this is even possible on ios native.

Related

Displaying 360 Video from an iOS App to an AirPlay Device

A client's app displays two versions of a video in the app, one regular, one 360 view. The regular video is handled by AVPlayer. The 360 video is rendered by the open source package Swifty360Player (which works very well, btw). The app would like to be able to display either video on a big screen using AirPlay.
For the normal video, this is no problem. The 360 Video, however, is produced by a SceneKit view, so it's technically more akin to a 3D game than a video. I know that we can display game scenes on an AirPlay device if/when the user manually mirrors his iPhone/iPad to the AirPlay screen.
But I wonder, is there any way to generate a live video stream from the SceneKit view, and then transmit that video stream to the AirPlay device in real time? My wish is that the user could then use an AVRoutePickerView to select an AirPlay device from within the app.
ReplayKit does this for streaming services like Twitch but this app isn't looking to broadcast, just to share video with a single screen in the same room.
Is there anyway to accomplish this?

RTCEAGLVideoView orientation change freezes the stream

I inherited an iOS webRTC app from someone else and now i am stuck with a problem for a code i am not familiar with.
It's a pretty generic webRTC video stream inside a Cordova iOS app. The problem im trying to solve is, that the video stream freezes, upon device orientation change. So when I change from lansdscape to portrait, the stream freezes. The app is landscape only, so therefore I would prefer that the video stream never changes orientation in the first place. Question is, how can I prevent a webRTC Videostream to switch orientations.
The app uses the RTCEAGLVideoView class to display the video stream. If anyone has any ideas for me where to even start to prevent that view from switching and therefor freezing, you would save my day.
I would suggest to use RTCCameraPreviewView for your local stream and RTCMTLVideoView for your remote stream.
As you read this post it has fixes for the same:
https://bugs.chromium.org/p/webrtc/issues/detail?id=7442
Also update your GoogleWebRTC SDK in case you are using older version.

Using multi-VLC Player in the same UIView cause lag

I'm developing an iOS app using swift, in this app I want monitor several IP camera in the same view.
I created two VLCMediaPlayer and feed them with two different rtsp link and both of them result in extremely lag. I've also change the "network-caching" to 10000 for both two VLC Player. If I use just one VLCMediaPlayer in that view, the streaming is fine.
I'm wondering is this the right solution for display multi-vlc player? Or I should use other solution or other media player?

iOS Concurrent Camera Usage

I am developing a background app that periodically makes use of the camera. This app is for my jailbroken device, so no problem with public SDKs restrictions. I want to take a photo of the different places I go during the day, in an automatic manner.
I am using AVCaptureSession, getting the frames from a video in order to produce no sound.
The problem is that if another application wants to make use of the camera (The Camera App for instance), and my app tries to take a photo, the Camera interface get frozen. Then I need to reopen the Camera app again in order for it to work. I guess it is because the startRunning method of the AVCaptureSession blocks the usage of previous camera sessions.
Is there any way to use the Camera in shared mode, or concurrently?
I don't know if there is some kind of mixWithOthers property as is included for the sound compatibility.

linking an inline swf wall post to a native ios app

I have a Facebook application where you can create an animation and share it on your wall. Each feed has a swf player that lets users to play the animation directly on the wall. (like any shared video). As IOS does not support flash, I have also created a native IOS application for playing those animations.
These posts can't be played on IOS devices, when touched, 'Flash Player update required' placeholder is shown.
What I want to do is, to have an alternative link for iPads to play these feeds on my native IOS app (like fb://APP_ID/video_url).
Is it possible to provide a native IOS App url for inline swf feeds?
I don't think swf playing is possible on iOS, inline or not. The only thing you can do, is to convert that animation to an iOS compatible format, like QuickTime, or HTML5.

Resources