P2P Video in iOS + Unity - ios

What would be the best way to realize p2p video (no audio) streaming between 2 iOS clients in real time? This would need to be inside of a Unity3D (or perhaps Cocos3D) game engine.
I've looked at some WebRTC based solutions like Icelink and OpenTok, but I don't have much experience with these technologies. Can someone recommend any de facto solutions for this type of task?

You can use Opentok webrtc-based platform to enable video (and audio) communication between two or more peers.
Opentok has native SDKs for Android and iOS so it should work for you since you are working in iOS.
In order to use it from another SDK such as Unity3d or Cocos3d, Opentok exposes the sent and received video frames (RGB or YUV) to the client, so you can take that video frame image data and render it any view inside the game engine using, for example, OpenGL.
As everything is implemented in the SDK and supported by Opentok platform, enabling the video communication is a matter of interacting with the SDKs so it shouldn't be so hard.

Related

WebRTC/Janus on iOS

What are my options implementing WebRTC (Janus specifically) in my iOS app?
From what I've gathered, using the WebRTC library made by Google is hard and may fail. Also the fact that the backend uses Janus, I don't know how much work I'd have to do to get the basic WebRTC compatible with the Janus layer.
Also, does WebRTC work in Safari/WK webviews in iOS 11/12? I don't want to stream from the app, I only want to view a stream.

Hybrid mobile app for video chat and recording

I want to build a mobile application which is social networking and allows users to have video chats. The video chats should also be able to get recorded and shared. So I am opting for Hybrid app development as I can release the app both in iOS and android.
For my requirements what Hybrid platform should I opt?
I came across Twilio services which provide video chat functionality, but is Twilio compatible with Hybrid apps?
If yes, I am more inclined towards flutter(google's hybrid application framework), is flutter Twilio compatible?
If not, which other hybrid framework is Twilio compatible?
Twilio developer evangelist here.
As Flutter is newly in beta, I don't think many people have attempted a Twilio integration. The Flutter video player plugin is not complete yet either. Flutter might not be the best platform for this.
I believe that Twilio Video can be supported in other frameworks like Xamarin, React Native and Cordova. I've not personally used any of them, so that's as much as I can tell you.
Still in development fase at the moment but we are working on a Flutter plugin for Twilio Programmable Video. It can be found here: https://gitlab.com/twilio-flutter/programmable-video
I can recommend to try ConnectyCube
They have hybrid Cordova/PhoneGap SDK and code samples for video chat:
Cordova Video Chat code sample - https://developers.connectycube.com/js/code-samples-videochat-cordova
Javascript/Cordova SDK - https://developers.connectycube.com/js/
Main features, in general:
1-1 video chat
Group video chat
Cross-platform
WebRTC based
Screen sharing
VP8/H264 video codecs supported
Mute/Unmute audio/video stream
Switch video input devices (cameras)
Video recording

external video input into iOS devices

I am working on the experimental project that need to connect external video camera to iPhone.
I found out that we can connect iPhone to external interface like Arduino using redpark cable that ship together with SDK. But I am not sure how iOS handle the raw data taken from the external camera.
I am wondering if AVFoundation can handle this part because we can specify the input device. But I am not sure how to point it to external device.
Or is there any other frameworks that can handle this task?
I am looking for tutorial or sample project that I can learn more about this.
The decoding you need to do depends entirely on the camera you will use.
But, given the data rate limitation of the serial cable you are considering, you will be practically limited to using a camera that can provide a low bit rate h.264 stream.
Decoding such a stream can be done with the ffmpeg library. Instructions for integrating it in an iOS project can be found in this SO question.

Video Streaming in iOS through WebRTC

I am trying to build a audio/video streaming app that works cross platform on iOS and Android mobile devices.
No matter how deep I Google, I'm ending up with suggestions that point me towards OpenTok/TokBox API. But this is what I wish to avoid.
I've checked a few demo, but WebRTC/HTML5 do not seem to work with streaming video/audio in iOS browser. For example, the https://apprtc.appspot.com demo does not work in Safari or Opera Mini in iOS.
When I try http://dev.opera.com/articles/media-capture-in-mobile-browsers/demo/ ... I can capture image using the default iOS camera picker from my browser but streaming video fails.
It seems like the getUserMedia() stuff is not supported by any browser in iOS.
Moreover, I am planning to put this on a WebView in a native iOS app. This sounds like a really far cry.
I wish someone could point me towards something that helps me build a video streaming app (hopefully using HTML5), that works uniformly for iOS and android (without TokBox).
You might want to look into Ericsson's Bowser App http://www.ericsson.com/research-blog/context-aware-communication/bowser-openwebrtc-released-open-source. It claims to provide WebRTC on Android and IOS. Apparently the App is currently under review in the App Store so if you wait it may just be a case of downloading the App. However it's also open source so if you can't wait then you can build it yourself https://github.com/ericssonresearch/bowser.
getUserMedia and WebRTC Peer-to-peer connections APIs are not supported in iOS.
One of the reason is that at the moment efforts around WebRTC focus on VP8 video codec which Apple and Microsoft do not support natively. Support in the near future is unlikely with Microsoft pushing for its own standard.
Doing what you want on iOS requires you use a native iOS compatible solution like OpenCV which supports video capture. You can find on Google tutorials on how to implement a solution based on OpenCV.
good news, will be supported at Safari 11.0
https://developer.apple.com/library/content/releasenotes/General/WhatsNewInSafari/Safari_11_0/Safari_11_0.html

Built-in AEC or WebRTC AEC on iOS?

I'm new to iOS(from android). I have already successfully do the AEC process on android with WebRTC standalone AECM module.
Now I have to do the same thing on iOS but with little progress. the mainly problem is which solution should I choose to do the AEC on iOS:
using iOS built-in AEC.
using WebRTC standalone AECM module again.
for the first solution, I've learnt that iOS has built-in AEC in Voice Processing I/O unit. but I heard that the performance of VPU is not good enough. does it?
for the second one, on android, i can use AudioRecord and AudioTrack's API and some buffer techniche to calculate the "delay" described in WebRTC "audio_processing.h", but I have no info. about what API should I use to calc the same delay on iOS.
I'm keep searching on google and iOS docs right now. but also eager for any advise here. especially some one who did the AEC on iOS already.
thanks a lot.

Resources