how to use Camera2 from rtmp with sceneform. I want to start live streaming from my android app and want to add Augmented reality model in camera view - augmented-reality

Youtube livestreaming is working seperately and Augmented reality is also working seperately. I want to merge both together. I want to start live streaming with Augmented reality view of camera. Is it possible?
I am using https://github.com/SceneView/sceneform-android for Augmented reality and
https://github.com/pedroSG94/rtmp-rtsp-stream-client-java for opening camera and sending data packets to youtube.

Related

Agora video SDK for Unity is freezing the AR camera

I've been following Agora's tutorial for building an Augmented Reality video chat app https://www.agora.io/en/blog/video-chat-with-unity3d-the-arfoundation-version/. When I build it onto my iPhone the cube on which the video is playing on remains blank. Frustratingly, if I then remove this line of codemRtcEngine.EnableLocalVideo(false); the videos play but the AR camera freezes! Is there a way I can have both??
I'm using Unity version 2019.3.1
It seems you can not use 2 cameras at once. See SO response, here, from a Agora.IO dev (or, seems to be a dev): Can we use Agora video using Unity AR Foundation with simultaniously using back and front camera

VR videos with VR UI - iOS Google VR SDK

During video playback I'd like to overlay a set of controls the user can interact with. Is this possible with the components provided in Google's SDK?
Can I leverage OpenGL to work in conjunction with the video player Google provides, or do I need to create my own player that renders in an OpenGL view?

Capture WKWebView audio for metering

I am currently working on an app that contains a WKWebView in which I have loaded an iFrame with video streaming from YouTube.
I would like to be able to create an audio visualizer that is shown alongside this iFrame and moves in reaction to the audio from the stream.
I have been following along with this Ray Wenderlich tutorial for creating a music visualizer, but the tutorial uses the setMeteringEnabled and updateMeters functions built in to AVAudioPlayer.
Is there any way to meter audio coming from a WKWebView? I just want an average volume level, not the actual audio stream itself.
I have attempted to look at libraries like The Amazing Audio Engine, but none of them seem to allow you to capture the channel coming from the WKWebView at all, let alone for metering purposes.

Add an arbitrary image above live camera feed using GPUIMage

I'm currently working on an augmented reality app, that's why I would like to add an image above the live video feed (GPUImageVideoCamera) AND be able to record the whole to an output file.
Get the video live preview, and recording is ok from now, but I can't manage to add the image on the screen the way it's recorded to the output file.
What's the best way to achieve this (I mean a GPUImage compliant way) ?

How to build a video wall with multi-streams

I am attempting to make a mosaic display for our video wall. The mosaic will play video files (wmv) in a matrix view (probably 4x2). I am looking for a programmatic aproach to stream multiple videos using mms.
I have accomplished something similar using vlc mosaic plugin, but it terminates when the first video finishes playing. I am interested in running the tool in a loop.
Here is a vlc mosaic example: https://gist.github.com/1367589
First question, what is the easiest technology that I can implement this in, DirectX SDK, Windows SDK for Media Foundation, OpenGL, libvlc?
Are there any tutorials for multi video playback coding?
VLC and gStreamer are good options for this and have support for picture in picture videomixing.

Resources