webrtc on ios ,how to realize the video function? - ios

i am coding the webrtc on ios ,i build the source code and generates a static library. According to the official demo, realizing the function of audio, but i don`t know how to realize the video on ios, who can help me or give me a demo obout video use webrtc, thanks .

Related

How can I start live streaming using RTMP link from iPhone using Swift 5

I'm working on a project in which users can go for live stream using iPhone device camera.
I looked for the same and found certain libraries but mostly are paid.
So, please suggest the better way to achieve the desired result, any library link, blog or code would be helpful.
The project is built using swift 5, Xocde 12.3.
Reference link:
I found this library for android: https://github.com/TakuSemba/RtmpPublisher
Thanks in advance.
#HappyCoding
try this
"Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS."
https://github.com/shogo4405/HaishinKit.swift

PJSIP iOS video call

I want to implement video call using pjsip but unable to integrate the same. Audio call is working fine at my end. Could you please guide me how to implement video call in iOS version? Any clue would be highly appreciated.

iOS playback RTMP with AVFoundation

I need to implement an iOS app that playback RTMP with AV. Exists any library for this propose? I have bean searched on Internet but I didn't found nothing or any example.
I think that I must capture frames and decodec because iOS don't playback rtmp.
I haven't got idea to where start
There are few providers i had come across few months back.
You might want to refer them.
cine.io
VideoStream SDK for iOS
Brightcove Player SDK for iOS
RealTimeLibs
All of above are paid ones.
There are also some open source libs:
RTMP-Wrapper
MediaLibDemos
Hope this helps you.

How to do screensharing using TLKSimpleWebRTC

I am trying to use TLKSimpleWebRTC of OTalk in my iOS app,It is easy to do video and audio chat, but I can't find the way to share my app screen using this SDK.
Is there anyone familiar with TLKSimpleWebRTC or WebRTC lib who can help me?
As far as I know, this isn't possible due to the lack of a screen capture mechanism on iOS.

Stream Video from ios device

I have integrated FFmpeg libraries to my project.
Now
i want stream a video that is captured using my ios device (iPhone, iPad, iPod)
to an RTMP server using FFMpeg.
I did post a similar question and googled for the same but did not end up with any solution.
Can anyone of you suggest me a tutorial or atleast direct me as i am badly stuck over here and not able to move ahead.
Kindly Pour your knowledge.
Thanking you in advance.
Maybe you can check the source code of my streaming app:
https://forum.speeddemosarchive.com/post/kumari_alpha_1.html
Check in common/rtmp_broadcaster.cpp. It is too long to post here. The code should be the same for iOS. You will only have to modify how the captured frame data is input into ffmpeg. Since you already have the frame data, this should not be too difficult.

Resources