I am trying to use TLKSimpleWebRTC of OTalk in my iOS app,It is easy to do video and audio chat, but I can't find the way to share my app screen using this SDK.
Is there anyone familiar with TLKSimpleWebRTC or WebRTC lib who can help me?
As far as I know, this isn't possible due to the lack of a screen capture mechanism on iOS.
Related
I have some questions for AWS CHime as we wanted to build a video calling application.
If we are using iPad safari and using AWS Chime for video calls can I share my screen during a video call?
For us, it's not working.
Then we tried to make an iOS Native App and tried to use AWS CHime SDK. This time screen share worked but if I share my screen and play a video on youtube another user can only see the video he cannot hear any audio of that played video.
I am tired of contacting the AWS Chime support team. They never answer any of the questions instead they send me this and then do not respond. I regret using AWS Chime. It has lots of issues. Pathetic library.
When did the issue first occur.
What is the Meeting ID of the meeting when users experienced this issue.
Could you please confirm that the issue is isolated to the Safari browser-based application?
Please provide the current Safari version, which can be found by following the steps in the following documentation [1]
Apologies on your AWS support experience. Content Share Audio is not yet supported in Amazon Chime SDK for iOS as of 5/4/2021.
This is on our roadmap and planned for future releases. Feel free to open an issue in our iOS GitHub repo to track this feature request.
I'm working on a project in which users can go for live stream using iPhone device camera.
I looked for the same and found certain libraries but mostly are paid.
So, please suggest the better way to achieve the desired result, any library link, blog or code would be helpful.
The project is built using swift 5, Xocde 12.3.
Reference link:
I found this library for android: https://github.com/TakuSemba/RtmpPublisher
Thanks in advance.
#HappyCoding
try this
"Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS."
https://github.com/shogo4405/HaishinKit.swift
I want to integrate the Brightcove https://www.brightcove.com/en/ in my iOS app for implementing Video Live Streaming & broad casting.
And also getting data form Brightcove cloud and its API's.
I know it is possible in Android.
If it is possible in iOS then if there is any issues / challenges while uploading the app to the Appstore.
OR
is there any other alternative / best way for doing this please suggest me.
Thanks in Advance.
You can integrate the iOS SDK provided by the brighcove.
Appstore will not reject the App until and unless it violates the App store guidelines link.
I see documentation for strategy that has a setDistanceType of DISTANCE_TYPE_EARSHOT (which is what I'd like). Now, for iOS this doesn't seem to be available.
If we set the discoveryMediums to use Audio, does this do the same thing? Wondering why there is no equivalent for iOS.
Yes, audio on iOS is the same as earshot on Android. Sorry for the confusion. We may add the earshot concept on iOS for consistency across platforms.
I'm working on an app that needs to be able to stream videos, music and photos to other devices such as chromecast, games consoles etc.. over wifi.
I have looked around and found libraries that support this but are not much help in terms of tutorials and getting it running on iOS. i'm using ARC Xcode 5 targeting iOS 7+
If anybody has guides, tutorials or source code to help me achieve this it would be much appreciated.