Using agora io's web sdk, I am simply trying to connect to a video call/stream. Per the docs below, safari and ios are supported although there are known issues. They also have a way to deal with autoplay blocking, but it raises more questions about implementation than it answers. My question is, has anyone successfully implemented Agora web sdk using react and been able to access video, 1-1 && many, using safari on ios? how did you go about it? Could you provide any code snippets and or links to examples that deal with this scenario? Ideally I would like implement something like what agora shows on their 17 person multistream example below, but when I try to access the page using ios it disconnects and goes back to the initial screen.
Agora docs - supported web browsers:
https://docs.agora.io/en/faq/browser_support#ios
Agora docs - dealing with autoplay:
https://docs.agora.io/en/Audio%20Broadcast/autoplay_policy_web?platform=Web
Agora docs - web on mobile:
https://docs.agora.io/en/faq/web_on_mobile
Agora example - multistream:
https://github.com/AgoraIO/Advanced-Video/tree/master/Web/17-Multistream
Yes, Agora SDK supports IOS web. Please use only Safari browser since other browsers don't have the necessary permissions from apple to perform webrtc calls
Tutorial: https://medium.com/agora-io/building-a-group-video-chat-app-bc05e8962c41
Demo: https://digitallysavvy.github.io/group-video-chat/
Related
I want to develop one audio-video calling application. So, I have decided to use Google WebRTC, so Google WebRTC is good for calling functionality?. Is Google WebRTC support to Conference Calling?. If it is not, then what are the different limitation of Google WebRTC? Please suggest me another calling SDK's for iOS Swift.
I have searched the whole web but did not get any document for passing the XMPP IQ to WebRTC. Also I see is the XMPP Jingle class, but cannot find any document for integrate this.
Can someone help me with setting up a two-way video call using XMPP and WebRTC? By providing a working sample of Objective-C code?
I have tried:
Checking https://github.com/YK-Unit/AppRTCDemo and many other Github projects.
Are you want to support different platforms to get live video streaming?
So, i have a very good experience with https://www.nanocosmos.de/ library.
It's supports iOS, Android and Web. Has demo applications and trial period.
About documentation:
XMPP Framework: https://github.com/robbiehanson/XMPPFramework/wiki/IntroToFramework
It's very good documented.
WebRTC Native Code:
https://webrtc.org/native-code/ios/
It's contains all information about WebRTC.
I am trying to share data on QQ internation and Weibo, but not getting such a good way to post data on this messengers, i tried a lot for sharing data on weibo and qq internation but not getting result.
guide me to correct way and which is the good way to share data on this both messegner.
You might want to look at the SDK's they offer. Third-party service providers encapsulate their services on platforms such as iPhone in special SDK (software development kit) so you can easily talk to their servers.
QQ iOS SDK
Weibo iOS SDK
Try MonkeyKing library.
MonkeyKing helps you post messages to Chinese Social Networks, without
their buggy SDKs.
MonkeyKing use the same analysis process of openshare, support share
Text, URL, Image, Audio, and Video to WeChat, QQ or Weibo. MonkeyKing
also can post message to Weibo by webpage. (Note that Auido and Video
are only specifically for WeChat or QQ)
One more thing: MonkeyKing supports OAuth.
I apologise for the possibility of the title of my question would lead to confusion of the problem. For that I will explain my purpose in detail.
We are currently developing our own wifi speaker which is built with MIPS. The speaker comes with an app that will be used to manage it. One of the features that would we would like to include in the app is accessing contents of Spotify and be able to play them on the speakers.
Unfortunately, after going through the iOS SDK Documentation, and did some tests on Web API Console provided by the official of Spotify, I noticed that Spotify does not allow developers to directly get URL of a song, except for preview purposes. I also wasn't able to find any way to get the data bytes of the music streamed from the server. Every content comes with a corresponding URI which is used for a request.
For the device(WiFi Speaker) part, we recently tried to contact Spotify and ask for an SDK that can be used for development. However, one problem is that Spotify told us that they have SDK for x86, and ARMs architecture only. They don't have MIPS.
Now, here are my questions:
Is there any way for me to push music from an app to the WiFi Speakers without having to use SDK (for backend device)?
If Spotify can provide an SDK for our device, then how can we integrate the SDK with our platform?
I'll explain my 2nd question for clarity. Like for instance, in Android and iOS, these are popular platforms and are widely used by mobile devices. So if they provide SDKs for the two OS, then they can use default system frameworks to access the player for playing the content. (In iOS, it's the AVFoundation Framework). However, if Spotify were able to provide the SDK that we need, how would we able to integrate that with our own platform?
I will answer your question no 1:
You should be able to push music from an app using a buffer that you can read from using Core Audio and also forward to a device of your choice. I think what you are looking for can be found at CocoaLibSpotify
I can't use an app on iOS, it has to be in-the-browser javascript, and it has to be video chat.
Can I support this on quickblox? I know that webRTC is not currently available on iOS.
I am sure this used to be a supported case before the move to webRTC?
Does the deprecated API offer this? Do we know when we can expect webRTC to be supported in the browser on iOS?
WebRTC javascript API should work on iOS Opera/Chrome browsers, but not in Safari unfortunately