I'm developing audio calling using webRTC and Janus.
below are my steps
peerconnection make offer
set local description of peerconnection
send message to socket
add local candidates to peerconnection
received message from socket
peerconnection set remote sdp
received remote media stream
But after receiving the stream also, it is not audible. If code is required, i can put the snippets on ask. Please help.
Check if this helps. Just found something in Swift.
https://github.com/Igor-Khomich/JanusAudioStreamPlayer
Related
For an iOS application, I need to know that when the remote party is stopped its video for an existing call. Is there any API or event that triggered by pjsip library?
Thanks in advance for your help.
Method reference for pjsua_call_vid_stream_is_running:
Determine if video stream for the specified call is currently running (i.e. has been created, started, and not being paused) for the specified direction.
https://www.pjsip.org/pjsip/docs/html/group__PJSUA__LIB__CALL.htm#ga23c0bd5a335b5fa0d02404cd03ca0d5e
Using the decoding direction we can check if the incoming video is running, but using the PJMEDIA_DIR_DECODING direction.
bool video_active =
pjsua_call_vid_stream_is_running(call_id, med_idx, PJMEDIA_DIR_DECODING);
I am creating a voice only (no video) chat application. I have created my own node.js/socket.io based server for signaling.
For WebRTC, I am using the following pod: https://cocoapods.org/pods/WebRTC
I have been successful in creating peer connection, adding local stream, setting local/remote sdp, and send/receive ice candidates. The "didAddStream" delegate method is also called successfully having audio tracks but I am stuck here. I don't know what should I do with the audio track. What should be the next step? How would I send/receive audio on both sides?
Also, if I integrate CallKit, what changes do I need to make.
I got stuck on this one too. You have to retain the RTCMediaStream object in order for the audio to play. You don't need to do anything with the RTCAudioTrack, it will play automatically. I simply assign it to property so it can get retained. See my example here: https://github.com/redfearnk/WebRTCVideoChat/blob/master/WebRTCVideoChat/WebRTCClient.swift#L143
I want to get notified whenever my iOS device is connected/disconnected to internet through my app. This is possible in android by using broadcast receiver. Is there any broadcast receiver like component in iOS through which I get notification for detecting change in internet connection?
I have searched through internet a lot but did not get any solutions.
You can refer to Reachability API. You might want to run this as a silent background thread and alert once there is a change in network connectivity.
I am trying to write an iOS application that connects to an OBD-II Interface over WiFi (specifically the OBDLink MX WiFi Scan Tool). I have written some basic socket code taken and I am able to open a socket to 192.168.0.10:35000. I receive the NSStreamEventOpenCompleted for both input and output streams.
The first event that fires shortly after is the NSStreamEventHasBytesAvailable. I attempt to read the stream, but the length comes back 0. My question is what is the flow of execution for communicating with these devices? I have tried to issue an ATZ\r command, but nothing is happening (no stream events are firing).
How do I know if if I am connected and the OBD-II interface is ready?
The usual command terminator is ˋ\r\nˋ, so first try sending ˋATZˋ with this command. Only send, after you have received the HasSpace notification from the ˋNSOutputStreamˋ.
Another alternative to communicate with this device would be this Car Diagnostics API, access to the API can be found on
https://github.com/HellaVentures/Car-Diagnostic-API
Is there a way to take a device's destination endpoint and monitor what messages are being sent to it via CoreMIDI?
This here is a very good free MIDI Monitor
I have also built a MIDI Monitor with Xcode for iMac and iPhone using coreMIDI. The only problem I have is the "MIDIThruConnectionCreate" for MIDI Thru. Of corse I could send the received MIDI data packet to MIDI out, but this is not the right way.
MIDIThruConnection is for this. See MIDIThruConnection.h in CoreMIDI.framework for more details.