Remote Audio not connecting: iOS, PJSIP 2.6, CallKit, PJSUA2 - ios

I am updating an existing iOS VOIP application to use CallKit with PJSIP 2.6 and PJSUA2.
After some effort, the CallKit implementation seems to be working as expected. Incoming calls can be accepted or declined, and if accepted, will be connected and controlled with an in-app active call view controller.
The audio, however, does not appear to be properly connected at the pjsip end. There is no audio coming in from, or going out to the remote caller. The microphone audio appears to be routed back to the iPhone speaker.
The SIP audio ports should be connecting in callback function onCallMediaState:
virtual void onCallMediaState(OnCallMediaStateParam &prm) {
CallInfo ci = getInfo();
AudioMedia* audio_media = 0;
for (unsigned i = 0; i < ci.media.size(); i++) {
if (ci.media[i].type==PJMEDIA_TYPE_AUDIO && ( ci.media[i].status == PJSUA_CALL_MEDIA_ACTIVE ||
ci.media[i].status ==PJSUA_CALL_MEDIA_REMOTE_HOLD)) {
try {
audio_media = static_cast<AudioMedia*>(getMedia(i));
if(audio_media != 0)
{
Endpoint::instance().audDevManager().getCaptureDevMedia().startTransmit(*audio_media);
audio_media->startTransmit(Endpoint::instance().audDevManager().getPlaybackDevMedia());
}
} catch (std::exception ex) {
continue;
}
}
}
}
As described in Ticket#1941 at:
https://trac.pjsip.org/repos/ticket/1941:
I set the audio devices using:
ep->audDevManager().setNullDev();
immediately after the initialization of the Endpoint class (ep->libInit(epConfig);), and then:
I attempt to set the devices using pjsua_set_snd_dev() in CXProvider’s didActivate function, like this:
-(void) setSipSoundDevices {
pj_status_t status;
int captDev, playDev;
pjsua_get_snd_dev(&captDev, &playDev);
Endpoint::instance().audDevManager().setPlaybackDev(playDev);
Endpoint::instance().audDevManager().setCaptureDev(captDev);
}
pjsua_get_snd_dev(&captDev, &playDev) returns -99, -99 and the audio does not connect.
My question is this. How can I properly hook up the remote audio sources or ports, on an incoming call using PJSIP 2.6 and CallKit?
Might 2.5.5 work better in this regard?
Any insights are appreciated.

By and by I got the incoming call audio working properly. The crux of the matter was that even though the documentation from both Apple and SIP say that the audio has to be handled on the iOS end, you still have to set the SIP audio devices in the SIP layer in the provider delegate 'didActivate' and 'didDeactivate' functions. Because I use the PJSUA C++ layer, I had to drill down through the objc-c++ bridging layer to provide this functionality. ie.
-(void) activateSipSoundDevices {
pj_status_t status = pjsua_set_snd_dev(0, 0);
}
-(void) deactivateSipSoundDevices {
pj_status_t status = pjsua_set_null_snd_dev();
}
When initializing the SIP Account, be sure to set the null sound devices like:
ep->audDevManager().setNullDev();
Hope this helps.

Related

no sound when using pjsip

I've got a problem with pjsip. I'm trying to make an outgoing call with pjsua_call_make_call. It's working, but when I answer this call on a device, I can't hear any sound. However, I can see an icon on iPhone, indicating that a microphone is in use. Did anybody come across such issue?
I am having a similar issue. I place an outbound call and I can hear the audio on the device when I pick up the call, but can't hear any audio on the device using pjsip to make the call.
If your audio capture doesn't seem to be working make sure you have microphone permission, and you have to manually call pjsua_set_snd_dev(), when you connect. There's some other additional troubleshooting here https://trac.pjsip.org/repos/wiki/Getting-Started/iPhone#Commonproblems
I've no experience with iOS, but I suppose you should connect audio stream to some device in on_call_media_state callback (link). Look at minimal example from desktop app:
pjsua_call_info ci;
pjsua_call_get_info(call_id, &ci);
for (unsigned i = 0; i < ci.media_cnt; i++) {
if (ci.media[i].type == PJMEDIA_TYPE_AUDIO) {
if (ci.media[i].status == PJSUA_CALL_MEDIA_ACTIVE) {
pjsua_conf_connect(ci.conf_slot, 0);
pjsua_conf_connect(0, ci.conf_slot);
}
}
}
Edit:
iOS code for default audio stream in call:
var callinfo: pjsua_call_info = pjsua_call_info()
pjsua_call_get_info(call_id, &callinfo)
if(callinfo.media_status == PJSUA_CALL_MEDIA_ACTIVE) {
pjsua_conf_connect(callinfo.conf_slot, 0)
pjsua_conf_connect(0, callinfo.conf_slot)
}

How to make audio/video calls and get the call type in on_incoming_call function in PJSIP in iOS?

I built PJSIP library with PJSUA_HAS_VIDEO as 1. I want to make an option to make audio only calls. I tried
pjsua_call_setting opt;
pjsua_call_setting_default(&opt);
opt.flag = PJSUA_CALL_INCLUDE_DISABLED_MEDIA;
opt.vid_cnt = 0;
opt.aud_cnt = 1;
pj_status_t status = pjsua_call_make_call((pjsua_acc_id)[self identifier], &uri, &opt, NULL, NULL, &callIdentifier);
At the receiving end, in on_incoming_call() function, I tried
if (callInfo.rem_offerer && callInfo.rem_vid_cnt == 1)
{
call.hasVideo = YES;
} else {
call.hasVideo = NO;
}
But rem_vid_cnt is always giving 1.
How can I set the call type while making call and receive it correctly at receiving end? I want to set the setHasVideo field of CallKit also at receiving end.
Thanks in advance.
At the app end your code is correct.
You need to disable video from server side also.
This is a two way communication. You can do this from the caller side set the rem_vid_cnt = 0 when you initiate a call, and at the receiver side you will get this as "0".
Hope this will help you :)
/** Number of video streams offered by remote */
unsigned rem_vid_cnt;

Use Bloothtooth LE while app is in background

I am building an iOS app with Xamarin, with this BLE plugin:
https://github.com/aritchie/bluetoothle
I'm just broadcasting a UUID via BLE, and it works. Here is my code:
var data = new Plugin.BluetoothLE.Server.AdvertisementData
{
LocalName = "MyServer",
};
data.ServiceUuids.Add(new Guid("MY_UUID_HERE"));
await this.server.Start(data);
The only problem is that it stops broadcasting once I put the app in the background. And resumes again when I open the app again.
How can I let it continue to broadcast once it's in the background? I read the documentation here:
https://developer.apple.com/library/content/documentation/NetworkingInternetWeb/Conceptual/CoreBluetooth_concepts/CoreBluetoothBackgroundProcessingForIOSApps/PerformingTasksWhileYourAppIsInTheBackground.html
And it says that I have to use the CBCentralManager class to obtain the preservation and restoration feature (so I can keep broadcasting the UUID at all times), but I'm having a hard time translating this to Xamarin/C#.
EDIT
After researching some more, I read that I need to create an instance of CBCentralManager and implement WillRestoreState in the delegate. I did this in the AppDelegate:
[Register("AppDelegate")]
public class AppDelegate : MvxApplicationDelegate, ICBCentralManagerDelegate
{
private IGattServer server = CrossBleAdapter.Current.CreateGattServer();
private CBCentralManager cbCentralManager;
public override bool FinishedLaunching(UIApplication application, NSDictionary launchOptions)
{
// irrelevant code...
this.Ble();
return true;
}
private async Task Ble()
{
try
{
await Task.Delay(5000); // wait for it to finish initializing so I can access BLE (it crashes otherwise)
var options = new CBCentralInitOptions();
options.RestoreIdentifier = "myRestoreIndentifier";
this.cbCentralManager = new CBCentralManager(this,null,options);
var data = new Plugin.BluetoothLE.Server.AdvertisementData
{
LocalName = "MyServer",
};
data.ServiceUuids.Add(new Guid("MY_UUID_HERE"));
await this.server.Start(data);
}
catch (Exception e)
{
}
}
public void UpdatedState(CBCentralManager central)
{
//throw new NotImplementedException();
}
[Export("centralManager:willRestoreState:")]
public void WillRestoreState(CBCentralManager central, NSDictionary dict)
{
//never gets called
}
But it didn't make a difference for me. And the WillRestoreState method never gets called... I don't mind using a different plugin/library if I have to at this point...
EDIT 2
I just realized that the app is still broadcasting while it is in the background, I just don't see the service UUID anymore (in the web portal of the beacon that I'm testing with), I only see the phone's identifier.
After doing tons of research, I found that it is simply an iOS restriction - you can not broadcast the UUID of a BLE service while your app is in the background. Background work is very restrictive in iOS.
EDIT to include Paulw11 comment (which is true):
You can advertise a service, but it is advertised in a way that only another iOS device that is specifically scanning for that service UUID can see.
Although you can not broadcast the UUID of a BLE service while your iOS app is in the background, for anyone trying to do something similar, you should look into iBeacon. It's Apple's protocol for letting iOS apps do bluetooth stuff while it's in the background.

AVAudioSession: Some Bluetooth devices are not working properly on my App

I'm developing a swift audio/video and text chat iOS App using AVAudioSession.
Whenever I select to use some Bluetooth devices the sound played on the device is not the App audio stream. They play only the system sound sent by text chat library whenever messages are sent/received instead. It doesn't happen on all Bluetooth devices, on some of them everything works fine. On Builtin Mic and Speaker the App works fine too.
Here are the most important methods from my class to manage the device:
class MyAudioSession
{
private var mAudioSession: AVAudioSession;
init!()
{
self.mAudioSession = AVAudioSession.sharedInstance();
do {
try self.mAudioSession.setActive(false);
try self.mAudioSession.setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: .AllowBluetooth);
try self.mAudioSession.setMode(AVAudioSessionModeVideoChat);
try self.mAudioSession.setActive(true);
}
catch {
return nil;
}
}
func switchToDevice(device: AVAudioSessionPortDescription!) -> Bool
{
var ret = false;
if (device != nil) {
do {
try self.mAudioSession.setPreferredInput(device);
ret = true;
}
catch {
self.logSwitch(device, error: error);
}
}
return ret;
}
}
I'd like to understand why my App is not working fine on just SOME Bluetooth devices. These same devices work properly on the other Apps on my Cel.
I did another test: I changed all of this for MPVolumeView, and exactly the same issue occurred, so the problem seems to be on audio player.
Could anybody give me a suggestion to fix this ?
Thx.
Jorg,
While this might not be the best answer I have been able to overcome the weird Bluetooth issues. My problem seems to be similar to yours as I too was using:
AVAudioSessionCategoryPlayAndRecord
This was causing issues for me on some Bluetooth devices (not all but some).
What I wound up doing was setting the Category to:
AVAudioSessionCategoryPlayback
Then when ever I needed to record I would switch the Category over to:
AVAudioSessionCategoryRecord
Then back to Playback after completing my recording.
This was the only way at this time I could get a consistent result from switching between the different outputs (Speaker, Headphones, Bluetooth).
Hope that helps some. Guessing this is a bug in the "AVAudioSessionCategoryPlayAndRecord"

AudioToolbox MusicPlayer change program has no effect

MIDI noob in training here...
I have been using MusicPlayer/MusicSequence/MusicTrack to play MIDI notes on devices running iOS. The notes are playing fine. I am struggling to change the instrument being played. As far as I can figure this is how to do it:
-(void) setInstrument:(MIDIInstruments) program channel:(int) channel MusicTrack:(MusicTrack*) track time:(float) time {
if(channel < 0 || channel > 15 || program >=MIDI_INSTRUMENT_COUNT || time < 0) {
return;
}
MIDIChannelMessage programChange = { ((UInt8)0xC) << 4 | ((UInt8)channel), ((UInt8)program), 0, 0};
OSStatus result = MusicTrackNewMIDIChannelEvent(*track, time, &programChange);
if(result != noErr) {
[NSException raise:#"Set Instrument" format:#"Failed to set instrument error: %#", [NSError errorWithDomain:NSOSStatusErrorDomain code:result userInfo:nil]];
}
}
In this case channel is 0 or 1, I tried several instruments through out the range of valid instrument enumerations, the time is 0.0, and the MusicTrack is valid, and has ~30 seconds of note events. The call to set the channel event passes back noErr. I am stumped...Anyone?
I had read in other posts that I would be able to generate midi using Music Player and friends. It provides for program changes. So, I had figured it was supported. After exhausting all theories, I turned to AUGraph. I added a *.sf2 file that I found online, instantiated the AUGraph, two AudioUnits, a MidiEndpointRef, and a MidiClientRef; according to this tutorial.
It was in the endpoint callback that I had to turn notes on and off using MusicDeviceMIDIEvent on the samplerUnit that seemed to allow for the program change. Whereas before I was just loading note events into a MusicTrack and playing/stoping the MusicPlayer.

Resources