If the broadcaster app (Android) and client app(IOS and Android) are connected to wifi in the same network, everything works fine. But when the broadcaster app is connected to mobile data connection, client app (IOS) shows black screen but the Android still works fine. I have searched some work around with this and they suggest to add STUN and TURN in my peerConnection , but i had already added before . Seems not working when the communication is stablished on different network connection. This is how i setup my RTCPeerConnection.
var rtcIceServers: [RTCIceServer] = []
rtcIceServers.append(RTCIceServer(urlStrings: [turnUrl], username:"*****",credential: "*********"))
rtcIceServers.append(RTCIceServer(urlStrings: [stunUrl]))
let rtcConf = RTCConfiguration()
rtcConf.iceServers = rtcIceServers
rtcConf.tcpCandidatePolicy = .disabled
rtcConf.bundlePolicy = RTCBundlePolicy.maxBundle
rtcConf.rtcpMuxPolicy = RTCRtcpMuxPolicy.require
rtcConf.continualGatheringPolicy = RTCContinualGatheringPolicy.gatherContinually
rtcConf.keyType = .ECDSA
let mediaConstraints = RTCMediaConstraints.init(mandatoryConstraints: mandatoryConstraints, optionalConstraints: nil)
let pc = self.peerConnectionFactory.peerConnection(with: rtcConf, constraints: mediaConstraints, delegate: self)
the client app requires only to receive video track , no audio added in the mandatory constraints. Thank you
After few days of debugging and comparing my setup to other sources, i resolved the issue by applying the right parameters to my RTCIceCandidate before i add to my peerConnection. I mistakenly applied candidate value to sdpMid parameter and id value to sdp parameter. Since everything works if both clients are in the same network connection, i never suspect my parameters, not until i understand quite a bit about why i need to use IceServers when clients are not in the same network. So RTCIceCandidate value is not quite important and also these ice servers when clients are in the same network thats why it was working even if RTCIceCandidate has wrong parameters applied. My bad...
RTCIceCandidate(sdp: id, sdpMLineIndex: label, sdpMid: candidate)// wrong
RTCIceCandidate(sdp: candidate, sdpMLineIndex: label, sdpMid: id)// correct
Related
Hi I am working on a Video Call Solution by using WebRTC directly. I have achieved 1-1 video call using firebase as Signaling service and using default google ICE Servers.
Core Req: Multiple users with in a Room using WebRTC at least 4 users using the default ice/stun servers available. I'm using pod 'GoogleWebRTC'
Issue comes when multiple users joins the same room ID.
So, I am maintaining Peerconnection reference as this
var peerConnection: RTCPeerConnection! = nil
When a new user i.e., remote user joins I set its description as below
self.peerConnection.setRemoteDescription(offer, completionHandler: {(error: Error?) in
if error == nil {
LOG("setRemoteDescription(offer) succsess")
self.makeAnswer() // Create Answer if setRemoteDescription succeeds
} else {
LOG("setRemoteDescription(offer) ERROR: " + error.debugDescription)
}
})
What I feel ? Issue is when third user joins again I set the remote Description with above mentioned code which makes my previous video stops to render sometimes or most of the times.
I looked for solutions and found need to maintain multiple peer connection references, but how? Any help with my requirement will be appreciated.
Just give me clue or sample code will be really great.
In case of multiple user call you should have multiple peerconnections, because it's isn't possible to set different sdps to one pc.
So you can use something like this
var peerConnectionMap = [String: RTCPeerConnection]()
Where String here is some constant user id.
When new user is joined to the room, then you create new pc and store it in this dictionary. Then you exchange with sdps as usual.
Don't forget that you should reuse local audio-video track created when first peerconnection is created.
I want to check which Bluetooth Devices my iPhone is connected to. In order to do that, I use CBCentralManager.retrieveConnectedPeripherals() like this:
let connectedPerphs = centralManager.retrieveConnectedPeripherals(withServices: []);
My problem is that even if my iPhone is connected to a BluetoothDongle (it explicitly says "connected" in the settings), the list that is returned by retriveConnectedPeripherals() is always empty. Am I using the method in a wrong way or can it not be used to detect a bluetooth connection such as the connection to to my dongle? If the latter is the case, how can I detect that connection?
Let me clear, centralManager.retrieveConnectedPeripherals always return empty or nil value, If you are not passing any value into serviceUUIDs
retrieveConnectedPeripherals(withServices:)
Returns a list of the peripherals (containing any of the specified
services) currently connected to the system.
serviceUUIDs:
A list of service UUIDs (represented by CBUUID objects).
Update:
Unfortunately this the long way to do it. You can create Array of CBUUID statically then you can pass it to the method. Please refer below code.
let aryUUID = ["1800","18811"]
var aryCBUUIDS = [CBUUID]()
for uuid in aryUUID{
let uuid = CBUUID(string: "1800")
aryCBUUIDS.append(uuid)
}
let connectedPerphs = centralManager.retrieveConnectedPeripherals(withServices: aryCBUUIDS)
List of available services
First, this works only with BLE devices, thus if your dongle is using a common BT you will not get it from here, but probably using EAAccessoryManager var connectedAccessories: [EAAccessory] method, but as far as I know your app must comply to MFI.
That is why is asking which service your devices are exposing as a filter.
I'm stumped, iOS 11.4 ( 15F79 ), iPhone 6. Cannot get the App to Ask for Motion Data. info.plist has been set via the editor and double checked via the info.plist open in textWrangler, Also deleted key and saved via textWrangler.
<key>NSMotionUsageDescription</key>
<string>This app needs your Phones motion manager to update when the phone is tilted. Please allow this App to use your phones tilt devices</string>
I have deleted then reinstalled the app about 10 times. I have restared the phone 5 times. I have checked through settings and my app does NOT show up in Privacy-Motion and Fitness or anywhere else in settings. I am using a free developer account, maybe that has something to do with it?
I created a new Xcode game template and changed nothing apart from importing CoreMotion and this code
**** Edited, sorry I forgot to say I had started the instance, just forgot to put it here, just in case someone thinks that's the problem ************
let motionManager = CMMotionManager()
override func didMove(to view: SKView) {
motionManager.startDeviceMotionUpdates()
if motionManager.isDeviceMotionActive == true {
motionManager.accelerometerUpdateInterval = 0.2
motionManager.startAccelerometerUpdates(to: OperationQueue.current!, withHandler: {
(accelerometerData: CMAccelerometerData!, error: NSError!) in
let acceleration = accelerometerData.acceleration
print(accelerometerData)
} as! CMAccelerometerHandler)
}else{
print(CMMotionActivityManager.authorizationStatus().rawValue)
}
which prints a 0 ( an Enum - case not determined ) to the console.
In my actual app it was a 3 ( same Enum - case Denied ).
As I've said, I have uninstalled, reinstalled, edited plist via Xcode and text wrangler ( a code editor ) , tried different versions of the code above, tried the code in different places ( in did move to view, in class )tried code off apple docs. etc.... I haven't been asked the NSUsage question and the App keeps crashing.
I have looked for ways to get the Alert fired up, As in CLLocationManager.requestWhenInUseAuthorization() but I cannot find a comparable CMMotion version ( I don't think there is one. ) I have created a new swift file , imported Foundation and CMMotion and just put that code there, But still no Alert asking for Motion Data.
I tried a single view app template instead of a game template thinking that might be the issue, Nope.
What do I do?
Any help Appreciated. Thanks
You are confusing two related but different classes.
CMMotionManager gives access to accelerometer, magnetometer and gyroscope data. It does not require any user permission as this information is not considered privacy related.
In your else clause you are checking the authorisation status of CMMotionActivityManager. This object reports the device motion type (walking, running, driving). This information is considered privacy related and when you create an instance of this class and request data from it, the permissions alert is displayed.
The reason your else is being triggered is because you are checking isDeviceMotionActive; this will be false until you call startDeviceMotionUpdates, which you never do. Even if you used isAccelerometerActive you would have a problem because you call startAccelerometerUpdates in the if clause which will never be reached.
You probably meant to check isAccelerometerAvailable. If this returns false then there isn't much you can do; the device doesn't have an accelerometer.
Update
It doesn't make sense to check isDeviceMotionActive immediately after calling startDeviceMotion:
You know it's active; you just started it
I imagine the start up takes some time, so you could expect to get false if you check immediately.
Apple recommends that you do not have more than one observer in place for each motion device type, so the purpose of check the is...Active to ensure you don't call start... again if you have already done so.
If you only want gyroscope data then you don't need to call startDeviceMotionUpdates at all.
This query is regarding the Portaudio framework. A little background before I ask the question:I am working on an application in PortAudio to output audio through a multichannel(=8) device. However, the device I am using does not expose itself as a single 8-channel device but instead shows up in my device-list as 4 stereo devices. On searching for an approach to handle this, I got to know that WinMME in PortAudio supports multiple devices.
Now, I went through the appropriate header file("pa_win_wmme.h") and followed the suggestions present. But I get the 'Invalid device' error (error number -9996) after calling PA_OpenStream(). In the above mentioned header file, they have in fact specified the right parameter(s) to use when configuring multiple devices to avoid this error, but in-spite of following them, I still get the error.
So I wanted to know if anybody has faced a similar issue and whether I have missed/wrongly configured anything.
I am pasting the required snippets of code below for reference:
PaStreamParameters outputParameters;
PaWinMmeStreamInfo wmmeStreamInfo;
PaWinMmeDeviceAndChannelCount wmmeDeviceAndNumChannels;**
...
...
outputParameters.device = paUseHostApiSpecificDeviceSpecification;
outputParameters.channelCount = 8;
outputParameters.sampleFormat = paFloat32; /* 32 bit floating point processing */
outputParameters.hostApiSpecificStreamInfo = NULL;
wmmeStreamInfo.size = sizeof(PaWinMmeStreamInfo);
wmmeStreamInfo.hostApiType = paMME;
wmmeStreamInfo.version = 1;
wmmeStreamInfo.flags = paWinMmeUseMultipleDevices;
wmmeDeviceAndNumChannels.channelCount = 2;
wmmeDeviceAndNumChannels.device = 3;
wmmeStreamInfo.devices = &wmmeDeviceAndNumChannels;
wmmeStreamInfo.deviceCount = 4;
outputParameters.hostApiSpecificStreamInfo = &wmmeStreamInfo;
The device id = 3 was obtained through
Pa_GetHostApiInfo( Pa_HostApiTypeIdToHostApiIndex( paMME ) )->defaultOutputDevice
I hope I have made the query clear enough. Will be happy to provide more details if required.
Thanks.
I finally figured out the mistake :-)
The configuration for multiple devices must be made as an array. For instance, in the above case
wmmeDeviceAndNumChannels must be an array of 4, with each individual device field containing the corresponding device index of each of the 4 stereo devices. The channelCount remains 2. The outputParameters.channelCount still has to be the aggregate number of channels, i.e. 8. With this I was able to run the application with a single stream, and of course, without any errors related to invalid device or invalid number of channels.:-)
Thanks.
Based on the code pasted above, it looks like you are trying to call open on a single 8-channel device. Instead you will have to get the Pa index of all four devices and call open 4 times. Once for each stereo device. You will then have 4 interleaved stereo streams to maintain. My guess is that changing channelCount = 8 to channelCount = 2 will allow the first stream to open.
Im stuck to get own phone number and Sim ID (SSID) using Monotouch
I tryed:
var v = NSUserDefaults.StandardUserDefaults.ValueForKey((NSString)#"SBFormattedPhoneNumber");
var t = NSUserDefaults.StandardUserDefaults.ValueForKey((NSString)#"ICCID");
new UIAlertView("Ur phone Number",""+v.ToString(),null,"Ok",null).Show();
new UIAlertView("Ur ICCID",""+t.ToString(),null,"Ok",null).Show();
and all other ValueFor***
it always return null or " "
Tried on iphone 3g. Please help.
Apple does not want to to access this information as it can easily be misused. Any application doing so is likely to be rejected from the AppStore. See the comment (with more than 30 up votes) from this answer.
Also note that your code above does not read from the SIM - it reads from the iTunes registration data, which does not have to be set to any value (i.e. you can't trust it).