Offer & Answer from same device iOS in WebRTC - ios

In Normal Structure, its followed two devices are connected to each other to communicate.
Is it possible to connect same iOS Device to connect itself using WebRTC servers (turn, stun).
What I did So far.
Initialized RTCPeerConnectionFactory & setup Peer Connection
[_peerConnection offerForConstraints:constraints
completionHandler:^(RTCSessionDescription *sdp,
NSError *error) {
NSLog(#"My SDP is %#", sdp);
NSLog(#"My Error is %#", error);
remoteSDP = sdp;
[self setLocalLocalDescription:sdp];
[self setRemoteRemoteDescription:sdp];
}];
I received SDP, I set on local description of _peerConnection's setLocalDescription & _peerConnectionremote's setRemoteDescription
I prepared Answer from _peerConnectionremote, and I received SDP, which I set _peerConnectionremote's setLocalDescription & _peerConnection's setRemoteDescription.
I set Ice candidate in didGenerateIceCandidate, as
if (peerConn == _peerConnection) {
[_peerConnection addIceCandidate:candidate];
}else if (peerConn == _peerConnectionremote) {
[_peerConnectionremote addIceCandidate:candidate];
}
I receive remote media stream in
- (void)peerConnection:(RTCPeerConnection *)peerConn
didAddStream:(RTCMediaStream *)stream ,
which I set properly.
In last, It says, state as RTCIceConnectionStateConnected, but nothing happens more.
Is it possible, two open two streams at the same time from iOS and connect them via WebRTC.
If Yes, how can I separate sources
RTCMediaStream* stream1 = [_factory mediaStreamWithStreamId:#"ARDAMS"];
RTCMediaStream* stream2 = [_factoryremote mediaStreamWithStreamId:#"ARDAMS"];
----- For now its stopping first stream even.
Any suggestion or thoughts?
Following this Image as architect, which is working fine with me when one party is remote or other device. But when both parties are same device skipping SIGNALING SERVER, it fails.

Yes you can do this. Are you retaining the media streams when they get added in the peerConnection:didAddStream: callback? Are you adding the video tracks to a RTCVideoRenderer? The way you differentiate the two sources is by the peer connection associated with the stream when peerConnection:didAddStream gets called, similar to what you are doing with the ICE candidates

Related

iOS - Get device's WIFI IP Address

I need to get the device IP of the WiFi interface.
According to several StackOverflow threads, we could assume that "en0" corresponds to the Wi-Fi interface name :
https://stackoverflow.com/a/30754194/12866797
However, this feels like some kind of convention, not a standard.
Is there any consistent/standard way to retrieve the WiFi interface or the device WiFi IP address, using the iOS SDK ?
It would be nice if the API is available starting from iOS 11 but I won't be picky.
My best attempt was to use NWPathMonitor (iOS 12+) and monitor network changes corresponding to WiFi interfaces (NWInterface.InterfaceType.wifi) :
- (void) MonitorWifiInterface
{
m_pathMonitor = nw_path_monitor_create_with_type(nw_interface_type_wifi);
nw_path_monitor_set_update_handler(m_pathMonitor, ^(nw_path_t _Nonnull path) {
NSLog(#"[NetInterfaceUtilies] Network path changed");
nw_path_enumerate_interfaces(path, ^ bool (nw_interface_t _Nonnull itf)
{
NSLog(#"[NetInterfaceUtilies] Name : %s , Index : %u", nw_interface_get_name(itf), nw_interface_get_index(itf));
return true; // In order to continue the enumeration
});
});
nw_path_monitor_start(m_pathMonitor);
}
But I am not happy with it for the following reasons :
NWPathMonitor is supposed to be used for monitoring network changes : I haven't managed to get network information whenever I wanted, but only when WiFi has been set on/off.
I only managed to get the network interface name. But I can combine this data with the network interfaces retrieved with getifaddrs() in order to deduce the correct interface and IP : it's a step forward ?
It's "only" available starting from iOS 12.

DJI Onboard SDK to Mobile SDK communication - DJI sample not working. (Matrice 600- Raspberry Pi3 - iOS)

I want to send commands from mobile device with the mobile-SDK to the onboard computer (Raspberry Pi 3) on the drone (Matrice 600) running the onboard-SDK. Therefore I am trying to make the sample from DJI work. I followed the guide (https://developer.dji.com/onboard-sdk/documentation/guides/component-guide-mobile-communication.html) and the link (https://developer.dji.com/onboard-sdk/documentation/sample-doc/msdk-comm.html). I used the code from https://github.com/dji-sdk/Onboard-SDK/tree/3.8/sample/linux/mobile for onboard-SDK and the iOS sample https://github.com/dji-sdk/Mobile-OSDK-iOS-App.
The Mobile App is saying it is sending commands but the onboard program does not recognize any and the log is not displaying any errors.
I did not change anything on the code besides my API Key I put in the variable. It says the product has been registered right. Also, the UART connection between the drone and the raspberry seems fine since every other example I tried is working. Also, the connection between the mobile device and the drone is working since other Apps like DJI-Go are able to send commands to the drone, which are then executed.
I am using the simulator on the DJI-Assistant Windows Program.
I feel like the problem is already in the App because I tried to log at different sections of the code and it seems like code that should be executed is not executed. I did not develop iOS apps so I do not really know what is going on exactly but maybe you can help me out.
For example code inside here is not executed. Does that mean he did not manage to finish "Sending"?
From "Mobile-OSDK-iOS-App/MOS/Network/MOSProductCommunicationManager.m"
[fc sendDataToOnboardSDKDevice:data withCompletion:^(NSError * _Nullable error) {
if (error) {
// Handle error locally
} else {
NSString *key = [self commandIDStringKeyFromData:data];
[self.sentCmds setObject:ackBlock forKey:key];
}
completion(error);
}];
Also in the log there are no other logs like these following up:
From "Mobile-OSDK-iOS-App/MOS/ViewController/MOSJSONDynamicController.m"
[self.appDelegate.model addLog:[NSString stringWithFormat:#"Sending CmdID %# with %ld Arguments", cmdId, (unsigned long)arguments.count]];
weakCell.commandResultLabel.text = #"Sending...";
[self.appDelegate.productCommunicationManager sendData:data
withCompletion:^(NSError * _Nullable error) {
[self.appDelegate.model addLog:[NSString stringWithFormat:#"Sent CmdID %#", cmdId]];
weakCell.commandResultLabel.text = #"Command Sent!";
}
andAckBlock:^(NSData * _Nonnull data, NSError * _Nullable error) {
NSData *ackData = [data subdataWithRange:NSMakeRange(2, [data length] - 2)];
uint16_t ackValue;
[ackData getBytes:&ackValue length:sizeof(uint16_t)];
NSString *responseMessage = [NSString stringWithFormat:#"Ack: %u", ackValue];
[self.appDelegate.model addLog:[NSString stringWithFormat:#"Received ACK [%#] for CmdID %#", responseMessage, cmdId]];
weakCell.commandResultLabel.text = responseMessage;
}];
Here are screenshots of the logs:
I found out the standard (master) branch for the iOS-Sample-App repository is the old version 3.1, which did not work. Since there is no documentation on this I used this one because I did not know other versions exist until checking out the different branches.
In the end, version 3.3 worked best for me.
This is the link to the newest branch/version (3.4): https://github.com/dji-sdk/Mobile-OSDK-iOS-App/tree/3.4

NEHotspotConfigurationManager Can it get the WiFi list?

iOS released the public API NEHotspot ConfigurationManager
Inside there is a function: getConfiguredSSIDs (completionHandler: ([String]) -> Void)
Do not know the return value of this, my Code like:
Code :
[[NEHotspotConfigurationManager sharedManager] getConfiguredSSIDsWithCompletionHandler: ^ (NSArray * array) {
            
             NSLog (# "Response:% #", array);
           
         }];
However, the value is null .. Why?
Is there any way to get nearby WiFi using NEHotspotConfigurationManager without going through NEHotspotHelper?
For security reasons your app only has access to SSIDs that your app itself has configured.
To configure a wifi network you need to enable the Hotspot Configuration entitlement in your app settings and then call:
NEHotspotConfigurationManager.shared.apply()
It's also worth noting that if the user has already manually joined the wifi network outside of your app when your app tries to join that network you will get an error stating that they are already a member and it won't show up in the ConfiguredSSIDs list, so try to support this as well.

Trying to store some bytes sent from my iOS app into a variable in my Arduino sketch

I’m currently writing an iOS app that passes strings to the Arduino via Bluetooth Low Energy (BLE). I’m using RedBearLab’s BLE shield and my code for iOS and Arduino are based on their open source code from their GitHub https://github.com/RedBearLab/iOS.
The problem I’m having is I don’t know how to store the NSData that I sent from my iOS app via BLE to a variable in my Arduino’s sketch.
Here is my iOS code:
#import “BLE.h” // from RedBearLab’s Github
- (IBAction)sendText:(id)sender
{
NSData *textInput = [self.textField.text dataUsingEncoding:NSUTF8StringEncoding];
if (ble.activePeripheral.state == CBPeripheralStateConnected) {
[ble write:textInput]; // ble is a object that follows the CoreBluetooth protocol written by RedBearLab
NSLog(#"Date sent..");
}
}
Here you can find the write:(NSData *)d method from the BLE.m
My Arduino has no problem receiving my textInput. Now let say my textInput is “Hello”. Arduino’s serial monitor will show the following message when I run the Arduino’s sketch provided by RedBearLab SimpleChat.ino:
Evt Device Started: Standby
Advertising started
Evt Connected
Evt Pipe Status
Evt Pipe Status
Pipe Number: 2
Hello // <--------my inputString
Evt link connection interval changed
So by looking at the serial monitor, my Arduino have no problem receiving my NSData from my iOS app. However, I have no idea how to store Hello into a variable (preferably a char[]) so I could put it into another function in my own Arduino sketch. If you look at the SimpleChat.ino, the program has a line which is Serial.write(ble_read());. I know I have to store the result of ble_read() into a variable, but I have no idea what type I should use to store the variable. According to the ble_shield.cpp, the ble_read() function returns a int, but what I really want is something like char[] so it can represent the textInput. Can someone familiar with C and Arduino help me out? Thanks.

Getting an audio device with OpenAL

I'm trying to use OpenAL for an IOS game I'm working on, but having an issue opening the audio device. Specifically, when I call the function alcOpenDevice(NULL), I get 'NULL' in return. This is causing issues, of course, but I can't tell what I'm doing wrong.
I'm new to OpenAL, so I've been looking at a couple guides here and here to see what I need to do. If I download their sample projects and test 'em, they both work fine. If I copy their files into my project, and ignore the files I made, they still work fine. I'm assuming something got lost in translation when I started rebuilding the code for use in my project. Asking around and searching online hasn't given me any leads though, so I'm hoping someone here could put me on the right track.
Here's the actual setup code I'm using in my AudioPlayer.m
- (void)setup {
audioSampleBuffers = [NSMutableDictionary new];
audioSampleSources = [NSMutableArray new];
[self setupAudioSession];
[self setupAudioDevice];
[self setupNotifications];
}
- (BOOL)setupAudioSession {
// // This has been depricated.
//
// /* Setup the Audio Session and monitor interruptions */
// AudioSessionInitialize(NULL, NULL, AudioInterruptionListenerCallback, NULL);
//
// /* Set the category for the Audio Session */
// UInt32 session_category = kAudioSessionCategory_MediaPlayback;
// AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(session_category), &session_category);
//
// /* Make the Audio Session active */
// AudioSessionSetActive(true);
BOOL success = NO;
NSError *error = nil;
AVAudioSession *session = [AVAudioSession sharedInstance];
success = [session setCategory:AVAudioSessionCategoryPlayback error:&error];
if (!success) {
NSLog(#"%# Error setting category: %#", NSStringFromSelector(_cmd), [error localizedDescription]);
return success;
}
success = [session setActive:YES error:&error];
if (!success) {
NSLog(#"Error activating session: %#", [error localizedDescription]);
}
return success;
}
- (BOOL)setupAudioDevice {
// 'NULL' uses the default device.
openALDevice = alcOpenDevice(NULL); // Returns 'NULL'
ALenum error = alGetError(); // Returns '0'
NSLog(#"%i", error);
if (!openALDevice) {
NSLog(#"Something went wrong setting up the audio device.");
return NO;
}
// Create a context to use with the device, and make it the current context.
openALContext = alcCreateContext(openALDevice, NULL);
alcMakeContextCurrent(openALContext);
[self createAudioSources];
// Setup was successful
return YES;
}
- (void)createAudioSources {
ALuint sourceID;
for (int i = 0; i < kMaxConcurrentSources; i++) {
// Create a single source.
alGenSources(1, &sourceID);
// Add it to the array.
[audioSampleSources addObject:[NSNumber numberWithUnsignedInt:sourceID]];
}
}
Note: I'm running IOS 7.1.1 on a new iPad air, and using Xcode 5.1.1. This issue has been confirmed on the iPad, my simulator, and an iPod touch.
The Short Answer:
Apple's implementation of alcOpenDevice() only returns the device once. Every subsequent call returns NULL. This function can be called by a lot of Apple audio code, so take out EVERY TRACE of audio code before using OpenAL and manually calling that function yourself.
The Long Answer:
I spent half a day dealing with this problem while using ObjectAL, and ended up doing exactly what you did, re-making the entire project. It worked, until out of curiosity I copied the entire project over, then same problem again, alcOpenDevice(NULL) returned NULL. By chance I stumbled upon the answer. It was this bit of code in my swift game scene:
let jumpSound = SKAction.playSoundFileNamed("WhistleJump.mp3", waitForCompletion: false)
And then I remembered I had this problem before without SKAction involved. That time it turned out I was using ObjectAL in two different ways, I used OALSimpleAudio in one place, and OpenAL objects in another, and it was initializing my audio session twice.
The common thread between these two incidents is both times alcOpenDevice() was called more than once during the life of the application. The first time it was ObjectAL calling it twice due to my misuse of the library. The second SKAction.playSoundFileNamed() must have called alcOpenDevice() before my ObjectAL code did. Upon further research I found this bit in the OpenAL 1.1 Specification:
6.1.1. Connecting to a Device
The alcOpenDevice function allows the application (i.e. the client program) to connect to a device (i.e. the server).
ALCdevice * alcOpenDevice (const ALCchar *deviceSpecifier);
If the function returns NULL, then no sound driver/device has been found. The argument is a null terminated string that requests a certain device or device configuration. If NULL is specified, the implementation will provide an implementation specific default.
My hunch is that Apple's implementation of this function only returns the correct device ONCE for the life of the application. Every time alcOpenDevice is called after that, it returns NULL. So bottom line: Take out every trace of audio code before switching to OpenAL. Even code that seems safe, like SKAction.playSoundFileNamed() still might contain a call to alcOpenDevice() buried deep in its implementation.
For those using ObjectAL, here is the console output of this problem to help them find their way here from google, as I couldn't find a good answer myself:
OAL Error: +[ALWrapper openDevice:]: Could not open device (null)
OAL Error: -[ALDevice initWithDeviceSpecifier:]: <ALDevice: 0x17679b20>: Failed to create OpenAL device (null)
OAL Error: +[ALWrapper closeDevice:]: Invalid Enum (error code 0x0000a003)
OAL Warning: -[OALAudioSession onAudioError:]: Received audio error notification, but last reset was 0.012216 seconds ago. Doing nothing.
fatal error: unexpectedly found nil while unwrapping an Optional value
This SO answer seems to validate my comment about AVAudioSession conflicting with OpenAL. Try removing AVAudioSession, or initializing OpenAL first (tho I imagine this would cause the inverse problem).
Alright, so I ended up starting over in a fresh project with a copy-pasted version of AudioSamplePlayer from the first sample project I linked. -It worked.
I then edited it step by step back to the format I had set up in my project. -It still works!
I still don't know what I did wrong the first time, and I'm not sure it was even in my audio player anymore, but It's running now. I blame gremlins.
...maybe alien surveillance.

Resources