I am going to implement video chat using Opentok by following the post http://www.iphonegamezone.net/ios-tutorial-create-iphone-video-chat-app-using-parse-and-opentok-tokbox/
I have implemented parse.com for the backend which is responsible for session and token creation for opentok
When I am running the code, It creates SessionId,active users (Which I can see in the parse.com's backend)
But When I am trying to connect to opentok with help of following code, Error message coming stating that "The Session failed to connect"
_session = [[OTSession alloc] initWithSessionId:sessionID
delegate:self];
[_session addObserver:self forKeyPath:#"connectionCount"
options:NSKeyValueObservingOptionNew
context:nil];
[_session connectWithApiKey:kApiKey token:token];
if any one know how to solve this problem then help.
Or any suggestion also appreciated.
Yes,Now I got the solution.
Firewall was not allowing me before to connect the session.
I allowed the firewall connection by providing network ID and password.
Now It is working for me.
Related
I am developing an app for iOS with Swift 3, this application search with the Bonjour service some robots in the local network which use a specific service, for example robot.local and show them in a list. The Bonjour service gives me the domain of the device. This is a example of the domains searched.
Ex.:
robot1.local
robot2.local
The next step, is when the user click an element of the list. This action start a connection by web sockets with the device and connect it for control it with the Iphone. I am using a library called RBManager which use RocketSocket library for connect. This library helps me to connect to RosBridge.
I use this code for connect:
NSURLRequest * request = [NSURLRequest requestWithURL:[NSURL URLWithString:#"wss://192.168.0.100:9090"]];
self.socket = [[SRWebSocket alloc] initWithURLRequest:request];
self.socket.delegate = self;
[self.socket open];
The problem is when I am install the app by Xcode I have not any problem but when I am install the app by an ipa file or by TestFlight the connection is rejected and shows this error:
managerDidFailWithError Optional(ErrorDomain=NSOSTatusErrorDomain
Code=-9807 "(null)" UserInfo={_kCFStreamErrorDomainKey=3,
_kCFStreamErrorCodeKey=-9807})
I found this issue in the library but is not the solution that I need.
I am deactive ATS in the info.plist but I not know how to solve this error. Could anyone help me?
I found the problem. The problem was that my RosBridge backend run with TLS and I didn't implement it.
The solution is implement the authentication in the client and all works :D
I am implementing iOS app where I have to implement Respoke SDK for audio and video calling. Audio and video functionality is working fine in development mode but In production mode it gives me error "Api authentication error". I have used this code for production:
[self.client connectWithTokenID:[[aryResult valueForKey:#"data"]valueForKey:#"token"] initialPresence:nil errorHandler:^(NSString *errorMessage)
{
[self showError:errorMessage];
}];
For reference, I have used this : Respoke Documentation
Please tell me what is missing in my end. Please help me out.
Thanks a lot in advance!
It seems most likely you are having one of these problems:
The value returned by [[aryResult
valueForKey:#"data"]valueForKey:#"token"] is not exactly the same
as the value returned by the Respoke server when asking for a
brokered authentication token from
https://api.respoke.io/v1/tokens due to URL encoding of the data
between the server and your iOS application or something similar.
The brokered authentication token is only valid for 20 seconds, so
perhaps too much time has passed before your iOS application
attempts to use it.
You have not switched your application out of
development mode on the Respoke developer portal, or have not
created a role to use when authenticating. This documentation
page
explains how to properly set up your application and define a role
for using brokered authentication. You can also use the example code
on that page to make sure that you are getting a valid token for
your application. This would help make sure you have your account
configured correctly.
I have resolved the issue by adding some lines of code. Now for the production mode, code will be this:
if (!sharedRespokeClient)
{
// Create a Respoke client instance to be used for the duration of the application
sharedRespokeClient = [[Respoke sharedInstance] createClient];
}
sharedRespokeClient.delegate = self;
[sharedRespokeClient connectWithTokenID:tokenStringFromServer initialPresence:nil errorHandler:^(NSString *errorMessage) {
[self showError:errorMessage];
}];
I am developing an iPhone app which is using CocoaHTTPServer for making remote server communication.
The app will send the request details to the CocoaHTTPServer which will store the request locally. Once the internet connectivity is available, CocoaHTTPServer will send the request to remote server & will get the server response now CocoaHTTPServer has to send this response back to the app,
But I am confused how to implement it. Is there any inter app communication api for the same?
Any suggestions are greatly appreciated.
Well , I haven't workaround CocoaHTTP server classes so can't explain you verywell but I found there are couple of tutorials will surly guide you.
Thanks to Matt Gallagher for such a detailed article.
You can listen for a connection using NSFileHandle class
listeningHandle = [[NSFileHandle alloc]
initWithFileDescriptor:fileDescriptor
closeOnDealloc:YES];
[[NSNotificationCenter defaultCenter]
addObserver:self
selector:#selector(receiveIncomingConnectionNotification:)
name:NSFileHandleConnectionAcceptedNotification
object:nil];
[listeningHandle acceptConnectionInBackgroundAndNotify];
When receiveIncomingConnectionNotification: is invoked, each new incoming connection will get its own NSFileHandle. If you're keeping track, you can handle received message
if(CFHTTPMessageIsHeaderComplete(incomingRequest))
{
HTTPResponseHandler *handler =
[HTTPResponseHandler
handlerForRequest:incomingRequest
fileHandle:incomingFileHandle
server:self];
[responseHandlers addObject:handler];
[self stopReceivingForFileHandle:incomingFileHandle close:NO];
[handler startResponse];
return;
}
Note : please go through the full article, it has nice explanation.
Apart from this you may have look on this as well.
Hope this will give you some idea.
You question is focussing on background proces.
When an App goes into background, it get very limited time to finish things up. After that the App freezes in background. That is not a good situation to start communication.
Apple states clearly that the priority is always on the running foreground tasks.
The Notification mechanism (as listed by RDC above) is created to handle external events. During such a wake up you can send/receive a little bit of data, however you'll get minimal priority. Since timing is important in communication, I would not go for that either.
I suggest checking communication during the wakeup call and start activities then. And use the Notification mechanism to wakeup the user, that network is up again.
URL scheme can be used to send the response back to the app. The response from the remote server can be set as a parameter in the URL. The CocoaHTTPServer can invoke the other app which will be the handler of this unique URL. The below link provides more information on the same.
Inter-AppCommunication using URL scheme
We´re developing a HTTP-streaming iOS app that requires us to receive playlists from a secured site. This site requires us to authenticate using a self signed SSL certificate.
We read the credentials from a .p12 file before we use NSURLConnection with a delegate to react to the authorization challenge.
- (void)connection:(NSURLConnection *)connection didReceiveAuthenticationChallenge:(NSURLAuthenticationChallenge *)challenge
{
[[challenge sender] useCredential: self.credentials forAuthenticationChallenge:challenge];
}
- (BOOL)connection:(NSURLConnection *)connection canAuthenticateAgainstProtectionSpace:(NSURLProtectionSpace *)protectionSpace
{
return YES;
}
By doing this initial connection to the URL where we´re getting the .m3u8 playlist we´re able to play back the playlist using AVPlayer. The problem is that this method only works in the simulator.
NOTE: We´re able to download the playlist using the NSURLConnection on device. This must mean that the AVPlayer somehow can´t continue using the trust established during this initial connection.
We have also tried adding the credentials to the [NSURLCredentialStorage sharedCredentialStorage] without any luck.
Below follows our shotgun approach for that:
NSURLProtectionSpace *protectionSpace = [[NSURLProtectionSpace alloc]
initWithHost:host
port:443
protocol:#"https"
realm:nil
authenticationMethod:NSURLAuthenticationMethodClientCertificate];
[[NSURLCredentialStorage sharedCredentialStorage] setDefaultCredential:creds
forProtectionSpace:protectionSpace];
NSURLProtectionSpace *protectionSpace2 = [[NSURLProtectionSpace alloc]
initWithHost:host
port:443
protocol:#"https"
realm:nil
authenticationMethod:NSURLAuthenticationMethodServerTrust];
[[NSURLCredentialStorage sharedCredentialStorage] setDefaultCredential:creds
forProtectionSpace:protectionSpace2];
EDIT: According to this question: the above method doesn´t work with certificates.
Any hint to why it doesn´t work on device, or an alternate solution is welcome!
From iOS 6 onwards AVAssetResourceLoader can be used for retrieving an HTTPS secured playlist or key file.
An AVAssetResourceLoader object mediates resource requests from an AVURLAsset object with a delegate object that you provide. When a request arrives, the resource loader asks your delegate if it is able to handle the request and reports the results back to the asset.
Please find the sample code below.
// AVURLAsset + Loader
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
AVAssetResourceLoader *loader = asset.resourceLoader;
[loader setDelegate:self queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];
// AVPlayer
AVPlayer *avPlayer = [AVPlayer playerWithPlayerItem:playerItem];
You will need to handle the resourceLoader:shouldWaitForLoadingOfRequestedResource:delegate method which will be called when there is an authentication need and you can use NSURLConnection to request for the secured resource.
(BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest
{
//Handle NSURLConnection to the SSL secured resource here
return YES;
}
Hope this helps!
P.S : The proxy approach using CocoaHTTPServer works well but using an AVAssetResourceLoader is a more elegant solution.
It seems that until Apple lets us control what NSURLConnections the AVPlayer uses the only answer seems to be to implement a HTTP loopback server.
To quote the apple representative that answered our support question:
Another option is to implement a loopback
HTTP server and point client objects at that. The clients can use
HTTP (because their requests never make it off the device), while the
loopback HTTP server itself can use HTTPS to connect to the origin
server. Again, this gives you access to the underlying
NSURLConnections, allowing you to do custom server trust evaluation.
Using this technique with UIWebView is going to be tricky unless you
completely control the content at the origin server. If the origin
server can return arbitrary content, you have to grovel through the
returned HTTP and rewrite all the links, which is not much fun. A
similar restriction applies to MPMoviePlayerController/AVPlayer, but
in this case it's much more common to control all of the content and
thus be able to avoid non-relative links.
EDIT:
I managed to implement a loopback server using custom implemenations of the
HTTPResponse and HTTPConnection classes found in CocoaHTTPServer
I can´t disclose the source, but I used NSURLConnection together with a mix of the AsyncHTTPResponse and DataHTTPResponse demonstration responses.
EDIT:
Remember to set myHttpServerObject.interface = #"loopback";
EDIT: WARNING!!! This approach does not seem to work with airplay since the airplay device will ask 127.1.1.1 for encryption keys. The correct approach seems to be defined here:
https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AirPlayGuide/EncryptionandAuthentication/EncryptionandAuthentication.html#//apple_ref/doc/uid/TP40011045-CH5-SW1
"Specify the keys in the .m3u8 files using an application-defined URL scheme."
EDIT:
An apple TV and iOS update has resolved the issue mentioned in the edit above!
After more than a few hours of searching, I got in what looks like a dead end. In this case, all that I am trying to do, is to get all the iOS Devices of the network with Bonjour. I did so like this
self.serviceBrowser = [[NSNetServiceBrowser alloc] init];
[self.serviceBrowser setDelegate:self];
[self.serviceBrowser searchForServicesOfType:#"_apple-mobdev2._tcp." inDomain:#"local."];
This works fine, though what I get is the following:
local. _apple-mobdev2._tcp. [MAC ADDRESS HERE]
I tried to resolve the connection by using the sync port (62078), since service.port returns -1.
for (NSNetService *service in self.services) {
NSLog(#"%#", service);
NSNetService *newService = [[NSNetService alloc] initWithDomain:service.domain type:service.type name:service.name port:62078];
[newService setDelegate:self];
[newService resolveWithTimeout:30];
}
This in its own turn calls netServiceWillResolve: with no problem at all, but, it doesn't make it to netServiceDidResolveAddress:
But neither does this fail. netService:didNotResolve: isn't called either, I believe it is just waiting for a response to be resolved.
To support this claim, once it did make it to the method and actually [service hostName]; did return Yanniss-iPhone, but that happened at a completely random time that I had left the Mac App running for around half an hour. What could have invoked this to run? Or does anyone know of a different way to get the hostName of the remote device? The other answers do not answer my question, since I am looking for the hostName of the remote device, not of the Mac device.
Relative to that, I've found that when you kill and restart iTunes, along with iTunes Helper, the very log I mentioned below is sent again. Which is why I believe the correct log was an iTunes related event. Any help is very much appreciated!
iTunes search bonjour for wifi sync capability. As for the didNotResolve or resolve delay, bonjour services randomly cast itself anywhere between a few seconds to 30 minutes.
I am actually trying to connect to iOS devices too, but I could not get any response or any devices returned. :\