GCDWebservers background mode not working on device - ios

I'm using GCDWebservers to start a http service. I want it still accept connections in background mode. This is my source code:
func startServer (){
do {
webServer?.stop()
webServer?.removeAllHandlers()
try self.webServer?.start(options: [GCDWebServerOption_BonjourName: "", GCDWebServerOption_BonjourType: IMS_DOMAIN, GCDWebServerOption_Port : IMS_SERVICE_PORT, GCDWebServerOption_AutomaticallySuspendInBackground: false])
} catch {
print("Start server error: ", error.localizedDescription)
}
print("bonjour type", self.webServer?.bonjourType)
}
It works on simulator but on real device its not working. This message printout when App goes foreground again:
dnssd_clientstub read_all(10) DEFUNCT
[ERROR] Bonjour registration
error -72000 (domain 10)
Please help me.

It's not possible to run a web server while your app is in the background (except for the first few minutes at most). See the "GCDWebServer & Background Mode for iOS Apps" section in the GCDWebServer README file for the detailed information:
Typically you must stop any network servers while the app is in the background and restart them when the app comes back to the foreground.

Related

Callkit - No audio if starting a call from background

This started to happen since iOS 13.3.1
On my app I use Callkit + WebRTC to establish VOIP connections. I always managed to establish connections without an issue.
However, since 13.3.1 that I'm not able to start a Callkit call if the app's not on the foreground: I manage to establish the connection but the callkit isn't started (no green icon/bar on the top) and the microphone isn't picked up also.
I always get the following error:
Error requesting transaction ((
" contactIdentifier=(null) video=0 relay=0 upgrade=0 retry=0 emergency=0 isVoicemail=0 ttyType=0 localLandscapeAspectRatio={0, 0} localPortraitAspectRatio={0, 0} dateStarted=(null) localSenderIdentityUUID=(null) shouldSuppressInCallUI=0>"
)): (Error Domain=com.apple.CallKit.error.requesttransaction Code=6 "(null)")
From what I've gathered (there is almost no information about this code 6 error) Callkit may terminate if the AudioSession isn't active. However I'm not understanding what happened since 13.3.1 to affect this on background (I have Audio,Airplay and PIP / Voice over IP / Background fetch) modes active.
In the meanwhile I tried numerous things, from activating the session myself (both before callController.request and also before provider.reportOutgoingCall)
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playAndRecord, mode: AVAudioSession.Mode.voiceChat, options: .mixWithOthers)
try AVAudioSession.sharedInstance().overrideOutputAudioPort(.speaker)
try AVAudioSession.sharedInstance().setActive(true)
} catch {
print(error)
}
to playing a silent audio (trying to force the AVAudioSession to activate) but had no luck whatsoever.
Any suggestions?
I experienced same thing when I implement call kit with my project, I tried everything with AudioSession but turns out it was related to library which I was using for webrtc and sip, there was one line inside webrtc library which check application state if it is in background or not, if it is if was not connecting audio. So my advice is check webrtc code base or search inside the codes for app state checks like UIApplicationStateBackground or directly [UIApplication sharedApplication].applicationState.
I hope this will help!
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(AVAudioSession.Category(rawValue: convertFromAVAudioSessionCategory(AVAudioSession.Category.playback)), options: [.allowBluetooth])
try session.setMode(AVAudioSession.Mode.voiceChat)
try session.setPreferredSampleRate(44100.0)
try session.setPreferredIOBufferDuration(0.005)
} catch {
print("Error configuring audio session webrtc",error)
}
Note: Use the Category to "playback". Do not use "playAndRecord" for AVAudioSession which causes the No Audio in background mode.

Swift: Get Mobile Data permission state

I have an application which I was testing on iOS 10.3 since few days and it was working fine. I recently tested it on an iOS 12 device and it was not working as expected. The application was not able to connect to my server.
Upon investigation, I found that the "Mobile Data" switch was turned off for my application in Settings -> AppName. After turning it on, it started working perfectly.
So, given this scenario, is there a way I can determine the status of this switch from my code? If I can know the status and if it's off, I can redirect the user to the application setting using:
let urlObj = NSURL.init(string:UIApplicationOpenSettingsURLString)
if #available(iOS 10.0, *) {
UIApplication.shared.open(urlObj as! URL, options: [ : ], completionHandler: { Success in
})
} else {
let success = UIApplication.shared.openURL(url as URL)
print("Open \(url): \(success)")
}
P.S: I am not looking for a solution using Reachability as it is not completely reliable.
There is no way to check this setting in the latest iOS release. You have two options to deal with this: first is you check if device is not connected to WiFi, and also not in airplane mode. If server can not be reached it’s safe to assume that data is disabled.
Second option is just to present user errors saying the application is unable to reach the server and to change UI/notice in application accordingly to deal with any scenario where a network connection can not be reached.
CTCellularData is a potential option however as of iOS 11 has some down falls of how often the state is checked.

NSURLSession return no connection error when launching after sleep mode

In my app, I use NSURLSession to download data from a server.
The download happens in two scenarios:
1. on app launch
2. if refresh button pushed
I noticed one issue with the following operation:
Open my app
Put iPhone into sleep mode
Awake iPhone out of sleep mode
Push refresh button
ERROR - I get "no connection error"
Push refresh button again - download starts with success
It seems like my app doesn't have an internet connection active each time iPhone awakes from sleep mode. But it restores connection when I push refresh for the second time.

Detecting Firebase connection state uses huge amount of memory in react-native app

When I add the following code to my react-native app, memory usage soars from 40MB to 400MB in ten minutes (and keeps going) as soon as I take the app offline.
root.child(".info/connected").on("value", (snap) => {
if (snap.val() === true) {
this.online = true;
info("Going online");
if (this.user) { /* counter already loaded */
debug("posting offline transactions");
}
} else {
this.online = false;
info("Going offline");
}
});
The app itself is entirely quiescent and the .on() listener is not getting triggered. When I bring the phone back online memory usage stabilizes but does not decrease.
I have no idea how to debug this. I cannot run the app under the Chrome debugger because the phone has to be online to connect to the debugger. I cannot use the iOS simulator because to bring that offline you have to bring the whole computer offline and then you get the error: WebSocket connection failed
The only way I have to debug is to view console.log messages in xcode and my app has lots of them, but nothing is happening in the app itself.
I need to monitor offline/online state in order to work around the fact that Firebase transactions consume a large amount of memory when the app is used offline.

Why sending message from WatchKit extension to iOS and getting back a reply is so slow?

I am using sendMessage method to send a message from WatchKit extension to an iOS app. It takes about 230 ms on average to receive a reply. The time does not depend on whether the iOS app is on screen or running in the background. 230ms is roughly the time it takes for light to travel the Earth circumference and back. But the phone is sitting 30 cm from my watch when I am testing this.
Questions:
Why is it so slow?
Is it supposed to be so slow?
Is there a way to make it faster?
An observation: according my previous experiments in watchOS 1 communication was a bit faster, a roundtrip used to take about 50 ms.
Send a message from WatchKit extension
let session = WCSession.defaultSession()
session.sendMessage(["message from watch":"🌷"], replyHandler: { reply in
// Getting reply from iOS app here
}, errorHandler: nil)
Receive the message from iOS app
func session(session: WCSession, didReceiveMessage message: [String : AnyObject], replyHandler: ([String : AnyObject]) -> Void) {
replyHandler(["reply from iOS":"🐱"])
}
Demo app: https://github.com/evgenyneu/WatchKitParentAppBenchmark
iOS: 9.0, watchOS: 2.0
AFAIK, when you send a message to other device, the message will be archived to file on local directory that called as WatchDirectory.
This directory will be synchronized to other device like as other iCloud Drive App or Drop Box through bluetooth. Because this approach doesn't need App running for iOS and watchOS App while the transfer will be finished.
When the new files were arrived on directory, iOS(or watchOS) will invoke WCSession related API to process content. If needed, iOS(or watchOS) will awake the destination App in background before dispatch message.
With watchOS1, the watch extension runs on iOS, only remote UI runs on AppleWatch. So it requires much more simple process to communicate, just communication between processes.
sendMessage is much more expensive method than other communication API those are provided by WCSession. iOS can't use it till the watch App runs foreground, And using sendMessage from watchOS should have to wake up iPhone and launch iOS App in background. After the dispatched messages were handled, iOS may kill the destination app that running on background to get memory back.
So, IMO there is no reason that it should be fast.
In my case for refresh my UI instantaneously on device:
func session(session: WCSession, didReceiveMessage message: [String : AnyObject]) {
//receive message from watch
dispatch_async(dispatch_get_main_queue()) {
self.textLabel.text = message["optionSent"]! as? String
}
}

Resources