iOS: What is IOFence and why is it freezing our app? - ios

We have an AR project (built in Unity) that's running 24/7 on a number of iPads Pros (12.9, gen 3) in a museum. It's been live for 2.5 years now. We've always had the occasional freeze (on iOS 13 and below), but lately they have gotten exponentially worse (on iOS 14.4.2 and 14.6).
The problem manifests as follows, at completely random times: our app keeps running in the background (we can hear sound effects and see database activity continuing), but it stops getting redrawn, causing the screen to freeze on whatever frame was drawn last.
We're getting a gpuevent .ips log at freeze time (see below), which has information about an IOSurface being blocked by IOFence. There's very little info online about these logs though, so we're sort of stuck with the analysis. From what little I can find, it seems like we have some resource, possibly a texture buffer, that gets accessed by two processes simultaneously, causing one of those processes (the one responsible for drawing the screen) to get killed by the OS. Does that sound plausible? And if so, how do we find out where in our code the issue is triggered?
Can anyone here interpret this log better than we can, or does anyone have suggestions as to our next steps?
{"bug_type":"284","timestamp":"2021-07-10 10:51:30.00 +0200","os_version":"iPhone OS 14.6 (18F72)","incident_id":"C3ABD477-2A61-4D43-A3B1-25D7CCD706E6"}
{
"process_name" : "hats",
"registers" : {
},
"timestamp" : 1625907090,
"analysis" : {
"iofence_list" : {
"iofence_num_iosurfaces" : 1,
"iofence_iosurfaces" : [
{
"iofence_current_queue" : [
{
"iofence_acceleratorid" : 1,
"iofence_backtrace" : [
-68148358856,
-68148357636,
-68144533400,
-68144532380,
-68144496896,
-68144598792,
-68144425788,
-68144434312
],
"iofence_direction" : 1
}
],
"iosurface_id" : 243,
"iofence_waiting_queue" : [
{
"iofence_acceleratorid" : 2,
"iofence_backtrace" : [
-68148358856,
-68148357636,
-68146420632,
-68154559340,
-68146423972,
-68146427004,
-68146495056,
-68154290536
],
"iofence_direction" : 2
},
{
"iofence_acceleratorid" : 1,
"iofence_backtrace" : [
-68148358856,
-68148357636,
-68144533400,
-68144532380,
-68144496896,
-68144598792,
-68144425788,
-68144434312
],
"iofence_direction" : 1
}
]
}
]
},
"fw_ta_substate" : {
"slot0" : 0,
"slot1" : 0
},
"fw_power_state" : 0,
"fw_power_boost_controller" : 0,
"guilty_dm" : 1,
"fw_power_controller_in_charge" : 0,
"fw_cl_state" : {
"slot0" : 0
},
"fw_perf_state_lo" : 1,
"fw_ta_state" : {
"slot0" : 0,
"slot1" : 0
},
"signature" : 625,
"fw_power_substate" : 4,
"command_buffer_trace_id" : 490782964,
"fw_perf_state_select" : 0,
"restart_reason" : 7,
"fw_3d_state" : {
"slot0" : 0,
"slot1" : 0,
"slot2" : 0
},
"fw_gpc_perf_state" : 0,
"fw_perf_state_hi" : 1,
"fw_power_limit_controller" : 7,
"restart_reason_desc" : "blocked by IOFence"
}
}

Related

Reading data from Firebase RTDB using Flutter has different behavior on Android and iOS

I'm experiencing a behavior where the following call behaves differently when executed for iOS and for Android.
In Android, the following .get() call returns the expected snapshot from the chatRoomID path. However, in iOS, .get() ends up returning a snapshot of the whole node under myUser.userID.
It seems for iOS, the second child node path is disregarded...
DataSnapshot snapshot = await chatsRef
.child(myUser.userID!)
.child(chatRoomID)
.get();
print(snapshot.value);
JSON:
{
"chats" : {
"oF1b6J4Hz3NGzRb9RmSVFGJdcYi1" : {
"c00dca80-9077-11ec-855a-910961fc4253" : {
"chatAdmin" : "oF1b6J4Hz3NGzRb9RmSVFGJdcYi1",
"chatImage" : "https://images.pexels.com/photos/887827/pexels-photo-887827.jpeg?auto=compress&cs=tinysrgb&h=650&w=940",
"chatName" : "chef",
"chatRoomID" : "c00dca80-9077-11ec-855a-910961fc4253",
"isActivityChat" : false,
"isGroupChat" : true,
"lastMessage" : {
"lastMessage" : "Hey",
"lastMessageTime" : "2022-02-18 00:00:56.992308",
"messageID" : "-MwAC5BUz3GKPmkA1SSQ",
"nKDsrLrcU0PgtEDV5tKpMumSDuu1" : "true",
"oF1b6J4Hz3NGzRb9RmSVFGJdcYi1" : "false",
"psJQRp96VGWIjTDpNpMUShPNWa82" : "true",
"sendBy" : "oF1b6J4Hz3NGzRb9RmSVFGJdcYi1",
"senderName" : "Emily"
},
"muted" : {
"nKDsrLrcU0PgtEDV5tKpMumSDuu1" : false,
"oF1b6J4Hz3NGzRb9RmSVFGJdcYi1" : false,
"psJQRp96VGWIjTDpNpMUShPNWa82" : false
},
"users" : {
"nKDsrLrcU0PgtEDV5tKpMumSDuu1" : true,
"oF1b6J4Hz3NGzRb9RmSVFGJdcYi1" : true,
"psJQRp96VGWIjTDpNpMUShPNWa82" : true
}
},
"nKDsrLrcU0PgtEDV5tKpMumSDuu1_oF1b6J4Hz3NGzRb9RmSVFGJdcYi1" : {
"chatRoomID" : "nKDsrLrcU0PgtEDV5tKpMumSDuu1_oF1b6J4Hz3NGzRb9RmSVFGJdcYi1",
"isActivityChat" : false,
"isGroupChat" : false,
"lastMessage" : {
"lastMessage" : "Shut",
"lastMessageTime" : "2022-02-18 00:01:30.161511",
"messageID" : "-MwACDHlTWpQhhQFh9k8",
"nKDsrLrcU0PgtEDV5tKpMumSDuu1" : "true",
"oF1b6J4Hz3NGzRb9RmSVFGJdcYi1" : "false",
"sendBy" : "oF1b6J4Hz3NGzRb9RmSVFGJdcYi1",
"senderName" : "Emily"
},
"muted" : {
"nKDsrLrcU0PgtEDV5tKpMumSDuu1" : false,
"oF1b6J4Hz3NGzRb9RmSVFGJdcYi1" : false
},
"users" : {
"nKDsrLrcU0PgtEDV5tKpMumSDuu1" : true,
"oF1b6J4Hz3NGzRb9RmSVFGJdcYi1" : true
}
}
}
},
}
As discussed in the comments
You should be getting the same behavior on both iOS and Android, as the underlying SDKs should work the same on those platforms. That said, since the FlutterFire libraries wrap the native SDKs and the get() API is relatively new, it might be that there is an unintended difference between how the iOS and Android SDKs implement it.
If that is indeed the case, you might want try and see if once() works for you. While it has an annoying side-effect in an edge-case (see here), the once API itself is much older, so more likely to be stable across platforms.
Update: I'm having a really hard time reproducing the problem. I've imported your JSON into a database of mine, and then run this code on startup of my app:
FirebaseDatabase database = FirebaseDatabase.instance;
var chatsRef = database.ref('71163140/chats');
var threadRef = chatsRef.child('oF1b6J4Hz3NGzRb9RmSVFGJdcYi1').child('nKDsrLrcU0PgtEDV5tKpMumSDuu1_oF1b6J4Hz3NGzRb9RmSVFGJdcYi1');
DataSnapshot snapshot = await threadRef.get();
if (snapshot.value != null) {
print('get: ${snapshot.value}');
}
DatabaseEvent event = await threadRef.once();
if (event.snapshot.value != null) {
print('once: ${event.snapshot.value}');
}
So it reads a single chat thread out of the data you shared with both get() and once(), and then prints what it gets. Both on iOS and Android, I never see the c00dca80-9077-11ec-855a-910961fc4253 thread showing up in my output, I only get the chat thread where "isGroupChat" : false,.
Can you update the code or data in your question to show how I can get the same faulty result as you get?

How do I set microsoft Edge browser binary file for Selenium Remote Webdriver

I am trying to execute remote webdriver tests using Selenium Grid, My node is a Windows 8 system, and I have insider version of edge installed on it. Now when I try to execute my tests, the node starts the driver service but it is not able to find the microsoft edge binary on the node.
Machine : Windows 8
Edge Version : 77.0.235.9
binary Path : C:\Program Files (x86)\Microsoft\Edge Beta\Application\msedge.exe
Selenium Version : 3.141.59
I tried to append the path to the edge executable "msedge.exe" on the node's Path variable -> did not work
I tried to mention the edge_binary bath in the nodeconfig.json as well, but still the executable was not found.
else if (browsername.equalsIgnoreCase("edge")) {
EdgeOptions options = new EdgeOptions();
driver = new RemoteWebDriver(new URL("http://192.168.1.107:8889/wd/hub"), options);
return driver;
}
nodeconfig.json :
{
"capabilities":
[
{
"browserName" : "chrome",
"maxInstances" : 5,
"seleniumProtocol" : "WebDriver"
},
{
"browserName" : "firefox",
"maxInstances" : 5,
"firefox_binary" : "C:\\Program Files\\Mozilla Firefox\\firefox.exe",
"seleniumProtocol" : "WebDriver",
"acceptInsecureCerts": true,
"acceptSslCerts": true
},
{
"browserName" : "internet explorer",
"version" : "11",
"maxInstances" : 5,
"seleniumProtocol" : "WebDriver"
} ,
{
"browserName" : "MicrosoftEdge",
"platform" : "WINDOWS",
"edge_binary" : "C:\\Program Files (x86)\\Microsoft\\Edge Beta\\Application\\msedge.exe",
"version" : "77.0.235.9",
"maxInstances" : 5,
"seleniumProtocol" : "WebDriver"
}
],
"proxy" : "org.openqa.grid.selenium.proxy.DefaultRemoteProxy",
"maxSession" : 10,
"port" : 5555,
"register" : true,
"registerCycle" : 5000,
"hub" : "http://192.168.1.107:8889",
"nodeStatusCheckTimeout" : 5000,
"nodePolling" : 5000,
"role" : "node",
"unregisterIfStillDownAfter" : 1000,
"downPollingLimit" : 2,
"debug" : false,
"servlets" : [],
"withoutServlets" : [],
"custom": {
"webdriver.ie.driver" : "drivers/ie/win/IEDriverServer.exe",
"webdriver.gecko.driver" : "drivers/firefox/win/geckodriver.exe",
"webdriver.chrome.driver" : "drivers/chrome/win/chromedriver.exe",
"webdriver.edge.driver" : "drivers/edge/win/msedgedriver.exe"
}
}
How can I make the executable visible to the client trying to automate it,
EdgeOptions does not allow to set a binary path as well.

Ricoh Theta S doesn't accept new options

I'm having problems with JSON and Postman when I'm trying to test camera settings. Every time when I change settings, the picture is always the same. If I change shutter speed to value 0.0002 (which should be shutter speed 1/5000) the result is the same as on 0.1 (1/10).
This is my JSON:
{
"name" : "camera.setOptions",
"parameters" : {
"sessionId": "SID_0001",
"options" : {
"clientVersion" : 2,
"_autoBracket": {
"_bracketNumber": 3,
"_bracketParameters": [
{
"exposureProgram" : 1,
"shutterSpeed": 0.1,
"iso": 400,
"exposureCompensation" : 0,
"whiteBalance" : "_colorTemperature",
"_colorTemperature": 5100
},
{
"exposureProgram" : 1,
"shutterSpeed": 0.1,
"iso": 320,
"exposureCompensation" : 0,
"whiteBalance" : "_colorTemperature",
"_colorTemperature": 5100
},
{
"exposureProgram" : 1,
"shutterSpeed": 0.1,
"iso": 1600,
"exposureCompensation" : 0,
"whiteBalance" : "_colorTemperature",
"_colorTemperature": 5000
}
]
}
}
}
}
Once I submit those settings using camera.execute.setOptions, the reply says that the settings have been successfuly changed, yet when I try to take a picture using camera.execute.takePicture, the picture turns out to be the same as on default settings. Am I missing something? Thanks in advance!
When you are using _autoBracket option you have to use camera.startCapture with parameter "_mode" : "bracket" and not camera.takePicture as when taking ordinary photos
{
"name" : "camera.startCapture",
"parameters" : {
"_mode" : "bracket"
}
}

Create push notification when value in firebase database change

I want to send a push notification to the user when a value in my firebase database change. this is my database
{
"Battles" : {
"00000111-062B3333-4046-4FB4-AA37-C2B05853E497" : {
"BattleProgress" : "",
"Player1" : "lzsPuNwHbIZI1J8k40FspYRV4XQ2",
"Player2" : "tHNBif9csWNCOuftAGLAqvLWNUw1",
"Score" : "0-0",
"Turn" : 1
}
},
"users" : {
"lzsPuNwHbIZI1J8k40FspYRV4XQ2" : {
"Coins" : 1,
"Dollars" : 0,
"FBID" : “ID”,
"GamesLost" : 0,
"GamesWon" : 0,
"name" : “Name”
},
"tHNBif9csWNCOuftAGLAqvLWNUw1" : {
"Coins" : 0,
"Dollars" : 0,
"FBID" : “ID”,
"GamesLost" : 0,
"GamesWon" : 0,
"name" : “Name”
}
}
}
So lets say, the value of "Turn" changes, then I want a push notification sent out to the player whos turn it is. But how can I check if the value of "Turn" changes when the user shuts down the app?
I've read a bit about Firebase Cloud Messaging, but cant seem to find the answar...
You can do this with Firebase using Cloud Functions. Check out https://firebase.google.com/docs/functions/use-cases and https://firebase.google.com/docs/functions/database-events for more information.

"Flush Result : No Connectivity" Error for FBSDKAppEvents

I have integrated FBSDKCoreKit.framework to track app events, I am calling [FBSDKAppEvents activateApp] in applicationDidBecomeActive: and enabling logs using [FBSDKSettings enableLoggingBehavior:FBSDKLoggingBehaviorAppEvents]
In logs it's logging following error
FBSDKLog: FBSDKAppEvents: Flushed # 1473666575, 2 events due to 'Timer' - {
"advertiser_tracking_enabled" = 1;
"anon_id" = "xxxxxx-xxxxx-xxxxx-xxxxx";
"application_tracking_enabled" = 1;
event = "CUSTOM_APP_EVENTS";
extinfo = "[xxx, xxx, xxx]";
"url_schemes" = "[\"xxxxxxxx\"]";
}
Events: [
{
"isImplicit" : false,
"event" : {
"fb_mobile_launch_source" : "Unclassified",
"_session_id" : "xxxxxx-xxxxx-xxxxx-xxxxx",
"fb_mobile_app_interruptions" : 0,
"_logTime" : 1473664599,
"_ui" : "no_ui",
"_eventName" : "fb_mobile_deactivate_app",
"_valueToSum" : 155,
"fb_mobile_time_between_sessions" : "session_quanta_2"
}
},
{
"isImplicit" : false,
"event" : {
"fb_mobile_launch_source" : "Unclassified",
"_ui" : "no_ui",
"_eventName" : "fb_mobile_activate_app",
"_logTime" : 1473665765,
"_session_id" : "96FA9509-AB21-475F-9F44-3005FE5D10BC"
}
}
]
Flush Result : No Connectivity
At end of log it's showing me error Flush Result : No Connectivity
Any one know why I'm getting this error ?
I got it working by following this guide FBSDK doc here does not explain about each configuration required for tracking events

Resources