I have the functionality Where I am loading the Wistia Video in Webview.
This is what I am doing in React Native.
The functionality works well when the video is loaded in portrait but when the video is moved to the landscape mode via native controller and when closed the UI gets distorts and causes the exception as below.
excessive number of pending callbacks: 501. Some pending callbacks that might have leaked by never being called from native code: {"17417":{},"17418":{},"17419":{},"17420":{"module":"UIManager","method":"measureLayout"},"17421":{"module":"UIManager","method":"measureLayout"},"17422":{},"17423":{},"17424":{},"17425":{},"17426":{},"17427":{},"17430":{},"17431":{},"17432":{},"17433":{},"17434":{},"17435":{},"17436":{},"17437":{},"17438":{},"17439":{},"17440":{},"17441":{},"17442":{"module":"UIManager","method":"measureLayout"},"17443":{"module":"UIManager","method":"measureLayout"},"17444":{},"17445":{},"17446":{},"17447":{},"17448":{},"17449":{},"17450":{},"17451":{},"17454":{},"17455":{},"17456":{},"17457":{},"17458":{"module":"UIManager","method":"measureLayout"},"17459":{"module":"UIManager","method":"measureLayout"},"17460":{},"17461":{},"17462":{},"17463":{},"17464":{},"17465":{},"17466":{},"17467":{},"17468":{},"17469":{},"17470":{},"...(truncated keys)...":451}
The solutions to this suggests that there can be multiple async calls etc. which is not the case here.
The simple flow is as below
FlatList> WebView>Wistia link> turning to landscape mode> closing the video> Exception thrown.
Related
I have a toy iOS app that's just for me. It's a native wrapper of a web-based WebAssembly-heavy game.
It's one of those where you set a task, and then wait for it to complete. Great for desktop browsers where you can just leave an open window in the corner of the display and work elsewhere.
WKWebView, however, seems to go inactive and stops processing game content when the app is backgrounded. I'd like any way (evil hacks accepted) to keep this from happening, so that the game does not pause or go inactive.
I've tried a number of hacks, including:
attaching the web view to the key window (deprecated in iOS 13 beta)
using an audio session to keep the app alive in the background
pumping the web view with evaluateJavaScript calls on a timer while in the background
...to no avail. Any solutions welcome!
(p.s., I'm aware of a couple other questions like this one, but want to ask my own to manage the bounty and also I'm not sure if the game being WASM makes a difference, vs. the other questions specifically calling out JavaScript)
I use outgoing call only in the app on iOS 10.3.3. If I do not have connected headphones to iPhone then pressing on lock button will end the call. But what if I have connected headphones?
Sometimes (after lock) I see native UI. Sometimes I see just time counter without any buttons (like speaker, mute, etc). Is there any special flag (or I should implement some delegate?) to allow see native callkit screen after pressing lock button? It looks like I tried everything here...
P.S. I know this screen could be showed while user take a call while iphone locked. But I would like (and I got it several times at least) to get this screen on lock device.
Did I miss something?
I hope that is the native iOS behaviour, callkit works similar to the native calls.
So customisation can't be done on call screens in lock mode.
For an iOS app we are developing, I am trying to tightly synchronize the rendering of a 3D scene happening in a WKWebView (using three.js/WebGL) and the OpenGL rendering happening on a separate UIView in the native side of the app (Objective-C). Each time the rendering loop of the native side prepares and presents a frame, a call to a JS function is done via the WKWebView evaluateJavaScript method to render one frame using three.js. The result is that, even though both rendering calls are frame synchronous, visually there's a tiny lag between the display of both scenes that doesn't work for our purposes. This is likely due to the fact that both renderings are happening in separate threads (OpenGL contexts?).
My question is: is it possible to completely synchronize the display of both a WKWebView/WebGL and a UIView (UIGLViewProtocol)?
I've having an issue with UIWebView and running HTML 5 games (that another developer is working on). We've tried two different options, and neither is optimal.
Option 1: He renders the HTML 5 game with "canvas drawing". When he does it this way, nothing crashes, however in iOS 9 when we go back into the app from the background, the Web View loads back up, but the game is moving much slower than normal (issue not on iOS 7.1 and above). By much slower I'm talking about the animations are not moving the same velocity that they were when we first load the game. The weird thing about this issue is that even if the user opens up a different HTML 5 game (we're adding multiple games) the animations are slower for that game as well. I've tried dismissing the Web View Controller when the UIApplicationWillResignActiveNotification gets posted. When I set the game up this way, the slowness only happens if the app is in the for 4 seconds (it's very strange).
Option 2: He renders the game with "WebGL". When he renders it this way, the app crashes when the app gets backgrounded on iOS 8.0 and above. My research into the crash is that iOS can't draw OpenGL ES in the background. I'm assuming that the WebGL commands are running similar commands as OpenGL ES would do, hence the crash. Dismissing the Web View Controller on UIApplicationWillResignActiveNotification still causes the crash to happen.
Has anyone else ever dealt with a situation like this?
I've not found a good solution to the problem, but I did find a work around.
When I get the App Will Resign Active message, I remove the UIWebView from the UIWebViewController's subview. When I get the Did Become Active message, I add the UIWebView to the UIWebViewController's subview.
This solution works for both cases.
I have a strange situation...
I implemented an in-app camera based on Apple's AVCam sample. It works just fine. My question is not about the actual camera implementation, but rather... What could cause a view's buttons to work on one iPhone 5S but fail on another iPhone 5S. Both are using the same build of the app, they have the same iOS version installed (7.0.4), etc.
The problem is...the camera starts and the camera preview displays just fine, but the buttons on that view (i.e. the shutter release, flash options, front/back camera switch, etc) all fail to respond. His iPhone 5S is the only one out of 4 iPhone 5S's that has the problem.
Trying to narrow down what can be different until I can hook the "sad" iPhone 5S up to my debugger in a few days when I see my client again (it's his)... we did notice that my phone asked for permission to access my photos and his did not...
Is there perhaps some system setting that he could have enabled that would cause this check to be skipped? I ask because I wonder if the camera scene's view controller is waiting for something from that check and therefor hanging the UI.
Any ideas would be appreciated
Finally tracked down the issue...
The difference was that on my developer phone, I had a few hundred to 1000 pictures in my camera roll. On my client's phone, he had about 6,000 pictures, so obviously getting those takes longer. If we were patient, the view eventually did come back alive after the enumeration block finished.
Also, I was asking my UICollectionView to scroll to the end (where the newest photos are) after it had finished loading all the camera roll images into itself. With few photos, the timing was fine, but with lots of photos, the timing was off and it was trying to scroll before it had finished enumerating. The solution, since there is no callback for "didFinishReloadingData" was no call the scroll method using -performSelector:withObject:withDelay to ensure that it gets called AFTER the enumeration block and reloadData are finished.