Can I reload my React Native application using a command? - ios

I'm testing on a older iPhone and I have to shake the phone like a madman to get to the dev menu. Is there a command I can send through the bridge to reload or bring up the dev menu? I'm aware of live reload but that's not what I want.
For example on my Android device I can send adb shell input keyevent 82 to simulate a shake event and bring up the dev menu.

There is no way to open dev menu without shaking phone. Here is the issue related to this problem: https://github.com/facebook/react-native/issues/10191.
Nevertheless you can always try your code inside an iPhone emulator and open dev menu using ⌃⌘Z.
Best regards.

Related

How does a UI app inspectors (like the one in Appium) work?

Appium has a way to inspect the view hierarchy of an app using an inspector. I am interested in building one myself.
I know the overview answer of: it uses some webdriver to accomplish this. But how?
It puzzles me that a separate iOS app can some how communicate to another app, and show even its screen.
How does it work under the hood? or how does the iOS app communicate to the UI inspector to send its screen shots and hierarchy?
It puzzles me that a separate iOS app can some how communicate to another app, and show even its screen.
Yes! Apps should not be able to do this. But there exists a special kind of app, built just for testing, which IS able to do this.
The way this is done is using Apple's XCUITest framework. When you write an XCUITest in XCode, it builds a special app which is able to start your test app and then communicate with it using the XCUITest methods. These methods allow you to inspect elements in the view.
In order to create a view tree, you start at the root view and iterate over the children, building out a tree with a tree traversal.
Normally, the XCUITest app exits when your test script finishes, which means you won't be able to access it from a desktop app for viewing the tree as it updates. If you write your test script to run an infinite loop and open a network port for communication with an outside process, now you can build your viewer. This is exactly what Appium does, so I suggest you check out the appium source code and maybe just use that?
More information in this blog post
[edit]: Oh yeah, Appium uses Feacebook's WebDriverAgent project as the script that runs on the app. So WebDriverAgent is basically an XCUITest script which runs a server and can take commands during a test. Appium does a ton of work to bundle and package it into the special kind of companion app that is able to access your app, installs it on the iOS device, and then runs the test. WebDriverAgent has a command which iterates over the UIHierarchy and returns the whole tree.

Appium can not find elements in Android after device popup or activity

I am using Appium 1.8.1 to automate my Android application.
When there is any device alert appears Appium stops finding element after i click on ok button for that popup.
Currently i run the code to relaunch the application after any popup appears and it works after that but i need proper solution for this because relaunching the application is not the solution as per my understanding.
In Python:
driver.switch_to_window(driver.window_handles[0])

iOS Monitor like Android Monitor for debugging?

I'm new with iOS development. What I'm working out is how to get information about my app when I run it in iPhone.
For example: When I develop in Android, I connect my phone (with depuration mode) and open Android Monitor. So I can see if something went wrong an the monitor show me:
lines 425 nullPointerException ....
What's the problem? I built my hybrid app with HTML and Ionic in a Windows PC so I'm using Ionic View to show my app in my friend's iPhone.
Then I downloaded xCode in my friend's Mac and I'm trying to know whats wrong with my app because it show me a white screen.
So... May be, I can reach my app error if I could do something like a I described with my Android Monitor and find the error.
Thanks for helping!
There are several places to see errors and log output depending on what you are doing. If you are running your app via Xcode, then you can see the console output in the debug view - use the following button on the toolbar to open the debug view:
The debug view will appear at the bottom of the Xcode window and it can have two panes - a variable view, and the console output. The console output area will show ouptut from your app as it runs in Xcode. You can open/close the two panes by using the two buttons you will see the bottom right when the debug view is showing.
If you are not running your app via Xcode, then you can connect your device to your Mac via USB cable, and then in Xcode select Window - Devices from the Xcode menu. The new window you get will allow you to connect to your device and see the crash logs on the device. If your app is crashing, this should allow you to see the crash logs from your app.
Alternatively, you can see the console output (similar to Monitor on iOS) by running the Console app on your Mac while your iOS device is connected to the Mac. On the Console app sidebar, you should see your iOS device. If you select the device, you can see the console on the device, like this:
The above might show you what is happening in your app as it runs.
Hopefully, this helps :)

Electron JS kiosk mode with touch screen

We are working on a customer facing electron app which should run in kiosk mode. The application runs on a touch enabled device with windows 10.
Even when the app is in kiosk mode, users can easily get into the OS by using the swipe gestures (Swipe left and Swipe right) of the OS.
What is the ideal way to lock down the app and prevent users from interacting with the OS?
There doesn't appear to be a way to disable touch screen gestures in Windows 10.
If you don't need the full node integration offered in electron, ie. your app could run entirely in Chrome you can run it from a cheap Android dongle and lock it down much more easily. I've done this a few times and there are apps which let you add a password, etc.
Alternatively, you could listen for the blur event on your BrowserWindow which is fired when your app loses focus. At that point you may be able to set it into the foreground again:
const mainWindow = require('electron').remote.getCurrentWindow();
mainWindow.on('blur', () => {
mainWindow.restore();
mainWindow.focus();
mainWindow.setKiosk(true);
});
I was able to disable touchscreen gestures for edge of screen by editing a group policy value:
RUN gpedit.msc from the Run box
Computer Configuration > Administrative Templates > Windows Components > EdgeUI > Allow Swipe: DISABLE
A method I've used since Windows 8, is to kill the explorer.exe process shortly after login. Achieved with a scheduled task, set to run 20-30 seconds after login, with a command like this:
taskkill /F /IM explorer.exe
If you need to work on the system, remote in or connect a keyboard and send CTRL + ALT + Del to bring up Task Manager. Go to File > Run new task. In the Create new task dialogue, enter explorer and hit Enter.
When done, reboot the system and all is back to normal.
What I've done to disable the swipe functionality of Windows is to launch our kiosk application instead of explorer.exe.
So, we often deal with multiple displays and many of our apps are written for chrome browser. That means an automated kiosk will have to run a script that launches full screen incognito/kiosk browser instances on two or more displays. We use AutoHotkey to automate the placement and fullscreening of displays, as well as running a watchdog script that monitors for possible browser crashes and popups.
Unintended defocusing by the OS is the last thing we want.
Anyway, to replace the explorer with your app, I've found this regedit to work:
Windows Registry Editor Version 5.00
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon]
"Shell"="D:\\path\\to\\your\\appFile.bat"
Copy that into a .txt file, rename the extension to .reg and run it.
To revert:
Windows Registry Editor Version 5.00
[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon]
"Shell"="explorer.exe"
Since windows cannot run an .ahk file, we just run a .bat file that runs .ahk file, that in turn runs all the browsers and does all the other initialization magic.
For remote work, we do the same as user Bink mentioned: bring up run dialog and run explorer.exe. When done, reboot.
If you have windows Home and you don't have gpedit.msc
(even if you can install using https://www.itechtics.com/enable-gpedit-windows-10-home/)
I found this working solution:
https://www.tenforums.com/tutorials/48507-enable-disable-edge-swipe-screen-windows-10-a.html

Access an iPads Device Settings using UIAutomation

Can you access the iPad Device settings using the UIAutomation instrument?
I can use deactivateAppForDuration, but that does not allow me to exit the app navigate to the settings page, change a setting, and navigate back to the app and move on.
any suggestion?
No, you cannot automate anything outside of the target application as of right now. When your app leaves the foreground the automation tool loses control until it returns (if it does at least).

Resources