iPhone UI Automation unrestricted to a single app - ios

I hope this is not a duplicate, but I couldn't find what I'm interested in.
I'm trying to build some automated tests that will use an iPhone. The best tool I can find for this seems to be UI Automation, but from what I can tell so far, I need to run it against a specific app. My tests need to be more general. For example, I want to be able to answer an incoming call in the default dialer.
I would like to be able to automate the iPhone itself, not a specific app. My tests might involve switching between apps or making calls. The main features I need are to be able to take a screenshot and touch certain coordinates on the screen, regardless of what is on the screen and running at the time.
Can anyone suggest the proper way to set this up? I'd like to use the supported UI Automation tools, but would like to use them in a more general way.
Thanks

Related

Embed Unreal Engine 4 project into another app

I've been trying to work on a proof of concept (POC) where I can embed a UE4 project into an existing application (in my case NativeScript) but this could just as easily apply to Kotlin or ReactNative.
In the proof of concept I've been able to run the projects on my iPhone launching from UE4 pretty easily by following the Blueprint and C++ tutorials for the FPS. However the next stage of my POC requires that I embed the FPS into an existing NativeScript application, this application will manage the root menu, chat, and store aspects of the platform in the POC.
The struggle I'm running into is that I cannot find how to interact with the xcode project generated from the blueprint tutorial and the C++ tutorial generates a xcode project that i'm unsure where the actual root is that I need to wrap.
Has anyone seen a project doing this before and if so are there any blogs or guidance that you can point me to? I've been Googling and looking around for a couple weeks and have hit a dead end. I found a feedback post here from April of 2020, that was referring to a post in January 2020 that talked about how Unity has a way to embed into other applications additionally a question from 2014 here. But other than that it's a dead end.
A slightly different approach
Disclaimer: I'm not an UE4 developer. Guilty as charged for seeing an unanswered bounty too big to ignore. So I started thinking and looking - and I've found something that could be bent to your needs. Enters pixelstreaming.
Pixelstreaming is a beta feature that is primarily designed to allow for embedding the game into a browser. This opens a two way communication between a server where the GPU heavy computations happen and a browser where the player can interact with the content - the mouseclick & other events are sent back to the server. Apparently it allows some additional neat stuff, however that is not relevant for the question at hand.
Since you want to embedd the Unreal application into your NativeScript tool(menu of some kind if I understood correctly), you could make your application a from two separate parts:
One part would run the server.
The second part would handle the overlay via the pixelstreaming.
This reduces the issue of embedding the UE4 into an application to the(possibly easier) issue of embedding a browser into your application. (Or if your application is browser based - voila, problem solved.)
If you don't want to handle the remote communication, just have the server-side run on the localhost.(With the nice sideeffect of saving bandwidth.)
Alternatively, if you are feeling adventurous, you could go and write your own WebRTC support on the application side to bypass the need for the browser alltogether. It might not be worth the effort though.
Side note: The first of the links you provided is a feature request which hints at the unfortunate fact that UE4 doesn't support embedding. This is further enforced by the fact that one of the people there says somethig along the lines "Unity can to this, it would be nice if UE4 could as well."
Yet a different approach:
You could embedd and use a virtual display to insert the UE4 part into your controller - you would be basically tricking UE4 into thinking that the desired display device is a canvas inside your application.
This thread suggests a similar approach:
In general, the way to connect two libraries like this would be through a platform dependent window handle, e.g. a HWND under Windows. Check the UE api if you find any way to bind the render target to a HWND. Then you could create a wxWindow in wxWidgets and tell UE to render into that window. That would be a first step.
I'm not sure if anything I've listed will be of much help but hey, at least I tried :-). Good luck with your game.
At the same time, the author suggests to:
Reverse the problem:
Using the UE4 slate framework and online subsystem. You would use the former to create the menus you need directly in the UE4 and then use the latter to link to the logic you want to have outside of the UE4. However that is not what you asked for so I'm listing it only for the completeness sake.

How can I automate specific settings in IOS?

I’m no new user of Apple products, but I am brand new to coding and automation on iPhone. I’ve created an automation that does something at a specific time of day, but that’s as far as I’ve gotten because there’s limited settings they offer without me coding anything in. I’m trying to change a specific setting in the settings app “Notifications->(App)->turn notifications off” but I’m not sure how, or even if it’s possible to do, outside of what’s already programmed in. Thanks in advance, if anyone figures it out and lends a hand. (Edit: I would automate Do Not Disturb because that seems like the simple answer, but long story short, I don’t want anyone to see banners of messages and read my texts without my phone unlocked, and Do Not Disturb doesn’t do that as far as I’m aware.)

how to build micro apps in iOS

We have a requirement to build apps within an app, like how we have a main app called, Google and when we click the app icon, we get options like gmail, chrome ...and those apps function independently..How to organise these apps within the main app.
Could anyone give some basic idea on how to build micro-apps for iOS.
From my point of view, there is no such thing as a "micro app". What you seem to want to achieve is put various functions in one single app, what is not a good idea.
Thinking about your example with Google, they arent doing what you are mentioning. If the functionality is big enough, they put it in a seperate app (gmail, maps, ...). Same goes for you.
If you really want to go that way tho, i would highly suggest to make seperate storyboards for every "micro app" to keep things easier to maintain.

Any way to run an iPhone simulator on a web app?

Is there an iPhone simulator plugin anywhere? We want to load some of our apps to our demo website for marketing purposes.
Does anyone know of anything that can do this?
Thanks
It's impossible, unless Apple releases some sort of plugin itself (which I can confidently say, will never ever happen). The closest you can get is to use some mockup templates/scripts to simulate the behavior of your app with Javascript/CSS/HTML5 canvas.
You can look into some web apps which provide similar functionality. Of course, not like the real thing, but at least, they can be a good starting point and even complete solutions for relatively simple projects. I've googled web iphone app mockup and got a few results: https://www.fluidui.com/editor/live/ or http://iphonemockup.lkmc.ch/ may be helpful.

Create control like assistive touch

We have a functionality that we would like to be available outside of the app and are wondering of there is a way to launch a small control that lives when the app isn't on, and can be "overlayed" onto any other app.
The control is just a way to help people with disabilities navigate the phone easier using new gestures.
Maybe something with a settings bundle?
What I think you want to look into is creating Cydia Substrate (formerly known as Mobile Substrate) tweaks.
Substrate tweaks allow you to inject code into other iOS processes, including the SpringBoard itself, Preferences.app, or any other app or framework.
In addition, you should checkout Ryan Petrich's libactivator, which allows you to build extensions that listen for different user actions or gestures, and let you respond to them.
This advice assumes you are deploying to jailbroken devices, as your question tag suggests. Without jailbreaking, you won't be able to do much useful in this area.
References
http://iphonedevwiki.net/index.php/Libactivator
http://iphonedevwiki.net/index.php/MobileSubstrate

Resources