I am trying to take screenshots of my iOS app. Before taking a screenshot, I need to get the app to an appropriate state. To get to an appropriate state, a lot of swiping is required.
This would have been fine if I have actual devices, but I don't. So I need to perform swipes on a simulator using a trackpad. I find this very hard, and sometimes I can't swipe properly so the gesture is not recognised.
I thought of using the UI Testing library to programmatically perform swipes. However, my app is actually a game, and random events happen. Writing code to handle these random events would not be worth the time. It would be best if I am in control of the swiping.
I have also thought of adding buttons on the UI of the app. When they are pressed a swipe is simulated. Then I can just click those buttons instead of swiping with my trackpad, which is way easier. However, these buttons will then appear on the screenshot, which I obviously don't want users to see.
Also note that I can't use a tap gesture recogniser as a replacement for the swipe gesture recognisers, because I need to detect swipes in all four directions and do different things depending on the direction.
Essentially, how can I perform a "swipe" more easily on the simulator? It would be great if I can do this by pressing keys on my keyboard, or maybe there is a feature in Xcode that allows me to do this which I am not aware of?
I suggest you automate the UI test.
Recording a test from live actions is a standard Xcode UI test feature. Fastlane is the icing on the cake to automate the capture of screenshots too.
Fastlane has the tools to automatically run a UI test and capture screenshots in all device resolutions. You can even record the actions by recording a UI test and play it back.
Check it out here:
Fastlane Screenshot
Even if you do not wish to use Fastlane, you can record the gestures in a unit test and have it pause.
Related
I am working with an app that supports swipe gestures with multi-fingers (2 finger down swipe, etc.), and would like to simulate this in XCUITests. I see that XCUIElement contains a bunch of functions like SwipeUp() and SwipeLeft(), however they all see to be for only single fingers. I don't see any other APIs that look like they would allow simulating a two-finger down swipe, for instance.
Does anyone know of a way to do this?
Unfortunately, there isn’t a solution for this until we’re given a specific call for it; the existing gestures are all synchronous so you can’t simulate individual fingers doing different things at the same time.
There are pinch and rotate functions, but they can’t be made to do what you’re looking for.
I have a collection view in my app, which is inside a table view cell (yes I know it's weird), and I want to UI Test the scrolling of the collection view. I use the recording button for this, however Xcode identifies the swiping on my collection view as taps. If I manually change the generated UI Test code from [collectionview tap] to [collectionview scrollLeft], it works, but Xcode won't generate the code for swiping automatically.
What could be the problem?
Xcode only recognises a gesture as a swipe if your trajectory with the gesture is fast, straight and true to the (up/down/left/right) direction you are swiping in.
My guess is that this prevents recording drag or tap-and-hold gestures as swipes, since these are unsupported by the recording tool. If you were going for either of those, a tap gesture would be closer.
As a workaround, take note of where you expected a swipe and switch the gesture as you have been doing when your swipes aren't recorded.
I believe you should file a bug with Apple and include a sample project.
It may be hard for the recording system to differentiate between a tap, a long press, and a swipe. While I've seen the recording of tap events to be reliable, I find I'm manually typing any steps for swipes or typeText. Generally I use the UI test recording feature to help with identification of particular elements, which I then work with in code to specify the user interactions and asserts.
If you want to create a sample project on github or somewhere with your collectionView-inside-tableViewCell configuration, I'd be willing to take a look.
EDIT: After trying your example project, I was sometimes able to get Xcode to record swipeLeft and swipeRight events from the first cell. It's not the most elegant approach, but with the trackpad on my MacBook Air, I start a horizontal swipe with one finger without pressing the mouse button, and then press the button with another finger while the first finger is still swiping.
There were some instances when Xcode recorded this simply as a tap, and at least one instance where it recorded a twoFingerTap.
I am building an app which will be displayed on an iPad. It will be an app for customers in the shop who will be able to leave their contact data. I am going to put an iPad to a frame so they won't be able to press home, lock buttons. Just the screen.
My question is: is there any way to prevent them from closing my app by a gesture? (fingers going closer to each other) or a gesture of switching apps? (3 fingers swipe left/right)
I suppose I can set it in settings, but I really need it to be set in the code.
Does iOS system provides that feature (disabling system gestures programically?)
Thank you
I'd like to perform an action when the user has their finger held on the screen when my app startups.
To give an example: When the App launches and the launch screen is showing up, the user has a finger on the screen as long as the first ViewController appears. When the first ViewController gets into the viewDidAppear() function, I want to detect, that the users finger is on the screen and perform an action, like f.ex. jumping straight into the lastest received email. Basically this is supposed to be a kind of shortcut to an main action.
Is there any method to detect an already laying finger on the screen? To be exactly I'd like to check for the tap in viewDidAppear()
Unless the nature of Time has changed since the last time I checked, your app cannot detect what the user was doing before the app launched. The app, ex hypothesi, was not running at that time. And the mere presence of a finger on the screen during app launch will not generate a touch event that the app can detect.
The system can detect it, however, since is running before your app launches. That is why Apple added force-touch Shortcuts (for appropriate hardware). The only way you can do what you're asking is to rely on that API. Hardware that lacks this feature will simply have to do without this feature.
(After all, this is how Apple makes money: by trying to make users jealous of hardware they don't have, so that they buy new hardware. You would want to rob Apple of its income by reading this feature backwards onto old hardware, even if you could, now would you?)
How can I record the operations on iOS, such as touch, move, select, when they happened, the time, the position of action, the action type will be recorded.And then the record can be replay, the actions would be triggered sequencly .
Thank you very much!
I would suggest you this GestureRecognizer:
Add a visual tap effect when you press the screen Record yourself
using the App on the simulator using a screen capture Add in a video
player at the tutorial screen that shows how to use the app
This three action can be done using via AVFoundation framework
or
A another way to this is SIKULI tool. You can automate the demo work flows easily http://www.sikuli.org/download.html