Can Xcode 7 UI Testing be used for app file alone, without its code? - ios

I have a few app/ipa files. Using instruments ui automation i could perform actions using a js file and a terminal command. Did nt need the code/project of app file. But from Xcode 7 onwards UI Automation is deprecated. And apple have brought in UI testing. With the limited tutorials available in internet I could understand that UI testing can be implemented only on an Xcode project. It cannot be run in an pp file individually. Please do correct me if my understanding is wrong. And guide me on how to do it.
Thanks in advance :-)

Don't know exactly how you can write UI testing for APP/IPA files.
But in the sense of code needed, I would rather say no. As the UI testing uses view hierarchy to find the UI Elements and perform actions, e.g. app.buttons["Hi"].tap().

Related

How Do You Debug Swift PlaygroundBook?

While I'm exploring Playground Book sample code, like this one, I find it to be very tedious to get the code to run because of Playground Book's limitations in where the code can run: only on iPad's Swift Playground.
There is no way to debug the "Sources" / Auxiliary code in iOS's Swift Playground, since all the source files are shown in un-editable plain text. You have to open the source files in Xcode to edit them, but then you can't compile or run them!
This is especially tedious for the sample code above, which uses PlaygroundValue, a persistence API that requires Playground Book format, so I still couldn't get the code to run by separating all source code into a separate Playground file to run on the Mac.
Since the sample code above is outdated, I find it to be near impossible to debug it right and get the code to run. You'd have to:
Deploy the code on iPad. Run the Book.
See many error messages on iPad.
Go back to Xcode on Mac and debug them one by one, manually.
Deploy the code on iPad again to run. Repeat the process.
Even after all the errors are resolved, you can still be faced with cryptic "Problem Running Playground" without any further concrete explanation.
What's your workflow to productively debug and deploy code with Playground Book? Current workflow seems impractical to me I think there must be a better way, but I'm not familiar enough with Playground Book and my online research doesn't yield any reasonable workarounds.
From a bug report / suggestion I sent to Apple, I got the following reply:
We’ve actually built tools to help debug the auxilliary sources and we did a presentation at WWDC 2018 that demonstrates it. Please view the presentation and get access to the tools here: https://developer.apple.com/videos/play/wwdc2018/413/
Upon further research, I found that they have recently released a Playgrounds Author Template:
The Swift Playgrounds Author Template is a starter Xcode project that will help you create, debug, and produce a Playground book. Using the template you can step through the code for your live view as if it were an app so that you can identify bugs more easily and develop an efficient workflow for developing your Playground books.
This template, requiring Swift 4.1 to run, includes three different targets:
PlaygroundBook
Book_Sources
LiveViewTestApp
You can use the LiveViewTestApp to fully debug your Playground Book right on your Mac with Xcode.
I am not aware of any possibility that does not require you to test the Playground on an actual iPad.
Anyway, you can make developing Swift Playgrounds less tedious by
Using iCloud to synchronize your mac version with the iPad.
Embedding your Playground in an Xcode project as described in one of my previous answers. That way, you can at least achieve autocompletion during development.
Linking your source files to another target, so that compile errors can be caught before running the Playground.
Anyway, you will still encounter mysterious "Problem Running Playground" errors from time to time

Known issues for iOS modularized approach?

We've a classic iOS application which was developed using objective-c and it has lot of features. The same features has been used for other similar apps as well.
Now we've plan to rewamp the entite application. One of the approach to reduce the development work, we've plan to modularize features as framework re-using the same objective-c code, so that all applicaiton can use the framework and compile time will be less.
Also as part of rewamp, we will be using iOS 10 and swift3.
Please kindly share me your ideas/feedback, what are issues will be popup or any limitaion to do this approach.
Appreciate your help!
Thanks,
Srini
Just pack it as cocoapods and deploy into your company git is the fastest way i suppose, packing into framework is also fine but it have many boilerplate thing like cant run on either device or simulator, and if using fat framework then need to extract the simulator part out when you archive, or have to use embed framework if they are depent on each other,.... its just pretty annoying

How do I take screenshots of my UI with Xcode 7 during UI Testing?

So I downloaded the beta of XCode 7 and I've created some UI tests, but I can't find the functionality of how to take screenshots of my app/UI during the test.
Can someone please help?
UI Testing in Xcode automatically takes screenshots of your app after every step.
Simply go to one of the tests that has already ran (Report Navigator > choose a Test), then start expanding your tests. When you hover your mouse over the steps, you will see eye icons near each step that has a screenshot.
Here's an example... in this case, notice the eye icon at the end of the gray row. If I were to tap on it, I would see a screenshot of the app right when the button in my app was tapped (since the step is Tap the "Button" Button).
If you want to generate screenshots, you can also use snapshot, which describes how to trigger screenshots in UI tests: https://github.com/fastlane/fastlane/tree/master/snapshot#how-does-it-work
It basically rotates the device to .Unknown (Source), which triggers a snapshot without actually modifying your app's state.
Comparing the output with the generated plist file enables you to even properly name the screenshot
Facebook's ios-snapshot-test-case and KIF both run as Unit Tests, and therefore are in the same process as the app. As such, they can directly access views and use something like renderView: or snapshotViewAfterScreenUpdates. Xcode UI Testing runs in a separate process, and therefore cannot directly access the views.
UI Automation, Apple's now-deprecated Javascript UI testing library, had a function calledcaptureScreenWithName.
Unfortunately, the new Xcode UI Testing lacks any similar function in its testing library, which to me seems like a glaring omission and I encourage you to submit a Radar for it, as taking screenshots is fundamental to perceptual difference tests (which it sounds like you're trying to do). I hope (and expect) that able will address this deficiency in later Xcode updates.
In the meantime, there are more creative approaches to taking screenshots. See this stack overflow response for a workaround involving taking the screenshot in the app itself and then sending it to the test process.
I' ve created a tool that saves the tests last n screenshots and generates the JUnit tests results report, parsing TestSummaries plist file from test logs. https://github.com/nacuteodor/ProcessTestSummaries
Maybe, that helps you.
Facebook's FBSnapshotTestCase can be alternative solution:
https://github.com/facebook/ios-snapshot-test-case

iOS8 - is there an example of UIAutomation framework from code?

I've been reading about UI automation using instruments, and the old documentation suggested that this is done using a javascript library to access frontmost app, then access UI view hierarchy.
I see that an iOS8 device has "Enable UI automation" option in the developer menu in settings. I also see that there's some documentation on the UIAutomation framework in iOS8, which seems to me like it allows to do UIAutomation from code.
Are there examples of using iOS8 UIAutomation framework from code that I can look at to understand if this is the framework for me?
I see this screen when looking for the info on UIAutomation framework, and I think it confused me into thinking that it is available in Obj-c or Swift, because of buttons on top. Can someone confirm that this framework is NOT available in either swift or Obj-c and is still a javascript framework?
.
I believe UIAutomation is still a JavaScript only testing framework. We would have heard otherwise at WWDC or in the release notes, if any other language was supported.
Concrete evidence of this however, is that the "Automation" Instrument used when profiling an app has no language drop downs to indicate another language is possible (like say, you do when creating a new class in Xcode and there's a drop-down for Swift and Objective-C).
If you use the automatic recording functionality built into the Automation Instrument, the code you see is JavaScript. The lack of options to select another language is telling. Instruments did get a minor visual tweak with Xcode 6 too, and this fact did not change.
The UIAutomation framework, sadly doesn't seem to get a lot of love of late (not much of anything has changed since the release in 2010 with iOS4, leading some to speculate there's a major revamp in the works or it's being forgotten).
To see what JavaScript code looks like that's aimed at writing tests for iOS, check out Alex Vollmer's Tuneup JS library: http://www.tuneupjs.org. His library provides a higher level abstraction, while still in JavaScript, over Apple's UIAutomation JavaScript classes.
He has a sample project linked there that runs tests on Apple's own UICatalog example application.
Using a library like Tuneup JS is a better way to go than the more primitive JavaScript classes that Apple provides, which are really just a starting point.
You should look into Subliminal. It's a testing framework which is built on top of UIAutomation that allows you write your tests in Objective C or Swift.
https://github.com/inkling/Subliminal
We have UIAutomation working in iOS 8 using the Illuminator framework (which I wrote), and use it in our CI. It is a set of extensions to the Javascript that UIAutomation provides.
Currently, Javascript is the only language for UIAutomation that Apple supports.

Creating an iOS library or framework using libgdx (roboVM)

Is it possible to create an iOS library or framework using libgdx (RoboVM) that can be imported into Xcode?
Background:
One of my colleagues has created a 3D visualisation app as a libgdx project for android and windows desktop. It can be compiled to run on iOS using RoboVM. However, I would like to wrap extra native user interface elements around it using Xcode. I know its possible to build the user interface programmatically via RoboVM but I would be keen to investigate if its possible to bring the existing work into Xcode. I don't need to edit the 3D visualisation component but add extra GUI elements around the 3D Vis window. I thought compiling the libgdx (RoboVM) code to a framework or library might be a solution that could be imported?!
Yes you can do it.
All you need to create a method, say initRoboVM(), This will be called by your code when you want to initialize libgdx. You'll need to pass the app path in, which you can hardcode when you're testing.
initRoboVM() will need some modifications, namely it should not call your Java app's main method, well, at least, that's what well behaving libraries should not do IMO. It should also not call rvmShutdown.
You can get further information from here
Thanks :)
I asked the RoboVM team directly. Their answer: It's not a native function, but it certainly can be done.
The complete message...
Hi,
Sorry for the late reply. This use case is not something we're going
to do now. It is possible though if you're prepared to do some
patching of RoboVM. Search the RoboVM Google Group and you should find
others who have managed to get this working.
We get this request every know and then so we will add support for
this eventually.
Regards, Niklas

Resources