How does Accessibility Inspector read content of an iOS app page? - ios

I am working on accesibility features scanning research and I have a question.
Accessibility Inspector built in the XCode can break down the page of the simulated app into the components and analyze them. How does it retreive the needed information from the app view?
What is algorythm of analyzing another program view displayed?
Is it possible to analyze installed app by another application?
I haven`t found any application capable of doing it when the analyzed app is already installed on the device (unlike Accessibility Scanner for Andoird, which can run on your phone). Are there any restrictions for that?
I will be really grateful for any link or information.
I do not need exact implemintation. I just need an understanding/steps how to do it.
Thank you in advance

Related

Advice on making first screen load super fast in React Native app

This is more of an advice question rather than a troubleshooting question. Speed is the focus here. I'm currently working in iOS but obviously planning to build for android.
I have built a React Native app where I am using the react-native-nfc-manager library. The function of the app is simple: read what is on an NFC tag and write to it (the NFC tag) in one tap. There are 3 screens:
The first screen (upon app open) is the NFC scanning screen and it is very simple- just a few components to activate the NFC scanner
NFC tag history screen
Account/settings screen
Screens 2&3 require API calls to some external (AWS) resources. No external API calls are needed for the first screen to be loaded or for the NFC scanner to be used. I need to ensure that this first screen loads as fast as possible on app cold/warm/hot start. My goal is to use RN for as much of this app as possible. As I see it, there are a few options:
use RN to write screens 1,2,3
use Swift/Obj-C to write screen 1, and RN to write 2&3
go completely native.
I really want to be able to reuse my RN code for screens 2&3 at the very least - it doesn't matter how long it takes for those screens to load. I almost wonder if there is way to bundle screens 2&3 separately. I've done some searching/reading and can't seem to find exactly the information I'm looking for so I welcome links/reading materials that may help as well. Please try to compare any suggestions made to the performance of a fully native application, ie. "Option 1 will be just as fast as an application written entirely in Swift." Thank you.

Using iPad with cloud9-ide (terminal keyboard)

I have read all of the posts about c9 being a very poor experience on a touch device but most of the posts are based on being unable to access the file structure (dbl click).
Has anyone been able to use the terminal from an iPad keyboard? I have a light rails app that I can leave most of the tabs open (saving the dbl click issue) but cannot use a KB in the terminal which means being unable to run the rails console or my app easily.
The ACE editor used in Cloud9 does not properly work on mobile devices yet. There is an issue open to add that but as far as I know it is not currently being worked on. We are considering to create a stripped down IDE for mobile at some point but we don't have an ETA for it yet.

How to create Newsstand Magazine App using opensource framework or free tools?

I would like to create an interactive iOS Newsstand Magazine app with features including page curl effect, direct to a page from contents table, double tab to get list of pages icon at bottom in a horizontal line, page with images and link to a web page inside Magazine that should open with Safari.
Is there any open source framework/tools (except Baker) or tutorial for free?.
Is it possible to get done everything using PageViewController?.
Is there also any restriction on number of pages in a Magazine app (either minimum or maximum pages per Magazine issue)?
Target Platform is iOS 6.0 and greater, Target device is iPad/iPad mini running on iOS 6.0 and greater.
Thanks in advance.
This tutorial explain most of what you need to know. Need to be more specific about what open source tools you need.
Yes
Apple does not specify, but you should publish regularly.
It can be a long journey to create a Newsstand App. You could also go the easy road and release your magazine through something like - http://uninkd.com.

share data between iOS native app and browser app

I am a rookie iOS developer trying to figure out this. Pardon me if I am asking basic question.
I need to set an unique identifier in the device (iphone or ipad) that can be read by the app (can be browser app or native app). Is this feasible ?
Reason being: the device needs to be uniquely identified. The user might be able to use the installed app or through browser as well.
Thx.
You can create a HTML element containing any amount of content and hide it with CSS. Sometimes I use an 'input' element just for that and read it with javascript. I think you use data-uri also.

UI Recorder template not found in instruments

I have installed xcode4 and used profiling for leaks and allocations, etc. However, I don't find the new UI Recorder template in instruments under iOS however I find a template named Automation. Please let me know how to enable UI Recorder template in instruments for xcode 4 and also any idea what is this Automation template for?
Thanks.
There is no UI Recorder template for iOS applications, it can't be done. The Automation template is for automated UI unit tests. It runs a JavaScript test script that you put together that is able to drive the UI and record successes or failures. You can find details about UI Automation in the Instruments Users Guide.
It helps a lot to read the documentation first and then ask specific questions later, when you don't understand something in the documentation or something doesn't work as expected. If you don't read the documentation, you'll just get answers where people either reiterate the documentation to you, or point you to the documentation.
Well, instead you can record the user actions of the iOS simulator.
Make sure that the 'Enable access for assistive devices' setting is enabled in your Universal Access pane of System Preferences.
In instruments, just choose the 'UI Recorder' template (Mac OS X).
Start the iOS simulator and connect it to instruments by attaching the 'iPhone simulator' process in the 'choose target' selection box.
Press 'Record'.
Welcome!
I've just tried Rolands suggestion below and it works - not perfectly, it misses to play some recorded interactions, but it works and creates an editable script! I'm using Xcode 4.6.2, Instruments 4.6 and running iPhone 6.1 Simulator.

Resources