Custom assistive touch - ios

Is it possible to create an app (App A) with a programmatically created view that works very much like Assistive Touch: stays on top of the UI, and is visible pretty much anytime (while using App B, App C etc)?
Objective is to be able to make screenshots in random parts of the interface and then launch an app that can process these screenshots (from camera roll or wherever else).

Related

How to build a PagerView in WatchOS

How can I build or get (library) a PagerView in WatchOS. I can't find a way to do it without SwiftUI. I have a library for UIKit (https://github.com/WenchaoD/FSPagerView) but i cant use it for watchOS
Use SpriteKit for custom ui on watchOS
If you can't use SwiftUI, WatchOS only has SpriteKit and SceneKit.
Your app isn't a game, but there's no difference to the computer, only to the user. A game is just a fun app.
You need to have a WKInterfaceController that hosts a WKInterfaceSKScene view and the WKInterfaceSKScene presents an SKScene
For a pager type view of images that you can swipe between, then you might have a scene with some SKSpriteNodes, for example 3, with one on screen, one to the left, one to the right, and move them around and change their images as user drags about. Or whatever you want, maybe a SKTileMapNode or so.
For dragging, maybe a WKPanGestureRecognizer on the WKInterfaceController - you need to ensure you're not in a paged based interface, only fullscreen or navigation based would work for that. You also want to allow the crown to be used, so implement WKCrownDelegate

How can I use ARKit while using Slide Over/Split Screen on iPadOS?

I have an app that uses ARKit to detect faces and send over the network the coordinates of interest, which works well. I would like this app to run in background, still sending the data over the network, while I would be using another app (almost) fullscreen.
The option 'Enable multiple windows' is activated in info.plist, but as soon as I launch my other app, the ARKit app stops sending information (the app actually probably stops).
Is there a simple way to do this, and at least is this feasible? Thanks!
This is not possible at this point. Camera and AR stuff is disabled at a system level in apps when they are displayed in Slide Over or Split View.
I'd recommend displaying a warning message when Slide Over/Split Screen is being used saying that you should use the app in full screen mode. See this answer under a different question for details.

Passing Data From native iOS app to Unity App

I have iOS app now i want to add some animation in my app so i think to create Unity app for that animation. So now i want open unity app from my iOS app with button click action are you getting ?
Now animation is generate the dynamic number ball so for that i have to pass the parameter(Number of total ball, Number of ball to ball generate, Ball colour code) to unity app from iOS app.
So is there any possible way to pass the data from iOS app to Unity. I i'm thinking to use URL Scheme Or any other method available?
I am assuming you want to communicate between two apps. For that there are several ways.
To open other app within an app, please check this answer here.
Solution 1:
Don't use two apps to do single animation. Thats not good practice and not feasible. Do animation inside your iOS app and I believe iOS is powerful enough to handle your animation. Apple has already suite of game technologies to support gaming development in iOS natively. You can check this link here.
Solution 3:
If you can build app in unity, build whole app in unity.
Solution 2:
The first part of your question is very confusing though. You want to show animation in your iOS app but want to create animation in unity app? That is not going to work. Because when you open unity app, the current app will be suspended as default and you will only see unity app at that time.
As per your 2nd part of the question you want pass some arguments from iOS for animation to play in unity app. To do that you have to save data locally (in some kind of database whatever you like), like Number of total ball, Number of ball to ball generate, Ball colour code. And then you have to do code for unity to read those parameters from database and do the animation accordingly.

Objective C - Detect if finger is held on screen at app startup

I'd like to perform an action when the user has their finger held on the screen when my app startups.
To give an example: When the App launches and the launch screen is showing up, the user has a finger on the screen as long as the first ViewController appears. When the first ViewController gets into the viewDidAppear() function, I want to detect, that the users finger is on the screen and perform an action, like f.ex. jumping straight into the lastest received email. Basically this is supposed to be a kind of shortcut to an main action.
Is there any method to detect an already laying finger on the screen? To be exactly I'd like to check for the tap in viewDidAppear()
Unless the nature of Time has changed since the last time I checked, your app cannot detect what the user was doing before the app launched. The app, ex hypothesi, was not running at that time. And the mere presence of a finger on the screen during app launch will not generate a touch event that the app can detect.
The system can detect it, however, since is running before your app launches. That is why Apple added force-touch Shortcuts (for appropriate hardware). The only way you can do what you're asking is to rely on that API. Hardware that lacks this feature will simply have to do without this feature.
(After all, this is how Apple makes money: by trying to make users jealous of hardware they don't have, so that they buy new hardware. You would want to rob Apple of its income by reading this feature backwards onto old hardware, even if you could, now would you?)

Create a floating GUI button on iOS home screen

I'm creating an iOS swift app that will allow users to play sounds while in another app. The app is hard to explain, but I would like to know how to create some thing like the assistive touch or the zoom controllers. Like this: http://www.phonecruncher.com/siteimage/scale/0/0/368401.gif
It's hard to explain, but as you can see there is a button on the screen (it's always on top) and the user needs to be able to open a menu from the button. The user will play the sounds from here.
Is there any way I can do this in swift in Xcode?

Resources