I have iOS app now i want to add some animation in my app so i think to create Unity app for that animation. So now i want open unity app from my iOS app with button click action are you getting ?
Now animation is generate the dynamic number ball so for that i have to pass the parameter(Number of total ball, Number of ball to ball generate, Ball colour code) to unity app from iOS app.
So is there any possible way to pass the data from iOS app to Unity. I i'm thinking to use URL Scheme Or any other method available?
I am assuming you want to communicate between two apps. For that there are several ways.
To open other app within an app, please check this answer here.
Solution 1:
Don't use two apps to do single animation. Thats not good practice and not feasible. Do animation inside your iOS app and I believe iOS is powerful enough to handle your animation. Apple has already suite of game technologies to support gaming development in iOS natively. You can check this link here.
Solution 3:
If you can build app in unity, build whole app in unity.
Solution 2:
The first part of your question is very confusing though. You want to show animation in your iOS app but want to create animation in unity app? That is not going to work. Because when you open unity app, the current app will be suspended as default and you will only see unity app at that time.
As per your 2nd part of the question you want pass some arguments from iOS for animation to play in unity app. To do that you have to save data locally (in some kind of database whatever you like), like Number of total ball, Number of ball to ball generate, Ball colour code. And then you have to do code for unity to read those parameters from database and do the animation accordingly.
Related
I have an app that uses ARKit to detect faces and send over the network the coordinates of interest, which works well. I would like this app to run in background, still sending the data over the network, while I would be using another app (almost) fullscreen.
The option 'Enable multiple windows' is activated in info.plist, but as soon as I launch my other app, the ARKit app stops sending information (the app actually probably stops).
Is there a simple way to do this, and at least is this feasible? Thanks!
This is not possible at this point. Camera and AR stuff is disabled at a system level in apps when they are displayed in Slide Over or Split View.
I'd recommend displaying a warning message when Slide Over/Split Screen is being used saying that you should use the app in full screen mode. See this answer under a different question for details.
I am writing a piece of code using swift for iOS.
It’s using location services. I need to take a screenshot of the current screen every time the location gets updated.
The thing is when the app is running in background and updating the location the screenshots taken are from my app. Not the app which is currently available in the screen.
Assume that the user plays a game in their phone. But my app run in background and updating the location when the users location is significantly changed the screenshot should be taken and it should include the game. Not my app screen.
Can some one tell me how to achieve the screenshot capability I want.
Short answer: You can't.
Apps can only take screenshots of their own content. Starting with (I think) iOS 7 Apple blocked the ability of apps to take screenshots of anything but their own content. This is for security reasons.
Imagine your app echoes each character of your password as you type it, and then replaces it with a bullet as you type the next character. Now imagine an app that runs in the background and takes a screen-shot every time your app displays a new character of the user's password.
When my google cardboard app starts up for the first time, it launches a Google page that says "Google Cardboard. Let's get you set up. Pair your phone with your viewer for the best experience."
I'd like to control this experience to show my own initial screen before the google cardboard sdk launches. One reason for this is that I want people without cardboard to be able to immediately start experiencing the app in non-cardboard mode without having to go through the pairing process.
I know I can launch the pairing / viewer profile selector later on with the cardboard.ShowSettingsDialog().
One way to do this with the current version of the SDK is to have a starter scene without a Cardboard component in it. It will not be in VR, and it won't trigger onboarding.
[edit] The function OnFocus() in CardboardiOSDevice.cs is where it decides to launch the onboarding dialog. You can suppress that by editing this function. But you may want to continue using the onboarding dialog for first time users, rather than the settings dialog, because of it walks them through the scanning process.
I am developing an iphone app for my internal purpose(dont want to put it on App Store).
I want Some functionality similar to built in VoiceOver Application in iPhone.
In my app, i want my app to be kept in background, and when user touches to any button,textfield or any UIElement, then my backgrounded app should be able to know the accessibility label of that element which was touched by User and then perform some action(similar to speech in voiceover). Is there any private apis available to get this detail. If it is possible for jailbroken, then still it is OK for me.
You should try to use IOHid framework to do this.
Look at this question, because it has the code how to do this:
iOS touch event notifications (private API)
Although, I am not sure this method is still applicable in iOS 7 (I believe it was discussed in iOS 6 timeframe).
I want to make app as my master thesis on ios 7 but I need to know if could run camera on background. I need to record video after user press home button (it should be traffic cam to car while user uses another app eg. navigation). I know it was not possilbe in ios 6 but I know that ios 7 has better support for background tasks... Is it please possible? I would be grateful for every answer. Thank you
There is a list of long-running tasks permitted in the background in Apple's documentation.
Camera access is not one of them. You will need to have your app in the foreground to use the camera.