Arkit iOS ( how to make a button to choose from mutiple models) - ios

I’m just starting in iOS development. I’m currently working on a AR app that allows me to choose from different models/equipment to place. I cannot seem to figure out how to add a button or option for the user to choose from the multiple models and select one.

There's no AR button that is offered natively with RealityKit at the moment, but you can add something to your scene and pick it up with a tap gesture on the ARView.
Self promotion warning, but as it's exactly relevant to what you're looking for, I tried to solve this issue recently. Here's a GitHub repo and swift package you can use in your app to add a button in AR pretty easily, handling the gestures for you:
https://github.com/maxxfrazer/RealityUI

Related

Instantiate to a specific position in Swift xCode

I'm looking for some help.
I'm doing a homework for school on xCode and I have an issue.
I'm trying to ma multiple magazine cover that the user can scroll through them. I created a UIScrollview that I instantiate each .xib files (Magazine covers) in it but they stack on top of each other.
What I am actually trying to do is to instantiate the covers side by side (so the user can swipe through them like snapchat's filter). Is there a way to set a certain position?!?
SPECS OF THE PROJECT:
-I'm on the latest xCode beta.
-The target is an iPad Pro on iOS 10.3
I believe UICollectionView is what you're looking for.
Try with this tutorial. It doesn't show exactly what you need but once you see how collection views work you'll realize how to implement your task (or come again for more questions :))

IOS Swift3 card stack overlay

I am using swift3 with Xcode8 to develop an iPhone app. I want to achieve the following effect: There are a set of cards and I can scroll and tap a card to select the card, then the selected card will be will be displayed by itself.
I am quite new in IOS development, what I am thinking is using the collection view, but how to achieve this kind of card overlay effect with collection view?
Or should I use something else? Can anyone give some clue? Thanks!
Have a look at MMCardView, currently supporting Swift 3.
In terms of implementing this, a collection view would be a start - looking deeper into libraries this is the common way of implementation. If you don't want to implement a library, take a look at the libraries files and see how they were implemented and adapt it to the way that you want.

Show taps in iOS App Demo Video

I am making demo videos of my iOS apps, some of which I made with XCode and others of which I made in Unity3D. I plan to use the Elgato Game Capture HD to capture my demo videos but I am not sure how to show the "taps." I found Touchpose, but when I changed the main.m as suggested by the instructions on the GitHub page I got error messages saying that QAppDelegate and QTouchposeApplication were undefined. I added import "QTouchposeApplication" but got an error message suggesting I change QAppDelegate to AppDelegate. When I did this the build failed. When I left it as QAppDelegate, added QAppDelegate to the project and imported it into the main.m the error messages went away but the build still failed. Am I doing something incorrectly? I found no tutorials on Touchpose online and am confused. Alternatively, is there some other easy solution, for example using the Elgato software or some other software or framework? I also tried Reflector but my wifi is not fast enough to support good frame rates with it. I am aware that I could use unity and "build for OSX" or in the case of XCode apps just screen record the simulator but I would prefer a single, foolproof solution for all of my apps. Thanks!
One way is to use assistive touch and create your own touch. You can do this by going to Settings > General > Accessibility > AssistiveTouch, turn it on, then tap the thing that pops up, click Custom, then create your own touch. Then use that touch to show taps in a screen recording.
It's a bad solution, but I'll post it in case it fits your needs.
You would have to superimpose multiple videos to make it so that only the tap shows and not the AssistiveTouch icon. Also, you can't scroll while using it.
It's also explained here: https://www.youtube.com/watch?v=4JqjU0-4Cek
Are you able to demo on a simulator? If so, Giphy Capture is a good solution. It has an option for showing taps in the settings.
If you can make your screen recording on iOS simulator, you can enable displaying taps with following code. Open your terminal and run it and restart your simulator.
defaults write com.apple.iphonesimulator ShowSingleTouches 1
I had trouble with Touchpose initially but figured out how to use it. The issue is that you have to go to build phases and add the .m. In regard to screen recording I am using the Elgato Game Capture HD. My only problem is showing the dots on Unity 3D applications, which is something I am still working on.

iOS: Facebook chat heads behaviour and animations

This question is probably a little out of date, but I've been using the new Facebook for iOS with the "chat heads" feature (with the chat heads only present within the app), and was wondering how Facebook went about implementing this? E.g. How did they handle the drag animations for the chat heads, and also (when clicking on the chat head) how did they manage to overlay a UITableView on top of the "base" UIViews in the background?
Is this all part of UIKit, or did they create their own classes to handle this?
To answer #StuartM's question in the comments, in the last couple of months I've had a bit more experience with UIKit, and I think I have a rough idea on how I would implement something like this if I was going to do it.
What I would do is for the chat head, create a styled UIButton and add it as subview to the main Window. For the dragging, I would add a Gesture Recognizer to the UIButton to respond to the drags, and for the "snapping to edges" I would use iOS 7's new UIKit Dynamics (http://www.raywenderlich.com/50197/uikit-dynamics-tutorial).
As for showing the UITableView overlay with the chat history, I would use a Child View Controller (https://developer.apple.com/library/ios/featuredarticles/ViewControllerPGforiPhoneOS/CreatingCustomContainerViewControllers/CreatingCustomContainerViewControllers.html#//apple_ref/doc/uid/TP40007457-CH18-SW6) and as for the popping open animation, I would just use the default UIView animations, maybe using animation transactions as I'm not sure I can do everything with just the implicit animations?
And I think that should be it. To be honest, I think anyone who has a handle on those frameworks should be able to build anything in their iOS apps, and if you were to create a "chat heads" like sample project in your spare time, should give you a pretty indepth knowledge of how those frameworks work.

iOS drag and drop with combination

I'm trying to figure out if this sort of thing is possible in iOS.
I'd like to have a UI where the user can drag "bubbles", each representing a noun, from a source pool into a destination panel. In addition, I'd like to be able to have another word pool of bubbles, each with adjective bubbles. These adjective bubbles could be dragged over noun bubbles already in the destination panel in order to modify them, making a combined bubble.
Mock-up of what I'm envisioning:
Is this possible in iOS (any versions)? Preferably using stock controls, but any way is fine. I will admit I have never worked with Objective-C or iOS dev before, but I am aiming for this sort of interface for my app, and want to see if it's at all possible.
Yes, this is certainly possible - but not something you could use 'stock' controls for. I would recommend learning about UIGestureRecognizer classes, try this tutorial for starters:
UIGestureRecognizer Tutorial in iOS 5: Pinches, Pans, and More!

Resources