When the user drags the ModelEntity, the ModelEntity can switch planes smoothly.
I want to use the gesture which RealityKit provided
self.arView.installGestures(.all, for: usdzEntity).forEach { gestureRecognizer in gestureRecognizer.addTarget(self, action: #selector(self.handleGesture(_:))) }
But I don't how to switch different planes(use realitykit's gestures) and keep the shadow which RealityKit provided.
This is my way -- RealityKit_ARQL
But I think ARQuicklook is better than my code.
Related
I migrated from old Mapbox Maps SDKs for iOS and macOS v6.x.x to mapbox-maps-ios v10.1.0.
A lot of API got changed.
I was able to restore same functionality using new SDK, however I could not find a way to animate annotations upon selection.
Before I used MGLAnnotationView, or MGLAnnotationImage for displaying annotations, and I could easily animate MGLAnnotationView by applying animation block with transformations I needed.
In a new SDK it's not the same. Annotations are now represented by struct PointAnnotation, therefore animation API is not accessible, as it's not a UIView subclass anymore.
What I'm looking for is a simple scale animation that should happen when user tap on the annotation.
Animation scales pin for 1.1 factor and return to 1.0, with duration of 350ms. It is not repeating animation, suppose to happen only once when user tap on the annotation.
I checked an example projects and from what I understand animation is possible by manipulating layers, but I'm not sure how to do that exactly.
Looking for help
As of Mapbox v10.4.3, the solution that works for me is by using View Annotations. You can add custom gesture recognizers, animations, etc inside your custom annotation view. https://docs.mapbox.com/ios/maps/guides/annotations/view-annotations/#create-a-view-annotation
let sampleCoordinate = CLLocationCoordinate2D(latitude: 39.7128, longitude: -75.0060)
let options = ViewAnnotationOptions(geometry: Point(sampleCoordinate), allowOverlap: true, anchor: .center)
let annotationView = CustomAnnotationView()
annotationView.rx.tapGesture().when(.recognized).subscribe(onNext: { [weak self] _ in
guard let self = self else { return }
self.viewModel.input.tappedAnnotation()
/// You can animate annotation view resize here when tapping specific annotation view, or you can also add it inside the custom annotation view
}).disposed(by: self.disposeBag)
try? self.mapView.viewAnnotations.add(annotationView, options: options)
class CustomAnnotationView: UIView {
// Do your customizations here
}
I have a view controller in which a user can move around UIButton, UISlider, and a custom UIView based control by panning the control around the screen. This view controller is used to create custom layout of control by the user. This is all done by adding PanGestureRecognizer to the UIControl to move the position of the control relative to user's finger location.
let panRecognizer = UIPanGestureRecognizer(target: self, action: #selector(pan))
panRecognizer.delegate = self;
myslider.addGestureRecognizer(panRecognizer)
~~~~~~~~~~~~
//pan handler method
#objc func pan(_ gestureRecognizer: UIPanGestureRecognizer) {
let translation = gestureRecognizer.translation(in: view)
guard let gestureView = gestureRecognizer.view else {
return
}
//Move the center to user's finger location
gestureView.center = CGPoint(x: gestureView.center.x + translation.x, y: gestureView.center.y + translation.y);
gestureRecognizer.setTranslation(.zero, in: view);
if (gestureRecognizer.state == .ended) {
//Save the new location inside data. Not relevant here.
}
}
This worked fine in iOS 13 and below for all the control i mentioned above (with the UISlider being a bit glitchy but it still responded to the pan gesture and i don't need the value of the uislider anyway so it's safe to ignore). However testing the app in iOS 14 reveals that UISlider completely ignore the PanGesture (proven by adding breakpoint that never got called inside the pan gesture handling method).
I have looked at apple's documentation regarding UISlider and found no change at all related to gesture handling so this must be done deliberately in deeper lever. My question is: is there any way to "force" my custom gesture to be executed instead of UISlider's gesture without the need to create transparent button overlay (which i don't know will work or not) / creating dummy slider just for this ?
Additionally i also added UILongPressGestureRecognizer and UITapGestureRecognizer to the control. Which worked fine on other UIButton but completely ignored by the UISlider in iOS14 (in iOS 13 everything worked fine).
Thanks.
Okay.. I found the answer myself after some more digging and getting cue from "claude31" about making a new clean project and testing from there.
The problem was with the overriden beginTracking function. This UISlider of mine is actually subclassed into a custom class and in there the beginTracking function is overriden, as per code below:
override func beginTracking(_ touch: UITouch, with event: UIEvent?) -> Bool {
let percent = Float(touch.location(in: self).x / bounds.size.width)
let delta = percent * (maximumValue - minimumValue)
let newValue = minimumValue + delta
self.setValue(newValue, animated: false)
super.sendActions(for: UIControl.Event.valueChanged)
return true
}
This is to make the slider move immediately to the user finger location if the touch is inside the boundaries of the UISlider (without the need to first touch the thumbTrack and sliding it to the position the user wants).
In iOS 13 this function does not block the gestureRecognizer from getting recognized. However in iOS 14B4, overriding this function, with sendActions(for:) method inside it cause the added gestureRecognizer to be ignored completely, be it pan gesture, long press gesture, or even tap gesture.
For me, the solution is to simply add a state to check whether the pan gesture is required in this view controller or not. Because, luckily, i only need the added gestureRecognizer in view controller that does not require the beginTouch function to be customized and vice versa.
Edit:
Originally i wrote that the cause of the problem is due to always true return value, however, I just reread the documentation and the default return value for this function is also true. So i think the root causes of this problem is the sendActions(for:) method, causing the added gestureRecognizer to be ignored. My Answer above has been edited to reflect this.
I have no idea about the following situation:
I had exported a NSObject from PaintCode and made a .swift file (someObject.swift).
public class PlanATrip: NSObject {
class func drawRectangle1(frame targetFrame: CGRect = CGRect(x: 0, y: 0, width: 200, height:100), resizing: ResizingBehavior = .aspectFit) {
....
}
I also overrode the draw() function in a UIView (someObjectView.swift).
So how can add a gesture recognizer to a bezierPath (for example, a rectangle1 = UIBezierPath(...) ) which is in the someObject.swift ?
I tried to add some functions like:
let tapGestureA:UITapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(touchAction))
However, the scope confused me; like if I put the touchAction function out of the drawRectangle1 function, I will not be able to access rectangle1.
How can I modify to make such a gesture recognizer work?
You are trying to manipulate UIBezierPath objects generated by PaintCode directly, which isn't PaintCode's purpose.
What you are trying to achieve is possible, but not without some hacking.
For example: How to access a layer inside UIView custom class generate with Paintcode?
In my opinion, you are "misusing" PaintCode.
Instead, you would be much better adding the tapping logic inside your custom UIView. The thing you called someObjectView.swift.
For example: https://github.com/backslash-f/paint-code-ui-button
In case you really need to check if gestures occur inside the boundaries of a UIBezierPath then you need to create a public reference to it (e.g.: a var that points to it outside of the drawRectangle1 function) and finally use the code proposed by #Sparky.
The main problem with this approach is that your changes will be overwritten every time you use PaintCode's export feature. I wouldn't recommend it.
If I understand your question correctly, you'd like to add a UITapGestureRecognizer that will recognize a gesture within a portion of your view defined by a UIBezierPath.
In this case, assign the selector to the gesture recognizer as you have in your question, then in the body of touchAction(recognizer:), test whether the gesture lies within the UIBezierPath. The simple case where you only care if the gesture is inside the UIBezierPath might be handled as follows:
func touchAction(recognizer: UITapGestureRecognizer) -> Void {
guard rectangle1.contains(recognizer.location(in: self)) else { return }
// Execute your code for the gesture here
// ...
}
I currently use an app and I'm trying to replicate an action that occurs in the app.
Here's an example:
As you can see, when my finger is on the selected "UIButton" (at least that's what I think they are), the background image is highlighted. Without releasing my finger, scrolling through the different options highlight that specific UIButton.
I believe each one is a UIButton with no text label, and they've just added a UILabel for each UIButton.
I've done something similar as soon:
However, what I can't figure out is how they managed to make the background highlighted between each selection.
It looks like a combination of a UIControlEvents.TouchDown and UIControlEvents.TouchUpInside.
I've coded up something similar:
cameraButton.addTarget(self, action: #selector(AddClothesViewController.selectCameraOptions(_:)), forControlEvents: .TouchUpInside)
cameraButton.addTarget(self, action: #selector(AddClothesViewController.selectCameraOptionsHighlighted(_:)), forControlEvents: .TouchDown)
func selectCameraOptionsHighlighted(button: UIButton)
{
if button == cameraButton
{
print("GO HERE")
cameraButton.setImage(UIImage(named: "highlighted background"), forState: .Normal)
}
}
func selectCameraOptions(button: UIButton)
{
if button == cameraButton
{
// DO STUFF
}
}
However this approach does not work, because if I perform a .TouchDown, somehow the camera options disappear. If I just add a target for .TouchUpInside, that obviously doesn't work neither because it's only highlighted once I release my finger, and by that time, it would have already taken me to a different view, i.e. Camera or Photo Library.
What's the best approach for how to achieve this?
Thanks.
I recently started using scenekit for scenekit in iOS 8. I am facing difficulty in detecting whether the user has tapped or pressed on the object. Is there any way to do that?
See the documentation for the hitTest method. Call that from wherever you're handling touch events to get a list of 3D scene objects/locations "under" a 2D screen point.
An easy way to get sample code that shows the hitTest in action is to create a sample app using the Game template in XCode6. Create a new project, select the "Game" template.
The hitTest code should be there in the implementation of:
- (void) handleTap:(UIGestureRecognizer*)gestureRecognize
Add a tap gesture to the object and check whether it is a SCNNode()
#objc func tapGestureRec(sender: UIPanGestureRecognizer? = nil){
let location: CGPoint = (sender?.location(in: self.view))!
let hits = self.sceneKitView.hitTest(location, options: nil)
if let tappedNode : SCNNode = hits.first?.node {
...
}
}