Relatively new to ARKit, I was wondering if there's a way to remove a 3D object after it has been placed in a scene.
Just use this function:
Swift:
node.removeFromParentNode()
Objective-C
[node removeFromParentNode];
I suggest reading the documentation of ARKit, SceneKit and their basic classes.
You need to remove a node from the scene graph if you don't want it to appear on screen. You need to remove it from its parent node. Read more about it in SCNNode - Managing the Node Hierachy in Apples documentation.
To remove an object(SCNNode) from your Scene view, you can use Long Press Gesture. Just add the below code in your viewDidLoad.
UILongPressGestureRecognizer *longPressGestureRecognizer =
[[UILongPressGestureRecognizer alloc] initWithTarget:self
action:#selector(handleRemoveObjectFrom:)];
longPressGestureRecognizer.minimumPressDuration = 0.5;
[self.sceneView addGestureRecognizer:longPressGestureRecognizer];
Then handle your gesture recognizer method like below,
- (void)handleRemoveObjectFrom: (UILongPressGestureRecognizer *)recognizer {
if (recognizer.state != UIGestureRecognizerStateBegan) {
return;
}
CGPoint holdPoint = [recognizer locationInView:self.sceneView];
NSArray<SCNHitTestResult *> *result = [self.sceneView hitTest:holdPoint
options:#{SCNHitTestBoundingBoxOnlyKey: #YES, SCNHitTestFirstFoundOnlyKey: #YES}];
if (result.count == 0) {
return;
}
SCNHitTestResult * hitResult = [result firstObject];
[[hitResult.node parentNode] removeFromParentNode];
}
Hope this will help you to solve your problem.
Thanks
Related
I'm working on a project with ARKit, and I'm trying to add objects when I tap on the screen. I used the touchesBegan function and the app seemed to be working fine, adding objects where I pressed. But I then wanted to add a longPressGestureRecognizer so that I could remove a certain object from my AR SceneView, so I decided to scrap the touchesBegan in favor of a tapGestureRecognizer, since the touchesBegan was taking precedence over the longPressGesture.
My issue is that the tap gesture recognizer started adding the AR Objects in more erratic places and as far as I can tell, the code was similar to my touchesBegan. I can't seem to find the difference, and I would rather work with the tapGestureRecognizer, or if there is a way to have the gestureRecognizer work alongside touchesBegan would also be useful.
My code is as follows for the touches Began function
-(void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
NSArray <UITouch *> *touchArray = [touches allObjects];
UITouch *touch = [touchArray firstObject];
NSArray <ARHitTestResult *> *resultArray = [_sceneView hitTest:[touch locationInView:_sceneView] types:ARHitTestResultTypeFeaturePoint];
ARHitTestResult *result = [resultArray lastObject];
SCNMatrix4 hitTransform = SCNMatrix4FromMat4(result.worldTransform);
SCNVector3 hitVector = SCNVector3Make(hitTransform.m41, hitTransform.m42, hitTransform.m43);
[self insertCandleWithPosition:hitVector];
}
For the tapGestureRecognizer, my code is as follows:
-(void)addObject:(UITapGestureRecognizer *)recognizer {
CGPoint tapPoint = [recognizer locationInView:self.view];
NSArray <ARHitTestResult *> *result = [_sceneView hitTest:tapPoint types:ARHitTestResultTypeFeaturePoint];
ARHitTestResult *hitResult = [result lastObject];
SCNMatrix4 hitTransform = SCNMatrix4FromMat4(hitResult.worldTransform);
SCNVector3 hitVector = SCNVector3Make(hitTransform.m41, hitTransform.m42, hitTransform.m42);
[self insertCandleWithPosition:hitVector];
}
The only difference with the sets of code is how the touch is initially generated, from what I can tell.
Any help would be highly appreciated.
Is there a way to determine if a MKMapView drag and zoom stops?
Right now I've added an UIPanGestureRecognizer for dragging MKMapView but I'll receive gestureRecognizer.state == UIGestureRecognizerStateEnded immediately when the user lift his finger even though the map is scrolling. What I try to figure out is how to prevent loading server data for my map annotations when the map is still moving and/or the user touches the map one's again to drag the map again? The data load mechanism should be only called when the map stops moving and zooming and is standing still for some predefined time.
This is what I've implement so far:
- (void)viewDidLoad {
...
UIPanGestureRecognizer* panRec = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(didDragMap:)];
[panRec setDelegate:self];
[panRec setDelaysTouchesBegan:YES];
[panRec setDelaysTouchesEnded:YES];
[panRec setCancelsTouchesInView:YES];
[self.mapView addGestureRecognizer:panRec];
}
And the selector method didDragMap:
- (void)didDragMap:(UIGestureRecognizer*)gestureRecognizer {
if (gestureRecognizer.state == UIGestureRecognizerStateBegan) {
_searchBar.text = #"";
_filtered = NO;
_crosshair.hidden = NO;
[self removeAllAnnotationExceptOfOriginalLocation];
}
else if (gestureRecognizer.state == UIGestureRecognizerStateEnded) {
[self performSelector:#selector(delayAddressResolving:) withObject:nil afterDelay:1.0];
}
}
The selector method delayAddressResolving: is loading the needed data from server to display the information for my annotations.
All notes are welcome!
Use the following MKMapViewDelegate methods:
- (void)mapView:(MKMapView *)mapView regionWillChangeAnimated:(BOOL)animated
- (void)mapView:(MKMapView *)mapView regionDidChangeAnimated:(BOOL)animated
These methods are called every time when map region is changing.
determine if MKMapView was dragged/moved
http://developer.apple.com/library/ios/#samplecode/Breadcrumb/Listings/Classes_BreadcrumbViewController_m.html. It may help you
I added a swipe gesture recognizer and a pan gesture recognizer to the same view. These gestures should be exclusive to each other.
In order to do this I added the constraint on the swipe gesture
[swipeGesture requireGestureToFail:panGesture];
(because the pan gesture should get precedence)
Problem is that the pan gesture is always invoked - even during a very fast swipe.
In order to over come this I set myself as the pan gesture's delegate. In the delegate method I set up some code as follows:
- (BOOL)gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer
{
// check if it is the relevant view
if (gestureRecognizer.view == self.myViewWithTwoGestures)
{
// check that it is the pan gesture
if ([gestureRecognizer isKindOfClass:[UIPanGestureRecognizer class]])
{
UIPanGestureRecognizer *pan = (UIPanGestureRecognizer *)gestureRecognizer;
CGPoint velocity = [pan velocityInView:gestureRecognizer.view];
// added an arbitrary velocity for failure
if (ABS(velocity.y) > 100)
{
// fail if the swipe was fast enough - this should allow the swipe gesture to be invoked
return NO;
}
}
}
return YES;
}
Is there a suggested velocity to ensure good behavior? Is there another way to force the pan gesture to fail?
According to Apple's documentation here (under Declaring a Specific Order for Two Gesture Recognizers) the way to get both UIPanGestureRecognizer and UISwipeGestureRecognizer to work on the same view is by requiring the UISwipeGesureRecognizer to fail before calling the UIPanGestureRecognizer (the opposite of what you wrote). This probably has something to do with the fact the a swipe gesture is also a pan gesture but the opposite is not necessarily true (see this SO question).
I wrote this little piece of code and it manages to recognize both pan and swipe gestures:
UIPanGestureRecognizer * pan = [[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(panned:)];
UISwipeGestureRecognizer * swipe = [[UISwipeGestureRecognizer alloc] initWithTarget:self action:#selector(swiped:)];
[pan requireGestureRecognizerToFail:swipe];
swipe.direction = (UISwipeGestureRecognizerDirectionLeft | UISwipeGestureRecognizerDirectionRight);
-(void)panned:(UIPanGestureRecognizer *)gesture
{
NSLog(#"Pan");
}
-(void)swiped:(UISwipeGestureRecognizer *)gesture
{
NSLog(#"Swipe");
}
This doesn't work as well as you'd hope (since you need the swipe gesture to fail there's a small delay before the pan gesture starts) but it does work.
The code you posted however gives you the ability to fine tune the gestures to your liking.
Late response, but I was having a similar issue where I wanted to pan to be recognized before the swipe. The only way I could get it working was to use a long press (or something similar) to set a flag to use the pan gesture as a pan or a swipe. I ended up not using swipes at all. I.e.:
- (void) handleLongPress : (UILongPressGestureRecognizer *) gestureRecognizer
{
if (gestureRecognizer.state == UIGestureRecognizerStateBegan)
{
_canSwipe = YES;
}
else if (gestureRecognizer.state == UIGestureRecognizerStateEnded)
{
_canSwipe = NO;
}
}
- (void) handleDragging : (id) sender
{
UIPanGestureRecognizer *pan = (UIPanGestureRecognizer *)sender;
GLKVector2 dragDelta = GLKVector2Make(0., 0.);
if (pan.state == UIGestureRecognizerStateBegan || pan.state == UIGestureRecognizerStateChanged)
{
_mousePosition = [pan translationInView:self.view];
if (_beginDragging == NO)
{
_beginDragging = YES;
}
else
{
dragDelta = GLKVector2Make(_mousePosition.x - _prevMousePosition.x, _mousePosition.y - _prevMousePosition.y);
}
_prevMousePosition = _mousePosition;
}
else
{
_beginDragging = NO;
}
if (_canSwipe == YES)
{
if (dragDelta.x > 0)
{
_canSwipe = NO;
[self.navigationController popToRootViewControllerAnimated:YES];
NSLog(#"swipe right");
}
else if (dragDelta.x < 0)
{
_canSwipe = NO;
[self performSegueWithIdentifier:#"toTableSegue" sender:pan];
NSLog(#"swipe left");
}
}
else
{
_dragDeltaTranslation = GLKVector2Make(dragDelta.x/90, dragDelta.y/90);
_translationXY = GLKVector2Make(_translationXY.x + _dragDeltaTranslation.x, _translationXY.y - _dragDeltaTranslation.y);
}
}
So essentially:
Use long press (or some other mechanism) to activate a state of swiping (long press is nice because as soon as you release, the state goes to UIGestureRecognizerStateEnded)
Then use the pan direction to determine the direction of the swipe.
2.
I have a view with several UIButtons. I have successfully implemented using UILongPressGestureRecognizer with the following as the selector;
- (void)longPress:(UILongPressGestureRecognizer*)gesture {
if ( gesture.state == UIGestureRecognizerStateEnded ) {
NSLog(#"Long Press");
}
}
What I need to know within this method is which UIButton received the longpress since I need to do something different, depending on which button received the longpress.
Hopefully the answer is not some issue of mapping the coordinates of where the longpress occured to the bounds of the buttons - would rather not go there.
Any suggestions?
Thanks!
This is available in gesture.view.
Are you adding the long tap gesture controller to the UIView that has the UIButtons as subviews? If so, something along the lines of #Magic Bullet Dave's approach is probably the way to go.
An alternative is to subclass UIButton and add to each UIButton a longTapGestureRecogniser. You can then get your button to do what you like. For example, it could send a message identifying itself to a view controller. The following snippet illustrates methods for the subclass.
- (void) setupLongPressForTarget: (id) target;
{
[self setTarget: target]; // property used to hold target (add #property and #synthesise as appropriate)
UILongPressGestureRecognizer* longPress = [[UILongPressGestureRecognizer alloc] initWithTarget:button action:#selector(longPress:)];
[self addGestureRecognizer:longPress];
[longPress release];
}
- (void) longPress: (UIGestureRecognizer*) recogniser;
{
if (![recogniser isEnabled]) return; // code to prevent multiple long press messages
[recogniser setEnabled:NO];
[recogniser performSelector:#selector(setEnabled:) withObject: [NSNumber numberWithBool:YES] afterDelay:0.2];
NSLog(#"long press detected on button");
if ([[self target] respondsToSelector:#selector(longPressOnButton:)])
{
[[self target] longPressOnButton: self];
}
}
In your view controller you might have code something like this:
- (void) viewDidLoad;
{
// set up buttons (if not already done in Interface Builder)
[buttonA setupLongPressForTarget: self];
[buttonB setupLongPressForTarget: self];
// finish any other set up
}
- (void) longPressOnButton: (id) sender;
{
if (sender = [self buttonA])
{
// handle button A long press
}
if (sender = [self buttonB])
{
// handle button B long press
}
// etc.
}
If your view contains multiple subViews (like lots of buttons) you can determine what was tapped:
// Get the position of the point tapped in the window co-ordinate system
CGPoint tapPoint = [gesture locationInView:nil];
UIView *viewAtBottomOfHeirachy = [self.window hitTest:tapPoint withEvent:nil];
if ([viewAtBottomOfHeirachy isKindOfClass:[UIButton class]])
According to image could you please advise me which function use to develop this feature?
i'm not sure, is it implement from UIPopover?
any idea, thank you.
source from Music.app iOS 5 beta 2
You can use a UIGestureRecognizer. Specifically, what you are looking for is a UILongPressGestureRecognizer
You should instantiate one and attach it to the view you would like to track the gesture on:
UILongPressGestureRecognizer* gestureRecognizer = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleGesture:)];
[view addGestureRecognizer:gestureRecognizer];
Then, in your handler method you would do the rest:
- (void)handleGesture:(UILongGestureRecognizer *)recognizer {
if (recognizer.state == UIGestureRecognizerStateBegan) {
} else if (recognizer.state == UIGestureRecognizerStateEnded) {
}
}
EDIT: for the popover implementation, have a look at WEPopover