Touch event in vuforia iOS app - ios

I am creating an Augmented Reality based iOS app using vuforia.I have integrated Vuforia SDK in my project.I need to shows some objects over the target image while scanning the target image. It works fine. I also need to show some messages over the screen when the user touch any of the object. How can i identify which object the user have touched? How the touch events works when the device get zoom in and zoom out?Please help me.

Try to combine ImageTarget and Dominoes sample. Touch event handle in Dominoes starts from EAGLView.mm in the dominoes sample:
// Pass touch events through to the Dominoes module
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
dominoesTouchEvent(ACTION_DOWN, 0, location.x, location.y);
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
dominoesTouchEvent(ACTION_CANCEL, 0, location.x, location.y);
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
dominoesTouchEvent(ACTION_UP, 0, location.x, location.y);
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch* touch = [touches anyObject];
CGPoint location = [touch locationInView:self];
dominoesTouchEvent(ACTION_MOVE, 0, location.x, location.y);
}
Understand how they handle these touches and try to do the same. For more reference see Vuforia developer forum https://developer.vuforia.com/forum/ios/getting-touch-event-3d-model

Related

Moving button like assistive touch

Hi I want to move a uibutton where even I want in my app like we have Assistive touch in iphone so as a same I want that kind of button in my app..Is it possible to do that If so please help out with code are some links for that
This question is asked so many times.... here is answer
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchLocation = [touch locationInView:self.view];
button.frame.x = touchlocation.x;
button.frame.y = touchlocation.y;
}

ios locationInView not show correct log when I click direct view

I feel strange, I have dark view(darkScreenView) and green view.
I using the
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchViewPoint = [touch locationInView:touch.view];
NSLog(#"[self.darkScreenView pointInside:touchViewPoint withEvent:event]:%d",[self.darkView pointInside:touchViewPoint withEvent:event]);
}
When I click on the dark view or green view,
The Log still show 1. why?
I click on the green view should show log 0, but I still get the log result 1.
I am using xcode 6.3.1.
Have anyone know what problem in this situation?
Thank you~
This will work
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchViewPoint = [touch locationInView:touch.view];
CGPoint pointOfDarkView = [touch.view convertPoint:touchViewPoint toView:self.darkview];
NSLog(#"[self.darkScreenView pointInside:touchViewPoint withEvent:event]:%d",[self.darkview pointInside:pointOfDarkView withEvent:event]);
Your problem:
touchViewPoint is relative to touch.view coordinate
You have to convert point to darkview coordinate first,then you can use pointInside:pointOfDarkView
Try to get touch for different views..
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
if (touch.view==GREEN_VIEW)
{
//Write your code for green view here
}
else if(touch.view==DARK_VIEW)
{
//Write your code for dark view here
}
}
Your touchViewPoint's location is relative to the touched view. So get location relative to self.darkView like below.
CGPoint touchViewPoint = [touch locationInView: self.darkView];
Here is the problem change to View you are referring to.
-(void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
CGPoint touchViewPoint = [touch locationInView:darkView]; <----- Problem
NSLog(#"[self.darkScreenView pointInside:touchViewPoint withEvent:event]:%d",[self.darkView pointInside:touchViewPoint withEvent:event]);
}

move Sprite with touches moved

i am moving a sprite using the touches moved method. currently the sprite jumps to the point on which the screen is touched but I want the sprite only to move when it is touched directly.
my code:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
CGPoint newPosition = CGPointMake(location.x, self.size.height/2);
self.sprite.position = newPosition;
}
}
check if the touch location is inside the sprite, like this:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint positionInScene = [touch locationInNode:self];
if(CGRectContainsPoint(self.sprite.boundingBox,positionInScene)) {
CGPoint newPosition = CGPointMake(positionInScene.x, self.size.height/2);
self.sprite.position = newPosition;
}
}

how to make an object draggable?

hey guys i was wondering if there was away to make an object like a UIImageView draggable. I am aware of the UITouch and touches i.e.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *drag=[[event allTouches] anyObject];
player.center=[drag locationInView:self.view];
}
but in this way it is possible for the user to make the object jump across the screen to where he or she touches the screen but i want the user to have to manually drag the object across the screen.
please explain with code and let me know if i have to be more specific or clear...
You can do this:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *aTouch = [touches anyObject];
CGPoint location = [aTouch locationInView:self.view];
if(CGRectContainsPoint(self.playerView.frame,location)) {
[UIView beginAnimations:#"Dragging A DraggableView" context:nil];
self.playerView.center = location;
[UIView commitAnimations];
}
}
This should give you a smooth animation.
if you want your user to drag&drop, first you have to check if the touch is inside the imageview rect frame. To improve the user experience, you can detect the touch offset from the imageview center; a good place to do this is inside the touchesBegan:withEvent:. After that, you have to change the position of the imageview, like this:
CGPoint touchOffset; //insert this inside your interface private iVars
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *drag=[[event allTouches] anyObject];
CGPoint touchLocation = [drag locationInView:self.view];
touchOffset = CGPointMake(player.center.x-touchLocation.x,player.center.y-touchLocation.y)
if(CGRectContainsPoint(player.frame,touchLocation)
player.center = CGPointMake(touchLocation.x+touchOffset.x,touchLocation.y+touchOffset.y);
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *drag=[[event allTouches] anyObject];
CGPoint touchLocation = [drag locationInView:self.view];
if(CGRectContainsPoint(player.frame,touchLocation)
player.center = CGPointMake(touchLocation.x+touchOffset.x,touchLocation.y+touchOffset.y);
}
P.S. - Next time try to search similar questions...
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch* aTouch = [touches anyObject];
UIView* aView = [aTouch view];
if (aView != self.view) {
[self.view bringSubviewToFront:aView];
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch* aTouch = [touches anyObject];
UIView* aView = [aTouch view];
if (aView != self.view) {
aView.center = [aTouch locationInView:self.view];
}
}

touchesCancelled triggered when using three fingers

I have a simple app that prints the coordinates of all the touches event.
In the simulator it works great.
On my device (iPhone) it works great with up to two fingers. When I tap with three fingers in a fast sequence the event touchesCancelled is triggered.
Could someone please explain this to me?
This is the code for printing (in case the problem lays there) that sits in my UIView.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches) {
CGPoint location = [touch locationInView:touch.view];
NSLog(#"Began %f %f", location.x, location.y);
}
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches) {
CGPoint location = [touch locationInView:touch.view];
NSLog(#"Moved %f %f", location.x, location.y);
}
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
for (UITouch *touch in touches) {
CGPoint location = [touch locationInView:touch.view];
NSLog(#"Ended %f %f", location.x, location.y);
}
}
-(void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"Phase: Touches cancelled");
for (UITouch *touch in touches) {
CGPoint location = [touch locationInView:touch.view];
NSLog(#"Cancelled %f %f", location.x, location.y);
}
}
An example of sequence is this one:
Began 38.000000 263.000000
Began 173.500000 238.500000
Moved 41.500000 263.000000
Phase: Touches cancelled <<<< third touch
Cancelled 41.500000 263.000000
Cancelled 173.500000 238.500000
Thank you.

Resources