iOS 9.0 added new property force on UITouch class. For new iPhones (6S) this enables to get value of user's finger presure.
The value of force property seems to be set between 0 and 6.66667.
Also iOS 9 added peek and pop feature - when a user aplies certain finger presure level on some controls, programed action is triggered.
My question is: What are these presure levels (for peek and pop) in terms of value of the force property of UITouch?
In another words, to what value do I need to set threshold for the force property for the user to be required to apply the same finger pressure level as when they use 'peek' (or pop) feature?
You can try to observe the force value by using the below function. It seems that for a peek the force value is 1.33 (normalized force = 0.20) and for a pop the force value is 5.0 (normalized force = 0.75.) At the peek force level, it triggers the UIViewControllerPreviewingDelegate method (UIViewController *)previewingContext:(id<UIViewControllerPreviewing>)previewingContext viewControllerForLocation:(CGPoint)location for peek.
- (void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
{
[super touchesMoved:touches withEvent:event];
UITouch *touch = [touches anyObject];
CGFloat maximumPossibleForce = touch.maximumPossibleForce;
CGFloat force = touch.force;
NSLog(#"***** force value : %f", force);
CGFloat normalizedForce = force/maximumPossibleForce;
NSLog(#"Normalized force : %f", normalizedForce);
if (normalizedForce > 0.75)
{
// Pop
}
else if (normalizedForce > 0.20)
{
// Peek
}
}
By default you don't need to set force threshold for pop and peek operations, values are predefined in the framework. You can refer this link https://developer.apple.com/library/ios/documentation/UserExperience/Conceptual/Adopting3DTouchOniPhone/3DTouchAPIs.html
on how to implement peek and pop in your view controller. If you want to customise when to peek and pop then you should be checking force value which is not recommended. As per apple documentation
The force of the touch, where a value of 1.0 represents the force of an average touch (predetermined by the system, not user-specific).
Peek is basically for showing preview for which you will have to implemented various things. You can get sample code here https://developer.apple.com/library/ios/samplecode/ViewControllerPreviews/Introduction/Intro.html. Pop is normal action.
Related
I am writing a game and when a user touches the left or right side of the screen, a "sustain level" is increased. Currently the increase only happens when the user first touches the screen (my code is below). I want the increase to be applied for as long as the user holds their finger on the screen. What do I have to do?
-(void)touchesBegan:(NSSet*)touches withEvent:(UIEvent*)event{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:self.view];
if (self.x >= touchPoint.x){
self.sustain += 1;
}else if (self.x <= touchPoint.x){
self.sustain += 1;
}
}
increased.Currently the sustain level only increase when you first touch the screen, I want the force to be applied for as long as the user holds their finger on the screen
You won't get any messages while the user holds the finger still; your next message will be touchesEnded. So you need to start a timer and just keep increasing the force as desired every time the timer fires, until you get touchesEnded.
You need to figure out how rapidly you want this increase to happen. Then use a timer mechanism to change the value. This could be an NSTimer that fires repeatedly on a certain interval: you start the timer when you get touchesBegan: and stop it when you get touchesEnded:
If you're using SpriteKit, the scene has a built-in timer that triggers its update: method. You can use a flag to indicate that a touch is present and change the "sustain" value in update:.
So I'm making a minesweeper clone for iOS, and I have an array of UIButtons containing 135 buttons (the minesweeper board). It looks great and theoretically should work great. But I was having trouble detecting which button was being hit. I tried working around the problem by using this code;
UITouch *touched = [[event allTouches] anyObject];
CGPoint location = [touched locationInView:touched.view];
NSLog(#"x=%.2f y=%.2f", location.x, location.y);
int pointX = location.x;
int pointY = location.y;
My goal was to grab the coordinates of the touch and then use some basic math to figure out which button was being pressed. However, it doesn't work. At all. No button is pressed, no function runs, essentially nothing happens. I'm left with a minesweeper board that you can't interact with. Any ideas?
Assign a separate number to the tag of each button. Use the button's target, not the UITouch code. When you get a buttonPress, query the tag.
You could subclass the buttons and then program the what needs to happen when a touch occurs in a button inside of that subclass.
The UIButton * can be accessed by calling:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
on self (a UIView * I imagine). So I suppose you can set the button to the pushed state, and when touchesEnded: is called, set it back.
I am using touchesMoved with a coordinate system to detect and respond to user touches within certain areas of the screen. For example, if I have a virtual keyboard and the user swipes across the keys, it reads the coordinates and responds:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch * touch = [[event allTouches] anyObject];
CGPoint point = [touch locationInView:touch.view];
if(point.y < 333 && point.y > 166 && point.x < 90 && point.x > 20)
{
//do something
}
}
...However, the problem is, if the user slowly drags across the keys, or the border between keys, the method is triggered several times in a row, playing the piano key sound in a stutter.
How can I prevent this stutter? I think setting a minimum delay of 0.25 seconds between each successive if statement triggering would help. Also, this delay would only be for a specific if statement -- I want the user to be able to drag across the keys quickly and trigger different key's if-statement as quick as they want.
Does anyone know how to code something like this?
Try this:
BOOL _justPressed; // Declare this in your #interface
...
- (void)unsetJustPressed {
_justPressed = NO;
}
Then, in your touchesMoved:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if (_justPressed) {
// A key was just pressed, so do nothing.
return;
}
else {
_justPressed = YES;
// Do stuff here
[self performSelector:#selector(unsetJustPressed)
withObject:nil
afterDelay:0.25];
}
}
This way, you set a variable _justPressed to YES every touchesMoved:withEvent: is called (or within a specific conditional in there, depending on what you want to do), and you use performSelector:withObject:afterDelay: to set _justPressed to NO after a certain time period, and so you can just check whether _justPressed is YES when touchesMoved: is called to ascertain whether it was called recently.
Remember, you don't have to return from the method like in the example above, you can simply use _justPressed to check whether you should play the sound, but still perform your other actions. The example is just to give you a basic idea of what to do.
I'm using XCode 4.4 developing for iOS 5 on an iPad and am using the Storyboard layout when creating my custom button.
I have the touch event correctly working and logging but now I want to get the x/y coordinates of the tap on my custom button.
If possible, I'd like the coordinates to be relative to the custom button instead of relative to the entire iPad screen.
Here's my code in the .h file:
- (IBAction)getButtonClick:(id)sender;
and my code in the .m file:
- (IBAction)getButtonClick:(id)sender {
NSLog(#"Image Clicked.");
}
Like I said, that correctly logs when I tap the image.
How can I get the coordinates of the tap?
I've tried a few different examples from the internet but they always freeze when it displays a bunch of numbers (maybe the coordinates) in the log box. I'm VERY new to iOS developing so please make it as simple as possible. Thanks!
To get touch location you can use another variant of button action method: myAction:forEvent: (if you create it from IB interface note "sender and event" option in arguments field: )
Then in your action handler you can get touch location from event parameter, for example:
- (IBAction)myAction:(UIButton *)sender forEvent:(UIEvent *)event {
NSSet *touches = [event touchesForView:sender];
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:sender];
NSLog(#"%#", NSStringFromCGPoint(touchPoint));
}
Incase of Swift 3.0 the accepted answer works same except syntax will be changed as follows:
Swift 3.0:
#IBAction func buyTap(_ sender: Any, forEvent event: UIEvent) {
let myButton = sender as! UIButton
let touches = event.touches(for: myButton)
let touch = touches?.first
let touchPoint = touch?.location(in: myButton)
print("touchPoint\(touchPoint)")
}
For your overall coordinates (with reference to the screen), you need to create a CGPoint that contains the coordinates of your touch. But to do that, you need to get that touch first. So start by getting the touch event, then by making that point using the locationInViewmethod. Now, depending on when you want to log the touch - when the user touches down, or when they lift their finger -, you have to implement this code in the touchesBegan or touchesEnded method. Let's say you do touchesEnded, which passes an NSSet cales "touches" containing all the touch events.
UITouch *tap = [touches anyObject];
CGPoint touchPoint = [tap locationInView:self.view];
"touchPoint" will now contain the point at which the user lifts their finger. To print out the coordinates, you just access the x and y properties of that point:
CGFloat pointX = touchPoint.x;
CGFloat pointY = touchPoint.y;
NSLog(#" Coordinates are: %f, %f ", pointX, pointY);
That should output the coordinates of the touch. Now to have it be referenced to whatever button you're using, I would suggest you just manually subtract the values for the button's coordinates from the point. It seems like a simple solution, and honestly I don't know a way of getting coordinates with reference to another object, unless you make a view based on that object, and pass it to locationInView instead of self.view.
For more info on touches, there's a great set of tutorials here.
I've been busy for a few days trying to figure out how to handle touch in my Cocos2d project. The situation is a bit different as normal. I have a few different game layers that have items on it that I need to control with touch:
ControlLayer: Holds the game controls
(movement, action button). This layer is on top.
GameplayLayer: Holds the game objects
(CCSprites). This layer is directly beneath the ControlLayer.
Now my touches work fine in the ControlLayer, I can move my playable character around and make him jump and do other silly stuff. Yet I cannot grasp how to implement the touches to some of my CCSprites.
The information I've gathered so far makes me think I need get all my touch input from the control layer. Then I somehow need to 'cascade' the touch information to the GameplayLayer so I can handle the input there. Another option would be for me to get the CGRect information from my sprites by somehow creating an array with pointers to the objects that should be touchable. I should be able to use that information in the ControlLayer to check for each item in that list if the item was touched.
What is the best option to do this, and how do I implement this? I'm kind of new to programming with cocoa and Objective C so I'm not really sure what the best option is for this language and how to access the sprites CGRect information ([mySpriteName boundingBox]) in another class then the layer it is rendered in.
At the moment the only way I'm sure to get it to work is create duplicate CGRects for each CCSprite position and so I can check them, but I know this is not the right way to do it.
What I have so far (to test) is this:
ControlLayer.m
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView: [touch view]];
CGRect rect = CGRectMake(0.0f, 0.0f, 100.0f, 100.0f);
//Tried some stuff here to get see if I could get a sprite by tagname so I could use it's bounding box but that didn't work
// Check for touch with specific location
if (CGRectContainsPoint([tree boundingBox], location)) {
CCLOG(#"CGRect contains the location, touched!");
}
CCLOG(#"Layer touched at %#", NSStringFromCGPoint(location));
}
Thanks in advance for helping me!
The easiest and simplest way to solve your problem, IMO, is by using ccTouchBegan/Moved/Ended instead of ccTouchesBegan/Moved/Ended. Meaning, you are handling a single touch at a particular moment so you avoid getting confuses over multiple touches, plus the most important feature of ccTouchBegan is a CCLayer can 'consume' the touch and stop it from propagating to the next layers. More explanation after code samples below.
Here are steps to do it. Implement these sets of methods in all CCLayer subclasses that should handle touch events:
First, register with CCTouchDispatcher:
- (void)registerWithTouchDispatcher {
[[CCTouchDispatcher sharedDispatcher] addTargetedDelegate:self priority:0 swallowsTouches:YES];
}
Next, implement ccTouchBegan, example below is from a game I've created (some part omitted of course):
- (BOOL)ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event {
if (scene.state != lvlPlaying) {
// don't accept touch if not playing
return NO;
}
CGPoint location = [self convertTouchToNodeSpace:touch];
if (scene.mode == modePlaying && !firstTouch) {
if (CGRectContainsPoint(snb_putt.sprite.boundingBox, location)) {
touchOnPutt = touch.timestamp;
// do stuff
// return YES to consume the touch
return YES;
}
}
// default to not consume touch
return NO;
}
And finally implement ccTouchMoved and ccTouchEnded like the ccTouches* counterparts, except that they handle single touch instead of touches. The touch that is passed to these methods is restricted to the one that is consumed in ccTouchBegan so no need to do validation in these two methods.
Basically this is how it works. A touch event is passed by CCScene to each of its CCLayers one by one based on the z-ordering (i.e starts from the top layer to the bottom layer), until any of the layers consume the touch. So if a layer at the top (e.g. control layer) consume the touch, the touch won't be propagated to the next layer (e.g. object layer). This way each layer only has to worry about itself to decide whether to consume the touch or not. If it decides that the touch cannot be used, then it just has to not consume the touch (return NO from ccTouchBegan) and the touch will automatically propagate down the layers.
Hope this helps.