I'm looking to create a UISwipeGestureRecognizer which is attached to my UIView. What I want to achieve allows the user to swipe their finger across the screen and set the brightness of the screen.
Here is what I've attempted:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint currentPosition = [touch locationInView:self.view];
CGFloat deltaXX = (startPoint.x - currentPosition.x);
float brightVal;
if (deltaXX > 0) {
brightVal = [[UIScreen mainScreen] brightness] / ((startPoint.x / deltaXX) * 1.0);
} else {
brightVal = [[UIScreen mainScreen] brightness] - ((startPoint.x / deltaXX) * 1.0);
}
NSLog(#"%f", brightVal);
[[UIScreen mainScreen] setBrightness:brightVal];
}
However this doesn't seem to work properly. At the right hand edge of the screen, the user can quickly adjust their brightness.
I think the problem is I can't work out how to normalize the value where 1.0 would be the left hand side and 0.0 would be the right hand side.
Any suggestions?
You cannot normalize unless you have a total. In this case this would by your screen width. It is counter-intuitive to have the screen get brighter faster if you start more on the right. It would be better to determine the equivalent of any point to the brightness scale:
CGFloat width = self.view.bounds.size.width;
brightVal = position.x / width;
Now, if startVal is larger than the brightness the brightness should jump to the point on the scale after the move starts. This is pretty much how a UISlider works. You don't even need to calculate the delta to the start point.
You could have this be triggered only after deltaXX is beyond a certain threshold.
Related
I'm using ACEDrawingView to draw within a view.
How would I detect the width and height of the drawing, so that I can crop around it, something like this:
Update: After #Duncan pointed me in the right direction, I was able to look through the source code and found the following:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
// save all the touches in the path
UITouch *touch = [touches anyObject];
previousPoint2 = previousPoint1;
previousPoint1 = [touch previousLocationInView:self];
currentPoint = [touch locationInView:self];
if ([self.currentTool isKindOfClass:[ACEDrawingPenTool class]]) {
CGRect bounds = [(ACEDrawingPenTool*)self.currentTool addPathPreviousPreviousPoint:previousPoint2 withPreviousPoint:previousPoint1 withCurrentPoint:currentPoint];
CGRect drawBox = bounds;
drawBox.origin.x -= self.lineWidth * 2.0;
drawBox.origin.y -= self.lineWidth * 2.0;
drawBox.size.width += self.lineWidth * 4.0;
drawBox.size.height += self.lineWidth * 4.0;
self.drawingBounds = bounds; // I added this property to allow me to extract the bounds and use it in my view controller
[self setNeedsDisplayInRect:drawBox];
}
else if ([self.currentTool isKindOfClass:[ACEDrawingTextTool class]]) {
[self resizeTextViewFrame: currentPoint];
}
else {
[self.currentTool moveFromPoint:previousPoint1 toPoint:currentPoint];
[self setNeedsDisplay];
}
}
However I get this when I test the bounds:
I'm going to keep trying to figure it out, but if anyone could help that would be great!
Update 3: Using CGContextGetPathBoundingBox I was finally able to achieve it.
Every time you get a touchesMoved, record the location of the point you are now drawing. When you are all done, you have all the points. Now look at the largest x value and the smallest x value and the largest y value and the smallest y value in all of those points. That's the bounding box of the drawing.
Another approach (which you've already discovered) is to save the CGPath and then call CGContextGetPathBoundingBox. Basically that does exactly the same thing.
Note that a path has no thickness, whereas your stroke does. You will need to inset the bounding box negatively to allow for this (my screencast doesn't do that).
I'm not familiar with the AceDrawingView class. I can tell you how to do it with iOS frameworks though:
Create your path as a UIBezierPath.
Interrogate the bounds property of the path.
I am developing a space shooter with SpriteKit for testing purposes.
The game field is 700 x 700 pixels/points large. obviously is does not fit into a iPhone screen. That means I need some sort of scrolling that affects players + enemies + bullet + asteroids. I searched google, but most subjects on scrolling refer to Cocos2D and nearly always to Cocos2D specific features, not provided by sprite kit. So the thing is this is my first game so I am not sure what the right way is to implement multidirectional scrolling. I hope you guys can give me a solution and/or hints/tutorials or anything else that helps me :D
Create your "spaceship", create a camera node and in the updates method make your camera always center on the spaceship.
Something similar to this question: How to make camera follow SKNode in Sprite Kit?
-(void)centerOnNode:(SKNode*)node {
CGPoint cameraPositionInScene = [node.scene convertPoint:node.position fromNode:node.parent];
cameraPositionInScene.x = 0;
node.parent.position = CGPointMake(node.parent.position.x - cameraPositionInScene.x, node.parent.position.y - cameraPositionInScene.y);
}
Then you'll have to figure out your own game mechanics on how the ship moves but if you simply imply the ship follows your finger you can use SKActions to animate to finger location. Example: https://stackoverflow.com/a/19172574/525576
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
for (UITouch *touch in touches) {
CGPoint location = [touch locationInNode:self];
CGPoint diff = CGPointMake(location.x - _myPlayer.position.x, location.y - _myPlayer.position.y);
CGFloat angleRadians = atan2f(diff.y, diff.x);
[_myPlayer runAction:[SKAction sequence:#[
[SKAction rotateToAngle:angleRadians duration:1.0],
[SKAction moveByX:diff.x y:diff.y duration:3.0]
]]];
}
}
}
Solved it, thanks to ray wenderlich
CGSize winSize = self.size;
int x = MAX(player.position.x, winSize.width / 2 );
int y = MAX(player.position.y, winSize.height / 2 );
x = MIN(x, worldSize.width - winSize.width / 2 );
y = MIN(y, worldSize.height - winSize.height/ 2 );
CGPoint actualPosition = CGPointMake(x, y);
CGPoint centerOfView = CGPointMake(winSize.width/2, winSize.height/2);
CGPoint viewPoint = ccpSub(centerOfView, actualPosition);
world.position = viewPoint;
I am creating a custom UIControl object as detailed here. It is all working well except for the touch area.
I want to find a way to limit the touch area to only part of the control, in the example above I want it to be restricted to the black circumference only rather than the whole control area.
Any idea?
Cheers
You can override UIView's pointInside:withEvent: to reject unwanted touches.
Here's a method that checks if the touch occurred in a ring around the center of the view:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
UITouch *touch = [[event touchesForView:self] anyObject];
if (touch == nil)
return NO;
CGPoint touchPoint = [touch locationInView:self];
CGRect bounds = self.bounds;
CGPoint center = { CGRectGetMidX(bounds), CGRectGetMidY(bounds) };
CGVector delta = { touchPoint.x - center.x, touchPoint.y - center.y };
CGFloat squareDistance = delta.dx * delta.dx + delta.dy * delta.dy;
CGFloat outerRadius = bounds.size.width * 0.5;
if (squareDistance > outerRadius * outerRadius)
return NO;
CGFloat innerRadius = outerRadius * 0.5;
if (squareDistance < innerRadius * innerRadius)
return NO;
return YES;
}
To detect other hits on more complex shapes you can use a CGPath to describe the shape and test using CGPathContainsPoint. Another way is to use an image of the control and test the pixel's alpha value.
All that depends on how you build your control.
I'm trying create an application that allows the user to move a frame over an image, so that I can apply some effects on a selected region.
I need to allow the user to precisely drag and scale the masked-frame on the image. I need this to be exact, just like any other photo app does.
My strategy is to get the touch points of the user, on a touch-moved event, and scale my frame accordingly. That was pretty intuitive. I coded the following stuff for handling the touch moved event :
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:[self view]];
float currX = touchPoint.x;
float currY = touchPoint.y;
/*proceed with other operations on currX and currY,
which is coming out quite well*/
}
But the only problem is, the coordinates of the currX and currY variables are not quite where they are supposed to be. There is a parallax error, which keeps shifting from device to device. I also think the x and y coordinates gets swapped in the case of an iPad.
Could you please help me to figure out how to get the exact touch coordinates?
My background image is in one view (imageBG) and the masked frame is in a separate one (maskBG). I have tried out :
CGPoint touchPoint = [touch locationInView:[maskBG view]];
and
CGPoint touchPoint = [touch locationInView:[imageBG view]];
...but the same problem persists. I have also noticed the error on touch being worse on an iPad than on an iPhone or iPod.
image.center = [[[event allTouches] anyObject] locationInView:self.view];
Hi your issue is the image and the iPhone screen are not necessarily in same aspect ratio.Your touch point might not translate correctly to your actual image.
- (UIImage*) getCroppedImage {
CGRect rect = self.movingView.frame;
CGPoint a;
a.x=rect.origin.x-self.imageView.frame.origin.x;
a.y=rect.origin.y-self.imageView.frame.origin.y;
a.x=a.x*(self.imageView.image.size.width/self.imageView.frame.size.width);
a.y=a.y*(self.imageView.image.size.height/self.imageView.frame.size.height);
rect.origin=a;
rect.size.width=rect.size.width*(self.imageView.image.size.width/self.imageView.frame.size.width);
rect.size.height=rect.size.height*(self.imageView.image.size.height/self.imageView.frame.size.height);
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
// translated rectangle for drawing sub image
CGRect drawRect = CGRectMake(-rect.origin.x, -rect.origin.y, self.imageView.image.size.width, self.imageView.image.size.height);
// clip to the bounds of the image context
// not strictly necessary as it will get clipped anyway?
CGContextClipToRect(context, CGRectMake(0, 0, rect.size.width, rect.size.height));
// draw image
[self.imageView.image drawInRect:drawRect];
// grab image
UIImage* croppedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return croppedImage;
}
This is what i did to crop my moving view is the rect which i pass for cropping see how its being translated to reflect correctly on image.Make sure the image view on which the user sees image is aspectfit content mode.
Note:- I make the rect of image view fit the aspectFit image
use this to do it
- (CGSize)makeSize:(CGSize)originalSize fitInSize:(CGSize)boxSize
{
widthScale = 0;
heightScale = 0;
widthScale = boxSize.width/originalSize.width;
heightScale = boxSize.height/originalSize.height;
float scale = MIN(widthScale, heightScale);
CGSize newSize = CGSizeMake(originalSize.width * scale, originalSize.height * scale);
return newSize;
}
have you tried these:
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint touchPoint = [touch locationInView:selectedImageView];
float currX = (touchPoint.x)/selectedImageView.frame.size.width;
float currY = (touchPoint.y)/selectedImageView.frame.size.height;
/*proceed with other operations on currX and currY,
which is coming out quite well*/
}
or you can also use UIPanGestureRecognizer..
I have one UIImageView having an image of an arrow. When user taps on the UIView this arrow should point to the direction of the tap maintaing its position it should just change the transform. I have implemented following code. But it not working as expected. I have added a screenshot. In this screenshot when i touch the point upper left the arrow direction should be as shown.But it is not happening so.
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch=[[event allTouches]anyObject];
touchedPoint= [touch locationInView:touch.view];
imageViews.transform = CGAffineTransformMakeRotation(DEGREES_TO_RADIANS(rangle11));
previousTouchedPoint = touchedPoint ;
}
- (CGFloat) pointPairToBearingDegrees:(CGPoint)startingPoint secondPoint:(CGPoint) endingPoint
{
CGPoint originPoint = CGPointMake(endingPoint.x - startingPoint.x, endingPoint.y - startingPoint.y); // get origin point to origin by subtracting end from start
float bearingRadians = atan2f(originPoint.y, originPoint.x); // get bearing in radians
float bearingDegrees = bearingRadians * (180.0 / M_PI); // convert to degrees
bearingDegrees = (bearingDegrees > 0.0 ? bearingDegrees : (360.0 + bearingDegrees)); // correct discontinuity
return bearingDegrees;
}
I assume you wanted an arrow image to point to where ever you touch, I tried and this is what i could come up with. I put an image view with an arrow pointing upwards (haven't tried starting from any other position, log gives correct angles) and on touching on different locations it rotates and points to touched location. Hope it helps ( tried some old math :-) )
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch=[[event allTouches]anyObject];
touchedPoint= [touch locationInView:touch.view];
CGFloat angle = [self getAngle:touchedPoint];
imageView.transform = CGAffineTransformMakeRotation(angle);
}
-(CGFloat) getAngle: (CGPoint) touchedPoints
{
CGFloat x1 = imageView.center.x;
CGFloat y1 = imageView.center.y;
CGFloat x2 = touchedPoints.x;
CGFloat y2 = touchedPoints.y;
CGFloat x3 = x1;
CGFloat y3 = y2;
CGFloat oppSide = sqrtf(((x2-x3)*(x2-x3)) + ((y2-y3)*(y2-y3)));
CGFloat adjSide = sqrtf(((x1-x3)*(x1-x3)) + ((y1-y3)*(y1-y3)));
CGFloat angle = atanf(oppSide/adjSide);
// Quadrant Identifiaction
if(x2 < imageView.center.x)
{
angle = 0-angle;
}
if(y2 > imageView.center.y)
{
angle = M_PI/2 + (M_PI/2 -angle);
}
NSLog(#"Angle is %2f",angle*180/M_PI);
return angle;
}
-anoop4real
Given what you told me, I think the problem is that you are not resetting your transform in touchesBegan. Try changing it to something like this and see if it works better:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
UITouch *touch=[[event allTouches]anyObject];
touchedPoint= [touch locationInView:touch.view];
imageViews.transform = CGAffineTransformIdentity;
imageViews.transform = CGAffineTransformMakeRotation(DEGREES_TO_RADIANS(rangle11));
previousTouchedPoint = touchedPoint ;
}
Do you need the line to "remove the discontinuity"? Seems atan2f() returns values between +π to -π. Won't those work directly with CATransform3DMakeRotation()?
What you need is that the arrow points to the last tapped point. To simplify and test, I have used a tap gesture (but it's similar to a touchBegan:withEvent:).
In the viewDidLoad method, I register the gesture :
UITapGestureRecognizer *tapGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:#selector(tapped:)];
[self.view addGestureRecognizer:tapGesture];
[tapGesture release];
The method called on each tap :
- (void)tapped:(UITapGestureRecognizer *)gesture
{
CGPoint imageCenter = mFlecheImageView.center;
CGPoint tapPoint = [gesture locationInView:self.view];
double deltaY = tapPoint.y - imageCenter.y;
double deltaX = tapPoint.x - imageCenter.x;
double angleInRadians = atan2(deltaY, deltaX) + M_PI_2;
mFlecheImageView.transform = CGAffineTransformMakeRotation(angleInRadians);
}
One key is the + M_PI_2 because UIKit coordinates have the origin at the top left corner (while in trigonometric, we use a bottom left corner).