Knob rotation gesture recognizer - ios

I'm trying to create a gesture recognizer able to detect the rotation of 4 fingers (similar when you rotate a volume knob).
The main idea was to create a subclass of UIRotateGestureRecognizer and override its method. In the -touchesBegan I detect the number of touches, if the number is lower than 4 the state of the gesture is fail. After that I pass the location point to an algorithm that find the diameter of a convex hull. If you think about it, your fingers are the vertices and I just need to find the two vertices with the max distance. Obtained these two points I reference them as ivar and I pass them to the superclass as it is a simple rotation with just two fingers.
It doesn't work:
the detection of the touches seems pretty hard
very rarely the -touchesHasMoved is called
when its called it hangs the most of time
Can someone help me?
Here is the code:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if (touches.count<4) {
//FAIL
self.state = UIGestureRecognizerStateFailed;
return;
}
//Find the diameter of the convex hull
NSArray * touchesArray = [touches allObjects];
NSMutableArray * pointsArray = #[].mutableCopy;
for (UITouch * touch in touchesArray) {
[pointsArray addObject:[NSValue valueWithCGPoint:[touch locationInView:touch.view]]];
}
DiameterType convexHullDiameter = getDiameterFromPoints(pointsArray);
CGPoint firstPoint = convexHullDiameter.firstPoint;
CGPoint secondPoint = convexHullDiameter.secondPoint;
for (UITouch * touch in touchesArray) {
if (CGPointEqualToPoint([touch locationInView:touch.view], firstPoint) ) {
self.fistTouch = touch;
}
else if (CGPointEqualToPoint([touch locationInView:touch.view], secondPoint)){
self.secondTouch = touch;
}
}
//Calculating the rotation center as a mid point between the diameter vertices
CGPoint rotationCenter = (CGPoint) {
.x = (convexHullDiameter.firstPoint.x + convexHullDiameter.secondPoint.x)/2,
.y = (convexHullDiameter.firstPoint.y + convexHullDiameter.secondPoint.y)/2
};
self.rotationCenter = rotationCenter;
//Passing touches to super as a fake rotation gesture
NSSet * touchesSet = [[NSSet alloc] initWithObjects:self.fistTouch, self.secondTouch, nil];
[super touchesBegan:touchesSet withEvent:event];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if (touches.count<4) {
self.state = UIGestureRecognizerStateFailed;
return;
}
[super touchesMoved:[[NSSet alloc] initWithObjects:self.fistTouch, self.secondTouch, nil] withEvent:event];
}
- (void) touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesCancelled:[[NSSet alloc] initWithObjects:self.fistTouch, self.secondTouch, nil] withEvent:event];
}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesEnded:[[NSSet alloc] initWithObjects:self.fistTouch, self.secondTouch, nil] withEvent:event];
}

The reason initial detection is hard is that all the touches may not start at the same time. touchesBegan will likely be called multiple times as separate touches land on the screen. You can use the event parameter to query all of the current touches with event.allTouches. So your current approach for triggering the gesture to fail will not work. You should not set state to fail if touches.count is < 4 but instead just return if event.allTouches.count < 4. You could use a timer to set the state to fail if the fourth touch does not happen within a certain time from the first.
touchesMoved likely has problems because the touches in the event object do not match up with those in the set that you pass to super.

If you think about it, your fingers are the vertices and I just need to find the two vertices with the max distance.
I don't think this will work in practice, even if you are able to trick the UIGestureRecognizer.
This is how I would implement the algorithm in the 'correct' way:
Remember the 'old' touches.
When you're given 'new' touches, try to match each finger to the previous touch. If you can't, fail.
Compute the center of 'new' + 'old' touches.
For each of 4 fingers identified two steps ago, compute angle traveled in radians, approximated as
new(i) - old(i) divided by distance to center
If any angle is too big (> 0.5), fail.
This guarantees that approximation is valid.
Now compute the average of 4 angles.
Congratulations, you now have the rotation angle (measured in radians).

I would put this in a comment if I had enough Rep.
[super touchesMoved:[[NSSet alloc] initWithObjects:self.fistTouch, self.secondTouch, nil] withEvent:event];
You're using something called fistTouch, which doesn't sound like what you want. My guess is you want firstTouch.
Additionally there are possible collisions between gestures going on that may be overriding each other. Did you know there is a 4-finger zoom-out in iOS7 that is a system-wide gesture? Also, a 4-finger zoom-in during an app will close it.

Related

objective c - SKSpriteNode - when touch ends outside of the sprite

i followed this wonderful guide about mario-style game:
http://www.raywenderlich.com/62053/sprite-kit-tutorial-make-platform-game-like-super-mario-brothers-part-2
however, i wanted to convert the movement controls to arrow keys, implemented by SKSpriteNodes with names that are detected by:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch* touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
SKNode* node = [self nodeAtPoint:location];
// other left, up, down arrows with same code here
if ([node.name isEqualToString:#"rightArrow"]) {
self.player.moveRight = YES;
self.rightOriginalTouchLocation = location;
...
}
}
self.player.moveRight is a boolean value (much like moveForward in the guide), that tells the character at update to move.
it is terminated at:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch* touch = [touches anyObject];
CGPoint location = [touch locationInNode:self];
SKNode* node = [self nodeAtPoint:location];
// other left, up, down arrows with same code here
if ([node.name isEqualToString:#"rightArrow"]) {
self.player.moveRight = NO;
}
...
}
however, i encounter the following problem - when i start the touch on the arrow, drag it outside the arrow, and then release the tap, it is not recognized as 'touch ended' for the arrow node (and it doesn't stop moving because of that).
i tried to solve it in many ways (even calculating touch move distance from original location and see if its too far, then cancel movement), but i always manage to reproduce the constant motion problem.
the issue lies with the fact that i can tap two arrows at the same time, so it is not enough to remember the last node tapped.
since i want to allow movement for different directions at the same time, i cant stop all movements in case one button is dismissed. i need to specifically know which button was released so i can stop that direction's movement only.
do you have any ideas for me? should i implement it in another way considering i want the arrow keys, or is there a method to detect which node is released even though it is not at its original location (of the tap)?
Thank you very much!
if anyone is interested about the issue - i had a problem where touching and moving from SKSpriteNode didnt call the touchesEnded for that SKSpriteNode (since the 'release' of the touch was not in the node).
i solved it by keeping the CGPoint of the touch at touchesBegan, and when touchesEnded called, i calculated distances to nearest key using simple distance function:
-(int)calculateDistanceWithPoints:(CGPoint)point1 andPoint:(CGPoint)point2 {
float d = sqrtf(pow((point1.x - point2.x), 2) + pow((point1.y - point2.y), 2));
return (int)d;
}
and then, in touchesEnded, i checked to which key the distance is minimal(i have 3 keys, so which key is most likely to be the key that was 'released') - and did the action required for that key release.

Unpinch custom gesture recognizer with three fingers in iOS

I want to make a custom gesture recognizer with three fingers. Which is similar to unpinch gesture recognizer.
All I need is an idea about how to recognize it.
My gesture needs to recognize three fingers with three directions. For example:
I hope images makes sense. I need to make it flexible for any three opposite directions. Thanks in advance. Any help would be appreciated.
I am aware about the subclass methods and I've created custom gestures already with single finger like semicircle, full circle. I need a coding idea about how to handle that.
You need to create a UIGestureRecognizer subclass of your own (let's call it DRThreeFingerPinchGestureRecognizer) and in it to implement:
– touchesBegan:withEvent:
– touchesMoved:withEvent:
– touchesEnded:withEvent:
– touchesCancelled:withEvent:
These methods are called when touches are accepted by the system and possibly before they are sent to the view itself (depending on how you setup the gesture recognizer). Each of these methods will give you a set of touches, for which you can check the current location in your view and previous location. Since pinch gesture is relatively very simple, this information is enough for you to test if the user is performing a pinch, and fail the test (UIGestureRecognizerStateFailed). If state was not failed by – touchesEnded:withEvent:, you can recognize the gesture.
I say pinch gestures are simple, because you can easily track each touch and see how it moves compared to other touches and itself. If a threshold of an angle is passed and broken, you fail the test, otherwise you allow it to continue. If touches do not move in separate angles to each other, you fail the test. You will have to play with what angles of the vectors are acceptable, because 120 degrees are not optimal for the three most common fingers (thumb + index + middle fingers). You may just want to check that the vectors are not colliding.
Make sure to read the UIGestureRecognizer documentation for an in-depth look at the various methods, as well as subclassing notes.
Quick note for future readers: the way you do an unpinch/pinch with three fingers is add the distances ab,bc,ac.
However if your graphics package just happens to have on hand "area of a triangle" - simply use that. ("It saves one whole line of code!")
Hope it helps.
All you need to do is track:
the distance between the three fingers!
Simply add up "every" permutation
(Well, there's three .. ab, ac and cb. Just add those; that's all there is to it!)
When that value, say, triples from the start value, that's an "outwards triple unpinch".
... amazingly it's that simple.
Angles are irrelevant.
Footnote if you want to be a smartass: this applies to any pinch/unpinch gesture, 2, 3 fingers, whatever:
track the derivative of the sum-distance (I mean to say, the velocity) rather than the distance. (Bizarrely this is often EASIER TO DO! because it is stateless! you need only look at the previous frame!!!!)
So in other words, the gesture is trigger when the expansion/contraction VELOCITY of the fingers reaches a certain value, rather than a multiple of the start value.
More interesting footnote!
However there is a subtle problem here: whenever you do anything like this (any platform) you have to be careful to measure "on the glass".
IF You are just doing distance (ie, my first solution above) of course everything cancels out and you can just say "if it doubles" (in pixels -- points -- whatever the hell). BUT if you are doing velocity as part of the calculation in any gesture, then somewhat surprisingly, you have to literally find the velocity in meters per second in the real world, which sounds weird at first! Of course you can't do this exactly (particularly with android) coz glass sizes vary somewhat, but you have to get close to it. Here is a long post discussing this problem http://answers.unity3d.com/questions/292333/how-to-calculate-swipe-speed-on-ios.html In practice you usually have to make do with "screen-widths-per-second" which is pretty good. (But this may be vastly different on phones, large tablets, and these days "surface" type things. on your whole iMac screen, 0.1 screenwidthspersecond may be fast, but on an iPhone that is nothing, not a gesture.)
Final footnote! I simply don't know if Apple use "distance multiple" or "glass velocity" in their gesture recognition, or also likely is some subtle mix. I've never read an article from them commenting on it.
Another footnote! -- if for whatever reason you do want to find the "center" of the triangle (I mean the center of the three fingers). This is a well-travelled problem for game programmers because, after all, all 3D mesh is triangles.
Fortunately it's trivial to find the center of three points, just add the three vectors and divide by three! (Confusingly this even works in higher dimensions!!)
You can see endless posts on this issue...
http://answers.unity3d.com/questions/445442/calculate-uv-at-center-of-triangle.html
http://answers.unity3d.com/questions/424950/mid-point-of-a-triangle.html
Conceivably, if you were incredibly anal, you would want the "barycenter" which is more like the center of mass, just google if you want that.
I think track angles is leading you down the wrong path. I think it's likely a more flexible and intuitive gesture if you don't constrain it based on the angles between the fingers. It'll be less error prone if you just deal with it as a three-fingered pinch regardless of how the fingers move relative to each other. This is what I'd do:
if(presses != 3) {
state = UIGestureRecognizerStateCancelled;
return;
}
// After three fingers are detected, begin tracking the gesture.
state = UIGestureRecognizerStateBegan;
central_point_x = (point1.x + point2.x + point3.x) / 3;
central_point_y = (point1.y + point2.y + point3.y) / 3;
// Record the central point and the average finger distance from it.
central_point = make_point(central_point_x, central_point_y);
initial_pinch_amount = (distance_between(point1, central_point) + distance_between(point2, central_point) + distance_between(point3, central_point)) / 3;
Then on each update for touches moved:
if(presses != 3) {
state = UIGestureRecognizerStateEnded;
return;
}
// Get the new central point
central_point_x = (point1.x + point2.x + point3.x) / 3;
central_point_y = (point1.y + point2.y + point3.y) / 3;
central_point = make_point(central_point_x, central_point_y);
// Find the new average distance
pinch_amount = (distance_between(point1, central_point) + distance_between(point2, central_point) + distance_between(point3, central_point)) / 3;
// Determine the multiplicative factor between them.
difference_factor = pinch_amount / initial_pinch_amount
Then you can do whatever you want with the difference_factor. If it's greater than 1, then the pinch has moved away from the center. If it's less than one, it's moved towards the center. This will also give the user the ability to hold two fingers stationary and only move a third to perform your gesture. This will address certain accessibility issues that your users may encounter.
Also, you could always track the incremental change between touch move events, but they won't be equally spaced in time and I suspect you'll have more troubles dealing with it.
I also apologize for the pseudo-code. If something isn't clear I can look at doing up a real example.
Simple subclass of UIGestureRecognizer. It calculates the relative triangular center of 3 points, and then calculates the average distance from that center, angle is not important. You then check the average distance in your Gesture Handler.
.h
#import <UIKit/UIKit.h>
#import <UIKit/UIGestureRecognizerSubclass.h>
#interface UnPinchGestureRecognizer : UIGestureRecognizer
#property CGFloat averageDistanceFromCenter;
#end
.m
#import "UnPinchGestureRecognizer.h"
#implementation UnPinchGestureRecognizer
-(CGPoint)centerOf:(CGPoint)pnt1 pnt2:(CGPoint)pnt2 pnt3:(CGPoint)pnt3
{
CGPoint center;
center.x = (pnt1.x + pnt2.x + pnt3.x) / 3;
center.y = (pnt1.y + pnt2.y + pnt3.y) / 3;
return center;
}
-(CGFloat)averageDistanceFromCenter:(CGPoint)center pnt1:(CGPoint)pnt1 pnt2:(CGPoint)pnt2 pnt3:(CGPoint)pnt3
{
CGFloat distance;
distance = (sqrt(fabs(pnt1.x-center.x)*fabs(pnt1.x-center.x)+fabs(pnt1.y-center.y)*fabs(pnt1.y-center.y))+
sqrt(fabs(pnt2.x-center.x)*fabs(pnt2.x-center.x)+fabs(pnt2.y-center.y)*fabs(pnt2.y-center.y))+
sqrt(fabs(pnt3.x-center.x)*fabs(pnt3.x-center.x)+fabs(pnt3.y-center.y)*fabs(pnt3.y-center.y)))/3;
return distance;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] == 3) {
[super touchesBegan:touches withEvent:event];
NSArray *touchObjects = [touches allObjects];
CGPoint pnt1 = [[touchObjects objectAtIndex:0] locationInView:self.view];
CGPoint pnt2 = [[touchObjects objectAtIndex:1] locationInView:self.view];
CGPoint pnt3 = [[touchObjects objectAtIndex:2] locationInView:self.view];
CGPoint center = [self centerOf:pnt1 pnt2:pnt2 pnt3:pnt3];
self.averageDistanceFromCenter = [self averageDistanceFromCenter:center pnt1:pnt1 pnt2:pnt2 pnt3:pnt3];
self.state = UIGestureRecognizerStateBegan;
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if ([touches count] == 3)
{
NSArray *touchObjects = [touches allObjects];
CGPoint pnt1 = [[touchObjects objectAtIndex:0] locationInView:self.view];
CGPoint pnt2 = [[touchObjects objectAtIndex:1] locationInView:self.view];
CGPoint pnt3 = [[touchObjects objectAtIndex:2] locationInView:self.view];
CGPoint center = [self centerOf:pnt1 pnt2:pnt2 pnt3:pnt3];
self.averageDistanceFromCenter = [self averageDistanceFromCenter:center pnt1:pnt1 pnt2:pnt2 pnt3:pnt3];
self.state = UIGestureRecognizerStateChanged;
return;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesEnded:touches withEvent:event];
self.state = UIGestureRecognizerStateEnded;
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[super touchesEnded:touches withEvent:event];
self.state = UIGestureRecognizerStateFailed;
}
#end
implementation of Gesture, I have a max avg distance set to start, and then a minimum to end, you can also check during changed as well:
-(IBAction)handleUnPinch:(UnPinchGestureRecognizer *)sender
{
switch (sender.state) {
case UIGestureRecognizerStateBegan:
//If you want a maximum starting distance
self.validPinch = (sender.averageDistanceFromCenter<75);
break;
case UIGestureRecognizerStateEnded:
//Minimum distance from relative center
if (self.validPinch && sender.averageDistanceFromCenter >=150) {
NSLog(#"successful unpinch");
}
break;
default:
break;
}
}

Restrict movement of UIButton along a UIBezierPath path

I have a circular UIBezierPath. I use the path to draw a circle on my view to create an outline of a 24 hr clock. I have a UIButton whose position depends on the current time. The button acts like an Hour hand. I want the users to be able to move the UIButton along the circular path. I call it "visit the future/past" feature. How do I restrict the buttons movement to the path I have?
Override touchesBegan: and touchesMoved: methods in your view
- (void)touchesBegan: (NSSet *)touches withEvent:(UIEvent *)event
{
if([[event touchesForView:button] count])
{
//User is trying to move the button set a variable to indicate this.
}
}
- (void)touchesMoved: (NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint *point = [[event anyObject] locationInView:self];
/*Compare x and y coordinates with the centre property of the button
If x or y are greater set center of button to next point in circle or previous
point if any of them are lesser.*/
}
Note that you will have to save all points in your circle in an array before attempting this or you will have to calculate the points on the circumference of the circle by knowing the radius.
The easiest way is in touchesMoved, you can check to ignore touch which is not in your circle view by using:
CGPoint point = [touch locationInView:circleView];
if (![circleView pointInside:point withEvent:event]) {
return;
}

Cocos2d ccDrawLine performance issue

I use cocos2d 2.0 and Xcode 4.5. I am trying to learn how to draw a line. I can draw a line but after I drew few lines a serious performance issue occurs on Simulator.
Simulator starts to freeze, draws lines very very slowly and worst of all ,I guess because of -(void)draw is called every frame, the label on the screen becomes bold
before lines :
after lines;
I use following code :
.m
-(id) init
{
if( (self=[super init])) {
CCLabelTTF *label = [CCLabelTTF labelWithString:#"Simple Line Demo" fontName:#"Marker Felt" fontSize:32];
label.position = ccp( 240, 300 );
[self addChild: label];
_naughtytoucharray =[[NSMutableArray alloc ] init];
self.isTouchEnabled = YES;
}
return self;
}
-(BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
BOOL isTouching;
// determine if it's a touch you want, then return the result
return isTouching;
}
-(void) ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [ touches anyObject];
CGPoint new_location = [touch locationInView: [touch view]];
new_location = [[CCDirector sharedDirector] convertToGL:new_location];
CGPoint oldTouchLocation = [touch previousLocationInView:touch.view];
oldTouchLocation = [[CCDirector sharedDirector] convertToGL:oldTouchLocation];
oldTouchLocation = [self convertToNodeSpace:oldTouchLocation];
// add my touches to the naughty touch array
[_naughtytoucharray addObject:NSStringFromCGPoint(new_location)];
[_naughtytoucharray addObject:NSStringFromCGPoint(oldTouchLocation)];
}
-(void)draw
{
[super draw];
ccDrawColor4F(1.0f, 0.0f, 0.0f, 100.0f);
for(int i = 0; i < [_naughtytoucharray count]; i+=2)
{
CGPoint start = CGPointFromString([_naughtytoucharray objectAtIndex:i]);
CGPoint end = CGPointFromString([_naughtytoucharray objectAtIndex:i+1]);
ccDrawLine(start, end);
}
}
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
ManageTraffic *line = [ManageTraffic node];
[self addChild: line z:99 tag:999];
}
I saw few Air Traffic Control games such as Flight Control, ATC Mania works really well.
Does this performance issue occur because of CCDrawLine/UITouch *touch or it is a common issue?
What Flight Control, ATC Mania might be using for line drawing?
Thanks in advance.
EDIT::::
OK I guess problem is not ccDrawLine, problem is I call ManageTraffic *line = [ManageTraffic node]; every time touch ends it calls init of node so it overrides scene
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
ManageTraffic *line = [ManageTraffic node];
[self addChild: line z:99 tag:999];
}
There's three things going on:
You assess performance on the Simulator. Test it on a device as Ben says.
You store points as strings and convert strings back to CGPoint. That is terribly inefficient.
ccDrawLine is not exactly efficient. For a couple dozen line segments it's ok. In your case maybe not (see below).
For #2, create a point class with only a CGPoint property and use that to store points in the array. Removes the string conversion or packing into NSData.
For #3 make sure that new points are only added if the new point is at least n points away from the previous point. For example a distance of 10 should reduce the number of points while still allowing for relatively fine line details.
Also regarding #3, I notice you add both current and previous point to the array. Why? You only need to add the new point, and then draw points from index 0 to 1, from 1 to 2, and so on. You only have to test for the case where there is only 1 point. The previous touch event's location is always the next touch event's previousLocation. So you're storing twice as many points as you need to.

rotate an image with touch

I making an app which sets the sleep timer with a clock .Basically it is clock which has a single hand which user can move to set his sleep time .I tried to rotate the image with uitouch but it rotates from middle but i want it to rotate from the tip.Secondly i want that the image only rotates when user is touching the tip of the image but in my project the image is also rotating when use touches any part of the screen.Also i want to rotate the image in both directions but in my project it moves only clockwise due to this method
image.transform = CGAffineTransformRotate(image.transform, degreesToRadians(1));
Can anybody give me hints or solutions about how can it be done?
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event{
//
touch = [[event allTouches] anyObject];
touchLocation = [touch locationInView:touch.view];
NSLog(#"began");
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
// get touch event
[image.layer setAnchorPoint:CGPointMake(0.0,0.0)];
if ([touch view] == image) {
image.transform = CGAffineTransformRotate(image.transform, degreesToRadians(1));
//image.center = touchLocation;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event{
NSLog(#"end");
}
In detail, you can custom a rotateView,then:
1: In the delegate method of "touchesBegan", get initialPoint of finger and initialAngle.
2: During "touchesMoved",get the newPoint of finger:
CGPoint newPoint = [[touches anyObject] locationInView:self];
[self pushTouchPoint:thePoint date:[NSDate date]];
double angleDif = [self angleForPoint:newPoint] - [self angleForPoint:initialPoint];
self.angle = initialAngle + angleDif;
[[imageView layer] setTransform:CATransform3DMakeRotation(angle, 0, 0, 1)];
3: At last, in "touchesEnded" you can calculate final AngularVelocity.
If anything being confused, for more detail, you can write back.
To rotate it the other way you just use:
image.transform = CGAffineTransformRotate(image.transform, degreesToRadians(1));
And to rotate form the tip you should use setAnchorPoint (like you used in your code, but i think you should do: [image setAnchorPoint]).
more on the subject: setAnchorPoint for UIImage?
I tried to rotate the image with uitouch but it rotates from middle but i want it to rotate from the tip
I dont know any way in SDK to rotate an element on its extreme end. If you want to have that effect take the clock handle image you have twice in length with half portion transparent. It is not a direct solution, but a workaround.
Also i want to rotate the image in both directions but in my project it moves only clockwise due to this method
As per iOS developer documentation, a positive angle value specifies counterclockwise rotation and a negative value specifies clockwise rotation.

Resources