Rotate the wheel to a specific point - ios

In this wheel we have 6 pieces. The top of the wheel is the specific point. The specific point gives information of pieces. Now it gives information of the blue piece. So if i click on one of the piece for example purple, i need that purple piece to goes to the specific point and automatically go into the given information about the purple piece.
CGFloat topPositionAngle = radiansToDegrees(atan2(view.transform.a, view.transform.b));
-180 - pink
-120 - blue
-60 - orange
0 - purple
60 - yellow
120 - green
Now the topPositionAngle shows -120 = blue, when purple comes to the specific point it shows 0.
UITouch *touch = [touches anyObject];
CGPoint currentTouchPoint = [touch locationInView:view];
CGFloat currentAngle = radiansToDegrees(atan2(currentTouchPoint.x, currentTouchPoint.y));
CGFloat angleTransform = ???
CGAffineTransform current = view.transform;
[UIView animateWithDuration:0.2f animations:^{
[view setTransform:CGAffineTransformRotate(current, angleTransform)];
}];
How can we get the automatically rotation to the specific point? Just like Dansk Bank app (see the following youtube link) something similar to the video from 0:21 - 0:25 min.
http://www.youtube.com/watch?v=hulBh_KNGjE

float fromAngle = atan2(m_locationBegan.y-img.center.y, m_locationBegan.x-img.center.x);
float toAngle = atan2(_location.y-imgDigits.center.y, _location.x-img.center.x);
float newAngle = wrapd(m_currentAngle + (toAngle - fromAngle), 0, 2*3.14);
angle = newAngle;
CGAffineTransform transform = CGAffineTransformMakeRotation(newAngle);
img.transform = transform;
Use something like this. Hope this helps
Here u should know touch point and your specific point

You Need to Know touched Point Angle:
CGFloat touchedPointAngle = atan2f(touchPoint.y - centerSuper.y, touchPoint.x - centerSuper.x) + M_PI;
if ((touchedPointAngle < sliceInRadians) && (touchedPointAngle > 0)) {
angleTransform = sliceInRadians;
} else if ...
Then you can transform.

Related

Restrict Touch on ColorPicker UIView - iOS

I am using third party Color picker wheel (ISColorWheel- https://github.com/justinmeiners/ios-color-wheel) to pick a color and display it on screen. I need to restrict selecting blue color if particular button is enabled.
When i see the color picker library class, they have implemented following code to restrict the Knob view to move around the color picker.
- (void)setTouchPoint:(CGPoint)point
{
CGFloat width = self.bounds.size.width;
CGFloat height = self.bounds.size.height;
CGPoint center = CGPointMake(width / 2.0, height / 2.0);
// Check if the touch is outside the wheel
if (ISColorWheel_PointDistance(center, point) < _radius)
{
//NSLog(#"Distance is %f and Radius is %f",ISColorWheel_PointDistance(center, point),_radius);
_touchPoint = point;
}
else
{
// If so we need to create a drection vector and calculate the constrained point
CGPoint vec = CGPointMake(point.x - center.x, point.y - center.y);
float extents = sqrtf((vec.x * vec.x) + (vec.y * vec.y));
vec.x /= extents;
vec.y /= extents;
_touchPoint = CGPointMake(center.x + vec.x * _radius, center.y + vec.y * _radius);
NSLog(#"Touch Point is %f %f",_touchPoint.x,_touchPoint.y);
}
[self updateKnob];
}
The above code restrict the user to move knobView away from the circle. In my case i need to restrict the user not to select Blue color of ColorPicker. How can i implement it. How to find the trajectory of Blue color.
You should define a triangle that defines the color blue as you see it (How much green dose it contain in one side how much purple on the other) then look for you Point inside that triangle. One way to do it is here: https://stackoverflow.com/a/9755252/1870192

convert uitouch dragging points to custom logic of goal selection

Right now i am working on custom control for fitness related app following is the image of control i have developed so far.
the red view can be draggable vertically only from inner most circle to outer circle.
i have so far managed to drag view vertically from inner most circle to outer circle.
here what i want to achieve,
each circle represent 2500 steps for e.g. innermost circle is 0,then 2500,5000 till outer circle 15000
user can select goal in increment of 500 only
so i am stuck at how to convert red view's sliding points to steps
also i have noticed when i swipe fast i got irregular interval of current touch points so steps increment are low on fast swipe & high on low speed swipe.
any help pointing towards solution is highly appreciated.
here is my code so far.
Circle Code
I would use an overlay UIScrollView for the red draggable part. And only allow vertical scrolling and set pagingEnabled to NO. And set its delegate as the viewcontroller.
Then, you can make your custom paging for 500step per page as follows by setting kCellHeigth to the number of pixels corresponding to 500steps in your image:
- (void)scrollViewWillEndDragging:(UIScrollView *)scrollView withVelocity:(CGPoint)velocity targetContentOffset:(inout CGPoint *)targetContentOffset
{
CGFloat kMaxIndex = 30; //=15000/500
CGFloat targetY = scrollView.contentOffset.y + velocity.y * 60.0;
CGFloat targetIndex = round(targetY / (kCellHeigth + kCellSpacing));
if (targetIndex < 0)
targetIndex = 0;
if (targetIndex > kMaxIndex)
targetIndex = kMaxIndex;
targetContentOffset->y = targetIndex * (kCellHeigth + kCellSpacing);
}
-- New Solution After looking at your code --
I see that for iPhone5(width=320) the radius delta between adjacent circles are roughly 20 points. And this space corresponds to 2500steps.
So 1 point = 125 steps
Since you want 500 step increments, it corresponds to 4 points in the iPhone5/5s screen. Maybe when the users touch ends, you can try to snap the navigator to the closest 4 point grid:
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
NSLog(#"%s",__FUNCTION__);
UITouch *touch = [touches anyObject];
touchPoint = [touch locationInView:_navView];
CGPoint newCenter = CGPointMake(_navView.center.x, _navView.center.y + (touchPoint.y - touchStart.y));
CGFloat yDelta = newCenter.y - _navMinPoint.y;
int yDeltaMultiple = yDelta / 4;
int newYDelta = yDeltaMultiple * 4;
newCenter.y = newYDelta;
_navView.center = newCenter;
_labelSteps.center = [self setStepsLabelfromNavView];
}
I didn't write these in Xcode so there might be typos/errors. If this works in iPhone 5/5s, then we will need to change 4 points to a variable.

GLKit Object not rotating properly

İ have a 3d object.
i am rotating it with touches like this :
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
CGPoint lastLoc = [touch previousLocationInView:self.view];
CGPoint diff = CGPointMake(lastLoc.x - location.x, lastLoc.y - location.y);
float rotX = -1 * GLKMathDegreesToRadians( diff.y / 2.0 );
float rotY = GLKMathDegreesToRadians( diff.x / 3.0 );
GLKVector3 yAxis = GLKMatrix4MultiplyAndProjectVector3(GLKMatrix4Invert(_rotMatrix, &isInvertible), GLKVector3Make(0, 0, 1) );
_rotMatrix = GLKMatrix4Rotate(_rotMatrix, rotY, yAxis.x, yAxis.y, yAxis.z);
GLKVector3 xAxis = GLKVector3Make(1, 0, 0);
_rotMatrix = GLKMatrix4Rotate(_rotMatrix, rotX, xAxis.x, xAxis.y, xAxis.z);
}
and setting matrices like this :
_modelViewMatrix = GLKMatrix4Identity;
_modelViewMatrix = GLKMatrix4Translate(_modelViewMatrix, 0.0f, 0.0f, -60.0f);
_modelViewMatrix = GLKMatrix4Translate(_modelViewMatrix, 0.0f, 5.5f, -4.0f);
// i know i can to this by one code
//çevirme işlemleri ilki klimanın kameraya doğru bakması için
//ikincisi parmak hareketlerinden gelen transform matrisi
// 90 derece döndermeyi kapatıyorum
_modelViewMatrix = GLKMatrix4RotateX(_modelViewMatrix, GLKMathDegreesToRadians(90.0f));
_modelViewMatrix = GLKMatrix4Multiply(_modelViewMatrix, _rotMatrix);
_modelViewMatrix = GLKMatrix4Translate(_modelViewMatrix, 0.0f, -5.5f, +4.0f);
self.reflectionMapEffect.transform.modelviewMatrix = _modelViewMatrix;
i am translating modelViewMatrix to objects centre. rotating it. than translating back. than translating -65 on z. but everytime i tried to do it. it's rotates like on the same vector. i think object has it's own centre. and rotating with it's own center and scene's center.
how can i change object's centre with code or how can i rotate this object properly?
The way the matrix multiplication works is considering the object base vectors. You can imagine it as looking form a first person perspective (from the object/model that is). If you first move the object (translate) and then rotate the object will still be at the same position but facing a different rotation, that means it will not simulate the orbiting. If you change the operations to rotate first and then move it will simulate orbiting (but rotating as well). For instance if you rotate the model to face your right and than translate forward it will seem as if translated to your right. So a true orbiting consists of first rotating by some angle, then translating by radius and then rotating by same negative angle. Again, try looking as from the model perspective.
I hope this helps as you did not explain what exactly is it you want/need to accomplish.

Problems with offsets in portrait mode Cocos2d

I am working on the basis of Ray Wenderlich's tutorial on rotating turrets in Cocos 2d (see here: http://www.raywenderlich.com/25791/rotating-turrets-how-to-make-a-simple-iphone-game-with-cocos2d-2-x-part-2). I need my game to be in portrait mode so I have managed to get the position of the turret correctly:
The turret manages to shoot right, but not left. Here is my code:
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if (_nextProjectile != nil) return;
// Choose one of the touches to work with
UITouch *touch = [touches anyObject];
CGPoint location = [self convertTouchToNodeSpace:touch];
// Set up initial location of projectile
CGSize winSize = [[CCDirector sharedDirector] winSize];
_nextProjectile = [[CCSprite spriteWithFile:#"projectile2.png"] retain];
_nextProjectile.position = ccp(160, 20);
// Determine offset of location to projectile
CGPoint offset = ccpSub(location, _nextProjectile.position);
// Bail out if you are shooting down or backwards
if (offset.x <= 0) return;
// Determine where you wish to shoot the projectile to
int realX = winSize.width + (_nextProjectile.contentSize.width/2);
float ratio = (float) offset.y / (float) offset.x;
int realY = (realX * ratio) + _nextProjectile.position.y;
CGPoint realDest = ccp(realX, realY);
// Determine the length of how far you're shooting
int offRealX = realX - _nextProjectile.position.x;
int offRealY = realY - _nextProjectile.position.y;
float length = sqrtf((offRealX*offRealX)+(offRealY*offRealY));
float velocity = 480/1; // 480pixels/1sec
float realMoveDuration = length/velocity;
// Determine angle to face
float angleRadians = atanf((float)offRealY / (float)offRealX);
float angleDegrees = CC_RADIANS_TO_DEGREES(angleRadians);
float cocosAngle = -1 * angleDegrees;
float rotateDegreesPerSecond = 180 / 0.5; // Would take 0.5 seconds to rotate 180 degrees, or half a circle
float degreesDiff = _player.rotation - cocosAngle;
float rotateDuration = fabs(degreesDiff / rotateDegreesPerSecond);
[_player runAction:
[CCSequence actions:
[CCRotateTo actionWithDuration:rotateDuration angle:cocosAngle],
[CCCallBlock actionWithBlock:^{
// OK to add now - rotation is finished!
[self addChild:_nextProjectile];
[_projectiles addObject:_nextProjectile];
// Release
[_nextProjectile release];
_nextProjectile = nil;
}],
nil]];
// Move projectile to actual endpoint
[_nextProjectile runAction:
[CCSequence actions:
[CCMoveTo actionWithDuration:realMoveDuration position:realDest],
[CCCallBlockN actionWithBlock:^(CCNode *node) {
[_projectiles removeObject:node];
[node removeFromParentAndCleanup:YES];
}],
nil]];
_nextProjectile.tag = 2;
}
Thanks for the help!
You are checking x axis instead of Y
// Bail out if you are shooting down or backwards
if (offset.x <= 0) return
;
Did you actually set the application to run in portrait mode or have you just rotated the simulator and repositioned the turret?
If you didn't explicitly set the app to run in portrait your x and y coordinates will be swapped (x will run from the ios button to the top of the phone, not accross as you would expect).
If it is converted properly I have answered this question before :)
This issue here is that you've copy-pasted the math instead of editing it properly for your purposes. There are some assumptions made in Ray's code that rely on you shooting always to the right of the turret instead of up, down, or left.
Here's the math code you should be looking at:
// Determine offset of location to projectile
CGPoint offset = ccpSub(location, _nextProjectile.position);
// Bail out if you are shooting down or backwards
if (offset.x <= 0) return;
Note here that you will have an offset.x less than 0 if the tap location is to the left of the turret, so this is an assumption you took from Ray but did not revise. As gheesse said, for your purposes this should be set to offset.y as you don't want them shooting south of the projectile's original location. But this is only part of the problem here.
// Determine where you wish to shoot the projectile to
int realX = winSize.width + (_nextProjectile.contentSize.width/2);
Here's your other big issue. You did not revise Ray's math for determining where the projectile should go. In Ray's code, his projectile will always end up on a location that is off the screen to the right, so he uses the width of the screen and projectile's size to determine the real location he wants the projectile to go. This is causing your issue since you don't have the assumption that your projectile will always head right - yours will always go up (hint, code similar to this should be used for your realY)
float ratio = (float) offset.y / (float) offset.x;
int realY = (realX * ratio) + _nextProjectile.position.y;
Again, Ray makes assumptions in his math for his game and you haven't corrected it in this realY. Your code has the turret turning in ways that will effect the realX coordinate instead of the realY, which is the coordinate that Ray's always shoot right turret needed to effect.
You need to sit down and re-do the math.

rotation along x and y axis

I'm using GLKit along with PowerVR library for my opengl-es 2.0 3D app. The 3D scene loads with several meshes, which simulate a garage environment. I have a car in the center of the garage. I am trying to add touch handling to the app, where the user can rotate the room around (e.g., to see all 4 walls surrounding the car). I also want to allow a rotation on the x axis, though limited to a small range. Basically they can see from a little bit of the top of the car to just above the floor level.
I am able to rotate on the Y OR on the X, but not both. As soon as I rotate on both axis, the car is thrown off-axis. The car isn't level with the camera anymore. I wish I could explain this better, but hopefully you guys will understand.
Here is my touches implementation:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
CGPoint lastLoc = [touch previousLocationInView:self.view];
CGPoint diff = CGPointMake(lastLoc.x - location.x, lastLoc.y - location.y);
float rotX = -1 * GLKMathDegreesToRadians(diff.x / 4.0);
float rotY = GLKMathDegreesToRadians(diff.y / 5.0);
PVRTVec3 xAxis = PVRTVec3(1, 0, 0);
PVRTVec3 yAxis = PVRTVec3(0,1,0);
PVRTMat4 yRotMatrix, xRotMatrix;
// create rotation matrices with angle
PVRTMatrixRotationXF(yRotMatrix, rotY);
PVRTMatrixRotationYF(xRotMatrix, -rotX);
_rotationY = _rotationY * yRotMatrix;
_rotationX = _rotationX * xRotMatrix;
}
Here's my update method:
- (void)update {
// Use the loaded effect
m_pEffect->Activate();
PVRTVec3 vFrom, vTo, vUp;
VERTTYPE fFOV;
vUp.x = 0.0f;
vUp.y = 1.0f;
vUp.z = 0.0f;
// We can get the camera position, target and field of view (fov) with GetCameraPos()
fFOV = m_Scene.GetCameraPos(vFrom, vTo, 0);
/*
We can build the world view matrix from the camera position, target and an up vector.
For this we use PVRTMat4LookAtRH().
*/
m_mView = PVRTMat4::LookAtRH(vFrom, vTo, vUp);
// rotate the camera based on the users swipe in the X direction (THIS WORKS)
m_mView = m_mView * _rotationX;
// Calculates the projection matrix
bool bRotate = false;
m_mProjection = PVRTMat4::PerspectiveFovRH(fFOV, (float)1024.0/768.0, CAM_NEAR, CAM_FAR, PVRTMat4::OGL, bRotate);
}
I've tried multiplying the new X rotation matrix to the current scene rotation first, and then multiplying the new Y rotation matrix second. I've tried the reverse of that, thinking the order of multiplication was my problem. That didn't help. Then I tried adding the new X and Y rotation matrices together before multiplying to the current rotation, but that didn't work either. I feel that I'm close, but at this point I'm just out of ideas.
Can you guys help? Thanks. -Valerie
Update: In an effort to solve this, I'm trying to simplify it a little. I've updated the above code, removing any limit in the range of the Y rotation. Basically I calculate the X and Y rotation based on the user swipe on the screen.
If I understand this correctly, I think I want to rotate the View matrix (camera/eye) with the calculation for the _rotationX.
I think I need to use the World matrix (origin 0,0,0) for the _rotationY calculation. I'll try and get some images of exactly what I'm talking about.
Wahoo, got this working! I rotated the view matrix (created by LookAt method) with the X rotation matrix. I rotated the model view matrix with the Y rotation Matrix.
Here's the modified Update method:
- (void)update {
PVRTVec3 vFrom, vTo, vUp;
VERTTYPE fFOV;
// We can get the camera position, target and field of view (fov) with GetCameraPos()
fFOV = m_Scene.GetCameraPos(vFrom, vTo, 0);
/*
We can build the world view matrix from the camera position, target and an up vector.
For this we use PVRTMat4LookAtRH().
*/
m_mView = PVRTMat4::LookAtRH(vFrom, vTo, PVRTVec3(0.0f, 1.0f, 0.0f));
// rotate on the X axis (finger swipe Y direction)
m_mView = m_mView * _rotationY;
// Calculates the projection matrix
m_mProjection = PVRTMat4::PerspectiveFovRH(fFOV, (float)1024.0/768.0, CAM_NEAR, CAM_FAR, PVRTMat4::OGL, false);
}
Here's the modified touch moved method:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch * touch = [touches anyObject];
CGPoint location = [touch locationInView:self.view];
CGPoint lastLoc = [touch previousLocationInView:self.view];
CGPoint diff = CGPointMake(lastLoc.x - location.x, lastLoc.y - location.y);
float rotX = -1 * GLKMathDegreesToRadians(diff.x / 2.5);
float rotY = GLKMathDegreesToRadians(diff.y / 2.5);
PVRTMat4 rotMatrixX, rotMatrixY;
// create rotation matrices with angle
PVRTMatrixRotationYF(rotMatrixX, -rotX);
PVRTMatrixRotationXF(rotMatrixY, rotY);
_rotationX = _rotationX * rotMatrixX;
_rotationY = _rotationY * rotMatrixY;
}

Resources