Setting position of UIView using IF statement - ios

In the application I am creating, I have a UIView named AdvancedView1. The user can drag the view around the screen. The user needs to drag the view to a certain position on the screen. If the view is dropped into a certain area, then the view would "snap" into a defined location. When trying to use IF AND statements, I am getting an error. Any suggestions?
Also "origin" is a CGPoint declared in the .h file.
- (void)pan:(UIPanGestureRecognizer *) gesture
{
if (gesture.state == UIGestureRecognizerStateChanged)
{
CGPoint origin = [gesture locationInView:[self superview]];
[self bringSubviewToFront:[self superview]];
[self setCenter:origin];
}
if (gesture.state == UIGestureRecognizerStateEnded)
{
if (origin.x >= 574.0 && origin.x <= 724.0 ) {
self.frame = CGRectMake(574.0, 184.0, self.frame.size.width, self.frame.size.height);
}
}
}
I am getting the following error for "origin.x >= ...."
"Request for member "x" in something not a structure or union"

origin variable scope is limited to the 1st if block and it is not visible in the 2nd one. Move its declaration to the function scope:
- (void)pan:(UIPanGestureRecognizer *) gesture
{
CGPoint origin = [gesture locationInView:[self superview]];
if (gesture.state == UIGestureRecognizerStateChanged)
{
[self bringSubviewToFront:[self superview]];
[self setCenter:origin];
}
if (gesture.state == UIGestureRecognizerStateEnded)
{
if (origin.x >= 574.0 && origin.x <= 724.0 ) {
self.frame = CGRectMake(574.0, 184.0, self.frame.size.width, self.frame.size.height);
}
}
}

Related

UIPanGestureRecognizer get swipe distance in pixels

I'm trying to track the horizontal distance that a user has swiped using a UIPanGestureRecognizer, but I'm having difficulty interpreting the results. Here is the code that I am using:
- (IBAction)handlePan:(UIPanGestureRecognizer *)recognizer {
CGPoint startLocation;
if (recognizer.state == UIGestureRecognizerStateBegan) {
startLocation = [recognizer locationInView:self.view];
}
if (recognizer.state == UIGestureRecognizerStateEnded) {
CGPoint stopLocation = [recognizer locationInView:self.view];
CGFloat dx = stopLocation.x - startLocation.x;
NSLog(#"dx: %f", dx);
}
}
If I swipe left-to-right, I get output something like this:
dx: 50328911327402404790403072.000000
My screen is only 320 pixels wide, so my end result cannot be greater than 320. Is there a problem with my code or am I just interpreting this number incorrectly? How can I get this value stated in pixels? Thanks!
startLocation doesn't persist after handlePan: returns, so you're getting garbage.
Either declare startLocation as static or save it in an instance variable / property.
You can use UIPanGestureRecognizer as below:-
CGPoint translatedPoint = [(UIPanGestureRecognizer*)sender translationInView:self.view];
if ([(UIPanGestureRecognizer*)sender state] == UIGestureRecognizerStateBegan) {
firstX = [[sender view] center].x;
firstY = [[sender view] center].y;
}
translatedPoint = CGPointMake(firstX+translatedPoint.x, firstY);
[[sender view] setCenter:translatedPoint];
if ([(UIPanGestureRecognizer*)sender state] == UIGestureRecognizerStateEnded) {
CGFloat velocityX = (0.2*[(UIPanGestureRecognizer*)sender velocityInView:self.view].x);
CGFloat finalX = translatedPoint.x + velocityX;
CGFloat finalY = firstY;

UIAttachmentBehavior doesn't work. UIDynamicAnimator

I am having trouble using initWithItem:attachedToItem: Initializes an attachment behavior that connects the center point of a dynamic item to the center point of another dynamic item.
But when I changed the center point of topview using method pan,only the topview moved around,I can't get the other view to move.Isn't it should be moving together?
(BTW I am trying to implement a pile of cards and move around all together when I pan the card in the top.)
-(void)pinch:(UIPinchGestureRecognizer *)gesture
{
if(gesture.state == UIGestureRecognizerStateChanged){
CGPoint pinchCen = [gesture locationInView:self.cardArea];
if (gesture.scale <= 0.5 && !self.pileStat) {
self.pileStat = !self.pileStat;
NSUInteger number = [self.cardViews count];
UIView *topView = [self.cardViews lastObject];
[topView addGestureRecognizer:[[UIPanGestureRecognizer alloc]initWithTarget:self action:#selector(pan:)]];
for (int i = 0; i < number;i++) {
UIView *cardView = self.cardViews[i];
[UIView animateWithDuration:0.5 animations:^{cardView.center = CGPointMake(pinchCen.x+i%10*0.5, pinchCen.y+i%10*0.5);} completion:^(BOOL finished){
if(i != number - 1){
UIAttachmentBehavior *attach = [[UIAttachmentBehavior alloc]initWithItem:cardView attachedToItem:topView];
[self.animator addBehavior:attach];
}
}];
}
}
else if(gesture.scale > 1.5 && self.pileStat)
{
self.pileStat = !self.pileStat;
}
}else if (gesture.state == UIGestureRecognizerStateEnded){
gesture.scale = 1.0f;
}
}
-(void)pan:(UIPanGestureRecognizer *)gesture
{
if (gesture.state == UIGestureRecognizerStateChanged || UIGestureRecognizerStateEnded) {
UIView *topView = [self.cardViews lastObject];
CGPoint trans = [gesture translationInView:self.cardArea];
topView.center = CGPointMake(trans.x+topView.center.x, trans.y+topView.center.y);
[gesture setTranslation:CGPointMake(0, 0) inView:self.cardArea];
}
}
setCenter does not play nice with UIDynamicAnimator. Instead of changing the center coordinate of your top view, you should use another UIAttachmentBehavior that you attach to your touch. Replace your pan gesture handler with this method:
//Add a variable in your interface or header section called "touchAttachment" of type UIAttachmentBehavior
- (void) pan: (UIPanGestureRecognizer *) sender{
UIView* topCard = [self.cardViews lastObject];
if (sender.state == UIGestureRecognizerStateBegan){
_touchAttachment = [[UIAttachmentBehavior alloc] initWithItem:topCard
attachedToAnchor: [sender locationInView:self.view]];
[self.animator addBehavior:_touchAttachment];
}
else if (sender.state == UIGestureRecognizerStateChanged){
[_touchAttachment setAnchorPoint: [sender locationInView: self.view]];
}
else if (sender.state == UIGestureRecognizerStateEnded){
[self.animator removeBehavior: _touchAttachment];
_touchAttachment = nil;
}
}
Make sure you add the "touchAttachment" variable.
Hope it helps :)

iOS: dragGesture don't move element

In my app I have this code to drag an object
I set my gesture:
UILongPressGestureRecognizer *downwardGesture = [[UILongPressGestureRecognizer new] initWithTarget:self action:#selector(dragGestureChanged:)];
downwardGesture.minimumPressDuration = 0.2;
[grid_element addGestureRecognizer:downwardGesture];
for (UILongPressGestureRecognizer *gestureRecognizer in self.view.gestureRecognizers)
{
[gestureRecognizer requireGestureRecognizerToFail:downwardGesture];
}
and in my method:
- (void) dragGestureChanged:(UILongPressGestureRecognizer*)gesture{
UIImageView *imageToMove;
CGPoint pointInSelfView;
if (gesture.state == UIGestureRecognizerStateBegan)
{
dragging = TRUE;
CGPoint location = [gesture locationInView:grid_element];
NSIndexPath *selectedIndexPath = [grid_element indexPathForItemAtPoint:location];
if (selectedIndexPath==nil) {
dragging = FALSE;
return;
}
indexToHide = selectedIndexPath.row;
imageToMove = [[UIImageView alloc]initWithImage:[UIImage imageNamed:#"image_1.png"]];
[self.view addSubview:imageToMove];
pointInSelfView = [gesture locationInView:self.view];
[imageToMove setCenter:pointInSelfView];
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
pointInSelfView = [gesture locationInView:self.view];
[imageToMove setCenter:pointInSelfView];
}
else if (gesture.state == UIGestureRecognizerStateEnded ||
gesture.state == UIGestureRecognizerStateCancelled ||
gesture.state == UIGestureRecognizerStateFailed)
{
dragging = FALSE;
}
}
it work in UIGestureRecognizerStateBegan, then imageToMove is added in self.view in the correct position but imageToMove don't drag in UIGestureRecognizerStateChanged; I show a NSLog for pointInSelfView in UIGestureRecognizerStateChanged and it changes correctly; where is the problem?
EDIT
if I use imageToMove as IBOutlet it work fine, but I don't understand the difference.
If you are using ARC, the iOS system is releasing object-imageToMove.
Creating a property with retain attribute, makes system retains the object and hence it moves.
The only thing I am confused is why there is no error in [imageToMove setCenter:pointInSelfView];. There should be an error saying - message is sent to deallocated instance. But I am pretty sure that the problem is due to memory release of imageToMove.

iOS (iPad) Drag & Drop in a UISplitViewController

I'm working on a D&D in a UISplitViewController (based on Xcode template project), from the MasterViewController to the DetailViewController.
Basically, what I'm doing is creating a UILongPressGestureRecognizer and placing it on the self.tableview property of the MasterViewController.
Below is my gesture recognizer.
- (void)gestureHandler:(UIGestureRecognizer*)gesture
{
CGPoint location;
NSIndexPath* indexPath = [self.tableView indexPathForRowAtPoint:[gesture locationInView:self.tableView]];
if (gesture.state == UIGestureRecognizerStateBegan) {
// Create draggable view
NSString* imageCanvasName = [self imageNameByIndexPath:indexPath isDetail:YES];
UIImage* imageCanvas = [UIImage imageNamed:imageCanvasName];
_draggedView = [[UIImageView alloc] initWithImage:imageCanvas];
// Create drag-feedback window, add the drag-view and make the drag-window visible
CGRect windowFrame = self.view.window.frame;
_dragFeedbackWindow = [[UIWindow alloc] initWithFrame:windowFrame];
location = [gesture locationInView:gesture.view.window];
[_draggedView setCenter:location];
[_dragFeedbackWindow addSubview:_draggedView];
[_dragFeedbackWindow setHidden:NO];
}
else if (gesture.state == UIGestureRecognizerStateChanged) {
// Update drag-view location
location = [gesture locationInView:gesture.view.window];
[_draggedView setCenter:location];
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
// Disconnect drag-view and hide drag-feedback window
[_draggedView removeFromSuperview];
[_dragFeedbackWindow setHidden:YES];
// If drop is in a valid location...
if ([self.tableView pointInside:_draggedView.center withEvent:nil] == NO)
{
// Get final location in detailViewController coordinates
location = [gesture locationInView:self.detailViewController.view];
[_draggedView setCenter:location];
[self.detailViewController.view addSubview:_draggedView];
}
}
else {
NSLog(#"%s unrecognized gesture %d", __FUNCTION__, gesture.state);
}
}
All works very nicely when the iPad is in portrait mode - the drag, the drop - the works.
My problem starts if the iPad is rotated...
In such a case, my _draggedView appears "counter-rotated" - it will "reflect" the iPad's rotation - until dropped.
It's like I must apply some rotation to _dragFeedbackWindow - but I tried a number of things, failing...
Any idea?
Thanks!
OK - I figured out the "WHY" this happens, and the "HOW" to fix (and will attach the handler code to do this correctly, all below.
Here's the code... "Just" add it to your MAsterViewController (if - like me - you want to drag from the master to the detail...)
// A simple UIView extension to rotate it to a given orientation
#interface UIView(oriented)
- (void)rotateToOrientation:(UIInterfaceOrientation)orientation;
#end
#implementation UIView(oriented)
- (void)rotateToOrientation:(UIInterfaceOrientation)orientation {
CGFloat angle = 0.0;
switch (orientation) {
case UIInterfaceOrientationPortraitUpsideDown:
angle = M_PI;
break;
case UIInterfaceOrientationLandscapeLeft:
angle = - M_PI / 2.0f;
break;
case UIInterfaceOrientationLandscapeRight:
angle = M_PI / 2.0f;
break;
default: // as UIInterfaceOrientationPortrait
angle = 0.0;
break;
}
self.transform = CGAffineTransformMakeRotation(angle);
}
Now, the LongPress gesture handler...
- (void)gestureHandler:(UIGestureRecognizer*)gesture
{
CGPoint location;
UIWindow* dragFeedback = [UIApplication sharedApplication].delegate.window;
if (gesture.state == UIGestureRecognizerStateBegan) {
// Create draggable view
_draggedView = [[UIImageView alloc] initWithImage:#"someImage.png"];
// Required to adapt orientation... WORKS, BUT WHY NEEDED???
[_draggedView rotateToOrientation:[[UIApplication sharedApplication] statusBarOrientation]];
// Create drag-feedback window, add the drag-view and make the drag-window visible
location = [gesture locationInView:dragFeedback];
[_draggedView setCenter:location];
[dragFeedback addSubview:_draggedView];
}
else if (gesture.state == UIGestureRecognizerStateChanged) {
// Update drag-view location
location = [gesture locationInView:dragFeedback];
[_draggedView setCenter:location];
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
// Disconnect drag-view and hide drag-feedback window
[_draggedView removeFromSuperview];
// Get final location in detailViewController coordinates
location = [gesture locationInView:self.detailViewController.view];
[_draggedView setCenter:location];
// "Noramlize" orientation... WORKS, BUT WHY NEEDED???
[_draggedView rotateToOrientation:UIInterfaceOrientationPortrait];
[self.detailViewController.view addSubview:_draggedView];
}
else {
NSLog(#"%s unrecognized gesture %d", __FUNCTION__, gesture.state);
}
}
This works like a charm - let me know how it works for you (if you need a sample project, let me know...)
Thanks for your solution! You got me 90% of the way there. In payment, I will give you the last 10% :)
The issue seems to be that the UIWindow never gets rotated, only its subviews do! So the solution is to use a subview.
I've also added some code that shows how to see which cell is selected (assuming your master view is a UITableViewController).
- (IBAction)handleGesture:(UIGestureRecognizer *)gesture
{
CGPoint location;
UIWindow *window = [UIApplication sharedApplication].delegate.window;
//The window doesn't seem to rotate, so we'll get the subview, which does!
UIView *dragFeedback = [window.subviews objectAtIndex:0];
if (gesture.state == UIGestureRecognizerStateBegan) {
location = [gesture locationInView:dragFeedback];
//which cell are we performing the long hold over?
NSIndexPath *indexPath = [self.tableView indexPathForRowAtPoint:[gesture locationInView:self.tableView]];
UITableViewCell *selecteCell = [self.tableView cellForRowAtIndexPath:indexPath];
NSLog(#"Cell %#", selecteCell);
// Create draggable view
_draggedView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"someImage.png"]];
// Create drag-feedback window, add the drag-view and make the drag-window visible
[_draggedView setCenter:location];
[dragFeedback addSubview:_draggedView];
}
else if (gesture.state == UIGestureRecognizerStateChanged) {
// Update drag-view location
location = [gesture locationInView:dragFeedback];
[_draggedView setCenter:location];
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
// Disconnect drag-view and hide drag-feedback window
[_draggedView removeFromSuperview];
// Get final location in detailViewController coordinates
location = [gesture locationInView:self.detailViewController.view];
[_draggedView setCenter:location];
[self.detailViewController.view addSubview:_draggedView];
}
else {
NSLog(#"%s unrecognized gesture %d", __FUNCTION__, gesture.state);
}
}

Detecting if a CGAffineTransformed view is out of bounds of a screen/UIView

I have a several views that I can drag around, rotate, scale. I want to make it so they can't be drug, rotated or scaled off the screen.
Dragging seems to not be an Issue as I'm not using a Transform to generate the new position and see if that new position would put the view off the screen.
When I rotate or scale I use a CGAffineTransform (CGAffineTransformedRotate or CGAffineTransformScale) and I cant seem to get what the new frame would be without actually applying it to my view.
CGRect newElementBounds = CGRectApplyAffineTransform(element.bounds, CGAffineTransformScale([[gestureRecognizer view] transform], [gestureRecognizer scale], [gestureRecognizer scale]));
CGRect elementBoundsInSuperView = [element convertRect:newElementBounds toView:[element superview]];
elementBoundsInSuperView is not the Rect that I would Expect it to be, Its way off.
I've also Tried to get the bounds in the SuperView first and then apply the transform to it, and that's not right either.
CGRect elementBoundsInSuperView = [element convertRect:element.bounds toView:[element superview]];
CGRect newElementBounds = CGRectApplyAffineTransform(newElementBounds, CGAffineTransformScale([[gestureRecognizer view] transform], [gestureRecognizer scale], [gestureRecognizer scale]));
the [gestureRecognizer view] should be the same as element.
I came up with some gesture handlers that work so the view you are manipulatoing does not go off the area you specified. My View pallet was defined by kscreenEditorSpace, 2048.
The Pan gesture just calls the calcCenterFromXposition:yPosition:fromBoundsInSuperView: method to set its center, if the center falls out of bounds it just adjusts and keeps the element in bounds
//--------------------------------------------------------------------------------------------------------
// handlePanGesture
// Description: Called when scrollView got a DoubleFinger DoubleTap Gesture
// We want to Zoom out one ZOOM_STEP.
//
//--------------------------------------------------------------------------------------------------------
- (void)handlePanGesture:(UIPanGestureRecognizer *)gestureRecognizer {
UIView *element = [gestureRecognizer view];
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan ) {
[[self superview] bringSubviewToFront:self];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
//Front and Center Mr Element!
// Find out where we are going
CGPoint translation = [gestureRecognizer translationInView:[element superview]];
CGRect elementBoundsInSuperView = [element convertRect:element.bounds toView:[element superview]];
CGFloat xPosition = CGRectGetMidX(elementBoundsInSuperView) + translation.x;
CGFloat yPosition = CGRectGetMidY(elementBoundsInSuperView) + translation.y;
CGPoint newCenter = [self calcCenterFromXposition:xPosition yPosition:yPosition fromBoundsInSuperView:elementBoundsInSuperView];
//Re position ourselves
[element setCenter:newCenter];
//set the translation back to 0 point
[gestureRecognizer setTranslation:CGPointZero inView:[element superview]];
[self setNeedsDisplay];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateEnded ) {
}
}
So the handle Pinch and Rotation are pretty Similar. instead of calling the calc Center Directly, we call another method to help determine if we are in bounds.
//--------------------------------------------------------------------------------------------------------
// handlePinchGesture
// Description: Called when scrollView got a DoubleFinger DoubleTap Gesture
// We want to Zoom out one ZOOM_STEP.
//
//--------------------------------------------------------------------------------------------------------
- (void)handlePinchGesture:(UIPinchGestureRecognizer *)gestureRecognizer {
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan ) {
[[self superview] bringSubviewToFront:self];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
BOOL aSelectedElementOffscreen = FALSE;
if ([[gestureRecognizer view] pinchOffScreen:[gestureRecognizer scale]]) {
aSelectedElementOffscreen = TRUE;
}
if (!aSelectedElementOffscreen) {
[gestureRecognizer view].transform = CGAffineTransformScale([[gestureRecognizer view] transform], [gestureRecognizer scale], [gestureRecognizer scale]);
//Update ourself
[self contentSizeChanged];
}
[gestureRecognizer setScale:1];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateEnded) {
if (![self becomeFirstResponder]) {
NSLog(#" %# - %# - couldn't become first responder", INTERFACENAME, NSStringFromSelector(_cmd) );
return;
}
}
}
}
Pinch Off Screen Method
//--------------------------------------------------------------------------------------------------------
// pinchOffScreen
// Description: Called to see if the Pinch Gesture will cause element to go off screen Gesture
//
//--------------------------------------------------------------------------------------------------------
- (BOOL)pinchOffScreen:(CGFloat)scale {
// Save Our Current Transform incase we go off Screen
CGAffineTransform elementOrigTransform = [self transform];
//Apply our Transform
self.transform = CGAffineTransformScale([self transform], scale, scale);
// Get our new Bounds in the SuperView
CGRect newElementBoundsInSuperView = [self convertRect:self.bounds toView:[self superview]];
//Find out where we are in the SuperView
CGFloat xPosition = CGRectGetMidX( newElementBoundsInSuperView);
CGFloat yPosition = CGRectGetMidY( newElementBoundsInSuperView);
//See if we are off the Screen
BOOL offScreen = [self calcOffEditorFromXposition:xPosition yPosition:yPosition fromBoundsInSuperView: newElementBoundsInSuperView];
// We just wanted to Check. Revert to where we were
self.transform = elementOrigTransform;
return offScreen;
}
The Handle Rotation is Similar to Pinch, we have a helper method to see if we rotated off screen.
//--------------------------------------------------------------------------------------------------------
// handleRotationGesture
// Description: Called when we get a rotation gesture
// toggle the scroll/zoom lock
//
//--------------------------------------------------------------------------------------------------------
- (void) handleRotationGesture:(UIRotationGestureRecognizer *)gestureRecognizer{
UIView *element = [gestureRecognizer view];
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan ) {
[[self superview] bringSubviewToFront:self];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateBegan || [gestureRecognizer state] == UIGestureRecognizerStateChanged) {
BOOL aSelectedElementOffscreen = FALSE;
if ([element rotateOffScreen:[gestureRecognizer rotation]]) {
aSelectedElementOffscreen = TRUE;
}
if (!aSelectedElementOffscreen) {
[gestureRecognizer view].transform = CGAffineTransformRotate([element transform], [gestureRecognizer rotation]);
//Update ourself
[self contentSizeChanged];
}
[gestureRecognizer setRotation:0];
}
if ([gestureRecognizer state] == UIGestureRecognizerStateEnded) {
}
}
}
Rotate Off Screen method
//--------------------------------------------------------------------------------------------------------
// rotateOffScreen
// Description: Called to see if the Rotation Gesture will cause element to go off screen Gesture
//
//--------------------------------------------------------------------------------------------------------
- (BOOL)rotateOffScreen:(CGFloat)rotation {
// Save Our Current Transform incase we go off Screen
CGAffineTransform elementOrigTransform = [self transform];
//Apply our Transform
self.transform = CGAffineTransformRotate([self transform], rotation);
// Get our new Bounds in the SuperView
CGRect newElementBoundsInSuperView = [self convertRect:self.bounds toView:[self superview]];
//Find out where we are in the SuperVire
CGFloat xPosition = CGRectGetMidX( newElementBoundsInSuperView);
CGFloat yPosition = CGRectGetMidY( newElementBoundsInSuperView);
//See if we are off the Screen
BOOL offScreen = [self calcOffEditorFromXposition:xPosition yPosition:yPosition fromBoundsInSuperView: newElementBoundsInSuperView];
// We just wanted to Check. Revert to where we were
self.transform = elementOrigTransform;
return offScreen;
}
Calc Screen Positioning Helper Methods
#pragma mark -
#pragma mark === Calc Screen Positioning ===
#pragma mark
//--------------------------------------------------------------------------------------------------------
// calcCenterFromXposition: yPosition: fromBoundsInSuperView:
// Description: calculate the center point in the element's super view from x, y
//
//--------------------------------------------------------------------------------------------------------
-(CGPoint) calcCenterFromXposition: (CGFloat) xPosition yPosition:(CGFloat) yPosition fromBoundsInSuperView:(CGRect) elementBoundsInSuperView{
// Ge the Height/width based on SuperView Bounds
CGFloat elementWidth = CGRectGetWidth(elementBoundsInSuperView);
CGFloat elementHeight = CGRectGetHeight(elementBoundsInSuperView);
//Determine our center.x from the new x
if (xPosition < elementWidth/2) {
xPosition = elementWidth/2;
} else if (xPosition + elementWidth/2 > kscreenEditorSpace) {
xPosition = kscreenEditorSpace - elementWidth/2;
}
//Determine our center.y from the new y
if (yPosition < elementHeight/2) {
yPosition = elementHeight/2;
} else if (yPosition + elementHeight/2 > kscreenEditorSpace) {
yPosition = kscreenEditorSpace - elementHeight/2;
}
return (CGPointMake(xPosition, yPosition));
}
//--------------------------------------------------------------------------------------------------------
// calcOffEditorFromXposition: yPosition: fromBoundsInSuperView:
// Description: Determine if moving the element to x, y will it be off the editor screen
//
//--------------------------------------------------------------------------------------------------------
-(BOOL) calcOffEditorFromXposition: (CGFloat) xPosition yPosition:(CGFloat) yPosition fromBoundsInSuperView:(CGRect) elementBoundsInSuperView{
BOOL offScreen = NO;
// Ge the Height/width based on SuperView Bounds
CGFloat elementWidth = CGRectGetWidth(elementBoundsInSuperView);
CGFloat elementHeight = CGRectGetHeight(elementBoundsInSuperView);
// Off Screen on Left
if (xPosition < elementWidth/2) {
offScreen = YES;
}
//Off Screen Right
if (xPosition + elementWidth/2 > kscreenEditorSpace) {
offScreen = YES;;
}
// Off Screen Top
if (yPosition < elementHeight/2) {
offScreen = YES;
}
//Off Screen Bottom
if (yPosition + elementHeight/2 > kscreenEditorSpace) {
offScreen = YES;
}
return (offScreen);
}

Resources