I am creating a notes application in which user can add any number of text views and the views will have borders (like image below) so that a user can resize it.
I am thinking to create a UIBezierPath, but I don't know how to add gesture recognizers on UIBezierPath.
Related
NOTE: I can't use UIView's maskView method because that changes the UITextView into a single view. My UITextViews are movable via pan, rotate and pinch gestures and maskView doesn't allow that.
Please look at these 2 pictures to understand what I am trying to achieve:
The first picture is what I already have. The background image of the tiger is a UIImageView. On top, I have 2 UITextViews "Love" and "Yolo". There can be more than 2 UITextViews too. I am trying to convert the first to look like the second one. Basically I want the UITextViews to show transparent see-through text. Everything which is not UITextView gets grey color.
Now note the 3rd image where the 2 UITextViews overlap each other. When overlapping, I want it both the UITextViews to be see-through like this image:
How can I achieve this effect? I am looking for guidelines on how this can be achieved.
I have a map and I want to turn different regions of it into clickable elements. I know I could just splice up the map using photoshop and turn each region I want into a button individually, but that feels a bit hacky to me and I don't know if the aspect ratio of everything would stay the same from device to device when I piece the puzzle together. What is the best way to take a single image and divide it up into several complexly shaped clickable areas?
The most general-purpose solution is probably to make the entire view (image view) clickable by attaching a tap gesture recognizer to it and then interpreting the tap gesture.
I'd suggest creating a custom subclass of UIView that has an image view inside it, attaches a tap gesture recognizer, and responds to the messages from the tap gesture recognizer to figure out which region was tapped.
Here's the scenario I am trying to implement:
I already have a view that can let user draw doodles by touching inside the view directly (like a doodle canvas). This view implements touchesBegan, touchMoved, and touchEnded handlers to draw lines from touch event parameters.
Now instead of that, I want user to be able to drag and move another UIView on this canvas view and can still draw lines just like they touch directly. For example, user can drag a pen image view on the canvas view to draw pen-style lines.
In this case, how can I transfer the movement of this pen image view to the canvas so it recognize it? One more question: if I want this canvas view to only recognize movements of drag other views rather than touching directly, what I should do?
(Sorry that this question is little too general, just want to get some pointer of it)... Thanks!
A better way to look at the problem is
How can I transfer movement on the canvas into the location of a pen
image view?
That's easy. You already have all the code that keeps track of movement in the canvas (touchesBegan, touchesMoved, touchesEnded), so all you need to do is change to center property of the pen view to track the movement in the canvas. (Obviously, you'll need to apply small X and Y offsets to put the center of the pen view at the correct location).
The only non-obvious detail that you need to be aware of is that the pen view must have userInteractionEnabled set to NO. That way, the pen view won't interfere with touches reaching the canvas view.
Note that UIImageView has user interaction disabled by default, so you don't need to do anything if the pen view is a UIImageView. However, if you're using a generic UIView to display the pen, then you need to disable user interaction in storyboard under the Attributes inspector or disable it in code, e.g. in viewDidLoad.
Essentially what I'm trying to do is use this gesture functionality as demonstrated below
https://www.youtube.com/watch?v=tG3lzBDMRQQ
http://www.vogella.com/articles/AndroidGestures/article.html
Except instead of just setting a color, I want to be able to add a variety of visual effects to the lines drawn during a gesture motion.
IE: pulsating thickness / color changing / particle effects like a sparkler-stick firework etc.
Where would one start in attempting such a venture?
edit: One method I'm considering is to set the gesture color to transparent, but have a separate listener for touches as in some paint-type apps. And So it simultaneously creatures the gesture and draws the proper image over top of it. Would this work? Can the screen be listening for input from two views at once?
GestureOverlayView is a normal view which has the capability to draw on screen. You can simply extend GestureOverlayView and add your custom effects. You can set your custom paint style, or you can override dispatchTouchEvent() to add your own effects while drawing it.
i am working on a project in which i have added a UIimageview to show an image selected by user. Now requirement is that if user want to crop an images it can touch the image according to crop requirement. when user cropping the image a line should draw at place where user has touched. i know that UITouch class will use for it. But i am unable to do it.
A couple of possible solutions.
1) Create a custom UIView that contains a UIImageView as it's subview. Make the UIImageView be the same size as the custom UIView. Then add code to detect the touches on the custom UIView and draw the lines on top of the UIImageView subview.
2) Create a custom clone of UIImageView that draws the image first as the 'background' and then detects the touches and draws the lines on top of the image.
Also make sure you have userInteractionEabled set to YES on the view receiving the touches.