How can I modify the Android Gesture visuals (for on screen gesture detection)? - android-view

Essentially what I'm trying to do is use this gesture functionality as demonstrated below
https://www.youtube.com/watch?v=tG3lzBDMRQQ
http://www.vogella.com/articles/AndroidGestures/article.html
Except instead of just setting a color, I want to be able to add a variety of visual effects to the lines drawn during a gesture motion.
IE: pulsating thickness / color changing / particle effects like a sparkler-stick firework etc.
Where would one start in attempting such a venture?
edit: One method I'm considering is to set the gesture color to transparent, but have a separate listener for touches as in some paint-type apps. And So it simultaneously creatures the gesture and draws the proper image over top of it. Would this work? Can the screen be listening for input from two views at once?

GestureOverlayView is a normal view which has the capability to draw on screen. You can simply extend GestureOverlayView and add your custom effects. You can set your custom paint style, or you can override dispatchTouchEvent() to add your own effects while drawing it.

Related

How to make different regions of image into buttons (swift/xcode)?

I have a map and I want to turn different regions of it into clickable elements. I know I could just splice up the map using photoshop and turn each region I want into a button individually, but that feels a bit hacky to me and I don't know if the aspect ratio of everything would stay the same from device to device when I piece the puzzle together. What is the best way to take a single image and divide it up into several complexly shaped clickable areas?
The most general-purpose solution is probably to make the entire view (image view) clickable by attaching a tap gesture recognizer to it and then interpreting the tap gesture.
I'd suggest creating a custom subclass of UIView that has an image view inside it, attaches a tap gesture recognizer, and responds to the messages from the tap gesture recognizer to figure out which region was tapped.

Divide an Image to a clickable parts

I'm trying to divide one image to a more than one clickable part. for example, if the image is a body image, and I tapped the head, it should take me to a different the HeadViewController, but if I tapped on the left hand, it should take me to a different view controller
Any idea how to do that?
Easy method:
Add UIButtons on top of the image with clear background color. You can do this with AutoLayout and always get correct proportions to the areas when scaling up and down.
Hard method:
Add UITapGestureRecognizer to the UIImageView and calculate CGPoint depending on where it the touchPoint is received. This is complicated and must be calculated correctly.
For you, I suggest the first method suggested.
Attach a tap gesture recognizer to your image view. Set user interaction enabled to true.
In the handler for the tap gesture, fetch the coordinates of the user's tap and write custom code that figures out which "hot box" the user tapped in.
Alternately you could create a custom subclass of UIGestureRecognizer that has multiple tap regions.

UIScrollView zoom just with 2 Fingers

I want to build an App that enables you to draw in an image, while zooming in.
I have a working code that lets you draw into an image, but the problem is that I also want to zoom in. I need to find a way to zoom and move around with 2 Fingers and draw with 1 Finger.
I tried:
self.scrollView.panGestureRecognizer.minimumNumberOfTouches = 2;
But the problem is that this disables the drawing.
I also tried to add an UIPanGestureRegognizer but that won't fit in the framework I prefer to use.
So I am asking for a way to make scrollView ignoring the 1 Finger gesture or a drawing Framework that supports zooming.
Set your UIViewController (or UIView, whatever you use for showing) as delegate for your recognizers. Then add gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: and return YES for your case or always if you don't have another recognizers.

Custom UISlider with pips ios

Wondering if the above can be created using UISlider? If not, what other ways can this be accomplished?
You can set components of a UISlider, such as the currentThumbImage ( see: "Appearance of Sliders"1).
However, it is almost certainly easier to just re-implement a slider for this much customization. Simply use background UIImageView with the scale image, and then add a separate UIView (or UIImageView) for the arrow. Finally, attach a UIPanGestureRecognizer to the arrow view to allow a user translate the view vertically.
You can change a lot in the appearance of a UISlider like setting the thumb to a red arrow. You can also replace the background image with the inches ruler and with different rulers for the different device types and display sizes.
The one thing that I don't see is that you turn the slider to work vertically. I know them only working left to right.
If I'm right, your only chance is to have a ruler as background image and a view that contains the arrow and a label with the actual value. That whole view can be pawned and tapped using Gesture Listener.

iOS touch event triggering by movement of other view

Here's the scenario I am trying to implement:
I already have a view that can let user draw doodles by touching inside the view directly (like a doodle canvas). This view implements touchesBegan, touchMoved, and touchEnded handlers to draw lines from touch event parameters.
Now instead of that, I want user to be able to drag and move another UIView on this canvas view and can still draw lines just like they touch directly. For example, user can drag a pen image view on the canvas view to draw pen-style lines.
In this case, how can I transfer the movement of this pen image view to the canvas so it recognize it? One more question: if I want this canvas view to only recognize movements of drag other views rather than touching directly, what I should do?
(Sorry that this question is little too general, just want to get some pointer of it)... Thanks!
A better way to look at the problem is
How can I transfer movement on the canvas into the location of a pen
image view?
That's easy. You already have all the code that keeps track of movement in the canvas (touchesBegan, touchesMoved, touchesEnded), so all you need to do is change to center property of the pen view to track the movement in the canvas. (Obviously, you'll need to apply small X and Y offsets to put the center of the pen view at the correct location).
The only non-obvious detail that you need to be aware of is that the pen view must have userInteractionEnabled set to NO. That way, the pen view won't interfere with touches reaching the canvas view.
Note that UIImageView has user interaction disabled by default, so you don't need to do anything if the pen view is a UIImageView. However, if you're using a generic UIView to display the pen, then you need to disable user interaction in storyboard under the Attributes inspector or disable it in code, e.g. in viewDidLoad.

Resources