Creating a custom UITableView index iOS - ios

Our designers have come up with a nice design that involves the index on the right side of the UITableView to be taller than the UITableView itself. It also has some custom images for search and better font/color than the default implementation. However, since UITableView does not support customizing its index (that I know of) I've been trying to get this going manually. The UI looks like this:
This should be a simple task, but I'm having a hell of a time making it work. I have set up a UIView to the right of the table with UIButtons lined up from top to bottom to represent my index. The goal being as the user drags in/out of the UIButtons, I would jump the UITableView to the right section.
I have searched around and it seems that the thing to do this would be to listen to the UIControlEventTouchDragOutside and UIControlEventTouchDragEnter events to know when I'm in/out of one of the UIButtons.
To this end, I have set up the entire list of buttons as an IBOutletCollection which I init in ViewDidLoad like so:
#property (retain, nonatomic) IBOutletCollection(UIButton) NSArray* indexButtons;
#implementation ViewBeerListViewController
...
#synthesize indexButtons = _indexButtons;
- (void)viewDidLoad
{
[super viewDidLoad];
...
// Go through the index buttons and set the change function
for (int i = 0; i < [self.indexButtons count]; ++i)
{
UIButton* button = [self.indexButtons objectAtIndex:i];
[button addTarget:self action:#selector(touchDragOutside:) forControlEvents:UIControlEventTouchDragOutside];
[button addTarget:self action:#selector(touchDragEnter:) forControlEvents:UIControlEventTouchDragEnter];
}
}
The functions touchDragOutside and touchDragEnter look like this:
- (IBAction)touchDragOutside:(UIButton*)sender
{
NSLog(#"Button %# touch dragged outside", [sender titleLabel].text);
}
- (IBAction)touchDragEnter:(UIButton*)sender
{
NSLog(#"Button %# touch dragged enter", [sender titleLabel].text);
}
This all builds and runs. However, it appears that I only ever get events for the button on which I initiated a touch. E.G. if I touch down on the letter "G", and start dragging up and down, I will only see logs for touch dragging out of "G" and into "G". I get no events for any of the other UIButtons as I go over them.
Any help with resolving this issue will be immensely appreciated. I've been stuck on what seems to be a very trivial problem for a very long time.
Thanks!

Instead of making them with UIButtons, try creating a custom UIView that contains UILabels with each letter. Then in the custom UIView class, override the following:
– touchesBegan:withEvent:
– touchesMoved:withEvent:
– touchesEnded:withEvent:
and use those to determine which labels are being touched. Within touchesMoved, for example, you could have:
UITouch * touch = [touches anyObject];
CGPoint point = [touch locationInView:self];
for(UIView * indexView in self.indexViews) {
if(CGRectContainsPoint(indexView.frame,point)) {
//indexView was touched. do something here
break;
}
}
Note that you can also use a UIImageView for the top one instead of UILabels and this will still work.

Related

iOS: Sorting an array of UIButtons based on their position in relation to their siblings?

I have a NSMutableArray, called self.tappableButtons, of UIButtons which all share the same parent view. At times they overlap, and I would love to sort my array based on their order in relation to each other (starting with the topmost button to the bottommost one in the hierarchy). What's the easiest way to achieve this?
I'm basically using fast enumeration to check if the current touch in my custom pan gesture recognizer falls within a button's bounds. The problem is that if two buttons overlap it returns the one that came first in my array when enumerating, rather than the topmost button within an overlap.
//Search through all our possible buttons
for (UIButton *button in self.tappableButtons) {
//Check if our tap falls within a button's view
if (CGRectContainsPoint(button.bounds, tapLocation)) {
return button;
}
}
The easiest way (without knowing more of your code) is probably to use the subviews order to determine the topmost button on your touch.
UIView* buttonParentView = ....
NSEnumerator* topToBottom = [buttonParentView.subviews reverseObjectEnumerator];
for (id theView in topToBottom) {
if (![self.tappableButtons containsObject: theView]) {
continue;
}
UIButton* button = (UIButton*)theView;
//Check if our tap falls within a button's view
if (CGRectContainsPoint(button.bounds, tapLocation)) {
return button;
}
}
If this function is executed a lot and your self.tappableButtons remains relatively stable, or your parentview has a -lot- of subviews that are not in self.tappableButtons it's probably better to simply use your function but sort the tappableButtons first based on where they appear in their parents subviews.

Dragging views on a scroll view: touchesBegan is received, but no touchesEnded or touchesCancelled

As an iOS programming newbie I am struggling with a word game for iPhone.
The app structure is: scrollView -> contentView -> imageView -> image 1000 x 1000 (here fullscreen):
I think I have finally understood how to use an UIScrollView with Auto Layout enabled in Xcode 5.1:
I just specify enough constraints (dimensions 1000 x 1000 and also 0 to the parent) for the contentView and this defines the _scrollView.contentSize (I don't have to set it explicitly) - after that my game board scrolls and zooms just fine.
However I have troubles with my draggable letter tiles implemented in Tile.m.
I use touchesBegan, touchesMoved, touchesEnded, touchesCancelled and not gesture recognizers (as often suggested by StackOverflow users), because I display larger letter tile image with shadow (the bigImage) on touchesBegan.
My dragging is implemented in the following way:
In touchesBegan I remove the tile from contentView (and add it to the main app view) and display bigImage with shadow.
In touchesMoved I move the tile
In touchesEnded or touchesCancelled I display smallImage with shadow again and - add the tile to the contentView or leave it in the main view (if the tile is at the bottom of the app).
My problem:
Mostly this works, but sometimes (often) I see that only touchesBegan was called, but the other touchesXXXX methods are never called:
2014-03-22 20:20:20.244 ScrollContent[8075:60b] -[Tile touchesBegan:withEvent:]: Tile J 10 {367.15002, 350.98877} {57.599998, 57.599998}
Instead the scrollView is scrolled by the finger, underneath the big tile.
This results in many big tiles with shadows sitting on the screen of my app, while the scroll view is being dragged underneath them:
How to fix this please?
I know for sure that my structure of the app (with custom UIViews dragged in/out of a UIScrollView) is possible - by looking at popular word games.
I use tile.exclusiveTouch = YES and a custom hitTest method for the contentView - but this doesn't help:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
UIView* result = [super hitTest:point withEvent:event];
return result == self ? nil : result;
}
UPDATE 1:
I've tried adding the following code to handleTileTouched:
_contentView.userInteractionEnabled = NO;
_scrollView.userInteractionEnabled = NO;
_scrollView.scrollEnabled = NO;
and then set it back to YES in handleTileReleased of ViewController.m - but this does not help and also looks more like a hack to me.
UPDATE 2:
Having read probably everything related to UIScrollView, hitTest:withEvent: and pointInside:withEvent: - on the web (for ex. Hacking the responder chain and Matt Neuburg's Programming iOS book), StackOverflow and Safari, it seems to me, that a solution would be to implement the hitTest:withEvent: method for the main view of my app:
If a Tile object is hit, it should be returned. Otherwise - the scrollView should be returned.
Unfortunately, this doesn't work - I am probably missing something minor.
And I am sure that a good solution exists - by studying popular word games for iOS. For example dragging and placement of letter tiles works very smooth in Zynga's Words with Friends ® app and in the screenshots below you can see them probably using UIScrollView (the scroll bars are visible in the corner) and displaying a tile shadow (probably in touchesBegan method):
UPDATE 3:
I've created a new project to test gesture recognizer suggested by TomSwift and it shows the problem I have with gesture recognizers: the tile size changes too late - it happens, when the user starts moving the tile and not at the moment he touches it:
The problem here is that removing a view from the view hierarchy confuses the system, the touch is lost. It is the same issue (internally gesture recognizers use the same touchesBegan: API).
https://github.com/LeoNatan/ios-newbie/commit/4cb13ea405d9f959f4d438d08638e1703d6c0c1e
(I created a pull request.)
What I changed was to not remove the tile from the content view when touches begin, but only move on touches end or cancel. But this creates a problem - when dragging to the bottom, the tile is hidden below the view (due to scrollview clipping to its bounds). So I created a cloned tile, add it as a subview of the view controller's view and move that together with the original tile. When touches end, I remove the cloned tile and place the original where it should go.
This is because the bottom bar is not part of the scrollview hierarchy. If it was, the entire tile cloning would not be necessary.
I also streamlined the positioning of tiles quite a bit.
you could set the userInteractionEnabled of the scrollview to NO while you are dragging the tile, and set it back to YES when the tile dragging ended.
You should really try using a gesture recognizer instead of the raw touchesBegan/touchesMoved. I say this because UIScrollView is using gesture recognizers and by default these will cede to any higher-level gesture recognizer that is running.
I put together a sample that has a UIScrollView with an embedded UIImageView. As with your screenshot, below the scrollView I have some UIButton "Tiles", which I subclassed as TSTile objects. The only reason I did this was to expose some NSLayoutConstraints to access/alter their height/width (since you're using auto layout vs. frame manipulation). The user can drag tiles from their starting place into the scroll view.
This seems to work well; I didn't hook up the ability to drag a tile once it is re-parented in the scrollview. But that shouldn't be too hard. For that you might consider placing a long-tap gesture recognizer in each tile, then when it fires you would turn off scrolling in the scrollview, such that the top-level pan gesture recognizer would kick in.
Or, you might be able to subclass the UIScrollView and intercept the UIScrollView's pan-gesture-recognizer delegate callbacks to hinder panning when the user starts from a tile.
#interface TSTile : UIButton
//$hook these up to width/height constraints in your storyboard!
#property (nonatomic, readonly) IBOutlet NSLayoutConstraint* widthConstraint;
#property (nonatomic, readonly) IBOutlet NSLayoutConstraint* heightConstraint;
#end
#implementation TSTile
#synthesize widthConstraint,heightConstraint;
#end
#interface TSViewController () <UIScrollViewDelegate, UIGestureRecognizerDelegate>
#end
#implementation TSViewController
{
IBOutlet UIImageView* _imageView;
TSTile* _dragTile;
}
- (void)viewDidLoad
{
[super viewDidLoad];
UIPanGestureRecognizer* pgr = [[UIPanGestureRecognizer alloc] initWithTarget: self action: #selector( pan: )];
pgr.delegate = self;
[self.view addGestureRecognizer: pgr];
}
- (UIView*) viewForZoomingInScrollView:(UIScrollView *)scrollView
{
return _imageView;
}
- (BOOL) gestureRecognizerShouldBegin:(UIGestureRecognizer *)gestureRecognizer
{
CGPoint pt = [gestureRecognizer locationInView: self.view];
UIView* v = [self.view hitTest: pt withEvent: nil];
return [v isKindOfClass: [TSTile class]];
}
- (void) pan: (UIGestureRecognizer*) gestureRecognizer
{
CGPoint pt = [gestureRecognizer locationInView: self.view];
switch ( gestureRecognizer.state )
{
case UIGestureRecognizerStateBegan:
{
NSLog( #"pan start!" );
_dragTile = (TSTile*)[self.view hitTest: pt withEvent: nil];
[UIView transitionWithView: self.view
duration: 0.4
options: UIViewAnimationOptionAllowAnimatedContent
animations:^{
_dragTile.widthConstraint.constant = 70;
_dragTile.heightConstraint.constant = 70;
[self.view layoutIfNeeded];
}
completion: nil];
}
break;
case UIGestureRecognizerStateChanged:
{
NSLog( #"pan!" );
_dragTile.center = pt;
}
break;
case UIGestureRecognizerStateEnded:
{
NSLog( #"pan ended!" );
pt = [gestureRecognizer locationInView: _imageView];
// reparent:
[_dragTile removeFromSuperview];
[_imageView addSubview: _dragTile];
// animate:
[UIView transitionWithView: self.view
duration: 0.25
options: UIViewAnimationOptionAllowAnimatedContent
animations:^{
_dragTile.widthConstraint.constant = 40;
_dragTile.heightConstraint.constant = 40;
_dragTile.center = pt;
[self.view layoutIfNeeded];
}
completion:^(BOOL finished) {
_dragTile = nil;
}];
}
break;
default:
NSLog( #"pan other!" );
break;
}
}
#end
I also think you should use a UIGestureRecognizer, and more precisely a UILongPressGestureRecognizer on each tile that once recognized will handle pan.
For fine grained control you can still use the recognizers' delegate.

Ignoring Touch Events of embedded UIButtons when scrolling UICollectionView

I have a custom UICollectionViewCell that has a few custom UIView objects residing inside them. Each of these UIViews has a UIButton which responds to Touch Down and Touch Up Inside linked by IBActions. Basically, I want these buttons to shrink down when pressed down and spring back to their original size when let go. I can easily accomplish this with the controls and the press down and press up works. However, the problem I am facing happens when scrolling is introduced into the mix. The UICollectionView these cells are apart of is a scrolling one. If I happen to touch a button as I start my scroll, the Touch Down event is triggered as well as the scrolling event of the UICollectionView. If I recall correctly, this was never the case pre-iOS7. When a scrolling event was started, the UIButton event wasnt fired off, I think it had to do with the delaysContentTouches. This looks to be broken or changed now. It actually still works decently on iPhone, just not on iPad. If I scroll my view on iPad, with my touch starting inside the embedded UIButton, the button will shrink and the buttons action will be fired off.
So to restate the issue as plainly as I can: Is there anyway to ignore touches on embedded UIButtons when scrolling is occurring? Touches work fine when there is no scrolling triggered, I just dont want the events to fire off if the user is indeed scrolling. Is there any workaround?
If you need any more specific details, I would be happy to help you understand.
you need to subclass scrollView (collectionView or tableView) and override
- (BOOL)touchesShouldCancelInContentView:(UIView *)view {
if ([view isKindOfClass:UIButton.class]) {
return YES;
}
return [super touchesShouldCancelInContentView:view];
}
swift
override func touchesShouldCancelInContentView(view: UIView) -> Bool {
if view is UIButton {
return true
}
return super.touchesShouldCancelInContentView(view)
}
thats it now you can scroll over button and not lose button tap event.
In a UICollectionView of mine, buttons inside of UICollectionViewCells registered TouchUpInside-taps even though the UICollectionView was still decelerating, which sounds like a similar problem to what you're having. I created a UIButton subclass that overrides beginTrackingWithTouch:withEvent and will return NO in case the UIScrollView it's contained in is decelerating or dragging.
- (BOOL)beginTrackingWithTouch:(UITouch *)touch withEvent:(UIEvent *)event
{
UIView *superView = self;
while((superView = [superView superview])) {
if ([superView isKindOfClass:UIScrollView.class]) {
UIScrollView *scrollView = (UIScrollView *)superView;
if (scrollView.isDecelerating || scrollView.isDragging) {
return NO;
}
}
}
return [super beginTrackingWithTouch:touch withEvent:event];
}
The easiest thing to try that comes to mind is to check if the UIScrollView (your UICollectionView) is scrolling or dragging when the button action is triggered.
if(! self.collectionView.dragging && ! self.collectionView.decelerating)
{
// do action because we are not moving
}
Have you tried that?

I have 15 UIButton in my iPhone project, after clicking on button one image is pop up and all images are different for different button?

I have 15 UIButton in my iPhone project, after clicking on button one image is pop up and all images are different for different buttons, how could I achieve this by using touch?
How should I call proper image for particular button?
I dont want to take 15 UIButtons and 15 UIImageView .
You could have all 15 buttons hooked to one single "IBAction" method.
When the method is called, the parameter that is sent (e.g. "(id) sender") is the object that sent the message.
If you assign different tags to each button, you'll know which button called the action method and then you could have a different image appear for each different button.
All you need here is one "IBAction" method and one "IBOutlet" for a single image view. You will still need to have some kind of logic to load 15 different images into that single image view, though.
Here is how you can do:
This is called after any button is clicked:
-(IBAction)buttonClicked:(UIButton *)sender {
if(sender.tag == 0){
NSLog(#"Button one");
} else if(sender.tag == 1){
NSLog(#"Button two"); ... // like wise do it for 15 buttons
}
}
All you have to do is assigne tag to the button when you create it.
I think you must be running a for loop 15 times to create those buttons,
so:
for(i=0;i<15;i++){
//create button..
button.tag = i;
}
Have a background image of the view with all the 15 button information, which is divided to 15 locations to detect the touches.
Get the touch location (co-ordinates) and based on the particular location that is pre defined, update the image in the popup UIImageView accordingly.
this might help to get the location of the touch in the view.
UIView touch location coordinates
You could use IBOutletCollections, this requires that you hook up the 15 UIButtons in a Xib.
The following code assumes you have a collection of N-buttons. For a given gesture, it will iterate over the buttons in the collection and determine if the location of the touch is within a buttons frame.
Edit: IBOutletCollections are available in iOS 4 and up.
Assuming you have the following as properties:
#property (strong, nonatomic) IBOutletCollection(UIButton) NSArray *collection_buttons;
#property (weak, nonatomic) IBOutlet UIImageView *imageView;
And the following synthesizes:
#synthesize collection_buttons = _collection_buttons, imageView = _imageView;
Then you could use the following code on gesture call back.
-(IBAction) tapGestureOnButton:(UITapGestureRecognizer *) gesture {
CGPoint point = [gesture locationInView:self.view_buttonContainer];
for( int index = 0; index < self.collection_buttons.count; index++ ) {
if( CGRectContainsPoint(((UIButton *)[_collection_buttons objectAtIndex:index]).frame, point)) {
//Button at index has been pressed
[imageView setImage:[button backgroundImageForState:UIControlStateNormal]];
break;
}
}
}

ios custom navigation (vertical) advice

I have an idea for an ios5 navigation I'm doing on an app and I thought it wise to get some constructive criticism from SOF about my idea.
Idea:
UIView containing 6 or so buttons stacked vertically
UIButtons have a selected state.
Buttons static/global keeps track of last touched button and always resets the last touched button when a new UIButton is touched.
Question:
Can you read and access the children of the UIView?
eg. (pseudocode)
for (i in [myView children]) {
[[myView getChildAt:i] doSomethingToThisButton];
}
Thanks all!
Yes. Here's the non-pseudocode (well, mostly):
for (UIView *subview in [myView subviews]) {
[subview doSomethingToThisButton];
}
Or, if you prefer
for (int i = 0; i < [myView.subviews count]; i++) {
[[myView.subviews objectAtIndex:i] doSomethingToThisButton];
}
Don't make your last touched button a static variable because then you can only have one such control in your whole app. Make a UIView subclass to act as the container for your buttons, and have the last selected view be a property of that class.
You may also want to make your containing view a subclass of UIControl instead of UIView, then you can make it send events and bind to it using drag and drop in interface builder, just like a regular control (e.g. a button).

Resources