UIRefreshControl for "Pull Up" to refresh? - ios

How do I add UIRefreshControl to bottom of a UIColloectionView? That means how does it work when It comes to Scroll up (to see old data or something)?

You can't use UIRefreshControl to do that, but if you're ok with a simpler solution, you could just set up your collection view to automatically load more data when you scroll to the bottom. (Incidentally, this is a far more common user interface ... the pull up to refresh is not common, but automatically retrieving more data when you hit the bottom is.)
The most primitive rendition of that would be to respond to the UIScrollViewDelegate method and determine if you've scrolled to the bottom of the collection view (which is, itself, a subclass of the UIScrollView):
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
{
if ((scrollView.contentOffset.y + scrollView.frame.size.height) >= scrollView.contentSize.height)
{
if (!self.isLoadingMoreData)
{
self.loadingMoreData = YES;
// proceed with the loading of more data
}
}
}
Even better, if you have more data to load, show a cell at the bottom, that says "please wait, loading more data", perhaps with a UIActivityIndicatorView. For example, if you have more data to load, add a section to the end with this one cell. If you format this additional cell properly (e.g. a single cell that goes all the way across the collection view), it could definitely render the effect you're looking for.

Here is a CCBottomRefreshControl category for UIScrollView class (parent of UICollectionView class) that implements bottomRefreshControl property. It's compatible with both iOS 6 and 7 native refresh controls.

You can't. You can't really customise UIRefreshControl at all.

Related

UITableView scrolls down when contentInset is set and UITextField in tableViewHeader becomes active

Problem
I'm trying too make a custom search text field, added to a UITableViewController, always stick to the top of the page.
Like this:
Everything works fine, but when I tap the search text field to becomeFirstResponder the tableView strangely scrolls down.
I have discovered that this only happens when I manually setts the contentInset for the UITableView. (So that the tableview section headers don't hides behind the search view)
How I have built it
Layers:
TableView
UIView (as UITableView.tableViewHeaderView)
UIView (to update the frame when the user scrolling the tableView)
UITextField (for search input)
Code for fixing the search view frame when user scrolls:
var rect = frame
rect.origin.y = scrollView.contentOffset.y + 64
frame = rect
I may be doing everything stupid, but I would like to have the view stick to the top, because the user will use search more than other apps. Any ideas?
My app requires iOS 8 so I have not been available to test this on iOS 7. I can make a screen recording if that helps.
UPDATE
I've created an example project with the same objects I'm using in my main project.
https://www.dropbox.com/s/qdlx0milebbuf3p/Search%20bug%20Example.zip?dl=0
It's a UIKit bug. To fix this I've created a subclass for UITableView to NEVER change the content offset when isScrollEnabled is NO.
.h
#interface CUISearchTableView : UITableView
#end
.m
#import "CUISearchTableView.h"
#implementation CUISearchTableView
- (void)setContentOffset:(CGPoint)contentOffset {
if (self.isScrollEnabled) {
[super setContentOffset:contentOffset];
}
}
#end
I'm not sure you need to do all that. If you'd just use a UIViewController and add a UITableView to that, then you could place the UISearchView anywhere ( i.e., the top of the view ). You just need to make sure that the UISearchView is the top-most view ( bring subview to front ).
I get that you're using a UITableViewController, but if you want to use your own search view instead of the TableViewController+Search then this is your best bet. The only thing that would be necessary is adding the top content offset ( which you're already doing ). you could then just delete the CUISearchTableView.
Does this make sense or am I missing something?

App Store app view layout

I would like to create a view similar to the App Store's app view on the iPhone :
http://cl.ly/image/0Q1h273v3C00
Basically when you scroll down, part of the header remains fixed at the top :
http://cl.ly/image/3m0I3Y0t1901
Here's the approach I tried, but it doesn't work as expected :
I use a tableView as the 'global view'
In this example, my tableView header would be the top part with the app icon, name ... and UISegmentedControl
Below the header is my regular tableView with cells...
Then, I simply implement the method below :
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
{
// Keeping the menu always visible when user scrolls
// 155 is the original y position of the menu (UISegmentedControl here)
if (scrollView.contentOffset.y > 155) {
self.menuView.frame = CGRectMake(0,
scrollView.contentOffset.y,
self.menuView.frame.size.width,
self.menuView.frame.size.height);
}
else {
self.menuView.frame = CGRectMake(0,
155,
self.menuView.frame.size.width,
self.menuView.frame.size.height);
}
}
The menu remains at the top position just as I want, but still it's not working as expected :
My tableView section headers overlap the menu when I scroll. The only fix I found is to put the header layer at the top using this code :
self.tableView.tableHeaderView.layer.zPosition = MAXFLOAT;
But now the scrollbar is hidden behind the header, so I'm not satisfied with this either
I also tried to use this call in viewDidLoad and ...viewForHeaderInSection
[self.tableView bringSubviewToFront:self.tableView.tableHeaderView];
but it just doesn't work, since the section headers are always added after the tableView header and I haven't found a way to know when the section headers are added, so I can put them behind the tableView header.
When I scroll down more than the menu's initial position, even though the menu remains fixed at the top, I can't click on it, all the events are going to the tableviewcell behind it.
Finally, I want to be able to keep the same header when I click on the other buttons of the UISegmentedControl, just like in the App Store app view, and just change the content below, but I don't want to include the same header each time in each of my views, I would like to make the header generic.
Does anyone have any ideas on how to tackle the problems I'm facing, or a different approach to this ?
P.S : Youtube does kinda the same thing for the channel view, but their solution is not as good as Apple's, as you can't move the content if you start scrolling from the header.
Thanks for your help!

Can two UICollectionViews respond to a single gesture?

I have two fullscreen child UICollectionViews. One is a transparent overlay on the other. I'd like them both to respond when I drag around the screen - both of them when it's a horizontal drag and only one of them when it's a vertical drag, a little like some media centre home screens. Is this possible without reimplementing the private UICollectionView gesture recognisers, and if so how?
If not then any pointers to example reimplementations would be appreciated.
Some things I know, or have tried:
I have a pan gesture recogniser on the View Controller with a Delayed Begin that can detect the vertical or horizontal movement before events are sent through to the views.
I know that simply forwarding events from my parent view's touchesBegan: etc. won't work because the touches' view property is set to my parent view, and UITouches can't be copied (naively at least) since they don't implement the NSCopying protocol. Perhaps I can synthesise suitable UITouch events and forward them?
I know I can send scrollToItemAtIndexPath:atScrollPosition:animated: messages manually but I'd prefer to have the natural drag, swipe and snap paging behaviour for the Collections.
Alternatively, is it possible to modify the private gesture recognisers' delegates and implement gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: - without explicitly accessing private APIs - to allow both collections to see the touches? Is the responder chain smart enough to call this with gesture recognisers from two sibling views?
Another approach might be to manually control the overlay, and not manage it as a Collection View, but Collection Views seem like a more natural fit, and in theory provide the interactivity I'd like out of the box. The box, at the moment, seems to need a crowbar to get in!
This question seems similar (if less explicit), and has no answers. The other questions I've looked at all seem to be about adding pinch, or having subviews of collections also respond to gestures; not quite my situation.
I'm scratching my head a little, so thanks for any pointers.
The short answer is you can't, easily, anyway.
The approach that worked for me is a lot simpler, and cleaner: embed one collection view within another. The containing one is limited to horizontal scrolling, and the overlay one to vertical, both with paging turned on. Both share the same controller as their delegate and datasource, and - since a collection view is a subclass of scroll view - this also keeps track of which container and overlay page we're on in the scrollViewDidEndDecelerating: method:
-(void)scrollViewDidEndDecelerating:(UIScrollView *)scrollView
{
if ([scrollView isEqual:containerCollection]) {
containerNumber = scrollView.contentOffset.x / scrollView.frame.size.width;
}
else {
overlayNumber = scrollView.contentOffset.y / scrollView.frame.size.height;
}
}
The only real bit of trickery was in my cellForItemAtIndexPath: method where, when I instantiate the container cell, I need to register .xibs for reuse (each overlay is different) and use the remembered overlay page and issue both scrollToItemAtIndexPath: and reloadItemsAtIndexPaths: to the embedded overlay collection to get it to appear correctly.
I've managed to keep both cells as separate .xibs as well, with associated convenience classes for any extra data they need (and in the case of the container collection the overlay collection IBOutlet).
And not a gesture recogniser in sight.

How can I get an accessibility swipe to navigate a collection of subviews correctly?

A single swipe gesture is intended by Apple to move the VoiceOver cursor through screen items in order but mine don't because the user can move them around!
My main view has a set of buttons and labels, however it also has two collections of custom subviews, let's call the instances SVA1 to SVA9 & SVB1 to SVB9 in 'ascending order' from left to right. That is, SVA is one custom UIView class and SVB is the 2nd. When I drag, say, SVA3 to the current position of SVA6 then I end up with an order of SVA1, SVA2, SVA4, SVA5, SVA3, SVA6, SVA7, SVA8 & SVA9. The collections are 'linked' so that that same order would now also be mirrored in the SVB subview collection via my code.
My problem is that swiping to the right, expecting VoiceOver to read out my items as I see them on screen results in a different order. It gets quite a lot of them right but then will suddenly move the VoiceOver cursor from the first collection to the 2nd or change its direction. After a move my code is aware of the new order of all the subviews but I'd like to be able to get that information to VoiceOver.
Each custom subview is an accessibilityElement. Is there a way that I can tell VoiceOver to read back my items in the order I'd expect? I've come across the -accessibilityElementAtIndex: method but don't see whether or how that fits with my situation.
Thank you.
If you want to change the order then accessibilityElementAtIndex: and the rest of the UIAccessibilityContainer protocol is what you are looking for.
Assuming that you have an array called accessibleElements that store the elements in the order you want them to appear.
- (NSInteger)accessibilityElementCount {
return self.accessibleElements.count;
}
- (id)accessibilityElementAtIndex:(NSInteger)index {
return self.accessibleElements[index];
}
- (NSInteger)indexOfAccessibilityElement:(id)element {
return [self.accessibleElements indexOfObject:element];
}
The container can't be a accessibility element itself so you should also override isAccessibilityElement
- (BOOL)isAccessibilityElement {
return NO;
}

UIPanGestureRecognizer.maximumNumberOfTouches not respected in nested scroll views?

I have a root UIScrollView that only scrolls vertically, this scrollview represents rows in my jagged grid. I have configured this scroll view's pan gesture recognizer for two touches for both minimum and maximum number of touches requires.
Inside this scrollview I have one or more UIScrollView instances that only scrolls horizontally, these scrollviews each represent a single row in my jagged grid view. I have configured the pan gesture recognizers for all of these scroll views for one touch minimum, and two touches maximum.
So far it works, I get a nice jagged grid view where I can scroll vertically between rows, and horizontally to scroll each row independently. I have intentionally set to minimum number of touches as 2, as not to inter fear with scrolling if I add fro example a UITableView as a subview for any of cell within this jagged grid view (cell == a position defined by a row and column in that row).
Using a UITableView as a cell works, the table view works as expected that is. But scrolling with two fingers also scrolls inside the table view, not at the root scroll view for vertically scrolling between rows.
I have tried configuring the table views pan gesture recognizer to allow a maximum of one touches, in hope that two finger touches would be ignored. This does not work, the maximumNumberOfTouches property of the table view's pan gesture recognizer seams to be ignored.
What could I have done wrong?
A screen shot displaying the layout to clarify what I have done:
Multiple scrolling tends to get tricky, and I don't for sure, but I think Apple does not encourage this. Even so, I still think it's possible. It may be that vertical scrolling on the table view gets mixed with the scroll view vertical scrolling or something else.
Try checking if the delegates for the gesture recognizers are correctly set.
Another way around this is:
- having a Scroll view with buttons, from which you can open popovers with custom controllers (insert there whatever you want).
- create a big UITableViewController and setting the cell's contents as scrollviews etc. I think you could get the same result.
My advice is not to get stuck on just one method, when there could be others more simpler and more intuitive.
TableViews on Scroll views are generally not a great idea. When a TableView receives the touches, even if doesn't need to do anything with it, it won't send them to it's superView.
You might wanna try either of these 2 things:
In your TableView you should send the touches to your superView manually and let them handle them appropriately. I've seen this method being used in one of my side-projects but I'm not able to post an example of it at this time.
The second thing might be easier to implement. Since TableView is a subclass of ScrollView you can call upon the delaysContentTouches of those TableViews. This property will delay the touch-down even on that TableView until it can determine if scrolling is the intent, as is written in the AppleDocs: http://developer.apple.com/library/ios/#documentation/uikit/reference/UIScrollView_Class/Reference/UIScrollView.html#//apple_ref/occ/cl/UIScrollView
Let me know if either of the 2 ways works for you, I'm quite curious about this subject generally.
But don't you try some tricks rather than implementing all such changes :
1) By Default, disable the scrolling of the TableView when the view is created.
2) Once the view gets generated, Recognize the gestures whether its Scrolling using single or multiple touches, if user touch the Child Scrollview.Look out the tag ,based on gestures, you can enable the Scrolling of Tableview.
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
{
//Get the tag of ScrollView
// Check for the Parent as well as Child SCrollview.
// If its child, just enable the scrolling of tableView.
}
- (void)tapAction:(UIGestureRecognizer *)gestureRecognizer
{
// CGPoint *poit = [Tile locationInView:gestureRecognizer.view];
/// [[[gestureRecognizers.view] objectAtIndex:0] removeFromSuperview];
// imageContent = [[UIImageView alloc]initWithFrame:CGRec tMake(0, 0, 200, 250)];
// [imageContent setImage:[UIImage imageNamed:#"Default.png"]];
NSLog(#"You tapped # this :%d",gestureRecognizer.view.tag);
//Regonize the gestures
}
There may be some unnecessary code since, there is no code snippet with your question.I gave a try & showed a trick if that could work for you & solve the problem. ;)
Try this link and pay attention to how they solve nested views.
Remember the best Practices for Handling Multitouch Events:
When handling events, both touch events and motion events, there are a few recommended techniques and patterns you should follow.
Always implement the event-cancellation methods.
In your implementation, you should restore the state of the view to what it was before the current multitouch sequence, freeing any transient resources set up for handling the event. If you don’t implement the cancellation method your view could be left in an inconsistent state. In some cases, another view might receive the cancellation message.
If you handle events in a subclass of UIView, UIViewController, or (in rare cases) UIResponder,
You should implement all of the event-handling methods (even if it is a null implementation).
Do not call the superclass implementation of the methods.
If you handle events in a subclass of any other UIKit responder class,
You do not have to implement all of the event-handling methods.
But in the methods you do implement, be sure to call the superclass implementation. For example,
[super touchesBegan:theTouches withEvent:theEvent];
Do not forward events to other responder objects of the UIKit framework.
The responders that you forward events to should be instances of your own subclasses of UIView, and all of these objects must be aware that event-forwarding is taking place and that, in the case of touch events, they may receive touches that are not bound to them.
Custom views that redraw themselves in response to events should only set drawing state in the event-handling methods and perform all of the drawing in the drawRect: method.
Do not explicitly send events up the responder (via nextResponder); instead, invoke the superclass implementation and let the UIKit handle responder-chain traversal.

Resources