I have a UINavigationController that has a bottom toolbar which I set it's height programatically like so:
navigationController?.toolbar.frame.size.height += 43.0
navigationController?.toolbar.frame.origin.y -= 43.0
navigationController?.hidesBarsOnTap = true
When I tap on my View to hide the bars and tap again to show them, the bottom bar returns to it's default state:
How can I preserve the height after the bar shows again?
Thanks a lot! :)
There's not a great way to do that, but you can do something like place a tapGestureRecognizer on self.view and count the number of taps.
Something like
var numTaps = 0
#IBAction func tapOnView(sender: UITapGestureRecognizer) {
self.numTaps++
if numTaps%2==0
{
self.navigationController?.toolbar.frame.size.height += 43.0
self.navigationController?.toolbar.frame.origin.y -= 43.0
}
}
func gestureRecognizer(gestureRecognizer: UIGestureRecognizer,
shouldRecognizeSimultaneouslyWithGestureRecognizer otherGestureRecognizer: UIGestureRecognizer) -> Bool{
return true
}
It is a little hackish but might work, might try with a slight delay to ensure you set the height after the toolbar's position is set.
Or try one of the answers provided Is there a way to change the height of a UIToolbar? and subclass uitoolbar to override sizeThatFits
Related
My view controller setup is as follows: UISearchBar on top of screen, keyboard on the bottom, and other views between them: custom UIView, UIView, UITableView. When I tap on the table view cells and UIButtons inside UIViews, I can trigger my actions successfully while keeping UISearchBar still visible on screen. But when I tap on UIViews that do not have buttons, this triggers searchBarShouldEndEditing as if search bar loses the focus. I want to disable this behavior and let only Cancel button and keyboard's Done button to trigger UISearchBar closure.
I thought this has to do with bubbling effect and tried to implement this fix:
extension UIViewController : UIGestureRecognizerDelegate{
public func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer, shouldReceive touch: UITouch) -> Bool {
return touch.view == gestureRecognizer.view
}
#IBAction func didTap(_ gestureRecognizer : UITapGestureRecognizer ) {
//
}
but this did not address the issue and UISearchBar still calls searchBarShouldEndEditing. Of course, I could add logic to searchBarShouldEndEditing to return false but not sure this is effective enough.
How can I ensure UISearchBar remains in place while tapping on other elements like UIViews?
I have a UICollectionViewCell subclass which has a UIStackView on it.
The StackView is populated with UIButtons and everything looks fine
however the buttons are untappable.
If I layout the same buttons on a UIScrollView (for the sake of experiment) instead of on a stackview, the buttons respond fine so it seems like something with the stackview is causing the issue
Any ideas?
Here is the code of how I add buttons to the StackView
func prepareStackView(buttonsArray: Array<UIButton>) {
var rect:CGRect = (stackView?.frame)!
rect.size.width = btn.frame.size.width * buttonsArray.count
stackView?.frame = rect //The frame of the stackview is set as such so that it looks exactly like I want
//Add all the buttons
for btn in buttonsArray {
//The buttons already have their selectors set
stackView?.addArrangedSubview(btn)
}
}
Use this code for UIButton to respond to click event:
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
if clipsToBounds || isHidden || alpha == 0 {
return nil
}
for subview in subviews.reversed() {
let subPoint = subview.convert(point, from: self)
if let result = subview.hitTest(subPoint, with: event) {
return result
}
}
return nil
}
Helpful link:
Capturing touches on a subview outside the frame of its superview using hitTest:withEvent:
I have also encountered similar problem and able to resolve. Please check my post where I have described the scenario of this type of cause and tried to explain how I was able to resolve.
UIButton is not clickable after first click
I have the following setup:
TableView
ScrollView
ContainerView
ScrollView lies on top of TableView and completely covers it, but they are siblings.
ScrollView scrolls left and right (paging enabled) and TableView scrolls up and down.
I want to scroll the right thing depending on scroll direction.
Tried subclassing ScrollView and overriding UIGestureRecognizerDelegate:
override func gestureRecognizerShouldBegin(_ gestureRecognizer: UIGestureRecognizer) -> Bool {
isUserInteractionEnabled = true
if let gR = gestureRecognizer as? UIPanGestureRecognizer{
let vel = gR.velocity(in: gR.view)
let x = fabs(vel.y) < fabs(vel.x)
isUserInteractionEnabled = x
return x
}
return true
}
That approach didn't work out properly.
I also tried
scrollView.addGestureRecognizer(tableView.panGestureRecognizer)
but that didn't work either because it removes panGestureRecognizer from TableView.
I have a ios-chart as a subview that takes up half the screen. When I pan up on any other subview it scrolls. But not when I pan on the chart.
I tried setting:
[self.chart setDefaultTouchEventsEnabled:YES];
//and
[self.chart setScaleEnabled:NO];
It says in the documentation
defaultTouchEventsEnabled
enables/disables default touch events to be handled. When disable, touches are not passed to parent views so scrolling inside a UIScrollView won’t work.
What can I do to enable scrolling when panning/dragging on the chart?
I was struggling with this as well. Since I needed the chart to be scrollable, #Entrabiter's solution didn't work for me. The only solution that worked for me, was assigning the delegate of the chart view's UIPanGestureRecognizer to my ViewController and implement UIGestureRecognizerDelegate.
This solution is in Swift, but it should also work fine in Objective-C.
class MyViewController: UIViewController, UIGestureRecognizerDelegate {
// MARK: Outlets
#IBOutlet weak var contentView: UIScrollView!
#IBOutlet weak var myChart: LineChartView!
override func viewDidLoad() {
super.viewDidLoad()
if let gestureRecognizers = myChart.gestureRecognizers {
for gestureRecongnizer in gestureRecognizers {
if gestureRecongnizer is UIPanGestureRecognizer {
gestureRecongnizer.delegate = self
}
}
}
}
}
The important part is to tell the gestureRecognizer to recognize both the scrollview's panGestureRecognizer as well as the chartview's panGestureRecognizer.
// MARK: UIGestureRecognizerDelegate
func gestureRecognizer(gestureRecognizer: UIGestureRecognizer, shouldRecognizeSimultaneouslyWithGestureRecognizer otherGestureRecognizer: UIGestureRecognizer) -> Bool {
if otherGestureRecognizer == contentView.panGestureRecognizer {
return true
}
return false
}
Set userInteractionEnabled on the chart view to NO.
I think I figured it out...
I removed all the UIPanGestureRecognizers from the ios-charts subview.
This allowed for the scrollview to handle all the pan events.
for(UIGestureRecognizer *rec in self.chart.gestureRecognizers){
if([rec isKindOfClass:[UIPanGestureRecognizer class]]){
[self.chart removeGestureRecognizer:rec];
}
}
Is there a way to determine the panning location of a UIPageViewController while sliding left/right? I have been trying to accomplish this but its not working. I have a UIPageViewController added as a subview and i can slide it horizontally left/right to switch between pages however i need to determine the x,y coordinates of where I am panning on the screen.
I figured out how to do this. Basically a UIPageViewController uses UIScrollViews as its subviews. I created a loop and set all the subviews that are UIScrollViews and assigned their delegates to my ViewController.
/**
* Set the UIScrollViews that are part of the UIPageViewController to delegate to this class,
* that way we can know when the user is panning left/right
*/
-(void)initializeScrollViewDelegates
{
UIScrollView *pageScrollView;
for (UIView* view in self.pageViewController.view.subviews){
if([view isKindOfClass:[UIScrollView class]])
{
pageScrollView = (UIScrollView *)view;
pageScrollView.delegate = self;
}
}
}
- (void)scrollViewDidScroll:(UIScrollView *)scrollView{
NSLog(#"Im scrolling, yay!");
}
My personal preference is not to rely too much on the internal structure of the PageViewController because it can be changed later which will break your code, unbeknownst to you.
My solution is to use a pan gesture recogniser. Inside viewDidLoad, add the following:
let gestureRecognizer = UIPanGestureRecognizer(target: self, action: #selector(handler))
gestureRecognizer.delegate = yourDelegate
view.addGestureRecognizer(gestureRecognizer)
Inside your yourDelegate's definition, you should implement the following method to allow your gesture recogniser to process the touches
func gestureRecognizer(_ gestureRecognizer: UIGestureRecognizer, shouldRecognizeSimultaneouslyWith otherGestureRecognizer: UIGestureRecognizer) -> Bool {
return true
}
Now, you should be able to access the X/Y location of the user's touches:
func handler(_ sender: UIPanGestureRecognizer) {
let totalTranslation = sender.translation(in: view)
//...
}