unexpected nil window in _UIApplicationHandleEventFromQueueEvent, _windowServerHitTestWindow - ios

I am trying to set up an edge swipe gesture in iOS 8 on iPad but getting and error that seems like a bug.
I have the following code:
UIScreenEdgePanGestureRecognizer *edgeRecognizer = [[UIScreenEdgePanGestureRecognizer alloc] initWithTarget:self action:#selector(handleRightEdgeSwipe:)];
edgeRecognizer.edges = UIRectEdgeRight;
[self.view addGestureRecognizer:edgeRecognizer];
and then I handle the gesture:
-(void)handleRightEdgeSwipe:(UIGestureRecognizer*)sender
{
//slide in view code here
}
The problem is that it doesn't detect the right edge swipe every time. And sometimes it detects it multiple times.
Whether it detects or not it always shows the following information in the console when swiping the right edge on iPad:
2014-10-07 00:04:40.386 Office Log[1531:500896] unexpected nil window in _UIApplicationHandleEventFromQueueEvent, _windowServerHitTestWindow: ; layer = >
What does this message mean and how can I fix it so that the right edge swipe is detected consistently?

I think it's a bug in iOS, which I can confirm on an iPad mini 2 and an iPad Air on iOS 7 or higher, even on the Home Screen.
In "Landscape Left" (Home button on the left) a "Right Edge Gesture" from outside the Screen is not working for me. Test it by your self on the Home Screen.
I reported a bug to Apple 9 Month ago, but noting further happened.
Update:
I played a bit with the UIWindow init and when it is a bit bigger than it really is, the Gesture works. Of course this is a horrible fix.
self.window = [UIWindow new];
self.window.rootViewController = [[UIViewController alloc] init];
// Real Size
CGRect frame = [UIScreen mainScreen].bounds;
// Real Size + 0.000001
self.window.frame = CGRectMake(0, 0, frame.size.width+0.000001, frame.size.height+0.000001);
[self.window makeKeyAndVisible];

I got the same issue.
My solution works fine: just set in your xib your Windows to hidden.
I don't really understand why it works, but it works.
EDIT 1:
I found another solution, better I think, and more understandable:
Put this code on your willFinishLaunchingWithOptions, in your appDelegate:
- (BOOL)application:(UIApplication *)application willFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
CGRect bounds = [[UIScreen mainScreen] bounds];
[self.window setFrame:bounds];
[self.window setBounds:bounds];
return YES;
}
Then, on your didFinishLaunchingWithOptions:
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
// Your codes...
self.window.rootViewController = self.navigationController;
[self.window makeKeyAndVisible];
return YES;
}
Your can then set your window object hidden to NO, and it should works.

I got issue when testing iPhone app on iPad. No problems on simulator and no problem if I compile app as universal and run on iPad.
unexpected nil window in _UIApplicationHandleEventFromQueueEvent, _windowServerHitTestWindow: <UIClassicWindow: 0x1276065a0; frame = (0 0; 768 1024); userInteractionEnabled = NO; gestureRecognizers = <NSArray: 0x1740557e0>; layer = <UIWindowLayer: 0x17403fd80>>
Perhaps frame is reported wrong ? ( frame = (0 0; 768 1024) )

iOS 8 has a bug where any touch that begins exactly on an iPad right edge when in Portrait Upside-Down (home button on top) or Landscape Left (home button on left) mode fails to be hit tested to the correct view.
The fix is to subclass UIWindow to hit test the right side properly.
#implementation FixedWindow
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent*)event
{
UIView* hit = [super hitTest:point withEvent:event];
if (!hit && point.x >= CGRectGetMaxX(self.bounds))
hit = [super hitTest:CGPointMake(point.x - 0.5, point.y) withEvent:event];
return hit;
}
#end
Attach the window to your application delegate via the window property.
#implementation AppDelegate
- (UIWindow*)window
{
if (!_window)
_window = [[IVWindow alloc] initWithFrame:[UIScreen mainScreen].bounds];
return _window;
}
#end
In Portrait and Landscape Right modes, I've confirmed that right edge touches are always at least 0.5px from the edge instead of exactly on the edge, so this fix should work in analogy to that.
Expanding the window frame
Note that firebug's fix will also work i.e. slightly expanding the window frame to include the right side. However:
If you do this at application:willFinishLaunchingWithOptions: or application:didFinishLaunchingWithOptions:, your view hierarchy doesn't get resized to the new frame and right edge touches won't make it through the hierarchy.
If you rotate the device, the window may not be centered correctly. This leads to either smearing or bumping interface elements slightly.
iOS 7:
iOS 7 has a similar bug in that the hit test also fails but with a non-nil result and unrotated geometry. This fix should work with both iOS 7 and 8:
#implementation FixedWindow
- (UIView*)hitTest:(CGPoint)point withEvent:(UIEvent*)event
{
UIView* hit = [super hitTest:point withEvent:event];
if (!hit || hit == self)
{
CGRect bounds = self.bounds;
hit = [super hitTest:CGPointMake(MIN(MAX(point.x, CGRectGetMinX(bounds) + 0.5), CGRectGetMaxX(bounds) - 0.5),
MIN(MAX(point.y, CGRectGetMinY(bounds) + 0.5), CGRectGetMaxY(bounds) - 0.5))
withEvent:event];
}
return hit;
}
#end

One possible fix is to remove or comment out code for hiding status bar if you have that.
I was pulling my hair to solve it, and I could only reproduce it on my root view. It appears that if you hide the status bar you cannot drag down today widgets/notification center (you can with some effort).
/* <- add this
- (BOOL)prefersStatusBarHidden
{
return YES;
}
add this -> */

Set your deployment to 8.x or above, set launch screen as your main xib.
Done!

It may be to late but some people may still need it,
normally the cause is that you haven't supplied correctly sized launch images or a launch screen and/or Main Interface is not set to your own storyboard at General> Deployment Info

Related

iOS in-Call indicator is pushing down view/content, modifying root view `frame`

I have a problem that my root view (the UIViewController view) is being pushed down by the in-call indicator: window.rootViewController.view.frame is being modifeid (Y is set to 20). As I respond to did/willStatusBarFrameChange on my own, I don't want this behaviour.
I'm looking for the property, or setup, that prevents the modification of the frame in response to an in-call status bar. I use other APIs to respond to changes in the top/bottom frames and iPhone X safe areas.
I've tried things like autoResizingMask, extendedLayoutIncludesOpaqueBars, edgesForExtendedLayout, viewRespectsSystemMinimumLayoutMargins but can't get anything working.
If relevant, the view is also animating down, indicating it's not some side-effect but an intended behaviour somewhere.
I've read many reports of similar behaviour but have yet to figure out if they actually resolved it and/or what the solution actually was (each solution appears to address a slightly different problem).
Related questions: Prevent In-Call Status Bar from Affecting View (Answer has insufficient detail), Auto Layout and in-call status bar (Unclear how to adapt this)
--
I can't provide a simple reproduction, but the portions of code setting up the view looks something like this:
Window setup:
uWindow* window = [[uContext sharedContext] window];
window.rootViewController = (UIViewController*)[[UIApplication sharedApplication] delegate];
[window makeKeyAndVisible];
Our AppDelegate implementation (relevant part)
#interface uAppDelegate : UIViewController<#(AppDelegate.Implements:Join(', '))>
...
#implementation uAppDelegate
- (id)init
{
CGRect screenBounds = [UIScreen mainScreen].bounds;
uWindow* window = [[uWindow alloc] initWithFrame:screenBounds];
return self;
}
We assign our root view to the above delegate, the UIViewController's .view property.
#interface OurRootView : UIControl<UIKeyInput>
UIControl* root = [[::OurRootView alloc] init];
[root setUserInteractionEnabled: true];
[root setMultipleTouchEnabled: true];
[root setOpaque: false];
[[root layer] setAnchorPoint: { 0.0f, 0.0f }];
// some roundabout calls that make `root` the `rootViewController.view = root`
[root sizeToFit];
The goal is that OurRootView occupies the entire screen space at all times, regardless of what frames/controls/margins are adjusted. I'm using other APIs to detect those frames and adjust the contents accordingly. I'm not using any other controller, view, or layout.
It's unclear if there is a flag to disable this behaviour. I did however find a way that negates the effect.
Whatever is causing the frame to shift down does so by modifying the frame of the root view. It's possible to override this setter and block the movement. In our case the root view is fixed in position, thus I did this:
#implementation OurRootView
- (void)setFrame:(CGRect)frame;
{
frame.origin.y = 0;
[super setFrame:frame];
}
#endf
This keeps the view in a fixed location when the in-call display is shown (we handle the new size ourselves via a change in the statusBarFrame and/or safeAreaInsets). I do not know why this also avoids the animation of the frame, but it does.
If for some reason you cannot override setFrame you can get a near similar seffect by overriding the app delegate's didChangeStatusBarFrame and modifying the root view's frame (setting origin back to 0). The animation still plays with this route.
I hope I understand your problem: If you have some indicator like incall, or in my case location using by maps. You need to detect on launching of the app that there is some indicator and re-set the frame of the whole window. My solution for this:
In didFinishLaunchingWithOptions you check for the frame of the status bar, because incall is the part of status bar.
CGFloat height = [UIApplication sharedApplication].statusBarFrame.size.height;
if (height == 20) {
self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
}
else {
CGRect frame = [[UIScreen mainScreen] bounds];
frame.size.height = frame.size.height - height +20;
frame.origin.y = height-20;
self.window = [[UIWindow alloc] initWithFrame:frame];
}
You can listen to the notification UIApplicationDidChangeStatusBarFrameNotification in your view controller(s) to catch when the status bar has changed. Then you adjust your view controller's main view rectangle to always cover the entire screen.
// Declare in your class
#property (strong, nonatomic) id<NSObject> observer;
- (void)viewDidLoad {
[super viewDidLoad];
_observer = [[NSNotificationCenter defaultCenter] addObserverForName:UIApplicationDidChangeStatusBarFrameNotification object:nil queue:nil usingBlock:^(NSNotification * _Nonnull note) {
CGFloat newHeight = self.view.frame.size.height + self.view.frame.origin.y;
self.view.frame = CGRectMake(0.0, 0.0, self.view.frame.size.width, newHeight);
}];
}
-(void)dealloc {
[[NSNotificationCenter defaultCenter] removeObserver:_observer];
}
I tried it on various models, and it works fine, as far as I can tell. On iPhone X the notification is not posted since it does not alter the status bar height on calls.
There is also a corresponding UIApplicationWillChangeStatusBarFrameNotification which is fired before the status bar changes, in case you want to prepare your view in some way.

Custom UIWindows do not rotate correctly in iOS 8

Get your application into landscape mode and execute the following code:
UIWindow *toastWindow = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
toastWindow.hidden = NO;
toastWindow.backgroundColor = [[UIColor cyanColor] colorWithAlphaComponent:0.5f];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(5 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[toastWindow removeFromSuperview];
});
In iOS 7 you will get a transparent blue overlay on top of the entire screen that disappears after 5 seconds. In iOS 8 you will get a transparent blue overlay that covers a little over half the screen
This obviously has something to do with the change Apple did in iOS 8 where the screen coordinates are now interface oriented instead of device oriented but in true Apple fashion they seem to have left a myriad of bugs in landscape mode and rotation.
I can "fix" this by checking if the device orientation is landscape and flip the width and height of the main screen bounds but that seems like a horrible hack that is going to break when Apple changes everything again in iOS 9.
CGRect frame = [[UIScreen mainScreen] bounds];
if (UIInterfaceOrientationIsLandscape([UIDevice currentDevice].orientation))
{
frame.size.width = frame.size.height;
frame.size.height = [[UIScreen mainScreen] bounds].size.width;
}
UIWindow *toastWindow = [[UIWindow alloc] initWithFrame:frame];
toastWindow.hidden = NO;
toastWindow.backgroundColor = [[UIColor cyanColor] colorWithAlphaComponent:0.5f];
dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(5 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
[toastWindow removeFromSuperview];
});
Has anyone been experiencing this problem and come across a better, less brittle solution?
EDIT: I know I could just use a UIView and add it to the key window but I would like to put something on top of the status bar.
Your 'fix' isn't great for another reason, in that it doesn't actually rotate the window so that text and other subviews appear with the appropriate orientation. In other words, if you ever wanted to enhance the window with other subviews, they would be oriented incorrectly.
...
In iOS8, you need to set the rootViewController of your window, and that rootViewController needs to return the appropriate values from 'shouldAutoRotate' and 'supportedInterfaceOrientations'. There is some more about this at: https://devforums.apple.com/message/1050398#1050398
If you don't have a rootViewController for your window, you are effectively telling the framework that the window should never autoRotate. In iOS7 this didn't make a difference, since the framework wasn't doing that work for you anyway. In iOS8, the framework is handling the rotations, and it thinks it is doing what you requested (by having a nil rootViewController) when it restricts the bounds of your window.
Try this:
#interface MyRootViewController : UIViewController
#end
#implementation MyRootViewController
- (UIInterfaceOrientationMask)supportedInterfaceOrientations
{
return UIInterfaceOrientationMaskAll;
}
- (BOOL)shouldAutorotate
{
return YES;
}
#end
Now, add that rootViewController to your window after it is instantiated:
UIWindow *toastWindow = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
toastWindow.rootViewController = [[MyRootViewController alloc]init];
I believe you have to convert the UICoordinateSpace.
On iPhone 6 Plus apps can now launch already in landscape orientation, which messed up an application I am working on since it only supports portrait orientation throughout most of the app, except one screen (meaning it needed to support landscape orientations in the plist).
The code that fixed this was as follows:
self.window = [[UIWindow alloc] initWithFrame:[self screenBounds]];
and calculating the bounds with the following code:
- (CGRect)screenBounds
{
CGRect bounds = [UIScreen mainScreen].bounds;
if ([[UIScreen mainScreen] respondsToSelector:#selector(fixedCoordinateSpace)]) {
id<UICoordinateSpace> currentCoordSpace = [[UIScreen mainScreen] coordinateSpace];
id<UICoordinateSpace> portraitCoordSpace = [[UIScreen mainScreen] fixedCoordinateSpace];
bounds = [portraitCoordSpace convertRect:[[UIScreen mainScreen] bounds] fromCoordinateSpace:currentCoordSpace];
}
return bounds;
}
Hopefully this will lead you in the right direction.
In iOS 7 and earlier, UIWindow's coordinate system did not rotate. In iOS 8 it does. I am guessing the frame supplied from [[UIScreen mainScreen] bounds] does not account for rotation, which could cause issues like what you're seeing.
Instead of getting the frame from UIScreen, you could grab the frame from the appDelegate's current key window.
However, it does not appear you really need functionality supplied by UIWindow. I'd like to echo others' recommendation that you use a UIView for this purpose. UIView is more generic than UIWindow, and should be preferred.
The best thing to do is use views instead of windows:
Most iOS applications create and use only one window during their lifetime. This window spans the entire main screen [...]. However, if an application supports the use of an external display for video out, it can create an additional window to display content on that external display. All other windows are typically created by the system
But if you think you've got a valid reason to create more than one window, I suggest you create a subclass of NSWindow that handles sizes automatically.
Also note that windows use a lot of RAM, especially on 3x retina screens. Having more than one of them is going to reduce the amount of memory the rest of your application can use before receiving low memory warnings and eventually being killed.

Why is the top portion of my UISegmentedControl not tappable?

While I was playing on my phone, I noticed that my UISegmentedControl was not very responsive. It would take 2 or more tries to make my taps register. So I decided to run my app in Simulator to more precisely probe what was wrong. By clicking dozens of times with my mouse, I determined that the top 25% of the UISegmentedControl does not respond (the portion is highlighted in red with Photoshop in the screenshot below). I am not aware of any invisible UIView that could be blocking it. Do you know how to make the entire control tappable?
self.segmentedControl = [[UISegmentedControl alloc] initWithItems:[NSArray arrayWithObjects:#"Uno", #"Dos", nil]];
self.segmentedControl.selectedSegmentIndex = 0;
[self.segmentedControl addTarget:self action:#selector(segmentedControlChanged:) forControlEvents:UIControlEventValueChanged];
self.segmentedControl.height = 32.0;
self.segmentedControl.width = 310.0;
self.segmentedControl.segmentedControlStyle = UISegmentedControlStyleBar;
self.segmentedControl.tintColor = [UIColor colorWithWhite:0.9 alpha:1.0];
self.segmentedControl.autoresizingMask = UIViewAutoresizingFlexibleLeftMargin | UIViewAutoresizingFlexibleRightMargin;
UIView* toolbar = [[UIView alloc] initWithFrame:CGRectMake(0, 0, self.view.width, HEADER_HEIGHT)];
toolbar.autoresizingMask = UIViewAutoresizingFlexibleWidth;
CAGradientLayer *gradient = [CAGradientLayer layer];
gradient.frame = CGRectMake(
toolbar.bounds.origin.x,
toolbar.bounds.origin.y,
// * 2 for enough slack when iPad rotates
toolbar.bounds.size.width * 2,
toolbar.bounds.size.height
);
gradient.colors = [NSArray arrayWithObjects:
(id)[[UIColor whiteColor] CGColor],
(id)[[UIColor
colorWithWhite:0.8
alpha:1.0
] CGColor
],
nil
];
[toolbar.layer insertSublayer:gradient atIndex:0];
toolbar.backgroundColor = [UIColor navigationBarShadowColor];
[toolbar addSubview:self.segmentedControl];
UIView* border = [[UIView alloc] initWithFrame:CGRectMake(0, HEADER_HEIGHT - 1, toolbar.width, 1)];
border.autoresizingMask = UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleTopMargin;
border.backgroundColor = [UIColor colorWithWhite:0.7 alpha:1.0];
border.autoresizingMask = UIViewAutoresizingFlexibleWidth;
[toolbar addSubview:border];
[self.segmentedControl centerInParent];
self.tableView.tableHeaderView = toolbar;
http://scs.veetle.com/soget/session-thumbnails/5363e222d2e10/86a8dd984fcaddee339dd881544ecac7/5363e222d2e10_86a8dd984fcaddee339dd881544ecac7_20140509171623_536d6fd78f503_68_896x672.jpg
As already written in other answers, UINavigationBar grabs the touches made near the nav bar itself, but not because it has some subviews extended over the edges: this is not the reason.
If you log the whole view hierarchy, you will see that the UINavigationBar doesn't extends over the defined edges.
The reason why it receives the touches is another:
in UIKit, there are many "special cases", and this is one of them.
When you tap the screen, a process called "hit testing" starts. Starting from the first UIWindow, all views are asked to answer two "questions": is the point tapped inside your bounds? what is the subviews that must receive the touch event?
this questions are answered by these two methods:
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event;
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event;
Ok, now we can continue.
After the tap, UIApplicationMain starts the hit testing process. The hit test starts from the main UIWindow (and is executed even on the status bar window and the alert view window, for example), and goes through all subviews.
This process is executed 3 times:
two times starting from UIWindow
one times starting from _UIApplicationHandleEvent
If you tap on the Navigation Bar, you will see that hitTest on UIWindow will return the UINavigationBar (all three times)
If you tap on the area below the Navigation Bar however, you will se something strange:
the first two hitTest will return your UISegmentedControl
the last hitTest will return UINavigationBar
why this?
If you swizzle and subclass UIView, overriding hitTest, you will see that the first two times the tapped point is correct. The third time, something changes the point doing something like point - 15 (or a similar number)
After a lot of searching, I have found where this is happening:
UIWindow has a (private) method called
-(CGPoint)warpPoint:(CGPoint)point;
debugging it, I saw that this method changes the tapped point if it is immediately below the status bar.
Debugging more, I saw that the stack calls that make this possible, are only 3:
[UINavigationBar, _isChargeEnabled]
[UINavigationBar, isEnabled]
[UINavigationBar, _isAlphaHittableAndHasAlphaHittableAncestors]
So, at the end, this warpPoint method checks if the UINavigationBar is enabled and hittable, if yes it "warps" the point. The point is warped of a number of pixel between 0 and 15, and this "warp" increases when you get closer to the Navigation Bar.
Now that you know what happens behind the scenes, you have to know how to avoid it (if you want).
You can't simply override warpPoint: if the application must go on the AppStore: it's a private method and your app will be rejected.
You have to find another system (like as suggested, overriding sendEvent, but I'm not sure if it will work)
Because this question is interesting, I will think about a legal solution tomorrow and update this answer (one good starting point can be subclassing UINavigationBar, overriding hitTest and pointInside, returning nil/false if, given the same event over multiple calls, the point changes. But I must test if it works tomorrow)
EDIT
Ok, I've tried many solutions but it's not simple to find a legal and stable one.
I've described the actual behavior of the system, that could vary on different versions (hitTest called more or less than 3 times, the warpPoint warping the point of about 15px that can change ecc ecc).
The most stable is obviously the illegal override of warpPoint: in a UIWindow subclass:
-(CGPoint)warpPoint:(CGPoint)point;
{
return point;
}
however, I've found that a method like this (in UIWindow subclass) it's stable enough and does the trick:
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
// this method is not safe if you tap the screen two times at the same x position and y position different for 16px, because it moves the point
if (self.lastPoint.x == point.x)
{
// the points are on the same vertical line
if ((0 < (self.lastPoint.y - point.y)) && ((self.lastPoint.y - point.y) < 16) )
{
// there is a differenc of ~15px in the y position?
// if so, the point has been changed
point.y = self.lastPoint.y;
}
}
self.lastPoint = point;
return [super hitTest:point withEvent:event];
}
This method records the last point tapped, and if the subsequent tap is at the same x, and an y different for max 16px, then uses the previous point.
I've tested a lot and it seems stable.
If you want, you can add more controls to enable this behavior only in particular controllers, or only on a defined portion of the window, ecc ecc.
If I find another solution, I'll update the post
I believe the problem is because the buttons in the UINavigationBar have a larger than normal touch area. See this SO post. You can also find plenty of discussion on this with a 'UINavigationBar touch area' Google search.
As a possible solution, you could put the segmented control IN the navigation bar, but you would know better than I if that fits your use cases or not.
I've come up with an alternate solution that to me seems safer than LombaX's. It uses the fact that both events come in with the same timestamp to reject the subsequent event.
#interface RFNavigationBar ()
#property (nonatomic, assign) NSTimeInterval lastOutOfBoundsEventTimestamp;
#end
#implementation RFNavigationBar
- (UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event
{
// [rfillion 2014-03-28]
// UIApplication/UIWindow/UINavigationBar conspire against us. There's a band under the UINavigationBar for which the bar will return
// subviews instead of nil (to make those tap targets larger, one would assume). We don't want that. To do this, it seems to end up
// calling -hitTest twice. Once with a value out of bounds which is easy to check for. But then it calls it again with an altered point
// value that is actually within bounds. The UIEvent it passes to both seem to be the same. However, we can't just compare UIEvent pointers
// because it looks like these get reused and you end up rejecting valid touches if you just keep around the last bad touch UIEvent. So
// instead we keep around the timestamp of the last bad event, and try to avoid processing any events whose timestamp isn't larger.
if (point.y > self.bounds.size.height)
{
self.lastOutOfBoundsEventTimestamp = event.timestamp;
return nil;
}
if (event.timestamp <= self.lastOutOfBoundsEventTimestamp + 0.001)
{
return nil;
}
return [super hitTest:point withEvent:event];
}
#end
You might want to check which view is recording the touches. Try this method-
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
[touch locationInView:self.view];
if([touch.view isKindOfClass:[UISegmentedControl class]])
{
NSLog(#"This is UISegment");
}
else if([touch.view isKindOfClass:[UITabBar class]])
{
NSLog(#"This is UITabBar");
} else if(...other views...) {
...
}
}
Once you figure that out you maybe able to narrow down your problem.
It looks as if you're using a category extension to set width/height on views, as well as center them in their parent. Perhaps there is a hidden issue here - can you refactor to do your layout w/out this category?
I copied your code into a clean project and ran it in a UITableViewController's viewDidLoad method - it works fine and I have no dead spots like you report. I had to change your code slightly since I don't have the same category extension that you're using.
Also, if you're running this code in viewDidLoad, you should verify that your view has a defined size (you access your view.width). If you're creating your UITableViewController programmatically (vs from a nib/storyboard) then the frame may be CGRectZero. Mine was loaded from a nib so the frame was preset.
I'd also try temporarily removing your border view to see if it's the culprit.
I recommend that you avoid having touch-sensitive UI in such close proximity to the nav bar or toolbar. These areas are typically known as "slop factors" making it easier for users to perform touch events on buttons without the difficulty of performing precision touches. This is also the case for UIButtons for example.
But if you want to capture the touch event before the navigation bar or toolbar receives it, you can subclass UIWindow and override: -(void)sendEvent:(UIEvent *)event;
An easy way to debug this is to try using DCIntrospect in your project. It's a very easy to use/implement library that makes finding out what views are where when in the simulator a breeze.
Install the library and configure it
Run the application in the simulator and navigate to the screen with the issue
Press spacebar on the keyboard (the computer keyboard, not the simulator's
keyboard)
Click on the 25% area and see what gets highlighted.
If what's highlighted isn't the segmented view controller, that view could be what's covering up the touch event.
Create a protocol for UINavigationBar: (add new file and paste below code)
/******** file: UINavigationBar+BelowSpace.h*******/
"UINavigationBar+BelowSpace.h"
#import <Foundation/Foundation.h>
#interface UINavigationBar (BelowSpace)
#end
/*******- file: UINavigationBar+BelowSpace.m*******/
#import "UINavigationBar+BelowSpace.h"
#implementation UINavigationBar (BelowSpace)
-(UIView *)hitTest:(CGPoint)point withEvent:(UIEvent *)event {
int errorMargin = 5;// space left to decrease the click event area
CGRect smallerFrame = CGRectMake(0 , 0 - errorMargin, self.frame.size.width, self.frame.size.height);
BOOL isTouchAllowed = (CGRectContainsPoint(smallerFrame, point) == 1);
if (isTouchAllowed) {
self.userInteractionEnabled = YES;
} else {
self.userInteractionEnabled = NO;
}
return [super hitTest:point withEvent:event];
}
#end
Hope this help ^ ^
Try this
self.navigationController!.navigationBar.userInteractionEnabled = false;

View being cut off (landscape vs. portrait orientation)

I have had this problem countless times and cannot figure out how to fix it. I am working in an Xcode project (empty project - NO XIB's!). I have my orientation set to landscape:
But this keeps happening:
The view is being cutoff. No matter what I do it doesn't seem to set to the proper size. For some reason it is displaying the view in landscape using portrait bounds. Does anyone know how to fix this? I also want to restrict the orientation to ONLY landscape.
UPDATE
The view does not get cutoff if I hard-code 1024 as the width and 768 as the height. This is obviously a terrible solution, but I cannot figure it out. Does anyone out there know of a solution?
Check the rooViewController that you are setting in application:didFinishLaunchingWithOptions: in app delegate class. Make sure you are returning proper allowed orientations in this view controller class whose object you are setting to rootViewController:
- (NSUInteger) supportedInterfaceOrientations{
return UIInterfaceOrientationMaskLandscape|UIInterfaceOrientationLandscapeRight;
}
- (BOOL) shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation{
return UIInterfaceOrientationIsLandscape(toInterfaceOrientation);
}
In your app delegate add this function :
- (NSUInteger) application:(UIApplication *)application supportedInterfaceOrientationsForWindow:(UIWindow *)window{
return UIInterfaceOrientationMaskLandscape|UIInterfaceOrientationLandscapeRight;
}
I have the answer! One of my friends helped me out with this one.
Views are not oriented until they appear, so, if you are going to add any components to the view and expect them to adhere to an orientation other than the default, which I suspect is portrait, you must add those components in the -(void)viewDidAppear:(BOOL)animated. I was calling methods that added several components to the view from within the viewDidLoad method, however, the view has not appeared, and the orientation not set, when that method is called. Moving my initialization code into the viewDidAppear method fixes my problem.
Here is an example:
-(void)viewDidAppear:(BOOL)animated
{
[super viewDidLoad];
//Probably dont want to draw stuff in here, but if you did, it would adhere to the
//correct orientation!
CAShapeLayer *layer = [CAShapeLayer layer];
layer.backgroundColor = [UIColor blueColor].CGColor;
layer.anchorPoint = CGPointMake(0, 0);
layer.bounds = CGRectMake(0, 0, self.view.bounds.size.width, 300);
[self.view.layer addSublayer:layer];
//Call methods from here
[self initializeScrollView];
[self addItemToScrollView];
[self addGraphToView];
}

iPad Landscape messing up touches began

My app is only allowable in the landscape orientations, and it launches in landscape. Thus, the top left corner when the iPad is landscape is (0,0), so everything works fine.
However, when I pick up "touchesBegan"...it isn't working properly. Only when I tap on like the right two-thirds of the iPad does it pick up the touches. I know it has something to do with the orientation, because the app is literally just blank screen with a single UIView and nothing else. It is quite simple. There are no other possible problems that would cause this.
To be sepecific, if I print out the x location in the touchesBegan function, and if the iPad is held with the home button on the left, the width is 1024. And 1024-768 is 256. This is exactly the x position where it begins to sense my touches. Anything to the left of x=256 does not sense the touches.
How do I fix this?
Check Struts and Springs and make sure that whatever should pick up the touches is covering the whole area and locked to the 4 sides.
To do it programmatically,
- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
CGRect appFrame = [[UIScreen mainScreen] applicationFrame];
[self.view setFrame:CGRectMake(0.0f, 0.0f, appFrame.size.width, appFrame.size.height)];
return YES;
// if you want to support only LANDSCAPE mode, use the line below
/*
return (interfaceOrientation == UIInterfaceOrientationLandscapeLeft | UIInterfaceOrientationLandscapeRight);
*/
}
This sets the view to occupy the full screen.
The answer is that, when defining the UIWindow, it needs to be defined as
self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
and not strict coordinates.

Resources