I have managed to support Portrait and Landscape orientations for my app.
However, my implementation is based on UIDeviceOrientationChangeNotification. I do not use viewillTransistionToSize
Offical Apple sample code for AlternateViews does use notification based approach. So I assumed it should be fine.
Is there any significant advantage/disadvantage of using one over another?
Would it ever happen that my app will not get deviceOrientationChangenotification?
Please clarify.
I am using a combination of both. But to me it looks like viewillTransistionToSize gives more control as you can execute code
before rotation begins
during rotation
after rotation
while in case of Notification you only execute code after rotation.
Not sure but I found this thru experimentation and not thru docs.
Related
I'm trying NSNotificationCenter with this code:
but does not print anything, never enters in the rotate function, any ideas? thanks a lot
Did you confirm that you've enabled the appropriate orientations under App General settings?
I'm trying to detect if an iPhone was moved (even a slight movement), and then if the device was moved, i want the iPhone to play a sound.
Im new to IOS programming, and I've looked through a bunch of apple documentations, and i found that i must use CoreMotion . But I'm not sure how to implement it, to what id like to do in my app.
I've also searched on google for some help (e.g. NSHipster...), but i couldn't fine anything that meets my needs.
can anyone assist?
Thank you.
It seems that you need the motion accelerometer in CMMotionManager.
https://developer.apple.com/library/ios/documentation/CoreMotion/Reference/CMMotionManager_Class/index.html#//apple_ref/occ/instm/CMMotionManager/accelerometerData
It seems you want to detect an iPhone shake. Apple has an article on the subject:
https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/motion_event_basics/motion_event_basics.html#//apple_ref/doc/uid/TP40009541-CH6-SW2
I need to track device orientation even though device orientation is locked to Portrait mode. What I really need is to accomplish similar behaviour to what Instagram camera view has: when you rotate device, it will rotate buttons over camera view, even when your device is locked.
I used to track orientation with UIDeviceOrientationDidChangeNotification, but that is not fired when device orientation is locked :(
Is there perhaps implementation somewhere using accelerometer and/or gyroscope? I'm suprised I couldn't find something like that.
Use the accelerometer to detect device orientation yourself. You can use the Core Motion framework to get the data you need. There's a sample snippet in the linked docs that shows how to get the data. Use a low-pass filter to isolate the force of gravity from relatively short-term changes due to user movement. Apple has a sample project called AccelerometerGraph that demonstrates this.
I want to implement all my own rotation animations but if I only return YES to UIInterfaceOrientationPortrait in shouldAutorotateToInterfaceOrientation:, I no longer get didRotateFromInterfaceOrientation: and willRotateFromInterfaceOrientation: notifications.
Can I get notifications for rotations while also disabling the default rotation animations?
You should still be able to subscribe to UIDeviceOrientationDidChangeNotification even if you've disabled view controller autorotation via shouldAutorotateToInterfaceOrientation:.
I know this question is a bit old, but just thought I'd add a link to a tutorial I found when I was in the same situation you were (wanting to get notifications for device rotation without allowing views to rotate).
This tutorial gives a nice and simple overview of how to handle device rotation using UIDeviceOrientationDidChangeNotification.
I was wondering if there was a specific method called when we press the 2 buttons of the iPhone (using Home-Button & Power on/off) to take a screenshot. If yes, I would known his name to use her in programming.
There used to be a UIGetScreenImage() function that you could use to capture the screen. Apple no longer allows use of that function in App Store apps, so you have a few other options. CALayer has a -renderInContext: method—Google it—that you can use to copy a view’s contents to a graphics context; this does not, however, work for OpenGL content, video, or live imagery from a device’s camera. I’m not sure about solutions for the first two, but for the latter—getting images from the camera—you’ll need to use the AVFoundation framework.
It is a system level service for which the app never receives any notification or method call.
I believe that would be a native method, not accessible from the IPhone SDK. In what context are you going to be using this? You might be looking for this - Take screenshot from code