Track device orientation when orientation is locked - ios

I need to track device orientation even though device orientation is locked to Portrait mode. What I really need is to accomplish similar behaviour to what Instagram camera view has: when you rotate device, it will rotate buttons over camera view, even when your device is locked.
I used to track orientation with UIDeviceOrientationDidChangeNotification, but that is not fired when device orientation is locked :(
Is there perhaps implementation somewhere using accelerometer and/or gyroscope? I'm suprised I couldn't find something like that.

Use the accelerometer to detect device orientation yourself. You can use the Core Motion framework to get the data you need. There's a sample snippet in the linked docs that shows how to get the data. Use a low-pass filter to isolate the force of gravity from relatively short-term changes due to user movement. Apple has a sample project called AccelerometerGraph that demonstrates this.

Related

How to determine iPhone is being used/unlocked

I am facing one of the requirements where I need to detect if the user is using the iPhone or not. Even if my app is in background then also. So far I've found the following way outs, but none of them points to the unlocking mechanism:
applicationProtectedDataWillBecomeUnavailable: This will tell if the user has locked/unlocked the device but will work only If a passcode is set.
Darwin lock/unlock notifications: Not accepted on AppStore.
Proximity Sensor (UIDeviceProximityStateDidChangeNotification): This will only happen if the user is calling.
[[UIScreen mainScreen] brightness]: Can be queried during applicationDidEnterBackground state to check whether phone was locked or home button was pressed. If brightness is greater than 0, it indicates home button was pressed or there was a transition to some other app, when it is 0, it means locked.
Using motion sensors: Given the battery drainage, a fairly incorrect approach to check what's the roll of the device or acceleration in a certain coordinate system
UIDeviceOrientationDidChangeNotification: only works when app is in foreground.
What's the technology behind apple's "pick up to wake" implementation: I have seen that iPhone6s brightens the screen when you pick it up. They must be using a more efficient way than accelerometer?
I don't see any other good approaches. Did I miss something?

Google Street View for iOS with device orientation / gyroscope sensor

Is there an easy way to make in iOS Google Street View GMSPanormaView's camera follow the device's orientation via data from its motion sensors?
If not, has anyone already done it and can share a code snippet that takes data from CoreMotion, maybe manipulates it to create GMSPanoramaCamera, and passes it to the GMSPanoramaView with animateToCamera:animationDuration:?
Any relevant Android code would also be useful.
Upon checking the Maps SDK for iOS:Internal: Street View, there is no built-in function/implementation for the device orientation/ gyroscope sensor.
According to Ziem's answer you can try implement this by yourself(create function). He also give pointers to study the following:
Set the camera orientation point of view
Animate the camera movements
Reference:
Blog
Github
Using a site they have successfully create a function that will let you browse the Google streetview panoramas with your smartphone/tablet like you were inside it, just by moving your phone like a window to the world.

PJSIP 2.4 video orientation change propagation

In my application I am running into issues handling a scenario where a capture device on one side of the conversation changes his orientation, which needs to reflect on the rendering device on the other side.
I am using iOS and have figured out how to use pjsua_vid_win_rotate() to deal with changes of orientation assuming the capture side of the conversation is using a static orientation. The issue seems to be that the render side of the conversation does not get notified that the orientation of the video being sent to him has changed. What is the proper way to handle this with pjsip?
So basically the problem is as follows:
User A is in portrait.
User B is also in portrait and sets window rotation to 270. This leads to a proper video render.
User A changes orientation to landscape mid call.
User B needs to change his window rotation to 0 to accommodate the change but is not aware a change has been made.
There is an RTP extension that may be used to carry mobile orientation data. It isn't supported in PJSIP yet. summary of existing standards for CVO
Or you may wish to use application specific RTCP APP type packets to transmit in a custom format (freesoft.org/CIE/RFC/1889/33.htm)
Either one of these options will require changes to the way PJSIP listens to and creates RTP. This can be accomplished by creating a Media Transport Adapter: PJSIP Media Transport Adapter

Suitable sensors in iPhone to pick the user interaction

I'm working on an app - one of its main properties is to inform the app if the user is using the phone, interacting with it in any way, or even touching it!
What are the suitable sensors in iPhone that can help me to detect these things?
And how can I benefit from the sensors to make this property work?
Thanks
The iPhone has accelerometers, gyros, and GPS. With these, you can monitor motion of the phone, sudden shocks(like when the phone is picked up and put down), orientation, and over all motion. If outside, you can also use the GPS to pick up on motion and position (lat, long, course, speed, altitude).
When interacting with the app, you've also got touch events and multi-touch events(like using two fingers to zoom in or zoom out or rotate). Most of the 'gestures' are coded and defined by apple so you don't need to figure out the user intent, just respond to their event.
Numerous sensor monitoring apps exist... eg:
http://wavefrontlabs.com/Wavefront_Labs/Sensor_Data.html
Tutorials on how to do some of this stuff :
https://www.youtube.com/watch?v=Hml2jB_Qpds
https://www.youtube.com/watch?v=Xk5cJlhePCI
https://www.youtube.com/watch?v=qY4xCMTejH8
https://www.youtube.com/watch?v=7GHc8ySyWcY
https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/GestureRecognizer_basics/GestureRecognizer_basics.html

camera view - can settings disrupt button response on a camera view on iPhone 5S?

I have a strange situation...
I implemented an in-app camera based on Apple's AVCam sample. It works just fine. My question is not about the actual camera implementation, but rather... What could cause a view's buttons to work on one iPhone 5S but fail on another iPhone 5S. Both are using the same build of the app, they have the same iOS version installed (7.0.4), etc.
The problem is...the camera starts and the camera preview displays just fine, but the buttons on that view (i.e. the shutter release, flash options, front/back camera switch, etc) all fail to respond. His iPhone 5S is the only one out of 4 iPhone 5S's that has the problem.
Trying to narrow down what can be different until I can hook the "sad" iPhone 5S up to my debugger in a few days when I see my client again (it's his)... we did notice that my phone asked for permission to access my photos and his did not...
Is there perhaps some system setting that he could have enabled that would cause this check to be skipped? I ask because I wonder if the camera scene's view controller is waiting for something from that check and therefor hanging the UI.
Any ideas would be appreciated
Finally tracked down the issue...
The difference was that on my developer phone, I had a few hundred to 1000 pictures in my camera roll. On my client's phone, he had about 6,000 pictures, so obviously getting those takes longer. If we were patient, the view eventually did come back alive after the enumeration block finished.
Also, I was asking my UICollectionView to scroll to the end (where the newest photos are) after it had finished loading all the camera roll images into itself. With few photos, the timing was fine, but with lots of photos, the timing was off and it was trying to scroll before it had finished enumerating. The solution, since there is no callback for "didFinishReloadingData" was no call the scroll method using -performSelector:withObject:withDelay to ensure that it gets called AFTER the enumeration block and reloadData are finished.

Resources