i develop application for Blackberry. In Storm 1 (4.7) and Storm 2 (5.0) i need to disable accelerometer in my application. I want my application does not react to the accelerometer, but that does not affect other applications. Its possible?
There is no way to disable accelerometer but you can lock screen orientation change:
// To force portrait view in a BlackBerry API application,
// use code like this before invoking UiApplication.pushScreen()
int directions = Display.DIRECTION_NORTH | Display.DIRECTION_SOUTH;
Ui.getUiEngineInstance.setAcceptableDirections(directions);
See Specifying the orientation and direction of the screen
Related
I have an app that uses ARKit to detect faces and send over the network the coordinates of interest, which works well. I would like this app to run in background, still sending the data over the network, while I would be using another app (almost) fullscreen.
The option 'Enable multiple windows' is activated in info.plist, but as soon as I launch my other app, the ARKit app stops sending information (the app actually probably stops).
Is there a simple way to do this, and at least is this feasible? Thanks!
This is not possible at this point. Camera and AR stuff is disabled at a system level in apps when they are displayed in Slide Over or Split View.
I'd recommend displaying a warning message when Slide Over/Split Screen is being used saying that you should use the app in full screen mode. See this answer under a different question for details.
Trying to develop my first application which will just be side loaded to my device. I am unsure if I can even do what I'm looking for; I'm trying to change the current location on my device via my application. I know this is done with the Xcode debugger but can it be done solely on the application with no connection to the computer? Basically I have a few buttons, each one I want to change the location to somewhere different. Do I just code the coordinates to set with the button and thats all? Will this show on my maps application and such? Thanks.
What you ask is not possible. The GPS hardware is what reports location to the phone. Xcode is able to do this because it is reporting location to the simulator, which has no GPS hardware on its own
In my application I am running into issues handling a scenario where a capture device on one side of the conversation changes his orientation, which needs to reflect on the rendering device on the other side.
I am using iOS and have figured out how to use pjsua_vid_win_rotate() to deal with changes of orientation assuming the capture side of the conversation is using a static orientation. The issue seems to be that the render side of the conversation does not get notified that the orientation of the video being sent to him has changed. What is the proper way to handle this with pjsip?
So basically the problem is as follows:
User A is in portrait.
User B is also in portrait and sets window rotation to 270. This leads to a proper video render.
User A changes orientation to landscape mid call.
User B needs to change his window rotation to 0 to accommodate the change but is not aware a change has been made.
There is an RTP extension that may be used to carry mobile orientation data. It isn't supported in PJSIP yet. summary of existing standards for CVO
Or you may wish to use application specific RTCP APP type packets to transmit in a custom format (freesoft.org/CIE/RFC/1889/33.htm)
Either one of these options will require changes to the way PJSIP listens to and creates RTP. This can be accomplished by creating a Media Transport Adapter: PJSIP Media Transport Adapter
I'm working on an app - one of its main properties is to inform the app if the user is using the phone, interacting with it in any way, or even touching it!
What are the suitable sensors in iPhone that can help me to detect these things?
And how can I benefit from the sensors to make this property work?
Thanks
The iPhone has accelerometers, gyros, and GPS. With these, you can monitor motion of the phone, sudden shocks(like when the phone is picked up and put down), orientation, and over all motion. If outside, you can also use the GPS to pick up on motion and position (lat, long, course, speed, altitude).
When interacting with the app, you've also got touch events and multi-touch events(like using two fingers to zoom in or zoom out or rotate). Most of the 'gestures' are coded and defined by apple so you don't need to figure out the user intent, just respond to their event.
Numerous sensor monitoring apps exist... eg:
http://wavefrontlabs.com/Wavefront_Labs/Sensor_Data.html
Tutorials on how to do some of this stuff :
https://www.youtube.com/watch?v=Hml2jB_Qpds
https://www.youtube.com/watch?v=Xk5cJlhePCI
https://www.youtube.com/watch?v=qY4xCMTejH8
https://www.youtube.com/watch?v=7GHc8ySyWcY
https://developer.apple.com/library/ios/documentation/EventHandling/Conceptual/EventHandlingiPhoneOS/GestureRecognizer_basics/GestureRecognizer_basics.html
I am trying to achieve the ability to lock the device rotation at certain points in my application, for example menu navigation could be done in both portrait and landscape mode but game play would be only in landscape.
Is there anyway to achieve this currently with PhoneGap? Short of having to brute force detect rotation with jquery etc and counter rotate the page?
It would be nice if their is some api call to lock or unlock rotation.
PS. i am using iOS7 and phonegap 3