PJSIP 2.4 video orientation change propagation - ios

In my application I am running into issues handling a scenario where a capture device on one side of the conversation changes his orientation, which needs to reflect on the rendering device on the other side.
I am using iOS and have figured out how to use pjsua_vid_win_rotate() to deal with changes of orientation assuming the capture side of the conversation is using a static orientation. The issue seems to be that the render side of the conversation does not get notified that the orientation of the video being sent to him has changed. What is the proper way to handle this with pjsip?
So basically the problem is as follows:
User A is in portrait.
User B is also in portrait and sets window rotation to 270. This leads to a proper video render.
User A changes orientation to landscape mid call.
User B needs to change his window rotation to 0 to accommodate the change but is not aware a change has been made.

There is an RTP extension that may be used to carry mobile orientation data. It isn't supported in PJSIP yet. summary of existing standards for CVO
Or you may wish to use application specific RTCP APP type packets to transmit in a custom format (freesoft.org/CIE/RFC/1889/33.htm)
Either one of these options will require changes to the way PJSIP listens to and creates RTP. This can be accomplished by creating a Media Transport Adapter: PJSIP Media Transport Adapter

Related

How can I use ARKit while using Slide Over/Split Screen on iPadOS?

I have an app that uses ARKit to detect faces and send over the network the coordinates of interest, which works well. I would like this app to run in background, still sending the data over the network, while I would be using another app (almost) fullscreen.
The option 'Enable multiple windows' is activated in info.plist, but as soon as I launch my other app, the ARKit app stops sending information (the app actually probably stops).
Is there a simple way to do this, and at least is this feasible? Thanks!
This is not possible at this point. Camera and AR stuff is disabled at a system level in apps when they are displayed in Slide Over or Split View.
I'd recommend displaying a warning message when Slide Over/Split Screen is being used saying that you should use the app in full screen mode. See this answer under a different question for details.

Modifying iPhone Camera/Flashlight?

I was wondering if there's anyway to modify how the iPhone takes pictures. Specifically, if you can turn off the dimmer focus flash that shows up before the full flash, so when you take a picture it's just the full flashlight that comes on immediately. Thanks!
Unfortunately, no, I think not. From the apple developer site:
Flash Modes
There are three flash modes:
AVCaptureFlashModeOff: The flash will never fire.
AVCaptureFlashModeOn: The flash will always fire.
AVCaptureFlashModeAuto: The flash will fire dependent on the ambient
light conditions. You use hasFlash to determine whether a device has a
flash. If that method returns YES, you then use the
isFlashModeSupported: method, passing the desired mode to determine
whether a device supports a given flash mode, then set the mode using
the flashMode property.
And there is no mention of the dimmer flash.
EDIT: Maybe you can. There are lots of iOS apps that do this
2nd Edit: After looking at this, I have concluded that you cannot do so.

How to check whether device set vibration mode or not in iOS programatically?

I am making a VoIP application for iOS. For incoming calls, I have set some custom ringtones. It's working fine, but I want to check whether the device settings are set to Vibration mode or not programmatically.
I have searched on the web regarding this issue, but I've only found answers for silent mode detection. Instead, I want to check whether the device is in vibration mode or not.
When an incoming call comes to my app, I want to use vibration mode if the device settings are set to vibration mode.
Could any one help me?
At first look it seems like none of the Audio Session properties allow you to read the value of this setting. However an alternative, albeit not exactly what you're looking for, is to check if the ringer is set to on-or-off and provide at least a semi-expected vibration experience to your user.
Ronak Chaniyara pointed you to the right answer here, however that is deprecated in iOS 7.0. Instead use AVAudiSession -setCategory to set the proper category of your audio. If you expect your audio to be muted by the silent switch or screen lock use AVAudioSessionCategorySoloAmbient otherwise use AVAudioSessionCategoryPlayback.
More details on AVAudioSession, it's settings and properties can be found here.

ios Objective-C how to display different video feed on big screen

According to the first comment at the end of this article:
http://www.imore.com/keynote-iphone-ipad-review
And here:
http://help.apple.com/keynote/ipad/2.2/#/tand1a4ee7c
It seems that you can configure keynote for iPad, such that you can see speaker notes on the iPad when you're using a dongle to plug into a projector or big screen; but not on the big screen.
Is this functionality only afforded to Keynote through some private API in the OS level? or does anyone know of a way of achieving this programatically? My use case doesn't need to make it into the app store - so a private API hack could work for me.
No need for private API's. You can observe UIScreenDidConnectNotification notifications for when a second screen is connected whether it's airplay or hdmi.
You then provide a View/ViewController for that screen.

Track device orientation when orientation is locked

I need to track device orientation even though device orientation is locked to Portrait mode. What I really need is to accomplish similar behaviour to what Instagram camera view has: when you rotate device, it will rotate buttons over camera view, even when your device is locked.
I used to track orientation with UIDeviceOrientationDidChangeNotification, but that is not fired when device orientation is locked :(
Is there perhaps implementation somewhere using accelerometer and/or gyroscope? I'm suprised I couldn't find something like that.
Use the accelerometer to detect device orientation yourself. You can use the Core Motion framework to get the data you need. There's a sample snippet in the linked docs that shows how to get the data. Use a low-pass filter to isolate the force of gravity from relatively short-term changes due to user movement. Apple has a sample project called AccelerometerGraph that demonstrates this.

Resources