QNX Using mmrplay to play video on different displays - qnx

I am trying to play 2 different videos on 2 different monitors connected to a board running QNX Neutrino. I tried to do this using the mmrplay utility on QNX.
The normal command to play a video on the default primary monitor is:
mmrplay -v screen: /path/to/video.mp4
How do I make another video play on the secondary monitor?

Related

Can iOS Swift play two separate audio files out of two separate pairs of AirPods?

I've been trying to play two different audio files out of two different pairs of AirPods simultaneously from a Swift app. After some trial and error, I've come to the following roadblock:
There can only be one AVAudioSession
The only way for an AVAudioSession have AVAudioPlayers playing out of different output devices is by setting its category to multiRoute
The only way for an AVAudioSession to detect "A2DP" devices for output (such as AirPods) is to set the category option allowBluetoothA2DP
The multiRoute category does not allow for the allowBluetoothA2DP option
Therefore, I feel I am stuck. Is there something I'm missing?
I tried detecting my bluetooth AirPods with a multiRoute AVAudioSession, with no success. I tried adding the allowBluetoothA2DP option to the multiRoute category, and the app crashed.

How to fill metadata info for tvOS info panel when using Airplay?

I'm barely new to iOS.
I'm able to reproduce streams(no local video) via AVPlayer using Airplay.
Also, MPNowPlayingInfo and RemoteCommandManager are supported, using external medatada, not included into the streams.
But, I would like to fill the info panel with title, artwork, etc. on AppleTv/tvOS.
The image is part of WWDC17 talk titled "Now Playing and Remote 
 Commands on tvOS".
My question is not about tvOS apps, which the referenced talk is about, but about a iOS app that plays video via Airplay.
My guess is that the played AVAsset needs to have medatada, which currently would be empty.
I've been checking AVMutableMetaDataItem, but still don't understand if that's what I would need to use, nor how to do it.
Does anyone has any hint?
The WWDC 2019 Talk :( https://developer.apple.com/videos/play/wwdc2019/503/ ) which is about Delivering Intuitive Media Playback with AVKit speaks about using the external metadata during Airplay and how they have provided the API for iOS now which is similar to what was present on tvOS. (Refer from duration of 7 minutes in the Video mentioned above where they explain the same.) Hope this helps:)

is there any way to control system volume from a app in objective c

I downloaded a app named "aurio Touch" from apple for ios platform. It captures the background sound using microphone and plays it simulteniously with other media sound (like, itunes songs, online songs etc). I want to control the capturing sound volume and media sound volume separately. so, Here is my question that is that possible or not. if possible then how can it achieved.
Please help.
Advance Thanks.....

CoreAudio: Creating kAudioUnitSubType_Reverb2 presets

I have a basic workflow for an iOS app in which .aupreset files are created for various AudioUnits such as AUSampler, delay, etc.. and are then loaded in my iOS app.
I need a reverb effect and see that the available reverb for desktop (which AULab can use) is kAudioUnitSubType_MatrixReverb, however the reverb that iOS can use is kAudioUnitSubType_Reverb2.
Hence I have no way of designing kAudioUnitSubType_Reverb2 preset files.
Any ideas on how I could load these to AULab or create them?

What is the best tech to create an iPad app that both records and plays local videos?

The key functionality of the app would be 1) recording short videos (typically 20-30 sec), 2) playing the videos 1-5 times right after shooting them (slow motion and pausing are a must) and 3) drawing over the videos, i.e. I'd need an additional data layer on top of the raw video.
I've been studying the HTML5 app platforms (PhoneGap, Titanium) because I'd like to minimize writing native iOS code, but it seems both recording and showing embedded video doesn't work on these platforms. The record-play-edit process is pretty simple, but it needs to be super-smooth and fast.
If you want to use JS / HTML5 and then generate the app with eg. Phonegap, then one option could be a custom Phonegap plugin built for "Media capture" and then use HTML5 in creating the app logic.
Objective-C Media Capture:
http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html#//apple_ref/doc/uid/TP40010188-CH5
Example Phonegap plugin for Audio Capture:
https://github.com/purplecabbage/phonegap-plugins/tree/master/iPhone/AudioRecord
More info about Phonegap plugin creation for iOS can be found from Phonegap wiki...

Resources