muteLocalVideoStream not muting the video in agora - ios

Iam new to react native. Iam using agora rtc (3.1.3) for video calling in my app. It works perfectly. We have some actions like switching camera and muting video etc. For muting video iam using the below code
const toggleVideo = async () => {
let mute = vidMute;
console.log('Video toggle', mute);
await (engine.muteLocalVideoStream(!mute));
setVidMute(!mute)
}
Where engine is my RtcEngine created using my app id. Issue is muteLocalVideoStream has no effect on my video. Iam testing in iphone. Please help

The muteLocalVideoStream method only stops the transmission of the video stream to remote users, it does not stop the video stream capture i.e. the local user can still view their video normally.
If you want to stop the video on the local user's device as well you can use the enableLocalVideo method instead. You can edit the code on line 4 - await (engine.enableLocalVideo(!mute));
I tried out your code on v3.1.6 of react-native-agora and everything works as expected.

Related

How to start video recording in Android TV (development)

I would like to implement video recording in Android TV, or at least a button that fires the recording event, I added the gradle dependency, but I could not find any RecordAction or RecordButton, only MultiAction, FastForwardAction etc.
What I would need is a button where the user clicks/presses and the current time of the current streamed media would be saved.
You may follow this documentation. To tell the system that your TV input service supports recording, set the android:canRecord attribute in your service metadata XML file to true:
<tv-input xmlns:android="http://schemas.android.com/apk/res/android"
android:canRecord="true"
android:setupActivity="com.example.sampletvinput.SampleTvInputSetupActivity" />
Alternatively, you can indicate recording support in your code using these steps:
In your TV input service onCreate() method, create a new TvInputInfo object using the TvInputInfo.Builder class.
When creating the new TvInputInfo object, call setCanRecord(true) before calling build() to indicate your service supports recording.
Register your TvInputInfo object with the system by calling TvInputManager.updateTvInputInfo().
Regarding button where the user clicks/presses, unfortunately I can't find any samples about it. According to the same reference above, the system will invoke the RecordingSession.onStartRecording() callback when the system calls RecordingSession.onTune(). Then your app must start recording immediately.

What is the equivalent of Android's MediaCodec.queueInputBuffer() in iOS?

I trying to play Audio with specific PTS(=DTS. decoding time stamp.)
So, I have tried to use AudioQueueEnqueueBufferWithParameters() with the inStartTime parameter to delay the start of playing each buffer, But that is not working.
I know that in android, MediaCodec class's queueInputBuffer method can play audio data with PTS. (see description : MediaCodec.queueInputBuffer)
I want to find the API in iOS that like MediaCodec's queueInputBuffer method.
If there is no the API in iOS,
How can I play each audio buffer with specific PTS ?

Vimeo Froogaloop events not firing on iOS

I'm trying to append an element to the page when a Vimeo embed starts playing. The code I'm using is:
var iframe = $videoObj.find("iframe")[0],
player = $f(iframe);
function onPlayProgress(id) {
if(!$videoObj.find(".video-play-pause").length)
$videoObj.append(controlHTML);
}
player.addEvent('ready', function() {
player.addEvent('play', onPlayProgress);
});
The event fires fine on desktop browsers, but it doesn't seem to fire on iOS7. I've also tried the play/pause events, but these don't work either. The ready event seems to be the only one that fires.
Is there a workaround for making the events fire or are they simply unsupported by iOS?
UPDATE
It seems that setting a video ID and using it with the API is what was missing to make it work on iOS (although that's not required in Chrome/Safari on desktop).

Phonegap / Cordova Stop audio after time when in background IOS

I have an audio app that loops sounds for playback using Cordova 2.2 and its Audio API.
At the moment I have setup a number of loops that will stop when finished based on predetermined time (calculated on n seconds per loop / 3 hours) . This method generally works.
playMainAudio = new Media(url,
// success callback
function() {
console.log("playAudio():Audio Success");
},
// error callback
function(err) {
console.log("playAudio():Audio Error: "+err);
});
// Play audio
playMainAudio.play({ numberOfLoops: 123, playAudioWhenScreenIsLocked : true });
But I'd prefer a native code addition where I could just set all audio to stop after 3 hours rather then work it out based on time- but not sure were to look or even place the code. The catch is it has to work when locked or the app is in the background (currently I have the correct background mode set so the audio will play in the BG).
is there a native timer that is background compatible ?
If you are keen on editing app delegate.m in Objective-C (not your language of choice) inside
- (void)applicationDidEnterBackground:(UIApplication *)application stop audio but delay using
– performSelector:withObject:afterDelay:
See the documentation here:
http://developer.apple.com/library/ios/#documentation/uikit/reference/UIApplicationDelegate_Protocol/Reference/Reference.html
https://developer.apple.com/library/mac/#documentation/Cocoa/Reference/Foundation/Classes/NSObject_Class/Reference/Reference.html
For examples on how to play audio in Objective-C:
How to play a sound in objective C iphone coding
Play Audio iOS Objective-C
How can I Add Audio Player in iphone App

how to play audio as background in blackberry

I have created Player object as
player = Manager.createPlayer(inputStream,"audio/mpeg");
and plays the audio as
player.realize(); player.prefetch(); player.start();
It starts playing the stream. Here the inputstream refers live streaming url. Now my question is when i click on the back button the application will be closed so that player also will stop the playing. but I need to play the audio in background even the application is closed and after launching the app i dont want to initialize the Player object again,for this i have to maintain the Player object as singleton. I am using 4.7 blackberry api.Can someone please tell me how all these will possible?
thanks
venu
Override the "onClose()" method in your Screen class to catch the close event and put your app in the background:
public boolean onClose() {
Application.getApplication().requestBackground();
return false;
}
Take a look at the multi-part blog post by Tim Windsor from RIM on running applications in the background.
Part 1
Part 2
Part 3
Part 4
Basically you need to override the behaviour of the back button and just send your app to background without closing it.Then the player will continue to play. There are many resources and tutorials on this. Maybe something from the links #Ted Hopp posted might help you.

Resources