I have where on iOS I am creating a WebRTC application that works fine in every other browser and setting except iOS Safari on the iPAd. I've pinpointed the problem to the volume being set to 0.05243a and sometimes to -1. Here is my audio object printed out:
_containerHandlers: {abort: function, canplaythrough: function, ended: function, error: function}
_events: {track.trackMuteChanged: function, track.audioLevelsChanged: function}
_eventsCount: 2
_maxListeners: undefined
_sourceName: undefined
_streamInactiveHandler: function()
addEventListener: function(e,t)
audioLevel: -1
conference: jc {connection: $c, xmpp: Ur, eventEmitter: r, options: Object, eventManager: Qn, …}
containers: [<audio id="ecf6be16audio2">] (1)
disposed: false
handlers: Map {} (0)
hasBeenMuted: false
isP2P: false
muted: false
off: function(e,t)
ownerEndpointId: "ecf6be16"
removeEventListener: function(e,t)
rtc: va {eventEmitter: r, on: function, addEventListener: function, off: function, removeEventListener: function, …}
ssrc: 79236293
stream: MediaStream {listeners: Object, oninactive: function, id: "ecf6be16-audio-1", active: false, onaddtrack: null, …}
track: MediaStreamTrack {listeners: Object, kind: "audio", id: "1260ece4-f00d-4c98-98c3-f5137affeb3a-1", label: "remote audio", enabled: true, …}
type: "audio"
videoType: undefined
volume: 1
Notice while the volume says 1, the audioLevel is -1. And here is several examples of trying to change the volume the volume, the ID his the referencing the audio html element, and the (#${id})[0] is the HTMLMediaElement.
if(track && track.type == 'audio'){
$(`#${id}`)[0].volume = 1;
track.volume=1;
$(`#${id}`)[0].play();
let AudioContext = window.AudioContext || window.webkitAudioContext;
if(AudioContext){
console.log("Audio Context Exist");
let elSound = $(`#${id}`)[0];
let audioContext = new AudioContext()
let gainNode = audioContext.createGain()
gainNode.gain.value = 1
audioContext.createMediaElementSource(elSound).connect(gainNode).connect(audioContext.destination)
elSound.play()
} else {
console.log("Audio Context Not Exist");
}
}
Only the audio fails, video works fine. And this only fails in iOS Safari iPad, works fine everywhere else. Any clues what I might be doing wrong?
Related
I have already implemented the Twilio dominant Speaker related code the code in bellow .
In my code problem is in dominantSpeakerChanged function when any user added in room ,unmute or mute the audio and mute unmute video then i am getting the response. But not getting any response when some one is talking.
when any user talk during the call at that time function is not called, i need when any user speak in call then give indication who is speaking like bellow image.
Image Link -
https://github.com/Soumojit1995/Issue/blob/main/Google_Meet_QA.max-2000x2000.jpg
Code -
var connectOptions = {
name: roomName,
// logLevel: 'debug',
_useTwilioConnection: true,
audio: { name: 'microphone' },
video: { name: 'camera' },
dominantSpeaker: true
};
if (previewTracks) {
connectOptions.tracks = previewTracks;
}
// Join the Room with the token from the server and the
// LocalParticipant's Tracks.
Video.connect(data.token, connectOptions).then(roomJoined, function(error) {
});
After call Video.connect function call roomJoined function in this function use dominantSpeakerChanged function.
// Successfully connected!
function roomJoined(room) {
room.on('dominantSpeakerChanged', participant => {
console.log('The new dominant speaker in the Room is:', participant);
});
}
i use FileTransfer-Plugin to download a file over a API. After download i want open the file with FileOpener-Plugin. FileOpener requires the mime-type of the file, but data.type (mime-type) returns null.
File.resolveLocalFilesystemUrl(cordova.file.cacheDirectory)
.then((dirEntry) => {
this.transfer.download(url, path, true, {headers: headers})
.then((entry) => {
entry.file((data) => {
console.log('MimeType: ', data.type); // returns null
});
});
});
it works without setting mime-type, but i think it is better with. Anybody knows about this Problem. I working in iOS.
data returns:
end: 683008
lastModified: 1497256786396.6
lastModifiedDate: 1497256786396.6
localURL: "cdvfile://localhost/cache/file.ext"
name: "file.ext"
size: 683008
start: 0
type: null
I want to make a react-native app that is able to:
show live streaming
upload live streaming
save streaming
I have a rtmp url and a playback url. I tried to achieve my goals using "react-native-video-stream" however stream doesn't start and there is no apparent errors.
How can I live stream videos in my app and which library should be used.
Please provide an example / demo app which does live streaming
I found one simple platform called mux to create live stream, upload and save it to play later. react-native-nomediaclient will help you to stream your video. On other side you can just user react-native-video to play the stream.
Here is the blog of whole process.
There are others also platform to create stream. But, the point is that you can stream from any of them using react-native-nomediaclient library.
Update:
Here is the nomediaclient configuration to create live stream using mux :
<NodeCameraView
style={styles.nodeCameraView}
ref={(vb) => { this.vb = vb }}
outputUrl = {`rtmp://live.mux.com/app/${this.state.streamId}`}
camera={{ cameraId: 0, cameraFrontMirror: true }}
audio={{ bitrate: 32000, profile: 1, samplerate: 44100 }}
video={{
preset: 4,
bitrate: 2000000,
profile: 2,
fps: 30,
videoFrontMirror: false
}}
autopreview={true}
/>
To get streamId :
createLive = async () => {
const auth = {
username: MUX_ACCESS_TOKEN,
password: MUX_SECRET
};
const param = { "reduced_latency": true, "playback_policy": "public", "new_asset_settings": { "playback_policy": "public" } }
const res = await axios.post('https://api.mux.com/video/v1/live-streams', param, { auth: auth }).catch((error) => {
throw error;
});
console.log(res.data.data);
const data = res.data.data;
this.setState({
streamId: data.stream_key
});
}
Update 2
I have also find another platform which is better than Mux called Bambuser. It provide easiest installation process for your react native application. It also has many advance features like, you can stream on multiple platform at a time. It provides high quality audio and video streaming with minimum lag time. I have used in my app and it's working without any issues.
Here is the library that you can use with your react-native application :
react-native-bambuser-player : Which allows you to play stream in your user side app.
react-native-bambuser-broadcaster : Using this library you create your broadcaster app to stream video for you user side app.
Follow the installation steps properly and you good to go.
Also if you don't want to build your broadcaster app, they also provide their own app to create live stream. It's has most of all feature that should be in broadcast app. You have to just login in app and it start stream for your player app.
Bambuser (Android)
Bambuser (iOS)
It also gives 14 days free trial to testing.
Sample Code
import Bambuser player :
import RNBambuserPlayer from 'react-native-bambuser-player';
Declare const for your credential :
const BambuserApplicationIds = {
android: 'ANDROID_APPLICATION_ID', // your bambuser android application id
ios: 'IOS_APPLICATION_ID' // your bambuser ios application id
}
const BambuserResourceUri = 'YOUR_BAMBUSER_RESOURCE_URI';
Here is the detail about how you can get applicationId and resourceUri.
render the Bambuser Player view :
<RNBambuserPlayer
style={{ flex: 1 }}
ref={ref => {
this.myPlayerRef = ref;
}}
applicationId={
Platform.OS === 'ios'
? BambuserApplicationIds.ios
: BambuserApplicationIds.android
}
requiredBroadcastState={
RNBambuserPlayer.REQUIRED_BROADCAST_STATE.LIVE
}
videoScaleMode={RNBambuserPlayer.VIDEO_SCALE_MODE.ASPECT_FILL}
resourceUri={BambuserResourceUri}
onTotalViewerCountUpdate={viewer => {
this.setState({ views: viewer }); // handle views update here
}}
onPlaying={() => {
// code to handle when playing stream
}}
onPlaybackError={error => {
// handle when some error occures
Alert.alert('Error', error.message);
}}
onPlaybackComplete={() => {
// this method called when stream is complete. Write some code to handle stream complete like :
this.setState({ isPlaying: false, isLiveEnded: true }, () => {
this.props.navigation.setParams({ isLive: false });
});
}}
onStopped={() => {
// called when stream stops.
this.setState({ isPlaying: false }, () => {
this.props.navigation.setParams({ isLive: false });
});
}}
/>
You can read here more about props.
Requesting mic with audio:true works perfectly fine, however requesting webcam video, or screen video is failing with SourceUnavailbleError. I have added the string of browser to the preference media.getusermedia.screensharing.allowed_domains
Code for requesting webcam video:
var param = {
// audio: true, // if just this it works fine
video: true // {mediaSource: 'screen'} does not work either
};
navigator.mediaDevices.getUserMedia(param).then(function(stream) {
console.log('success');
})
.catch(function(err) {
console.error('err:', err)
});
To reproduce this error, open up browser console, and paste the code above like this:
Error given is:
err: MediaStreamError { name: "SourceUnavailableError", message: "Failed to allocate videosource", constraint: "", stack: "" }
Do you know how I can get around this SourceUnvailableError?
I did a lot of digging here - https://dxr.mozilla.org/mozilla-central/source/browser/modules/webrtcUI.jsm#201 - but no success yet.
Thanks
This seems to be a recurring issue as there have been a number of similar posts on the YouTube API Google Group over the past couple of years. Using the YouTube iFrame API (only tried HTML5 not Flash) the onStateChange event fires correctly (and calls the specified handler), but the data property passed is undefined. I'm running OS X Mountain Lion and the error occurs using both Chrome (23.0.1271.64) and Safari (6.0.2).
Any thoughts on what the issue might be would be most appreciated.
Selected code snippets
myapplication.controllers.Player = function(containerId, videoId) {
[...]
this.player = new YT.Player(this.currentView_.playerContainer_, {
videoId: videoId,
playerVars: { 'showinfo': 0, 'modestbranding': 1, 'rel': 0 },
suggestedQuality: 'medium'
});
[...]
goog.events.listen(this.player, 'onStateChange', this.onPlayerStateChange_,
false, this);
[...]
};
myapplication.controllers.Player.prototype.onPlayerStateChange_ = function(e) {
if (e.data == 1) {
this.startTimer_(100);
} else {
this.stopTimer_();
};
};