I am new to the react js. Here I am trying to access the mobile camera in the app. Here,
I am trying to use the getUserMedia .
I am able to access the camera in the android device. But not able to access in the IOS.
So, I am using react for this.
if (!('getUserMedia' in navigator.mediaDevices)) {
navigator.mediaDevices.getUserMedia = function (constraints) {
var getUserMedia = navigator.webkitGetUserMedia || navigator.mozGetUserMedia;
if (!getUserMedia) {
return Promise.reject(new Error('getUserMedia is not implemented!'));
}
return new Promise(function (resolve, reject) {
getUserMedia.call(navigator, constraints, resolve, reject);
});
}
}
navigator.mediaDevices.getUserMedia({
video: { facingMode: 'user', width: 1200, height: 600 },
audio: true
}).then((stream) => {
console.log('recording started');
this.setState({
cameraAccessGranted: true,
});
return this.startRecording(stream)
}).
Can any one please help me with this ?
Thanks in advance .
It doesn't matter if it is React or not, WebRtc is not dependent on framework.
Have you tried to check examples from https://webrtc.github.io/samples/src/content/devices/input-output/ on your android?
For me it works well
Related
I am using the native camera (iOS/Android) calling as following:
async function takePhoto() {
const photo = await ImagePicker.launchCameraAsync(cameraOptions);
if (photo.cancelled) {
return '';
}
return photo.uri;
}
Since upgraded from Expo 39 to 42 it is broken (see screenshots)
Portrait
Landscape
It seems to me, that it is beeing opened as Modal. I don't know where to change this.
Expected behaviour:
Display of camera in fullscreen as native camera under iOS
Update: 20210730: Meanwhile it has been opend as a bug/issue:
https://github.com/expo/expo/issues/13614
Any ideas, suggestions - especially in terms of a workaround?
Thanks a lot.
I've done a massive upgrade from EXPO SDK 37 to EXPO SDK 42. Had to change alot of things around camera, location and permissions.
I do not experience this behavior when using the following (I cannot see your import statements or your package versions but this is what I've implemented and experience no issue)
// Import statements...
import * as ImagePicker from 'expo-image-picker';
import * as FileSystem from 'expo-file-system';
import { Camera } from 'expo-camera';
// Code within Component
const takePicture = async () => {
// You MUST ask for permissions first.
const permissions = {
[Camera]: await Camera.requestPermissionsAsync()
};
// If denied let the user know its required.
if (permissions[Camera].status !== 'granted') {
return Promise.reject(new Error('Camera Permission Required'));
}
// Then let them launch the camera and perform any other task
await ImagePicker.launchCameraAsync({
allowsEditing: false
})
.then(({ uri }) => imageProcesser(uri))
.then(res => onImageAdded(res))
.catch((e) => console.log(e));
};
// These are my concerning package versions
"expo-camera": "^11.2.2"
"expo-file-system": "~11.1.3",
"expo-image-picker": "~10.2.2",
"expo": "^42.0.3"
I am new to RTCPeerConnection (WEbRTC), so please bear with me.
So far I am able to get to the point where I can replace tracks on the run by switching camera or screen sharing in my app. But I noticed it in 2 browser tabs that newly replaced track stream is captured in partner/remote peer only, not on initiator's tab. It just keep showing old stream even though stream has been replaced.
It should've been nice if initiator can also see what he/she is sharing. I tried but no luck so far. Looking for some assistance.
My code looks like:
function screenShare(){
(async () => {
try {
await navigator.mediaDevices.getDisplayMedia(
{
cursor: true
}).then(stream => {
// localStream = stream;
let videoTrack = stream.getVideoTracks()[0];
var sender = senders.find(function(s) {
return s.track.kind == videoTrack.kind;
});
sender.replaceTrack(videoTrack);
videoTrack.onended = function(){
sender.replaceTrack(localStream.getTracks()[1]);
}
});
} catch (err) {
console.log('(async () =>: ' + err);
}
})();
}
Thanks in advance.
By design replaceTrack replaces the stream on the RTCPeerConnection. This does not affect the local video object. Reset the srcObject on the local video element to change it.
We have an application developed using WebRTC in iOS 11, and it says it supports WebRTC but the application is not working in Safari on iOS 11. Is there anything required to do from our end to support this on the Safari browser? Do we have to make any changes in the script? Please help.
Did you got the last adapter.js that assume the browser compatibility ?
Regards
Here is a sample code that worked for me
// create video element first
var video = document.createElement('video');
video.style.width = document.width + 'px';
video.style.height = document.height + 'px';
video.setAttribute('autoplay', '');
video.setAttribute('muted', '');
video.setAttribute('playsinline', '');
document.body.appendChild(video);
//setup your constraints
constraints = {
audio: false,
video: {
facingMode: front
}
}
//ask navigator to allow access
navigator.mediaDevices.getUserMedia(constraints).then(function
success(stream) {
video.srcObject = stream;
});
});
I develop iOS app with Apache Cordova on Xcode. The problem is when I test my app on device, Cordova listens events too late. For example, when I click share button on iOS, the share box runs after 2 minutes. Another example is I use admob pro pluginfor admob, the ads run after 5 minutes on device ready. I recognized that this problem is exist only on Cordova and its plugins events.
I checked and tried everything but couldn't find a solution. On android platform everything is work fine.
How can I fix this? Is there anybody can help me ?
the admob responding after 2-5 minutes. I did type simple javascript function that is alert for device ready. its working fine.
the code snippet is below;
function reklamYukle() {
var admobid = {};
// TODO: replace the following ad units with your own
if (/(android)/i.test(navigator.userAgent)) {
admobid = { // for Android
banner: 'ca-app-pub-5534669392136777/3711456161',
interstitial: 'ca-app-pub-5534669392134577/5454702358'
};
} else if (/(ipod|iphone|ipad)/i.test(navigator.userAgent)) {
admobid = { // for iOS
banner: 'ca-app-pub-5534669392136777/7457497261',
interstitial: 'ca-app-pub-5534669392136777/5896501200'
};
} else {
admobid = { // for Windows Phone
banner: 'ca-app-pub-6869992474017983/8878394753'
};
}
AdMob.createBanner({
adId: admobid.banner,
position: AdMob.AD_POSITION.BOTTOM_CENTER,
overlap: false,
offsetTopBar: true,
bgColor: 'black',
autoshow: true
});
AdMob.prepareInterstitial({
adId: admobid.interstitial,
autoShow: true
});
}
function onDeviceReady() {
reklamYukle();
}
document.addEventListener("deviceready", onDeviceReady, false);
I created a live streaming app, i used the streaming media plugin for playing the videos and it works well on android but it doesn't work on iOS.
The video player open successfully but the video not working and no errors appear in the console.
Link of the plugin
playVideo(url) {
let options: StreamingVideoOptions = {
successCallback: () => { console.log('Finished Video') },
errorCallback: (e) => { console.log('Error: ', e) },
orientation: 'portrait'
};
this.streamingMedia.playVideo(url, options);
}
Use IONIC InAppBrowser instead of StreamingVideoOptions
InAppBrowser Docs
-https://ionicframework.com/docs/native/in-app-browser/
Use this code for ios
const browser = this.iab.create(url, '_self');
browser.show;
Use this code for android
streamingMedia.playVideo(url, options);