Camera Video not Working - React Native - ios

Question
I have been trying to figure out why this is not working for some time. I have used a lot of example code, however I still cannot figure it out.
Code
takeVideo() {
console.log('started to take video');
this.camera.capture({
audio: true,
mode: Camera.constants.CaptureMode.video,
target: Camera.constants.CaptureTarget.disk
}).then((data) => {
this.setState({ path: data.path });
console.log(data);
}).catch((err) => console.log(err));
}
stopVideo() {
this.camera.stopCapture();
console.log(this.state.path);
}
renderCamera() {
return (
<View>
<Camera
ref={(cam) => {
this.camera = cam;
}}
style={styles.preview}
aspect={Camera.constants.Aspect.fill}
captureTarget={Camera.constants.CaptureTarget.disk}
captureMode={Camera.constants.CaptureMode.video}
>
<TouchableHighlight
style={styles.capture}
onPressIn={this.takeVideo.bind(this)}
onPressOut={this.stopVideo.bind(this)}
underlayColor="rgba(255, 255, 255, 0.5)"
>
<View />
</TouchableHighlight>
</Camera>
</View>
);
}
Whats not working
When I console.log(this.state.path) it outputs false which means that it does not change and the video did not record.
Info
This is on IOS
This works if I change Camera.constants.CaptureMode.video to Camera.constants.CaptureMode.still (.video => .still)
RN version:
react-native-cli: 2.0.1
react-native: 0.44.0
Repo
I found this repo that is trying to do almost the exact same thing as me and is having the same issue. Here is the repo: https://github.com/MiLeung/record

Everything in your code is ok, however you're missing one important thing.
this.camera.capture({
audio: true,
mode: Camera.constants.CaptureMode.video,
target: Camera.constants.CaptureTarget.disk
}).then((data) => {
this.setState({ path: data.path });
console.log(data);
}).catch((err) => console.log(err));
In code above, you're telling state, to set object path after saving the data.
But, there:
stopVideo() {
this.camera.stopCapture();
console.log(this.state.path);
}
You're fetching path object before saving the data.
Just try this:
this.camera.capture({
audio: true,
mode: Camera.constants.CaptureMode.video,
target: Camera.constants.CaptureTarget.disk
}).then((data) => {
this.setState({ path: data.path });
console.log(this.state.path); // You should have your path set
console.log(data);
}).catch((err) => console.log(err));
stopCapture function tells the native code, to stop recording and save video - what can take some time, so executing this.state.path immediately after stopCapture does not work.
For more info check this out https://developer.mozilla.org/pl/docs/Web/JavaScript/Reference/Global_Objects/Promise

Related

react-native-webrtc Mic not closing after video call on iOS

Our iOS app has audio video calling implemented using the following technologies:
"react-native": "0.63.4"
"react": "16.13.1"
"react-native-webrtc": "^1.87.3"
"react-native-incall-manager": "^3.3.0"
iOS version 14.4.1
Our calling module works like the following:
First request and initiate audio call
Then request and initiate video call
On the code side things work like this:
We call the getStream() function which gets the user media for audio call i.e Audio only
Then we call the startStream() function which connects the peer connection
On requesting video we call the getVideoStream() method to get Audio and Video streams
Call startStream() again to start peer connection with video
The scenario is as follows:
We start off by connecting an audio call. On success the audio call is connected and works fine as expected
We request for video and connect video call, all works fine as expected and I receive video on both ends
When I disconnect the call and stop tracks using this.state.localStream.getTracks(), the mic does not close. An orange indicator for mic is visible on iOS.
Important Notes:
Disconnecting from the audio call closes the mic just fine
Even if we get video stream on audio call and disconnect without connecting video it still works fine and closes both tracks
Its only when I connect the video is when the issue arises
Calling InCallManager.stop() closes the mic but does not open it on second call. The mic does not open on second call and the orange mic indicator on iOS is not shown.
Get User Media Audio Call
getStream() {
InCallManager.setSpeakerphoneOn(false);
InCallManager.setMicrophoneMute(false);
mediaDevices.enumerateDevices().then((sourceInfos) => {
let videoSourceId;
for (let i = 0; i < sourceInfos.length; i++) {
const sourceInfo = sourceInfos[i];
if (
sourceInfo.kind === 'videoinput' &&
sourceInfo.facing === (true ? 'front' : 'back')
) {
videoSourceId = sourceInfo.deviceId;
}
}
mediaDevices
.getUserMedia({
audio: true,
})
.then((stream) => {
this.setState({
localStream: stream,
});
})
.catch((error) => {
// Log error
console.log('stream get error', error);
});
});
}
Get User Media for Video Call
getVideoStream() {
this.state.peerConn.removeStream(this.state.localStream);
InCallManager.setSpeakerphoneOn(false);
InCallManager.setMicrophoneMute(false);
mediaDevices.enumerateDevices().then((sourceInfos) => {
let videoSourceId;
for (let i = 0; i < sourceInfos.length; i++) {
const sourceInfo = sourceInfos[i];
if (
sourceInfo.kind === 'videoinput' &&
sourceInfo.facing === (true ? 'front' : 'back')
) {
videoSourceId = sourceInfo.deviceId;
}
}
mediaDevices
.getUserMedia({
audio: true,
mirror: true,
video: {
mandatory: {
minWidth: 500,
minHeight: 300,
minFrameRate: 30,
},
facingMode: true ? 'user' : 'environment',
optional: videoSourceId ? [{sourceId: videoSourceId}] : [],
},
})
.then((stream) => {
this.setState(
{
localStream: stream,
},
() => {
this.startStream();
},
);
})
.catch((error) => {
// Log error
console.log('stream get error', error);
});
});
}
Start Stream Function
startStream() {
console.log('start Stream');
this.newPeerConnection();
setTimeout(() => {
this.state.peerConn
.createOffer()
.then((sessionDescription) =>
this.setLocalAndSendMessage(sessionDescription),
)
.catch((error) => this.defaultErrorCallback(error));
}, 3000);
}
newPeerConnection()
newPeerConnection() {
var peerConn = new RTCPeerConnection({
iceServers: turnServer,
});
peerConn.onicecandidate = (evt) => {
console.log(`OnIceCan`);
if (evt.candidate) {
this.state.connection.invoke(
'addIceCandidate',
parseInt(this.state.ticket.pkTicketId),
JSON.stringify({
type: 'candidate',
sdpMLineIndex: evt.candidate.sdpMLineIndex,
sdpMid: evt.candidate.sdpMid,
candidate: evt.candidate.candidate,
}),
);
}
};
peerConn.addStream(this.state.localStream);
peerConn.addEventListener(
'addstream',
(stream) => {
InCallManager.setForceSpeakerphoneOn(false);
this.setState({
isSpeakerEnabled: false,
});
this.setState({
remoteStream: stream,
showAudioCallTimer: true,
});
},
false,
);
this.setState(
{
peerConn,
});
}
Close Tracks
if (this.state.localStream) {
const tracks = this.state.localStream.getTracks();
tracks.map((track, index) => {
track.stop();
});
}
if (this.state.peerConn) {
this.state.peerConn.removeStream(this.state.localStream);
this.state.peerConn.close();
if (!this.state.callRatingSubmitted && this.state.remoteStream) {
this._handleCallFeedbackModal(true);
}
}

React native: crash while requesting permissions

I'm following this tutorial to get user's location on iOS React Native app:
https://hackernoon.com/react-native-basics-geolocation-adf3c0d10112
Using this code to get current location:
navigator.geolocation.getCurrentPosition((position) => {
console.log(position); // TBD
this.setState({ location: true });
}, (error) => {
console.log(error); // Handle this
this.setState({ location: false });
}, {
enableHighAccuracy: true,
timeout: 20000,
maximumAge: 1000,
});
But the app crashes at this file:
PermissionsAndroid.js:
const shouldShowRationale = await NativeModules.PermissionsAndroid.shouldShowRequestPermissionRationale(
with error:
TypeError: Cannot read property shouldShowRequestPermissionRationale of undefined at PermissionsAndroid.Request$
But I'm not even running on Android - I'm running iOS.
Could this be a RN bug or how I'm using it?
Just had to request permissions first:
await navigator.geolocation.requestAuthorization();

react-mic: Recommended Options to Convert weba Audio to MP3, MP4, or WAV

I recently incorporated react-mic in my app. My app uses React on the front-end and Rails on the backend. While I do love using react-mic, I'm having troubles converting the audio blob to another format - particularly mp3, mp4, or even wav.
I reviewed the react-mic documentation and GitHub issues to see if there were any recommended courses of action, but all I could see is that developers would need to find some solution outside of the library to take care of the conversion. Also react-mic's author is looking into format conversion options as a future enhancement. I'm new to all this and would LOVE to hear how others are taking care of this. Please note the following - my React code takes the audio blob and sends it to my Rails back-end as part of a POST fetch request. The back-end uses CarrierWave to upload the audio file. Seeing the popularity of react-mic, I was hoping for guidance on how to take care of the conversion:
Here is my React code using react-mic library:
import React from 'react';
import { ReactMic } from 'react-mic';
const hasGetUserMedia = !!(navigator.getUserMedia || navigator.webkitGetUserMedia ||
navigator.mozGetUserMedia || navigator.msGetUserMedia);
class AudioMic extends React.Component {
constructor(props){
super(props)
this.state = {
recordAudio: false,
blobAudio: null,
blobURL: null
};
};
startRecording = () => {
this.setState({
recordAudio: true
});
}
stopRecording = () => {
this.setState({
recordAudio: false
});
}
onStart = () => {
console.log('You can tap into the onStart callback');
}
onStop = (blobAudio) => {
this.setState({
blobAudio: blobAudio,
blobURL: blobAudio.blobURL
});
this.onUpload();
};
onUpload= () => {
let reader = new FileReader()
reader.onload = (event) => {
//save audio blob in FormData and use FileReader to get into proper format
let formData = new FormData();
formData.append('audio', event.target.result);
fetch('/api/v1/user_response', {
credentials: 'same-origin',
method: 'POST',
body: formData,
headers: {
'Accept': 'application/json, */*'
}
}).then(response => {
if (response.ok) {
return response;
}
else {
let errorMessage = `${response.status} (${response.statusText})`, error = new Error(errorMessage);
throw(error);
}
})
.then(response => response.json())
.then(body => {
console.log('MADE IT HERE');
console.log(body);
})
.catch(error => console.error(`Error in fetch: ${error.message}`));
};
reader.readAsDataURL(this.state.blobAudio.blob);
};
componentDidMount = () => {
if(!hasGetUserMedia) {
alert('Your browser cannot stream from your audio. Please switch to Chrome or Firefox.');
}
};
render() {
return (
<div>
<ReactMic
className='oscilloscope'
record={ this.state.recordAudio }
backgroundColor='#FF4081'
visualSetting='sinewave'
audioBitsPerSecond= { 128000 }
onStop={ this.onStop }
onStart={ this.onStart }
strokeColor='#000000'
/>
<div>
<audio ref='audioSource' controls='controls' src={ this.state.blobURL }></audio>
</div>
<Button animated='fade' onClick={ this.startRecording } >
<Button.Content visible>Start Recording</Button.Content>
<Button.Content hidden><Icon name='microphone' /></Button.Content>
</Button>
<Button animated='fade' onClick={ this.stopRecording }>
<Button.Content visible>Stop Recording</Button.Content>
<Button.Content hidden><Icon name='stop' /></Button.Content>
</Button>
<Button animated='fade' onClick={ this.upload }>
<Button.Content visible>Upload Response</Button.Content>
<Button.Content hidden><Icon name='cloud upload' /></Button.Content>
</Button>
</div>
);
};
};
Here is my Rails uploader for the audio file:
class UserResponseUploader < CarrierWave::Uploader::Base
include CarrierWave::Audio
if Rails.env.test?
storage :file
else
storage :fog
end
def store_dir
"uploads/#{model.class.to_s.underscore}/#{mounted_as}/#{model.id}"
end
end
Any suggestions? I tried using carrierwave-audio to convert to mp3, but I found myself going down a rabbit hole that mentioned using sox for the audio conversion. Is this something that could be taken care of with another JS library on the front-end? All help is appreciated. Thanks in advance.
Okay, I know this is such an old question but I also greatly struggled with this problem as well. So I think others will benefit from the solution.
The answer is so simple. Let's say you want to convert the audio file to mp3 then all you have to do is add the following code as a property in
mimeType='audio/mp3'
Hope this helped at least one person.

Open pdf in default app using pdfmake

I am able to create pdf in my ionic app and if I run the app in chrome it opens perfectly. However if I install my app on the android device it doesn't open. Below is my code. Can someone please help me if I have to do something extra to open it on device. I want to open it with default pdf application on device.
pdfMake.createPdf(dd).open();
Ok. After banging my head on wall for 3 days I finally found the solution and sharing here so that other people who are facing this issue can get help. I am creating pdf and saving it using cordova file plugin. After successful save I am opening it in default application using cordova file opener plugin. Below is my code.
pdfMake.createPdf(YOUR_DEFINITION_HERE).getBlob(buffer => {
this.file.resolveDirectoryUrl(this.file.externalRootDirectory)
.then(dirEntry => {
this.file.getFile(dirEntry, 'test1.pdf', { create: true })
.then(fileEntry => {
fileEntry.createWriter(writer => {
writer.onwrite = () => {
this.fileOpener.open(fileEntry.toURL(), 'application/pdf')
.then(res => { })
.catch(err => {
const alert = this.alertCtrl.create({ message:
err.message, buttons: ['Ok'] });
alert.present();
});
}
writer.write(buffer);
})
})
.catch(err => {
const alert = this.alertCtrl.create({ message: err, buttons: ['Ok'] });
alert.present();
});
})
.catch(err => {
const alert = this.alertCtrl.create({ message: err, buttons: ['Ok']
});
alert.present();
});
});

React Native NetInfo always returns offline

I'm currently trying to implement some functionality in my react native app where I use information stored locally if the device is offline, and perform a fetch if the device is online.
I used NetInfo after reading this How to handle network failure in React-Native, when network is off, but unfortunately I ran into an error where NetInfo always returns offline. I found this github issue, which recommended that I change the host in RCTReachability.m from 'htpp://apple.com' to 'apple.com'. However, I couldn't find a file with that name in the project directory. Instead I found the only mention of 'apple.com' in any file, which was in RCTNetInfo.m, which was in the correct form.
Does anybody know a way to fix this problem? Or possibly a different way to go about performing one action if the device is online, and another if the device is offline?
Here's the relevant code:
fetchData() {
NetInfo.isConnected.fetch().done((isConnected) => {
console.log('First, is ' + (isConnected ? 'online' : 'offline'));
if ( isConnected )
{
fetch(REQUEST_URL)
.then((response) => response.json())
.then((responseData) => {
store.save('contacts', responseData.feed.entry)
.then(() => store.get('contacts'))
.then((contacts) => {
this.setState({
dataSource: this.state.dataSource.cloneWithRows(contacts),
isLoading: false
});
})
})
.catch((error) => { console.error(error); });
}
else
{
store.get('contacts')
.then(contacts => {
if (contacts == null)
{
this.setState({
dataSource: this.state.dataSource.cloneWithRows(CONTACT_DATA),
isLoading: false
});
}
else
{
this.setState({
dataSource: this.state.dataSource.cloneWithRows(contacts),
isLoading: false
});
}
})
}
});
}

Resources