Can I use experimental WebKit features in my iOS app? - ios

I am developing an iOS app with react-native. I wanted to use MediaRecorder which is still in 'experimental' phase. I turned it on in advanced Safari settings but when I try to use it in my app:
var mediaRecorder = new MediaRecorder(stream)
I get this error:
ReferenceError: Can't find variable: MediaRecorder
This feature works well in safari, but I can't get it to work in my app. Is there a way to turn it on in Xcode/real-native settings?
EDIT:
Here is the larger section of my code. I use react-native-webrtc that provides mediaDevices component. I do capture the stream, the problem I have is with MediaRecorder. I know that MediaRecorder works in safari browser, the question I have is if it can be used in a mobile iOS app and if so, how to enable it.
import {
RTCPeerConnection,
RTCIceCandidate,
RTCSessionDescription,
RTCView,
MediaStream,
MediaStreamTrack,
mediaDevices,
registerGlobals
} from 'react-native-webrtc';
var mediaRecorder;
const pc_config = {
"iceServers": [
{
urls: 'stun:stun.l.google.com:19302'
}
]
}
var pc = new RTCPeerConnection(pc_config)
const success = (stream) => {
mediaRecorder = new MediaRecorder(stream) //this line throws the error
pc.addStream(stream)
}
const failure = (e) => {
console.log('getUserMedia Error: ', e)
}
const constraints = {
audio: true,
video: {
mandatory: {
minWidth: 200,
minHeight: 200*(16/9),
minFrameRate: 24
},
facingMode: "user"
}
}
mediaDevices.getUserMedia(constraints)
.then(success)
.catch(failure);

The MediaRecorder constructor syntax is
var mediaRecorder = new MediaRecorder(stream[, options]);
as in
navigator.mediaDevices.getUserMedia(constraints).then(function(stream) {
var mediaRecorder = new MediaRecorder(stream);
}
When running only the following line in the Safari 13 console with Experimental MediaRecorder activated:
var mediaRecorder = new MediaRecorder(stream)
I get the following (expected) output:
ReferenceError: Can't find variable: stream

Related

Ionic - Some functionnalities are not working while compiling on IOS

Context:
Hello, I am working on a Ionic application (made in Typescript). In this application, I control a wifi camera via HTTP Request to the IP (http://192.72.1.1/myCommand to be precise).
Main actions I do with the camera:
Start recording
Stop recording
Get videos list
Download a video
When I use the Ionic DevApp:
With the Ionic DevApp, everything works perfectly, I can do all mains actions without a problem.
When I compile the application on IOS:
I compile with the command ionic cordova build ios --prod, then I archive with Xcode and send it to the AppStore to test it with Test Flight.
I got no errors while compiling / archive the application. But when I try it on my iPhone, I can start / stop recording, but can't download the video.
Problem:
Some commands are not working, but I don't know if it is getting the list or downloading the video, I have no logs. I don't understand why some commands are working but others no.
IOS is blocking download requests? How to solve my problem?
Notes:
I already tried all basic things like delete the IOS platform, recompile, uninstall, ...
I tried different Ionic HTTP plugins, same problem with all of them.
Some code:
Start / Stop the camera: (it is the same command to start / stop).
startCamera(){
var url = "http://192.72.1.1/changeRecordStatus";
var result = this.http.get(url);
result.subscribe(data => {
console.log("Works");
},
err => {
console.log("Error" + err);
}
);
}
Getting the name of the last video:
getLastVideo(){
var url = "http://192.72.1.1/listVideos";
this.http.get(url, {}, {})
.then(data => {
var xml = data.data
var xmlDOM = new DOMParser().parseFromString(xml, 'text/xml');
var temp = this.xmlToJson(xmlDOM); // function that convert XML to JSON
var resultArray = Object.keys(temp).map(function(i){
let ite = temp[i];
return ite;
});
resultArray = resultArray[0]['file'].reverse();
this.lastVideo = resultArray[0]['name']; // lastVideo is a global variable
},
(error) =>{
console.log("Error while getting the name of the last video" + error);
});
}
Downloading the file from the camera:
downloadFileFromCamera() {
this.getLastVideo();
var basename_file = this.lastVideo;
var url = "http://192.72.1.1" + basename_file;
this.fileTransfer.download(encodeURI(url), this.file.dataDirectory + '/videos/' + basename_file, true).then((entry) => {
this.video[this.counterVideos] = entry; // video is a global array
this.counterVideos +=1;
}, (error) => {
console.log("Error while downloading the last video" + error);
});
}
If someone knows how to solve my problem, I would be so grateful! Thanks in advance.

Google Sign-In JavaScript client not working on PWA App

Since yesterday when I use the gapi.auth2 to do a Google Sign-in on an installed PWA app on Android, the App opens the browser window to select the user, but it remains blank.
The same page on the Chrome browser on Android open the user selection as usual. The code is the same, from the same server. The code was not modified in more than 15 days. I presume the problem is some change in the gapi JS client code from Google servers.
Inspecting the PWA Google Sign-in tab on chrome shows the following error:
Uncaught Failed to get parent origin from URL hash!
The origins on Google Developer Console are ok.
Anyone has any clue how to solve this?
Edit1: Code chunk
initGoogle() {
this.ngRedux.dispatch({ type: SN_INIT_GOOGLE });
Observable.create((observer: Observer<any>) => {
let head = document.getElementsByTagName('head');
(<any>window).__ongload = () => {
gapi.load('auth2', () => {
gapi.auth2.init({
client_id: `${AppConfig.google.clientID}`
}).then(() => {
this.auth2 = gapi.auth2.getAuthInstance();
this.googleInitiated();
observer.complete();
}, (err) => {
this.log.error(err);
observer.error(err);
});
});
};
let script: HTMLScriptElement = document.createElement('script');
script.src = 'https://apis.google.com/js/platform.js?onload=__ongload';
script.type = 'text/javascript';
head[ 0 ].appendChild(script);
}).pipe(
timeout(AppConfig.google.timeout),
retry(AppConfig.google.retries),
catchError(error => {
this.googleInitError();
return observableEmpty();
}),
take(1)
).subscribe();
}
async googleLogin(scope: string = 'profile email', rerequest: boolean = false, type: string = SN_GOOGLE_LOGIN): Promise<GoogleUser> {
let goopts = {
scope: this.ngRedux.getState().socialNetworks.getIn([ 'google', 'grantedScopes' ]),
prompt: rerequest ? 'consent' : undefined
};
try {
const user: GoogleUser = await this.auth2.signIn(<any>goopts);
...
return user;
} catch (error) {
...
return error;
}
}
Edit 2: Error screenshot
Screenshot
I had the similar issue as mentioned here. I had not registered my domain under Credential -> My OAuth Client ID -> Authorized JavaScript origins. By adding, it started working. Check the similar case for your app. It may help.
This bug should be fixed. Cannot reproduce it any more.

preloadjs getResult(audio) doesn't work

I am trying to use preloadjs to load mp3 file. I also use HTMLAudioPlugin in Soundjs, but when I want to get the audiodom as use getResult function, it doesn't work in ios but not in chrome.
preloadManifest = [{src: "bgm.mp3", id: 'bgm', type:'createjs.AbstractLoader.SOUND'}]
createjs.Sound.registerPlugins([createjs.HTMLAudioPlugin]);
loader = new createjs.LoadQueue(true, null, true);
loader.addEventListener('complete',function () {
loader.getResult('bgm') // undefined in ios
});

Ionic 2 Cannot use audio file from resource on IOS Emulator

I have an Ionic 2 app that works fine on Android.
Now I need to build the IOS version. The app can stream and download audios from SoundCloud.
When the users click to download an audio, the audio is stored in the app's data directory provided by Cordova file plugin.
The logic I use to store and retrieve data does not matter because in Android it works. The problem is that I want to recover and play the audio in IOS.
This is the part of code where I create a MediaObject and later play, pause it:
var pathalone: string = '';
if (this.platform.is('ios')) {
pathalone = this.audio.filePath.replace(/^file:\/\//, '');
console.log("Pathalone: " + pathalone);
this.mediaPlayer = this.media.create(pathalone, onStatusUpdate, onSuccess, onError);
} else {
pathalone = this.audio.filePath.substring(8);
this.mediaPlayer = this.media.create(pathalone, onStatusUpdate, onSuccess, onError);
}
setTimeout(() => {
if (this.mediaPlayer) {
this.togglePlayPause();
this.updateTrackTime();
}
}, 1000);
In the Media plugin docs says if still having problems on IOS, first create the file. So I tried, but I still having the same problem:
this.file.createFile(cordova.file.dataDirectory, this.audio.fileName, true).then(() => {
pathalone = this.audio.filePath.replace(/^file:\/\//, '');
console.log("Pathalone: " + pathalone);
this.mediaPlayer = this.media.create(pathalone, onStatusUpdate, onSuccess, onError);
});
In console I get these logs:
console.log: Pathalone:
/Users/ivan/Library/Developer/CoreSimulator/Devices/0D75C1A9-591A-4112-BBE4-AB901953A38F/data/Containers/Data/Application/509D1136-86F6-4C9B-84B5-8E0D0D203DAC/Library/NoCloud/311409823.mp3
[14:07:48] console.log: Cannot use audio file from resource
'/Users/ivan/Library/Developer/CoreSimulator/Devices/0D75C1A9-591A-4112-BBE4-AB901953A38F/data/Containers/Data/Application/509D1136-86F6-4C9B-84B5-8E0D0D203DAC/Library/NoCloud/311409823.mp3'
Maybe It's happening on Emulator but will not happen on a real device? I can't test on iPhone because I haven't got one :/
PD:
A curious fact is that the SoundCloud api returns a redirect when making a GET to its api to download an audio.
The response has a status of 301, so I use the response.headers.Location to handle the redirect and there, I can perform the download.
And I'm noticing that in IOS it never performs the redirect, it goes directly on the 'positive' way and the console says 'downloaded'. Maybe it's never really downloaded...
You need indeed to create a file before creating the recorder media object. Recordings are possible even with the emulator on iOS and Android. The following illustrate how this is done with Ionic2+
First you need to import the following
import { NavController, Platform } from 'ionic-angular';
import { Media, MediaObject } from '#ionic-native/media';
import { File } from '#ionic-native/file';
Make sure that you injected the imports in your constructor as follows
constructor(public navCtrl: NavController, private media: Media, private file: File, public platform: Platform) {
// some more code here
}
Declare the required variables before the constructor as follows
filePath: string;
fileName: string;
audio: MediaObject;
The code for starting the recording in as follows
startRecord() {
if (this.platform.is('ios')) {
this.fileName = 'record' + new Date().getDate() + new Date().getMonth() + new Date().getFullYear() + new Date().getHours() + new Date().getMinutes() + new Date().getSeconds() + '.3gp';
this.filePath = this.file.dataDirectory;
this.file.createFile(this.filePath, this.fileName, true).then(() => {
this.audio = this.media.create(this.filePath.replace(/^file:\/\//, '') + this.fileName);
this.audio.startRecord();
});
} else if (this.platform.is('android')) {
this.fileName = 'record' + new Date().getDate() + new Date().getMonth() + new Date().getFullYear() + new Date().getHours() + new Date().getMinutes() + new Date().getSeconds() + '.3gp';
this.filePath = this.file.externalDataDirectory;
this.file.createFile(this.filePath, this.fileName, true).then(() => {
this.audio = this.media.create(this.filePath.replace(/^file:\/\//, '') + this.fileName);
this.audio.startRecord();
});
}
}
Notice the differences between using the path name "this.filePath" in createFile and media.create.
The code for stopping the recording is as follows
stopRecord() {
this.audio.stopRecord();
}

react native live streaming

I want to make a react-native app that is able to:
show live streaming
upload live streaming
save streaming
I have a rtmp url and a playback url. I tried to achieve my goals using "react-native-video-stream" however stream doesn't start and there is no apparent errors.
How can I live stream videos in my app and which library should be used.
Please provide an example / demo app which does live streaming
I found one simple platform called mux to create live stream, upload and save it to play later. react-native-nomediaclient will help you to stream your video. On other side you can just user react-native-video to play the stream.
Here is the blog of whole process.
There are others also platform to create stream. But, the point is that you can stream from any of them using react-native-nomediaclient library.
Update:
Here is the nomediaclient configuration to create live stream using mux :
<NodeCameraView
style={styles.nodeCameraView}
ref={(vb) => { this.vb = vb }}
outputUrl = {`rtmp://live.mux.com/app/${this.state.streamId}`}
camera={{ cameraId: 0, cameraFrontMirror: true }}
audio={{ bitrate: 32000, profile: 1, samplerate: 44100 }}
video={{
preset: 4,
bitrate: 2000000,
profile: 2,
fps: 30,
videoFrontMirror: false
}}
autopreview={true}
/>
To get streamId :
createLive = async () => {
const auth = {
username: MUX_ACCESS_TOKEN,
password: MUX_SECRET
};
const param = { "reduced_latency": true, "playback_policy": "public", "new_asset_settings": { "playback_policy": "public" } }
const res = await axios.post('https://api.mux.com/video/v1/live-streams', param, { auth: auth }).catch((error) => {
throw error;
});
console.log(res.data.data);
const data = res.data.data;
this.setState({
streamId: data.stream_key
});
}
Update 2
I have also find another platform which is better than Mux called Bambuser. It provide easiest installation process for your react native application. It also has many advance features like, you can stream on multiple platform at a time. It provides high quality audio and video streaming with minimum lag time. I have used in my app and it's working without any issues.
Here is the library that you can use with your react-native application :
react-native-bambuser-player : Which allows you to play stream in your user side app.
react-native-bambuser-broadcaster : Using this library you create your broadcaster app to stream video for you user side app.
Follow the installation steps properly and you good to go.
Also if you don't want to build your broadcaster app, they also provide their own app to create live stream. It's has most of all feature that should be in broadcast app. You have to just login in app and it start stream for your player app.
Bambuser (Android)
Bambuser (iOS)
It also gives 14 days free trial to testing.
Sample Code
import Bambuser player :
import RNBambuserPlayer from 'react-native-bambuser-player';
Declare const for your credential :
const BambuserApplicationIds = {
android: 'ANDROID_APPLICATION_ID', // your bambuser android application id
ios: 'IOS_APPLICATION_ID' // your bambuser ios application id
}
const BambuserResourceUri = 'YOUR_BAMBUSER_RESOURCE_URI';
Here is the detail about how you can get applicationId and resourceUri.
render the Bambuser Player view :
<RNBambuserPlayer
style={{ flex: 1 }}
ref={ref => {
this.myPlayerRef = ref;
}}
applicationId={
Platform.OS === 'ios'
? BambuserApplicationIds.ios
: BambuserApplicationIds.android
}
requiredBroadcastState={
RNBambuserPlayer.REQUIRED_BROADCAST_STATE.LIVE
}
videoScaleMode={RNBambuserPlayer.VIDEO_SCALE_MODE.ASPECT_FILL}
resourceUri={BambuserResourceUri}
onTotalViewerCountUpdate={viewer => {
this.setState({ views: viewer }); // handle views update here
}}
onPlaying={() => {
// code to handle when playing stream
}}
onPlaybackError={error => {
// handle when some error occures
Alert.alert('Error', error.message);
}}
onPlaybackComplete={() => {
// this method called when stream is complete. Write some code to handle stream complete like :
this.setState({ isPlaying: false, isLiveEnded: true }, () => {
this.props.navigation.setParams({ isLive: false });
});
}}
onStopped={() => {
// called when stream stops.
this.setState({ isPlaying: false }, () => {
this.props.navigation.setParams({ isLive: false });
});
}}
/>
You can read here more about props.

Resources