Expo AV audio not playing on iOS/ iPhone - ios

I have used Expo AV and developed a screen in my app to play audio files fetched from my server. It works fine on Android, but doesn't play anything on iPhone.
When I play a button to play the audio which loads and plays the file
soundObject.loadAsync({ uri: this.state.file });
soundObject.playAsync();
It returns an error:
This media format is not supported. - The AVPlayerItem instance has failed with the error code -11828 and domain "AVFoundationErrorDomain".
Here is my code that loads and plays the audio :
async loadAudio() {
soundObject = new Audio.Sound();
try {
await soundObject.loadAsync({ uri: this.state.file });
console.log("File loaded: " + this.state.file);
} catch (error) {
console.log(error);
}
}
async playAudio() {
if (!this.state.isPlayingAudio) {
try {
await soundObject.playAsync();
} catch (error) {
console.log(error);
}
else {
soundObject.pauseAsync();
}
}
I have tried changing the audio format to m4a, wav, caf while recording and fetching the file but that did not help
I'm running the app on iPhone 7 plus, iOS 14.2
Any suggestions/ fixes, please? Thanks in advance

You're calling loadAsync improperly.
The call should look like this:
await Audio.Sound.createAsync(
{ uri: this.state.file },
{ shouldPlay: true }
);

I'm passing the uri object and a second argument {shouldPlay: true} to the loadAsync method.
This plays my mp3 files from amazon server s3
await Audio.Sound.loadAsync( { uri: this.state.file }, { shouldPlay: true } )

Please add this method before playing soundd view "expo-av"
const enableAudio = async () => {
await Audio.setAudioModeAsync({
playsInSilentModeIOS: true,
staysActiveInBackground: false,
interruptionModeAndroid: INTERRUPTION_MODE_ANDROID_DO_NOT_MIX,
shouldDuckAndroid: false,
})

I was on Expo 44, downgrading to Expo 43 did the trick. Run expo upgrade 43.

Related

Using addListener (willBlur) is crashing on tester's IOS device

I have created an IOS app using React Native. The app consists of a Song Menu screen and a song screen. On the song screen the user is able to press the play button to play the song.
I am currently testing it out using TestFlight. It is working fine on my phone. However, on my friend's phone it keeps crashing. The error is very generic giving RCTFatal as the error.
However, I have narrowed the problem down to code which stops songs from playing when the user navigates away from the Song page.
The relevant code is here:
export default class Song extends Component {
playAudio = (file) => {
var s = new Sound('audio/' + file, Sound.MAIN_BUNDLE, (error) => {
if (error){
console.log('error', error)
} else {
s.play(() => {
s.release()
})
}
})
/* Problem LINE #1 */
this.willBlurSubscription = this.props.navigation.addListener(
'willBlur',
() => s.release()
)
}
/* Problem LINE #2 */
componentWillUnmount(){
this.willBlurSubsciption.remove()
}
/* Other code here for displaying UI and the clickable audio button which loads the playAudio function. */
...
}
When I remove the subscription code above (shown on two lines) then there is no crash. However, the song will not stop playing after the user goes back to the main menu. When I put the lines back in there it crashes for my tester with the RCTFatal error.
Note: The app crashes whether my tester plays the audio file or not, but always when he navigates back to the Song Menu.
What is the problem with my code? Why is it generating such a cryptic error? And why does my IOS device not crash but his does?
I guess, from the react-navigation event you're using (willBlur), the version of react-navigation is <=4.x, so the event subscription methods looks correct. If you're using react-navigation version 5 then these events have totally changed.
You should be very careful using these event subscriptions when disposing native modules.
You should try to be safe by checking if each function/method exists before calling it and of course wrap every function call to a try..catch error handler:
playAudio = (file) => {
// If audio playback is triggered by user,
// then you should always check if there is already an object
// and if there is, either exit the function or stop/dispose current sound and start a new one
if (this.sound) {
// either return and let the current sound play
// return;
// or stop and release the sound if you have play/stop/pause sound controls
try {
this.sound.release()
} catch(error) {}
finally {
this.sound = null
}
}
this.sound = new Sound(`audio/${file}`, Sound.MAIN_BUNDLE, (error) => {
if (error) {
console.log('error', error)
this.sound = null;
} else {
this.sound.play(() => {
try {
if (this.sound) {
this.sound.release()
}
} catch(error) {}
finally {
this.sound = null
}
})
}
})
/* Problem LINE #1 */
this.willBlurSubscription = this.props.navigation.addListener(
'willBlur',
() => {
try {
if (this.sound) {
this.sound.release()
}
} catch(error) {}
finally {
this.sound = null
}
}
)
}
/* Problem LINE #2 */
componentWillUnmount() {
try {
this.willBlurSubsciption &&
this.willBlurSubsciption.remove &&
this.willBlurSubsciption.remove()
} catch(error) {}
finally {
this.willBlurSubsciption = null
}
}
I guest this.willBlurSubscription = this.props.navigation.addListener will create one listener every playAudio invoked. So from the second time, the s will be released 2 times -> cause the cash.
Try to push the listener in componentDidMount or check the listener before add it, like:
/* Problem LINE #1 */
if (this.willBlurSubscription) {
this.willBlurSubsciption.remove();
this.willBlurSubsciption = null;
}
this.willBlurSubscription = this.props.navigation.addListener(
'willBlur',
() => s.release()
)

Cordova plugin media doesn't fire onStatus callbacks on ios 11.0 and 12.0 properly

I have a project built on ionic 4.2.1. The project is developed for ios and uses audio playing. First I tried #ionic-native/media lib for tracking statuses (I actually need only two of them - 'started' and 'stoped'), and tested in browser it all worked good, but on ios there is a big problem.
At first time I play audio there are no callback at all.
The second one - just after I clicked the button two callback are fired at the same time - start and stop.
After that, the second case is repeated all the next times. Audio is playing properly all the time though.
I tried to use cordova-plugin-media itself, without #ionic-native/media, but the problem is in cordova plugin and nothing changed.
Tested on ios 11.x and 12.1
ionic 4.2.1
cordova-plugin-media ^5.0.1
#ionic-native/media ^4.17.0
with pure cordova-media-plugin
this.platform.ready().then(() => {
let file = new (<any>window).Media(path,
() => {
console.log("playAudio():Audio Success")
},
(err) => {
console.log("playAudio():Audio Error: " + err)
},
(status) => {
this.addConsole('status=' + status)
if (status === 1) {
this.isAudioActive = true
this.addConsole('played')
}
if (status === 4) {
this.addConsole('stoped')
this.isAudioActive = false
}
}
)
this.file = file
this.file.play()
})
with #ionic-native/media
let file: MediaObject = this.media.create(path);
this.file = file
this.addConsole('play')
this.file.play()
this.file.onStatusUpdate.subscribe(
(status) => {
this.addConsole('status=' + status)
if (status === 1) {
this.isAudioActive = true
this.addConsole('played')
}
if (status === 4) {
this.addConsole('stoped')
this.isAudioActive = false
}
}
)
Hope since cordova-plugin-media callback methods anything has changed
Any help will be appreciated

react native live streaming

I want to make a react-native app that is able to:
show live streaming
upload live streaming
save streaming
I have a rtmp url and a playback url. I tried to achieve my goals using "react-native-video-stream" however stream doesn't start and there is no apparent errors.
How can I live stream videos in my app and which library should be used.
Please provide an example / demo app which does live streaming
I found one simple platform called mux to create live stream, upload and save it to play later. react-native-nomediaclient will help you to stream your video. On other side you can just user react-native-video to play the stream.
Here is the blog of whole process.
There are others also platform to create stream. But, the point is that you can stream from any of them using react-native-nomediaclient library.
Update:
Here is the nomediaclient configuration to create live stream using mux :
<NodeCameraView
style={styles.nodeCameraView}
ref={(vb) => { this.vb = vb }}
outputUrl = {`rtmp://live.mux.com/app/${this.state.streamId}`}
camera={{ cameraId: 0, cameraFrontMirror: true }}
audio={{ bitrate: 32000, profile: 1, samplerate: 44100 }}
video={{
preset: 4,
bitrate: 2000000,
profile: 2,
fps: 30,
videoFrontMirror: false
}}
autopreview={true}
/>
To get streamId :
createLive = async () => {
const auth = {
username: MUX_ACCESS_TOKEN,
password: MUX_SECRET
};
const param = { "reduced_latency": true, "playback_policy": "public", "new_asset_settings": { "playback_policy": "public" } }
const res = await axios.post('https://api.mux.com/video/v1/live-streams', param, { auth: auth }).catch((error) => {
throw error;
});
console.log(res.data.data);
const data = res.data.data;
this.setState({
streamId: data.stream_key
});
}
Update 2
I have also find another platform which is better than Mux called Bambuser. It provide easiest installation process for your react native application. It also has many advance features like, you can stream on multiple platform at a time. It provides high quality audio and video streaming with minimum lag time. I have used in my app and it's working without any issues.
Here is the library that you can use with your react-native application :
react-native-bambuser-player : Which allows you to play stream in your user side app.
react-native-bambuser-broadcaster : Using this library you create your broadcaster app to stream video for you user side app.
Follow the installation steps properly and you good to go.
Also if you don't want to build your broadcaster app, they also provide their own app to create live stream. It's has most of all feature that should be in broadcast app. You have to just login in app and it start stream for your player app.
Bambuser (Android)
Bambuser (iOS)
It also gives 14 days free trial to testing.
Sample Code
import Bambuser player :
import RNBambuserPlayer from 'react-native-bambuser-player';
Declare const for your credential :
const BambuserApplicationIds = {
android: 'ANDROID_APPLICATION_ID', // your bambuser android application id
ios: 'IOS_APPLICATION_ID' // your bambuser ios application id
}
const BambuserResourceUri = 'YOUR_BAMBUSER_RESOURCE_URI';
Here is the detail about how you can get applicationId and resourceUri.
render the Bambuser Player view :
<RNBambuserPlayer
style={{ flex: 1 }}
ref={ref => {
this.myPlayerRef = ref;
}}
applicationId={
Platform.OS === 'ios'
? BambuserApplicationIds.ios
: BambuserApplicationIds.android
}
requiredBroadcastState={
RNBambuserPlayer.REQUIRED_BROADCAST_STATE.LIVE
}
videoScaleMode={RNBambuserPlayer.VIDEO_SCALE_MODE.ASPECT_FILL}
resourceUri={BambuserResourceUri}
onTotalViewerCountUpdate={viewer => {
this.setState({ views: viewer }); // handle views update here
}}
onPlaying={() => {
// code to handle when playing stream
}}
onPlaybackError={error => {
// handle when some error occures
Alert.alert('Error', error.message);
}}
onPlaybackComplete={() => {
// this method called when stream is complete. Write some code to handle stream complete like :
this.setState({ isPlaying: false, isLiveEnded: true }, () => {
this.props.navigation.setParams({ isLive: false });
});
}}
onStopped={() => {
// called when stream stops.
this.setState({ isPlaying: false }, () => {
this.props.navigation.setParams({ isLive: false });
});
}}
/>
You can read here more about props.

SourceUnavailableError when requesting getUserMedia for video from about: page

Requesting mic with audio:true works perfectly fine, however requesting webcam video, or screen video is failing with SourceUnavailbleError. I have added the string of browser to the preference media.getusermedia.screensharing.allowed_domains
Code for requesting webcam video:
var param = {
// audio: true, // if just this it works fine
video: true // {mediaSource: 'screen'} does not work either
};
navigator.mediaDevices.getUserMedia(param).then(function(stream) {
console.log('success');
})
.catch(function(err) {
console.error('err:', err)
});
To reproduce this error, open up browser console, and paste the code above like this:
Error given is:
err: MediaStreamError { name: "SourceUnavailableError", message: "Failed to allocate videosource", constraint: "", stack: "" }
Do you know how I can get around this SourceUnvailableError?
I did a lot of digging here - https://dxr.mozilla.org/mozilla-central/source/browser/modules/webrtcUI.jsm#201 - but no success yet.
Thanks

Stream video to an ios app from a mongoDB using Gridfs-stream

I'm a relatively new programmer trying to create an ios video-streaming app in swift. I've written the backend in node.js and have a mongoose connection to a mongoDB. I've been able to upload videos to the database using the following code and gridfs
function (req, res, next) {
req.pipe(gfs.createWriteStream({ filename: 'filename'}));
res.send("success");
};
I'm attempting to stream the videos using the following code
gfs.findOne({_id: req.params.id}, function (err, file) {
if (err) {
return res.status(400).send(err);
} else if (!file) {
return res.status(404).send(' ');
} else {
res.header("Content-Type","video/mp4");
res.header("X-Content-Type-Options", "nosniff");
res.header("Accept-Ranges", "bytes");
res.header("Content-Length",file.length);
var readStream = gfs.createReadStream({_id: file._id});
readStream.on('open', function () {
console.log('Starting download...');
});
readStream.on('data', function (chunk) {
console.log('Loading...');
});
readStream.on('end', function () {
console.log('Video is ready to play');
});
readStream.on('error', function (err) {
console.log('There was an error with the download' + err);
res.end();
});
readStream.pipe(res);
}
});
When I run a server on localhost and attempt to access the videos via google chrome, all I get is the default playback screen but no video. However, ultimately I am trying to playback on an ios app. When I try the same thing on swift (using the local host url passed in to mpmovieplayer), there is also no video playback. I know the get request is going through because my console outputs the proper response with the filesize. I have even been fiddling with alamofire on the front end and can see the hexadecimal representation of the video in the response data. Can anybody help with this code? Do I need to update my res.header to fit some IOS specification? Should I even be using alamofire at all for this? Thanks in advance for your responses.

Resources