I'm a relatively new programmer trying to create an ios video-streaming app in swift. I've written the backend in node.js and have a mongoose connection to a mongoDB. I've been able to upload videos to the database using the following code and gridfs
function (req, res, next) {
req.pipe(gfs.createWriteStream({ filename: 'filename'}));
res.send("success");
};
I'm attempting to stream the videos using the following code
gfs.findOne({_id: req.params.id}, function (err, file) {
if (err) {
return res.status(400).send(err);
} else if (!file) {
return res.status(404).send(' ');
} else {
res.header("Content-Type","video/mp4");
res.header("X-Content-Type-Options", "nosniff");
res.header("Accept-Ranges", "bytes");
res.header("Content-Length",file.length);
var readStream = gfs.createReadStream({_id: file._id});
readStream.on('open', function () {
console.log('Starting download...');
});
readStream.on('data', function (chunk) {
console.log('Loading...');
});
readStream.on('end', function () {
console.log('Video is ready to play');
});
readStream.on('error', function (err) {
console.log('There was an error with the download' + err);
res.end();
});
readStream.pipe(res);
}
});
When I run a server on localhost and attempt to access the videos via google chrome, all I get is the default playback screen but no video. However, ultimately I am trying to playback on an ios app. When I try the same thing on swift (using the local host url passed in to mpmovieplayer), there is also no video playback. I know the get request is going through because my console outputs the proper response with the filesize. I have even been fiddling with alamofire on the front end and can see the hexadecimal representation of the video in the response data. Can anybody help with this code? Do I need to update my res.header to fit some IOS specification? Should I even be using alamofire at all for this? Thanks in advance for your responses.
Related
I have used Expo AV and developed a screen in my app to play audio files fetched from my server. It works fine on Android, but doesn't play anything on iPhone.
When I play a button to play the audio which loads and plays the file
soundObject.loadAsync({ uri: this.state.file });
soundObject.playAsync();
It returns an error:
This media format is not supported. - The AVPlayerItem instance has failed with the error code -11828 and domain "AVFoundationErrorDomain".
Here is my code that loads and plays the audio :
async loadAudio() {
soundObject = new Audio.Sound();
try {
await soundObject.loadAsync({ uri: this.state.file });
console.log("File loaded: " + this.state.file);
} catch (error) {
console.log(error);
}
}
async playAudio() {
if (!this.state.isPlayingAudio) {
try {
await soundObject.playAsync();
} catch (error) {
console.log(error);
}
else {
soundObject.pauseAsync();
}
}
I have tried changing the audio format to m4a, wav, caf while recording and fetching the file but that did not help
I'm running the app on iPhone 7 plus, iOS 14.2
Any suggestions/ fixes, please? Thanks in advance
You're calling loadAsync improperly.
The call should look like this:
await Audio.Sound.createAsync(
{ uri: this.state.file },
{ shouldPlay: true }
);
I'm passing the uri object and a second argument {shouldPlay: true} to the loadAsync method.
This plays my mp3 files from amazon server s3
await Audio.Sound.loadAsync( { uri: this.state.file }, { shouldPlay: true } )
Please add this method before playing soundd view "expo-av"
const enableAudio = async () => {
await Audio.setAudioModeAsync({
playsInSilentModeIOS: true,
staysActiveInBackground: false,
interruptionModeAndroid: INTERRUPTION_MODE_ANDROID_DO_NOT_MIX,
shouldDuckAndroid: false,
})
I was on Expo 44, downgrading to Expo 43 did the trick. Run expo upgrade 43.
function authenticate() {
return gapi.auth2.getAuthInstance()
.signIn({scope: "https://www.googleapis.com/auth/youtube.readonly"})
.then(function() { console.log("Sign-in successful"); },
function(err) { console.error("Error signing in", err); });
}
function loadClient() {
gapi.client.setApiKey("YOUR_API_KEY");
return gapi.client.load("https://www.googleapis.com/discovery/v1/apis/youtube/v3/rest")
.then(function() { console.log("GAPI client loaded for API"); },
function(err) { console.error("Error loading GAPI client for API", err); });
}
// Make sure the client is loaded and sign-in is complete before calling this method.
function execute() {
return gapi.client.youtube.liveStreams.list({
"part": [
"snippet,cdn,contentDetails,status"
//"cdn"
],
"mine": true
})
.then(function(response) {
// Handle the results here (response.result has the parsed body).
console.log("Response", response);
var responseData = JSON.stringify(response);
alert(responseData);
//alert(response.result.items);
var itemsArr = response.result.items;
var itemObj = itemsArr[0];
alert('streamName = ' + itemObj.cdn.ingestionInfo.streamName);
//alert(responseData.result);
//var result = responseData.result;
},
function(err) { console.error("Execute error", err); });
}
gapi.load("client:auth2", function() {
gapi.auth2.init({client_id: "YOUR_CLIENT_ID"});
});
<script src="https://apis.google.com/js/api.js"></script>
<button onclick="authenticate().then(loadClient)">authorize and load</button>
<button onclick="execute()">execute</button>
I am new to YouTube live streaming. I am doing it through my application. I have gone through various question/answers on this portal but couldn't find/understand a way to get it.
Is there any way (with YouTube Data API v3) to get live stream tiny URL (something like https://youtu.be/OHi8m4o8XeQ) so that I can share my live stream to my audiences?
I have got a stream key/name (20 character alphanumeric key with four - in between) from YouTube Data API v3, that I will use to stream to YouTube.
I am adding one screenshot for reference. I want the tiny url (something like https://youtu.be/someid) in upper right side.
Yes, take your channel url and add /live .
YouTube's shortened URL associated to a given video -- identified by its ID VIDEO_ID -- is of form:
https://youtu.be/VIDEO_ID,
where (usually, though not officially documented as such) VIDEO_ID obeys to the following regex pattern:
^[0-9a-zA-Z_-]{11}$.
In the case of live streaming, for to be able to share the shortened URL of one such stream you've created, you should obtain the video ID associated to that stream.
That video ID is to be found as the value of the property id of the LiveBroadcasts resource that is bound to your live stream.
I am recording audio, sending it as a blob to a nodejs server. The nodejs server then sends it to all connected users that are not currently recording.
Sending the blob:
mediaRecorder.onstop = function(e) {
var blob = new Blob(this.chunks, { 'type' : 'audio/ogg; codecs=opus' });
socket.emit('radio', blob);
};
Server receiving the blob:
socket.on('radio', function(blob) {
socket.broadcast.emit('voice', blob);
});
Listener receiving the blob:
socket.on('voice', function(arrayBuffer) {
var blob = new Blob([arrayBuffer], { 'type' : 'audio/ogg; codecs=opus' });
var audio = document.getElementById('audio');
audio.src = window.URL.createObjectURL(blob);
audio.play();
});
This works in all browsers/devices except safari and any ios device. Taking it further with safari's inspector I found this:
Does safari require something else in its headers for blob objects to be interpreted properly? I've researched accepted audio types, tried aac/mp3/ogg without any success. Upon reading further I've heard references to the fact that there is a bug with streaming blob audio/video data in safari and IOS though I'm not too clear any the details.
Guidance in the rite direction would be very helpful!
EDIT: It looks like this line audio.src = window.URL.createObjectURL(blob); in the receiving blob is what is causing the blob errors (image i linked)
EDIT 2: I tried to see if using another format other than blob would work, opting for base64 encoded string. Looks like this works on all devices and browsers except for IOS and Safari. I'm getting the impression it has something to do with how IOS interprets/loads the data...
For me the solution was to insert a source element into the audio element, and use the sourceElement.src attribute to refer to the blob. I didn't even need to attach the audio-element to the DOM. Example below, hope it helps someone.
var audioElement = document.createElement('audio')
var sourceElement = document.createElement('source')
audioElement.appendChild(sourceElement)
sourceElement.src = '<your blob url>'
sourceElement.type = 'audio/mp3' // or whatever
audioElement.load()
audioElement.play()
I haven't been able to find a solution using an audio element, however the Web Audio Api seems to do the trick: https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API
var audioContext = new (window.AudioContext || window.webkitAudioContext);
socket.on('voice', function(arrayBuffer) {
audioContext.decodeAudioData(arrayBuffer, audioData => {
var source = audioContext.createBufferSource();
source.buffer = audioData;
source.connect(audioContext.destination);
source.start()
});
You may still have an issue on iOS as any audio/video must be triggered by a user action.
I had a similar problem this week. I was recording the audio on Safari and the audio blob was being generated just fine. But when I tried to send the blob to server, no data was getting there. The solution bellow reads the blob with File Reader to convert it to base64, and afterwards send it to server. Here is what worked for me:
const reader = new FileReader();
reader.readAsDataURL(audioBlob);
reader.onloadend = () => {
const base64data = reader.result;
const audioName = uuid.v4() + '.mp3';
this.http.post(this.app.cloudStorageEndpoint + '/audios/local?audioName=' + audioName, base64data , header).subscribe(
res => { resolve(audioName); },
err => { reject(err); }
);
};
I've tested with Safari 12 on both iPad and iPhone and it worked fine.
I'm developing iOS app using ionic framework and I have one problem when I try to call web service by using 3G network.
here is my service in UserService:
function getUserStat(user_id){
var request = $http({ method: "get",
url: "http://www.example.com/user.php",
params: {
action: "stat",
user_id:user_id
},
data: {
}
});
return(request.then(handleSuccess, handleError));
}
function handleError( response ) {
// The API response from the server should be returned in a
// nomralized format. However, if the request was not handled by the
// server (or what not handles properly - ex. server error), then we
// may have to normalize it on our end, as best we can.
if (!angular.isObject( response.data ) || !response.data.message) {
return( $q.reject("An unknown error occurred.") );
}
// Otherwise, use expected error message.
return( $q.reject( response.data.message ) );
}
// I transform the successful response, unwrapping the application data
// from the API response payload.
function handleSuccess( response ) {
return( response.data );
}
the getUserStat() function will return json back.
here is my controller
UserService.getUserStat($scope.user_id).then(function(data){
alert("Result: " + JSON.stringify(data));
});
in my control I just show the json.
I build this code to my iPhone and test it over WIFI network, everything work fine. If i update the serverside, UserService.getUserStat in controller will show update. but the problem is when I test it on 3G network, iPhone always show the old json returned from the server (even I change server side data).
any idea to solve this problem?
Thank you
I had a similar problem when I tried to upload a camera photo to my data server.when i tested the app on my local WIFI it worked perfectly but when I tested it outside i noticed it fails to upload the file. eventualy the problem was that since the internet outside is much slower the app moved to another view without finish the upload action.
so for example if your controller looks something like this:
.controller('Ctrl1', function(webService, $scope, $state) {
UserService.getUserStat($scope.user_id).then(function(data){
alert("Result: " + JSON.stringify(data));
});
$state.go('app.posts');
});
it should be like this:
.controller('Ctrl1', function(webService, $scope, $state) {
UserService.getUserStat($scope.user_id).then(function(data){
alert("Result: " + JSON.stringify(data));
})
.finally(function() {
$state.go('app.posts');
});
});
I want to import and export CSV's. I have figured out how to get the iPad to recognize my app as one that opens CSV files.
From there though I am lost. I have found explanations on how the iPad sends in my file via application:didFinishLaunchingWithOptions or handleOpenURL ...
I've figured out that adding a function called handleOpenURL(url) in my js file passes me the url for the file... so now I have this.
That is great because I now know that someone has opened my app this way. Cool... BUT how do I grab the contents of that URL?
GOT IT! Woot, this is what i did...
function handleOpenURL(url)
{
window.resolveLocalFileSystemURI(url, onResolveSuccess, fail)
}
function onResolveSuccess(fileEntry)
{
fileEntry.file(win, fail);
}
function win(file) {
var reader = new FileReader();
reader.onloadend = function(evt) {
alert("succes");
alert(evt.target.result);
}
reader.readAsText(file);
}
function fail() {
alert('fail');
}