Wrong duration in MediaRecorder stream on iOS - ios

I am trying to use multiple files with ffmpeg and I am having a problem with the duration of the files created through iOS devices.
I am recording the videos through the webcam of the device with:
mediaRecorder.current = new MediaRecorder(videoPlayer.current.srcObject);
mediaRecorder.current.start();
mediaRecorder.current.ondataavailable = (e) => {
setVideoCreatedData(e.data);
};
Then I am downloading the file to my computer and testing the metadata through https://www.metadata2go.com/ and this is the output:
As you can see, the duration is 0s.
I want to then use this file with ffmpeg, however, it does not detect the duration and therefore I am experiencing problems.
How should I record the video in order to get a proper duration on the file's metadata?

Related

DASH Streaming ExoPlayer android studio

I'm streaming video from URL stored in firebase storage and I'm using the following code for streaming the video using ExoPlayer
BandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
TrackSelector trackSelector = new DefaultTrackSelector(new AdaptiveTrackSelection.Factory(bandwidthMeter));
LoadControl loadControl = new CustomLoadControl();
exoPlayer = ExoPlayerFactory.newSimpleInstance(SafetyTVHomeActivity.this, trackSelector, loadControl);
Uri videoUri = Uri.parse(videourl);
DefaultHttpDataSourceFactory dataSourceFactory = new DefaultHttpDataSourceFactory("exoplayer_video");
ExtractorsFactory extractorsFactory = new DefaultExtractorsFactory();
MediaSource mediaSource = new ExtractorMediaSource(videoUri, dataSourceFactory, extractorsFactory, null, null);
exoPlayerView.setPlayer(exoPlayer);
exoPlayer.addListener(new PlayerEventListener());
exoPlayer.prepare(mediaSource, false, false);
exoPlayer.seekTo(0, 0);
Everything is fine and the video gets streamed. But the problem I'm facing is the initial load time to start the video is too long (5+ seconds). I want to reduce the initial loading time to start the video to (0-2 seconds). Is there a way to achieve this using exoplayer?
I also tried using DASH media source in exoplayer using the code below
Uri videoUri = Uri.parse(videourl);
DataSource.Factory dataSourceFactory = new DefaultHttpDataSourceFactory(Util.getUserAgent(SafetyTVHomeActivity.this, "app-name"));
MediaSource mediaSource = new DashMediaSource.Factory(dataSourceFactory).createMediaSource(videoUri);
exoPlayer = ExoPlayerFactory.newSimpleInstance(this);
exoPlayer.prepare(mediaSource);
exoPlayerView.setPlayer(exoPlayer);
exoPlayer.addListener(new PlayerEventListener());
I used the same firebase storage URL in the dash media source but I'm getting the following error
ExoPlayerImplInternal: Source error.
com.google.android.exoplayer2.ParserException: org.xmlpull.v1.XmlPullParserException: Unexpected token (position:TEXT G#��B�%���������...#2:79 in java.io.InputStreamReader#c587547) at com.google.android.exoplayer2.source.dash.manifest.DashManifestParser.parse(DashManifestParser.java:105) at........
Could anyone please help me on how can I work around this.
My main objective is to stream video from URL and the initial load time to start the video should be 0-2 seconds (The way TikTok does it). Any help would be really helpful.

Google cloud speech very inaccurate and misses words on clean audio

I am using Google cloud speech through Python and finding many transcriptions are inaccurate and missing several words. This is a simple script I'm using to return a transcript of an audio file, in this case 'out307.wav':
client = speech.SpeechClient()
with io.open('out307.wav', 'rb') as audio_file:
content = audio_file.read()
audio = speech.types.RecognitionAudio(content=content)
config = speech.types.RecognitionConfig(
enable_word_time_offsets=True,
language_code='en-US',
audio_channel_count=1)
response = client.recognize(config, audio)
for result in response.results:
alternative = result.alternatives[0]
print(u'Transcript: {}'.format(alternative.transcript))
This returns the following transcript:
to do this the tensions and suspicions except
This is very far off what the actual audio says (I've uploaded it at https://vocaroo.com/i/s1zdZ0SOH1Ki). The audio is a .wav and very clear with no background noise. This is worse than average, as in some cases it will get the transcription fully correct on a 10 second audio file, or it may miss just a couple of words. Is there anything I can do to improve results?
This is weird, I tried your audio file with your code and I get the same result, but, if I change the language_code to "en-UK" I am able to get the full response.
I'm working for Google Cloud and I created for you a public issue here, you can track there the updates.

Actionscript netStream play mp4 with ios

I'm trying to play a video from an app using Flash Builder 4.7, AIRSDK 31.0 and ios 12.
private function init():void{
holder.addChild(video);
this.addElement(holder);
nc.connect(null);
ns = new NetStream(nc);
ns.client = {};
ns.client.onMetaData = ns_onMetaData;
ns.client.onCuePoint = ns_onCuePoint;
video.attachNetStream(ns);
ns.play("Videos/video.mp4");
ns.addEventListener(NetStatusEvent.NET_STATUS, statusNet);
}
This works on simulators and on android devices, but not for ios devices. I've seen a couple of similiar questions but they are trying to stream an mp4 from a "http" address where mine is using a local file.
I've been asked to stick to mp4 format, although I have read using an FLV file should work.
Special considerations for H.264 video in AIR 3.0 for iOS
For H.264 video, the iOS APIs for video playback accept only a URL to a file or stream. You cannot pass in a buffer of H264 video data to be decoded.
So do I need to find a new way of playing the video other than netStream or am I best to swap to a different file type?
As a side note Adobe says to write your mp4 URLs like this:
("mp4:samples/myvideo.mp4");
My app can't find the file with "mp4:" at the front of the URL.
If you are wanting to play videos that are packaged with your iOS app it's important to ensure you are actually including them when you compile your app.
Untested but something like this should work.
var _dFile:File;
var _ns:NetStream;
var _nc:NetConnection;
var _customClient:Object;
var _video:Video;
_customClient = new Object();
_customClient.onMetaData = metaDataHandler;
_nc = new NetConnection();
_nc.connect(null);
_ns = new NetStream(_nc);
_ns.client = _customClient;
//this is the important bit for finding files within the .ipa bundle.
_dFile = File.applicationStorageDirectory.resolvePath("nameOfYourVideoDirectory/nameOfVideo.mp4");
_ns.play(_dFile.url);
_video = new Video(480, 340);
_video.attachNetStream(_ns);
_ns.addEventListener(NetStatusEvent.NET_STATUS, onNSComplete, false, 0, true);
private function metaDataHandler(infoObject:Object):void {
trace("Length of video",infoObject.duration);
}
private function onNSComplete(e:NetStatusEvent):void{
if(e.info.code == "NetStream.Buffer.Empty") {
//do something
}
}
However, I would highly recommend using an ANE to play video on mobile via the native media player. Take a look at Distriqt MediaPlayer ANE.

Getting audio visualization using Web Audio API to work on iOS

I'm developing an HTML5 audio player for use specifically on iPhones, and am trying to get an EQ visualizer working. From what I've found there are two ways to set this up:
One where you load the mp3 file on demand using an XMLHttpRequest:
var request = new XMLHttpRequest();
request.open('GET', 'sampler.mp3', true);
request.responseType = 'arraybuffer';
request.addEventListener('load', bufferSound, false);
request.send();
function bufferSound(event) {
var request = event.target;
var buffer = myAudioContext.createBuffer(request.response, false);
source = myAudioContext.createBufferSource();
source.buffer = buffer;
}
You then use the source.noteOn and source.noteOff functions to play and pause the audio. Working this way, I AM able to get the EQ visualization going. BUT, you have to wait until the mp3 file completely loads to start playing, which won't work in our situation.
The other way to do this is to have an <audio> element already on the page, and you get the audio data from that using:
source = myAudioContext.createMediaElementSource(document.querySelector('audio'));
You then use the audio tag's play and pause functions. This solves the loading problem as it allows the media to be played immediately once the page loads... BUT, EQ visualization is gone.
Both methods show the EQ when testing on Chrome (WIN), so there seems to be something specific with iOS/iPhone that isn't allowing me to get the data from an <audio> tag, but will allow me to get it if I load the mp3 file on demand.
...
Any ideas out there?
Unfortunately Safari doesn't properly support MediaElementSource. It's a bug: Why aren't Safari or Firefox able to process audio data from MediaElementSource?

Flex/Flash Builder/Actionscript/AIR/Mobile iOS How to take video using the camera and/or browse for & view/access video stored in the 'Camera Roll"

My understanding currently is that:
CameraUI
I can use the CameraUI to access the built in camera for MediaType.VIDEO and that delegates to the built-in video camera app and lets me record a video. My app does that now.
When I stop recording and click the "Use" button, I am returned to my app and theoretically I have a valid MediaPromise.
iOS does -not- provide a valid/usable url/filename to the recorded video (or to photos) and so I would have to use a Loader to bring-in/use/access the 'recorded' video... AND... iOS does not actually create a file anywhere on the device, most importantly, in the Camera Roll where one would expect by the normal behavior when uses the system native camera/video app.
The documentation says that the Loader can load various image types and SWFs but nothing about video data, so I conclude from that that I cannot actually use the CameraUI to generate a valid MediaPromise that I can then pass to a Loader class or similar to read in the information created by the system camera and then manipulate (upload, save to applicationStorageDirectory, and/or display in one of the two video player components available in the API).
CameraRoll
I can have video entities in the iOS Camera Roll but the AS3/Air3.5 CameraRoll class won't let me view/access/reference them in any way.
Normal File I/O
All my attempts to use the Air3.5 File classes to browse to the storage location of the iOS Camera Roll have been rebuffed.
------- Questions -------
Am I correct in believing that there is a way to take video but no way to use the video that's been captured. (No way to use the resulting MediaPromise successfully).
I believe you can take video and access it using Android, but there's nothing in the documentation that says that you cannot using iOS.
Am I correct in believing that iOS sandboxes apps so that they cannot browse to video/photo storage using standard File I/O, but only through the apparently non-workable means I've tried (CameraUI & CameraRoll)
Am I wrong to think that these should be rather obvious NEEDS that one can achieve using the XCode Objective C++ etc route but the AIR Mobile Framework does not allow either because of Apple blocking functionality or because Adobe has failed to meet reasonable expectations?
One item of ironic note to convey. If I use the iOS system camera app to record a video, a thumnail of that video then appears in the Gallery/Camera Roll, and of course, I can share it or view it, or whatever... If I use AIR's CameraRoll.browseForImage(), provided I haven't used the camera to take another image, when it shows me the folder where the pictures are stored, the folder icon uses the thumbnail of the last object added... in this case, the video I took, but if I then enter the folder, the video cannot be found. It's teasing us. It knows it's there, but it is apparently forbidden fruit.
I can't answer all your questions, so this entry may not be acceptable, but I found this page while searching a solution for some the problems you described and thought that someone else may find this answer (partially) useful.
To save the movie you just took you need to open and read the data from the promise.
The iOS won't save the file anywere, so the MediaPromise.file is always null.
This is my solution to the problem:
private var camera:CameraUI;
private var dataInput:IDataInput;
public function recordVideo():void
{
// Start the camera and ask for a video
camera = new CameraUI();
camera.addEventListener(MediaEvent.COMPLETE, onCameraComplete);
camera.launch(MediaType.VIDEO);
}
private function onCameraComplete(event:MediaEvent):void
{
// event.data is a MediaPromise and MediaPromise.open() returns a IDataInput
// Let's cast it to a dispatcher and check when it's complete
dataInput = event.data.open();
var dispatcher:IEventDispatcher = IEventDispatcher(dataInput);
dispatcher.addEventListener(Event.COMPLETE, onDataInputComplete);
}
private function onDataInputComplete(event:Event):void
{
// We can do whatever we want with the data, so we'll store it in a File
var file:File = new File();
var bytes:ByteArray = new ByteArray();
var stream:FileStream = new FileStream();
// Reading the data from the opened MediaPromise
dataInput.readBytes(bytes);
stream.open(file, FileMode.WRITE);
stream.writeBytes(bytes, 0, bytes.bytesAvailable);
stream.close();
}
Also, I'm still looking for a way to put the movie in the CameraRoll

Resources