I trying to play Audio with specific PTS(=DTS. decoding time stamp.)
So, I have tried to use AudioQueueEnqueueBufferWithParameters() with the inStartTime parameter to delay the start of playing each buffer, But that is not working.
I know that in android, MediaCodec class's queueInputBuffer method can play audio data with PTS. (see description : MediaCodec.queueInputBuffer)
I want to find the API in iOS that like MediaCodec's queueInputBuffer method.
If there is no the API in iOS,
How can I play each audio buffer with specific PTS ?
Related
I need to determine if the Chromecast remote media client is currently playing media or if it's paused in Swift. Chromecast provides the two following methods:
GCKCastContext.sharedInstance().sessionManager.currentSession?.remoteMediaClient?.play()
GCKCastContext.sharedInstance().sessionManager.currentSession?.remoteMediaClient?.pause()
But I can't seem to find any property to determine if it's currently playing or paused.
The equailvant property for what I'm looking for in AVPlayer is:
avPlayer.timeControlStatus == .playing
What is the equivalent way to do this (check if media is playing or paused) in Chromecast?
It's as easy as:
GCKCastContext.sharedInstance().sessionManager.currentCastSession?.remoteMediaClient?.mediaStatus?.playerState
This is an enum with all the cases you're looking for (see: documentation)
Iam new to react native. Iam using agora rtc (3.1.3) for video calling in my app. It works perfectly. We have some actions like switching camera and muting video etc. For muting video iam using the below code
const toggleVideo = async () => {
let mute = vidMute;
console.log('Video toggle', mute);
await (engine.muteLocalVideoStream(!mute));
setVidMute(!mute)
}
Where engine is my RtcEngine created using my app id. Issue is muteLocalVideoStream has no effect on my video. Iam testing in iphone. Please help
The muteLocalVideoStream method only stops the transmission of the video stream to remote users, it does not stop the video stream capture i.e. the local user can still view their video normally.
If you want to stop the video on the local user's device as well you can use the enableLocalVideo method instead. You can edit the code on line 4 - await (engine.enableLocalVideo(!mute));
I tried out your code on v3.1.6 of react-native-agora and everything works as expected.
I want to get the m4a audio playback time without player implementation.
I found AVAsset implementation but didn't worked for me.
playbacktime can only be used when using MPMediaPalyback
From AVAsset you can get only the duration of the Asset not the playback time.
below is a refernce link
How to get the duration of an audio file in iOS?
You can access the duration property of the audio file but you cannot get the playback time.
Playback time will only be available when you are using a player. :-)
I placed 2 VLCMediaPlayer in the IPad ViewController.
Then I want to mute one of the players.
I executed the following code from VLCAudio class:
[VLCMediaPlayer.audio setMute:YES];
But the voice of the player was still on.
Then I added another piece of code:
[VLCMediaPlayer.audio setVolume:0];
Nothing had been changed.
Is it because both setMute and setVolue functions don’t work under the ISO VLCKit?
If so, how to mute VLCMediaPlayer by coding?
Set the current audio track to -1. Performance-wise, this is more efficient, too, since the audio information isn't even decoded.
Volume control (incl. mute) isn't supported with current versions of MobileVLCKit on iOS, but on the Mac only.
If you want to mute from the beginning, I found a way to do it.
Before you send play msg to the player instance, you should init it with options, such as
self.player = [[VLCMediaPlayer alloc] initWithOptions:#[#"--gain=0"]];
where "--gain=0" which means audio gain set to 0. This is not the documentation method, may not work on every version of mobile vlc framework. But it works for me.
If you want to mute while playing, you can try
self.player.currentAudioTrackIndex = -1;
This also works for me!
I'm a new iOS developer, I'm working on a video player app for a video sharing site, where sometimes a recording consists of two video streams (one showing the presenter, the other showing the recording of his screen). I'm trying to play this second video with AVFoundation, creating an AVPlayer. With some videos it works very well, but with some others it runs out of memory. After lot of investigating I figured that it tries to buffer the whole video into the memory.
I've spent hours googling it, but couldn't find anything.
I created a small project just to demonstrate this:
github project. It sets up two AVPlayer's, for two different video streams, and updates the UI to show the loadedTimeRanges of the players' AVPlayerItem. For the first video it only buffers ~60 seconds, which is nice, but for the second video it keeps buffering.
self.player1 = [AVPlayer playerWithURL:url1];
self.player2 = [AVPlayer playerWithURL:url2];
and the two text labels:
self.data1.text = [NSString stringWithFormat:#"Player 1 loadedTimeRanges: %#",
self.player1.currentItem.loadedTimeRanges];
self.data2.text = [NSString stringWithFormat:#"Player 2 loadedTimeRanges: %#",
self.player2.currentItem.loadedTimeRanges];
Maybe this could be important: The over-buffering video does not have an audio track, just a video.
UPDATE: I reproduced the problem with using MPMoviePlayerController instead of AVPlayer, and checking the playableDuration property. With the first movie it stops around 60 seconds, with the second movie it keeps going and then it runs out of memory.
UPDATE2: I got the actual video files and put them up to Dropbox, and tried to stream those: then I don't have the problem! It buffers the whole movie, but it does not run out of memory. It only runs out of memory if I stream them from the original site (our video sharing site). The URLs are there in the github project.
I'm really looking forward to any hints what could cause this.
Thank you!
This problem is indeed caused by the lack of an audio track for video streams sent from Wowza media server. (I have inferred from your stream URLs that you're using Wowza media server to stream your videos).
To verify this issue, I created a 5 minute video file with no audio track.
mplayer -nolirc -vo null -ao null -frames 0 -identify test_60.mp4
...
Opening video decoder: [ffmpeg] FFmpeg's libavcodec codec family
Selected video codec: [ffh264] vfm: ffmpeg (FFmpeg H.264)
==========================================================================
ID_VIDEO_CODEC=ffh264
Audio: no sound
Starting playback...
...
Then I added a mp3 track to that video file using mp4box.
MP4Box -new -add test_60.mp4 -add test_music.mp3 test_60_music.mp4
And verified that there was indeed an audio track.
mplayer -nolirc -vo null -ao null -frames 0 -identify /tmp/test_60_music.mp4
...
AUDIO: 44100 Hz, 2 ch, floatle, 320.0 kbit/11.34% (ratio: 40000->352800)
ID_AUDIO_BITRATE=320000
ID_AUDIO_RATE=44100
ID_AUDIO_NCH=2
Selected audio codec: [ffmp3float] afm: ffmpeg (FFmpeg MPEG layer-3 audio)
==========================================================================
AO: [null] 44100Hz 2ch floatle (4 bytes per sample)
ID_AUDIO_CODEC=ffmp3float
Starting playback...
...
Then, I put both the test_60.mp4 and test_60_music.mp4 in the Wowza content directory, and tested them. I actually wrote a small test app similar to yours to examine the loadedTimeRanges, but just loading the videos via safari from the device should be sufficient to see the difference.
I opened wowza_server:1935/vod/mp4:test_60.mp4/playlist.m3u8 and pressed pause as soon as it started playing. The buffer indicator kept increasing until the full 5 minute video was loaded.
Then, I opened wowza_server:1935/vod/mp4:test_60_music.mp4/playlist.m3u8 and did the same, but only the first 1/5th (roughly 1 minute) was loaded.
So it seems like a problem with the Wowza server's packetization - note this problem does not happen for me on adobe (flash) media server 5.0. Only 60 seconds is buffered regardless of whether the video contains an audio track.
Hope that's helpful. I've asked for input from Wowza folks at the Wowza forums