I am working on an watchOS app.
There is a .ogg opus file locally and I need to listen to it on AVplayer.
But AVPlayer currently does not support it.
To listen on AVPlayer, I need to decode the opus file as I understand. So I followed these two threads which are
How to decode self-delimited opus in iOS and How to encode and decode Real-time Audio using OpusCodec in IOS? but I couldn't succeed.
I always get error while decoding data.
I have been trying for 2 days to find a solution. Can anybody explain how I can do it?
Related
How can I save what AVPlayer is currently playing (both video and audio) from Live HLS stream?
I know how to load and play m3u8 video file using AVPlayer.
Please note that the HLS stream is live and not Video on demand so cannot use AVAggregateAssetDownloadTask In the perfect scenario will get CMSampleBuffer objects which can save to file easily. Also AVPlayerItemOutput is not entirely an option because I am unable to see how will get the audio channel.
Seems not possible with the current SDK. I've implemented it using ffmpeg
I need to make an app that sends recorded audio in .wav format to a REST endpoint. I will be using AFNetworking for the POST, but I am not sure where to go with recording the audio and then converting it. I know I should use AVFoundation to record, but I am not sure how to convert. Would CoreAudio help?
I am using AVCaptureSession to record a video and audio of user. I am getting real time video and audio streams independently. I am able to encode them using h264 encoder and aac encoder respectively. Now I am not getting how to multiplexing them both and make a stream ? How to send them to specific server url which is protected by userName and Password ? If it can be done using RTMP also then also it's fine.
I have taken a reference from here! But I am not getting much out from this.
Is there any RTSP library project which can help me ?
I have been struggling in it from a long.
Is there any solution to my problem ?
Thanks in advance.
I am working on iOS application which needs showing video coming from server. Video data is coming in packets and its H264 encoded Format. After looking for AVFoundation option I didnt find any thing which plays video data received in packets from socket connection. If some one could point me in right direction it would be of very helpful. I am stuck over this for couple of days.
I need to play an RTMP streaming in a website.
I want to run on iOS devices, but I know RTMP does not run on it.
So, my idea is to convert RTMP to some audio file (mp4, for example) and send it to the user and to play with HTML5. Do you have a better idea to do this work? I have PHP5 at server, but I have no idea about how to convert a continuous data (streaming) in a HTML5 compatible audio. For now, it's only an idea I have...
Thanks in advance.
There is currently no live format that works with html 5 . Mp4 can not be used as a live source as it is not a streaming format. The only built in support for Live audio/video on iOS is HLS.