How to play streaming AV data in iOS? - ios

I'm trying to play streaming AV data from server (ex. VLC server)
Server send AV file data to client by TCP/IP.
Client get streaming data and queue buffers,
finally play AV in buffer queue.
(Before file download complete. It literally 'streaming')
Is there any APIs in iOS framework to do this?

I have solved it with AudioFileStream & AudioQueue Service.
AudioQueue support vary audio formats.
Please refer to Apple doc.
FYI.

Related

How to access audio stream in real time when using Tokbox

I'm building a voice-only IOS (swift) app and Tokbox is my VoIP provider.
My app is simple: user1 is talking to user2. However, I would like to get access to the audio stream in real-time. I'm ok with both options: 1. The audio stream goes to my piece of code then I stream it back to Tokbox 2. The audio stream is forked to Tokbox and to my code in parallel.
The only way I was able to put my hand on the audio stream is by using their archiving capabilities, but that is too late (only after the session ends)
Any ideas? or maybe other providers that give me that option?
Option 1 can be done using the external/custom audio driver, take look at this example on how to use/implement it https://github.com/opentok/opentok-ios-sdk-samples/tree/master/Custom-Audio-Driver

What is the role of Streaming server like Wowza?

I have been exploring on how to live stream from iPhone. I will have to publish a stream at URL to Wowza server that I came to know. Other thing is that I will require a library for iOS to encode and compress the camera output and will have to send that stream over RTMP protocol to the Wowza server. At the receiving and, there should be a player which can decode, decompress that stream comes from the Wowza to the device like iPhone (a user who wants to see live stream).
My question is, if encoding is done through particular iOS SDK, RTMP has a role as a Protocol, a player at receiving end has a role of decoding, then what is the role of Wowza ? What is its function that makes it very important in the live streaming process ?
I have been searching on the function of a Media Streaming server since 3 days, but I could not understand the exact function of Media Streaming server like a Wowza.
I am desperate to the answer..
Any explanation will be appriciated, thanks in Advance !!!
I actually did a year-long project involving media streaming on iOS and I used Wowza. The role of Wowza is to function as a media server that can receive the video that is broadcast from an iOS device over the RTMP protocol. With Wowza, you have options to send http parameters that instruct the server to begin or stop recording the live video that is being streamed. You also have the option of embedding video players in websites for live view.

iOS: How to stream audio data from client to server?

I need to send audio chunks like a stream to server while recording audio in iPhone. What is the best method to implement this? I looked at HLS. But it supports server to client streaming. I need client to server streaming. Please suggest a best method to implement audio streaming from client (iOS device) to server.
HLS is not supported from client to server. Only from server to Client.
"Streaming audio or video to iPhone, iPod touch, iPad, or Apple TV"
https://developer.apple.com/library/content/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/Introduction/Introduction.html
Probably not the answer you were hoping for.

iOS RTP live audio receiving

I'm trying to receive a live RTP audio stream in my iPhone but I don't know how to start. I'm seeking some samples but I can't find them anywhere.
I have a Windows desktop app which captures audio from the selected audio interface and streams it as ยต-law or a-law. This app works as an audio server that serves any incoming connection with that streaming. I have to say that I've developed an Android app that receives that stream and it works, so I want to replicate this functionality on iOS. In Android we have "android.net.rtp" package to manage this and transmit or receive data streams over the network.
Is there any kind of equivalent package for iOS to implement this? Could you give me any kind of reference / sample to do this, or just tell me where to start?
You can see this libraryHTTPLiveStreaming, But his protocol maybe is not standard one, You can check my fork aelam/HTTPLiveStreaming-1, I'm still working on it, it can be played by ffplay. You can try
check the file rtp.c in ffmpeg, I think it will help out

Objective C: iOS: Audio Streaming and Audio Uploading using RTMP

I have to record the high volume sound. It can be upto 60 seconds. The application is already made on flash and the current flash application uses rtmp (red5 server) and streams the recording to the server in real time as FLV. I have uploaded the audio file on server as mp3 from iOS application but this approach is not useful as the volume is very high as mentioned above. So I want to use rtmp (red5 server) for iOS application. I want to ask is it possible that I can record the audio and upload on this server and also can steam from there?
Edit-1: I want to ask that whether Apple can reject the application if we use the RTMP iOS library in our application?
Edit-2: I did some research and found some 3rd party libraries like:
http://www.aftek.com/afteklab/aftek-iphone-RTMP-library.shtml
I want to know, whether it is possible to directly connect from iOS to RTMP server?
Or these libraries use the middle layer approach with HTTP to connect to RTMP?
Example: iOS -> HTTP Server -> RTMP Server (red5)
I will appreciate the help.

Resources