Listening on asterisk active calls from custom IOS app - ios

I am working on an mobile app to listen in on ongoing asterisk calls. Asterisk is set up to record calls, however the inbound and outgoing voices get saved to different wav files. Overcoming first obstacle was to stream wav files while they are being written to - this was achieved using Node JS, however now I need to join the mix two files together and stream them, which would be doable if the files were not written to at the same time.
First option would be to figure out how to programatically join the two while continuously checking if EOF has changed while also streaming the result. (Feels above my paygrade)
Second option would be to stream two files independently to client IOS application which would play them at the same time. If the first challenge of playing two streams simultaneously would be solved, it would require very stable connection. Therefore I don't see this as a viable option
Third possibility would be to embed softphone into the IOS app and use it as a client for ChanSpy. Would that be possible and what library can help me achieve it?
What do you guys suggest, perhaps there are more options out there?
Thanks

What about Using Application_MixMonitor instead?

Why not just build a SIP client on IOS and use ChanSpy to listen to the calls live?
http://www.voip-info.org/wiki/view/Asterisk+cmd+ChanSpy

You can supply m option to mixmon application or use sox to do the mixing.
https://wiki.asterisk.org/wiki/display/AST/Application_Monitor
http://leifmadsen.wordpress.com/tag/mixmonitor-sox-mixing-asterisk-script/

Related

How to play wav file with 8Kbps bit rate in iOS application?

I develop an iOS application where i should call web service and return audio file ( as byte array ) then play it to user .
I have problem with audio file format as it has 8 Kbps bit rate and no player inside app can play it. when I convert it to any other bit rate for example (13 Kbps) from server side to test it works properly . However i have a huge number of file where converting them manually is impossible . is there any way to convert file inside iOS app code ?
To address your comment about converting the files manually: Is it a plausible solution for you to do it automatically, server side? If so, you probably have a few options.
One would be to install Audacity, which handles all of these bit rates, and use the Chains feature for batch processing. If it's a one-off conversion you could initiate it manually, otherwise you may need to find a way of scripting the process (if new files come in from an external source, that is).
As for playing these files in iOS, have you considered embedding libpd (just one example of many, but one that you can get up and running in minutes)? It has a fairly open approach to file playback, and may handle these file formats. If you send me an example I can test right away!

Can iOS8 CloudKit support streaming behind the scenes?

Is there any way, using currently available SDK frameworks on Cocoa (touch) to create a streaming solution where I would host my mp4 content on some server and stream it to my iOS client app?
I know how to write such a client, but it's a bit confusing on server side.
AFAIK cloudKit is not suitable for that task because behind the scenes it keeps a synced local copy of datastore which is NOT what I want. I want to store media content remotely and stream it to the client so that it does not takes precious space on a poor 16 GB iPad mini.
Can I accomplish that server solution using Objective-C / Cocoa Touch at all?
Should I instead resort to Azure and C#?
It's not 100% clear why would you do anything like that?
If you have control over the server side, why don't you just set up a basic HTTP server, and on client side use AVPlayer to fetch the mp4 and play it back to the user? It is very simple. A basic apache setup would do the job.
If it is live media content you want to stream, then it is worth to read this guide as well:
https://developer.apple.com/Library/ios/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/StreamingMediaGuide.pdf
Edited after your comment:
If you would like to use AVPlayer as a player, then I think those two things don't fit that well. AVPlayer needs to buffer different ranges ahead (for some container formats the second/third request is reading the end of the stream). As far as I can see CKFetchRecordsOperation (which you would use to fetch the content from the server) is not capable of seeking in the stream.
If you have your private player which doesn't require seeking, then you might be able to use CKFetchRecordsOperation's perRecordProgressBlock to feed your player with data.
Yes, you could do that with CloudKit. First, it is not true that CloudKit keeps a local copy of the data. It is up to you what you do with the downloaded data. There isn't even any caching in CloudKit.
To do what you want to do, assuming the content is shared between users, you could upload it to CloudKit in the public database of your app. I think you could do this with the CloudKit web interface, but otherwise you could create a simple Mac app to manage the uploads.
The client app could then download the files. It couldn't stream them though, as far as I know. It would have to download all the files.
If you want a streaming solution, you would probably have to figure out how to split the files into small chunks, and recombine them on the client app.
I'm not sure whether this document is up-to-date, but there is paragraph "Requirements for Apps" which demands using HTTP Live Streaming if you deliver any video exceeding 10min. or 5MB.

Is it posible to cast to the receiver the input from the phone microphone?

I would like to know if it is posible to cast the audio taken directly from the microphone iOS device to the receiver. (in a live way)
I´ve downloaded all the git example projects, and in all of them use a "loadMedia" method to start the casting. Example of one of those:
- (NSInteger)loadMedia:(GCKMediaInformation *)mediaInfo
autoplay:(BOOL)autoplay
playPosition:(NSTimeInterval)playPosition;
Can I follow this approach to do what I want? If so, what´s the expected delay?
Thanks a lot
Echo is likely if the device (iOS, Android, or Chrome) is in range of the speakers. That said:
Pick a fast codec that is supported, such as CELT/Opus or Vorbis
I haven't tried either of these, but they should be possible.
Implement your own protocol using CastChannel that passes the binary data. You'll want to do some simple conversion of the stream from Binary to something a bit more friendly. Take a look at Intro to Web Audio for using AudioContext.
or, 2. Setup a trivial server to stream from on your device, then tell the Receiver to just access that local server.

What is the simplest way to stream from an iOS device to a Wowza server?

I have tried to use LiVu and Broadcast Me, but it does not work smoothly with what I am trying to do. I need to live stream audio/video from the iPhone to our servers (while saving locally).
I have tried to implement a RTSP UDP stream but it is proving to be more of a challenge than we initially thought.
RTSP/UDP is preferred, but whatever gets the stream to the servers in a timely fashion will work.
Any advice or framework suggestions would really help. Have already looked at iOS-RTMP-Library but its too expensive for us to use at this point.
I don’t know about your budget, but you might check ANGL lib which worked fine for us on RTMP.

iOS streaming audio from network -- random access of a 6-hour file

A potential client has come to me asking for a an app which will stream a six hour audio file. The user needs to be able to set the "playback head" to any position along the file. Presumably, this means that the app must not be forced to download the entire file before it beings playing back starting at an arbitrary
An added complication -- there are actually four files which need to be streamed and mixed simultaneously.
My questions are:
1) Is there an out-of-the box technology which will allow me random access of streaming audio, on iOS? Can this be done with standard server technology and a single long file, or will it involve some fancy server tech?
2) Which iOS framework is best suited for this. Is there anything high-level that would allow me to easily mix these four audio files?
3) Can this be done entirely with standard browser technology on the client side? (i.e. HTML5)
Have a close look at the MP3 format. It is remarkably easy and efficient to parse, chop up into little bits, and reassemble into a custom stream.
Hence rolling your own server-side code to grab what you want and send to the client will not be as crazy or difficult as it may sound.
MP3 is also widely supported by various clients. I strongly suspect any HTML5 capable browser will be able of play the stream you generate via a long-lived bit-rate regulated HTTP request.

Resources