iOS: developing audio (music) streaming from server - ios

I'm researching to start developing an audio streaming apps. First thing I'd like to ask is the framework to use for it. I've tested DOUAudioStreamer , tumtumtum/StreamingKit , RadioKit and etc.
The framework needs to be able to seek a song from certain seconds with low latency. Spotify has that amazing seek feature i.e. find a song you've never listened before and play it, then slide the time slider to the middle, the song continues to play with low latency as if it has downloaded the whole song before.

I guess you could do some predictive analysis as to what songs the user will listen to next, i.e. start caching songs before they actually start to play. That will at least improve the experience somewhat.

Related

Audio playlist autoplaying queue on iOS

Greetings to all fellow front-end coders.
I have quite a headache because of Apple's autoplay policy. As you've probably experienced, you can't autoplay a video / audio element with sound without user interaction. (I'm referring to this)
I'm currently trying to create my own music web application that will support playlists. I work with the Next.js framework and use react-player as player. Since it does not natively support playlists, I created a two-player pendulum system, where one plays the current sound file and the other loads the next. Then their roles change.
Everything works great in all browsers, except those on iOS, because this player uses autoplay to start playing.
I've been worried about this for several days. I've already thought about letting an Apple product users press play button every time a song changes as a punishment, but of course it's not a good solution.
I think a lot of you have encountered this and you have certainly found a solution. Please help me :(

Why would AVPlayer observedBitrate (and bitrate play) seriously degrade on second playback in iOS 13?

I have an app that uses AVPlayer to play live streams and on-demand video. That in-house app (and recent in-house update) plays video beautifully in all sorts of conditions. Until a user in the company updated to iOS 13.
The same app run under iOS 12 from the same location will work perfect as before.
Now, after a reboot of the phone the first video played will play perfect and if it is a live stream will run for an indefinite amount of time without trouble. The second video played will always play audio only or fail depending on the bit rates available for the video. Even the exact same video.
Quiting and restarting the app makes no difference in the results. Restarting the phone WILL fix it for the next video played.
What appears to be happening is that on the first play, observedBitrate in the event log is correct and the correct bit rate stream is played. The second playback shows observedBitrate starting out an order of magnitude less and no reasonable amount of time sees that rate change significantly.
If the connection is an order of magnitude better than necessary (evidently that is true for most of these users including myself pre-Network Link Conditioner testing), then everything appears to work normally and life is good. What is even stranger, is that on these higher quality connections I don't see the same observedBitrate drop. Also, it appears that video served up from a different ip (but not different domains from same ip) will work once and then fail the second time as if some kind of connection bit rate cache is being used? These last two observations have not been repeated enough to be cast in stone but have been observed more than once.
I've scoured the iOS 13 release notes in hopes that I'm missing some change or need for a new key but nothing strikes me as relevant.
Any ideas appreciated!!!
A very similar question was posted earlier this year Video playback issues only on iOS 13 with AVPlayerViewController and AVPlayer when using HLS video. The unaccepted answer in that question does not apply here (and may not apply there either for all we know). I do wait for StreamPlayerItemStatusObserverContext to change to AVPlayerStatusReadyToPlay.

iOS App for Live Concert Ideas

I want to make an App for concert. Basically this will serve a lights sync on the music played in the concert. The flashing of lights should be in sync in the beat or music at the concert.
Something like this:
https://itunes.apple.com/us/app/dan-deacon/id536378735?mt=8
Question is
Is the app listening to the music thru the microphone?
If it is listening, how does it know when to flash the lights based on the beat of the music or the music itself?
In the example app I posted it's not using any wifi or mobile data so does it mean that it's standalone and does #1 and #2?
I have this question coz I'm new to app development. I want to do those feature but my idea is that there is a server which we control that will just send the lights flashing pattern on those apps, my concern is that if there's were 100,000 people in the event, provided the all have a wifi connection, will the sending of flashing lights pattern be a problem espcially you are sending hundred thousands of commands at the same time? I would prefer it to be in offline mode but how till it exactly work?
That is an amazing app. You need to synchronize your apps to the musik and maybe the apps among them selfs.
I picked some links for you:
http://content.iospress.com/articles/journal-of-embedded-computing/jec00021
http://www.cse.wustl.edu/~cdgill/PDF/RTSS08.pdf
http://blog.trifork.com/2013/02/18/build-build-massively-scalable-soft-real-time-systems-with-erlang/
In praxis i can imagine that the user starts playing the song and the app synchonize by the recorded sounds...

AVPlayer seekToTime download an insane amount of media segment files consuming a lot of data

I'm working in an app where I'm able to play a HLS m3u8 playlist of a streaming radio (audio only) without any problem using an instance of AVPlayer. Using Charles I can see how the playlist is updated properly in a normal pace (each 9-10 seconds, which takes one media segment file). When I perform a seekToTime: (back in time), the player success playing the stream from when I want to, but in Charles I observe how the player starts dowloading a huge amount of media segment files, consuming a lot of data. It seems that the player downloads all the media segment files until that time and then keeps again with the normal behaviour.
I understand that the correct behaviour would be to download the media segment file for the time I'm seeking to, start playing it and then download constantly 1 or 2 media segment files each 9-10 seconds, as it does when I play the stream without timeshift.
My question is if this is a normal behaviour, or if something could be wrong with my m3u8 playlist or the client implementation. Anyone could help me to clarify this?
UPDATED: I can confirm this doesn't happen in iOS 7, so it seems to be a bug introduced by iOS 8.
I've been told by Apple that this is not a bug, but a feature. They've made the buffer bigger since iOS 8.

iOS - Streaming Multitrack Audio

I'm building an app in which I need to steam multiple tracks of audio that make up a song. They all need to be synchronized so the song plays back naturally.
I've been able to play back local multitrack audio very well with the solution on this thread: multi track mp3 playback for iOS application, but it looks like AVAudioPlayer isn't able to stream.
I've been looking into working with DOUAudioStreamer because I've read that it's the best solution for streaming audio on iOS without going pretty low-level, but it seems to lack the equivalent of a -playAtTime: method, which is how the tracks were synced up using `AVAudioPlayer.
Does anyone know a workaround for this using DOUAudioStreamer, or have any advice on another way I should approach this? Thanks.

Resources