Why video start buffing with offline server? - ios

I am doing the functionality of downloading video and playing in offline mode. Here I am using NexPlayer with GCDWebServer my videos are encoded and needs to sync with server. I am using GCDWebServer for offline mode but after some video play its starts buffering.
So my question is this is offline server and we already having all data so why its buffering, I am not getting this. Please suggest something or can I use any other server instate of GCDWebServer.

Sorry to late reply but I found the solution for my problem.
Actually I set some buffering values with NexPlayer as online video playing but as we have a file locally stored, we don't want any buffing values to interrupt the process.
So I just removed them, simply set to 0 and it works perfectly as per expectations.
No buffing, happy life :)

Related

Youtube encoder won't start for live streams

I'm trying to get a livestream working on youtube. I want to stream 360° content with H264 video and AAC audio. The stream is started with the youtube live api from my mobile app and librtmp is used to deliver video and audio packets. I easily get to the point where the livestream health is good and my broadcast and stream are bound successfully.
However, when I try to transition to "testing" like this:
YoutubeManager.this.youtube.liveBroadcasts().transition("testing", liveBroadcast.getId(), "status").execute();
I get stuck on the "startTesting" status every time (100% reproducible) while I expect it to change to testing after few seconds to allow me to change it to live.
I don't know what's going on as in the youtube live control room everything seems to be fine but the encoder won't start.
Is it a common issue? Is there a mean to access the encoder logs? If you need more information feel free to ask me.
Regards.
I found a temporary fix !
I noticed 2 things :
When the autostart option was on, the stream changed its state to startLive as soon as I stopped sending data. It suggested that the encoder was trying to start but it was too slow to do it before some other data paket was received (I guess)
When I tried to stream to the "Stream now" URL, as #noogui suggested, it worked ! So I checked out what was the difference in the stream now & event configurations.
It turned out I just had to activate the low latency option as it's done by default in the stream now configuration.
I consider it as a temporary fix because I don't really know why the encoder isn't starting otherwise and because it doesn't work with the autostart option... So I hope it wont break again if Youtube does another change on their encoder.
So, if you have to work with the Youtube api, good luck guys !

Web Audio API audio editor saving edited clip back onto web server

I am making a drum machine and have implemented a recording function using recorderJS library. The problem as you may expect is limited functionality in terms of not been able to edit the recorded clips. So my question is if I was to implement an audio editor that allows the user to trim the clip, how would I go about saving the edited clip back onto the web server?
Is this even possible using Web Audio API?
Many Thanks
The web audio API doesn't do this for you; you need a back end server that can accept uploads. You'll also probably want to re-encode the audio data (as a WAV, MP3, OGG, etc.)

Can iOS8 CloudKit support streaming behind the scenes?

Is there any way, using currently available SDK frameworks on Cocoa (touch) to create a streaming solution where I would host my mp4 content on some server and stream it to my iOS client app?
I know how to write such a client, but it's a bit confusing on server side.
AFAIK cloudKit is not suitable for that task because behind the scenes it keeps a synced local copy of datastore which is NOT what I want. I want to store media content remotely and stream it to the client so that it does not takes precious space on a poor 16 GB iPad mini.
Can I accomplish that server solution using Objective-C / Cocoa Touch at all?
Should I instead resort to Azure and C#?
It's not 100% clear why would you do anything like that?
If you have control over the server side, why don't you just set up a basic HTTP server, and on client side use AVPlayer to fetch the mp4 and play it back to the user? It is very simple. A basic apache setup would do the job.
If it is live media content you want to stream, then it is worth to read this guide as well:
https://developer.apple.com/Library/ios/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/StreamingMediaGuide.pdf
Edited after your comment:
If you would like to use AVPlayer as a player, then I think those two things don't fit that well. AVPlayer needs to buffer different ranges ahead (for some container formats the second/third request is reading the end of the stream). As far as I can see CKFetchRecordsOperation (which you would use to fetch the content from the server) is not capable of seeking in the stream.
If you have your private player which doesn't require seeking, then you might be able to use CKFetchRecordsOperation's perRecordProgressBlock to feed your player with data.
Yes, you could do that with CloudKit. First, it is not true that CloudKit keeps a local copy of the data. It is up to you what you do with the downloaded data. There isn't even any caching in CloudKit.
To do what you want to do, assuming the content is shared between users, you could upload it to CloudKit in the public database of your app. I think you could do this with the CloudKit web interface, but otherwise you could create a simple Mac app to manage the uploads.
The client app could then download the files. It couldn't stream them though, as far as I know. It would have to download all the files.
If you want a streaming solution, you would probably have to figure out how to split the files into small chunks, and recombine them on the client app.
I'm not sure whether this document is up-to-date, but there is paragraph "Requirements for Apps" which demands using HTTP Live Streaming if you deliver any video exceeding 10min. or 5MB.

What is the simplest way to stream from an iOS device to a Wowza server?

I have tried to use LiVu and Broadcast Me, but it does not work smoothly with what I am trying to do. I need to live stream audio/video from the iPhone to our servers (while saving locally).
I have tried to implement a RTSP UDP stream but it is proving to be more of a challenge than we initially thought.
RTSP/UDP is preferred, but whatever gets the stream to the servers in a timely fashion will work.
Any advice or framework suggestions would really help. Have already looked at iOS-RTMP-Library but its too expensive for us to use at this point.
I don’t know about your budget, but you might check ANGL lib which worked fine for us on RTMP.

How can I stream a movie in iOS and playback from the filesystem later?

I've got an app that currently ships with all the videos it can play embedded in it. This doesn't scale well, and unless you want to play all the movies, wastes disk space. It also makes it less desirable to upgrade the app because you have to re-download all movies.
What I would like to do is download the movie on the fly, play it back while downloading, and then if it's successfully downloaded, save it to the file system so that next time they want to watch it, it streams from the local file.
I can do whatever is needed to the video, but currently I'm serving it up as an .mp4 file from Amazon S3, with a mimetype of video/mp4, and so the first half of my issue works fine: the movie downloads, and MPMovieViewController will start playing it as soon as it thinks it has downloaded "enough."
Is there any way to tap into the cache of that video file so that I can save it and control how long it resides on the filesystem? This seems like it would be the easiest approach.
I am targeting iOS 5+6, but if the only solution available required iOS 6, I would consider it also. Thanks!
UPDATE: Using AFNetworking, I am now half-way there, I think. I am downloading the video file from the server, and listening for the download progress. Once I see 25% of the video has been downloaded, I start playback on the local file using an MPMoviePlayerController.
The main issue I'm running into now is playback seems to get screwed up. It's going along fine, 25% downloaded, playback starts... download continues normally... then the file finishes downloading completely, and shortly thereafter video freezes. The onscreen playback timer still indicates playback is ongoing and I don't see any "playback finished" type notifications, but the video is frozen. My guess based on the behavior is that perhaps the initial buffer for the video playback was used up, and it isn't detecting that more video is available on disk now?
Is there any way to interact with MPMoviePlayerController to let it know periodically to refresh the buffer it's playing out of? Or some other way to handle this situation?
UPDATE: Make sure to see the newer answer from #TomHamming.
I have yet to find a conclusive answer, but at this time I believe the answer is: you can't reliably do this. At least not without a lot of work which seems too much like a hack. I filed a feature request with Apple as it really seems like this should be possible with some adjustments to MPMoviePlayerController.
I will go over the variety of things I tried or considered, and the results I encountered.
Pass MPMoviePlayerController a URL to your movie file, which allows it to stream, and then pull the file out of the cache it was saved into, into your local Documents folder. Won't work, as of iOS 6. I filed a feature request with Apple, but as it stands now there's no way to get your hands on the file they are downloading, AFAIK.
Start downloading the movie file with NSURLConnection (or something like AFNetwork), and then when a "decent amount" has been downloaded to the device, pass the file URL to the MPMoviePlayerController and let it stream from disk. Sort of works, but not well. Three problems:
It's really hard to know when to start playing the file. I haven't figured out the algorithm Apple uses, and so I always erred on the side of caution, waiting for 25% to be downloaded before playing.
The MPMoviePlayerController interface provides no sense of the movie being streamed, as it does when Apple is doing the calculations via the network. It appears to the user that the file is totally downloaded when it really is not.
And most importantly, MPMoviePlayerController seems to not work well with playing a file that is not completely downloaded. I experienced playback problems once the file finished downloading, or if the player caught up with the amount downloaded, and never found a graceful way to handle these situations.
Same procedure as above, but use AVFoundation classes to more finely control the playback process, and avoid the issues described above regarding playback stopping, etc. Might work, but I want all the features of MPMoviePlayerController. Re-implementing MPMoviePlayerController myself just to get this one feature seems like a waste of time.
Same procedure as #1 above, but run a small web server in your app to handle streaming the video from the disk to MPMoviePlayerController, with the hope being that the streaming would work more like it normally does when streaming the file directly from an external web server. Works, but results were still sporadic and performance seemed to suffer. I did my test with CocoaHTTP. I decided against this approach because it just felt like a terrible hack.
Run a lightweight HTTP proxy, thus intercepting the downloaded movie file data as it gets streamed from the internet into your MPMoviePlayerController. Not sure if this works or not. I was not able to test this yet, as I have not found a lightweight HTTP proxy written in Objective-C, and at this point don't feel like implementing one just to try this experiment. It seems like the next easiest of all these hacks to implement -- if you don't have to write the proxy!
At this point I've decided to go the less-hacky, but also less user-friendly route of simply downloading the file completely, and then passing it to MPMoviePlayerController, until a better solution comes along.
You can do this as of iOS 10 with AVAssetDownloadTask. See this WWDC 2016 session and this documentation.
Alternatively, if your movie isn't DRM'd, you can do it with AVAssetResourceLoaderDelegate, which effectively lets you give an AVPlayer an arbitrary stream of bytes. See this walkthrough.

Resources