goal: network flow collection in my app.
For collecting all network flow, i custom URLProtocol. it all work fine with normal GET, or POST request of Json. But it makes audio playing part not working. My Audio playing part is based on AVPlayer and AVPlayerItem.
If i unregister custom protocol, audio playing works again. Does it has some connections between two of them ?
According to https://forums.developer.apple.com/thread/75328 AVPlayer does go through the URL loading system, but those requests are made in a helper process (mediaserverd) and thus don’t ‘see’ custom NSURLProtocol subclass.
Here post my implementation of custom protocol.https://github.com/JimmyOu/JODevelop/blob/master/JODevelop/Tool/performance_Monitor/Network/NEHTTPMonitor.m
Thanks all.
there is no way to deal with it .you cant custom protocol to deal with AVPlayer staff.My compromise is filtering .mp4 or .mp3 url sacrificing some lost network flow.but it wont make mistake
Related
The use-case is that I want the user to be able to preview a song hosted at a remote URL. Fine, you say, just use AVPlayer. Yes, but I want to be able to cache the file locally if it is completely downloaded in the course of being previewed. As far as I can tell, there's no way to access the local data that AVPlayer downloads for its streaming playback. Likewise, there's no way to get access to the data being downloaded by URLSession until it is completely downloaded. So, I'd like to find a result using one of these approaches:
Access the data in AVPlayer once it has completed downloading it and save to a file.
Download the data progressively to the cached URL and have AVPlayer play it as enough data becomes available.
Are either of these scenarios possible? Is there some other scenario which will achieve what I am looking to do?
So the solution to this came from the AlamoFire source code. If you use a URLSessionDataTask in the traditional, non-combine way and have a controller conforming to URLSessionDataDelegate, then you can implement the urlSession(_:dataTask:didReceive:) protocol method to receive the data as it arrives, rather than waiting to receive it at completion. This allows you to directly write the data to a file of your choosing that is completely under your app's control.
According to "Example Playlist Files for use with HTTP Live Streaming: Basic Variant Playlist", it describes snippet text below :
Note: A variant playlist is not re-read. Once the client has read the variant playlist, it assumes the set of variations isn't changing. As soon as the client sees the endlist tag on one of the individual variant, that ends the stream.
but I could not find related APIs to get event for "endlist tag" occur.
Does any one has suggestion to get this event by delegate or function?
Thank you !
Web
Safari will just fire the ended event on the HTML5 video element when playback finished.
If you are using third-party players like hls.js or Bitmovin Player you may want to check their API; Most of them provide cross-browser convenience functions for this.
Apps
For native apps on iOS the AVPlayerItemDidPlayToEndTimeNotification event might be what you want to look at.
I have a black box container. I love black boxes, they obfuscate things so well.
This black box is an encrypted zip (sort of) and has inside some html files (this is the short, not so painful to explain, version).
Those files need to be displayed in an UIWebView. Now, the easy way to do it, decrypt, unzip to filesystem, load file from filesystem. That's good, except, the black box contains secret stuff, and can't just lay around on the filesystem, not even a sec, so, I made a C library that actually streams the contents of the box (directly out of the box).
Now, I have this streaming capability and have to somehow make it work with UIWebView. First thing that comes in my mind would be to use a mini local HTTP server where the UIWebView can sent its requests. I would then manage the requests myself and return the contents the UIWebView requires using the streaming lib I've done. That would work I suppose well, but I think a mini HTTP server would somehow, maybe, be a little bit of a overkill.
So, I was wondering, is there another way to interfere between UIWebView and the filesystem? Maybe using a custom schema? Like myschema://? And every time the UIWebView makes a request to myschema://myfile.html I would somehow interfere and return the data it needs?
Is such a idea viable? Where should I look to start from? Maybe NSURLRequest?
EDIT: I found this: iPhone SDK: Loading resources from custom URL scheme. It sounds good, however, how will the browser know the size of the request, the type (xml/binary/xhtml) and all the info HTTP puts in its header?
Create a custom NSURLProtocol subclass and register it so it will handle the HTTP requests. This will allow you to handle the requests that come from the UIWebView however you see fit, including supplying the data from your library. You can examine an implementation of one that performs disk caching of requests to allow offline browsing by looking at RNCachingURLProtocol. I personally use a custom NSURLProtocol subclass that I wrote to handle injecting some javascript code into pages that are loaded in the UIWebView, and it works very well.
I have a video in the form of NSInputStream object that I'm using to write data to. I know MPMoviePlayerController can receive a file location with an NSURL, however what I'm wondering is whether it provides functionality for reading bytes from an NSInputStream, or how this could be achieved?
I've heard mentions about how NSURLProtocol can be used to set up a custom protocol to do this sort of thing, but I've not seen any samples with readable code.
I fear this is not possible. The init method
- (id)initWithContentURL:(NSURL *)url
requires a file URL. So, we cannot be sure that MPMoviePlayerController will use the URL loading system. Very likely it's using the File Manager API.
The preferred API would be (anyway):
- (id)initWithStream:(NSInputStream*)inputStream;
I would suggest to file an enhancement request to Apple.
I'm building an app that has mp3 files stored on Amazon S3, and want to allow users to listen to the audio files from their browsers.
The original plan was to use the html5 audio tag, but since that won't work in older browsers, an alternative is needed.
I've never worked with streaming audio before, and don't know what is needed to get started. Do I need to use an outside player to do this? Can it be done in html? Javascript? What is the best way to approach this?
Thanks!
You can use HTML5 as the main solution and fall back to javascript or flash if it is not supported. Something like this might work well: http://jplayer.org/