I am developing an app that can upload/ record a video. I set a limit to video but when I upload from library a video that exceeds the limit, a pop-up appear "Video Too Long to Send", but I can choose the video.
My issue is: can I disable choose button or make something to stop upload the video.
Before uploading video you should check video size. I have done it with my swift application. I have used following code.
let data: NSData = NSData(contentsOfURL: <video URL>)!
if !isFileSizeUpTo10Mebibytes(data.length)
{
//File is more than 10 MB.
}
func isFileSizeUpTo10Mebibytes(fileSize: Int) -> Bool {
return fileSize <= 10485760
}
Related
I am trying to retrieve the size of apple music items downloaded from the apple music library as iPod-library URL (ex. ipod-library://item/item.m4a?id=8536687568586472929) but without exporting to the document directory, I am unable to get the size.
I tried the below code.
Code:
do {
let resources = try url.resourceValues(forKeys:[.fileSizeKey])
let fileSize = resources.fileSize ?? 0
print ("size \(fileSize)")
} catch {
print("Error: \(error)")
}
}
Output:
Couldn't fetch size for file The file “item.m4a” couldn’t be opened because there is no such file.
Size: 0
I am developing an ios video trimmer with swift 4. I am trying to render a horizontal list of video thumbnails spread out over various durations both from local video files and and remote urls. When I test it in the simulator the thumbnails get generated in less than a second which is ok. However, when I test this code on an actual device the thumbnail generation is really slow and sometimes crashes. I tried to add the actual image generation to a background thread and then update the UI on the main thread when it is completed but that doesnt seem to work very well and the app crashes after rendering the screen a few times. I am not sure if that is because I am navigating away from the screen while tasks are still trying to complete. I am trying to resolve this problem and have the app generate the thumbnails quicker and not crash. Here is the code that I am using below. I would really appreciate any assistance for this issue.
func renderThumbnails(view: UIView, videoURL: URL, duration: Float64) {
var offset: Float64 = 0
for i in 0..<self.IMAGE_COUNT{
DispatchQueue.global(qos: .userInitiated).async {
offset = Float64(i) * (duration / Float64(self.IMAGE_COUNT))
let thumbnail = thumbnailFromVideo(videoUrl: videoURL,
time: CMTimeMake(Int64(offset), 1))
DispatchQueue.main.async {
self.addImageToView(image: thumbnail, view: view, index: i)
}
}
}
}
static func thumbnailFromVideo(videoUrl: URL, time: CMTime) -> UIImage{
let asset: AVAsset = AVAsset(url: videoUrl) as AVAsset
let imgGenerator = AVAssetImageGenerator(asset: asset)
imgGenerator.appliesPreferredTrackTransform = true
do{
let cgImage = try imgGenerator.copyCGImage(at: time, actualTime: nil)
let uiImage = UIImage(cgImage: cgImage)
return uiImage
}catch{
}
return UIImage()
}
The first sentence of the documentation says not to do what you’re doing! And it even tells you what to do instead.
Generating a single image in isolation can require the decoding of a large number of video frames with complex interdependencies. If you require a series of images, you can achieve far greater efficiency using the asynchronous method, generateCGImagesAsynchronously(forTimes:completionHandler:), which employs decoding efficiencies similar to those used during playback.
(Italics mine.)
Since iOS 10, Apple has provided the support for downloading HLS (m3u8) video for offline viewing.
My question is: Is it necessary that we can only download HLS when it is being played ? Or we can just download when user press download button and show progress.
Does anyone has implemented this in Objective C version? Actually my previous App is made in Objective C. Now I want to add support for downloading HLS rather than MP4 (previously I was downloading MP4 for offline view).
I am really desperate to this. Please share thoughts or any code if implemented.
I used the apple code guid to download HLS content with the following code:
var configuration: URLSessionConfiguration?
var downloadSession: AVAssetDownloadURLSession?
var downloadIdentifier = "\(Bundle.main.bundleIdentifier!).background"
func setupAssetDownload(videoUrl: String) {
// Create new background session configuration.
configuration = URLSessionConfiguration.background(withIdentifier: downloadIdentifier)
// Create a new AVAssetDownloadURLSession with background configuration, delegate, and queue
downloadSession = AVAssetDownloadURLSession(configuration: configuration!,
assetDownloadDelegate: self,
delegateQueue: OperationQueue.main)
if let url = URL(string: videoUrl){
let asset = AVURLAsset(url: url)
// Create new AVAssetDownloadTask for the desired asset
let downloadTask = downloadSession?.makeAssetDownloadTask(asset: asset,
assetTitle: "Some Title",
assetArtworkData: nil,
options: nil)
// Start task and begin download
downloadTask?.resume()
}
}//end method
func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask, didFinishDownloadingTo location: URL) {
// Do not move the asset from the download location
UserDefaults.standard.set(location.relativePath, forKey: "testVideoPath")
}
if you don't understand what's going on, read up about it here:
https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/MediaPlaybackGuide/Contents/Resources/en.lproj/HTTPLiveStreaming/HTTPLiveStreaming.html
now you can use the stored HSL content to play the video in AVPlayer with the following code:
//get the saved link from the user defaults
let savedLink = UserDefaults.standard.string(forKey: "testVideoPath")
let baseUrl = URL(fileURLWithPath: NSHomeDirectory()) //app's home directory
let assetUrl = baseUrl.appendingPathComponent(savedLink!) //append the saved link to home path
now use the path to play video in AVPlayer
let avAssest = AVAsset(url: assetUrl)
let playerItem = AVPlayerItem(asset: avAssest)
let player = AVPlayer(playerItem: playerItem) // video path coming from above function
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.present(playerViewController, animated: true, completion: {
player.play()
})
The only way you can do this is to set up an HTTP server to serve the files locally after you've downloaded them.
The Live playlist uses a sliding-window. You need to periodically reload it after target-duration time and download only the new segments as they appear in the list (they will be removed at a later time).
Here are some related answers: Can IOS devices stream m3u8 segmented video from the local file system using html5 video and phonegap/cordova?
You can easily download an HLS stream with AVAssetDownloadURLSession makeAssetDownloadTask. Have a look at the AssetPersistenceManager in Apples Sample code: https://developer.apple.com/library/content/samplecode/HLSCatalog/Introduction/Intro.html
It should be fairly straight forward to use the Objective C version of the api.
Yes, you can download video stream served over HLS and watch it later.
There is a very straight forward sample app (HLSCatalog) from apple on this. The code is fairly simple. you can find it here - https://developer.apple.com/services-account/download?path=/Developer_Tools/FairPlay_Streaming_Server_SDK_v3.1/FairPlay_Streaming_Server_SDK_v3.1.zip
You can find more about offline HLS streaming here.
I'm using AVPlayer to play URL's i'm fetching from my backend. Initially, I was downloading the items to my documents directory and used the urls to play the files via AVAudioPlayer. I switched over to AVPlayer so I can stream the audio instead of downloading them. I see that the URL's are being fetched successfully, but once I try to play them I get no audio. Below is an example of a URL i'm fetching:
/Users/ellie/Desktop/ellie/sound/uploads/ellie1/Track5.m4a
var player: AVPlayer!
var fetchedURL: NSURL?
func tableView(tableView: UITableView, didSelectRowAtIndexPath indexPath: NSIndexPath) {
//I left out the fetching process
self.fetchedURL = NSURL(string:parseString!)
print("fetchedURL is \(self.fetchedURL!)")
self.playCell()
}
func playCell() {
let audioSession: AVAudioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSessionCategoryPlayback)
} catch _ {
}
do {
try audioSession.setActive(true)
} catch _ {
}
print("fetchedURL is \(self.fetchedURL!)")
player = AVPlayer(URL:self.fetchedURL!)
player.play()
}
The AVPlayer will play only locally and remotely hosted video files, and will also play streaming links. To stream you need to ensure that a proper streaming link is used, some examples found here: https://stackoverflow.com/questions/10104301/hls-streaming-video-url-need-for-testing.
Note, that it is not a trivial task to convert a video file into a hosted streaming link. Services such as vimeo provide the ability to upload and encode video files, however will provide you with a streaming link only under the 'pro' version.
Other options include configure AWS S3 buckets to host and encode your video files. http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/TutorialStreamingJWPlayer.html
I need to make a video file from thousands of images generated by code.
I cannot put those images into an array because they are too many--a 30second-long video would consist of 1800 imgae frames. And I don't get all images at once.
To get each image, it first triggers Javascript fucntion in WebView asking if a video frame,which is UIImage, should be generated. if Yes, my code makes UIImage for a video frame one at a time. and the app does other things and at some point it asks webview again for a permission to generate another image. it does this for a thousand times and more.
If Delegate gets a message that says No, the last image was a final video frame so a complete video file should be made at this point and saved to Document directory.
How should I do this? Objective C solutions would be acceptable too. thank you in advance
//ask webview if another video frame should be made
func askWebView() {
webView?.evaluateJavaScript("Ask JS function",completionHandler:nil)
}
Delegate
//Delegate method
func userContentController(userContentController:WKUserContentController,didReceiveScriptMessage message:WKScriptMessage){
let body:String = message.body as! String
if body == "makeAFrame" {
let videoFrame = self.makeImage()
//should asseble video frames
} else {
//No,nomore video frame. write a complete video file to Document directory
}
Generating an UIImage for a video frame
func makeImage() -> UIImage {
//make an image and return it
}