so I been playing around the new xcode 8 beta and I'm able to load an image into the .image property but I had not succeed to load an audio file with the .mediaFileURL property.
here's my
var message = MSMessage()
var template = MSMessageTemplateLayout()
viewDidLoad() {
if let filePath2 =
Bundle.main().pathForResource("synth", ofType: "wav") {
let fileUrl = NSURL(string: filePath2)
let URL2 = fileUrl as! URL
template.mediaFileURL = URL2
}
message.layout = template
guard let conversation = activeConversation else {
fatalError("Expected a conversation") } conversation.insert(message,
localizedChangeDescription: nil) { error in
if let error = error {
print(error)
}
}
}
According to bug reporter I should use the insertAttachment API to insert MP3, WAV and M4a.
conversation.insertAttachment(fileUrl, withAlternateFilename: "fileAudio") { error in
if let error = error {
print(error)
}
Related
I have already copy the file absolute path and paste in simulator browser, the image can be opened. But the fileExists is fail, i dont know why..... Can anyone help
let defaultImage = "302C3FA1-E4E1-4CD8-B6DF-2FF4E4E24C11.jpeg"
loadImage(at: defaultImage)
func fileExists(at path: String) -> Bool {
return FileManager.default.fileExists(atPath: path)
}
func loadImage(at path: String) -> UIImage? {
let tempPath = URL(fileURLWithPath: NSTemporaryDirectory(), isDirectory: true)
let imagePath = "\(tempPath)\(path.trimmingCharacters(in: .whitespacesAndNewlines))"
guard fileExists(at: imagePath) else { return nil }
guard let image = UIImage(contentsOfFile: imagePath) else { return nil }
return image
}
You need split filename and extension filename.
If you use main bundle. you can follow this code
let stringPath = Bundle.main.path(forResource: "your_filename", ofType: "txt")
let urlPath = Bundle.main.url(forResource: "your_filename", withExtension: "txt")
or you can use my code.
func readConfigFromBundle(fileExtension: String) -> TCBConfigure? {
let bundle = Bundle.main
if let resPath = bundle.resourcePath {
do {
let dirContents = try FileManager.default.contentsOfDirectory(atPath: resPath)
let filteredFiles = dirContents.filter { $0.contains(fileExtension) }
for fileName in filteredFiles {
let sourceURL = bundle.bundleURL.appendingPathComponent(fileName)
let data: NSData? = NSData.init(contentsOf: sourceURL)
if let fileData = data {
// implement your logic
}
}
} catch {
// implement when error
}
}
return nil
}
When I use this code below and I pull my https video link from Firebase over Wifi everything is smooth, the video immediately plays with zero issues. When I use this same code over Cellular everything moves extremely slow, like the video pauses and takes forever to load.
If it plays from file wether I'm on Cellular or Wifi shouldn't matter. What is the issue here?
DataModel:
class Video {
var httpsStr: String?
var videoURL: URL?
convenience init(dict: [String: Any] {
self.init()
if let httpsStr = dict["httpsStr"] as? String {
self.httpsStr = httpsStr
let url = URL(string: httpsStr)!
let assetKeys = [ "playable", "duration"]
let asset = AVURLAsset(url: url)
asset.loadValuesAsynchronously(forKeys: assetKeys, completionHandler: {
DispatchQueue.main.async {
self.videoURL = asset.url
// save videoURL to FileManager to play video from disk
}
})
}
}
}
Firebase Pull:
ref.observeSingleEvent(of: .value) { (snapshot) in
guard let dict = snapshot.value as? [String: Any] else { return }
let video = Video(dict: dict)
self.video = video
DispatchQueue.main.asyncAfter(deadline: .now() + 2, execute: {
self.playVideo()
}
}
Play Video:
func playVideo() {
// init AVPlayer ...
guard let videoURL = self.video.videoURL else { return }
let lastPathComponent = videoURL.lastPathComponent
let file = FileManager...appendingPathComponent(lastPathComponent)
if FileManager.default.fileExists(atPath: file.path) {
let asset = AVAsset(url: file)
play(asset)
} else {
let asset = AVAsset(url: videoURL)
play(asset)
}
}
func play(_ asset: AVAsset) {
self.playerItem = AVPlayerItem(asset: asset)
self.player?.automaticallyWaitsToMinimizeStalling = false // I also set this to true
self.playerItem?.preferredForwardBufferDuration = TimeInterval(1)
self.player?.replaceCurrentItem(with: playerItem!)
// play video
}
I followed this answer and now everything seems to work smoothly while on Cellular Data. I needed to include the tracks property in the assetKeys.
You create an asset from a URL using AVURLAsset. Creating the asset,
however, does not necessarily mean that it’s ready for use. To be
used, an asset must have loaded its tracks.
class Video {
var httpsStr: String?
var videoURL: URL?
convenience init(dict: [String: Any] {
self.init()
if let httpsStr = dict["httpsStr"] as? String {
self.httpsStr = httpsStr
let url = URL(string: httpsStr)!
let assetKeys = ["playable", "duration", "tracks"] // <----- "tracks" added here
let asset = AVURLAsset(url: url)
asset.loadValuesAsynchronously(forKeys: assetKeys, completionHandler: {
var error: NSError? = nil
let status = asset.statusOfValue(forKey: "tracks", error: &error)
switch status {
case .loaded:
// Sucessfully loaded, continue processing
DispatchQueue.main.async {
self.videoURL = asset.url
// save videoURL to FileManager to play video from disk
}
case .failed:
// Examine NSError pointer to determine failure
print("Error", error?.localizedDescription as Any)
default:
// Handle all other cases
print("default")
}
})
}
}
}
I know that SDWebImage loads the image in a background thread so you're not blocking the UI/main thread when this downloading is going on. Furthermore, it will also disk-cache all the images you've downloaded and will NEVER re-download an image from the same URL.
So I wonder if there is something similar or the same for videos?
Something to note: I add Videos as Sublayer.
let videoURL = URL(string: postArray[indexPath.item].media[0].videoURLString!)//need to do error handlin here
print(videoURL as Any, "<-- video url in dispkay")
let player = AVPlayer(url: videoURL! as URL)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = CGRect(x: -8, y: 0, width: 138, height: 217)//cell.frame
cell.imageOrVideoView.layer.addSublayer(playerLayer)
//Other code and play()
This was recommended in the past but it seems like it does something different or at the very leased has too much extra functionality I dont need.
Update:
What I am testing:
DispatchQueue.global(qos: .default).async(execute: {
var downloadedData: Data? = nil
if let url = URL(string: videoURL) {
do {
downloadedData = try Data(contentsOf: url)
} catch {
print(error, "downloaded Data failed")
}
}
if downloadedData != nil {
// STORE IN FILESYSTEM
var cachesDirectory = NSSearchPathForDirectoriesInDomains(.cachesDirectory, .userDomainMask, true)[0]
var file = URL(fileURLWithPath: cachesDirectory).appendingPathComponent(videoURL).absoluteString
do {
try downloadedData?.write(to: URL(string: file)!)
} catch {
print(error, "error dowloading data and writing it")
}
// STORE IN MEMORY
if let downloadedData = downloadedData {
memoryCache?.setObject(downloadedData as AnyObject, forKey: videoURL as AnyObject)
}
}
// NOW YOU CAN CREATE AN AVASSET OR UIIMAGE FROM THE FILE OR DATA
})
I do not understand however if I should do something right after the last line or if I should do it after the }) or if I need to add a Update UI there.
So I was able to solve the problem with the following:
Swift 4:
import Foundation
public enum Result<T> {
case success(T)
case failure(NSError)
}
class CacheManager {
static let shared = CacheManager()
private let fileManager = FileManager.default
private lazy var mainDirectoryUrl: URL = {
let documentsUrl = self.fileManager.urls(for: .cachesDirectory, in: .userDomainMask).first!
return documentsUrl
}()
func getFileWith(stringUrl: String, completionHandler: #escaping (Result<URL>) -> Void ) {
let file = directoryFor(stringUrl: stringUrl)
//return file path if already exists in cache directory
guard !fileManager.fileExists(atPath: file.path) else {
completionHandler(Result.success(file))
return
}
DispatchQueue.global().async {
if let videoData = NSData(contentsOf: URL(string: stringUrl)!) {
videoData.write(to: file, atomically: true)
DispatchQueue.main.async {
completionHandler(Result.success(file))
}
} else {
DispatchQueue.main.async {
let error = NSError(domain: "SomeErrorDomain", code: -2001 /* some error code */, userInfo: ["description": "Can't download video"])
completionHandler(Result.failure(error))
}
}
}
}
private func directoryFor(stringUrl: String) -> URL {
let fileURL = URL(string: stringUrl)!.lastPathComponent
let file = self.mainDirectoryUrl.appendingPathComponent(fileURL)
return file
}
}
Usage:
CacheManager.shared.getFileWith(stringUrl: videoURL) { result in
switch result {
case .success(let url):
// do some magic with path to saved video
break;
case .failure(let error):
// handle errror
print(error, " failure in the Cache of video")
break;
}
}
I am trying to play a video using UIWebView, but it's not showing any video even though the video is downloading from the server. Does anyone know what I'm doing wrong here?
Here is my code:
override func viewDidLoad() {
super.viewDidLoad()
self.pdfView.delegate = self
self.pdfView.mediaPlaybackRequiresUserAction = false
if "" != video?.videoPath {
self.loadFromUrl(path: (video?.videoPath)!)
self.activityIND.isHidden = true
self.activityIND.stopAnimating()
} else {
let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]
let strName = video?.id
let filePath = "\(documentsPath)/"+strName!+".wmv"
let fileManager = FileManager.default
self.activityIND.startAnimating()
if fileManager.fileExists(atPath: filePath) {
self.loadFromUrl(path: filePath)
return;
}
let reference = FIRStorage.storage().reference(forURL: (self.video?.videoURL)!)
reference.data(withMaxSize: 50 * 1024 * 1024) { (data, error) -> Void in
if (error != nil) {
print ("unable to download video file from Firebase Storage")
self.activityIND.isHidden = false
self.activityIND.startAnimating()
} else {
if ((try! data?.write(to: URL.init(fileURLWithPath: filePath, isDirectory: false))) != nil) {
self.loadFromUrl(path: filePath)
print ("video file is downloaded from Firebase Storage")
self.db.upDate(id: (self.video?.id)!, videoPath: filePath)
self.activityIND.isHidden = true
}
}
}
}
}
func loadFromUrl(path: String)
{
let url = NSURL(string:path)
pdfView.loadRequest(NSURLRequest(url: url! as URL) as URLRequest)
activityIND.isHidden = true
activityIND.startAnimating()
}
turns out wmv format is not supported, once i changed the format to mp4 everything worked
I'm doing a test program dealing with audio and tableView, right now I have the controller working and playing the chosen song correctly, but I'm trying to display the metadata from the mp3 file and is not doing anything currently I have it like this:
Function to print the data:
func testAudioStuff(testURL testURL: NSURL) {
let audioInfo = AVPlayerItem(URL: testURL)
let metaDataList = audioInfo.asset.metadata as [AVMetadataItem]
//here it should print something it does not
for item in metaDataList {
if item.commonKey == nil {
continue
}
if let key = item.commonKey, let value = item.value {
print(key)
print(value as! String)
if key == "title" {
print(key)
print("here is the title" + (value as! String))
}
if key == "artist" {
print(key)
print("here is the artist" + (value as! String))
}
if key == "artwork" {
print("here is the artwork" + (value as! String))
}
}
}
}
Function to play the music:
func playSelectedMusic(song: Int, section: Int) {
if section == 1 {
//print(NSBundle.mainBundle().description + "\(song) \(section)")
if let starMusic = productsAndMusic["Music"] {
//print("We are looking at \(starMusic[song])")
let play = starMusic[song].componentsSeparatedByString(".")[0]
//print(play)
if let fileToPlay = NSBundle.mainBundle().pathForResource(play, ofType: "mp3") {
//print(fileToPlay)
testAudioStuff(testURL: NSURL(string: fileToPlay)!)
do {
player = try AVAudioPlayer(contentsOfURL: NSURL(string: fileToPlay)!)
} catch Errors.GeneralError {
} catch let error as NSError {
print("i dunno this does not comes with instructions \(error)")
} catch let something as ErrorType {
print("somehting else \(something)")
}
player.prepareToPlay()
player.play()
}
//player = AVAudioPlayer(contentsOfURL: NSURL(string: NSBundle.mainBundle().pathForResource(play, ofType: "mp3")!))
//player.prepareToPlay()
//print(NSBundle.mainBundle().pathForResource(play, ofType: "mp3"))
}
//print(NSBundle.mainBundle().pathForResource((productsAndMusic["Music"]![song] as String).componentsSeparatedByString(".")[0], ofType: "mp3")!)
//print((productsAndMusic["Music"]![song] as String).componentsSeparatedByString(".")[0])
} else {
}
}
As I mentioned it plays the selected song in the simulator but it does not print the metadata, why?, any help? Is in Xcode 7 beta 2.
I have tried something, but no dice, this:
var secondTestTitle: AVAsset = AVAsset(URL: testURL)
print(secondTestTitle.description)
var metaStuff: [AVMetadataItem] = secondTestTitle.commonMetadata as [AVMetadataItem]
print(metaStuff.count)
Result:
<AVURLAsset: 0x78ea3930, URL = /Users/pedro/Library/Developer/CoreSimulator/Devices/304FE5A7-9506-4A9B-B685-5CDBE9AFB4C4/data/Containers/Bundle/Applica ... ock1.mp3>
0
If this is an MP3 file in your app bundle, then let's suppose it is called test.mp3. Then you can say:
let url = NSBundle.mainBundle().URLForResource("test", withExtension:"mp3")
let asset = AVAsset(URL: url!)
let meta = asset.metadata
print(meta)
If you don't see any metadata in the console when you say that, then this file has no metadata.
However, that's not what I would do, because it's not a very realistic test — and it isn't likely to be very useful, because most information is not written into the file as metadata. More likely, you're going to want to know about a song in the user's Music library. In that case you would use the Media Player framework, as I describe in my book.