How to get or edit exif for video in iOS - ios

I want to edit exif for a video.
I shoot video with a custom camera.But this video lacks exif information. I searched the Internet for a long time,not found any way.
The PHAsset doesn't OK,because I don't need save the video in Photo album。
Who can help me ,thank you!

You can use this guide to get metadata.
To work with metadata you will use AVMetadataItem items.
To load metadata use next example:
let url = Bundle.main.url(forResource: "audio", withExtension: "m4a")!
let asset = AVAsset(url: url)
let formatsKey = "availableMetadataFormats"
asset.loadValuesAsynchronously(forKeys: [formatsKey]) {
var error: NSError? = nil
let status = asset.statusOfValue(forKey: formatsKey, error: &error)
if status == .loaded {
for format in asset.availableMetadataFormats {
let metadata = asset.metadata(forFormat: format)
// process format-specific metadata collection
}
}
}

Related

iOS development: navigate video chapters programmatically

I want to programmatically navigate chapters of a mp4 video.
The chapters work in QuickTime, so I assume the video format isn't the issue.
The code from this page should return an array of the chapters but only returns an empty one instead:
https://developer.apple.com/documentation/avfoundation/media_playback/presenting_chapter_markers
let asset = AVAsset(url: <# Asset URL #>)
let chapterLocalesKey = "availableChapterLocales"
asset.loadValuesAsynchronously(forKeys: [chapterLocalesKey]) {
var error: NSError?
let status = asset.statusOfValue(forKey: chapterLocalesKey, error: &error)
if status == .loaded {
let languages = Locale.preferredLanguages
let chapterMetadata = asset.chapterMetadataGroups(bestMatchingPreferredLanguages: languages)
// Process chapter metadata.
}
else {
// Handle other status cases.
}
}
Has anyone an idea how to do it?

How to upload a video from iOS photo album to Azure Blob Storage

I am struggling with uploading videos from iOS photo album to Azure blob storage. I am using AZSClient.
uploading images is straight forward, ie. I get the image 'Data' from PHAsset and then upload it to azure storage using AZSCloudBlockBlob.uploadFromData method
Can anyone guide me on how to upload a video to azure blob preferably in swift
There was a similar thread for this they used the bellow code, and they used the IOS library found here:
//Upload to Azure Blob Storage with help of SAS
func uploadBlobSAS(container: String, sas: String, blockname: String, fromfile: String ){
// If using a SAS token, fill it in here. If using Shared Key access, comment out the following line.
var containerURL = "https://yourblobstorage.blob.core.windows.net/\(container)\(sas)" //here we have to append sas string: + sas
print("containerURL with SAS: \(containerURL) ")
var container : AZSCloudBlobContainer
var error: NSError?
container = AZSCloudBlobContainer(url: NSURL(string: containerURL)! as URL, error: &error)
if ((error) != nil) {
print("Error in creating blob container object. Error code = %ld, error domain = %#, error userinfo = %#", error!.code, error!.domain, error!.userInfo);
}
else {
let blob = container.blockBlobReference(fromName: blockname)
blob.uploadFromFile(withPath: fromfile, completionHandler: {(NSError) -> Void in
NSLog("Ok, uploaded !")
})
}
}
I found the answer in this thread
let manager = PHImageManager.default()
manager.requestAVAsset(forVideo: asset, options: nil, resultHandler: { (avasset, audio, info) in
if let avassetURL = avasset as? AVURLAsset {
guard let video = try? Data(contentsOf: avassetURL.url) else {
return
}
videoData = video
}
})
once you get the Data object then you can use AZSCloudBlockBlob.uploadFromData to upload it to azure storage

Swift - How can I convert Saved Audio file conversations to Text?

I work on speech recognition. I solve the text-to-speech and speech-to-text with IOS frameworks. But now i want to convert saved audio file conversations to text. How can i solve this ? Thank you for all replies.
I have worked on same things which are working for me.
I have audio file in my project bundle which. So I have written following code to convert audio to text.
let audioURL = Bundle.main.url(forResource: "Song", withExtension: "mov")
let recognizer = SFSpeechRecognizer(locale: Locale(identifier: "en-US"))
let request = SFSpeechURLRecognitionRequest(url: audioURL!)
request.shouldReportPartialResults = true
if (recognizer?.isAvailable)! {
recognizer?.recognitionTask(with: request) { result, error in
guard error == nil else { print("Error: \(error!)"); return }
guard let result = result else { print("No result!"); return }
print(result.bestTranscription.formattedString)
}
} else {
print("Device doesn't support speech recognition")
}
First get audio url from where you have store audio file.
Then create instance of SFSpeechRecognizer with locale that you have want.
Create instance of SFSpeechURLRecognitionRequest which are used to requesting recognitionTask.
recognitionTask will give you result and error. Where result contains bestTranscription.formattedString. formmatedString is your test result of audio file.
If set request.shouldReportPartialResults = true, this will give your partial result of every line speak in audio.
I hope this will help you.

Retrieving a video from Documents Directory, but app cannot find location

I have implemented Core Data to save string formatted URLS in my application. These URLS are URLS of videos they have recorded.
I used core data because I want the video to still be available to them after they exit out of the app. I am able to save and retrieve the URLS. However, when i use them to get the video thumbnails it does not work.
Here is where I declare the video file location:
let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0] as NSString
let cropUniqueId = NSUUID().uuidString
let outputPath = "\(documentsPath)/\(cropUniqueId).mov"
Then i convert it to a string and save the data to core data:
arrayOfStringPaths.append(outputPath)
stringOfArrayPaths = stringOfArrayPaths + arrayOfStringPaths.joined(separator: ",")
saveData(arrayPath: stringOfArrayPaths)
func saveData(arrayPath: String) {
let savedVideo = VideoPath(context: context)
savedVideo.fileLocations = arrayPath
appDelegate.saveContext()
print("Saved")
}
Everything so far works fine. It saves the URLS just as they are, i checked them with various print statements.
Now I retrieve the information when the user opens the app.
var data = [VideoPath]()
func fetchSavedData() {
do {
data = try context.fetch(VideoPath.fetchRequest())
for each in data {
// I append each url to the array.
videosArray.append(URL(fileURLWithPath: each.fileLocations!))
// They all print out correctly
print(each.fileLocations!)
}
for video in videosArray {
print("This is in video array")
// This prints out correctly as the URL i recorded earlier
print(video)
// This is where everything messes up
let thumbnail = getThumbnail(video)
thumbnails.append(thumbnail)
}
} catch {
print("There was an error")
}
}
When i try to get the thumbnail of the video it gives me this error here:
"The requested URL was not found on this server." UserInfo={NSLocalizedDescription=The requested URL was not found on this server., NSUnderlyingError=0x17064c330 {Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"}}: file /Library/Caches/com.apple.xbs/Sources/swiftlang/swiftlang-802.0.53/src/swift/stdlib/public/core/ErrorType.swift

AVAssetExportSession fails to convert .mov from photo library. Why?

Scenario:
I wish to reduce the size of individual videos from my iTouch photo library.
1. Collect videoAssets from library.
2. Get a thumbnail of the PHAsset - works.
3. Get the actual video from the library.
4. Request the AVAssetForVideo from the library.
5. Convert the video via ExportSessions... loading assorted parameters.
6. Attempt to run the export into a tmp directory for use.
* FAILS *
Here's the debug output:
Here's the error message:
func getVideoFromPhotoLibrary() {
let videoAssets = PHAsset.fetchAssetsWithMediaType(.Video, options:nil)
videoAssets.enumerateObjectsUsingBlock {
(obj:AnyObject!, index:Int, stop:UnsafeMutablePointer<ObjCBool>) in
let mySize = CGSizeMake(120,120)
let myAsset = obj as! PHAsset
let imageManager = PHImageManager.defaultManager()
var myVideo:BlissMedium?
// Request the poster frame or the image of the video
imageManager.requestImageForAsset(myAsset, targetSize:mySize, contentMode: .AspectFit, options: nil) {
(imageResult, info) in
let thumbnail = UIImage(named:"videoRed")
myVideo = BlissMedium(blissImage: imageResult, creationDate:myAsset.creationDate)
myVideo!.mediumType = .video
}
// Actual Video:
imageManager.requestAVAssetForVideo(myAsset, options: nil, resultHandler: {result, audio, info in
let asset = result as! AVURLAsset
let mediaURL = asset.URL
let session = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetMediumQuality)
let filename = "composition.mp4"
session.outputURL = NSURL(string: NSTemporaryDirectory());
session.outputFileType = AVFileTypeQuickTimeMovie;
session.exportAsynchronouslyWithCompletionHandler({ () -> Void in
dispatch_async(dispatch_get_main_queue(), {
if session.status == AVAssetExportSessionStatus.Completed {
println("Success")
}
else {
println(session.error?.localizedDescription)
//The requested URL was not found on this server.
}
})
})
})
if nil != myVideo {
self.gBlissVideoMedia.append(myVideo!)
}
}
}
I checked to be sure the target path/file exist; then I added the 'AVFileTypeMPEG4' output type to match the intended .mp4:
let targetDir = createTempDirectory("bliss/composition.mp4") as String?
if NSFileManager.defaultManager().fileExistsAtPath(targetDir!) {
println("*** file exists! ***")
} else {
return
}
session.outputURL = NSURL(string: targetDir!);
session.outputFileType = AVFileTypeMPEG4
I'm still having problems:
* file exists! *
Optional("The operation could not be completed")
What am I doing wrong; what's missing?
Update:
I'm able to successfully run the export to my NSHomeDirectory() vs NSTemporaryDictory() in Objective-C.
However... the same code written in Swift fails.
I notice a change in absolute path to the target output in Swift, not found in Objective-C:
Perhaps it's a Swift 1.2 bug???
I am not sure if you can save in the root of the temp directory, I normally use this function to create a new temp directory that I can use:
func createTempDirectory(myDir: String) -> String? {
let tempDirectoryTemplate = NSTemporaryDirectory().stringByAppendingPathComponent(myDir)
let fileManager = NSFileManager.defaultManager()
var err: NSErrorPointer = nil
if fileManager.createDirectoryAtPath(tempDirectoryTemplate, withIntermediateDirectories: true, attributes: nil, error: err) {
return tempDirectoryTemplate
} else {
return nil
}
}
Try to make your conversion in the directory returned by this function.
I hope that helps you!
I didn't quite understand what that last part of code did, where you find out if a file exists or not. Which file is it you are locating?
Since I didn't understand that then this might be irrelevant, but in your topmost code I notice that you set the filename to composition.mp4, but let the outputURL be NSURL(string: NSTemporaryDirectory()). With my lack of Swiftness I might be missing something, but it seems to me as if you're not using the filename at all, and are trying to write the file as a folder. I believe setting a proper URL might fix the problem but I'm not sure. An Objective-c-example of this could be:
NSURL * outputURL = [[NSURL alloc]
initFileURLWithPath:[NSString pathWithComponents:
#[NSTemporaryDirectory(), #"composition.mp4"]]];
The outputURL is supposed to point to the actual file, not the folder it lies in. I think..
Anyway, if that doesn't work I do have a few other thoughts as well.
Have you tried it on an actual device? There may be a problem with the simulator.
Also, sadly, I have gotten the error -12780 countless times with different root-problems, so that doesn't help very much.
And, I see you check if session.status == AVAssetExportSessionStatus.Completed, have you checked what the actual status is? Is it .Failed, or perhaps .Unknown? There are several statuses.
This might be a long shot, but in one of my apps I am using the camera to capture video/audio, then encode/convert it using AVAssetExportSession. There were strange errors when starting to record, as well as after recording(exporting). I found out that I could change the AVAudioSession, which apparently has something to do with how the device handles media.
I have no idea how to Swift, but here's my code (in viewDidAppear of the relevant view)
NSError *error;
AVAudioSession *aSession = [AVAudioSession sharedInstance];
[aSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
[aSession setMode:AVAudioSessionModeVideoRecording error:&error];
[aSession setActive: YES error: &error];
The category PlayAndRecord allowed me to start the camera much faster, as well as getting rid of the occasional hanging AVAssetExportSessionStatus.Unknown and the occasional crash .Failed (which also threw the -12780-error).

Resources