iOS development: navigate video chapters programmatically - ios

I want to programmatically navigate chapters of a mp4 video.
The chapters work in QuickTime, so I assume the video format isn't the issue.
The code from this page should return an array of the chapters but only returns an empty one instead:
https://developer.apple.com/documentation/avfoundation/media_playback/presenting_chapter_markers
let asset = AVAsset(url: <# Asset URL #>)
let chapterLocalesKey = "availableChapterLocales"
asset.loadValuesAsynchronously(forKeys: [chapterLocalesKey]) {
var error: NSError?
let status = asset.statusOfValue(forKey: chapterLocalesKey, error: &error)
if status == .loaded {
let languages = Locale.preferredLanguages
let chapterMetadata = asset.chapterMetadataGroups(bestMatchingPreferredLanguages: languages)
// Process chapter metadata.
}
else {
// Handle other status cases.
}
}
Has anyone an idea how to do it?

Related

How to get or edit exif for video in iOS

I want to edit exif for a video.
I shoot video with a custom camera.But this video lacks exif information. I searched the Internet for a long time,not found any way.
The PHAsset doesn't OK,because I don't need save the video in Photo album。
Who can help me ,thank you!
You can use this guide to get metadata.
To work with metadata you will use AVMetadataItem items.
To load metadata use next example:
let url = Bundle.main.url(forResource: "audio", withExtension: "m4a")!
let asset = AVAsset(url: url)
let formatsKey = "availableMetadataFormats"
asset.loadValuesAsynchronously(forKeys: [formatsKey]) {
var error: NSError? = nil
let status = asset.statusOfValue(forKey: formatsKey, error: &error)
if status == .loaded {
for format in asset.availableMetadataFormats {
let metadata = asset.metadata(forFormat: format)
// process format-specific metadata collection
}
}
}

Swift - How can I convert Saved Audio file conversations to Text?

I work on speech recognition. I solve the text-to-speech and speech-to-text with IOS frameworks. But now i want to convert saved audio file conversations to text. How can i solve this ? Thank you for all replies.
I have worked on same things which are working for me.
I have audio file in my project bundle which. So I have written following code to convert audio to text.
let audioURL = Bundle.main.url(forResource: "Song", withExtension: "mov")
let recognizer = SFSpeechRecognizer(locale: Locale(identifier: "en-US"))
let request = SFSpeechURLRecognitionRequest(url: audioURL!)
request.shouldReportPartialResults = true
if (recognizer?.isAvailable)! {
recognizer?.recognitionTask(with: request) { result, error in
guard error == nil else { print("Error: \(error!)"); return }
guard let result = result else { print("No result!"); return }
print(result.bestTranscription.formattedString)
}
} else {
print("Device doesn't support speech recognition")
}
First get audio url from where you have store audio file.
Then create instance of SFSpeechRecognizer with locale that you have want.
Create instance of SFSpeechURLRecognitionRequest which are used to requesting recognitionTask.
recognitionTask will give you result and error. Where result contains bestTranscription.formattedString. formmatedString is your test result of audio file.
If set request.shouldReportPartialResults = true, this will give your partial result of every line speak in audio.
I hope this will help you.

Choosing a picture causes resave to camera roll

I have a program in which the user chooses a photo to put on the screen and the code puts it into a custom album automatically. But whenever they choose a picture, it resaves it to the camera roll, creating duplicates. How do I make it stop doing this?
func fetchAssetCollectionForAlbum() -> PHAssetCollection? {
let fetchOptions = PHFetchOptions()
fetchOptions.predicate = NSPredicate(format: "title = %#", albumName)
// fetch the asset for the album
let collection = PHAssetCollection.fetchAssetCollections(with: .album, subtype: .any, options: fetchOptions)
var picturePlaceHolder: PHObjectPlaceholder? = nil
if let _: AnyObject = collection.firstObject {
return collection.firstObject
}
return nil
}
func save(image: UIImage) {
if assetCollection == nil {
return
}
PHPhotoLibrary.shared().performChanges({
let assetChangeRequest = PHAssetChangeRequest.creationRequestForAsset(from: image)
let assetPlaceHolder = assetChangeRequest.placeholderForCreatedAsset
let albumChangeRequest = PHAssetCollectionChangeRequest(for: self.assetCollection)
let enumeration: NSArray = [assetPlaceHolder!]
albumChangeRequest!.addAssets(enumeration)
}, completionHandler: nil)
}
I posted a similar question:
Swift 3 or 4 Saving to custom album creates duplicate images
But I got nothing but crickets as well. Luckily, I think I found the answer. I'll answer my own question as well.
The code you have (which was the same code I had) is to CREATE A NEW ASSET. It is useful only for the saving the image to your custom album after the user has taken a picture with the camera. It is for brand new assets.
However, for existing assets, you do not want to create a new asset. Instead, you want to add the existing asset to the custom album. To do this, you need a different method. Here is the code I created and it seems to be working. Keep in mind that you will have to get the asset ID FIRST, so that you can send it to your method and access the existing asset.
So, in your imagePickerController, you have to determine whether the user chose an existing image or whether the method is being called from a new camera action.
let pickerSource = picker.sourceType;
switch(pickerSource){
case .savedPhotosAlbum, .photoLibrary:
if(let url = info[UIIMagePickerControllerReferenceURL] as? NSURL{
let refURLString = refURL?.absoluteString;
/* value for refURLString looks something like assets-library://asset/asset.JPG?id=82A6E75C-EA55-4C3A-A988-4BF8C7F3F8F5&ext=JPG */
let refID = {function here to extract the id query param from the url string}
/*above gets you the asset ID, you can get the asset directly, but it is only
available in ios 11+.
*/
MYPHOTOHELPERCLASS.transferImage(toAlbum: "myalbumname", withID: refID!, ...)
}
break;
case .camera:
...
break;
}
Now, in your photohelper class (or in any function anywhere, whatever), to EDIT the asset instead of create a new one, this is what I have. I am assuming the changeRequest variable can be ommitted. I was just playing around until I got this right. Going through the completely ridiculous apple docs I was able to at least notice that there were other methods to play with. I found that the NSFastEnumeration parameter can be an NSArray of PHAssets, and not just placeholder PHObjectPlaceholder objects.
public static func transferImage(toAlbum albumName:String, withID imageID:String, onSuccess success:#escaping(String)->Void, onFailure failure:#escaping(Error?)->Void){
guard let album = self.getAlbum(withName: albumName) else{
... failure here, albumNotFoundError
return;
}
if(self.hasImageInAlbum(withIdentifier: imageID, fromAlbum: albunName)){
... failure here, image already exists in the album, do not make another
return;
}
let theAsset = self.getExistingAsset(withLocalIdentifier: imageID);
if(theAsset == nil){
... failure, no asset for asset id
return;
}
PHPhotoLibrary.shared().performChanges({
let albumChangeRequest = PHAssetCollectionChangeRequest(for: album);
let changeRequest = PHAssetChangeRequest.init(for: theAsset!);
let enumeration:NSArray = [theAsset!];
let cnt = album.estimatedAssetCount;
if(cnt == 0){
albumChangeRequest?.addAssets(enumeration);
}else{
albumChangeRequest?.inserAssets(enumeration, at: [0]);
}
}){didSucceed, error) in
OperationQueue.main.addOperation({
didSucceed ? success(imageID) : failure(error);
})
}
}
So, it is pretty much the same, except instead of creating an Asset Creation Request and generating a placeholder for the created asset, you instead just use the existing asset ID to fetch an existing asset and add the existing asset to the addasset/insertasset NSArray parameter instead of a newly created asset placeholder.

aws dynamodb how to use object mapper with batch get in ios swift

Thanks in advance for any help. I am trying to get Batch items (Load multiple) items from one DynamoDb table using the AWS iOS SDK (Swift). I can load one item using the Block syntax, but I need to load 10 or more than that. I don't want to use 10 Block calls to load them individually. I tried to follow the attach stackoverflow Link (where the similar solution is given) but I am getting the following compiler error message. I come from Java background, hence could also be a syntax issue. Is it the right way to load multiple items? I don't want to use low level API. Any help, where I am going wrong. Thanks.
aws dynamodb how to use object mapper with batch get in ios
let dynamoDBObjectMapper = AWSDynamoDBObjectMapper.default()
var tasksList = Array<AWSTask<AnyObject>>()
for i in 1...10 {
tasksList.append(dynamoDBObjectMapper.load(AWSCards.self, hashKey: "SH_"+String(i), rangeKey: nil))
}
AWSTask.init(forCompletionOfAllTasksWithResults: tasksList).continueWithBlock { (task) -> AnyObject? in
if let cards = task.result as? [AWSCards] {
print(cards.count)
}
else if let error = task.error {
print(error.localizedDescription)
}
return nil
}
Have a try with the following codes (Swift 4.1, Feb 9th, 2018):
let dynamoDBObjectMapper = AWSDynamoDBObjectMapper.default()
var tasksList = Array<AWSTask<AnyObject>>()
for i in 1...10 {
tasksList.append(dynamoDBObjectMapper.load(AWSCards.self, hashKey: "SH_"+String(i), rangeKey: nil))
}
AWSTask<AnyObject>.init(forCompletionOfAllTasksWithResults: tasksList).continueWith { (task) -> Any? in
if let cards = task.result as? [AWSCards] {
print(cards.count)
}
else if let error = task.error {
print(error.localizedDescription)
}
return nil
}
Your question is "how to use the object mapper" but it might be more efficient for you to not use it.
However, there is a way to use it. See Niklas's answer here and here (he copy & pasted), but something about it strikes me as fishy. I want to make the assertion that it is not as fast as the built-in batch-get function, but I am unsure. I suspect that this does not complete the items in parallel, or at least not as efficiently as in BatchGetItem.
See the docs: "In order to minimize response latency, BatchGetItem retrieves items in parallel."
According to Yosuke, "Currently, AWSDynamoDBObjectMapper does not support the batch get item. You need to load one item at a time if you want to use the object mapper" as of 2016. This still seems to be the case. I am using a version a couple versions behind, but not too far behind. Someone check.
In conclusion, if you are loading one item at a time, you are likely missing out on the whole purpose of BatchGetItem (low latency).
Pulling from various sources, including John Davis's question here, I have tested and ran this BatchGetItem result. Here ya go.
import AWSDynamoDB
let primaryKeyToSortKeyDict : [String : String] = .... // Your stuff
var keys = [Any]()
for key in primaryKeyToSortKeyDict.keys {
let partitionKeyValue = AWSDynamoDBAttributeValue()
partitionKeyValue?.s = String(key)
let sortValue = AWSDynamoDBAttributeValue()
sortValue?.s = String(primaryKeyToSortKeyDict[key]!)
keys.append(["partitionKeyAttributeName": partitionKeyValue, "sortKeyAttributeName": sortValue])
}
let keysAndAttributesMap = AWSDynamoDBKeysAndAttributes()
keysAndAttributesMap?.keys = keys as? [[String : AWSDynamoDBAttributeValue]]
keysAndAttributesMap?.consistentRead = true
let tableMap = [table : keysAndAttributesMap]
let request = AWSDynamoDBBatchGetItemInput()
request?.requestItems = tableMap as? [String : AWSDynamoDBKeysAndAttributes]
request?.returnConsumedCapacity = AWSDynamoDBReturnConsumedCapacity.total
guard request != nil else {
print("Handle some error")
return
}
AWSDynamoDB.default().batchGetItem(request!) { (output, error) in
print("Here is the batchgetitem output")
if error == nil {
// do output stuff
} else {
// handle error
}
}

AVAssetExportSession fails to convert .mov from photo library. Why?

Scenario:
I wish to reduce the size of individual videos from my iTouch photo library.
1. Collect videoAssets from library.
2. Get a thumbnail of the PHAsset - works.
3. Get the actual video from the library.
4. Request the AVAssetForVideo from the library.
5. Convert the video via ExportSessions... loading assorted parameters.
6. Attempt to run the export into a tmp directory for use.
* FAILS *
Here's the debug output:
Here's the error message:
func getVideoFromPhotoLibrary() {
let videoAssets = PHAsset.fetchAssetsWithMediaType(.Video, options:nil)
videoAssets.enumerateObjectsUsingBlock {
(obj:AnyObject!, index:Int, stop:UnsafeMutablePointer<ObjCBool>) in
let mySize = CGSizeMake(120,120)
let myAsset = obj as! PHAsset
let imageManager = PHImageManager.defaultManager()
var myVideo:BlissMedium?
// Request the poster frame or the image of the video
imageManager.requestImageForAsset(myAsset, targetSize:mySize, contentMode: .AspectFit, options: nil) {
(imageResult, info) in
let thumbnail = UIImage(named:"videoRed")
myVideo = BlissMedium(blissImage: imageResult, creationDate:myAsset.creationDate)
myVideo!.mediumType = .video
}
// Actual Video:
imageManager.requestAVAssetForVideo(myAsset, options: nil, resultHandler: {result, audio, info in
let asset = result as! AVURLAsset
let mediaURL = asset.URL
let session = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetMediumQuality)
let filename = "composition.mp4"
session.outputURL = NSURL(string: NSTemporaryDirectory());
session.outputFileType = AVFileTypeQuickTimeMovie;
session.exportAsynchronouslyWithCompletionHandler({ () -> Void in
dispatch_async(dispatch_get_main_queue(), {
if session.status == AVAssetExportSessionStatus.Completed {
println("Success")
}
else {
println(session.error?.localizedDescription)
//The requested URL was not found on this server.
}
})
})
})
if nil != myVideo {
self.gBlissVideoMedia.append(myVideo!)
}
}
}
I checked to be sure the target path/file exist; then I added the 'AVFileTypeMPEG4' output type to match the intended .mp4:
let targetDir = createTempDirectory("bliss/composition.mp4") as String?
if NSFileManager.defaultManager().fileExistsAtPath(targetDir!) {
println("*** file exists! ***")
} else {
return
}
session.outputURL = NSURL(string: targetDir!);
session.outputFileType = AVFileTypeMPEG4
I'm still having problems:
* file exists! *
Optional("The operation could not be completed")
What am I doing wrong; what's missing?
Update:
I'm able to successfully run the export to my NSHomeDirectory() vs NSTemporaryDictory() in Objective-C.
However... the same code written in Swift fails.
I notice a change in absolute path to the target output in Swift, not found in Objective-C:
Perhaps it's a Swift 1.2 bug???
I am not sure if you can save in the root of the temp directory, I normally use this function to create a new temp directory that I can use:
func createTempDirectory(myDir: String) -> String? {
let tempDirectoryTemplate = NSTemporaryDirectory().stringByAppendingPathComponent(myDir)
let fileManager = NSFileManager.defaultManager()
var err: NSErrorPointer = nil
if fileManager.createDirectoryAtPath(tempDirectoryTemplate, withIntermediateDirectories: true, attributes: nil, error: err) {
return tempDirectoryTemplate
} else {
return nil
}
}
Try to make your conversion in the directory returned by this function.
I hope that helps you!
I didn't quite understand what that last part of code did, where you find out if a file exists or not. Which file is it you are locating?
Since I didn't understand that then this might be irrelevant, but in your topmost code I notice that you set the filename to composition.mp4, but let the outputURL be NSURL(string: NSTemporaryDirectory()). With my lack of Swiftness I might be missing something, but it seems to me as if you're not using the filename at all, and are trying to write the file as a folder. I believe setting a proper URL might fix the problem but I'm not sure. An Objective-c-example of this could be:
NSURL * outputURL = [[NSURL alloc]
initFileURLWithPath:[NSString pathWithComponents:
#[NSTemporaryDirectory(), #"composition.mp4"]]];
The outputURL is supposed to point to the actual file, not the folder it lies in. I think..
Anyway, if that doesn't work I do have a few other thoughts as well.
Have you tried it on an actual device? There may be a problem with the simulator.
Also, sadly, I have gotten the error -12780 countless times with different root-problems, so that doesn't help very much.
And, I see you check if session.status == AVAssetExportSessionStatus.Completed, have you checked what the actual status is? Is it .Failed, or perhaps .Unknown? There are several statuses.
This might be a long shot, but in one of my apps I am using the camera to capture video/audio, then encode/convert it using AVAssetExportSession. There were strange errors when starting to record, as well as after recording(exporting). I found out that I could change the AVAudioSession, which apparently has something to do with how the device handles media.
I have no idea how to Swift, but here's my code (in viewDidAppear of the relevant view)
NSError *error;
AVAudioSession *aSession = [AVAudioSession sharedInstance];
[aSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
[aSession setMode:AVAudioSessionModeVideoRecording error:&error];
[aSession setActive: YES error: &error];
The category PlayAndRecord allowed me to start the camera much faster, as well as getting rid of the occasional hanging AVAssetExportSessionStatus.Unknown and the occasional crash .Failed (which also threw the -12780-error).

Resources