Quicktime metadata APIs and iTunes - quicktime

I'm trying to set some metadata in a .mov file with the quicktime metadata APIs and have it show up in iTunes. I've got it working for most of the properties, but I can't get the description field to populate. Here is the code I'm using (shortened to only show what I think is the relevant portion).
const char* cString = ([#"HELLO WORLD" cStringUsingEncoding:NSMacOSRomanStringEncoding]);
QTMovie* qtMovie = [[QTMovie alloc] initWithFile:filename error:&error];
Movie movie = [qtMovie quickTimeMovie];
QTMetaDataRef metaDataRef = NULL;
OSStatus err = noErr;
err = QTCopyMovieMetaData(movie, &metaDataRef);
QTMetaDataItem outItem;
QTMetaDataAddItem(metaDataRef,
kQTMetaDataStorageFormatiTunes,
kQTMetaDataKeyFormatCommon,
(const UInt8 *)&key,
sizeof(key),
(const UInt8 *)cString,
strlen(cString),
kQTMetaDataTypeUTF8,
&outItem);
I found the following link, stating that for the information and description properties, I should be using kQTMetaDataStorageFormatQuicktime, but that doesn't seem to make any difference. Has anyone else had any success getting the description column to populate when importing metadata into iTunes videos?
http://lists.apple.com/archives/quicktime-api/2006/May/msg00115.html

I ended up using AtomicParsley http://atomicparsley.sourceforge.net/ without any issues which also has the benefit that it supports mp4 and m4v files and not just mov files which is also something I needed. With that the descriptions showed up fine. It was also much easier to use than the QTMetaData api.
Edit: Argh.. Just found out that this app doesn't work with mov files. This will work with mp4 and m4v files, but I guess the original question still stands because I would like to support mov files as well.

Figured it out finally with the help of this post and some deep debugging into the contents of my tagged media.
Retrieving the key name on AVMetadataItem for an AVAsset in iOS
I set the data format to kQTMetaDataStorageFormatiTunes and the key format to kQTMetaDataKeyFormatiTunesShortForm. And then the tags I use are the encoded id3 tags like in the post above. The common keys (kQTMetaDataCommonKeyArtist, kQTMetaDataCommonKeyComment) will generally not work if your goal is to view the data in iTunes. It seems a couple of them still do work, but in general they don't map over properly to their id3 counterparts.

Related

iOS, Swift, Image Metadata, XMP, DJI Drones

I'm writing an iOS Swift app to fetch metadata from DJI drone images. I'm trying to access the Xmp.drone-dji.X metadata. The iOS/Swift CGImageSource and CGImageMetadata libraries/classes get almost all of the metadata out of the image but not the Xmp.drone-dji. When I get a list of tags, those tag/values are not listed. I know the tags/data are in the images because I've examined the images using exif, exiv2, etc.
Any suggestions?
Here is the code I'm using so far:
result.itemProvider.loadDataRepresentation(forTypeIdentifier: UTType.image.identifier)
{ data, err in
if let data = data {
let src = CGImageSourceCreateWithData(data as CFData,nil)!
let md = CGImageSourceCopyPropertiesAtIndex(src,0,nil) as! NSDictionary
let md2 = CGImageSourceCopyMetadataAtIndex(src,0,nil)
}
Thanks,
Bobby
So, after a lot of searching, trial and error, I have found an answer.
I was not able to get any of the CGImage swift libraries to extract this info for me.
Adobe has a c++ library that parses xmp/xml data out of images and it purports to support iOS. I didnt want the hassle of building c++ on iOS, importing that into Xcode and then dealing with the fact that thrown errors do not propagate well from c++/objectiveC to Swift.
So, at a high level, I did the following:
get the bytes of the raw image as CFData or Data then cast to a String
then use String.range() to find beginning of XML/XMP data in image
searching for substring <?xpacket begin
use String.range() to find end of XML/XMP data in image
using substring <?xpacket end.*?>
Extract the XML document out of image data String
Use Swift XMLParser class to parse the XML and then copying attributes and
elements as necessary. I just simply added what I wanted to already
existing Exif NSdictionary returned by CGImage classes.
Happy to answer questions on this approach. My code will eventually be uploaded to GitHub under OpenAthenaIOS project.
Bobby

Adding metadata to generated audio file

I'm generating an audio file programmatically, and I'd like to add metadata to it, such as the title and artist. I don't particularly care what format the file is written in, as long as AVPlayer will read it and send it to the playing device. (The whole goal is to send this generated audio and its track name to a Bluetooth device. I'm happy to explore easier ways to achieve this on iPhone that don't require writing the file or adding metadata directly to the file.)
So far I've discovered that AVAssetWriter will often just throw away metadata that it doesn't understand, without generating errors, so I'm stumbling a bit trying to find what combinations of file formats and keys are acceptable. So far I have not found a file format that I can auto-generate that AVAssetWriter will add any metadata to. For example:
let writer = try AVAssetWriter(outputURL: output, fileType: .aiff)
let title = AVMutableMetadataItem()
title.identifier = .commonIdentifierTitle
title.dataType = kCMMetadataBaseDataType_UTF8 as String
title.value = "The Title" as NSString
writer.metadata = [title]
// setup the input and write the file.
I haven't found any combination of identifiers or fileTypes (that I can actually generate) that will include this metadata in the file.
My current approach is to create the file as an AIFF, and then use AVAssetExportSession to rewrite it as an m4a. Using that I've been able to add enough metadata that iTunes will show the title. However, Finder's "File Info" is not able to read the title (which it does for iTunes m4a files). My assumption is that if it doesn't even show up in File Info, it's not going to be sent over Bluetooth (I'll be testing that soon, but I don't have the piece of hardware I need handy).
Studying iTunes m4a files, I've found some tags that I cannot recreate with AVMetadataItem. For example, Sort Name (sonm). I don't know how to write tags that aren't one of the known identifiers (and I've tested all 263 AVMetadataIdentifiers).
With that background, my core questions:
What metadata tags are read by AVPlayer and sent to Bluetooth devices (i.e. AVRCP)?
Is it possible to write metadata directly with AVAssetWriter to a file format that supports Linear PCM (or some other easy-to-generate format)?
Given a known tag/value that does not match any of the AVMetadataIdentifiers), is it possible to write it in AVAssetExportSession?
I'll explore third-party id3 frameworks later, but I'd like to achieve it with AVFoundation (or other built-in framework) if possible.
I've been able to use AVAssetWriter to store metadata values in a .m4a file using the iTunes key space:
let songID = AVMutableMetadataItem()
songID.value = "songID" as NSString
songID.identifier = .iTunesMetadataSongID
let songName = AVMutableMetadataItem()
songName.value = "songName" as NSString
songName.identifier = .iTunesMetadataSongName
You can write compressed .m4a files directly using AVAssetWriter by specifying the correct settings when you set up the input object, so there’s no need to use an intermediate AIFF file.

Finding video type from NSData

I am loading an video from a URL provided by a third-party. There is no file extension (or filename for that matter) on the URL (as it is an obscured URL). I can take the data from this (in the form of NSData) and load it into a video player and display it fine.
I want to persist this data to a file. However, I don't know what format the data is in (mp4, wav)? I assume it is mp4 (since it's an video from the web) but is there a programmatic way of finding out for sure? I've looked around StackOverflow and at the documentation and haven't been able to find anything. I just wanted to know the file extension whether it is an image or video.
You should check the content type returned by the server:
- (void)connection:(NSURLConnection*)connection didReceiveResponse:(NSURLResponse*)theResponse
{
NSString* content_type = [[(NSHTTPURLResponse*)theResponse allHeaderFields] valueForKey:#"Content-Type"];
//content_type might be image/jpeg, video/mp4 etc.
}

AVMutableMetadataItem's time & duration INVALID after reading

I have a question.
Recently I needed to add custom tags for recorded video. Local video on device not a streamed video. The task is to add some event specific tags in video, position of which could be set by pressing forward/backward like buttons like in any player.
It is not important whether the movie file will be mov file or mp4 format.
I searched on forum, found several samples how to add metadata using AVExportSession & it worked.
Although, when I tried to add metadata using AVAssetWriter. I wasn't able to append attributes to video.
What I do not understand is that after adding attribute, returned (time & duration) properties are always invalid.
For instance let's say I have a video with duration 2 seconds.
I have tried different key spaces. I am not able to write keys' from ID3 space.
IS ID3 used for stream video? (as far as I understood ID3 metadata of .mp3). Therefore, I was not able to write it into MPEG-4 file
I also used QuickTimeUserData & ISOUserData but again results are the same.
Here is an example
AVMutableMetadataItem *item2 = [AVMutableMetadataItem new];
item2.keySpace = AVMetadataKeySpaceiTunes;
item2.key = AVMetadataiTunesMetadataKeyUserComment;
item2.value = #"One two three";
item2.duration =CMTimeMakeWithSeconds(1, 1);
item2.time = CMTimeMakeWithSeconds(0, 1);
After reading I got the following:
AVMutableMetadataItem: 0xa4301f0, keySpace=itsk, key=\U00a9cmt, commonKey=(null), locale= (null), value=One two three, time={INVALID}, duration={INVALID}, extras={\n dataType = 1;\n}
I would like to use time & duration properties for metadata instead of writing custom data and processing it after that.
Ideally it would be great to append array of items with time = t1, duration = d1, .... (tn,dn).
Does anyone know how to accomplish that?
I've ended with a solution adding chapters to a video file instead of using metadata.
I looked at available libraries, took mpv4lib.
The library currently is not compiled for iOS, therefore, I ported the source project into static library for iOS platform.
That library allows to add custom "atoms" to mp4 file, and one of them is Quick Time text track, containing chapters.
I do similar with that post
The library is located here.

How to extract the song name from a live audio stream on the Blackberry Storm?

HI
I am new to Blackberry.
I am developing an application to get the song name from the live audio stream. I am able to get the mp3 stream bytes from the particular radioserver.To get the song name I add the flag "Icy-metadata:1".So I am getting the header from the stream.To get the mp3 block size I use "Icy-metaInt".How to recognize the metadatablocks with this mp3 block size.I am using the following code.can anyone help me to get it...Here the b[off+k] is the bytes that are from the server...I am converting whole stream in to charArray which is wrong, but how to recognize the metadataHeaders according to the mp3 block size..
b[off+k] = buffers[PlayBuf]PlayByte];
String metaSt = httpConn.getHeaderField("icy-metaint");
metaInt=Integer.parseInt(metaSt);
for (int i=0;i<b[off+k];i++)
{
metadataHeader+=(new String(b)).toCharArray();
System.out.println(metadataHeader);
metadataLength--;
Blackberry has no native regex functionality; I would recommend grabbing the regexp-me library (http://code.google.com/p/regexp-me/) and compiling it into your code. I've used it before and its regex support is pretty good. I think the regex in the code you posted would work just fine.

Resources