I’m trying to attach some XMP metadata to a QuickTime video I'm exporting using AVAssetExportSession.
AVFoundation does support writing metadata (AVMetadataItem) and I’ve managed to export simple values which can subsequently be examined using exiftool:
AVMutableMetadataItem *item = [AVMutableMetadataItem metadataItem];
item.identifier = [AVMetadataItem identifierForKey:AVMetadataCommonKeyTitle keySpace:AVMetadataKeySpaceCommon];
item.value = #"My Title";
exportSession.metadata = #[item];
But I’m having trouble configuring my AVMetadataItem’s to correctly encode XMP. According to the Adobe XMP spec, XMP in QuickTime videos should be under the moov / udta / XMP_ atoms but I can’t see a way to make hierarchical metadata using the AVFoundation API, or any key space that corresponds to this part of the metadata.
I also need to write XMP metadata to images, and Image I/O does have direct support for this (CGImageMetadataCreateFromXMPData), but I can't find anything equivalent in AVFoundation.
If it's not possible using AVFoundation (or similar), I'll probably look at integrating XMP-Toolkit-SDK but this feels like a clunky solution when AVFoundation almost seems to do what I need.
I finally managed to figure this after trying lots of variations of keys/key spaces and other attributes of AVMetadataItem:
Use a custom XMP_ key in the AVMetadataKeySpaceQuickTimeUserData key space
Set the value not as an NSString but as an NSData containing UTF-8 data for the payload
Set the dataType to raw data
This results in XMP attributes that can be read by exiftool as expected.
NSString *payload =
#"<x:xmpmeta xmlns:x=\"adobe:ns:meta/\" x:xmptk=\"MyAppXMPLibrary\">"
"<rdf:RDF xmlns:rdf=\"http://www.w3.org/1999/02/22-rdf-syntax-ns#\">"
"<rdf:Description rdf:about=\"\" xmlns:xmp=\"http://ns.adobe.com/xap/1.0/\">"
"<xmp:CreatorTool>My App</xmp:CreatorTool>"
"</rdf:Description>"
"</rdf:RDF>"
"</x:xmpmeta>";
NSData *data = [payload dataUsingEncoding:kCFStringEncodingUTF8];
AVMutableMetadataItem *item = [AVMutableMetadataItem metadataItem];
item.identifier = [AVMetadataItem identifierForKey:#"XMP_"
keySpace:AVMetadataKeySpaceQuickTimeUserData];
item.dataType = (NSString *)kCMMetadataBaseDataType_RawData;
item.value = data;
exportSession.metadata = #[item];
Related
I'm attempting to print a PDF file in my Cordova application on iOS.
The file is generated using jsPDF in the Cordova app and then I've modified the katzer cordova-plugin-printer to accept the raw PDF data as a string, convert it to NSData and print it out.
- (void) printPDFFromData:(CDVInvokedUrlCommand*)command
{
if (!self.isPrintingAvailable)
{
return;
}
NSArray* arguments = [command arguments];
NSString* documentData = [arguments objectAtIndex:0];
NSData* pdfData = [documentData dataUsingEncoding:NSUTF8StringEncoding];
UIPrintInteractionController* controller = printController;
[self adjustSettingsForPrintController:controller];
controller.printingItem = pdfData;
[self openPrintController:controller];
[self commandDelegate];
}
Using the iOS print simulator (I don't have access to an AirPrint printer), the PDF appears to print out, except that the background image is not printed, just the vector drawings overlaying it.
The same raw output data when saved to a PDF file will display the background image and when you print that file, the background image is printed.
Is this just an anomaly of the printer simulator or do I need to somehow set the print controller to be able to print the image in the document?
I found a solution to the issue. Something was getting lost in the decoding of the string data from JavaScript into Objective-C.
To get around this I Base64 encoded the PDF document in my JS side before sending it off to the plugin:
var startIndexOfBase64Data = 28;
var base64Document = doc.output('dataurlstring').substring(startIndexOfBase64Data);
window.plugin.printer.printPDFFromData(base64Document);
Then I needed to add
NSData+Base64.m and NSData+Base64.h
from this sample project into my plugins directory to allow this line of code to convert the Base64 string into NSData:
NSData* pdfData = [NSData dataFromBase64String:documentData];
Then the document then printed out untainted.
Now I'm off to see if I can get it working with Android.
Given an AVAsset representing a movie that has at least one audio track one can determine various properties of this audio track by obtaining an AudioStreamBasicDescription instance corresponding to it:
AVAssetTrack audio_track = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CMFormatDescriptionRef formatDescriptionRef = [audio_track.formatDescriptions objectAtIndex:0];
AudioStreamBasicDescription *ASBD = CMAudioFormatDescriptionGetStreamBasicDescription(formatDescriptionRef);
this instance (ASBD) can then be examined, for example:
ASBD->mFormatID == kAudioFormatLinearPCM // True if the track is PCM
ASBD->mFormatFlags & kAudioFormatFlagIsBigEndian // nonzero if the format is big endian
However I cannot seem to find a way to determine the bit-depth of the sample. This is necessary as it will be supplied as a value for the key AVLinearPCMBitDepthKey in a dictionary that will get passed as output settings to +[AVAssetWriterInput assetWriterInputWithMediaType: outputSettings:].
How may this information may be extracted from an AVAsset or an AVAssetTrack?
(The context is re-encoding the video in an AVAsset, but leaving the audio as-is)
The bit depth is stored in ASBD->mBitsPerChannel
I have to create app that provides online radio streaming (icecast), preferably .ogg format.
So I have next questions:
How can I play .ogg format audio stream? Are there any supported
classes? Because I can't find any, so I think that it is impossible
without many bitwise operations using CFNetwork, CoreAudio,
AudioToolbox etc. (I don't look at cocos2d, because it's ridiculous)
Am i wrong?
I'm playing mp3 stream for now (no possibility for .ogg for me). I
tried to use AVPlayer, MPMovieMediaController, AudioSreaming lib by
MattGallagher and by DigitalDJ, and none of these solutions can't
provides me metadata access.
For AVPlayer:
-(void)playButtonPressed:(id)sender
{
NSURL *grindURL = [NSURL URLWithString:#"http://radio.goha.ru:8000/grind.fm"];
grindFMPlayer = [[AVPlayer alloc] initWithURL:grindURL];
[grindFMPlayer.currentItem addObserver:self forKeyPath:#"status" options:0 context:nil];
AVPlayerItem *item = grindFMPlayer.currentItem;
[grindFMPlayer play];
}
-(void)stopButtonPressed:(id)sender
{
AVURLAsset *ass = grindFMPlayer.currentItem.asset;
NSArray *arr = [ass commonMetadata];
NSArray *it_meta = [grindFMPlayer.currentItem timedMetadata];
[grindFMPlayer pause];
}
arr and it_meta count always 0, no song\artist\any metadata.
The same for the MPMovieMediaController, metadataUpdate never called
streamAudioPlayer = [[MPMoviePlayerController alloc]
initWithContentURL:[NSURL URLWithString:#"http://radio.goha.ru:8000/grind.fm"];
streamAudioPlayer.movieSourceType = MPMovieSourceTypeStreaming;
[[NSNotificationCenter defaultCenter] addObserver:self selector:#selector(MetadataUpdate:) name:MPMoviePlayerTimedMetadataUpdatedNotification object:nil];
[streamAudioPlayer play];
and in stop button method:
timedMeta = [streamAudioPlayer timedMetadata];
if ([streamAudioPlayer timedMetadata]!=nil && [[streamAudioPlayer timedMetadata] count] > 0)
{
NSLog(#"metadata count = %d", [[streamAudioPlayer timedMetadata] count]);
for (MPTimedMetadata *metadata in [streamAudioPlayer timedMetadata])
{
NSLog(#"description %#", metadata.allMetadata);
if ([[metadata.allMetadata valueForKey:#"key"] isEqualToString:#"title"])
{
NSString *text = [metadata.allMetadata valueForKey:#"value"];
NSString* filename = text;
}
}
}
[streamAudioPlayer timedMetadata] always nil.
I've tried
https://github.com/mattgallagher/AudioStreamer
https://github.com/DigitalDJ/AudioStreamer
These 2 projects for shoutcast and icecast - http://www.mikejablonski.org/2009/04/17/reading-shoutcast-metadata-from-a-stream/
But still have no luck to get current playing track info, which only obtains in SHOUTcast app as
1st Metadata = 'StreamTitle=',
2nd metadata = '' and bitrate = '128000' (So I think I have to have deal with bytes from http headers response or something like this? but wtf, it's shoutcast metadata, but my radiostream is icecast. Have no idea)
I would be grateful for any help!
Icecast and Shoutcast are compatible with each other. The difference lies in the way they respond to a HTTP GET request. When you send a HTTP GET request to an Icecast server it will reply with a HTTP 200 OK response. The response headers will contain values for the icy-br, icy-metaint, icy-name, icy-genre and icy-url keys.
When you send a HTTP GET request to a Shoutcast server it will respond with a ICY 200 OK response. In this case you'll have to parse the response data because the metadata will not be available in the response headers.
The most important metadata key is the icy-metaint key. This value will tell you how often the metadata is sent in the stream. For more information about parsing this metadata have a look at this website: Shoutcast Metadata Protocol
To play ogg streams you'll need to use the open source FFmpeg library. This library can be compiled for the iOS platform and used to connect and decode ogg streams.
To play an ogg formatted audio stream, you would need to supply a codec. iOS does not have native support for the ogg format
I think the first thing to do is to "order" the meta data by setting the request header:
CFHTTPMessageSetHeaderFieldValue(message, CFSTR("Icy-MetaData"), CFSTR("1"));
In my case I added this line to Matt Galaghers's Audio Streamer. The result was that I could actually hear the metadata when the stream was playing because it was contained in the stream data. The next step was to filter the meta data out and interpret it. To achieve this all the necessary information are already mentioned here.
Good luck!
All:
I am recording a movie, using AVCaptureMovieFileOutput. As various events occur, I wish to store the event's title/time in the QuickTime movie being written. Thus I might have 20-30 data points that I wish to associate with a particular movie.
My strategy is to use metadata, but I have not been having much luck. Can someone please tell me, first of all:
a) Can I store arbitrary metadata, or just those keys and values as defined in AVMetadataFormat.h? I would like to be able to store an array.
b) If I can store an arbitrary array, what key does the trick? If not, could I store my metadata in a comment field (ugly, but I could parse 20-30 points quickly enough).
c) The code shown below does not appear to work, as no matter what I put in for the item.key (AVMetadataQuickTimeMetadataKeyArtist, AVMetadataCommonKeyArtist, or all sorts of other things ending in Artist) I never see anything in iTune's Get Info window.
- (IBAction)recordEvent:(id)sender {
NSLog(#"Record a metadata point here ...");
// is there any metadata associated with the file yet?
NSArray * existingMetaData = self.aMovieFileOutput.metadata;
NSMutableArray * newMetadataArray = nil;
if(existingMetaData){
newMetadataArray = [existingMetaData mutableCopy];
} else {
newMetadataArray = [[NSMutableArray alloc]init];
}
AVMutableMetadataItem * item = [[AVMutableMetadataItem alloc]init];
item.keySpace = AVMetadataKeySpaceCommon;
item.key = AVMetadataQuickTimeMetadataKeyArtist;
item.value = #"Enya, really!"; // in practice this will be the title of (UIButton *)sender
item.time = CMTimeMake(0.0,1.0);
[newMetadataArray addObject:item];
self.aMovieFileOutput.metadata = newMetadataArray;
}
Any advice would be greatly appreciated.
Thanks!
Storing metadata in QuickTime file via AVCaptureMovieFileOutput & AVMutableDataItems only allows you to store values for keys predefined in AVMetadataKeySpaceCommon keyspace, i. e. AVMetadataCommonKey* like keys
All the other data is ignoring
being that PDFKit is not available on iOS, how is it possible to get the outline of a pdf document in that environment? Is commercial libraries like FastPdfKit or PSPDFKit the only solution?
It's not TOO tricky to access the pdf outline. My outline parser has about 420 LOC. I'll post some snippets, so you'll get the idea. I can't post the full code as it's a commercial library.
You basically start like this:
CGPDFDictionaryRef outlineRef;
if(CGPDFDictionaryGetDictionary(pdfDocDictionary, "Outlines", &outlineRef)) {
going down to
NSArray *outlineElements = nil;
CGPDFDictionaryRef firstEntry;
if (CGPDFDictionaryGetDictionary(outlineRef, "First", &firstEntry)) {
NSMutableArray *pageCache = [NSMutableArray arrayWithCapacity:CGPDFDocumentGetNumberOfPages(documentRef)];
outlineElements = [self parseOutlineElements:firstEntry level:0 error:&error documentRef:documentRef cache:pageCache];
}else {
PSPDFLogWarning(#"Error while parsing outline. First entry not found!");
}
you parse single items like this:
// parse title
NSString *outlineTitle = stringFromCGPDFDictionary(outlineElementRef, #"Title");
PSPDFLogVerbose(#"outline title: %#", outlineTitle);
if (!outlineTitle) {
if (error_) {
*error_ = [NSError errorWithDomain:kPSPDFOutlineParserErrorDomain code:1 userInfo:nil];
}
return nil;
}
NSString *namedDestination = nil;
CGPDFObjectRef destinationRef;
if (CGPDFDictionaryGetObject(outlineElementRef, "Dest", &destinationRef)) {
CGPDFObjectType destinationType = CGPDFObjectGetType(destinationRef);
The most annoying thing is that you have Named Destinations in most pdf documents, which need additional steps to resolve. I save those in an array and resolve them later.
It took quite a while to "get it right" as there are LOTS of differences in the PDFs that are around, and even if you implement everything in compliance to the PDF reference, some files won't work until you apply further tweaking. (PDF is a mess!)
It is now possible in iOS 11+.
https://developer.apple.com/documentation/pdfkit
You can get the PDFOutline of a PDFDocument.
The PDFOutline's outlineRoot will return outline items if there are any and NULL if none.