Store data in video frames while capturing the video in iOS - ios

How to store the data like time or latitude in video frames per second while capturing the video and retrieve this data back from the saved video in iOS?

Most video formats include video MetaData, describing the whole video at the asset level. When using AVAssetWriter, you can add MetaData items before writing to associate this with the output file. https://developer.apple.com/reference/avfoundation/avassetwriter#//apple_ref/occ/instp/AVAssetWriter/metadata
There are common keys (https://developer.apple.com/reference/avfoundation/1668870-av_foundation_metadata_key_const/1669056-common_metadata_keys) you can use to store the information if you like.
Note this is only at the file level, not per frame.
If you want to store information at a "frames per second" type time refernece then you could build a custom solution, interacting with the buffers "vended" by AVFoundation Recording locations in time domain with AVFoundation It's possible to then write your own custom storage for that information that's synchronised to the video file and you would need to read it back and process it yourself.
I don't believe there's a way to encapsulate that "per frame location info" within the actual video file itself (you could perhaps do a hack and repurpose a subtitles AVAssetTrack and write the info, then pull it off but not display it - this would be unpredictable when video was played on other devices however).
ADDITIONAL INFO
Following on from a comment a year after I wrote this, I did some more investigation. While you could use and abuse the subtitle track like suggested, a better solution is to use the AVAsset metadata type which is specifically for this. https://developer.apple.com/documentation/avfoundation/avmediatype/1390709-subtitle
There are many different AVAssetTrack types which allow you to time data to a point on a video including
Audio
closedCaption
depthData (BETA at time of edit)
metaData <- This is probably what you want
metaDataObject <- In combination with this one too
muxed
text
timecode
video

Related

Play video using Data content without saving

My question is simple...
I have an array of bytes or actually a data of my video, and I want to play it.
Every library that I use, saves the Data & gets the path of saved Data as URL.
But I don't want to save the Data...
Kindly help me what to do to play a video using bytes or Data, without saving it.
but I don't want to save the Data
Yes, you do. You never want to play a video from data held in memory; it's way too big and will crash your app. And in fact the system isn't even set up to play video out of memory! It's set up to play video out of files.
So, as you obtain the data you save it, and in order to play it, you supply the file URL where you saved it. That's the right way.

How to create real-time video on iOS: Apple docs say AVAssetWriter not for real-time processing?

We have been to trying to create real-time videos on iOS, but have experienced many frustrating problems with AVAssetWriter like this one where the error claims media is being appended after a session ends, though our code does not appear to do this.
Upon reading the Apple docs more carefully, it appears AVAssetWriter is not meant for real-time processing:
Note: The asset reader and writer classes are not intended to be used
for real-time processing. In fact, an asset reader cannot even be used
for reading from a real-time source like an HTTP live stream. However,
if you are using an asset writer with a real-time data source, such as
an AVCaptureOutput object, set the expectsMediaDataInRealTime property
of your asset writer’s inputs to YES. Setting this property to YES for
a non-real-time data source will result in your files not being
interleaved properly.
If not AVAssetWriter, how are you supposed to capture input from the front camera and make a video in real-time (with different overlays/watermarks appearing at different points in the video)?

Why does no AVPlayerItemAudioOutput exist in AVFoundation?

AVPlayerItemVideoOutput is a subclass of AVPlayerItemOutput in AVFoundation, I can get the visual data in pixel buffer format and do some process. (through copyPixelBufferForItemTime:)
However, there is no AVPlayerItemAudioOutput exists accordingly. How can I process the audio data?
Do I have to use the AVAssetReader class to get this?
This is a great question. -[AVPlayerItem addOutput:] mentions audio but there is nothing to be found on it in AVPlayerItemOutput.h (unless you're meant to get audio via the AVPlayerItemLegibleOutput class - I'm only half joking, as a class that vends CMSampleBuffers, I think a a hypothetical AVPlayerItemAudioOutput would look a lot like this).
So I don't know where AVPlayerItemAudioOutput is, but yes you can use AVAssetReader to get at audio data.
However if you're already using an AVPlayer, your most painless path would be using MTAudioProcessingTap to play the role of the hypothetical AVPlayerItemAudioOutput.
You can add a tap to the inputParameters of your AVPlayer's currentItem's audioMix to receive (and even modify) the audio of your chosen audio tracks.
It's probably easier to read some example code than it is to parse what I just wrote.

iOS UIImagePickerController for Video URL Only

I'm using the UIImagePickerController to select a video from the the device's camera roll. However, I'm not interested in viewing the video at this time; I want to save the URL (in Core Data) so that when the user chooses the name of the video from, for example, a pickerView, the video will load and play at that time.
My understanding (which may be wrong) is the UIImagePickerController makes a compressed copy into the sandbox and provides two different URLS (in the info dictionary). It is kind of a guess at this point, but what I think is:
UIImagePickerControllerMediaURL is the url that points to the original video; and
UIImagePickerControllerReferenceURL is the url that points to the copy.
Here are my questions:
a) Is my assumption correct as to what the two URLs point to, and can I count on the ReferenceURL to point to the selected video so long as it is on the device's camera roll?
and
b) Under the circumstances, is there any way to avoid the compression? From reading on SO, I'm thinking there may not be, but I haven't really seen any posts that really relate exactly to what I'm doing. The structure of my app is such that there could be a lot of these videos and users will not want to get rid of the original, so there is no point in having both the original and compressed version around.
All I'm interested in is a URL I can use to access the video in the camera roll. I will also have to get a thumbnail of it to store with the URL, but I think I see how to do that.
Any help on this will be greatly appreciated.
If you only want the URL to access the video, then you can use UIImagePickerControllerMediaURL this specifies the filesystem URL for the movie (if editing is enabled, this points to the edited/trimmed video).If you want the original video URL you can se UIImagePickerControllerReferenceURL this is the Assets Library URL for the original version of the video. (The truly selected item, without editing). You can, of course, establish controller.allowsEditing = NO to avoid user to edit the video, getting in UIImagePickerControllerMediaURL the URL of the original unedited video.
AFAIK there is no compression applied to the recorded/selected video by default, this only happens if you press the Share button and try to send the file over MMS, MobileMe, etc., just make sure you establish controller.videoQuality = UIImagePickerControllerQualityTypeHigh to get highest quality.

AudioPlayer iOS and m4a

I've made an app that plays music using AVAudioPlayer. It either uploads or downloads songs, writes them to Core Data, then recalls them to play when selected. All of the fifteen songs that I've been testing with operate normally using both the iPhone Music Client and my own computer.
However, three of them don't play back on the app. Specifically, I can upload these fifteen songs in any order, clear my Model.sqlite, download them again into the app, and find that three of them just don't play. They do, however, have the right title and artist.
Looking into this, I noticed that the difference is that the non-working files are .m4a. How do I play files of that format with AVAudioPlayer?
EDIT ("Whats "recalling?", what URL do you initialise AVAudioPlayer with?"):
There is a server with songs that the user can access through the app. After choosing which subset S to retrieve, the app then downloads S and writes it to a CoreModel using NSManagedObjectContext. Each song is stored as a separate entity with a unique ID and a relationship to a subset entity (in this case, S).
When I "recall" using the AppDelegate to get the right song using the context, the data is returned as well. I then initialize the AVAudioPlayer like so:
[[AVAudioPlayer alloc] initWithData:(NSData *)[currentSong valueForKey:#"data"] error:nil];
... So I wrote that and then realized that I haven't actually checked out what the error is (silly me). I found that it's OSStatus error 1954115647, which returns as Unsupported File Type. Looking into this a bit more, I found this iPhone: AVAudioPlayer unsupported file type. A solution is presented there as either trimming off bad data in the beginning or initializing from the contents of a URL. Is it possible to find where the data is written to in core model to feed that as the URL?
EDIT: (Compare files. Are they different?)
Yes, they are. I'm grabbing a sample .m4a file from my server, which was uploaded by the app, and comparing it to the one that's in iTunes. What I found is that the file is cut off before offset 229404 (out of 2906191 bytes), which starts 20680001 A0000E21. In the iTunes version, 0028D83B 6D646174 lies before those bytes. Before that is a big block of zeroes preceded by a big block of data preceded by iTunes encoding information. At the very top is more encoding information listing the file as being M4A.
Are you sure your codec is supported in iOS? AVAudioPlayer is ought to play any format that iOS supports, you can read the list of supported formats here :http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/MultimediaPG/UsingAudio/UsingAudio.html#//apple_ref/doc/uid/TP40009767-CH2-SW6 .
I will suggest you to try manually adding those files to your device through iTunes and playing them in iPod, if they won't play then the problem is not your code or sdk, but the format.
How are you recalling them to play - are you writing them to a temporary file which has an m4a extension - this m4a extension is probably required.
This is not a direct solution, but you probably shouldn't be saving the blobs in Core Data directly. Write the files to a cached location and save the file paths in Core Data. This will both use the database more efficiently and give you a local file path to give to your AVAudioPlayer, which will bypass the problem.

Resources