I'm using AVCaptureSession to record a video with audio. Everything seems to work properly for short videos, but for some reason, if I record a video that is longer than about 12 seconds, the audio doesn't work.
Edit (because this answer is still getting upvotes): This answer works to mitigate the problem but the likely root cause for the issue is addressed in #jfeldman's answer.
I found the solution as an answer to a completely different question.
The issue is the movieFragmentInterval property in AVCaptureMovieFileOutput.
The documentation for this property explains what these fragments are:
A QuickTime movie is comprised of media samples and a sample table
identifying their location in the file. A movie file without a sample
table is unreadable.
In a processed file, the sample table typically appears at the
beginning of the file. It may also appear at the end of the file, in
which case the header contains a pointer to the sample table at the
end. When a new movie file is being recorded, it is not possible to
write the sample table since the size of the file is not yet known.
Instead, the table is must be written when recording is complete. If
no other action is taken, this means that if the recording does not
complete successfully (for example, in the event of a crash), the file
data is unusable (because there is no sample table). By periodically
inserting “movie fragments” into the movie file, the sample table can
be built up incrementally. This means that if the file is not written
completely, the movie file is still usable (up to the point where the
last fragment was written).
It also says:
The default is 10 seconds. Set to kCMTimeInvalid to disable movie
fragment writing (not typically recommended).
So for some reason my recording is getting messed up whenever a fragment is written. I just added the line movieFileOutput.movieFragmentInterval = kCMTimeInvalid; (where movieFileOutput is the AVCaptureMovieFileOutput I've added to the AVCaptureSession) to disable fragment writing, and the audio now works.
We also experienced this issue. Basically disabling movie fragment writing will work but it doesn't actually explain the issue. Most likely you are recording to an output file using a file extension that does not support this feature, like mp4. If you pass an output file with the extension mov you should have no issues using movie fragment writing and the output file will have audio.
Updating videoFileOutput.movieFragmentInterval = kCMTimeInvalid solved this for me.
However, I accidentally set the movieFragmentInterval after calling startRecordingToOutputFileURL. An agonizing hour later I realized my mistake. For newbies like me, note this obvious sequence.
videoFileOutput.movieFragmentInterval = kCMTimeInvalid
videoFileOutput.startRecordingToOutputFileURL(filePath, recordingDelegate: recordingDelegate)
kCMTimeInvalid is now deprecated. This is how to assign it now:
videoFileOutput?.movieFragmentInterval = CMTime.invalid
Related
I'm working on a tvOS application where I'm using the AVPlayer to play an HLS playlist which provides audio in two formats for some languages. For example:
French (AAC)
French (EC-3)
English
I'm trying to display a custom dialog that would allow the users to select between each of these tracks.
The playlist looks like this:
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio-mp4a.40.2",NAME="Français",DEFAULT=YES,AUTOSELECT=YES,LANGUAGE="fr",URI="..."
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio-ec-3",NAME="Français",DEFAULT=NO,AUTOSELECT=YES,LANGUAGE="fr",URI="..."
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio-mp4a.40.2",NAME="English",DEFAULT=NO,AUTOSELECT=YES,LANGUAGE="en",URI="..."
The problem is that, from what I can tell, the AVPlayer groups the tracks by language and it never returns all the 3 tracks.
(lldb) po player?.currentItem?.asset.mediaSelectionGroup(forMediaCharacteristic: .audible)
▿ Optional<AVMediaSelectionGroup>
- some : <AVAssetMediaSelectionGroup: 0x283961940, options = (
"<AVMediaSelectionKeyValueOption: 0x2839a5a00, language = fr, mediaType = 'soun', title = Français, default = YES>",
"<AVMediaSelectionKeyValueOption: 0x2839a5b00, language = en, mediaType = 'soun', title = English>"
), allowsEmptySelection = YES>
I went deeper into the French item (player?.currentItem?.asset.mediaSelectionGroup(forMediaCharacteristic: .audible)?.options.first) but I still couldn't find anything useful. I also tried looking at other fields from the AVPlayer with no success.
Even when I use the AVPlayerViewController I only see two audio tracks to choose from.
Is there any way to get all the available audio tracks?
So the issue here is actually the playlist. If you check the HLS specification, there are some notes explaining this under the Rendition Groups subsection of EXT-X-MEDIA (https://datatracker.ietf.org/doc/html/draft-pantos-hls-rfc8216bis-07#section-4.4.6.1.1)
A Playlist MAY contain multiple Groups of the same TYPE in order to
provide multiple encodings of that media type. If it does so, each
Group of the same TYPE MUST have the same set of members, and each
corresponding member MUST have identical attributes with the
exception of the URI and CHANNELS attributes.
Generally, the way to think about it is, anything within a given GROUP-ID is selectable by the user (and so AVFoundation reveals it to you). But which GROUP-ID is playing is selected by the player, and (for this scenario) this is determined by the AUDIO attribute of the EXT-X-STREAM-INF that the player has selected.
If you want the surround audio track to be selectable, then it needs to exist in the same GROUP-ID as the rest of the AUDIO tracks. If you don't have control of the manifest, you can test this by re-writing the GROUP-ID of the surround French track (using something like Charles Proxy), from audio-ec-3 to audio-mp4a.40.2; it should appear in AVFoundation after that. But a word of warning, for the HLS to remain valid, the CODECS attribute of all of the EXT-X-STREAM-INF tags have to be updated to include the CODECS defined by the surround track (otherwise playback failure may occur).
If you want to leave it up to the player to select, and you do not have a surround English track, you still have to give the English option in the surround group to remain valid HLS, but you can just leave the URI identical to the one defined in the stereo group. Again CODECS will have to be updated in this scenario.
This video from WWDC gives a good explanation of all of this (relevant section starts around 42:39): https://developer.apple.com/videos/play/wwdc2018/502/
I am making some light video editing in swift 4, reading the video with AVAsset() and then using the AVExportSession to export the result. Everything works fine except one thing: the resulted video keeps the metadata of the original video.
This metadata includes (for example) the time and location where the video was taken.
I saw that the AVExportSession has a metadata:[AVMetadataItem] property, but I don't know how to use it. I set it to nil and it didn't work, it still keept the old metadata.
I read the apple's documentation about and it says that you don't create instances nor can modify a metadata item, so how can I do? how can I erase that metadata or write new generated metadata to it?
There is a lot of info about reading metadata, but not much on writing it.
Thanks in advance.
Aditional links
https://developer.apple.com/documentation/avfoundation/avassetexportsession
You can filter metadata with AVMetadataItemFilter.forSharing().
From the spec: Removes user-identifying metadata items, such as location information and leaves only metadata releated to commerce or playback itself. (see https://developer.apple.com/documentation/avfoundation/avmetadataitemfilter/1387905-forsharing)
Just add it to your export session:
let exportSession = AVExportSession() // choose your appropriate init
exportSession.metadataItemFilter = AVMetadataItemFilter.forSharing()
Fun bug in the MPMedia API
I have had an ongoing bug in my music app that I have finally tracked down now (now that I am re-writing it in swift). It has a few facets. (using systemMusicPlayer)
I think I have narrowed the issues down to an MPMediaItem that has the following properties.
MPMediaItemPropertyIsCloudItem = true
assetURL = nil
** these two make sense, but the following corner case (well probably pretty common) threw me for a while**
The first 2 items can be true, but I believe if you copied it from iTunes, it /can/ be playable (they would play most of the time for me), and there is no way to tell. I have tested this over and over and it seems to be the case, but these MPMediaItems by their existence might only sometimes cause issue, or they are fine. But you cant find out which songs these are.
If you are playing a queue, and systemMusicPlayer comes across a
song in your library but not downloaded or copied form iTunes, I believe it will automatically skip the
song, similar to systemMusicPlayer.skipToNextItem(), but I think
internally it's a different mechanism.
2a. This behavior causes a basically unrecoverable problem if you are using systemMusicPlayer.skipToPreviousItem() and come across an Item that would have been skipped over - meaning, it doesn't recognize that you are trying to move back in the queue and just throws the error and moves the queue forward.
2b. As far I I could tell, when the error hits going forward the MPMediaItem never becomes the nowPlayingItem. The problems going backwards get compounded by the MPMediaItem metadata (which is always available weather it is local or not) getting loaded, but the song trying to play immediately sends it forward in the queue again.
OK, so asinine and infuriating.
Now to my question:
I cant do anything about not being able to know if a cloud item is on the device or not (via iTunes). I /should/ be able to just filter out if an item has an assetURL, however, which is a guarantee that it is local and available.
let filter:MPMediaPropertyPredicate = MPMediaPropertyPredicate(value: "ipod", forProperty: MPMediaItemPropertyAssetURL, comparisonType: MPMediaPredicateComparison.Contains)
This returns 0 items. Does anyone know of a way to filter on this property? Doing it here seems like it would be the cleanest, and should leave the query returning items and itemSections. All my tables populate from the queries, and I dont think theres a way to reconstruct one manually.
The URL has a format like this: ipod-library://item/item.m4a?id=5314739480586915369
Now, I suspect it is possible to add catches when populating table views and such, but it feels really messy.
This is ios 9.2.1, Swift 2, Xcode 7.2.1
I have not yet wiped the phone and re-copied the songs. Manually downloading them from the Music app is the only way the items get an assetURL if it was not present.
It's not as efficient, but one thing you can do:
let query = MPMediaQuery()
let allItems = query.items ?? []
let items = allItems.filter { $0.assetURL?.scheme?.hasPrefix("ipod") ?? false }
From MPMediaItem.h, you can see only these are filterable(commented with filterable):
MPMediaItemPropertyPersistentID
MPMediaItemPropertyMediaType
MPMediaItemPropertyTitle
MPMediaItemPropertyAlbumTitle
MPMediaItemPropertyAlbumPersistentID
MPMediaItemPropertyArtist
MPMediaItemPropertyArtistPersistentID
MPMediaItemPropertyAlbumArtist
MPMediaItemPropertyAlbumArtistPersistentID
MPMediaItemPropertyGenre
MPMediaItemPropertyGenrePersistentID
MPMediaItemPropertyCompose
MPMediaItemPropertyComposerPersistentID
MPMediaItemPropertyIsCompilation
MPMediaItemPropertyIsCloudItem
MPMediaItemPropertyHasProtectedAsset
MPMediaItemPropertyPodcastTitle
MPMediaItemPropertyPodcastPersistentID
MPMediaItemPropertyPlayCount
So it is impossible to build any query with condition on assetURL property. It is a dead end trying to do anything on assetURL unless you fetch all the MPMediaItems and do a NSArray search.
Also from somewhere in the Apple's docs, I remember vaguely, you can not by any means get the information of where a cloud item is downloaded or not.
However if you want to investigate more into cloud/local issues, I suggest you take consideration of user's music setting on if iCloud Music Library is turned on or off and looks into MPMediaItemPropertyHasProtectedAsset.
If the purpose is to detect if the song is a local song or not, you can just build a query on both isCloudItem == FALSE AND hasProtectedAsset == FALSE, in that case assetURL does not matter.
I'm trying to figure out how I can specify a custom end time for an embedded YouTube video. I know that I can customize the start time by adding &start=30, but I haven't seen anything relating to the end time.
I need to be able to do this for a web app I'm building, so if there is no way provided by YouTube, how might I be able to accomplish this anyway?
I've skimmed over the documentation to no avail. Thanks!
I just found out that the following works:
https://www.youtube.com/embed/[video_id]?start=[start_at_second]&end=[end_at_second]
Note: the time must be an integer number of seconds (e.g. 119, not 1m59s).
I tried the method of #mystic11 ( https://stackoverflow.com/a/11422551/506073 ) and got redirected around. Here is a working example URL:
http://youtube.googleapis.com/v/WA8sLsM3McU?start=15&end=20&version=3
If the version=3 parameter is omitted, the video starts at the correct place but runs all the way to the end. From the documentation for the end parameter I am guessing version=3 asks for the AS3 player to be used. See:
end (supported players: AS3, HTML5)
Additional Experiments
Autoplay
Autoplay of the clipped video portion works:
http://youtube.googleapis.com/v/WA8sLsM3McU?start=15&end=20&version=3&autoplay=1
Looping
Adding looping as per the documentation unfortunately starts the second and subsequent iterations at the beginning of the video:
http://youtube.googleapis.com/v/WA8sLsM3McU?start=15&end=20&version=3&loop=1&playlist=WA8sLsM3McU
To do this properly, you probably need to set enablejsapi=1 and use the javascript API.
FYI, the above video looped: http://www.infinitelooper.com/?v=WA8sLsM3McU&p=n#/15;19
Remove Branding and Related Videos
To get rid of the Youtube logo and the list of videos to click on to at the end of playing the video you want to watch, add these (&modestBranding=1&rel=0) parameters:
http://youtube.googleapis.com/v/WA8sLsM3McU?start=15&end=20&version=3&autoplay=1&modestBranding=1&rel=0
Remove the uploader info with showinfo=0:
http://youtube.googleapis.com/v/WA8sLsM3McU?start=15&end=20&version=3&autoplay=1&modestBranding=1&rel=0&showinfo=0
This eliminates the thin strip with video title, up and down thumbs, and info icon at the top of the video. The final version produced is fairly clean and doesn't have the downside of giving your viewers an exit into unproductive clicking around Youtube at the end of watching the video portion that you wanted them to see.
Use parameters(seconds) i.e. youtube.com/v/VIDEO_ID?start=4&end=117
Live DEMO:
https://puvox.software/software/youtube_trimmer.php
Youtube doesn't provide any option for an end time, but there alternative sites that provide this, like
Tubechop. Otherwise try writing a function that either pauses video/skips to next
when your when your video has played its desired duration.
OR: using the Youtube Javascript player API, you could do something like this:
function onPlayerStateChange(evt) {
if (evt.data == YT.PlayerState.PLAYING && !done) {
setTimeout(stopVideo, 6000);
done = true;
}
}
Youtube API blog
Today I found, that the old ways are not working very well.
So I used:
"Customize YouTube Start and End Time - Acetrot.com"
from http://www.youtubestartend.com/
They provide a link into
https://xxxx.app.goo.gl/yyyyyyyyyy
e.g. https://v637g.app.goo.gl/Cs2SV9NEeoweNGGy9
Link contain forward to format like this
https://www.youtube.com/embed/xyzabc123?start=17&end=21&version=3&autoplay=1
I was just trying to look up how and found there is a CLIP feature now added by Youtube right under the video that I had never noticed before!
I use this signature:
youtube.com/embed/[YOUR_VIDEO_ID]?start=[TIME_IN_SEC]&end=[TIME_IN_SEC]&autoplay=1
https://www.youtube.com/embed/2EWejmkKlxs?start=1230&end=1246&autoplay=1
I have an HTML5 video player with a custom seek bar, that's working great on the iPhone (playing inline) and on the browser.
It plays great on the iPad too and the seek bar is updated as the movie plays, but for some reason, I can't seek.
All of the values are correct and I'm trying to set:
myPlayer.currentTime = XX;
Unfortunately, the iPad refuses to set the .currentTime attribute.
From what I can gather the difference between the browser and iPad is that on the browser I get:
myPlayer.networkState = 3
myPlayer.readyState = 4
On the iPad I get:
myPlayer.networkState = 2
myPlayer.readyState = 3
It's exactly the same code, running a local MP4 video.
Any idea why this is happening?
Cheers,
Andre
I've had all kinds of problems getting JavaScript to control audio elements, and a lot of frustration with the currentTime property, along with Apple's restrictions on what constitutes direct user initiation of events.
It wouldn't surprise me if there were some kind of weird bug with JavaScript & HTML5 video playback on the iPad (or "feature" that's undocumented), which requires a workaround. From my experience, the iPad has a unique way of doing things than what's in the official documentation.
You should check the error, buffered, seekable, and seeking properties of the video element. Looking at your readyState & networkState values, the iPad seems to think that the video has not been completely loaded - which is odd for a local resource.
buffered and seekable should be equal to the time range of your entire video. seeking should be TRUE. That should at least give you a little more information about the problem.
Have you tested it with other videos? It might be that there is some kind of encoding problem with the video that the iPad has a problem with.
Other than that - there was a bug in a previous iPad OS version that broke the ability to set the currentTime property. Are you using the latest OS version?
This issue is related with value used on the video.currentTime property. On my specific case I fixed the problem by always making sure I was using floating point numbers with 1 decimal digit during the seek.
Setting video.currentTime to ZERO on iOS 3.2 will indeed seek the video to the beginning but the value won't update after that - timeupdate event is still dispatched normally but if you try to read currentTime it will always return the same value.
To seek to the begin of the video use 0.1 instead of 0, to seek to 12.2345 use 12.2.
PS: you can use (+(12.2345).toFixed(1)) to limit the number of decimal digits to 1.
Kyle's answer is a good one. I would add, you can't assume the seekable attribute is filled in after any particular event. This means you can't wait for events like loadedmetadata, canplay, or any other and assume that you're able to set currentTime safely at that point.
It seems the safest way to do it is to check seekable after every video-related event, and if it encompasses the time you want to seek to, set currentTime at that point. On iPad, seekable may not be filled until after the canplaythrough event, which is quite late.
See my blog post for more on this.
I am having the same issue - here are the properties in my case:
UIWebView - iPad Simulator
duration=4.861666679382324
startTime=0
currentTime=4.861666679382324
buffered(1)=[0-0]
seekable(0)=
seeking=false
error=null
readystate=4
networkstate=3
Chrome:
duration=4.9226298332214355
startTime=0
currentTime=4.9226298332214355
buffered(1)=[0-4.9226298332214355]
seekable(1)=[0-4.9226298332214355]
seeking=false
error=null
readystate=4
networkstate=1
so - nothing is getting buffered and nothing is seekable. i am playing a local clip from the resources directory of an iPad bundle, via a UIWebView.
In my case, all i need is to reset to the top of the video after each play, and I was able to accomplish this via a call to "load()"