I have an URL. It looks like this:
https://content.stage.someCompany.net/deliveries/artistNameHere/songNameHere-128.mp3?Expires=someNumberHere&Signature=someReallyReallyReallyLongStringHere&Key-Pair-Id=someIdHere
Let me break it down in pieces:
https://content.stage.someCompany.net/deliveries/artistNameHere/songNameHere-128.mp3
?Expires=someNumberHere
&Signature=someReallyReallyReallyLongStringHere
&Key-Pair-Id=someIdHere
As you can see, it's just a glorified .mp3 restricted to 128 kbps, with some security stuff at the end.
If I load it in Safari on my Mac, it will play. If I pass it to an AVPlayer constructor in my iOS app, it will play as well.
If, however, I use it to create an AVURLAsset, it reports that .isPlayable is false. If I stubbornly persist in further creating an AVPlayerItem based on that asset, it will report AVPlayerItemStatusFailed.
Needless to say, in these conditions my AVURLAsset + AVPlayerItem + AVPLayer infrastructure, which culminates in player.play(), actually plays no music.
However, it does successfully play if I substitute other URLs, like Apple's own https://devimages.apple.com.edgekey.net/streaming/examples/bipbop_4x3/bipbop_4x3_variant.m3u8
or
(some random .mp3 from another stackoverflow topic) http://podcast.cbc.ca/mp3/podcasts/asithappens_20160907_50906.mp3
The differences that I see: Apple's url is in fact a "playlist" of some sort, while the second one in a plain, "civilised" .mp3. No more security mambo-jumbo at the end of the link.
Why won't my url play? Do I need to do something specific with the security stuff? Right now, I'm just naively "hey, AVURLAsset...here's my (entire) URL...do your stuff with it..."
Found it.
The links that I receive are valid only once. Apparently that's why "the security" is in place at the end of them.
Testing one in Safari and "seeing that it works" invalidates it. Subsequently, trying the same one in the app results is .isPlayable = false.
Simply requesting & using directly in the app results in .isPlayable = true.
So AVURLAsset + AVPlayerItem + AVPLayer are working just fine. I was just a bloody fool.
Related
I'm working on a project where I have an instance of AVPlayer capable of playing different audio content that I retrieve from a backend, from podcast to music and streamings. Every content has two types of urls: one with mp3 and another with a m3u8 file. All the mp3 files work good. However some m3u8 files work fine and others don't. In particular, those who don't work cause the AVPlayer to crash with the error:
Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action"
UserInfo={NSLocalizedRecoverySuggestion=Try again later.,
NSLocalizedDescription=Cannot Complete Action.}
I don't understand what the problem is. According to this answer it is a wrong Manifest file, which in my case is - for example - the following:
#EXTM3U
#EXT-X-MEDIA:TYPE=AUDIO,URI="_64/index.m3u8",GROUP-ID="2#48000-64000",NAME="AAC 64",DEFAULT=NO,AUTOSELECT=NO
#EXT-X-MEDIA:TYPE=AUDIO,URI="_80/index.m3u8",GROUP-ID="2#48000-80000",NAME="AAC 80",DEFAULT=NO,AUTOSELECT=NO
#EXT-X-MEDIA:TYPE=AUDIO,URI="_96/index.m3u8",GROUP-ID="2#48000-96000",NAME="AAC 96",DEFAULT=NO,AUTOSELECT=NO
#EXT-X-STREAM-INF:BANDWIDTH=133336,CODECS="mp4a.40.2",AUDIO="2#48000-96000"
_96/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=100641,CODECS="mp4a.40.2",AUDIO="2#48000-64000"
_64/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=116989,CODECS="mp4a.40.2",AUDIO="2#48000-80000"
_80/index.m3u8
On the Apple forum, I found this answer which says iOS 14+ is on fault. Unfortunately I cannot test with an iOS 13 physical device.
Do you have any suggestion?
Tested on Xcode 13.1 with iPhone 7plus with iOS 15.0.2.
Finally I found a solution for this issue. What worked for me was this. I believe the problem was that my manifest files were structured like the following:
#EXT-X-MEDIA:TYPE=AUDIO,URI="_64/index.m3u8", GROUP-ID="1#48000-64000",NAME="Audio 64",DEFAULT=NO,AUTOSELECT=NO
In particular they had DEFAULT=NO,AUTOSELECT=NO. Therefore before calling replaceCurrentItem I now do the following:
let asset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
for characteristic in asset.availableMediaCharacteristicsWithMediaSelectionOptions {
if let group = asset.mediaSelectionGroup(forMediaCharacteristic: AVMediaCharacteristic.audible) {
if let option = group.options.first {
playerItem.select(option, in: group)
}
}
}
This makes all my HLS audio playable by the AVPlayer.
I dont see version in your .m3u8. Try adding #EXT-X-VERSION:03 into your playlist. AVPlayer does need to have version included in playlist (Android EXO player does not need it). Here is example of playlist that might work:
#EXTM3U
#EXT-X-VERSION:03
#EXT-X-MEDIA:TYPE=AUDIO,URI="_64/index.m3u8",GROUP-ID="2#48000-64000",NAME="AAC 64",DEFAULT=NO,AUTOSELECT=NO
#EXT-X-MEDIA:TYPE=AUDIO,URI="_80/index.m3u8",GROUP-ID="2#48000-80000",NAME="AAC 80",DEFAULT=NO,AUTOSELECT=NO
#EXT-X-MEDIA:TYPE=AUDIO,URI="_96/index.m3u8",GROUP-ID="2#48000-96000",NAME="AAC 96",DEFAULT=NO,AUTOSELECT=NO
#EXT-X-STREAM-INF:BANDWIDTH=133336,CODECS="mp4a.40.2",AUDIO="2#48000-96000"
_96/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=100641,CODECS="mp4a.40.2",AUDIO="2#48000-64000"
_64/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=116989,CODECS="mp4a.40.2",AUDIO="2#48000-80000"
_80/index.m3u8
Through AVCaptureSession I record a video and then immediately play it back via an AVPlayer once recording has stopped.
My problem is that the audio from the video sometimes plays out of the ear speaker at a really low volume and other times plays out of the bottom speaker.
How can I default the audio to output to the bottom speaker?
I've looked at other related posts with instances of the below code, which I tried, but to no avail..Any guidance would be appreciated.
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(.playAndRecord)
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.none)
try session.setActive(true)
} catch {
print ("error")
}
You're explicitly turning that off here:
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.none)
If you want to prefer the speaker, you'd use:
try session.overrideOutputAudioPort(.speaker)
AVAudioSession is very complicated, and many parts of it are not intuitive. Do not copy code you find on the internet without reading the docs on each command. The docs are pretty good, but you have to read them.
That said, rather than doing this, I'd probably switch your category and options when you switch to playback. You can do that at any time:
try session.setCategory(.playback, options: [.defaultToSpeaker])
It is generally best to keep your category aligned what you're doing. If you set .playback here as the category, you may not even need .defaultToSpeaker, depending on what precisely you're trying to achieve.
Be certain to read all the relevant docs on .defaultToSpeaker, setCategory, overrideOutputAudioPort, etc. Don't just copy my suggestions. These settings have many subtle (and documented) interactions, you need to configure it based on your actual use case, not just copy something that "seems to work." You may be very surprised at what happens when the user switches to Bluetooth, or plugs headphones, or switches to CarPlay.
You can change the audio output device for a given AVPlayer instance by setting the instance property 'audioOutputDeviceUniqueID' to the UniqueID of the desired device.
I can confirm that this works as expected in MacOS 10.11.6, using Key-Value coding ( setValue:forKey:)
Apple's doc on this:
Instance Property
audioOutputDeviceUniqueID
Specifies the unique ID of the Core Audio output device used to play audio.
Declaration
#property(nonatomic, copy) NSString *audioOutputDeviceUniqueID;
Discussion
The default value of this property is nil, indicating that the default audio output device is used. Otherwise the value of this property is a string containing the unique ID of the Core Audio output device to be used for audio output.
Core Audio's kAudioDevicePropertyDeviceUID is a suitable source of audio output device unique IDs.
I'm working on a video app, we are changing form regular mp4 files to HLS, one of the many reasons we have to do the change is that we hace much more control over the bandwidth usage of videos (we load lots of other stuff in our player, so we need to optimize the experience the best way).
So, AVFoundation introduced in iOS10 the ability to control the bandwidth using:
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:self.urlAsset];
playerItem.preferredForwardBufferDuration = 30.0;
playerItem.preferredPeakBitRate = 200000.0; // Remember this line
There's also a configuration introduced on iOS11 to set the maximum resolution of the item with preferredMaximumResolution, So we're using it, but we still need a solution for iOS10 devices.
Well, now we have control over the preferredPeakBitRate that's nice, but we have a problem, not all the HLS sources are generated by us, so, let's say we want to set a maximum resolution of 480p when you're not connected to a wifi network, today I don't have way to achieve that, not always I'm going to be able to know how much bandwidth needs the 480p source for the selected HLS playlist.
One thing I was thinking about is to read the information inside the m3u8 file, to at least know which are the different quality sources that my player can show and how much bandwidth needs everyone.
One way to do this, would download the m3u8 playlist as a plain text, use a regex to read the file and process this data, well, I'm trying to avoid that, I think that this should far less difficult.
I cannot read this information from the tracks, because a) I can't find the information, b) the tracks are replaced dynamically when changing the quality, yeah 1 track for every quality level.
So, I don't know how I can get this information, I've searched google, stackoverflow and I can't find this information, does any one can help me?
Here's an example for what I want to do, I have this example playlist:
#EXTM3U
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=314000,RESOLUTION=228x128,CODECS="mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_192k.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=478000,RESOLUTION=400x224,CODECS="avc1.42001e,mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_400k.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=691000,RESOLUTION=480x270,CODECS="avc1.42001e,mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_600k.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1120000,RESOLUTION=640x360,CODECS="avc1.4d001f,mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_1000k.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1661000,RESOLUTION=960x540,CODECS="avc1.4d001f,mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_1500k.m3u8
And I just want to have that information available on an array inside my code, something like this:
NSArray<ZZMetadata *> *metadataArray = self.urlAsset.bandwidthMetadata;
NSLog(#"Metadata info: %#", metadataArray);
And print something like this:
<__NSArrayM 0x123456789> (
<ZZMetadata 0x234567890> {
trackId: 1
neededBandwidth: 314000
resolution: 228x128
codecs: ...
...
}
<ZZMetadata 0x345678901> {
trackId: 2
neededBandwidth: 478000
resolution: 400x224
}
...
}
Given an HLS manifest with multiple variants/renditions:
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1612430,CODECS="avc1.4d0020,mp4a.40.5",RESOLUTION=640x360
a.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=3541136,CODECS="avc1.4d0020,mp4a.40.5",RESOLUTION=960x540
b.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=5086455,CODECS="avc1.640029,mp4a.40.5",RESOLUTION=1280x720
c.m3u8
Is it possible to get an array of the three variants (with the attributes such as bandwidth and resolution) from either the AVAsset or AVPlayerItem?
I am able to get the currently playing AVPlayerItemTrack by using KVO on the AVPlayerItem, but again, it's only the track that's actively being played not the full list of variants.
I'm interested in knowing if the asset is being played at it's highest possible quality, so that I can make a decision on whether the user has enough bandwidth to start a simultaneous secondary video stream.
To know which variant you are currently playing, you can keep a KVO on AVPlayerItemNewAccessLogEntryNotification and by looking at AVPlayerItemAcessLogEvent in the access log, you can tell current bitrate and any change in bitrate.
AVPlayerItemAccessLog *accessLog = [((AVPlayerItem *)notif.object) accessLog];
AVPlayerItemAccessLogEvent *lastEvent = accessLog.events.lastObject;
if( lastEvent.indicatedBitrate != self.previousBitrate )
{
self.bitrate = lastEvent.indicatedBitrate
}
As far as knowing the entire list of available bitrates, you can simply make a GET request for the master m3u8 playlist and parse it. You will only need to do it once so not much of an overhead.
New in iOS 15, there’s AVAssetVariant
I'm working with AVAudioplayer and AVAudiosession. I have got an iPad and a audio interface (sound card).
This audio interface has 4 outputs (2 stereo), a lightning cable and it receive energy from the iDevice, works excellent.
Ive coded a simple play() stop() AVAudioplayer that works fine BUT I need to asign specific channel of the audio interface (1-2 & 3-4). My idea is send two audios (A & B) to each output/channel (1-2 or 3-4)
I've read the AVAudioplayer's documentation and it says: channelAssignments is for asign channels to a audioplayer.
The problem is: I've created an AVAudiosession that get the data of the USBport's device plugged (soundcard). And I got:
let route = AVAudioSession.sharedInstance().currentRoute
for port in route.outputs {
if port.portType == AVAudioSessionPortUSBAudio {
let portChannels = port.channels
let sessionOutputs = route.outputs
let dataSource = port.dataSources
dataText.text = String(portChannels) + "\n" + String(sessionOutputs) + "\n" + String(dataSource)
}
}
Log:
outputs
Which data I must to take and use to send the audios with play()?
Wow - I had no idea that AVAudioPlayer had been developed at all since AVPlayer came out in iOS 4. Yet here we are in 2016, and AVAudioPlayer has channelAssignments while the fancy streaming, video playing with subtitles AVPlayer does not.
But I don't think you will be able to play two files A and B through one AVAudioPlayer as each player can only open one file. That leaves
creating two players (player(A) and player(B)) and setting the channelAssignments of each to one half of the audio devices output channels, dealing with the joys of synchronising the players, or
creating a four channel file, AB, and playing it through one player, assigning channelAssignments the full four channels you found above, dealing with the joys of slightly non-standard audio files .
Sanity check: is your session.outputNumberOfChannels returning 4?
Personally, when I do this kind of thing I create a 4 channel remote io audio unit as I've found the higher level APIs cause too much heartache once you do anything a little unusual. I also use AVAudioSessionCategoryMultiRoute because I don't have any > 2 channel sound cards, so I have to cobble headphone jack plus usb sound card to get 4 channels, but you shouldn't need this.
Despite not having procedural output (like remoteIO audio units), you may also be able to use AVAudioEngine to do what you want.