Intro: We are creating a bot to record the team meeting. After the meeting gets over
we should get the recordings files in video format.
Issue: We have implemented this code(https://github.com/microsoftgraph/microsoft-graph-comms-samples/tree/master/Samples/V1.0Samples/AksSamples/teams-recording-bot)
using azure kubernetes. We are able to join the team meeting and recording start automatically and we get audio wav file format in zip file to our local but we need video files.
So how can we get the video files in .zip file to our local.
Implementation of code: We have used ivideosocket. But we dont no how we can implement code to get video
so that we can save that video to our local in .zip file.
`
private void OnVideoMediaReceived(object sender, VideoMediaReceivedEventArgs e)
{
this.GraphLogger.Info($"[{e.SocketId}]: Received Video: [VideoMediaReceivedEventArgs(Data=<{e.Buffer.Data.ToString()}>, Length={e.Buffer.Length}, Timestamp={e.Buffer.Timestamp}, Width={e.Buffer.VideoFormat.Width}, Height={e.Buffer.VideoFormat.Height}, ColorFormat={e.Buffer.VideoFormat.VideoColorFormat}, FrameRate={e.Buffer.VideoFormat.FrameRate})]");
// TBD: Compliance Recording bots can record the Video here
e.Buffer.Dispose();
}
`
I'm working on a project where I have an instance of AVPlayer capable of playing different audio content that I retrieve from a backend, from podcast to music and streamings. Every content has two types of urls: one with mp3 and another with a m3u8 file. All the mp3 files work good. However some m3u8 files work fine and others don't. In particular, those who don't work cause the AVPlayer to crash with the error:
Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action"
UserInfo={NSLocalizedRecoverySuggestion=Try again later.,
NSLocalizedDescription=Cannot Complete Action.}
I don't understand what the problem is. According to this answer it is a wrong Manifest file, which in my case is - for example - the following:
#EXTM3U
#EXT-X-MEDIA:TYPE=AUDIO,URI="_64/index.m3u8",GROUP-ID="2#48000-64000",NAME="AAC 64",DEFAULT=NO,AUTOSELECT=NO
#EXT-X-MEDIA:TYPE=AUDIO,URI="_80/index.m3u8",GROUP-ID="2#48000-80000",NAME="AAC 80",DEFAULT=NO,AUTOSELECT=NO
#EXT-X-MEDIA:TYPE=AUDIO,URI="_96/index.m3u8",GROUP-ID="2#48000-96000",NAME="AAC 96",DEFAULT=NO,AUTOSELECT=NO
#EXT-X-STREAM-INF:BANDWIDTH=133336,CODECS="mp4a.40.2",AUDIO="2#48000-96000"
_96/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=100641,CODECS="mp4a.40.2",AUDIO="2#48000-64000"
_64/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=116989,CODECS="mp4a.40.2",AUDIO="2#48000-80000"
_80/index.m3u8
On the Apple forum, I found this answer which says iOS 14+ is on fault. Unfortunately I cannot test with an iOS 13 physical device.
Do you have any suggestion?
Tested on Xcode 13.1 with iPhone 7plus with iOS 15.0.2.
Finally I found a solution for this issue. What worked for me was this. I believe the problem was that my manifest files were structured like the following:
#EXT-X-MEDIA:TYPE=AUDIO,URI="_64/index.m3u8", GROUP-ID="1#48000-64000",NAME="Audio 64",DEFAULT=NO,AUTOSELECT=NO
In particular they had DEFAULT=NO,AUTOSELECT=NO. Therefore before calling replaceCurrentItem I now do the following:
let asset = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
for characteristic in asset.availableMediaCharacteristicsWithMediaSelectionOptions {
if let group = asset.mediaSelectionGroup(forMediaCharacteristic: AVMediaCharacteristic.audible) {
if let option = group.options.first {
playerItem.select(option, in: group)
}
}
}
This makes all my HLS audio playable by the AVPlayer.
I dont see version in your .m3u8. Try adding #EXT-X-VERSION:03 into your playlist. AVPlayer does need to have version included in playlist (Android EXO player does not need it). Here is example of playlist that might work:
#EXTM3U
#EXT-X-VERSION:03
#EXT-X-MEDIA:TYPE=AUDIO,URI="_64/index.m3u8",GROUP-ID="2#48000-64000",NAME="AAC 64",DEFAULT=NO,AUTOSELECT=NO
#EXT-X-MEDIA:TYPE=AUDIO,URI="_80/index.m3u8",GROUP-ID="2#48000-80000",NAME="AAC 80",DEFAULT=NO,AUTOSELECT=NO
#EXT-X-MEDIA:TYPE=AUDIO,URI="_96/index.m3u8",GROUP-ID="2#48000-96000",NAME="AAC 96",DEFAULT=NO,AUTOSELECT=NO
#EXT-X-STREAM-INF:BANDWIDTH=133336,CODECS="mp4a.40.2",AUDIO="2#48000-96000"
_96/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=100641,CODECS="mp4a.40.2",AUDIO="2#48000-64000"
_64/index.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=116989,CODECS="mp4a.40.2",AUDIO="2#48000-80000"
_80/index.m3u8
I want to cache videos that are displayed in a tableview. However, I am not sure what to cache. I'm using AVFoundation, in particular I'm using AVPlayer and creating AVPlayerItem's.
My question is: what do I cache? Is it the AVPlayer, AVPlayerItem, or the underlying asset of AVPlayerItem called the AVAsset?
Please give a code sample (or library) with answer. Thanks!
You need to have private var mediaCache = NSMutableDictionary() for caching videos.
How to cache:
let assetForCache = AVAsset(url: URL(string: cell.videoRef)!)
self.mediaCache.setObject(assetForCache, forKey: cacheKey as NSCopying)
How to use: Just check if object exists in cache for this row. If yes - use it, no - download it
Cache means download the video of each cell and try to use downloaded video for the second time . Either you can Temp location or Document directory.
Execute a download call to download the video from URL Video Link.
In your cellForRowat indexpath check whether video available in your local if its there play.or else stream for the first time , simultaneously place a download
call for that video in background .So that for second time you can use local video and play without internet .
I am newcomer in Objective-C and have experience only 12 months in iPhone development.
I am recording audio files in One UIViewController, and playing on another UIViewController. For playing purpose i am saving the date string for generation of url,it is fine working properly,
But ,now my problem is i want to play previous audio record file for some time after that i want to play next audio file using url . i am saving all the data using nsuser dafaults please help me
NSM![enter image description here][1]utableArray *dateString;
NSURL recordFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingString:[self.dateString objectAtIndex:sender.tag]]];
For playing
player = [[AVAudioPlayer alloc] initWithContentsOfURL:recordFile error:&error];
from fig when i click a tag 2 i want to play first 5 sec tag1 after that i want to play tag2
Record the audio -> Save the audio in the Temp -> Track the saved path of the audio file (You can store this saved path in an array) . Repeat the same steps for the next file.
Use AVQueuePlayerfor playing items one after the other.
I have video that lives here:
http://195.16.112.71/adaptive/66aebabb-2632-44fc-abf1-df29bca6b941.video/66aebabb-2632-44fc-abf1-df29bca6b941.m3u8
Ffmpeg says that this video has 5 tracks and it's correctly.
But if I use AVURLAsset with that link it says me that there isn't any tracks:
NSArray* const tracks = asset.tracks; // it's empty
I modified Apple's StichedStreamPlayer sample to reproduce this problem, it lies here:
https://yadi.sk/d/hV3jfbx1Z9sfC
Simply click 'Load Movie', than the 'Play' button - movie plays perfectly, but if you check tracks variable in prepareToPlayAsset function you find it's empty.
The question is: why it's empty if in reality the video has 5 tracks and how this video could be playing if no tracks exist, as AVURLAsset says?
Thanks for your help in advance!
If you are directly streaming the video then it won't have tracks. You can download the file and ask for the tracks of the video file asset.