I need to use Replaykit (Broadcast extension UI) to be able to cast content from iPhone to TV (Chrome cast).
Currently I am using Haishinkit library, I have written content to HTTPStream (CMSampleBuffer), I use this URL and cast to TV, it doesn't work.
let url = URL.init(string: "abc.m38u")!
let mediaInfoBuilder = GCKMediaInformationBuilder.init(contentURL: url)
mediaInfoBuilder.streamType = GCKMediaStreamType.buffered
mediaInfoBuilder.contentID = mediaURL.absoluteString
mediaInfoBuilder.contentType = mediaURL.mimeType()
mediaInfoBuilder.hlsSegmentFormat = .TS
mediaInfoBuilder.hlsVideoSegmentFormat = .MPEG2_TS
mediaInfoBuilder.streamDuration = .infinity
Where am I going wrong.
Can I use any other way to stream content to Chromecast, because using HTTPStream, the content is deleyed for about 5 to 10 seconds.
Thanks.
Related
I try to get file list from the Music Library (iPod Music Library), but I can't do it, my list is always empty. I sure that I have tracks in Music Library, I check it in other app - and it works. But as I remember that application sent me a request to access the Music Library. Perhaps I also need to create such a request? Help me solve the problem. I use this code to get file list:
func fetchFileList() {
let mediaItems = MPMediaQuery.songs().items
let mediaCollection = MPMediaItemCollection(items: mediaItems ?? [])
print("mediaCollectionItems: \(mediaCollection.items)") //It's always empty
//Then I'd like to get url of the track
//let item = mediaCollection.items[0]
//let pathURL = item.value(forProperty: MPMediaItemPropertyAssetURL) as? URL
//print("pathURL: \(pathURL)")
}
If you want to access the Music Library, you have to add NSAppleMusicUsageDescription key to your Info.plist with a description about what you want to do with the music.
Se apple documentation for more info: MediaPlayer Documentation
At first I want to mention that I'm using MobilePlayer library but it's just a skin above AVPlayer.
When I tap to play video it only plays sound and I do not see any video. I've checked view bounds but the problem is not there. At the beginning of the video it indicates loading progress but later I hear only the sound without any video.
My code looks like:
let bundle = Bundle.main
let config = MobilePlayerConfig(fileURL: bundle.url(
forResource: "Skin",
withExtension: "json")!)
let videoURL = NSURL(string: channelURL!)
self.playerVC = MobilePlayerViewController(contentURL: videoURL! as URL, config: config)
self.playerVC.activityItems = [videoURL! as URL]
self.present(self.playerVC, animated: true)
I also want to mention that I see the video when I run on the simulator but on the real device there is no video at all. What can be a problem?
If you need any part of the code just ask me please
I'm sharing to Facebook with FB share dialog like this:
guard let localIdentifier = localIdentifier else {return}
let assetURL = "assets-library://asset/asset.MOV?id=" + localIdentifier + "&ext=MOV"
let video : FBSDKShareVideo = FBSDKShareVideo()
video.videoURL = URL(string:assetURL)
let content : FBSDKShareVideoContent = FBSDKShareVideoContent()
content.video = video
let shareDialog = FBSDKShareDialog()
shareDialog.shareContent = content
shareDialog.delegate = self
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0 , execute: {
shareDialog.show()
})
I check that the url is valid. This is a local MOV file and I copy it first to the camera roll. Then I retrieve it using PHManager and trying to share with FBSDK share dialog.
The FB sharing dialog with the movie appears and I press post. The FB seems processes the file and I get a V - that probably the video was shared.Afterwards I get the callback the share was cancelled. Did somebody experience the same? What might be the problem?
Thanks in advance.
What version of the FacebookSDK are you using? I had this problem too and it was because the version that I was using was not compatible with iOS 11 yet. I think that from v 4.27 it's compatible, I was using v4.23. Hope it helps.
I am trying to use the LFLiveKit sdk to send rtmp streams to server. I tried this to stream the pixel buffer like so,
var Lsession: LFLiveSession = {
let audioConfiguration = LFLiveAudioConfiguration.defaultConfiguration(for: LFLiveAudioQuality.high)
let videoConfiguration = LFLiveVideoConfiguration.defaultConfiguration(for: LFLiveVideoQuality.low3)
let session = LFLiveSession(audioConfiguration: audioConfiguration, videoConfiguration: videoConfiguration)
return session!
}()
let stream = LFLiveStreamInfo()
stream.url = "rtmp://domain.com:1935/show/testS"
Lsession.pushVideo(frame.capturedImage)
How can I initialize the session with screen capture? Any pointers?
I had to set captureType in the session initilization like so,
let session = LFLiveSession(audioConfiguration: audioConfiguration, videoConfiguration: videoConfigurationcaptureType: LFLiveCaptureTypeMask.inputMaskVideo)
I would like to play a video file which is on my local PC (path: "http://192.168.1.5:9000/assets/uploadedFiles/4.mp4"). My iPhone and my PC are both in the same network. My following code works fine if I play videos on my iPhone. Otherwise it displays only a black screen.
self.objMoviePlayerController = MPMoviePlayerController(contentURL: url!)
self.objMoviePlayerController.movieSourceType = MPMovieSourceType.Unknown
self.objMoviePlayerController.view.frame = self.videoView.bounds
self.objMoviePlayerController.scalingMode = MPMovieScalingMode.AspectFill
self.objMoviePlayerController.controlStyle = MPMovieControlStyle.Embedded
self.objMoviePlayerController.contentURL = url!
self.objMoviePlayerController.shouldAutoplay = true
self.videoView.addSubview(self.objMoviePlayerController.view)
self.objMoviePlayerController.prepareToPlay()
self.objMoviePlayerController.play()
I don't want to download the video file to my iPhone. Is there a way to stream the video file from my PC? Thanks