Screen Mirroring iPhone to smart TV - ios

I want to create an application screen mirror the whole screen to smart TV
I have found some apps on the market that do the trick
This is a sample
https://apps.apple.com/ec/app/gomirror-screen-mirroring/id1484792219
I paid and tried the trial version, then tried to mirror the iPhone screen with a smart Samsung TV, and it worked.
Also I tried something else, I used Reflector Mac application (Desktop app) and used this code
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
let routePickerView = AVRoutePickerView(frame: CGRect(x: 0.0, y: 30.0, width: 30.0, height: 30.0))
routePickerView.backgroundColor = UIColor.lightGray
self.view.addSubview(routePickerView)
let avAsset = AVAsset(url: URL(string: "https://sample-videos.com/video123/mp4/720/big_buck_bunny_720p_1mb.mp4")!)
let avPlayerItem = AVPlayerItem(asset: avAsset)
let avPlayer = AVPlayer(playerItem: avPlayerItem)
let avPlayerLayer = AVPlayerLayer(player: avPlayer)
avPlayerLayer.frame = CGRect(x: 0.0, y: 40.0, width: self.view.frame.size.width, height: self.view.frame.size.height - 40.0)
self.view.layer.addSublayer(avPlayerLayer)
avPlayer.seek(to: CMTime.zero)
avPlayer.play()
}
this code show a player and add the ability to mirror ONLY the player into Mac screen, But I just was playing around, What I really want to screen mirror the full screen to a Smart TV.
Edit 1:
I found that some applications are recording the screen of the Mobile phone and send it as a video for the TV, and this make some delay between the action on the iPhone and the preview on the TV, so if any one knows how to that programmatically that would be great.

Related

Airplay support iOS

I'am trying to add airplay for my audio app. Here is my code
let buttonView = UIView(frame: CGRect(x: 0, y: 0, width: 50, height: 50))
let routerPickerView = AVRoutePickerView(frame: buttonView.bounds)
routerPickerView.tintColor = UIColor.white
routerPickerView.activeTintColor = UIColor.white
buttonView.addSubview(routerPickerView)
self.btnsStack.addArrangedSubview(buttonView)
This works well with my Mac, but when I try to play it with my Samsung Smart TV, there is a loader keeps on spinning. I tried connecting my TV with other apps like Spotify and it works.
Found my solution, so for casting airplay with smart tv one line of code is must
player?.allowsExternalPlayback = false
player is your AVPlayer.

How do I use screen mirroring in iOS apps

I just need to use screen mirroring on enabled device when I click the button on my app screen. There is an airplay button in my app, clicking on which will check for available mirroring devices such as tv or laptop but I don't think it is for screen mirroring. I want to call screen mirroring in my app (I want to use mirroring not casting) on that screen.
I don't know Objective-c well, so please use swift if you use example code. Thanks.
Please check below, this might help you to understand how currently apps are achieving this.
Screen Mirroring iPhone to smart TV
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
let routePickerView = AVRoutePickerView(frame: CGRect(x: 0.0, y: 30.0, width: 30.0, height: 30.0))
routePickerView.backgroundColor = UIColor.lightGray
self.view.addSubview(routePickerView)
let avAsset = AVAsset(url: URL(string: "https://sample-videos.com/video123/mp4/720/big_buck_bunny_720p_1mb.mp4")!)
let avPlayerItem = AVPlayerItem(asset: avAsset)
let avPlayer = AVPlayer(playerItem: avPlayerItem)
let avPlayerLayer = AVPlayerLayer(player: avPlayer)
avPlayerLayer.frame = CGRect(x: 0.0, y: 40.0, width: self.view.frame.size.width, height: self.view.frame.size.height - 40.0)
self.view.layer.addSublayer(avPlayerLayer)
avPlayer.seek(to: CMTime.zero)
avPlayer.play()
}
This code shows a player and adds the ability to mirror ONLY the player into the Mac screen.
You can check below as well.
how-to-do-screen-mirroring-using-airplay-from-the-application-not-from-control

Play Youtube video on ARImageAnchor in Swift

I want to play a youtube video on my image tracking module
like when image tracked then play youtube video on it.
Can anyone help me?
private func makeVideo(with url: URL, size: CGSize) -> SCNNode? {
DispatchQueue.main.async {
self.player.frame = CGRect(x: 0, y: 0, width: 650, height: 400)
self.player.autoplay = true
self.player.loadPlayer()
}
// 4
let avMaterial = SCNMaterial()
avMaterial.diffuse.contents = player
// 5
let videoPlane = SCNPlane(width: size.width, height: size.height)
videoPlane.materials = [avMaterial]
// 6
let videoNode = SCNNode(geometry: videoPlane)
videoNode.eulerAngles.x = -.pi / 2
return videoNode
}
import YoutubeKit
let player = YTSwiftyPlayer(playerVars: [.videoID("GJQsT-h0FTU")])
I try this but it shows a black screen on image but audio is play
this may be not exactly solution to your issue, but I hope it can help you in some ways
So, what you can do in addition to the Image Detection is to display the UIWebView
'UIWebView' was deprecated in iOS 12.0: No longer supported; please adopt WKWebView.
Even though the Xcode will warn you with something like the message above, but the WKWebView does not display a thing, so you will need to stick with WebView until they fix the issue
So, assuming that you have figured out the image detection part and you are able to locate and put nodes over detected image you can implement following function:
func displayWebSite(on rootNode: SCNNode, horizontalOffset: CGFloat) {
DispatchQueue.main.async {
// Open YouTube
let request = URLRequest(url: URL(string: "https://youtu.be/7ehEPsrw1X8")!)
// Define the size
let webView = UIWebView(frame: CGRect(x: 0, y: 0, width: 650, height: 900))
webView.loadRequest(request)
// Set size
let webViewPlane = SCNPlane(width: horizontalOffset, height: horizontalOffset * 1.45)
webViewPlane.cornerRadius = 0.025
// Define geometry
let webViewNode = SCNNode(geometry: webViewPlane)
// Set the WebView as a material to the plane
webViewNode.geometry?.firstMaterial?.diffuse.contents = webView
webViewNode.opacity = 0
// Put a little in front to avoid merger with detected image
webViewNode.position.z += 0.04
// Add the node
rootNode.addChildNode(webViewNode)
}
}
For your reference you can check one of my projects to see how that function works within a project

WKWebView do not work with AVAudioSession's category

By setting app's AVAudioSession's category to AVAudioSessionCategoryPlayback with option mixWithOthers, will make sounds from video in UIWebView be mixed with third party background music (such as Spotify).
But it do NOT work at all once switch to WKWebView. Code sample:
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSessionCategoryPlayback, with: .mixWithOthers)
try audioSession.setActive(true)
} catch {
print(error)
}
/*
// Mixed background music & video sound in web works with UIWebView
let webview = UIWebView(frame: CGRect(x: 0, y: 0, width: self.view.frame.width, height: 300))
self.view.addSubview(webview)
webview.loadRequest(URLRequest(url: URL(string: "http://www.ultimedia.com/deliver/generic/iframe/mdtk/01509739/src/8l50lp/ad/yes")!))
*/
// Mixed background music & video sound in web do NOT work with WKWebView
let webview = WKWebView(frame: CGRect(x: 0, y: 0, width: self.view.frame.width, height: 300))
self.view.addSubview(webview)
webview.load(URLRequest(url: URL(string: "http://www.ultimedia.com/deliver/generic/iframe/mdtk/01509739/src/8l50lp/ad/yes")!))
// end of viewDidLoad
Code above tested on Xcode 8.2.1, iOS 10.2.1
Any idea or workaround would be really helpful.
Many thanks.

SCNVideoNode playing extremely fast

I'm trying to show a video in augmented reality using Vuforia - but for the sake of this question, just showing the scene and video would be fine.
What's expected:
Show the video (playing) at the correct speed for video and audio and have them both in sync.
What's happening:
Audio plays at correct speed. Video plays at a seriously fast speed - like 10x.
Tried:
I've tried changing the rate - it's ignored completely.
I've tried using different ways (AVPlayer, AVPlayerLayer,
SKVideoNode(withURL)) of putting the video into the scene - all
suffer from hyperactive-video-syndrome
I've tried other file formats - nope
I've tried local files and URL - no dice
I've tried throwing my laptop at a wall - it made the video go away
Code to return a the scene with the video:
private func createVideoScene(with view: VuforiaEAGLView) -> SCNScene {
// create the asset & player and grab the dimensions
let asset = AVAsset(URL: NSURL(string: "https://inm-baobab-prod-eu-west-1.s3.amazonaws.com/public/inm/media/video/2016/09/02/61537094SansSouciGirlsSchool.mp4")!)
let size = asset.tracksWithMediaType(AVMediaTypeVideo)[0].naturalSize
let player = AVPlayer(playerItem: AVPlayerItem(asset: asset))
let videoNode = SKVideoNode(AVPlayer: player)
videoNode.size = size
videoNode.position = CGPoint(x: size.width * 0.5, y: size.height * 0.5)
let videoScene = SKScene(size: size)
videoScene.addChild(videoNode)
let videoWrapperNode = SCNNode(geometry: SCNPlane(width: 10, height: 8))
videoWrapperNode.position = SCNVector3(x: 0, y: 0, z: 0)
videoWrapperNode.geometry?.firstMaterial?.diffuse.contents = videoScene
videoWrapperNode.geometry?.firstMaterial?.doubleSided = true
videoWrapperNode.scale.y = -1
videoWrapperNode.name = "video"
let scene = SCNScene()
scene.rootNode.addChildNode(videoWrapperNode)
return scene
}
Thank you
PS. Help in Objective-C is also welcome :)

Resources