I'm implementing an app that retrieves a video stream using WebRTC (libjingle_peerconnection library). At some moment, the stream (RTCVideoTrack) could be removed. When this happens in the UIView (RTCEAGLVideoView) still show the last frame of the stream. I want to set that View to black. How can I do it?
For now I'm removing the stream with the following code, but as I said, last frame keeps showing on the view.
remoteVideoTrack.setEnabled(false) // RTCVideoTrack object
remoteVideoTrack.remove(videoView) // videoView is the RTCEAGLVideoView UI object
remotePeerConnection.close()
I encountered a similar issue. However, in may case a black view wasn't acceptable and I needed the renderer view to become completely transparent.
Since the video chat view controller in my case is displayed in a container, I was able to make the renderer last frame disappear by completely reloading the container.
This is the relevant code:
// Kill renderer
vcWebRtc?.willMove(toParent: nil)
vcWebRtc?.view.removeFromSuperview()
vcWebRtc?.removeFromParent()
vcWebRtc = UIStoryboard.instance(from: .WebRTC).instantiateInitialViewController() as? WebRtcVC
if vcWebRtc != nil{
addChild(vcWebRtc!)
viewWebRtcContainer.addSubview(vcWebRtc!.view)
vcWebRtc!.view.frame = viewWebRtcContainer.bounds
vcWebRtc?.didMove(toParent: self)
vcWebRtc?.delegate = self}
Related
I have a QR Code Scanner view in which I have an AVCaptureVideoPreviewLayer and a UIButton.
When this view displays, the text within the button does not show. It should display the word 'Cancel'. If I touch the button, or swipe the button - but not tap it, the button text will show at that point.
Does anyone know how I can get the button text to display properly?
Here's what my view hierarchy looks like:
When I first enter the Scanner view, the buttons look like this:
There is no text. The text will show only after I touch the buttons:
One last thing, it seems this is an issue only is iOS 10...
Any suggestions welcome. Thanks!
Yes. You need to add an overlay view.
Main view containing video preview and overlay view.
Overlay view contains buttons.
For each button, you can use these events if you want to animate:
TouchDown
TouchUpInside
TouchOutInside
As the documentation of AVCaptureSession states, startRunning() call blocks the main thread so your UI is not drawn correctly.
The startRunning() method is a blocking call which can take some time,
therefore you should perform session setup on a serial queue so that
the main queue isn't blocked (which keeps the UI responsive). See
AVCam-iOS: Using AVFoundation to Capture Images and Movies for an
implementation example.
If you check Apple's example (Swift 3 and Objective-C), you can easily set a new queue for those actions without blocking your main thread.
This is the example for Swift 2.3 if you need it.
// Swift 2.3
private let sessionQueue = dispatch_queue_create("session queue", nil)
override func viewDidLoad() {
super.viewDidLoad()
dispatch_async(sessionQueue) {
self.configureSession()
}
dispatch_async(dispatch_get_main_queue()) {
self.setupScreen()
}
}
Use layer.insertSublayer(..., above: ...) or layer.insertSublayer(..., below: ...) to control your CALayers
Here's the similar answer link or
a bit more detailed instructions for Swift how to insert image on the top of video stream manual
My application reads data from MySql server and displays it in a tableview. At the start of the call to the server the application covers the tableView with a view (named "pdView") that has background colour of Light Gray. That view is part of the scene which the tableView is also part of (designed in IB), but is in a hidden state, until just before the call to the server where it becomes visible using
pdView.hidden = false
After the app gets the data and fills the tableview, using
dispatch_async(dispatch_get_main_queue(), { () -> Void in
tableview.reloadData()
})
one can see the data displayed in the tableView, under the gray display.
I then try to hide back the view using
pdView.hidden = true
but it takes the app about 44 seconds for the gray color to disappear. During that time my app behaves normally and I can scroll the table up and down.
I tried putting the code to hide the view inside dispatch_async() but to no avail.
What needs to be done to fix this problem so that the pdView disappear right after it is back to the hidden state?
ALL your UI code needs to be from the main thread. That includes things like changing the state of a view's hidden flag.
Do you have that code wrapped in a call to
dispatch_async(dispatch_get_main_queue()
as well?
It sounds like you are trying to update the UI from a separate thread. You will need to give your pdView.hidden = true some context. You can try
dispatch_async(dispatch_get_main_queue(), { () -> Void in
tableview.reloadData()
self.pdView.hidden = true
})
Note the self in self.pdView.hidden..
Hope that helps.
Trying to create an app that uses the UIImagePickerController for its picture taking activities. I'm using a cameraOverlayView to accomplish this (for now, it just contains two buttons, one for Taking/Keeping the Picture and the other one for Canceling/Retaking). The steps below outline what is occurring.
UIImagePickerController.view (with cameraOverlayView) is shown
I click "Shoot" and the imagePicker does its magic and I'm able to get an image out of it. However going past the image acquisition section (after I dismiss the custom view), the imagePicker goes onto "still" preview screen mode without any controls other than the "<" back button at the top of the screen. I'd like to avoid that and instead continue to use my custom view provided to the cameraOverlayView to handle "Keep/Retake" actions until I'm finished.
As you can see in the image above, there is "black" band at the bottom of the view with my two custom buttons in it. I have no idea how the black band gets there since I did not add it. My custom View only has the two buttons.
Attempting to add an UIImageView on the fly (to display a still preview of the image taken to the user while still keeping my action buttons) has the effect below .
The red arrow is pointing to an extra section below the UIImage I added after acquiring the picture data from the imagePicker.
That section is actually displaying the live preview the imagePicker is constantly producing.
What I'd like to accomplish is to be able to get the bounds of that live preview section being displayed so I can calculate the correct bounds of the UIImage I'm adding on the fly since getting the height correctly will absolutely fail when ported to different devices.
Any suggestions?
Hi I am trying to make some application.
that is this application (i can't upload video in stackoverflow.. so i linked my blog and uploaded video)
http://blog.naver.com/cooksunghun/220488172371
like this application.
there are multiple video in tableview.
but play video is in only one talbleViewCell
I tried and i made tableViewCell and embed video. but i can't function that play only one part depends on scroll... (i am new about progmraming T_T)
I tried with scrollViewDidScroll and VisibleCells
func scrollViewDidScroll(scrollView: UIScrollView) {
let cell = VideoCell()
let visibleCell = tableView.visibleCells
var count = visibleCell[0]
switch count {
case visibleCell[0] : cell.playVideo()
case visibleCell[1] : print("test2")
case visibleCell[2] : print("test3")
case visibleCell[3] : print("test4")
default : print("testDefault")
}
}
i hope someone give idea. Thanks you!
(sorry for bad english)
i am waiting your idea!
Maybe it's too late but you are on wrong way. You don't need UIScrollViewDelegate methods, only thing you need this methods from UITableViewDelegate;
- tableView:willDisplayCell:forRowAtIndexPath:
For what? According to Apple docs;
A table view sends this message to its delegate just before it uses
cell to draw a row, thereby permitting the delegate to customize the
cell object before it is displayed. This method gives the delegate a
chance to override state-based properties set earlier by the table
view, such as selection and background color. After the delegate
returns, the table view sets only the alpha and frame properties, and
then only when animating rows as they slide in or out.
So, according to scroll direction while your next cell will prepare to display, this method firing and you need to prepare your video content to play in this method.
- tableView:didEndDisplayingCell:forRowAtIndexPath:
For what? According to Apple docs;
Use this method to detect when a cell is removed from a table view, as
opposed to monitoring the view itself to see when it appears or
disappears.
So, according to scroll direction, while your cell which one is last seen will preparing to destroy. Meanwhile, your next cell will be preparing to display. You need to prepare to stop your video content here. You didn't give any information about your video content class which could be MPMoviePlayerController or AVFoundation so search for how to prepare your video content to stop and play.
The long version: I am writing an app for iOS 8 in Swift, and its main functionality requires it to display a slideshow of sorts of pictures and videos. Considering that, to my knowledge, an AVPlayer cannot display pictures (I've tried feeding it an image URL in the same way I've fed it a video URL, and no luck), this requires me to add and remove a UIImageView or an AVPlayer from the view in order to display the appropriate content. However, in order to show the content correctly, I have to hide the app's navigationbar by default. The idea is to show it again if the user taps on the view. The view has a TapGestureRecognizer that serves this purpose. These tap gestures work correctly, assuming the view's content is the imageview. However, if the view presently contains the AVPlayer, the gestures are not received, rendering me unable to allow the user to show the navigationbar and leave the view. Here is how the AVPlayer is added to the view:
if (Utilities.IsVideo(url)) {
var escapedUrl = url.stringByAddingPercentEscapesUsingEncoding(NSUTF8StringEncoding)
let player = AVPlayer(URL: NSURL(string: escapedUrl!))
var cont = AVPlayerViewController()
cont.player = player
self.addChildViewController(cont)
self.view.addSubview(cont.view)
cont.view.frame = self.view.frame
player.play()
}
The short version: I have a view in an iOS app that contains a TapGestureRecognizer. The taps do not pass through an AVPlayer that is added to the view. How do I detect these taps?
It is worthy of mention that if the user taps on one of the media controls (such as the play button), the tap gesture is then recognized and its action triggered.