I'm capturing ARFrame's and applying filters, which works fine, but I want to turn off the filters and go back to the original camera feed, but I'm running into issues. This code is applying the filter:
func session(_ session: ARSession, didUpdate frame: ARFrame) {
let filterImage = setFilter(session.currentFrame)
sceneView.scene.background.contents = context.createCGImage(filterImage, from: filterImage.extent)
}
Once I set sceneView.scene.background.contents, I cannot set it back to the original source. The original source is an object called: SCNCaptureDeviceOutputConsumerSource, which is not in the documentation. I tried saving that object and setting the background contents with it again, but it will just display the last frame it was holding (so there will be a still image). It does not continuously update. I don't know how to make sceneView.scene.background.contents extract data from the same source as it was before I replaced it.
I tried setting sceneView.session.delegate = nil, but that did not work, it just stops updating, and the screen looks like it freezes.
Is there a way to reset ARSCNView background contents to the original source it was getting data from?
If I reload the ARSCNView, it works, but it takes at least a second and a half to reload:
sceneView.session.delegate = nil
sceneView = nil
sceneView = ARSCNView(frame: view.bounds)
view.addSubview(sceneView)
sceneView.delegate = self
sceneView.session.run(ARFaceTrackingConfiguration())
Thank you for any help you can give.
I'm new to Stackoverflow posting, If the quality of this answer isn't that high, that is the reason for it. I had the same problem as you. Here's how I fixed it.
First set the sceneView.scene.background.contents as a variable.
Then you can change the sceneView.scene.background.contents to for example UIColor.lightGray.
If you want to reset it back to its original Source, just do as follows.
sceneView.scene.background.contents = originalSource
Just make sure to not change the variable originalSource, because then you are deleting your originalSource.
Hope this helped,
Best Regards
Related
I apply real time effects using CoreImage to video that is played using AVPlayer. The problem is when the player is paused, filters are not applied if you tweak filter parameters using slider.
let videoComposition = AVMutableVideoComposition(asset: asset, applyingCIFiltersWithHandler: {[weak self] request in
// Clamp to avoid blurring transparent pixels at the image edges
let source = request.sourceImage.clampedToExtent()
let output:CIImage
if let filteredOutput = self?.runFilters(source, filters: array)?.cropped(to: request.sourceImage.extent) {
output = filteredOutput
} else {
output = source
}
// Provide the filter output to the composition
request.finish(with: output, context: nil)
})
As a workaround, I used this answer that worked till iOS 12.4, but not anymore in iOS 13 beta 6. Looking for solutions that work on iOS 13.
After reporting this as a bug to Apple and getting some helpful feedback I have a fix:
player.currentItem?.videoComposition = player.currentItem?.videoComposition?.mutableCopy() as? AVVideoComposition
The explanation i got was:
AVPlayer redraws a frame when AVPlayerItem’s videoComposition property gets a new instance or, even if it is the same instance, a property of the instance has been modified.
As a result; forcing a redraw can be achieved by making a 'new' instance simply by copying the existing instance.
So I couldn't find a good answer that works for my problem. I am creating a SpriteKit game on iOS. Im just going to give you guys a quick overview of my setup and then go over the problem. I think that's the best way to explain this.
The Setup:
Basically in my didMove(to view:) method I call two helper functions: setupNodes() and setupLevel(). What these functions do is load the different scenes and SKSpriteNodes into memory and then add them to the scene. Here is the code I use to do this:
func setupNodes(){
doubleVent = (loadForegroundOverlayTemplate("DoubleVent").copy() as! SKSpriteNode)
}
func loadForegroundOverlayTemplate(_ fileName: String) -> SKSpriteNode{
let overlayScene = SKScene(fileNamed: fileName)!
let overlayTemplate = overlayScene.childNode(withName: "ForegroundOverlay")!.copy() as! SKSpriteNode
return overlayTemplate
}
This now stores the scene which is a child of an SKNode called "ForegroundOverlay". I then pass this to a function called "createForegroundOverlay()" like so:
let overlaySprite = doubleVent
createForegroundOverlay(doubleVent)
and then "createForegroundOverlay" adds the SKSpriteNode stored in doubleVent to the scene like so:
func createForegroundOverlay(_ overlayTemplate: SKSpriteNode) {
let foregroundOverlay = overlayTemplate.copy() as? SKSpriteNode
lastOverlayPosition.x = lastOverlayPosition.x + (lastOverlayWidth + ((foregroundOverlay?.size.width)!)/2)
lastOverlayWidth = (foregroundOverlay?.size.width)!/2.0
foregroundOverlay?.position = lastOverlayPosition
//TESTING PURPOSES
if TESTING_NODE_POS{
print("The last overlay position x:\(lastOverlayPosition.x), y:\(lastOverlayPosition.y)")
print("The last overlay width is \(lastOverlayWidth)")
print("Adding a foreground overlay \(count)")
count += 1
}
//TESTING PURPOSES
if TESTING_BOX_OUTLINE {
let foregroundPos = foregroundOverlay?.position
let boxLocation = foregroundOverlay?.childNode(withName: "Box1")?.position
if boxLocation != nil {
print("The location of the box \(String(describing: boxLocation))")
print("The foregroundLocation is \(String(describing: foregroundPos))")
print("last overlay position is \(lastOverlayPosition)")
}
}
//add the foreground overlay as a child of the foreground node
fgNode.addChild(foregroundOverlay!)
}
the variables for positioning - lastOverlayPosition, lastOverlayWidth etc.. - are just properties of my GameScene class used to know where and when to add the overlay that was passed.
The fgNode is a node that I stored from my GameScene.sks file like so:
let worldNode = self.childNode(withName: "World")!
bgNode = worldNode.childNode(withName: "Background")!
bgCityNode = worldNode.childNode(withName: "BackgroundCity")
cityBGOverlayTemplate = (bgCityNode.childNode(withName: "Overlay")!.copy() as! SKNode)
cityOverlayWidth = bgCityNode.calculateAccumulatedFrame().width
backgroundOverlayTemplate = (bgNode.childNode(withName: "Overlay")!.copy() as! SKNode)
backgroundOverlayWidth = backgroundOverlayTemplate.calculateAccumulatedFrame().width
fgNode = worldNode.childNode(withName: "Foreground")!
This was also done in my setupNodes() method.
The problem:
So maybe some of you guys have already seen the problem, but the problem is that when I launch my game it crashes and I get the message:
"Thread 1: EXC_BAD_ACCESS: (code = 1, address = 0x78)"
It is the exact same message every single time a crash occurs. I think I understand what the error is saying. Basically there is a pointer pointing to some location in memory (0x78) that has nothing there. So dereferencing that pointer obviously causes a fault. Here is where I get confused... This only happens about 50% of the time. when I run and build the project it builds successfully every time and then crashes 50% of the time with that error message. Secondly, this occurs at the very beginning. This is odd to me because how can some memory already be freed resulting in a bad pointer at the very beginning of the game. Also if the crash doesn't occur at launch then a crash never occurs except for when I reload the scene after the game over scene is displayed, which is basically the same as the game being relaunched.
With some time I narrowed the problem to one line of code:
fgNode.addChild(foregroundOverlay!)
if I comment this line out, no crash ever occurs (I tried building and running 50 times). The problem must be with the foregroundOverlay variable, which was setup using the code in the setup section of this discussion. So I have no idea how to fix this... Any Ideas??
P.S. it might also be worth noting that adding a child to the scene in this way only noticeably became a problem when I upgraded to xCode 10.0 and that I have considered using zombies, but I don't think that would help since the crash only happens at the launch of the game.
If I was unclear anywhere please let me know.
After being all over Stack Overflow and the deepest corners of the internet, I'm yet to find the solution. I hope someone out there will be able help me out with a problem I've had for days now.
The app in question has a collection view with lots of items. When you click on it you get to a preview collection view. And finally (where I need help) when you click "GO" you come to a collection view with cells that fills out the entire screen. It consists of an AVPlayerLayer and AVPlayer. Each time you scroll to right or left you see another video. The following code works:
UICollectionViewCell
class PageCell: UICollectionViewCell {
var player: AVPlayer?
var playerLayer: AVPlayerLayer?
var playerItem: AVPlayerItem?
var videoAsset: AVAsset?
var indexPath: IndexPath?
var page: Page? {
didSet {
guard let unwrappedPage = page, let unwrappedIndexPath = indexPath else { return }
addingVideo(videoID: unwrappedPage.steps[unwrappedIndexPath.item].video)
}
}
func setupPlayerView() {
player = AVPlayer()
playerLayer = AVPlayerLayer(player: player)
playerLayer?.videoGravity = .resizeAspectFill
videoView.layer.addSublayer(playerLayer!)
layoutIfNeeded()
playerLayer?.frame = videoView.bounds
}
func addingVideo(videoID: String) {
setupPlayerView()
guard let url = URL(string: videoID) else { return }
videoAsset = AVAsset(url: url)
activityIndicatorView.startAnimating()
videoAsset?.loadValuesAsynchronously(forKeys: ["duration"]) {
guard self.videoAsset?.statusOfValue(forKey: "duration", error: nil) == .loaded else { return }
self.player?.play()
self.playerItem = AVPlayerItem(asset: self.videoAsset!)
self.player?.replaceCurrentItem(with: self.playerItem)
}
}
In the UICollectionViewController I'm reusing cells. Since there is no way to know the number of AVPlayer instances, I simply made a helper variable inside the PageCell and it seems like three cells are getting reused (the normal amount when dequeuing and reusing cells). Now when I close this UICollectionViewController the AVPlayer instances seems to disappear/close.
Now, the problem arises when I want to loop the videos. Using AVPlayerLooper is not an option because it simply is too laggy (I've implemented it in a dusin different ways without luck). So my solution was to use a period time observer inside the videoAsset?.loadValuesAsynchronously block:
self.timeObserver = self.player?.addPeriodicTimeObserver(forInterval: CMTime(value: 1, timescale: 2),
queue: DispatchQueue.global(), using: { (progressTime) in
if let totalDuration = self.player?.currentItem?.duration{
if progressTime == totalDuration {
self.player?.seek(to: kCMTimeZero)
self.player?.play()
}
}
})
Problem
Video showcasing problem: Problem video
The problem arises when having +17 AVPlayer instances running simultaneously then suddenly the videos won't load anymore. iOS devices have a hardware limitation of everything from 4 to 18 AVPlayer instances running at the same time (most likely depending on RAM), see the following Stack Overflow posts just to mention a few:
AVPlayerItemStatusFailed error when too many AVPlayer instances created
How many AVPlayers are allowed to be created at the same time?
AVPlayer not able to play after playing some items
Loading issues with AVPlayerItem from local URL
See also these articles for more insight:
Building native video Pins
Too Many AVPlayers?
Because the problem only occurs when adding the time observer I suspect that it keeps the AVPlayer instances "alive" even though I've closed the collection view.
Notes to Problem video: Every time I press "GO" one AVPlayer instance is created. When I swipe to the right two more cells are created hence two more AVPlayer instances. All in all 3 AVPlayer instances are created each time accumulating in the end to about 17 instances and the videos will not load anymore. I've tried scrolling through all the videos each time I've pressed "GO" but this does not change the outcome with a maximum of three cells being reused all the time.
What I've tried to solve the problem
Try 1:
I made playerLayer a global variable and in viewDidDisappear inside the UICollectionViewController I added:
playerLayer?.player = nil
This resulted in one AVPlayer instance to disappear/close when I closed the cells onto the "GO" page (i.e. the view disappeared). This meant that I hit 17 instances a bit later than when I didn't add this code. Together with the code above I also tried adding playerLayer = nil, playerLayer?.player?.replaceCurrentItem(with: nil) and playerLayer?.removeFromSuperlayer() but the only thing that changed anything was playerLayer?.player = nil. It should be noted that if I do not scroll and simply open the first video and close, open the first video and close and so on, I could do it "forever" without any problems. So, in this case one instance was created and then closed/disappeared afterwards.
Video showcasing try 1: Try 1 video
Try 2:
I changed the addPeriodicTimeObserver block to:
self.timeObserver = playerLayer?.player?.addPeriodicTimeObserver(forInterval: CMTime(value: 1, timescale:
2), queue: DispatchQueue.global(), using: { (progressTime) in
if let totalDuration = playerLayer?.player?.currentItem?.duration{
if progressTime == totalDuration {
playerLayer?.player?.seek(to: kCMTimeZero)
playerLayer?.player?.play()
}
}
})
In this block I essentially changed all self.player? to playerLayer?.player?.
This made me able to open and close the view with the videos as much as I wanted - so the AVPlayer instances were somehow closing/disappearing. Though now the looping didn't work. The first video would loop initially. But then I swiped to the second cell and this video would not loop. Then I swiped back to the first cell and now this wouldn't loop either. If I added playerLayer?.player = nil like in "Try 1" no effect occurred with neither the open and close nor the loops.
Video showcasing try 2: Try 2 video
Try 3:
I made the timeObserver variable global and tried many many things. But when I tried to remove the observer(s) it always resulted in the following error:
'NSInvalidArgumentException', reason: 'An instance of AVPlayer cannot remove a time observer that was added by a different instance of AVPlayer.'
The error indicates that the time observers are clearly all over the place. This was clear when I added helper variables (counters) inside the addPeriodicTimeObserver block and found out that the time observers were quickly adding up. At one point with all the different combinations of the code presented in this post, I was able to remove the time observer at any time during the scrolling. But this resulted in only removing a single time observer out of many - resulting in the same situation as in "Try 1" where I was able to remove a single AVPlayer instance each time I closed the view.
General note:
To test whether or not it really was only three AVPlayer instances that were created each time I press "GO" and swiped I made a test where I scrolled through over 20 videos, but there was no problem loading them. Therefore I'm quite sure, as mentioned earlier, that a maximum of three AVPlayer instances are created in the view each time.
Conclusion
The problem is, as I see it, that when I add the time observers in order to loop the videos they accumulate and keep the AVPlayer instances "alive". Remember that I had no problem at all without the time observers. When applying playerLayer?.player = nil it seems as though, I was able to "save" an instance but then again it's so hard to tell when I don't know the amount of AVPlayer instances currently "active". Simply put, all I wanna do is to delete all AVPlayer instances the moment the view disappears and it will be happy days. If anyone got this far, I will be overly grateful if you are able to help me out.
There is a retain cycle. Use weak self in escaping closures. The retain cycle you created was:
cell -> Player -> observer closure -> cell
Try this:
self.timeObserver = self.player?.addPeriodicTimeObserver(forInterval: CMTime(value: 1, timescale: 2),
queue: DispatchQueue.global(), using: { [weak self] (progressTime) in
if let totalDuration = self?.player?.currentItem?.duration{
if progressTime == totalDuration {
self?.player?.seek(to: kCMTimeZero)
self?.player?.play()
}
}
})
I have a photo/gif app I made previously using AVFoundation as a base for the camera and taking photos, but I wanted to upgrade it to add some live filtering and post capture filtering too.
After some digging I found gpuimage/gpuimage2 and since my project is in swift 3 I started replacing my previous camera module with gpuimage.
I got the camera to work again but I have issues capturing a photo from the camera to store it as a uiimage until it is uploaded to a server.
do {
self.videoCamera = try Camera(sessionPreset: AVCaptureSessionPresetPhoto, location: .frontFacing)
} catch {
self.videoCamera = nil
print("Couldn't initialize camera with error: \(error)")
}
this is my init and then this is where I place the camera feed in the view
self.filterView!.frame = self.view.frame
self.filterView!.orientation = .portraitUpsideDown
self.filterView!.fillMode = .preserveAspectRatioAndFill
self.videoCamera! --> self.filterView!
self.videoCamera!.startCapture()
as you can see for the moment I don't want to use any filters, I'm trying to first get the basic functionality back (i.e. showing a camera feed the taking 1-5 images in a row)
I noticed there was a saveNextFrameToURL but it saves the file on the device but I only want the uiimage so this is what I put to replace the content of my takePhoto method (images is nil on first run)
func takePhoto(){
if self.images == nil {
self.images = []
}
let pictureOutput = PictureOutput()
pictureOutput.encodedImageFormat = .jpeg
pictureOutput.imageAvailableCallback = {image in
self.images!.append(image)
}
self.videoCamera! --> pictureOutput
}
My issue is that imageAvailableCallback is simply never called (I tried placing a breakpoint in it but nothing) whereas it goes through the rest of the method not raising any errors or warning.
what am I doing wrong ? is it even possible to capture a still image from a non filtered view ? if so how can I add a filter that would not change the image as such so I can still do some unedited photo capture in my app?
I've been going at it for over 2 weeks now and every time I search if anyone had the same issue I only find issues about editing a still image or a filtered image and when I tried filtering the image as such:
self.filterView!.frame = self.view.frame
self.filterView!.orientation = .portraitUpsideDown
self.filterView!.fillMode = .preserveAspectRatioAndFill
self.baseFilter = BrightnessAdjustment()
self.videoCamera! --> self.baseFilter --> self.filterView!
self.videoCamera!.startCapture()
and the takePhoto method
func takePhoto(){
if self.images == nil {
self.images = []
}
let pictureOutput = PictureOutput()
pictureOutput.encodedImageFormat = .jpeg
pictureOutput.imageAvailableCallback = {image in
self.images!.append(image)
}
self.baseFilter! --> pictureOutput
}
I get a white screen instead of my camera feed and still no image.
Any help would be appreciated thank you
I found where my problem was by looking at similar issues and the comments (found one where Brad Larson commented here
Basically it has to do with the lifespan of my pictureOutput variable, since it was enclosed inside a method is didn't last long enough for the callback to be made and to save the image, by making my pictureOutput variable a class variable I solved my issues
I'm probably missing something. I'm trying to change filter to my GPUImageView.It's actually working the first two times(sometimes only one time), and than stop responding to changes. I couldn't find a way to remove the target from my GPUImageView.
Code
for x in filterOperations
{
x.filter.removeAllTargets()
}
let f = filterOperations[randomIntInRange].filter
let media = GPUImagePicture(image: self.largeImage)
media?.addTarget(f as! GPUImageInput)
f.addTarget(g_View)
media.processImage()
Any suggestions? * Processing still image from my library
UPDATE
Updated Code
//Global
var g_View: GPUImageView!
var media = GPUImagePicture()
override func viewDidLoad() {
super.viewDidLoad()
media = GPUImagePicture(image: largeImage)
}
func changeFilter(filterIndex : Int)
{
media.removeAllTargets()
let f = returnFilter(indexPath.row) //i.e GPUImageSepiaFilter()
media.addTarget(f as! GPUImageInput)
f.addTarget(g_View)
//second Part
f.useNextFrameForImageCapture()
let sema = dispatch_semaphore_create(0)
imageSource.processImageWithCompletionHandler({
dispatch_semaphore_signal(sema)
return
})
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER)
let img = f.imageFromCurrentFramebufferWithOrientation(img.imageOrientation)
if img != nil
{
//Useable - update UI
}
else
{
// Something Went wrong
}
}
My primary suggestion would be to not create a new GPUImagePicture every time you want to change the filter or its options that you're applying to an image. This is an expensive operation, because it requires a pass through Core Graphics and a texture upload to the GPU.
Also, since you're not maintaining a reference to your GPUImagePicture beyond the above code, it is being deallocated as soon as you pass out of scope. That tears down the render chain and will lead to a black image or even crashes. processImage() is an asynchronous operation, so it may still be in action at the time you exit your above scope.
Instead, create and maintain a reference to a single GPUImagePicture for your image, swap out filters (or change the options for existing filters) on that, and target the result to your GPUImageView. This will be much faster, churn less memory, and won't leave you open to premature deallocation.