How to stop Swift ARKit app from reverting SCN Technique, and world tracking issues - ios

I am making an ARKit swift app, and have followed this post on how to apply a mirroring effect to the camera feed. However, I've run into a couple problems. First, the mirroring effect stops after a while. This often happens when the images I am searching for come into the frame, but also happens when looking at something totally unrelated. The camera simply reverts back to the default, unmirrored view, and does not go back to the SCNTechnique mirrored view.
I have been able to conclude that this is not happening when the anchors are placed in the world, since the code is not executing those lines when the mirroring effect reverts back. Is there a way to stop this from happening and make sure the mirroring SCNTechnique remains? I am unsure if this is simply a hardware limitation of the iPhone 11 that the app is running on, or if there is an issue with the code.
Also, whenever the image reverts from the mirrored effect to the default camera feed, the world tracking is severely screwed up. The coordinate axes that are placed in the view essentially just move around with the camera, and do not seem to be attached to any specific place. Is this related to the above issue, or would this be a separate problem with the code or implementation?

Related

ARKit insideFrustumOf ambiguous result

I want to show navigation guide(tell user to move left or right if node is not visible). And for this I am using this code sninppet to check if node is present in camera or not
renderer.isNode(pathNode, insideFrustumOf: navigationNode)
Most of the times, it is working fine, but when object/node is on the ground(i.e. near foot or even little behind your foot) and not visible in camera, it still returns true. Is there anything I am missing or Anything can be done to make experience better?
After some more debugging, I found out its the 3rd party library I am using for creating custom geometry scene node is at fault, Function works well.

Positioning and Resizing handles similar to Apple Pages

In Apple's Pages app it allows you to add an image or text box or shape layer to the page then resize it by tapping on it and the handles appear. A similar thing also happens in the Pixelmator app and a few others. Is this something made by Apple that I can use in my app or would I have to build it in myself?
As far as I know there is no system support for resize handles and you will need to build it yourself. That's what I've done when I needed them. I added views on top of the thing that I wanted to resize, with pan gesture recognizers attached.
I have an app called CIFilterTest on Github (written in Objective-C, unfortunately) that uses resize handles to let the user move around points and rects when they are needed for the various Core Image filters. Even though it's written in Objective-C it should give you the idea.
Note that most Core Image filters run VERY slowly on the simulator, making the app seem extremely laggy. That's an artifact of running Core Image filters on the simulator. It runs much faster on an actual iOS device.

Simple Javascript Sprite Animation - line flicker, but only on iOS devices

There is a problem with the sprite animation on the homepage of one of my clients, but it only appears when the site is viewed on an iOS device, namely an iPhone or iPad. I can't replicate the problem on any other device or emulator, so I'm having an issue troubleshooting it (don't own an iPhone or iPad). The problem is: what looks to be a 1px line is appearing on the right edge of the animation frame pretty much all the time, and a similar line flickers occasionally at the top of the frame as the animation runs. The animation itself is a simple javascript sprite sheet animation. I'm operating under the assumption that I have the sprite animation programmed correctly since it appears correctly on every other device, platform and browser I've checked. It even works in IE.
Two questions:
What would cause a simple sprite animation to display differently when rendered by iOS?
As a small business consultant, I don't have the time and my clients don't have the budget for me to physically test on every single device, so I have to rely on emulators. What other options do I have if the emulators don't properly demonstrate what the device will display?
I'm not entirely sure of the protocol regarding posting a link to my client's production website, but happy to send a link to anyone willing to help that responds and/or messages me.
welcome to SO.
I spend a lot of time working specifically with iOS on the web and have run into similar situations. Without tweaking an example you post I won't be able to prove it exactly, but this should at least give you direction.
Flickering or semi opaque lines are often caused by the scaling set to the asset. In the world of high DPI displays and fluid layouts, there are differences in rounding that result in fine lines, shimmers, et al. Is there any scaling set on the assets, e.g. background-size, downsampling?
The emulators are displaying the software correctly - these issues are a result of hardware. Best thing you can do is buy a flagship for all of the platforms you test on, or look into local resources like Clearleft's Device Testing Lab

Qt screen orientation change

I'm using Qt 5.1 beta on iOS. I am deploying my app on an ipad.
The problems I am having regard how touch events are sensed and processed by qt. As long as I keep the ipad oriented straight (i.e. frontal camera is up), everything works fine. In that configuration, if I touch the screen, the coordinates of the point of touch sensed through mousePressedEvent(QMouseEvent *e) indicates that, as expected, the origin of the coordinate system is in the upper left corner of the screen.
When I turn my ipad, let's say left, so that the camera is to the left, my ui correctly rotates so that the buttons that I have are aligned to the screen. However, if I sense the touch events as described above, the origin of the coordinate system has not changed, so now it is in the lower left corner of the screen. Because of this, the buttons act as if they were not aligned to the screen but as if they turned around the same way I turned the screen (even if they are rendered aligned) So, if I tap on the visualized button it won't sense the touch, but it will if I tap were they would be if they would have not changed orientation as the screen did.
Does anyone have any idea what might cause this or how it could be fixed?
Please ask if you would like to see code. I did not put any as I would not know what might be of interest and my app is quite big already.
Thanks,
Corneliu
You file a bug report to the trolls about it. And also check to see if there is a bug report about it already.
http://bugreports.qt-project.org/
For the mean time you could push your coordinates through a mapping function of some sort based on the orientation of the device.
Also calling 'adjustSize()` on widgets tends to fix sizing and positioning with layouts.
Hope that helps.

Require Waving flag effect over iOS application

There is a requirement in my application to wave a flag image like air blowing.
I have found one similar effect - Ripple Effect. But it is specifically for water ripple. Means, it gives kinda same effect like air blow but the problem with this is, it automatically affects from the center of the view, Whereas air blowing happens in left to right manner.
Any suggestion will be helpful.
This is probably not the answer you'd desire, but you could either
a) use a movie with a waving flag as a background for your view
b) animate a series of images with a waving flag as a background for your view
c) you could try some OpenGL, this tutorial seems useful: http://nehe.gamedev.net/tutorial/flag_effect_(waving_texture)/16002/

Resources