OpenTok - cycling camera on iOS degrades video quality - ios

I've used OpenTok to create a web platform and works fine with Web browsers and Android devices. I've swapped the encoding to support iOS and whenever someone cycles camera on an iOS device, the quality of the video degrades alot making it all fuzzy and sometimes causes the camera feed to freeze.
Is this just an encoding issue? I can't quite figure out why this happens or what causes this?

Related

MTKView vs GLKView performance on old devices

I am running into weird performance issues on older device (iPad Mini 2 of year 2014) ever since migrating to MTKView from OpenGLES based views. To summarize, I am rendering live video preview using MTKView and also optionally record videos (like in RosyWriter sample code). My new app version is written in Swift and supports both Metal & OpenGLES. So I have both Open GL and Metal rendering code to compare. I also have a legacy version of app written in Objective C & uses OpenGLES for all preview and rendering. Here are performance issues I see on iPad Mini 2:
When using MTKView, if I debug the app in XCode and zoom video (using AVCaptureDevice zoom API, no Metal code to crop/zoom frames), the frame rate drops significantly (from 30 to 10 fps). If I record video while zooming (which means additional video encoding pipeline in another thread), there are frame drops in the recorded video as well.
When I profile the app using time profiler, MTKView is smooth enough with zoom. However, if I start recording video now , MTKView fps again drops or frames get delayed in the preview, but there is no frame drop issue in recorded video -- the end recorded video is still smooth.
If I select Release version of app for Debug, behavior is same as while profiling,
If I switch back to GLKView instead of MTKView, there are no issues in preview even in Debug version. There are still issues when recording -- preview gets delayed when zooming. But if I profile the app, there are NO delays or frame drops!!!!
Finally, the original legacy version in Objective C that uses OpenGLES for all purposes has no issues at all.
Now the question is what tools in Instruments can I use to nail down the exact issues for frame drops in preview and recording. Can Metal not match performance of OpenGLES on older devices?

ARKit just showing blue screen in Unity, not using camera?

Ok I have no idea what is going on here, cant find any solutions anywhere. Here is what I happens when I try to run this ARKit Unity demo (or any AR demo for that matter) https://github.com/FusedVR/PetAR built to my iPhone -
The UI shows up, but where the camera capture is supposed to be occurring, I just have a blue screen. This is not what happens on their demo video online and it seems no one else has this problem.
I am on Unity 5.6.6, however I was on 2017 before and that did not work either. I made sure I had some text written in my "Camera description" field so the iPhone would allow camera access, and I am out of solutions at this point.
How can I get ARKit to work in Unity deployed to iOS? What am I doing wrong here?
I have the Unity build deploying via Xcode 9 the most recent beta.
There are certain hardware and software requirements in order to run ARKit-based applications.
https://developer.apple.com/arkit/
High Performance Hardware and Rendering Optimizations
ARKit runs on the Apple A9 and A10 processors.
Practically, you need an iPhone 6s or newer.
Introducing ARKit
iOS 11 introduces ARKit, a new framework
iOS 11 is also required.

Removing low frequency (hiss) noise from video in iOS

I am recording videos and playing them back using AVFoundation. Everything is perfect except the hissing which is there in the whole video. You can hear this hissing in every video captured from any iPad. Even videos captured from Apple's inbuilt camera app has it.
To hear it clearly, you can record a video in a place as quiet as possible without speaking anything. It can be very easily detected through headphones and keeping volume to maximum.
After researching, I found out that this hissing is made by preamplifier of the device and cannot be avoided while recording.
Only possible solution is to remove it during post processing of audio. Low frequency noise can be removed by implementing low pass filter and noise gates. There are applications and software like Adobe Audition which can perform this operation. This video shows how it is achieved using Adobe Audition.
I have searched Apple docs and found nothing which can achieve this directly. So I want to know if there exists any library, api or open source project which can perform this operation. If not, then how can I start going in right direction because it does looks like a complex task.

iOS 7+ Is there a possibility to capture video from frontal camera while showing another video on the screen?

I have a task.
There is iOS device. There is an app I should create.
The app shows some video file (local video file from the device) while frontal camera captures users' face.
Showing video and capturing user's face via frontal camera are simultaneous.
I see that FaceTime and Skype for iOS can do this. But the former one created by Apple (they can do whatever on their devices) while latter one is owned by Microsoft (big companies/big money sometimes allowed more than usual developers).
Moreover, I doubt on co-existense of video capturing along with video player at the same time.
So, I am not sure that this task is 100% implement-able and publish-able.
Is it possible on iOS 7+?
Is it allowed by Apple to do this (I mean that there are many technical possibilities on iOS but only some of them are OK for Apple. Especially during moderation process)?
Are there good technical references?
I believe so. Doing a search on Appstore shows a number of video conferencing apps:
Zoom cloud
Polycom
VidyoMobile
Fuze
Just search for "video conferencing".

Could anyone guide me to develop with qrcode in blackberry?

I am developing my camera application that can decoded the qrcode in real time of camera on.
When I open the camera, I set the timer about every 3 seconds interval to capture picture on the screen with Display.screenshot(); method and then decoded it with zxing library while camera is on. Sometimes, it can capture the qrcode and successful decoded, but otherwise it hard to decoded the picture. I think the problem is about camera. I can not use auto focus with the camera, so the capture picture is gloomy. I want to know how to use auto focus with the camera.
My application implement on OS version 5.0 and upper version.
Could anyone help me or suggestion about new solution ?
Thank you so much
Try BBBarcodeLib available for download here: http://aliirawan-wen.blogspot.com/2011/05/barcode-scanner-for-blackberry-os-50.html.
Note that for OS6 devices you can use the built in BlackBerry barcode scanning library (built on Google ZXing).
The main problem is likely that 3 second pause. You should be decoding frames as fast as you can, which should be many times per second. This will get you a successful scan much faster.
You can and should integrate the latest zxing library instead of using that built into RIM's OS if you can, as it has small improvements that will be helpful.

Resources